Quantifying introgression risk with realistic population genetics.
Ghosh, Atiyo; Meirmans, Patrick G; Haccou, Patsy
2012-12-07
Introgression is the permanent incorporation of genes from the genome of one population into another. This can have severe consequences, such as extinction of endemic species, or the spread of transgenes. Quantification of the risk of introgression is an important component of genetically modified crop regulation. Most theoretical introgression studies aimed at such quantification disregard one or more of the most important factors concerning introgression: realistic genetical mechanisms, repeated invasions and stochasticity. In addition, the use of linkage as a risk mitigation strategy has not been studied properly yet with genetic introgression models. Current genetic introgression studies fail to take repeated invasions and demographic stochasticity into account properly, and use incorrect measures of introgression risk that can be manipulated by arbitrary choices. In this study, we present proper methods for risk quantification that overcome these difficulties. We generalize a probabilistic risk measure, the so-called hazard rate of introgression, for application to introgression models with complex genetics and small natural population sizes. We illustrate the method by studying the effects of linkage and recombination on transgene introgression risk at different population sizes.
Quantifying introgression risk with realistic population genetics
Ghosh, Atiyo; Meirmans, Patrick G.; Haccou, Patsy
2012-01-01
Introgression is the permanent incorporation of genes from the genome of one population into another. This can have severe consequences, such as extinction of endemic species, or the spread of transgenes. Quantification of the risk of introgression is an important component of genetically modified crop regulation. Most theoretical introgression studies aimed at such quantification disregard one or more of the most important factors concerning introgression: realistic genetical mechanisms, repeated invasions and stochasticity. In addition, the use of linkage as a risk mitigation strategy has not been studied properly yet with genetic introgression models. Current genetic introgression studies fail to take repeated invasions and demographic stochasticity into account properly, and use incorrect measures of introgression risk that can be manipulated by arbitrary choices. In this study, we present proper methods for risk quantification that overcome these difficulties. We generalize a probabilistic risk measure, the so-called hazard rate of introgression, for application to introgression models with complex genetics and small natural population sizes. We illustrate the method by studying the effects of linkage and recombination on transgene introgression risk at different population sizes. PMID:23055068
Bassereau, Maud; Chaintreau, Alain; Duperrex, Stéphanie; Joulain, Daniel; Leijs, Hans; Loesing, Gerd; Owen, Neil; Sherlock, Alan; Schippa, Christine; Thorel, Pierre-Jean; Vey, Matthias
2007-01-10
The performances of the GC-MS determination of suspected allergens in fragrance concentrates have been investigated. The limit of quantification was experimentally determined (10 mg/L), and the variability was investigated for three different data treatment strategies: (1) two columns and three quantification ions; (2) two columns and one quantification ion; and (3) one column and three quantification ions. The first strategy best minimizes the risk of determination bias due to coelutions. This risk was evaluated by calculating the probability of coeluting a suspected allergen with perfume constituents exhibiting ions in common. For hydroxycitronellal, when using a two-column strategy, this may statistically occur more than once every 36 analyses for one ion or once every 144 analyses for three ions in common.
Detection and Quantification of Human Fecal Pollution with Real-Time PCR
ABSTRACT Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for enumeration of two recently described ...
A fault tree model to assess probability of contaminant discharge from shipwrecks.
Landquist, H; Rosén, L; Lindhe, A; Norberg, T; Hassellöv, I-M; Lindgren, J F; Dahllöf, I
2014-11-15
Shipwrecks on the sea floor around the world may contain hazardous substances that can cause harm to the marine environment. Today there are no comprehensive methods for environmental risk assessment of shipwrecks, and thus there is poor support for decision-making on prioritization of mitigation measures. The purpose of this study was to develop a tool for quantitative risk estimation of potentially polluting shipwrecks, and in particular an estimation of the annual probability of hazardous substance discharge. The assessment of the probability of discharge is performed using fault tree analysis, facilitating quantification of the probability with respect to a set of identified hazardous events. This approach enables a structured assessment providing transparent uncertainty and sensitivity analyses. The model facilitates quantification of risk, quantification of the uncertainties in the risk calculation and identification of parameters to be investigated further in order to obtain a more reliable risk calculation. Copyright © 2014 Elsevier Ltd. All rights reserved.
DETECTION AND QUANTIFICATION OF COW FECAL POLLUTION WITH REAL-TIME PCR
Assessment of health risk and fecal bacteria loads associated with cow fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for enumeration of two recently described cow-specific g...
Integrating public risk perception into formal natural hazard risk assessment
NASA Astrophysics Data System (ADS)
Plattner, Th.; Plapp, T.; Hebel, B.
2006-06-01
An urgent need to take perception into account for risk assessment has been pointed out by relevant literature, its impact in terms of risk-related behaviour by individuals is obvious. This study represents an effort to overcome the broadly discussed question of whether risk perception is quantifiable or not by proposing a still simple but applicable methodology. A novel approach is elaborated to obtain a more accurate and comprehensive quantification of risk in comparison to present formal risk evaluation practice. A consideration of relevant factors enables a explicit quantification of individual risk perception and evaluation. The model approach integrates the effective individual risk reff and a weighted mean of relevant perception affecting factors PAF. The relevant PAF cover voluntariness of risk-taking, individual reducibility of risk, knowledge and experience, endangerment, subjective damage rating and subjective recurrence frequency perception. The approach assigns an individual weight to each PAF to represent its impact magnitude. The quantification of these weights is target-group-dependent (e.g. experts, laypersons) and may be effected by psychometric methods. The novel approach is subject to a plausibility check using data from an expert-workshop. A first model application is conducted by means of data of an empirical risk perception study in Western Germany to deduce PAF and weight quantification as well as to confirm and evaluate model applicbility and flexibility. Main fields of application will be a quantification of risk perception by individual persons in a formal and technical way e.g. for the purpose of risk communication issues in illustrating differing perspectives of experts and non-experts. For decision making processes this model will have to be applied with caution, since it is by definition not designed to quantify risk acceptance or risk evaluation. The approach may well explain how risk perception differs, but not why it differs. The formal model generates only "snap shots" and considers neither the socio-cultural nor the historical context of risk perception, since it is a highly individualistic and non-contextual approach.
NASA Astrophysics Data System (ADS)
Wang, Hongrui; Wang, Cheng; Wang, Ying; Gao, Xiong; Yu, Chen
2017-06-01
This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLE confidence interval and thus more precise estimation by using the related information from regional gage stations. The Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.
Martins, Ayrton F; Frank, Carla da S; Altissimo, Joseline; de Oliveira, Júlia A; da Silva, Daiane S; Reichert, Jaqueline F; Souza, Darliana M
2017-08-24
Statins are classified as being amongst the most prescribed agents for treating hypercholesterolaemia and preventing vascular diseases. In this study, a rapid and effective liquid chromatography method, assisted by diode array detection, was designed and validated for the simultaneous quantification of atorvastatin (ATO) and simvastatin (SIM) in hospital effluent samples. The solid phase extraction (SPE) of the analytes was optimized regarding sorbent material and pH, and the dispersive liquid-liquid microextraction (DLLME), in terms of pH, ionic strength, type and volume of extractor/dispersor solvents. The performance of both extraction procedures was evaluated in terms of linearity, quantification limits, accuracy (recovery %), precision and matrix effects for each analyte. The methods proved to be linear in the concentration range considered; the quantification limits were 0.45 µg L -1 for ATO and 0.75 µg L -1 for SIM; the matrix effect was almost absent in both methods and the average recoveries remained between 81.5-90.0%; and the RSD values were <20%. The validated methods were applied to the quantification of the statins in real samples of hospital effluent; the concentrations ranged from 18.8 µg L -1 to 35.3 µg L -1 for ATO, and from 30.3 µg L -1 to 38.5 µg L -1 for SIM. Since the calculated risk quotient was ≤192, the occurrence of ATO and SIM in hospital effluent poses a potential serious risk to human health and the aquatic ecosystem.
Wang, Hongrui; Wang, Cheng; Wang, Ying; ...
2017-04-05
This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLEmore » confidence interval and thus more precise estimation by using the related information from regional gage stations. As a result, the Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.« less
EPRI/NRC-RES fire human reliability analysis guidelines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, Stuart R.; Cooper, Susan E.; Najafi, Bijan
2010-03-01
During the 1990s, the Electric Power Research Institute (EPRI) developed methods for fire risk analysis to support its utility members in the preparation of responses to Generic Letter 88-20, Supplement 4, 'Individual Plant Examination - External Events' (IPEEE). This effort produced a Fire Risk Assessment methodology for operations at power that was used by the majority of U.S. nuclear power plants (NPPs) in support of the IPEEE program and several NPPs overseas. Although these methods were acceptable for accomplishing the objectives of the IPEEE, EPRI and the U.S. Nuclear Regulatory Commission (NRC) recognized that they required upgrades to support currentmore » requirements for risk-informed, performance-based (RI/PB) applications. In 2001, EPRI and the USNRC's Office of Nuclear Regulatory Research (RES) embarked on a cooperative project to improve the state-of-the-art in fire risk assessment to support a new risk-informed environment in fire protection. This project produced a consensus document, NUREG/CR-6850 (EPRI 1011989), entitled 'Fire PRA Methodology for Nuclear Power Facilities' which addressed fire risk for at power operations. NUREG/CR-6850 developed high level guidance on the process for identification and inclusion of human failure events (HFEs) into the fire PRA (FPRA), and a methodology for assigning quantitative screening values to these HFEs. It outlined the initial considerations of performance shaping factors (PSFs) and related fire effects that may need to be addressed in developing best-estimate human error probabilities (HEPs). However, NUREG/CR-6850 did not describe a methodology to develop best-estimate HEPs given the PSFs and the fire-related effects. In 2007, EPRI and RES embarked on another cooperative project to develop explicit guidance for estimating HEPs for human failure events under fire generated conditions, building upon existing human reliability analysis (HRA) methods. This document provides a methodology and guidance for conducting a fire HRA. This process includes identification and definition of post-fire human failure events, qualitative analysis, quantification, recovery, dependency, and uncertainty. This document provides three approaches to quantification: screening, scoping, and detailed HRA. Screening is based on the guidance in NUREG/CR-6850, with some additional guidance for scenarios with long time windows. Scoping is a new approach to quantification developed specifically to support the iterative nature of fire PRA quantification. Scoping is intended to provide less conservative HEPs than screening, but requires fewer resources than a detailed HRA analysis. For detailed HRA quantification, guidance has been developed on how to apply existing methods to assess post-fire fire HEPs.« less
Legionella detection by culture and qPCR: Comparing apples and oranges.
Whiley, Harriet; Taylor, Michael
2016-01-01
Legionella spp. are the causative agent of Legionnaire's disease and an opportunistic pathogen of significant public health concern. Identification and quantification from environmental sources is crucial for identifying outbreak origins and providing sufficient information for risk assessment and disease prevention. Currently there are a range of methods for Legionella spp. quantification from environmental sources, but the two most widely used and accepted are culture and real-time polymerase chain reaction (qPCR). This paper provides a review of these two methods and outlines their advantages and limitations. Studies from the last 10 years which have concurrently used culture and qPCR to quantify Legionella spp. from environmental sources have been compiled. 26/28 studies detected Legionella at a higher rate using qPCR compared to culture, whilst only one study detected equivalent levels of Legionella spp. using both qPCR and culture. Aggregating the environmental samples from all 28 studies, 2856/3967 (72%) tested positive for the presence of Legionella spp. using qPCR and 1331/3967 (34%) using culture. The lack of correlation between methods highlights the need to develop an acceptable standardized method for quantification that is sufficient for risk assessment and management of this human pathogen.
Cryar, Adam; Pritchard, Caroline; Burkitt, William; Walker, Michael; O'Connor, Gavin; Burns, Duncan Thorburn; Quaglia, Milena
2013-01-01
Current routine food allergen quantification methods, which are based on immunochemistry, offer high sensitivity but can suffer from issues of specificity and significant variability of results. MS approaches have been developed, but currently lack metrological traceability. A feasibility study on the application of metrologically traceable MS-based reference procedures was undertaken. A proof of concept involving proteolytic digestion and isotope dilution MS for quantification of protein allergens in a food matrix was undertaken using lysozyme in wine as a model system. A concentration of lysozyme in wine of 0.95 +/- 0.03 microg/g was calculated based on the concentrations of two peptides, confirming that this type of analysis is viable at allergenically meaningful concentrations. The challenges associated with this promising method were explored; these included peptide stability, chemical modification, enzymatic digestion, and sample cleanup. The method is suitable for the production of allergen in food certified reference materials, which together with the achieved understanding of the effects of sample preparation and of the matrix on the final results, will assist in addressing the bias of the techniques routinely used and improve measurement confidence. Confirmation of the feasibility of MS methods for absolute quantification of an allergenic protein in a food matrix with results traceable to the International System of Units is a step towards meaningful comparison of results for allergen proteins among laboratories. This approach will also underpin risk assessment and risk management of allergens in the food industry, and regulatory compliance of the use of thresholds or action levels when adopted.
Robust approaches to quantification of margin and uncertainty for sparse data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hund, Lauren; Schroeder, Benjamin B.; Rumsey, Kelin
Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of themore » risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.« less
Gyawali, P
2018-02-01
Raw and partially treated wastewater has been widely used to maintain the global water demand. Presence of viable helminth ova and larvae in the wastewater raised significant public health concern especially when used for agriculture and aquaculture. Depending on the prevalence of helminth infections in communities, up to 1.0 × 10 3 ova/larvae can be presented per litre of wastewater and 4 gm (dry weight) of sludge. Multi-barrier approaches including pathogen reduction, risk assessment, and exposure reduction have been suggested by health regulators to minimise the potential health risk. However, with a lack of a sensitive and specific method for the quantitative detection of viable helminth ova from wastewater, an accurate health risk assessment is difficult to achieve. As a result, helminth infections are difficult to control from the communities despite two decades of global effort (mass drug administration). Molecular methods can be more sensitive and specific than currently adapted culture-based and vital stain methods. The molecular methods, however, required more and thorough investigation for its ability with accurate quantification of viable helminth ova/larvae from wastewater and sludge samples. Understanding different cell stages and corresponding gene copy numbers is pivotal for accurate quantification of helminth ova/larvae in wastewater samples. Identifying specific genetic markers including protein, lipid, and metabolites using multiomics approach could be utilized for cheap, rapid, sensitive, specific and point of care detection tools for helminth ova and larva in the wastewater.
Delgado-Goñi, Teresa; Campo, Sonia; Martín-Sitjar, Juana; Cabañas, Miquel E; San Segundo, Blanca; Arús, Carles
2013-08-01
In most plants, sucrose is the primary product of photosynthesis, the transport form of assimilated carbon, and also one of the main factors determining sweetness in fresh fruits. Traditional methods for sugar quantification (mainly sucrose, glucose and fructose) require obtaining crude plant extracts, which sometimes involve substantial sample manipulation, making the process time-consuming and increasing the risk of sample degradation. Here, we describe and validate a fast method to determine sugar content in intact plant tissue by using high-resolution magic angle spinning nuclear magnetic resonance spectroscopy (HR-MAS NMR). The HR-MAS NMR method was used for quantifying sucrose, glucose and fructose in mesocarp tissues from melon fruits (Cucumis melo var. reticulatus and Cucumis melo var. cantalupensis). The resulting sugar content varied among individual melons, ranging from 1.4 to 7.3 g of sucrose, 0.4-2.5 g of glucose; and 0.73-2.83 g of fructose (values per 100 g fw). These values were in agreement with those described in the literature for melon fruit tissue, and no significant differences were found when comparing them with those obtained using the traditional, enzymatic procedure, on melon tissue extracts. The HR-MAS NMR method offers a fast (usually <30 min) and sensitive method for sugar quantification in intact plant tissues, it requires a small amount of tissue (typically 50 mg fw) and avoids the interferences and risks associated with obtaining plant extracts. Furthermore, this method might also allow the quantification of additional metabolites detectable in the plant tissue NMR spectrum.
Quantification of free circulating tumor DNA as a diagnostic marker for breast cancer.
Catarino, Raquel; Ferreira, Maria M; Rodrigues, Helena; Coelho, Ana; Nogal, Ana; Sousa, Abreu; Medeiros, Rui
2008-08-01
To determine whether the amounts of circulating DNA could discriminate between breast cancer patients and healthy individuals by using real-time PCR quantification methodology. Our standard protocol for quantification of cell-free plasma DNA involved 175 consecutive patients with breast cancer and 80 healthy controls. We found increased levels of circulating DNA in breast cancer patients compared to control individuals (105.2 vs. 77.06 ng/mL, p < 0.001). We also found statistically significant differences in circulating DNA amounts in patients before and after breast surgery (105.2 vs. 59.0 ng/mL, p = 0.001). Increased plasma cell-free DNA concentration was a strong risk factor for breast cancer, conferring an increased risk for the presence of this disease (OR, 12.32; 95% CI, 2.09-52.28; p < 0.001). Quantification of circulating DNA by real-time PCR may be a good and simple tool for detection of breast cancer with a potential to clinical applicability together with other current methods used for monitoring the disease.
Reiter, Rolf; Wetzel, Martin; Hamesch, Karim; Strnad, Pavel; Asbach, Patrick; Haas, Matthias; Siegmund, Britta; Trautwein, Christian; Hamm, Bernd; Klatt, Dieter; Braun, Jürgen; Sack, Ingolf; Tzschätzsch, Heiko
2018-01-01
Although it has been known for decades that patients with alpha1-antitrypsin deficiency (AATD) have an increased risk of cirrhosis and hepatocellular carcinoma, limited data exist on non-invasive imaging-based methods for assessing liver fibrosis such as magnetic resonance elastography (MRE) and acoustic radiation force impulse (ARFI) quantification, and no data exist on 2D-shear wave elastography (2D-SWE). Therefore, the purpose of this study is to evaluate and compare the applicability of different elastography methods for the assessment of AATD-related liver fibrosis. Fifteen clinically asymptomatic AATD patients (11 homozygous PiZZ, 4 heterozygous PiMZ) and 16 matched healthy volunteers were examined using MRE and ARFI quantification. Additionally, patients were examined with 2D-SWE. A high correlation is evident for the shear wave speed (SWS) determined with different elastography methods in AATD patients: 2D-SWE/MRE, ARFI quantification/2D-SWE, and ARFI quantification/MRE (R = 0.8587, 0.7425, and 0.6914, respectively; P≤0.0089). Four AATD patients with pathologically increased SWS were consistently identified with all three methods-MRE, ARFI quantification, and 2D-SWE. The high correlation and consistent identification of patients with pathologically increased SWS using MRE, ARFI quantification, and 2D-SWE suggest that elastography has the potential to become a suitable imaging tool for the assessment of AATD-related liver fibrosis. These promising results provide motivation for further investigation of non-invasive assessment of AATD-related liver fibrosis using elastography.
Cost-effectiveness analysis of risk-reduction measures to reach water safety targets.
Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof; Pettersson, Thomas J R
2011-01-01
Identifying the most suitable risk-reduction measures in drinking water systems requires a thorough analysis of possible alternatives. In addition to the effects on the risk level, also the economic aspects of the risk-reduction alternatives are commonly considered important. Drinking water supplies are complex systems and to avoid sub-optimisation of risk-reduction measures, the entire system from source to tap needs to be considered. There is a lack of methods for quantification of water supply risk reduction in an economic context for entire drinking water systems. The aim of this paper is to present a novel approach for risk assessment in combination with economic analysis to evaluate risk-reduction measures based on a source-to-tap approach. The approach combines a probabilistic and dynamic fault tree method with cost-effectiveness analysis (CEA). The developed approach comprises the following main parts: (1) quantification of risk reduction of alternatives using a probabilistic fault tree model of the entire system; (2) combination of the modelling results with CEA; and (3) evaluation of the alternatives with respect to the risk reduction, the probability of not reaching water safety targets and the cost-effectiveness. The fault tree method and CEA enable comparison of risk-reduction measures in the same quantitative unit and consider costs and uncertainties. The approach provides a structured and thorough analysis of risk-reduction measures that facilitates transparency and long-term planning of drinking water systems in order to avoid sub-optimisation of available resources for risk reduction. Copyright © 2010 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravindra, M.K.; Banon, H.
1992-07-01
In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jeffrey C. JOe; Ronald L. Boring
Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understandmore » from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.« less
Gibby, Jacob T; Njeru, Dennis K; Cvetko, Steve T; Heiny, Eric L; Creer, Andrew R; Gibby, Wendell A
We correlate and evaluate the accuracy of accepted anthropometric methods of percent body fat (%BF) quantification, namely, hydrostatic weighing (HW) and air displacement plethysmography (ADP), to 2 automatic adipose tissue quantification methods using computed tomography (CT). Twenty volunteer subjects (14 men, 6 women) received head-to-toe CT scans. Hydrostatic weighing and ADP were obtained from 17 and 12 subjects, respectively. The CT data underwent conversion using 2 separate algorithms, namely, the Schneider method and the Beam method, to convert Hounsfield units to their respective tissue densities. The overall mass and %BF of both methods were compared with HW and ADP. When comparing ADP to CT data using the Schneider method and Beam method, correlations were r = 0.9806 and 0.9804, respectively. Paired t tests indicated there were no statistically significant biases. Additionally, observed average differences in %BF between ADP and the Schneider method and the Beam method were 0.38% and 0.77%, respectively. The %BF measured from ADP, the Schneider method, and the Beam method all had significantly higher mean differences when compared with HW (3.05%, 2.32%, and 1.94%, respectively). We have shown that total body mass correlates remarkably well with both the Schneider method and Beam method of mass quantification. Furthermore, %BF calculated with the Schneider method and Beam method CT algorithms correlates remarkably well with ADP. The application of these CT algorithms have utility in further research to accurately stratify risk factors with periorgan, visceral, and subcutaneous types of adipose tissue, and has the potential for significant clinical application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravindra, M.K.; Banon, H.
1992-07-01
In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as amore » three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC`s PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described.« less
Quantitative PCR for Genetic Markers of Human Fecal Pollution
Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires reliable host-specific analytical methods and a rapid quantificationapproach. We report the development of quantitative PCR assays for quantification of two recently described human-...
[Imaging of diabetic osteopathy].
Patsch, J; Pietschmann, P; Schueller-Weidekamm, C
2015-04-01
Diabetic bone diseases are more than just osteoporosis in patients with diabetes mellitus (DM): a relatively high bone mineral density is paired with a paradoxically high risk of fragility fractures. Diabetics exhibit low bone turnover, osteocyte dysfunction, relative hypoparathyroidism and an accumulation of advanced glycation end products in the bone matrix. Besides typical insufficiency fractures, diabetics show a high risk for peripheral fractures of the lower extremities (e.g. metatarsal fractures). The correct interdisciplinary assessment of fracture risks in patients with DM is therefore a clinical challenge. There are two state of the art imaging methods for the quantification of fracture risks: dual energy X-ray absorptiometry (DXA) and quantitative computed tomography (QCT). Radiography, multidetector computed tomography (MDCT) and magnetic resonance imaging (MRI) are suitable for the detection of insufficiency fractures. Novel research imaging techniques, such as high-resolution peripheral quantitative computed tomography (HR-pQCT) provide non-invasive insights into bone microarchitecture of the peripheral skeleton. Using MR spectroscopy, bone marrow composition can be studied. Both methods have been shown to be capable of discriminating between type 2 diabetic patients with and without prevalent fragility fractures and thus bear the potential of improving the current standard of care. Currently both methods remain limited to clinical research applications. DXA and HR-pQCT are valid tools for the quantification of bone mineral density and assessment of fracture risk in patients with DM, especially if interpreted in the context of clinical risk factors. Radiography, CT and MRI are suitable for the detection of insufficiency fractures.
Quantitative characterization of fatty liver disease using x-ray scattering
NASA Astrophysics Data System (ADS)
Elsharkawy, Wafaa B.; Elshemey, Wael M.
2013-11-01
Nonalcoholic fatty liver disease (NAFLD) is a dynamic condition in which fat abnormally accumulates within the hepatocytes. It is believed to be a marker of risk of later chronic liver diseases, such as liver cirrhosis and carcinoma. The fat content in liver biopsies determines its validity for liver transplantation. Transplantation of livers with severe NAFLD is associated with a high risk of primary non-function. Moreover, NAFLD is recognized as a clinically important feature that influences patient morbidity and mortality after hepatic resection. Unfortunately, there is a lack in a precise, reliable and reproducible method for quantification of NAFLD. This work suggests a method for the quantification of NAFLD. The method is based on the fact that fatty liver tissue would have a characteristic x-ray scattering profile with a relatively intense fat peak at a momentum transfer value of 1.1 nm-1 compared to a soft tissue peak at 1.6 nm-1. The fat content in normal and fatty liver is plotted against three profile characterization parameters (ratio of peak intensities, ratio of area under peaks and ratio of area under fat peak to total profile area) for measured and Monte Carlo simulated x-ray scattering profiles. Results show a high linear dependence (R2>0.9) of the characterization parameters on the liver fat content with a reported high correlation coefficient (>0.9) between measured and simulated data. These results indicate that the current method probably offers reliable quantification of fatty liver disease.
NASA Astrophysics Data System (ADS)
Haining, Wang; Lei, Wang; Qian, Zhang; Zongqiang, Zheng; Hongyu, Zhou; Chuncheng, Gao
2018-03-01
For the uncertain problems in the comprehensive evaluation of supervision risk in electricity transaction, this paper uses the unidentified rational numbers to evaluation the supervision risk, to obtain the possible result and corresponding credibility of evaluation and realize the quantification of risk indexes. The model can draw the risk degree of various indexes, which makes it easier for the electricity transaction supervisors to identify the transaction risk and determine the risk level, assisting the decision-making and realizing the effective supervision of the risk. The results of the case analysis verify the effectiveness of the model.
Quantitative PCR for genetic markers of human fecal pollution
Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires reliable host-specific analytical methods and a rapid quantification approach. We report the development of quantitative PCR assays for enumeration of two recently described hum...
Gao, He-Gang; Gong, Wen-Jie; Zhao, Yong-Gang
2015-01-01
Synthetic pigments are still used instead of natural pigments in many foods and their residues in food could be an important risk to human health. A simple and rapid analytical method combining the low-cost extraction protocol with ultra-fast liquid chromatography-tandem quadrupole mass spectrometry (UFLC-MS/MS) was developed for the simultaneous determination of seven synthetic pigments used in colored Chinese steamed buns. For the first time, ethanol/ammonia solution/water (7:2:1, v/v/v) was used as extraction solution for the synthetic pigments in colored Chinese steamed buns. The results showed that the property of the extraction solution used in this method was more effective than critic acid solution, which is used in the polyamide adsorption method. The limits of quantification for the seven synthetic pigments ranged from 0.15 to 0.50 μg/kg. The present method was successfully applied to samples of colored Chinese steamed buns for food-safety risk monitoring in Zhejiang Province, China. The results found sunset yellow pigment in six out of 300 colored Chinese steamed buns (from 0.50 to 32.6 μg/kg).
Chevolleau, S; Noguer-Meireles, M-H; Jouanin, I; Naud, N; Pierre, F; Gueraud, F; Debrauwer, L
2018-04-15
Red or processed meat rich diets have been shown to be associated with an elevated risk of colorectal cancer (CRC). One major hypothesis involves dietary heme iron which induces lipid peroxidation. The quantification of the resulting reactive aldehydes (e.g. HNE and HHE) in the colon lumen is therefore of great concern since these compounds are known for their cytotoxic and genotoxic properties. UHPLC-ESI-MS/MS method has been developed and validated for HNE and HHE quantification in rat faeces. Samples were derivatised using a brominated reagent (BBHA) in presence of pre-synthesized deuterated internal standards (HNE-d11/HHE-d5), extracted by solid phase extraction, and then analysed by LC-positive ESI-MS/MS (MRM) on a TSQ Vantage mass spectrometer. The use of BBHA allowed the efficient stabilisation of the unstable and reactive hydroxy-alkenals HNE and HHE. The MRM method allowed selective detection of HNE and HHE on the basis of characteristic transitions monitored from both the 79 and 81 bromine isotopic peaks. This method was validated according to the European Medicines Agency (EMEA) guidelines, by determining selectivity, sensitivity, linearity, carry-over effect, recovery, matrix effect, repeatability, trueness and intermediate precision. The performance of the method enabled the quantification of HNE and HHE in concentrations 0.10-0.15 μM in faecal water. Results are presented on the application to the quantification of HNE and HHE in different faecal waters obtained from faeces of rats fed diets with various fatty acid compositions thus corresponding to different pro-oxidative features. Copyright © 2018 Elsevier B.V. All rights reserved.
In this study we have developed a novel method to estimate in vivo rates of metabolism in unanesthetized fish. This method provides a basis for evaluating the accuracy of in vitro-in vivo metabolism extrapolations. As such, this research will lead to improved risk assessments f...
A quantitative polymerase chain reaction (qPCR) method for the detection of entercocci fecal indicator bacteria has been shown to be generally applicable for the analysis of temperate fresh (Great Lakes) and marine coastal waters and for providing risk-based determinations of wat...
Computer-aided Assessment of Regional Abdominal Fat with Food Residue Removal in CT
Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi
2014-01-01
Rationale and Objectives Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Materials and Methods Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. Results We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Conclusions Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. PMID:24119354
Quantification of habitat fragmentation reveals extinction risk in terrestrial mammals
Crooks, Kevin R.; Burdett, Christopher L.; Theobald, David M.; King, Sarah R. B.; Rondinini, Carlo; Boitani, Luigi
2017-01-01
Although habitat fragmentation is often assumed to be a primary driver of extinction, global patterns of fragmentation and its relationship to extinction risk have not been consistently quantified for any major animal taxon. We developed high-resolution habitat fragmentation models and used phylogenetic comparative methods to quantify the effects of habitat fragmentation on the world’s terrestrial mammals, including 4,018 species across 26 taxonomic Orders. Results demonstrate that species with more fragmentation are at greater risk of extinction, even after accounting for the effects of key macroecological predictors, such as body size and geographic range size. Species with higher fragmentation had smaller ranges and a lower proportion of high-suitability habitat within their range, and most high-suitability habitat occurred outside of protected areas, further elevating extinction risk. Our models provide a quantitative evaluation of extinction risk assessments for species, allow for identification of emerging threats in species not classified as threatened, and provide maps of global hotspots of fragmentation for the world’s terrestrial mammals. Quantification of habitat fragmentation will help guide threat assessment and strategic priorities for global mammal conservation. PMID:28673992
Quantification of habitat fragmentation reveals extinction risk in terrestrial mammals.
Crooks, Kevin R; Burdett, Christopher L; Theobald, David M; King, Sarah R B; Di Marco, Moreno; Rondinini, Carlo; Boitani, Luigi
2017-07-18
Although habitat fragmentation is often assumed to be a primary driver of extinction, global patterns of fragmentation and its relationship to extinction risk have not been consistently quantified for any major animal taxon. We developed high-resolution habitat fragmentation models and used phylogenetic comparative methods to quantify the effects of habitat fragmentation on the world's terrestrial mammals, including 4,018 species across 26 taxonomic Orders. Results demonstrate that species with more fragmentation are at greater risk of extinction, even after accounting for the effects of key macroecological predictors, such as body size and geographic range size. Species with higher fragmentation had smaller ranges and a lower proportion of high-suitability habitat within their range, and most high-suitability habitat occurred outside of protected areas, further elevating extinction risk. Our models provide a quantitative evaluation of extinction risk assessments for species, allow for identification of emerging threats in species not classified as threatened, and provide maps of global hotspots of fragmentation for the world's terrestrial mammals. Quantification of habitat fragmentation will help guide threat assessment and strategic priorities for global mammal conservation.
2015-04-15
manage , predict, and mitigate the risk in the original variable. Residual risk can be exemplified as a quantification of the improved... the random variable of interest is viewed in concert with a related random vector that helps to manage , predict, and mitigate the risk in the original... manage , predict and mitigate the risk in the original variable. Residual risk can be exemplified as a quantification of the improved situation faced
Esquinas, Pedro L; Uribe, Carlos F; Gonzalez, M; Rodríguez-Rodríguez, Cristina; Häfeli, Urs O; Celler, Anna
2017-07-20
The main applications of 188 Re in radionuclide therapies include trans-arterial liver radioembolization and palliation of painful bone-metastases. In order to optimize 188 Re therapies, the accurate determination of radiation dose delivered to tumors and organs at risk is required. Single photon emission computed tomography (SPECT) can be used to perform such dosimetry calculations. However, the accuracy of dosimetry estimates strongly depends on the accuracy of activity quantification in 188 Re images. In this study, we performed a series of phantom experiments aiming to investigate the accuracy of activity quantification for 188 Re SPECT using high-energy and medium-energy collimators. Objects of different shapes and sizes were scanned in Air, non-radioactive water (Cold-water) and water with activity (Hot-water). The ordered subset expectation maximization algorithm with clinically available corrections (CT-based attenuation, triple-energy window (TEW) scatter and resolution recovery was used). For high activities, the dead-time corrections were applied. The accuracy of activity quantification was evaluated using the ratio of the reconstructed activity in each object to this object's true activity. Each object's activity was determined with three segmentation methods: a 1% fixed threshold (for cold background), a 40% fixed threshold and a CT-based segmentation. Additionally, the activity recovered in the entire phantom, as well as the average activity concentration of the phantom background were compared to their true values. Finally, Monte-Carlo simulations of a commercial [Formula: see text]-camera were performed to investigate the accuracy of the TEW method. Good quantification accuracy (errors <10%) was achieved for the entire phantom, the hot-background activity concentration and for objects in cold background segmented with a 1% threshold. However, the accuracy of activity quantification for objects segmented with 40% threshold or CT-based methods decreased (errors >15%), mostly due to partial-volume effects. The Monte-Carlo simulations confirmed that TEW-scatter correction applied to 188 Re, although practical, yields only approximate estimates of the true scatter.
Quantitative PCR for Detection and Enumeration of Genetic Markers of Bovine Fecal Pollution
Accurate assessment of health risks associated with bovine (cattle) fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for the detection of two recently described cow feces-spec...
Validating the Use of pPerformance Risk Indices for System-Level Risk and Maturity Assessments
NASA Astrophysics Data System (ADS)
Holloman, Sherrica S.
With pressure on the U.S. Defense Acquisition System (DAS) to reduce cost overruns and schedule delays, system engineers' performance is only as good as their tools. Recent literature details a need for 1) objective, analytical risk quantification methodologies over traditional subjective qualitative methods -- such as, expert judgment, and 2) mathematically rigorous system-level maturity assessments. The Mahafza, Componation, and Tippett (2005) Technology Performance Risk Index (TPRI) ties the assessment of technical performance to the quantification of risk of unmet performance; however, it is structured for component- level data as input. This study's aim is to establish a modified TPRI with systems-level data as model input, and then validate the modified index with actual system-level data from the Department of Defense's (DoD) Major Defense Acquisition Programs (MDAPs). This work's contribution is the establishment and validation of the System-level Performance Risk Index (SPRI). With the introduction of the SPRI, system-level metrics are better aligned, allowing for better assessment, tradeoff and balance of time, performance and cost constraints. This will allow system engineers and program managers to ultimately make better-informed system-level technical decisions throughout the development phase.
The Occurrence of Veterinary Pharmaceuticals in the Environment: A Review
Kaczala, Fabio; Blum, Shlomo E.
2016-01-01
It is well known that there is a widespread use of veterinary pharmaceuticals and consequent release into different ecosystems such as freshwater bodies and groundwater systems. Furthermore, the use of organic fertilizers produced from animal waste manure has been also responsible for the occurrence of veterinary pharmaceuticals in agricultural soils. This article is a review of different studies focused on the detection and quantification of such compounds in environmental compartments using different analytical techniques. Furthermore, this paper reports the main challenges regarding veterinary pharmaceuticals in terms of analytical methods, detection/quantification of parent compounds and metabolites, and risks/toxicity to human health and aquatic ecosystems. Based on the existing literature, it is clear that only limited data is available regarding veterinary compounds and there are still considerable gaps to be bridged in order to remediate existing problems and prevent future ones. In terms of analytical methods, there are still considerable challenges to overcome considering the large number of existing compounds and respective metabolites. A number of studies highlight the lack of attention given to the detection and quantification of transformation products and metabolites. Furthermore more attention needs to be given in relation to the toxic effects and potential risks that veterinary compounds pose to environmental and human health. To conclude, the more research investigations focused on these subjects take place in the near future, more rapidly we will get a better understanding about the behavior of these compounds and the real risks they pose to aquatic and terrestrial environments and how to properly tackle them. PMID:28579931
Mull, Bonnie J.; Narayanan, Jothikumar; Hill, Vincent R.
2013-01-01
Primary amebic meningoencephalitis (PAM) is a rare and typically fatal infection caused by the thermophilic free-living ameba, Naegleria fowleri. In 2010, the first confirmed case of PAM acquired in Minnesota highlighted the need for improved detection and quantification methods in order to study the changing ecology of N. fowleri and to evaluate potential risk factors for increased exposure. An immunomagnetic separation (IMS) procedure and real-time PCR TaqMan assay were developed to recover and quantify N. fowleri in water and sediment samples. When one liter of lake water was seeded with N. fowleri strain CDC:V212, the method had an average recovery of 46% and detection limit of 14 amebas per liter of water. The method was then applied to sediment and water samples with unknown N. fowleri concentrations, resulting in positive direct detections by real-time PCR in 3 out of 16 samples and confirmation of N. fowleri culture in 6 of 16 samples. This study has resulted in a new method for detection and quantification of N. fowleri in water and sediment that should be a useful tool to facilitate studies of the physical, chemical, and biological factors associated with the presence and dynamics of N. fowleri in environmental systems. PMID:24228172
Grandin, Flore; Picard-Hagen, Nicole; Gayrard, Véronique; Puel, Sylvie; Viguié, Catherine; Toutain, Pierre-Louis; Debrauwer, Laurent; Lacroix, Marlène Z
2017-12-01
Regulatory measures and public concerns regarding bisphenol A (BPA) have led to its replacement by structural analogues, such as Bisphenol S (BPS), in consumer products. At present, no toxicokinetic investigations have been conducted to assess the factors determining human internal exposure to BPS for subsequent risk assessment. Toxicokinetic studies require reliable analytical methods to measure the plasma concentrations of BPS and its main conjugated metabolite, BPS-glucuronide (BPS-G). An efficient on-line SPE-UPLC-MS/MS method for the simultaneous quantification of BPS and BPS-G in ovine plasma was therefore developed and validated in accordance with the European Medicines Agency guidelines for bioanalytical method validation. This method has a limit of quantification of 3ngmL -1 for BPS and 10ngmL -1 for BPS-G, an analytical capacity of 200 samples per day, and is particularly well suited to toxicokinetic studies. Use of this method in toxicokinetic studies in sheep showed that BPS, like BPA, is efficiently metabolized into its glucuronide form. However, the clearances and distributions of BPS and BPS-G were lower than those of the corresponding unconjugated and glucuroconjugated forms of BPA. Copyright © 2017 Elsevier B.V. All rights reserved.
Integrating human behaviour dynamics into flood disaster risk assessment
NASA Astrophysics Data System (ADS)
Aerts, J. C. J. H.; Botzen, W. J.; Clarke, K. C.; Cutter, S. L.; Hall, J. W.; Merz, B.; Michel-Kerjan, E.; Mysiak, J.; Surminski, S.; Kunreuther, H.
2018-03-01
The behaviour of individuals, businesses, and government entities before, during, and immediately after a disaster can dramatically affect the impact and recovery time. However, existing risk-assessment methods rarely include this critical factor. In this Perspective, we show why this is a concern, and demonstrate that although initial efforts have inevitably represented human behaviour in limited terms, innovations in flood-risk assessment that integrate societal behaviour and behavioural adaptation dynamics into such quantifications may lead to more accurate characterization of risks and improved assessment of the effectiveness of risk-management strategies and investments. Such multidisciplinary approaches can inform flood-risk management policy development.
Air Quality Cumulative Effects Assessment for U.S. Air Force Bases.
1998-01-29
forecasted activities. Consideration of multimedia effects and transmedia impacts is important, however, in CEA. Any quantification method developed...substantive areas, such as water quality, ecology, planing, archeology , and landscape architecture? 9. Are there public concerns due to the impact risks of...methods developed for CEA should consider multimedia effects and transmedia impacts. Portions of this research can be used, or modified, to address other
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brigantic, Robert T.; Betzsold, Nick J.; Bakker, Craig KR
In this presentation we overview a methodology for dynamic security risk quantification and optimal resource allocation of security assets for high profile venues. This methodology is especially applicable to venues that require security screening operations such as mass transit (e.g., train or airport terminals), critical infrastructure protection (e.g., government buildings), and largescale public events (e.g., concerts or professional sports). The method starts by decomposing the three core components of risk -- threat, vulnerability, and consequence -- into their various subcomponents. For instance, vulnerability can be decomposed into availability, accessibility, organic security, and target hardness and each of these can bemore » evaluated against the potential threats of interest for the given venue. Once evaluated, these subcomponents are rolled back up to compute the specific value for the vulnerability core risk component. Likewise, the same is done for consequence and threat, and then risk is computed as the product of these three components. A key aspect of our methodology is dynamically quantifying risk. That is, we incorporate the ability to uniquely allow the subcomponents and core components, and in turn, risk, to be quantified as a continuous function of time throughout the day, week, month, or year as appropriate.« less
Pawar, Rajesh; Bromhal, Grant; Carroll, Susan; ...
2014-12-31
Risk assessment for geologic CO₂ storage including quantification of risks is an area of active investigation. The National Risk Assessment Partnership (NRAP) is a US-Department of Energy (US-DOE) effort focused on developing a defensible, science-based methodology and platform for quantifying risk profiles at geologic CO₂ sequestration sites. NRAP has been developing a methodology that centers round development of an integrated assessment model (IAM) using system modeling approach to quantify risks and risk profiles. The IAM has been used to calculate risk profiles with a few key potential impacts due to potential CO₂ and brine leakage. The simulation results are alsomore » used to determine long-term storage security relationships and compare the long-term storage effectiveness to IPCC storage permanence goal. Additionally, we also demonstrate application of IAM for uncertainty quantification in order to determine parameters to which the uncertainty in model results is most sensitive.« less
A fully automated system for quantification of background parenchymal enhancement in breast DCE-MRI
NASA Astrophysics Data System (ADS)
Ufuk Dalmiş, Mehmet; Gubern-Mérida, Albert; Borelli, Cristina; Vreemann, Suzan; Mann, Ritse M.; Karssemeijer, Nico
2016-03-01
Background parenchymal enhancement (BPE) observed in breast dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) has been identified as an important biomarker associated with risk for developing breast cancer. In this study, we present a fully automated framework for quantification of BPE. We initially segmented fibroglandular tissue (FGT) of the breasts using an improved version of an existing method. Subsequently, we computed BPEabs (volume of the enhancing tissue), BPErf (BPEabs divided by FGT volume) and BPErb (BPEabs divided by breast volume), using different relative enhancement threshold values between 1% and 100%. To evaluate and compare the previous and improved FGT segmentation methods, we used 20 breast DCE-MRI scans and we computed Dice similarity coefficient (DSC) values with respect to manual segmentations. For evaluation of the BPE quantification, we used a dataset of 95 breast DCE-MRI scans. Two radiologists, in individual reading sessions, visually analyzed the dataset and categorized each breast into minimal, mild, moderate and marked BPE. To measure the correlation between automated BPE values to the radiologists' assessments, we converted these values into ordinal categories and we used Spearman's rho as a measure of correlation. According to our results, the new segmentation method obtained an average DSC of 0.81 0.09, which was significantly higher (p<0.001) compared to the previous method (0.76 0.10). The highest correlation values between automated BPE categories and radiologists' assessments were obtained with the BPErf measurement (r=0.55, r=0.49, p<0.001 for both), while the correlation between the scores given by the two radiologists was 0.82 (p<0.001). The presented framework can be used to systematically investigate the correlation between BPE and risk in large screening cohorts.
López, Laura B; Baroni, Andrea V; Rodríguez, Viviana G; Greco, Carola B; de Costa, Sara Macías; de Ferrer, Patricia Ronayne; Rodríguez de Pece, Silvia
2005-06-01
A methodology for the quantification of vitamin A in human milk was developed and validated. Vitamin A levels were assessed in 223 samples corresponding to the 5th, 6th and 7th postpartum months, obtained in the province of Santiago del Estero, Argentina. The samples (500 microL) were saponified with potassium hydroxide/ethanol, extracted with hexane, evaporated to dryness and reconstituted with methanol. A column RP-C18, a mobile phase methanol/water (91:9 v/v) and a fluorescence detector (lambda excitation 330 nm and lambda emition 470 nm) were used for the separation and quantification of vitamin A. The analytical parameters of linearity (r2: 0.9995), detection (0.010 microg/mL) and quantification (0.025 microg/mL) limits, precision of the method (relative standard deviation, RSD = 9.0% within a day and RSD = 8.9% among days) and accuracy (recovery = 83.8%) demonstrate that the developed method allows the quantification of vitamin A in an efficient way. The mean values + standard deviation (SD) obtained for the analyzed samples were 0.60 +/- 0.32; 0.65 +/- 0.33 and 0.61 +/- 0.26 microg/ mL for the 5th, 6th and 7th postpartum months, respectively. There were no significant differences among the three months studied and the values found were similar to those in the literature. Considering the whole population under study, 19.3% showed vitamin A levels less than 0.40 microg/mL, which represents a risk to the children in this group since at least 0.50 microg/mL are necessary to meet the infant daily needs.
USDA-ARS?s Scientific Manuscript database
Campylobacter jejuni (C. jejuni) is one of the most common causes of gastroenteritis in the world. Given the potential risks to human, animal and environmental health the development and optimization of methods to quantify this important pathogen in environmental samples is essential. Two of the mos...
Lamar, Melissa; Zhou, Xiaohong Joe; Charlton, Rebecca A.; Dean, Douglas; Little, Deborah; Deoni, Sean C
2013-01-01
Human brain imaging has seen many advances in the quantification of white matter in vivo. For example, these advances have revealed the association between white matter damage and vascular disease as well as their impact on risk for and development of dementia and depression in an aging population. Current neuroimaging methods to quantify white matter damage provide a foundation for understanding such age-related neuropathology; however, these methods are not as adept at determining the underlying microstructural abnormalities signaling at risk tissue or driving white matter damage in the aging brain. This review will begin with a brief overview of the use of diffusion tensor imaging (DTI) in understanding white matter alterations in aging before focusing in more detail on select advances in both diffusion-based methods and multi-component relaxometry techniques for imaging white matter microstructural integrity within myelin sheaths and the axons they encase. While DTI greatly extended the field of white matter interrogation, these more recent technological advances will add clarity to the underlying microstructural mechanisms that contribute to white matter damage. More specifically, the methods highlighted in this review may prove more sensitive (and specific) for determining the contribution of myelin versus axonal integrity to the aging of white matter in brain. PMID:24080382
NASA Astrophysics Data System (ADS)
Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B. E.
2013-03-01
Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very useful in quantifying disease severity, they require an extensive clinical experience and carry a risk of subjectivity. We explore the opportunity to use in vivo near-infrared (NIR) spectra as an objective and noninvasive method for local disease severity assessment in 31 psoriasis patients in whom selected plaques were scored clinically. A partial least squares (PLS) regression model was used to analyze and predict the severity scores on the NIR spectra of psoriatic and uninvolved skin. The correlation between predicted and clinically assigned scores was R=0.94 (RMSE=0.96), suggesting that in vivo NIR provides accurate clinical quantification of psoriatic plaques. Hence, NIR may be a practical solution to clinical severity assessment of psoriasis, providing a continuous, linear, numerical value of severity.
Pralatnet, Sasithorn; Poapolathep, Saranya; Giorgi, Mario; Imsilp, Kanjana; Kumagai, Susumu; Poapolathep, Amnart
2016-07-01
One hundred wheat product samples (50 instant noodle samples and 50 bread samples) were collected from supermarkets in Bangkok, Thailand. Deoxynivalenol (DON) and aflatoxin B1 (AFB1) contamination in these products was analyzed using a validated liquid chromatography-tandem mass spectrometry method. The limit of quantification values of DON and AFB1 in the instant noodles and bread were 2 and 1 ng g(-1), respectively. The survey found that DON was quantifiable in 40% of collected samples, in 2% of noodles (0.089 μg g(-1)), and in 78% of breads (0.004 to 0.331 μg g(-1)). AFB1 was below the limit of quantification of the method in all of the tested samples. The results suggest that the risk of DON exposure via noodles and breads is very low in urban areas of Thailand. No risk can be attributable to AFB1 exposure in the same food matrices, but further studies with a larger sample size are needed to confirm these data.
Wang, Mo; Yang, Ruiyue; Dong, Jun; Zhang, Tianjiao; Wang, Siming; Zhou, Weiyan; Li, Hongxia; Zhao, Haijian; Zhang, Lijiao; Wang, Shu; Zhang, Chuanbao; Chen, Wenxiang
2016-01-15
Recent observations from metabonomic studies have consistently found that branched-chain amino acids (BCAAs), aromatic amino acids (AAAs), glutamine (Gln), glutamic acid (Glu), Gln/Glu ratio, carnitine, and several species of acylcarnitines and lysophosphatidylcholines (LPCs) are possible risk factors for metabolic diseases such as diabetes mellitus (DM) and cardiovascular diseases (CVD). We described here a simple and reliable method for simultaneous quantification of these metabolic risk factors by liquid chromatography tandem mass spectrometry (LC-MS/MS). Serum samples were extracted with isopropanol, and the extracted metabolites were separated by hydrophilic interaction liquid chromatography (HILIC) and detected with electrospary ionization (ESI) inpositive ion mode with multiple reaction monitor (MRM) mode. All the metabolites were effectively separated within 5.5min. Analytical recoveries were in the range of 92.8-106.9%, with an average of 100.6%. The intra- run and total imprecisions for the measurement of these metabolites were 1.2-3.8% and 1.5-7.4%, respectively. Serum concentrations of the metabolites were analyzed in 123 apparently healthy volunteers. Significant associations between the metabolites and traditional CVD risk factors were observed. The newly developed LC-MS/MS method was simple, precise, and accurate and can be used as an efficient tool in CVD research and studies. Copyright © 2015 Elsevier B.V. All rights reserved.
Möller, L; Schuetzle, D; Autrup, H
1994-01-01
This paper presents key conclusions and future research needs from a Workshop on the Risk Assessment of Urban Air, Emissions, Exposure, Risk Identification, and Quantification, which was held in Stockholm during June 1992 by 41 participants from 13 countries. Research is recommended in the areas of identification and quantification of toxics in source emissions and ambient air, atmospheric transport and chemistry, exposure level assessment, the development of improved in vitro bioassays, biomarker development, the development of more accurate epidemiological methodologies, and risk quantification techniques. Studies are described that will be necessary to assess and reduce the level of uncertainties associated with each step of the risk assessment process. International collaborative research efforts between industry and government organizations are recommended as the most effective way to carry out this research. PMID:7529703
Computer-aided assessment of regional abdominal fat with food residue removal in CT.
Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi
2013-11-01
Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. Published by Elsevier Inc.
2017-02-02
Corresponding Author Abstract Accurate virus quantification is sought, but a perfect method still eludes the scientific community. Electron...unlimited. UNCLASSIFIED 2 provides morphology data and counts all viral particles, including partial or noninfectious particles; however, EM methods ...consistent, reproducible virus quantification method called Scanning Transmission Electron Microscopy – Virus Quantification (STEM-VQ) which simplifies
Application of the NUREG/CR-6850 EPRI/NRC Fire PRA Methodology to a DOE Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tom Elicson; Bentley Harwood; Richard Yorg
2011-03-01
The application NUREG/CR-6850 EPRI/NRC fire PRA methodology to DOE facility presented several challenges. This paper documents the process and discusses several insights gained during development of the fire PRA. A brief review of the tasks performed is provided with particular focus on the following: • Tasks 5 and 14: Fire-induced risk model and fire risk quantification. A key lesson learned was to begin model development and quantification as early as possible in the project using screening values and simplified modeling if necessary. • Tasks 3 and 9: Fire PRA cable selection and detailed circuit failure analysis. In retrospect, it wouldmore » have been beneficial to perform the model development and quantification in 2 phases with detailed circuit analysis applied during phase 2. This would have allowed for development of a robust model and quantification earlier in the project and would have provided insights into where to focus the detailed circuit analysis efforts. • Tasks 8 and 11: Scoping fire modeling and detailed fire modeling. More focus should be placed on detailed fire modeling and less focus on scoping fire modeling. This was the approach taken for the fire PRA. • Task 14: Fire risk quantification. Typically, multiple safe shutdown (SSD) components fail during a given fire scenario. Therefore dependent failure analysis is critical to obtaining a meaningful fire risk quantification. Dependent failure analysis for the fire PRA presented several challenges which will be discussed in the full paper.« less
Li, Yinghong; Zhou, Ping; Xu, Quanhua; Zhao, Huan; Shao, Qiaoyun
2018-02-08
A method was developed for the simultaneous determination of seven high risk pesticides in the royal jelly, eg. tau-fluvalinate, triadimenol, coumaphos, haloxyfop, carbendazim, thiophanate-ethyl and thiophanate-methyl by high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS). First, the royal jelly samples were extracted with acetonitrile under alkaline conditions. After dehydration by anhydrous sodium sulfate, the extracts were enriched and purified through solid-phase extraction (SPE) with Oasis HLB cartridges. Finally, the pesticides were detected by HPLC-MS/MS method. The separation was carried out on a Venusil MP C18 column with gradient elution. Methanol (containing 0.1% (v/v) formic acid) and 0.5 mmol/L ammonium acetate aqueous solution (containing 0.1% (v/v) formic acid) were used as the mobile phases. The detection was achieved using electrospray ionization in positive ion (ESI + ) mode and multiple reaction monitoring (MRM) mode for data collection. Quantification was carried out using internal standard method. The results showed that the seven high risk pesticides were linear in the range of 5-100 μg/kg. The linear correlation coefficients ( r 2 ) were 0.9921-0.9996. The limits of detection (LODs) and limits of quantification (LOQs) of the seven high risk pesticides were 0.5-2.0 μg/kg and 1.0-5.0 μg/kg, respectively. The average recoveries at the three spiked levels were 80.5%-101.3%, and the relative standard deviations were 3.6%-9.4% ( n =3). This method is simple, effective and sensitive, and is suitable for the determination of the pesticide residues in royal jelly.
Establishment of a method for determination of arsenic species in seafood by LC-ICP-MS.
Zmozinski, Ariane V; Llorente-Mirandes, Toni; López-Sánchez, José F; da Silva, Márcia M
2015-04-15
An analytical method for determination of arsenic species (inorganic arsenic (iAs), methylarsonic acid (MA), dimethylarsinic acid (DMA), arsenobetaine (AB), trimethylarsine oxide (TMAO) and arsenocholine (AC)) in Brazilian and Spanish seafood samples is reported. This study was focused on extraction and quantification of inorganic arsenic (iAs), the most toxic form. Arsenic speciation was carried out via LC with both anionic and cationic exchange with ICP-MS detection (LC-ICP-MS). The detection limits (LODs), quantification limits (LOQs), precision and accuracy for arsenic species were established. The proposed method was evaluated using eight reference materials (RMs). Arsenobetaine was the main species found in all samples. The total and iAs concentration in 22 seafood samples and RMs ranged between 0.27-35.2 and 0.02-0.71 mg As kg(-1), respectively. Recoveries ranging from 100% to 106% for iAs, based on spikes, were achieved. The proposed method provides reliable iAs data for future risk assessment analysis. Copyright © 2014 Elsevier Ltd. All rights reserved.
Baldelli, Sara; Marrubini, Giorgio; Cattaneo, Dario; Clementi, Emilio; Cerea, Matteo
2017-10-01
The application of Quality by Design (QbD) principles in clinical laboratories can help to develop an analytical method through a systematic approach, providing a significant advance over the traditional heuristic and empirical methodology. In this work, we applied for the first time the QbD concept in the development of a method for drug quantification in human plasma using elvitegravir as the test molecule. The goal of the study was to develop a fast and inexpensive quantification method, with precision and accuracy as requested by the European Medicines Agency guidelines on bioanalytical method validation. The method was divided into operative units, and for each unit critical variables affecting the results were identified. A risk analysis was performed to select critical process parameters that should be introduced in the design of experiments (DoEs). Different DoEs were used depending on the phase of advancement of the study. Protein precipitation and high-performance liquid chromatography-tandem mass spectrometry were selected as the techniques to be investigated. For every operative unit (sample preparation, chromatographic conditions, and detector settings), a model based on factors affecting the responses was developed and optimized. The obtained method was validated and clinically applied with success. To the best of our knowledge, this is the first investigation thoroughly addressing the application of QbD to the analysis of a drug in a biological matrix applied in a clinical laboratory. The extensive optimization process generated a robust method compliant with its intended use. The performance of the method is continuously monitored using control charts.
2013-01-01
Influenza virus-like particle vaccines are one of the most promising ways to respond to the threat of future influenza pandemics. VLPs are composed of viral antigens but lack nucleic acids making them non-infectious which limit the risk of recombination with wild-type strains. By taking advantage of the advancements in cell culture technologies, the process from strain identification to manufacturing has the potential to be completed rapidly and easily at large scales. After closely reviewing the current research done on influenza VLPs, it is evident that the development of quantification methods has been consistently overlooked. VLP quantification at all stages of the production process has been left to rely on current influenza quantification methods (i.e. Hemagglutination assay (HA), Single Radial Immunodiffusion assay (SRID), NA enzymatic activity assays, Western blot, Electron Microscopy). These are analytical methods developed decades ago for influenza virions and final bulk influenza vaccines. Although these methods are time-consuming and cumbersome they have been sufficient for the characterization of final purified material. Nevertheless, these analytical methods are impractical for in-line process monitoring because VLP concentration in crude samples generally falls out of the range of detection for these methods. This consequently impedes the development of robust influenza-VLP production and purification processes. Thus, development of functional process analytical techniques, applicable at every stage during production, that are compatible with different production platforms is in great need to assess, optimize and exploit the full potential of novel manufacturing platforms. PMID:23642219
den Braver, Michiel W; Vermeulen, Nico P E; Commandeur, Jan N M
2017-03-01
Modification of cellular macromolecules by reactive drug metabolites is considered to play an important role in the initiation of tissue injury by many drugs. Detection and identification of reactive intermediates is often performed by analyzing the conjugates formed after trapping by glutathione (GSH). Although sensitivity of modern mass spectrometrical methods is extremely high, absolute quantification of GSH-conjugates is critically dependent on the availability of authentic references. Although 1 H NMR is currently the method of choice for quantification of metabolites formed biosynthetically, its intrinsically low sensitivity can be a limiting factor in quantification of GSH-conjugates which generally are formed at low levels. In the present study, a simple but sensitive and generic method for absolute quantification of GSH-conjugates is presented. The method is based on quantitative alkaline hydrolysis of GSH-conjugates and subsequent quantification of glutamic acid and glycine by HPLC after precolumn derivatization with o-phthaldialdehyde/N-acetylcysteine (OPA/NAC). Because of the lower stability of the glycine OPA/NAC-derivate, quantification of the glutamic acid OPA/NAC-derivate appeared most suitable for quantification of GSH-conjugates. The novel method was used to quantify the concentrations of GSH-conjugates of diclofenac, clozapine and acetaminophen and quantification was consistent with 1 H NMR, but with a more than 100-fold lower detection limit for absolute quantification. Copyright © 2017. Published by Elsevier B.V.
Lamar, Melissa; Zhou, Xiaohong Joe; Charlton, Rebecca A; Dean, Douglas; Little, Deborah; Deoni, Sean C
2014-02-01
Human brain imaging has seen many advances in the quantification of white matter in vivo. For example, these advances have revealed the association between white matter damage and vascular disease as well as their impact on risk for and development of dementia and depression in an aging population. Current neuroimaging methods to quantify white matter damage provide a foundation for understanding such age-related neuropathology; however, these methods are not as adept at determining the underlying microstructural abnormalities signaling at risk tissue or driving white matter damage in the aging brain. This review will begin with a brief overview of the use of diffusion tensor imaging (DTI) in understanding white matter alterations in aging before focusing in more detail on select advances in both diffusion-based methods and multi-component relaxometry techniques for imaging white matter microstructural integrity within myelin sheaths and the axons they encase. Although DTI greatly extended the field of white matter interrogation, these more recent technological advances will add clarity to the underlying microstructural mechanisms that contribute to white matter damage. More specifically, the methods highlighted in this review may prove more sensitive (and specific) for determining the contribution of myelin versus axonal integrity to the aging of white matter in brain. Copyright © 2014 American Association for Geriatric Psychiatry. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Kim, Chang-Beom; Kim, Kwan-Soo; Song, Ki-Bong
2013-05-01
The importance of early Alzheimer's disease (AD) detection has been recognized to diagnose people at high risk of AD. The existence of intra/extracellular beta-amyloid (Aβ) of brain neurons has been regarded as the most archetypal hallmark of AD. The existing computed-image-based neuroimaging tools have limitations on accurate quantification of nanoscale Aβ peptides due to optical diffraction during imaging processes. Therefore, we propose a new method that is capable of evaluating a small amount of Aβ peptides by using photo-sensitive field-effect transistor (p-FET) integrated with magnetic force-based microbead collecting platform and selenium(Se) layer (thickness ~700 nm) as an optical filter. This method demonstrates a facile approach for the analysis of Aβ quantification using magnetic force and magnetic silica microparticles (diameter 0.2~0.3 μm). The microbead collecting platform mainly consists of the p-FET sensing array and the magnet (diameter ~1 mm) which are placed beneath each sensing region of the p-FET, which enables the assembly of the Aβ antibody conjugated microbeads, captures the Aβ peptides from samples, measures the photocurrents generated by the Q-dot tagged with Aβ peptides, and consequently results in the effective Aβ quantification.
NASA Astrophysics Data System (ADS)
Dalezios, Nicolas R.; Blanta, Anna; Spyropoulos, Nicos
2013-04-01
Drought is considered as one of the major environmental hazards with significant impacts to agriculture, environment, economy and society. This paper addresses drought as a hazard within the risk management framework. Indeed, hazards may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Besides, risk management consists of risk assessment and feedback of the adopted risk reduction measures. And risk assessment comprises three distinct steps, namely risk identification, risk estimation and risk evaluation. In order to ensure sustainability in agricultural production a better understanding of the natural disasters, in particular droughts, that impact agriculture is essential. Droughts may result in environmental degradation of an area, which is one of the factors contributing to the vulnerability of agriculture, because it directly magnifies the risk of natural disasters. This paper deals with drought risk identification, which involves hazard quantification, event monitoring including early warning systems and statistical inference. For drought quantification the Reconnaissance Drought Index (RDI) combined with Vegetation Health Index (VHI) is employed. RDI is a new index based on hydrometeorological parameters, and in particular precipitation and potential evapotranspiration, which has been recently modified to incorporate monthly satellite (NOAA/AVHAA) data for a period of 20 years (1981-2001). VHI is based on NDVI. The study area is Thessaly in central Greece, which is one of the major agricultural areas of the country occasionally facing droughts. Drought monitoring is conducted by monthly remotely sensed RID and VHI images and several drought features are extracted such as severity, duration, areal extent, onset and end time. Drought early warning is developed using empirical relationships of the above mentioned features. In particular, two second-order polynomials are fitted relating severity and areal extend (number of pixels), one for low and other for high severity drought. The two fitted curves offer a forecasting tool on a monthly basis from the beginning of each hydrological year with high severity droughts occurring from October, whereas low severity droughts start in April. The results of this drought risk identification effort are considered quite satisfactory offering a prognostic potential of drought. The adopted remote sensing data and methods have proven very effective in delineating spatial variability and features in drought quantification and monitoring.
Cutler, Timothy D.; Wang, Chong; Hoff, Steven J.; Zimmerman, Jeffrey J.
2013-01-01
In aerobiology, dose-response studies are used to estimate the risk of infection to a susceptible host presented by exposure to a specific dose of an airborne pathogen. In the research setting, host- and pathogen-specific factors that affect the dose-response continuum can be accounted for by experimental design, but the requirement to precisely determine the dose of infectious pathogen to which the host was exposed is often challenging. By definition, quantification of viable airborne pathogens is based on the culture of micro-organisms, but some airborne pathogens are transmissible at concentrations below the threshold of quantification by culture. In this paper we present an approach to the calculation of exposure dose at microbiologically unquantifiable levels using an application of the “continuous-stirred tank reactor (CSTR) model” and the validation of this approach using rhodamine B dye as a surrogate for aerosolized microbial pathogens in a dynamic aerosol toroid (DAT). PMID:24082399
Dermatologic radiotherapy and thyroid cancer. Dose measurements and risk quantification.
Goldschmidt, H; Gorson, R O; Lassen, M
1983-05-01
Thyroid doses for various dermatologic radiation techniques were measured with thermoluminescent dosimeters and ionization rate meters in an Alderson-Rando anthropomorphic phantom. The effects of changes in radiation quality and of the use or nonuse of treatment cones and thyroid shields were evaluated in detail. The results indicate that the potential risk of radiogenic thyroid cancer is very small when proper radiation protection measures are used. The probability of radiogenic thyroid cancer developing and the potential mortality risk were assessed quantitatively for each measurement. The quantification of radiation risks allows comparisons with risks of other therapeutic modalities and the common hazards of daily life.
Selmi, Giuliana da Fontoura Rodrigues; Trapé, Angelo Zanaga
2014-05-01
Quantification of dermal exposure to pesticides in rural workers, used in risk assessment, can be performed with different techniques such as patches or whole body evaluation. However, the wide variety of methods can jeopardize the process by producing disparate results, depending on the principles in sample collection. A critical review was thus performed on the main techniques for quantifying dermal exposure, calling attention to this issue and the need to establish a single methodology for quantification of dermal exposure in rural workers. Such harmonization of different techniques should help achieve safer and healthier working conditions. Techniques that can provide reliable exposure data are an essential first step towards avoiding harm to workers' health.
On the complex quantification of risk: systems-based perspective on terrorism.
Haimes, Yacov Y
2011-08-01
This article highlights the complexity of the quantification of the multidimensional risk function, develops five systems-based premises on quantifying the risk of terrorism to a threatened system, and advocates the quantification of vulnerability and resilience through the states of the system. The five premises are: (i) There exists interdependence between a specific threat to a system by terrorist networks and the states of the targeted system, as represented through the system's vulnerability, resilience, and criticality-impact. (ii) A specific threat, its probability, its timing, the states of the targeted system, and the probability of consequences can be interdependent. (iii) The two questions in the risk assessment process: "What is the likelihood?" and "What are the consequences?" can be interdependent. (iv) Risk management policy options can reduce both the likelihood of a threat to a targeted system and the associated likelihood of consequences by changing the states (including both vulnerability and resilience) of the system. (v) The quantification of risk to a vulnerable system from a specific threat must be built on a systemic and repeatable modeling process, by recognizing that the states of the system constitute an essential step to construct quantitative metrics of the consequences based on intelligence gathering, expert evidence, and other qualitative information. The fact that the states of all systems are functions of time (among other variables) makes the time frame pivotal in each component of the process of risk assessment, management, and communication. Thus, risk to a system, caused by an initiating event (e.g., a threat) is a multidimensional function of the specific threat, its probability and time frame, the states of the system (representing vulnerability and resilience), and the probabilistic multidimensional consequences. © 2011 Society for Risk Analysis.
Nebot, Carolina; Regal, Patricia; Miranda, Jose Manuel; Fente, Cristina; Cepeda, Alberto
2013-12-01
Sulfonamides are antimicrobial agents widely employed in animal production and their residues in food could be an important risk to human health. In the dairy industry, large quantities of milk are monitored daily for the presence of sulfonamides. A simple and low-cost extraction protocol followed by a liquid chromatographic-tandem mass spectrometry method was developed for the simultaneous detection of nine sulfonamides in whole milk. The method was validated at the maximum residue limits established by European legislation. The limits of quantification obtained for most sulfonamides were between 12.5 and 25 μg kg(-1), detection capabilities ranged from 116 to 145 μg kg(-1), and recoveries, at 100 μg kg(-1), were greater than 89±12.5%. The method was employed to analyse 100 raw whole bovine milk samples collected from dairy farms in the northwest region of Spain. All of the samples were found to be compliant, but two were positive; one for sulfadiazine and the other for sulfamethoxipyridazine. Copyright © 2013 Elsevier Ltd. All rights reserved.
Dual Approach To Superquantile Estimation And Applications To Density Fitting
2016-06-01
incorporate additional constraints to improve the fidelity of density estimates in tail regions. We limit our investigation to data with heavy tails, where...samples of various heavy -tailed distributions. 14. SUBJECT TERMS probability density estimation, epi-splines, optimization, risk quantification...limit our investigation to data with heavy tails, where risk quantification is typically the most difficult. Demonstrations are provided in the form of
Hayashi, Hideki; Kita, Yutaro; Iihara, Hirotoshi; Yanase, Koumei; Ohno, Yasushi; Hirose, Chiemi; Yamada, Maya; Todoroki, Kenichiro; Kitaichi, Kiyoyuki; Minatoguchi, Shinya; Itoh, Yoshinori; Sugiyama, Tadashi
2016-07-01
A simultaneous, selective, sensitive and rapid liquid chromatography/tandem mass spectrometry method was developed and validated for the quantification of gefitinib, erlotinib and afatinib in 250 μL samples of human blood plasma. Diluted plasma samples were extracted using a liquid-phase extraction procedure with tert-butyl methyl ether. The three drugs were separated by high-performance liquid chromatography using a C18 column and an isocratic mobile phase running at a flow rate of 0.2 mL/min for 5 min. The drugs were detected using a tandem mass spectrometer with electrospray ionization using imatinib as an internal standard. Calibration curves were generated over the linear concentration range of 0.05-100 nm in plasma with a lower limit of quantification of 0.01 or 0.05 nm for all compounds. Finally, the validated method was applied to a clinical pharmacokinetic study in patients with nonsmall-cell lung cancer (NSCLC) following the oral administration of afatinib. These results indicate that this method is suitable for assessing the risks and benefits of chemotherapy in patients with NSCLC and is useful for therapeutic drug monitoring for NSCLC treatment. As far as we know, this is the first report on LC-MS/MS method for the simultaneous quantification of NSCLC tyrosine kinase inhibitor plasma concentrations including afatinib. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Morita, M
2011-01-01
Global climate change is expected to affect future rainfall patterns. These changes should be taken into account when assessing future flooding risks. This study presents a method for quantifying the increase in flood risk caused by global climate change for use in urban flood risk management. Flood risk in this context is defined as the product of flood damage potential and the probability of its occurrence. The study uses a geographic information system-based flood damage prediction model to calculate the flood damage caused by design storms with different return periods. Estimation of the monetary damages these storms produce and their return periods are precursors to flood risk calculations. The design storms are developed from modified intensity-duration-frequency relationships generated by simulations of global climate change scenarios (e.g. CGCM2A2). The risk assessment method is applied to the Kanda River basin in Tokyo, Japan. The assessment provides insights not only into the flood risk cost increase due to global warming, and the impact that increase may have on flood control infrastructure planning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Zhou; Adams, Rachel M; Chourey, Karuna
2012-01-01
A variety of quantitative proteomics methods have been developed, including label-free, metabolic labeling, and isobaric chemical labeling using iTRAQ or TMT. Here, these methods were compared in terms of the depth of proteome coverage, quantification accuracy, precision, and reproducibility using a high-performance hybrid mass spectrometer, LTQ Orbitrap Velos. Our results show that (1) the spectral counting method provides the deepest proteome coverage for identification, but its quantification performance is worse than labeling-based approaches, especially the quantification reproducibility; (2) metabolic labeling and isobaric chemical labeling are capable of accurate, precise, and reproducible quantification and provide deep proteome coverage for quantification. Isobaricmore » chemical labeling surpasses metabolic labeling in terms of quantification precision and reproducibility; (3) iTRAQ and TMT perform similarly in all aspects compared in the current study using a CID-HCD dual scan configuration. Based on the unique advantages of each method, we provide guidance for selection of the appropriate method for a quantitative proteomics study.« less
Statistical image quantification toward optimal scan fusion and change quantification
NASA Astrophysics Data System (ADS)
Potesil, Vaclav; Zhou, Xiang Sean
2007-03-01
Recent advance of imaging technology has brought new challenges and opportunities for automatic and quantitative analysis of medical images. With broader accessibility of more imaging modalities for more patients, fusion of modalities/scans from one time point and longitudinal analysis of changes across time points have become the two most critical differentiators to support more informed, more reliable and more reproducible diagnosis and therapy decisions. Unfortunately, scan fusion and longitudinal analysis are both inherently plagued with increased levels of statistical errors. A lack of comprehensive analysis by imaging scientists and a lack of full awareness by physicians pose potential risks in clinical practice. In this paper, we discuss several key error factors affecting imaging quantification, studying their interactions, and introducing a simulation strategy to establish general error bounds for change quantification across time. We quantitatively show that image resolution, voxel anisotropy, lesion size, eccentricity, and orientation are all contributing factors to quantification error; and there is an intricate relationship between voxel anisotropy and lesion shape in affecting quantification error. Specifically, when two or more scans are to be fused at feature level, optimal linear fusion analysis reveals that scans with voxel anisotropy aligned with lesion elongation should receive a higher weight than other scans. As a result of such optimal linear fusion, we will achieve a lower variance than naïve averaging. Simulated experiments are used to validate theoretical predictions. Future work based on the proposed simulation methods may lead to general guidelines and error lower bounds for quantitative image analysis and change detection.
Zahnd, Guillaume; Karanasos, Antonios; van Soest, Gijs; Regar, Evelyn; Niessen, Wiro; Gijsen, Frank; van Walsum, Theo
2015-09-01
Fibrous cap thickness is the most critical component of plaque stability. Therefore, in vivo quantification of cap thickness could yield valuable information for estimating the risk of plaque rupture. In the context of preoperative planning and perioperative decision making, intracoronary optical coherence tomography imaging can provide a very detailed characterization of the arterial wall structure. However, visual interpretation of the images is laborious, subject to variability, and therefore not always sufficiently reliable for immediate decision of treatment. A novel semiautomatic segmentation method to quantify coronary fibrous cap thickness in optical coherence tomography is introduced. To cope with the most challenging issue when estimating cap thickness (namely the diffuse appearance of the anatomical abluminal interface to be detected), the proposed method is based on a robust dynamic programming framework using a geometrical a priori. To determine the optimal parameter settings, a training phase was conducted on 10 patients. Validated on a dataset of 179 images from 21 patients, the present framework could successfully extract the fibrous cap contours. When assessing minimal cap thickness, segmentation results from the proposed method were in good agreement with the reference tracings performed by a medical expert (mean absolute error and standard deviation of 22 ± 18 μm) and were similar to inter-observer reproducibility (21 ± 19 μm, R = .74), while being significantly faster and fully reproducible. The proposed framework demonstrated promising performances and could potentially be used for online identification of high-risk plaques.
Shivali, Garg; Praful, Lahorkar; Vijay, Gadgil
2012-01-01
Fourier transform infrared (FT-IR) spectroscopy is a technique widely used for detection and quantification of various chemical moieties. This paper describes the use of the FT-IR spectroscopy technique for the quantification of total lactones present in Inula racemosa and Andrographis paniculata. To validate the FT-IR spectroscopy method for quantification of total lactones in I. racemosa and A. paniculata. Dried and powdered I. racemosa roots and A. paniculata plant were extracted with ethanol and dried to remove ethanol completely. The ethanol extract was analysed in a KBr pellet by FT-IR spectroscopy. The FT-IR spectroscopy method was validated and compared with a known spectrophotometric method for quantification of lactones in A. paniculata. By FT-IR spectroscopy, the amount of total lactones was found to be 2.12 ± 0.47% (n = 3) in I. racemosa and 8.65 ± 0.51% (n = 3) in A. paniculata. The method showed comparable results with a known spectrophotometric method used for quantification of such lactones: 8.42 ± 0.36% (n = 3) in A. paniculata. Limits of detection and quantification for isoallantolactone were 1 µg and 10 µg respectively; for andrographolide they were 1.5 µg and 15 µg respectively. Recoveries were over 98%, with good intra- and interday repeatability: RSD ≤ 2%. The FT-IR spectroscopy method proved linear, accurate, precise and specific, with low limits of detection and quantification, for estimation of total lactones, and is less tedious than the UV spectrophotometric method for the compounds tested. This validated FT-IR spectroscopy method is readily applicable for the quality control of I. racemosa and A. paniculata. Copyright © 2011 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Pérez-de-Mora, Alfredo; Schulz, Stephan; Schloter, Michael
Hydrocarbons are major contaminants of soil ecosystems as a result of uncontrolled oil spills and wastes disposal into the environment. Ecological risk assessment and remediation of affected sites is often constrained due to lack of suitable prognostic and diagnostic tools that provide information of abiotic-biotic interactions occurring between contaminants and biological targets. Therefore, the identification and quantification of genes involved in the degradation of hydrocarbons may play a crucial role for evaluating the natural attenuation potential of contaminated sites and the development of successful bioremediation strategies. Besides other gene clusters, the alk operon has been identified as a major player for alkane degradation in different soils. An oxygenase gene (alkB) codes for the initial step of the degradation of aliphatic alkanes under aerobic conditions. In this work, we present an MPN- and a real-time PCR method for the quantification of the bacterial gene alkB (coding for rubredoxin-dependent alkane monooxygenase) in environmental samples. Both approaches enable a rapid culture-independent screening of the alkB gene in the environment, which can be used to assess the intrinsic natural attenuation potential of a site or to follow up the on-going progress of bioremediation assays.
Bridging the divide between human and environmental nanotoxicology
NASA Astrophysics Data System (ADS)
Malysheva, Anzhela; Lombi, Enzo; Voelcker, Nicolas H.
2015-10-01
The need to assess the human and environmental risks of nanoscale materials has prompted the development of new metrological tools for their detection, quantification and characterization. Some of these methods have tremendous potential for use in various scenarios of nanotoxicology. However, in some cases, the limited dialogue between environmental scientists and human toxicologists has hampered the full exploitation of these resources. Here we review recent progress in the development of methods for nanomaterial analysis and discuss the use of these methods in environmental and human toxicology. We highlight the opportunities for collaboration between these two research areas.
Symonds, E M; Sinigalliano, C; Gidley, M; Ahmed, W; McQuaig-Ulrich, S M; Breitbart, M
2016-11-01
To identify faecal pollution along the southeastern Florida coast and determine the performance of a reverse transcription-quantitative polymerase chain reaction (RT-qPCR) method for pepper mild mottle virus (PMMoV). In 2014, bimonthly surface water samples were collected from inlets, exposed to runoff and septic seepage, and coastal sites, exposed to ocean outfalls. Analysis of culturable enterococci and a suite of microbial source tracking (MST) markers (BacHum, CowM2, DogBact, HF183, HPyV, PMMoV) revealed faecal pollution, primarily of human origin, at all sites. Since PMMoV was detected more frequently than other MST markers, the process limits of quantification (undiluted to 10 -2 dilution) and detection (10 -2 dilution) for the RT-qPCR method were determined by seeding untreated wastewater into the coastal waters. Simulated quantitative microbial risk assessment, employing human norovirus as a reference pathogen, calculated a 0·286 median risk of gastrointestinal illness associated with the PMMoV limit of detection. All sites met the U.S. EPA recreational water criteria, despite detection of domestic wastewater-associated MST markers. PMMoV correlated only with human-associated MST markers. This study demonstrated that PMMoV is an important domestic wastewater-associated marker that should be included in the MST toolbox; therefore, future studies should thoroughly investigate the health risks associated with its detection and quantification in environmental waters. © 2016 The Society for Applied Microbiology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mays, S.E.; Poloski, J.P.; Sullivan, W.H.
1982-07-01
This report describes a risk study of the Browns Ferry, Unit 1, nuclear plant. The study is one of four such studies sponsored by the NRC Office of Research, Division of Risk Assessment, as part of its Interim Reliability Evaluation Program (IREP), Phase II. This report is contained in four volumes: a main report and three appendixes. Appendix C generally describes the methods used to estimate accident sequence frequency values. Information is presented concerning the approach, example collection, failure data, candidate dominant sequences, uncertainty analysis, and sensitivity analysis.
Recent application of quantification II in Japanese medical research.
Suzuki, T; Kudo, A
1979-01-01
Hayashi's Quantification II is a method of multivariate discrimination analysis to manipulate attribute data as predictor variables. It is very useful in the medical research field for estimation, diagnosis, prognosis, evaluation of epidemiological factors, and other problems based on multiplicity of attribute data. In Japan, this method is so well known that most of the computer program packages include the Hayashi Quantification, but it seems to be yet unfamiliar with the method for researchers outside Japan. In view of this situation, we introduced 19 selected articles of recent applications of the Quantification II in Japanese medical research. In reviewing these papers, special mention is made to clarify how the researchers were satisfied with findings provided by the method. At the same time, some recommendations are made about terminology and program packages. Also a brief discussion of the background of the quantification methods is given with special reference to the Behaviormetric Society of Japan. PMID:540587
Zehentmeier, Sandra; Cseresnyes, Zoltan; Escribano Navarro, Juan; Niesner, Raluca A.; Hauser, Anja E.
2015-01-01
Confocal microscopy is the method of choice for the analysis of localization of multiple cell types within complex tissues such as the bone marrow. However, the analysis and quantification of cellular localization is difficult, as in many cases it relies on manual counting, thus bearing the risk of introducing a rater-dependent bias and reducing interrater reliability. Moreover, it is often difficult to judge whether the co-localization between two cells results from random positioning, especially when cell types differ strongly in the frequency of their occurrence. Here, a method for unbiased quantification of cellular co-localization in the bone marrow is introduced. The protocol describes the sample preparation used to obtain histological sections of whole murine long bones including the bone marrow, as well as the staining protocol and the acquisition of high-resolution images. An analysis workflow spanning from the recognition of hematopoietic and non-hematopoietic cell types in 2-dimensional (2D) bone marrow images to the quantification of the direct contacts between those cells is presented. This also includes a neighborhood analysis, to obtain information about the cellular microenvironment surrounding a certain cell type. In order to evaluate whether co-localization of two cell types is the mere result of random cell positioning or reflects preferential associations between the cells, a simulation tool which is suitable for testing this hypothesis in the case of hematopoietic as well as stromal cells, is used. This approach is not limited to the bone marrow, and can be extended to other tissues to permit reproducible, quantitative analysis of histological data. PMID:25938636
Qualitative and quantitative analysis of monomers in polyesters for food contact materials.
Brenz, Fabrian; Linke, Susanne; Simat, Thomas
2017-02-01
Polyesters (PESs) are gaining more importance on the food contact material (FCM) market and the variety of properties and applications is expected to be wide. In order to acquire the desired properties manufacturers can combine several FCM-approved polyvalent carboxylic acids (PCAs) and polyols as monomers. However, information about the qualitative and quantitative composition of FCM articles is often limited. The method presented here describes the analysis of PESs with the identification and quantification of 25 PES monomers (10 PCA, 15 polyols) by HPLC with diode array detection (HPLC-DAD) and GC-MS after alkaline hydrolysis. Accurate identification and quantification were demonstrated by the analysis of seven different FCM articles made of PESs. The results explained between 97.2% and 103.4% w/w of the polymer composition whilst showing equal molar amounts of PCA and polyols. Quantification proved to be precise and sensitive with coefficients of variation (CVs) below 6.0% for PES samples with monomer concentrations typically ranging from 0.02% to 75% w/w. The analysis of 15 PES samples for the FCM market revealed the presence of five different PCAs and 11 different polyols (main monomers, co-monomers, non-intentionally added substances (NIAS)) showing the wide variety of monomers in modern PESs. The presented method provides a useful tool for commercial, state and research laboratories as well as for producers and distributors facing the task of FCM risk assessment. It can be applied for the identification and quantification of migrating monomers and the prediction of oligomer compositions from the identified monomers, respectively.
Imaging evaluation of non-alcoholic fatty liver disease: focused on quantification.
Lee, Dong Ho
2017-12-01
Non-alcoholic fatty liver disease (NAFLD) has been an emerging major health problem, and the most common cause of chronic liver disease in Western countries. Traditionally, liver biopsy has been gold standard method for quantification of hepatic steatosis. However, its invasive nature with potential complication as well as measurement variability are major problem. Thus, various imaging studies have been used for evaluation of hepatic steatosis. Ultrasonography provides fairly good accuracy to detect moderate-to-severe degree hepatic steatosis, but limited accuracy for mild steatosis. Operator-dependency and subjective/qualitative nature of examination are another major drawbacks of ultrasonography. Computed tomography can be considered as an unsuitable imaging modality for evaluation of NAFLD due to potential risk of radiation exposure and limited accuracy in detecting mild steatosis. Both magnetic resonance spectroscopy and magnetic resonance imaging using chemical shift technique provide highly accurate and reproducible diagnostic performance for evaluating NAFLD, and therefore, have been used in many clinical trials as a non-invasive reference of standard method.
Imaging evaluation of non-alcoholic fatty liver disease: focused on quantification
2017-01-01
Non-alcoholic fatty liver disease (NAFLD) has been an emerging major health problem, and the most common cause of chronic liver disease in Western countries. Traditionally, liver biopsy has been gold standard method for quantification of hepatic steatosis. However, its invasive nature with potential complication as well as measurement variability are major problem. Thus, various imaging studies have been used for evaluation of hepatic steatosis. Ultrasonography provides fairly good accuracy to detect moderate-to-severe degree hepatic steatosis, but limited accuracy for mild steatosis. Operator-dependency and subjective/qualitative nature of examination are another major drawbacks of ultrasonography. Computed tomography can be considered as an unsuitable imaging modality for evaluation of NAFLD due to potential risk of radiation exposure and limited accuracy in detecting mild steatosis. Both magnetic resonance spectroscopy and magnetic resonance imaging using chemical shift technique provide highly accurate and reproducible diagnostic performance for evaluating NAFLD, and therefore, have been used in many clinical trials as a non-invasive reference of standard method. PMID:28994271
Cut set-based risk and reliability analysis for arbitrarily interconnected networks
Wyss, Gregory D.
2000-01-01
Method for computing all-terminal reliability for arbitrarily interconnected networks such as the United States public switched telephone network. The method includes an efficient search algorithm to generate minimal cut sets for nonhierarchical networks directly from the network connectivity diagram. Efficiency of the search algorithm stems in part from its basis on only link failures. The method also includes a novel quantification scheme that likewise reduces computational effort associated with assessing network reliability based on traditional risk importance measures. Vast reductions in computational effort are realized since combinatorial expansion and subsequent Boolean reduction steps are eliminated through analysis of network segmentations using a technique of assuming node failures to occur on only one side of a break in the network, and repeating the technique for all minimal cut sets generated with the search algorithm. The method functions equally well for planar and non-planar networks.
Quantification of Training and Competition Loads in Endurance Sports: Methods and Applications.
Mujika, Iñigo
2017-04-01
Training quantification is basic to evaluate an endurance athlete's responses to training loads, ensure adequate stress/recovery balance, and determine the relationship between training and performance. Quantifying both external and internal workload is important, because external workload does not measure the biological stress imposed by the exercise sessions. Generally used quantification methods include retrospective questionnaires, diaries, direct observation, and physiological monitoring, often based on the measurement of oxygen uptake, heart rate, and blood lactate concentration. Other methods in use in endurance sports include speed measurement and the measurement of power output, made possible by recent technological advances such as power meters in cycling and triathlon. Among subjective methods of quantification, rating of perceived exertion stands out because of its wide use. Concurrent assessments of the various quantification methods allow researchers and practitioners to evaluate stress/recovery balance, adjust individual training programs, and determine the relationships between external load, internal load, and athletes' performance. This brief review summarizes the most relevant external- and internal-workload-quantification methods in endurance sports and provides practical examples of their implementation to adjust the training programs of elite athletes in accordance with their individualized stress/recovery balance.
Quantitative proteome analysis using isobaric peptide termini labeling (IPTL).
Arntzen, Magnus O; Koehler, Christian J; Treumann, Achim; Thiede, Bernd
2011-01-01
The quantitative comparison of proteome level changes across biological samples has become an essential feature in proteomics that remains challenging. We have recently introduced isobaric peptide termini labeling (IPTL), a novel strategy for isobaric quantification based on the derivatization of peptide termini with complementary isotopically labeled reagents. Unlike non-isobaric quantification methods, sample complexity at the MS level is not increased, providing improved sensitivity and protein coverage. The distinguishing feature of IPTL when comparing it to more established isobaric labeling methods (iTRAQ and TMT) is the presence of quantification signatures in all sequence-determining ions in MS/MS spectra, not only in the low mass reporter ion region. This makes IPTL a quantification method that is accessible to mass spectrometers with limited capabilities in the low mass range. Also, the presence of several quantification points in each MS/MS spectrum increases the robustness of the quantification procedure.
NASA Astrophysics Data System (ADS)
Kondoh, Takayuki; Yamamura, Tomohiro; Kitazaki, Satoshi; Kuge, Nobuyuki; Boer, Erwin Roeland
Longitudinal vehicle control and/or warning technologies that operate in accordance with drivers' subjective perception of risk need to be developed for driver-support systems, if such systems are to be used fully to achieve safer, more comfortable driving. In order to accomplish this goal, it is necessary to identify the visual cues utilized by drivers in their perception of risk when closing on the vehicle ahead in a car-following situation. It is also necessary to quantify the relation between the physical parameters defining the spatial relationship to the vehicle ahead and psychological metrics with regard to the risk perceived by the driver. This paper presents the results of an empirical study on quantification and formulization of drivers' subjective perception of risk based on experiments performed with a fixed-base driving simulator at the Nissan Research Center. Experiments were carried out to investigate the subjective perception of risk relative to the headway distance and closing velocity to the vehicle ahead using the magnitude estimation method. The experimental results showed that drivers' perception of risk was strongly affected by two variables: time headway, i.e., the distance to the lead vehicle divided by the following vehicle's velocity, and time to collision, i.e., the distance to the lead vehicle divided by relative velocity. It was also found that an equation for estimating drivers' perception of risk can be formulated as the summation of the time headway inverse and the time to collision inverse and that this expression can be applied to various approaching situations. Furthermore, the validity of this equation was examined based on real-world driver behavior data measured with an instrumented vehicle.
Loukotková, Lucie; VonTungeln, Linda S; Vanlandingham, Michelle; da Costa, Gonçalo Gamboa
2018-01-01
According to the World Health Organization, the consumption of tobacco products is the single largest cause of preventable deaths in the world, exceeding the total aggregated number of deaths caused by diseases such as AIDS, tuberculosis, and malaria. An important element in the evaluation of the health risks associated with the consumption of tobacco products is the assessment of the internal exposure to the tobacco constituents responsible for their addictive (e.g. nicotine) and carcinogenic (e.g. N-nitrosamines such as NNN and NNK) properties. However, the assessment of the serum levels of these compounds is often challenging from an analytical standpoint, in particular when limited sample volumes are available and low detection limits are required. Currently available analytical methods often rely on complex multi-step sample preparation procedures, which are prone to low analyte recoveries and ex-vivo contamination due to the ubiquitous nature of these compounds as background contaminants. In order to circumvent these problems, we report a facile and highly sensitive method for the simultaneous quantification of nicotine, cotinine, NNN, and NNK in serum samples. The method relies on a simple "one pot" liquid-liquid extraction procedure and isotope dilution ultra-high pressure (UPLC) hydrophilic interaction liquid chromatography (HILIC) coupled with tandem mass spectrometry. The method requires only 10μL of serum and presents a limit of quantification of 0.02nmol (3000pg/mL) nicotine, 0.6pmol (100pg/mL) cotinine, 0.05pmol NNK (10pg/mL), and 0.06pmol NNN (10pg/mL), making it appropriate for pharmacokinetic evaluations. Published by Elsevier B.V.
Jackowetz, J N; Mira de Orduña, R
2013-08-15
Sulphur dioxide (SO2) is essential for the preservation of wines. The presence of SO2 binding compounds in musts and wines may limit sulphite efficacy leading to higher total SO2 additions, which may exceed SO2 limits permitted by law and pose health risks for sensitive individuals. An improved method for the quantification of significant wine SO2 binding compounds is presented that applies a novel sample treatment approach and rapid UHPLC separation. Glucose, galacturonic acid, alpha-ketoglutarate, pyruvate, acetoin and acetaldehyde were derivatised with 2,4-dinitrophenylhydrazine and separated using a solid core C18 phase by ultra high performance liquid chromatography. Addition of EDTA to samples prevented de novo acetaldehyde formation from ethanol oxidation. Optimised derivatisation duration enhanced reproducibility and allowed for glucose and galacturonic acid quantification. High glucose residues were found to interfere with the recovery of other SO2 binders, but practical SO2 concentrations and red wine pigments did not affect derivatisation efficiency. The calibration range, method accuracy, precision and limits of detection were found to be satisfactory for routine analysis of SO2 binders in wines. The current method represents a significant improvement in the comprehensive analysis of SO2 binding wine carbonyls. It allows for the quantification of major SO2 binders at practical analyte concentrations, and uses a simple sample treatment method that prevents treatment artifacts. Equipment utilisation could be reduced by rapid LC separation while maintaining analytical performance parameters. The improved method will be a valuable addition for the analysis of total SO2 binder pools in oenological samples. Published by Elsevier Ltd.
Evaluation of AgNORs in Oral Potentially Malignant Lesions.
Tomazelli, Karin Berria; Modolo, Filipe; Rivero, Elena Riet Correa
2015-01-01
Oral squamous cell carcinoma (OSCC) is usually preceded by detectable mucosal changes, as leukoplakias and erythroplakia. Histologically, these lesions can range from hyperkeratosis and acanthosis to epithelial dysplasia and even OSCC. The aim of this study was to investigate the proliferative activity, using AgNORs quantification proteins, in low- and high-risk oral epithelial dysplasia, OSCC, and nondysplastic epithelium (inflammatory fibrous hyperplasia). The sample was divided into 4 groups: G1: 10 cases of inflammatory fibrous hyperplasia (IFH), G2: 11 cases of low-risk epithelial dysplasia (LD), G3: 10 cases of high-risk epithelial dysplasia (HD), and G4: 11 cases of OSCC. The quantitative analysis was performed using an image processing software in photomicrographs at 1000x magnification. The one-way ANOVA was used for comparison of the mean AgNORs counts between the study groups. The mean AgNORs count was significantly higher (P ≤ 0.01) in OSCC when compared to IFH and the LD; however, it was not statistically different from HD. The mean number of LD was significantly lower than the HD and OSCC, with no difference related to IFH. AgNORs quantification can be an important and cheap method to help in the determination of the degree of epithelial dysplasia and, consequently, in the analysis of their potential for malignant transformation.
Uncertainty Quantification in High Throughput Screening ...
Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of biochemical and cellular processes, including endocrine disruption, cytotoxicity, and zebrafish development. Over 2.6 million concentration response curves are fit to models to extract parameters related to potency and efficacy. Models built on ToxCast results are being used to rank and prioritize the toxicological risk of tested chemicals and to predict the toxicity of tens of thousands of chemicals not yet tested in vivo. However, the data size also presents challenges. When fitting the data, the choice of models, model selection strategy, and hit call criteria must reflect the need for computational efficiency and robustness, requiring hard and somewhat arbitrary cutoffs. When coupled with unavoidable noise in the experimental concentration response data, these hard cutoffs cause uncertainty in model parameters and the hit call itself. The uncertainty will then propagate through all of the models built on the data. Left unquantified, this uncertainty makes it difficult to fully interpret the data for risk assessment. We used bootstrap resampling methods to quantify the uncertainty in fitting models to the concentration response data. Bootstrap resampling determines confidence intervals for
Ray, Partha; Knowlton, Katharine F.; Shang, Chao; Xia, Kang
2014-01-01
Cephapirin, a cephalosporin antibiotic, is used by the majority of dairy farms in the US. Fecal and urinary excretion of cephapirin could introduce this compound into the environment when manure is land applied as fertilizer, and may cause development of bacterial resistance to antibiotics critical for human health. The environmental loading of cephapirin by the livestock industry remains un-assessed, largely due to a lack of appropriate analytical methods. Therefore, this study aimed to develop and validate a cephapirin quantification method to capture the temporal pattern of cephapirin excretion in dairy cows following intramammary infusion. The method includes an extraction with phosphate buffer and methanol, solid-phase extraction (SPE) clean-up, and quantification using ultra performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS). The LOQ values of the developed method were 4.02 µg kg−1 and 0.96 µg L−1 for feces and urine, respectively. This robust method recovered >60% and >80% cephapirin from spiked blank fecal and urine samples, respectively, with acceptable intra- and inter-day variation (<10%). Using this method, we detected trace amounts (µg kg−1) of cephapirin in dairy cow feces, and cephapirin in urine was detected at very high concentrations (133 to 480 µg L−1). Cephapirin was primarily excreted via urine and its urinary excretion was influenced by day (P = 0.03). Peak excretion (2.69 mg) was on day 1 following intramammary infusion and decreased sharply thereafter (0.19, 0.19, 0.08, and 0.17 mg on day 2, 3, 4, and 5, respectively) reflecting a quadratic pattern of excretion (Quadratic: P = 0.03). The described method for quantification of cephapirin in bovine feces and urine is sensitive, accurate, and robust and allowed to monitor the pattern of cephapirin excretion in dairy cows. This data will help develop manure segregation and treatment methods to minimize the risk of antibiotic loading to the environment from dairy farms. PMID:25375097
NASA Astrophysics Data System (ADS)
Bode, Felix; Ferré, Ty; Zigelli, Niklas; Emmert, Martin; Nowak, Wolfgang
2018-03-01
Collaboration between academics and practitioners promotes knowledge transfer between research and industry, with both sides benefiting greatly. However, academic approaches are often not feasible given real-world limits on time, cost and data availability, especially for risk and uncertainty analyses. Although the need for uncertainty quantification and risk assessment are clear, there are few published studies examining how scientific methods can be used in practice. In this work, we introduce possible strategies for transferring and communicating academic approaches to real-world applications, countering the current disconnect between increasingly sophisticated academic methods and methods that work and are accepted in practice. We analyze a collaboration between academics and water suppliers in Germany who wanted to design optimal groundwater monitoring networks for drinking-water well catchments. Our key conclusions are: to prefer multiobjective over single-objective optimization; to replace Monte-Carlo analyses by scenario methods; and to replace data-hungry quantitative risk assessment by easy-to-communicate qualitative methods. For improved communication, it is critical to set up common glossaries of terms to avoid misunderstandings, use striking visualization to communicate key concepts, and jointly and continually revisit the project objectives. Ultimately, these approaches and recommendations are simple and utilitarian enough to be transferred directly to other practical water resource related problems.
Code of Federal Regulations, 2014 CFR
2014-01-01
... internal risk rating and segmentation system; risk parameter quantification system; data management and... advanced IRB systems, operational risk management processes, operational risk data and assessment systems... generated on an arm's-length basis between the seller and the obligor (intercompany accounts receivable and...
Quantifying construction and demolition waste: An analytical review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Zezhou; Yu, Ann T.W., E-mail: bsannyu@polyu.edu.hk; Shen, Liyin
2014-09-15
Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and projectmore » levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.« less
2014-01-01
Background Various computer-based methods exist for the detection and quantification of protein spots in two dimensional gel electrophoresis images. Area-based methods are commonly used for spot quantification: an area is assigned to each spot and the sum of the pixel intensities in that area, the so-called volume, is used a measure for spot signal. Other methods use the optical density, i.e. the intensity of the most intense pixel of a spot, or calculate the volume from the parameters of a fitted function. Results In this study we compare the performance of different spot quantification methods using synthetic and real data. We propose a ready-to-use algorithm for spot detection and quantification that uses fitting of two dimensional Gaussian function curves for the extraction of data from two dimensional gel electrophoresis (2-DE) images. The algorithm implements fitting using logical compounds and is computationally efficient. The applicability of the compound fitting algorithm was evaluated for various simulated data and compared with other quantification approaches. We provide evidence that even if an incorrect bell-shaped function is used, the fitting method is superior to other approaches, especially when spots overlap. Finally, we validated the method with experimental data of urea-based 2-DE of Aβ peptides andre-analyzed published data sets. Our methods showed higher precision and accuracy than other approaches when applied to exposure time series and standard gels. Conclusion Compound fitting as a quantification method for 2-DE spots shows several advantages over other approaches and could be combined with various spot detection methods. The algorithm was scripted in MATLAB (Mathworks) and is available as a supplemental file. PMID:24915860
Wang, Qingqing; Zhang, Suhong; Guo, Lili; Busch, Christine M; Jian, Wenying; Weng, Naidong; Snyder, Nathaniel W; Rangiah, Kannan; Mesaros, Clementina; Blair, Ian A
2015-01-01
Background: Absolute quantification of protein biomarkers such as serum apolipoprotein A1 by both immunoassays and LC–MS can provide misleading results. Results: Recombinant ApoA-1 internal standard was prepared using stable isotope labeling by amino acids in cell culture with [13C615N2]-lysine and [13C915N1]-tyrosine in human cells. A stable isotope dilution LC–MS method for serum ApoA-1 was validated and levels analyzed for 50 nonsmokers and 50 smokers. Conclusion: The concentration of ApoA-1 in nonsmokers was 169.4 mg/dl with an 18.4% reduction to 138.2 mg/dl in smokers. The validated assay will have clinical utility for assessing effects of smoking cessation and therapeutic or dietary interventions in high-risk populations. PMID:26394123
NASA Astrophysics Data System (ADS)
Bourgeat, Pierrick; Dore, Vincent; Fripp, Jurgen; Villemagne, Victor L.; Rowe, Chris C.; Salvado, Olivier
2015-03-01
With the advances of PET tracers for β-Amyloid (Aβ) detection in neurodegenerative diseases, automated quantification methods are desirable. For clinical use, there is a great need for PET-only quantification method, as MR images are not always available. In this paper, we validate a previously developed PET-only quantification method against MR-based quantification using 6 tracers: 18F-Florbetaben (N=148), 18F-Florbetapir (N=171), 18F-NAV4694 (N=47), 18F-Flutemetamol (N=180), 11C-PiB (N=381) and 18F-FDG (N=34). The results show an overall mean absolute percentage error of less than 5% for each tracer. The method has been implemented as a remote service called CapAIBL (http://milxcloud.csiro.au/capaibl). PET images are uploaded to a cloud platform where they are spatially normalised to a standard template and quantified. A report containing global as well as local quantification, along with surface projection of the β-Amyloid deposition is automatically generated at the end of the pipeline and emailed to the user.
NASA Astrophysics Data System (ADS)
Lee, Hyun-Seok; Heun Kim, Sook; Jeong, Ji-Seon; Lee, Yong-Moon; Yim, Yong-Hyeon
2015-10-01
An element-based reductive approach provides an effective means of realizing International System of Units (SI) traceability for high-purity biological standards. Here, we develop an absolute protein quantification method using double isotope dilution (ID) inductively coupled plasma mass spectrometry (ICP-MS) combined with microwave-assisted acid digestion for the first time. We validated the method and applied it to certify the candidate protein certified reference material (CRM) of human growth hormone (hGH). The concentration of hGH was determined by analysing the total amount of sulfur in hGH. Next, the size-exclusion chromatography method was used with ICP-MS to characterize and quantify sulfur-containing impurities. By subtracting the contribution of sulfur-containing impurities from the total sulfur content in the hGH CRM, we obtained a SI-traceable certification value. The quantification result obtained with the present method based on sulfur analysis was in excellent agreement with the result determined via a well-established protein quantification method based on amino acid analysis using conventional acid hydrolysis combined with an ID liquid chromatography-tandem mass spectrometry. The element-based protein quantification method developed here can be generally used for SI-traceable absolute quantification of proteins, especially pure-protein standards.
RNA-Skim: a rapid method for RNA-Seq quantification at transcript level
Zhang, Zhaojun; Wang, Wei
2014-01-01
Motivation: RNA-Seq technique has been demonstrated as a revolutionary means for exploring transcriptome because it provides deep coverage and base pair-level resolution. RNA-Seq quantification is proven to be an efficient alternative to Microarray technique in gene expression study, and it is a critical component in RNA-Seq differential expression analysis. Most existing RNA-Seq quantification tools require the alignments of fragments to either a genome or a transcriptome, entailing a time-consuming and intricate alignment step. To improve the performance of RNA-Seq quantification, an alignment-free method, Sailfish, has been recently proposed to quantify transcript abundances using all k-mers in the transcriptome, demonstrating the feasibility of designing an efficient alignment-free method for transcriptome quantification. Even though Sailfish is substantially faster than alternative alignment-dependent methods such as Cufflinks, using all k-mers in the transcriptome quantification impedes the scalability of the method. Results: We propose a novel RNA-Seq quantification method, RNA-Skim, which partitions the transcriptome into disjoint transcript clusters based on sequence similarity, and introduces the notion of sig-mers, which are a special type of k-mers uniquely associated with each cluster. We demonstrate that the sig-mer counts within a cluster are sufficient for estimating transcript abundances with accuracy comparable with any state-of-the-art method. This enables RNA-Skim to perform transcript quantification on each cluster independently, reducing a complex optimization problem into smaller optimization tasks that can be run in parallel. As a result, RNA-Skim uses <4% of the k-mers and <10% of the CPU time required by Sailfish. It is able to finish transcriptome quantification in <10 min per sample by using just a single thread on a commodity computer, which represents >100 speedup over the state-of-the-art alignment-based methods, while delivering comparable or higher accuracy. Availability and implementation: The software is available at http://www.csbio.unc.edu/rs. Contact: weiwang@cs.ucla.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24931995
Measuring the Risk of Shortfalls in Air Force Capabilities
2004-03-01
quantifying risk and simplifying that quantification in a risk measure is to order different risks and, ultimately, to choose between them. The...the analytic goal of understanding and quantifying risk . The growth in information technology, and the amount of data collected on, for example
Assessment of cardiac fibrosis: a morphometric method comparison for collagen quantification.
Schipke, Julia; Brandenberger, Christina; Rajces, Alexandra; Manninger, Martin; Alogna, Alessio; Post, Heiner; Mühlfeld, Christian
2017-04-01
Fibrotic remodeling of the heart is a frequent condition linked to various diseases and cardiac dysfunction. Collagen quantification is an important objective in cardiac fibrosis research; however, a variety of different histological methods are currently used that may differ in accuracy. Here, frequently applied collagen quantification techniques were compared. A porcine model of early stage heart failure with preserved ejection fraction was used as an example. Semiautomated threshold analyses were imprecise, mainly due to inclusion of noncollagen structures or failure to detect certain collagen deposits. In contrast, collagen assessment by automated image analysis and light microscopy (LM)-stereology was more sensitive. Depending on the quantification method, the amount of estimated collagen varied and influenced intergroup comparisons. PicroSirius Red, Masson's trichrome, and Azan staining protocols yielded similar results, whereas the measured collagen area increased with increasing section thickness. Whereas none of the LM-based methods showed significant differences between the groups, electron microscopy (EM)-stereology revealed a significant collagen increase between cardiomyocytes in the experimental group, but not at other localizations. In conclusion, in contrast to the staining protocol, section thickness and the quantification method being used directly influence the estimated collagen content and thus, possibly, intergroup comparisons. EM in combination with stereology is a precise and sensitive method for collagen quantification if certain prerequisites are considered. For subtle fibrotic alterations, consideration of collagen localization may be necessary. Among LM methods, LM-stereology and automated image analysis are appropriate to quantify fibrotic changes, the latter depending on careful control of algorithm and comparable section staining. NEW & NOTEWORTHY Direct comparison of frequently applied histological fibrosis assessment techniques revealed a distinct relation of measured collagen and utilized quantification method as well as section thickness. Besides electron microscopy-stereology, which was precise and sensitive, light microscopy-stereology and automated image analysis proved to be appropriate for collagen quantification. Moreover, consideration of collagen localization might be important in revealing minor fibrotic changes. Copyright © 2017 the American Physiological Society.
A phase quantification method based on EBSD data for a continuously cooled microalloyed steel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, H.; Wynne, B.P.; Palmiere, E.J., E-mail: e.j
2017-01-15
Mechanical properties of steels depend on the phase constitutions of the final microstructures which can be related to the processing parameters. Therefore, accurate quantification of different phases is necessary to investigate the relationships between processing parameters, final microstructures and mechanical properties. Point counting on micrographs observed by optical or scanning electron microscopy is widely used as a phase quantification method, and different phases are discriminated according to their morphological characteristics. However, it is difficult to differentiate some of the phase constituents with similar morphology. Differently, for EBSD based phase quantification methods, besides morphological characteristics, other parameters derived from the orientationmore » information can also be used for discrimination. In this research, a phase quantification method based on EBSD data in the unit of grains was proposed to identify and quantify the complex phase constitutions of a microalloyed steel subjected to accelerated coolings. Characteristics of polygonal ferrite/quasi-polygonal ferrite, acicular ferrite and bainitic ferrite on grain averaged misorientation angles, aspect ratios, high angle grain boundary fractions and grain sizes were analysed and used to develop the identification criteria for each phase. Comparing the results obtained by this EBSD based method and point counting, it was found that this EBSD based method can provide accurate and reliable phase quantification results for microstructures with relatively slow cooling rates. - Highlights: •A phase quantification method based on EBSD data in the unit of grains was proposed. •The critical grain area above which GAM angles are valid parameters was obtained. •Grain size and grain boundary misorientation were used to identify acicular ferrite. •High cooling rates deteriorate the accuracy of this EBSD based method.« less
Nitride, Chiara; Lee, Victoria; Baricevic-Jones, Ivona; Adel-Patient, Karine; Baumgartner, Sabine; Mills, E N Clare
2018-01-01
Allergen analysis is central to implementing and monitoring food allergen risk assessment and management processes by the food industry, but current methods for the determination of allergens in foods give highly variable results. The European Union-funded "Integrated Approaches to Food Allergen and Allergy Risk Management" (iFAAM) project has been working to address gaps in knowledge regarding food allergen management and analysis, including the development of novel MS and immuno-based allergen determination methods. Common allergenic food ingredients (peanut, hazelnut, walnut, cow's milk [Bos domesticus], and hen's egg [Gallus domesticus]) and common food matrixes (chocolate dessert and cookie) have been used for both clinical studies and analytical method development to ensure that the new methods are clinically relevant. Allergen molecules have been used as analytical targets and allergenic ingredients incurred into matrixes at levels close to reference doses that may trigger the use of precautionary allergen labeling. An interlaboratory method comparison has been undertaken for the determination of peanut in chocolate dessert using MS and immuno-based methods. The iFAAM approach has highlighted the need for methods to report test results in allergenic protein. This will allow food business operators to use them in risk assessments that are founded on clinical study data in which protein has been used as a measure of allergenic potency.
Arvia, Rosaria; Sollai, Mauro; Pierucci, Federica; Urso, Carmelo; Massi, Daniela; Zakrzewska, Krystyna
2017-08-01
Merkel cell polyomavirus (MCPyV) is associated with Merkel cell carcinoma and high viral load in the skin was proposed as a risk factor for the occurrence of this tumour. MCPyV DNA was detected, with lower frequency, in different skin cancers but since the viral load was usually low, the real prevalence of viral DNA could be underestimated. To evaluate the performance of two assays (qPCR and ddPCR) for MCPyV detection and quantification in formalin fixed paraffin embedded (FFPE) tissue samples. Both assays were designed to simultaneous detection and quantification of both MCPyV as well as house-keeping DNA in clinical samples. The performance of MCPyV quantification was investigated using serial dilutions of cloned target DNA. We also evaluated the applicability of both tests for the analysis of 76 FFPE cutaneous biopsies. The two approaches resulted equivalent with regard to the reproducibility and repeatability and showed a high degree of linearity in the dynamic range tested in the present study. Moreover, qPCR was able to quantify ≥10 5 copies per reaction, while the upper limit of ddPCR was 10 4 copies. There was not significant difference between viral load measured by the two methods The detection limit of both tests was 0,15 copies per reaction, however, the number of positive samples obtained by ddPCR was higher than that obtained by qPCR (45% and 37% respectively). The ddPCR represents a better method for detection of MCPyV in FFPE biopsies, mostly these containing low copies number of viral genome. Copyright © 2017 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bley, D.C.; Cooper, S.E.; Forester, J.A.
ATHEANA, a second-generation Human Reliability Analysis (HRA) method integrates advances in psychology with engineering, human factors, and Probabilistic Risk Analysis (PRA) disciplines to provide an HRA quantification process and PRA modeling interface that can accommodate and represent human performance in real nuclear power plant events. The method uses the characteristics of serious accidents identified through retrospective analysis of serious operational events to set priorities in a search process for significant human failure events, unsafe acts, and error-forcing context (unfavorable plant conditions combined with negative performance-shaping factors). ATHEANA has been tested in a demonstration project at an operating pressurized water reactor.
Development of economic consequence methodology for process risk analysis.
Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed
2015-04-01
A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.
Takeno, Shinya; Bamba, Takeshi; Nakazawa, Yoshihisa; Fukusaki, Eiichiro; Okazawa, Atsushi; Kobayashi, Akio
2008-04-01
Commercial development of trans-1,4-polyisoprene from Eucommia ulmoides Oliver (EU-rubber) requires specific knowledge on selection of high-rubber-content lines and establishment of agronomic cultivation methods for achieving maximum EU-rubber yield. The development can be facilitated by high-throughput and highly sensitive analytical techniques for EU-rubber extraction and quantification. In this paper, we described an efficient EU-rubber extraction method, and validated that the accuracy was equivalent to that of the conventional Soxhlet extraction method. We also described a highly sensitive quantification method for EU-rubber by Fourier transform infrared spectroscopy (FT-IR) and pyrolysis-gas chromatography/mass spectrometry (PyGC/MS). We successfully applied the extraction/quantification method for study of seasonal changes in EU-rubber content and molecular weight distribution.
MacDonald, Matthew L.; Ciccimaro, Eugene; Prakash, Amol; Banerjee, Anamika; Seeholzer, Steven H.; Blair, Ian A.; Hahn, Chang-Gyu
2012-01-01
Synaptic architecture and its adaptive changes require numerous molecular events that are both highly ordered and complex. A majority of neuropsychiatric illnesses are complex trait disorders, in which multiple etiologic factors converge at the synapse via many signaling pathways. Investigating the protein composition of synaptic microdomains from human patient brain tissues will yield valuable insights into the interactions of risk genes in many disorders. These types of studies in postmortem tissues have been limited by the lack of proper study paradigms. Thus, it is necessary not only to develop strategies to quantify protein and post-translational modifications at the synapse, but also to rigorously validate them for use in postmortem human brain tissues. In this study we describe the development of a liquid chromatography-selected reaction monitoring method, using a stable isotope-labeled neuronal proteome standard prepared from the brain tissue of a stable isotope-labeled mouse, for the multiplexed quantification of target synaptic proteins in mammalian samples. Additionally, we report the use of this method to validate a biochemical approach for the preparation of synaptic microdomain enrichments from human postmortem prefrontal cortex. Our data demonstrate that a targeted mass spectrometry approach with a true neuronal proteome standard facilitates accurate and precise quantification of over 100 synaptic proteins in mammalian samples, with the potential to quantify over 1000 proteins. Using this method, we found that protein enrichments in subcellular fractions prepared from human postmortem brain tissue were strikingly similar to those prepared from fresh mouse brain tissue. These findings demonstrate that biochemical fractionation methods paired with targeted proteomic strategies can be used in human brain tissues, with important implications for the study of neuropsychiatric disease. PMID:22942359
Joly-Tonetti, Nicolas; Wibawa, Judata I D; Bell, Mike; Tobin, Desmond
2016-07-01
Melanin is the predominant pigment responsible for skin colour and is synthesized by the melanocyte in the basal layer of the epidermis and then transferred to surrounding keratinocytes. Despite its optical properties, melanin is barely detectable in unstained sections of human epidermis. However, identification and localization of melanin is of importance for the study of skin pigmentation in health and disease. Current methods for the histologic quantification of melanin are suboptimal and are associated with significant risk of misinterpretation. The aim of this study was to reassess the existing literature and to develop a more effective histological method of melanin quantification in human skin. Moreover, we confirm that Warthin-Starry (WS) stain provides a much more sensitive and more specific melanin detection method than the commonplace Fontana-Masson (FM) stain. For example, WS staining sensitivity allowed the visualization of melanin even in very pale Caucasian skin that was missed by FM or Von Kossa (VK) stains. From our reassessment of the histology-related literature, we conclude that so-called melanin dust is most likely an artifact of discoloration due to non-specific silver deposition in the stratum corneum. Unlike FM and VK, WS was not associated with this non-specific stratum corneum darkening, misinterpreted previously as 'degraded' melanin. Finally, WS melanin particle counts were largely similar to previously reported manual counts by transmission electron microscopy, in contrast to both FM and VK. Together these findings allow us to propose a new histology/Image J-informed method for the accurate and precise quantification of epidermal melanin in skin. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Decision Support Methods and Tools
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Alexandrov, Natalia M.; Brown, Sherilyn A.; Cerro, Jeffrey A.; Gumbert, Clyde r.; Sorokach, Michael R.; Burg, Cecile M.
2006-01-01
This paper is one of a set of papers, developed simultaneously and presented within a single conference session, that are intended to highlight systems analysis and design capabilities within the Systems Analysis and Concepts Directorate (SACD) of the National Aeronautics and Space Administration (NASA) Langley Research Center (LaRC). This paper focuses on the specific capabilities of uncertainty/risk analysis, quantification, propagation, decomposition, and management, robust/reliability design methods, and extensions of these capabilities into decision analysis methods within SACD. These disciplines are discussed together herein under the name of Decision Support Methods and Tools. Several examples are discussed which highlight the application of these methods within current or recent aerospace research at the NASA LaRC. Where applicable, commercially available, or government developed software tools are also discussed
Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.
Gerold, Chase T; Bakker, Eric; Henry, Charles S
2018-04-03
In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.
Raulf, M; Buters, J; Chapman, M; Cecchi, L; de Blay, F; Doekes, G; Eduard, W; Heederik, D; Jeebhay, M F; Kespohl, S; Krop, E; Moscato, G; Pala, G; Quirce, S; Sander, I; Schlünssen, V; Sigsgaard, T; Walusiak-Skorupa, J; Wiszniewska, M; Wouters, I M; Annesi-Maesano, I
2014-10-01
Exposure to high molecular weight sensitizers of biological origin is an important risk factor for the development of asthma and rhinitis. Most of the causal allergens have been defined based on their reactivity with IgE antibodies, and in many cases, the molecular structure and function of the allergens have been established. Significant information on allergen levels that cause sensitization and allergic symptoms for several major environmental and occupational allergens has been reported. Monitoring of high molecular weight allergens and allergen carrier particles is an important part of the management of allergic respiratory diseases and requires standardized allergen assessment methods for occupational and environmental (indoor and outdoor) allergen exposure. The aim of this EAACI task force was to review the essential points for monitoring environmental and occupational allergen exposure including sampling strategies and methods, processing of dust samples, allergen analysis, and quantification. The paper includes a summary of different methods for sampling and allergen quantification, as well as their pros and cons for various exposure settings. Recommendations are being made for different exposure scenarios. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Jiménez-Carvelo, Ana M; González-Casado, Antonio; Cuadros-Rodríguez, Luis
2017-03-01
A new analytical method for the quantification of olive oil and palm oil in blends with other vegetable edible oils (canola, safflower, corn, peanut, seeds, grapeseed, linseed, sesame and soybean) using normal phase liquid chromatography, and applying chemometric tools was developed. The procedure for obtaining of chromatographic fingerprint from the methyl-transesterified fraction from each blend is described. The multivariate quantification methods used were Partial Least Square-Regression (PLS-R) and Support Vector Regression (SVR). The quantification results were evaluated by several parameters as the Root Mean Square Error of Validation (RMSEV), Mean Absolute Error of Validation (MAEV) and Median Absolute Error of Validation (MdAEV). It has to be highlighted that the new proposed analytical method, the chromatographic analysis takes only eight minutes and the results obtained showed the potential of this method and allowed quantification of mixtures of olive oil and palm oil with other vegetable oils. Copyright © 2016 Elsevier B.V. All rights reserved.
Yan, Neng; Zhu, Zhenli; He, Dong; Jin, Lanlan; Zheng, Hongtao; Hu, Shenghong
2016-01-01
The increasing use of metal-based nanoparticle products has raised concerns in particular for the aquatic environment and thus the quantification of such nanomaterials released from products should be determined to assess their environmental risks. In this study, a simple, rapid and sensitive method for the determination of size and mass concentration of gold nanoparticles (AuNPs) in aqueous suspension was established by direct coupling of thin layer chromatography (TLC) with catalyzed luminol-H2O2 chemiluminescence (CL) detection. For this purpose, a moving stage was constructed to scan the chemiluminescence signal from TLC separated AuNPs. The proposed TLC-CL method allows the quantification of differently sized AuNPs (13 nm, 41 nm and 100 nm) contained in a mixture. Various experimental parameters affecting the characterization of AuNPs, such as the concentration of H2O2, the concentration and pH of the luminol solution, and the size of the spectrometer aperture were investigated. Under optimal conditions, the detection limits for AuNP size fractions of 13 nm, 41 nm and 100 nm were 38.4 μg L−1, 35.9 μg L−1 and 39.6 μg L−1, with repeatabilities (RSD, n = 7) of 7.3%, 6.9% and 8.1% respectively for 10 mg L−1 samples. The proposed method was successfully applied to the characterization of AuNP size and concentration in aqueous test samples. PMID:27080702
Wille, Klaas; Claessens, Michiel; Rappé, Karen; Monteyne, Els; Janssen, Colin R; De Brabander, Hubert F; Vanhaecke, Lynn
2011-12-23
The presence of both pharmaceuticals and pesticides in the aquatic environment has become a well-known environmental issue during the last decade. An increasing demand however still exists for sensitive and reliable monitoring tools for these rather polar contaminants in the marine environment. In recent years, the great potential of passive samplers or equilibrium based sampling techniques for evaluation of the fate of these contaminants has been shown in literature. Therefore, we developed a new analytical method for the quantification of a high number of pharmaceuticals and pesticides in passive sampling devices. The analytical procedure consisted of extraction using 1:1 methanol/acetonitrile followed by detection with ultra-high performance liquid chromatography coupled to high resolution and high mass accuracy Orbitrap mass spectrometry. Validation of the analytical method resulted in limits of quantification and recoveries ranging between 0.2 and 20 ng per sampler sheet and between 87.9 and 105.2%, respectively. Determination of the sampler-water partition coefficients of all compounds demonstrated that several pharmaceuticals and most pesticides exert a high affinity for the polydimethylsiloxane passive samplers. Finally, the developed analytical methods were used to measure the time-weighted average (TWA) concentrations of the targeted pollutants in passive samplers, deployed at eight stations in the Belgian coastal zone. Propranolol, carbamazepine and seven pesticides were found to be very abundant in the passive samplers. These obtained long-term and large-scale TWA concentrations will contribute in assessing the environmental and human health risk of these emerging pollutants. Copyright © 2011 Elsevier B.V. All rights reserved.
Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of bioche...
Lu, Tzong-Shi; Yiao, Szu-Yu; Lim, Kenneth; Jensen, Roderick V; Hsiao, Li-Li
2010-07-01
The identification of differences in protein expression resulting from methodical variations is an essential component to the interpretation of true, biologically significant results. We used the Lowry and Bradford methods- two most commonly used methods for protein quantification, to assess whether differential protein expressions are a result of true biological or methodical variations. MATERIAL #ENTITYSTARTX00026; Differential protein expression patterns was assessed by western blot following protein quantification by the Lowry and Bradford methods. We have observed significant variations in protein concentrations following assessment with the Lowry versus Bradford methods, using identical samples. Greater variations in protein concentration readings were observed over time and in samples with higher concentrations, with the Bradford method. Identical samples quantified using both methods yielded significantly different expression patterns on Western blot. We show for the first time that methodical variations observed in these protein assay techniques, can potentially translate into differential protein expression patterns, that can be falsely taken to be biologically significant. Our study therefore highlights the pivotal need to carefully consider methodical approaches to protein quantification in techniques that report quantitative differences.
Yan, Xiaowen; Yang, Limin; Wang, Qiuquan
2013-07-01
Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.
Learning and Information Approaches for Inference in Dynamic Data-Driven Geophysical Applications
NASA Astrophysics Data System (ADS)
Ravela, S.
2015-12-01
Many Geophysical inference problems are characterized by non-linear processes, high-dimensional models and complex uncertainties. A dynamic coupling between models, estimation, and sampling is typically sought to efficiently characterize and reduce uncertainty. This process is however fraught with several difficulties. Among them, the key difficulties are the ability to deal with model errors, efficacy of uncertainty quantification and data assimilation. In this presentation, we present three key ideas from learning and intelligent systems theory and apply them to two geophysical applications. The first idea is the use of Ensemble Learning to compensate for model error, the second is to develop tractable Information Theoretic Learning to deal with non-Gaussianity in inference, and the third is a Manifold Resampling technique for effective uncertainty quantification. We apply these methods, first to the development of a cooperative autonomous observing system using sUAS for studying coherent structures. We apply this to Second, we apply this to the problem of quantifying risk from hurricanes and storm surges in a changing climate. Results indicate that learning approaches can enable new effectiveness in cases where standard approaches to model reduction, uncertainty quantification and data assimilation fail.
Recommendations for standardized reporting of protein electrophoresis in Australia and New Zealand.
Tate, Jillian; Caldwell, Grahame; Daly, James; Gillis, David; Jenkins, Margaret; Jovanovich, Sue; Martin, Helen; Steele, Richard; Wienholt, Louise; Mollee, Peter
2012-05-01
Although protein electrophoresis of serum (SPEP) and urine (UPEP) specimens is a well-established laboratory technique, the reporting of results using this important method varies considerably between laboratories. The Australasian Association of Clinical Biochemists recognized a need to adopt a standardized approach to reporting SPEP and UPEP by clinical laboratories. A Working Party considered available data including published literature and clinical studies, together with expert opinion in order to establish optimal reporting practices. A position paper was produced, which was subsequently revised through a consensus process involving scientists and pathologists with expertise in the field throughout Australia and New Zealand. Recommendations for standardized reporting of protein electrophoresis have been produced. These cover analytical requirements: detection systems; serum protein and albumin quantification; fractionation into alpha-1, alpha-2, beta and gamma fractions; paraprotein quantification; urine Bence Jones protein quantification; paraprotein characterization; and laboratory performance, expertise and staffing. The recommendations also include general interpretive commenting and commenting for specimens with paraproteins and small bands together with illustrative examples of reports. Recommendations are provided for standardized reporting of protein electrophoresis in Australia and New Zealand. It is expected that such standardized reporting formats will reduce both variation between laboratories and the risk of misinterpretation of results.
Pycke, Benny F. G.; Chao, Tzu-Chiao; Herckes, Pierre; Westerhoff, Paul
2013-01-01
Owing to their exceptional properties and versatility, fullerenes are in widespread use for numerous applications. Increased production and use of fullerenes will inevitably result in accelerated environmental release. However, study of the occurrence, fate, and transport of fullerenes in the environment is complicated because a variety of surface modifications can occur as a result of either intentional functionalization or natural processes. To gain a better understanding of the effect and risk of fullerenes on environmental health, it is necessary to acquire reliable data on the parent compounds and their congeners. Whereas currently established quantification methods generally focus on analysis of unmodified fullerenes, we discuss in this review the occurrence and analysis of oxidized fullerene congeners (i.e., their corresponding epoxides and polyhydroxylated derivatives) in the environment and in biological specimens. We present possible strategies for detection and quantification of parent nanomaterials and their various derivatives. PMID:22644149
Using analogues to quantify geological uncertainty in stochastic reserve modelling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wells, B.; Brown, I.
1995-08-01
The petroleum industry seeks to minimize exploration risk by employing the best possible expertise, methods and tools. Is it possible to quantify the success of this process of risk reduction? Due to inherent uncertainty in predicting geological reality and due to changing environments for hydrocarbon exploration, it is not enough simply to record the proportion of successful wells drilled; in various parts of the world it has been noted that pseudo-random drilling would apparently have been as successful as the actual drilling programme. How, then, should we judge the success of risk reduction? For many years the E&P industry hasmore » routinely used Monte Carlo modelling to generate a probability distribution for prospect reserves. One aspect of Monte Carlo modelling which has received insufficient attention, but which is essential for quantifying risk reduction, is the consistency and repeatability with which predictions can be made. Reducing the subjective element inherent in the specification of geological uncertainty allows better quantification of uncertainty in the prediction of reserves, in both exploration and appraisal. Building on work reported at the AAPG annual conventions in 1994 and 1995, the present paper incorporates analogue information with uncertainty modelling. Analogues provide a major step forward in the quantification of risk, but their significance is potentially greater still. The two principal contributors to uncertainty in field and prospect analysis are the hydrocarbon life-cycle and the geometry of the trap. These are usually treated separately. Combining them into a single model is a major contribution to the reduction risk. This work is based in part on a joint project with Oryx Energy UK Ltd., and thanks are due in particular to Richard Benmore and Mike Cooper.« less
Uncertainties in estimates of the risks of late effects from space radiation
NASA Astrophysics Data System (ADS)
Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P. B.; Dicello, J. F.
2004-01-01
Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Using the linear-additivity model for radiation risks, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain an estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including a deep space outpost and Mars missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative metrics, e.g., the number of days in space without exceeding a given risk level within well-defined confidence limits.
Rauniyar, Navin
2015-01-01
The parallel reaction monitoring (PRM) assay has emerged as an alternative method of targeted quantification. The PRM assay is performed in a high resolution and high mass accuracy mode on a mass spectrometer. This review presents the features that make PRM a highly specific and selective method for targeted quantification using quadrupole-Orbitrap hybrid instruments. In addition, this review discusses the label-based and label-free methods of quantification that can be performed with the targeted approach. PMID:26633379
NASA Astrophysics Data System (ADS)
Restaino, Stephen M.; White, Ian M.
2017-03-01
Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.
NASA Astrophysics Data System (ADS)
Denis, Vincent
2008-09-01
This paper presents a statistical method for determining the dimensions, tolerance and specifications of components for the Laser MegaJoule (LMJ). Numerous constraints inherent to a large facility require specific tolerances: the huge number of optical components; the interdependence of these components between the beams of same bundle; angular multiplexing for the amplifier section; distinct operating modes between the alignment and firing phases; the definition and use of alignment software in the place of classic optimization. This method provides greater flexibility to determine the positioning and manufacturing specifications of the optical components. Given the enormous power of the Laser MegaJoule (over 18 kJ in the infrared and 9 kJ in the ultraviolet), one of the major risks is damage the optical mounts and pollution of the installation by mechanical ablation. This method enables estimation of the beam occultation probabilities and quantification of the risks for the facility. All the simulations were run using the ZEMAX-EE optical design software.
Jeong, Hyun Cheol; Hong, Hee-Do; Kim, Young-Chan; Rhee, Young Kyoung; Choi, Sang Yoon; Kim, Kyung-Tack; Kim, Sung Soo; Lee, Young-Chul; Cho, Chang-Won
2015-01-01
Background: Maltol, as a type of phenolic compounds, is produced by the browning reaction during the high-temperature treatment of ginseng. Thus, maltol can be used as a marker for the quality control of various ginseng products manufactured by high-temperature treatment including red ginseng. For the quantification of maltol in Korean ginseng products, an effective high-performance liquid chromatography-diode array detector (HPLC-DAD) method was developed. Materials and Methods: The HPLC-DAD method for maltol quantification coupled with a liquid-liquid extraction (LLE) method was developed and validated in terms of linearity, precision, and accuracy. An HPLC separation was performed on a C18 column. Results: The LLE methods and HPLC running conditions for maltol quantification were optimized. The calibration curve of the maltol exhibited good linearity (R2 = 1.00). The limit of detection value of maltol was 0.26 μg/mL, and the limit of quantification value was 0.79 μg/mL. The relative standard deviations (RSDs) of the data of the intra- and inter-day experiments were <1.27% and 0.61%, respectively. The results of the recovery test were 101.35–101.75% with an RSD value of 0.21–1.65%. The developed method was applied successfully to quantify the maltol in three ginseng products manufactured by different methods. Conclusion: The results of validation demonstrated that the proposed HPLC-DAD method was useful for the quantification of maltol in various ginseng products. PMID:26246746
Artifacts Quantification of Metal Implants in MRI
NASA Astrophysics Data System (ADS)
Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.
2017-11-01
The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.
Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations
Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.
2013-01-01
Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359
Developing and Implementing the Data Mining Algorithms in RAVEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea
The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less
Donovan, Carl; Harwood, John; King, Stephanie; Booth, Cormac; Caneco, Bruno; Walker, Cameron
2016-01-01
There are many developments for offshore renewable energy around the United Kingdom whose installation typically produces large amounts of far-reaching noise, potentially disturbing many marine mammals. The potential to affect the favorable conservation status of many species means extensive environmental impact assessment requirements for the licensing of such installation activities. Quantification of such complex risk problems is difficult and much of the key information is not readily available. Expert elicitation methods can be employed in such pressing cases. We describe the methodology used in an expert elicitation study conducted in the United Kingdom for combining expert opinions based on statistical distributions and copula-like methods.
Ahn, Sung Hee; Bae, Yong Jin; Moon, Jeong Hee; Kim, Myung Soo
2013-09-17
We propose to divide matrix suppression in matrix-assisted laser desorption ionization into two parts, normal and anomalous. In quantification of peptides, the normal effect can be accounted for by constructing the calibration curve in the form of peptide-to-matrix ion abundance ratio versus concentration. The anomalous effect forbids reliable quantification and is noticeable when matrix suppression is larger than 70%. With this 70% rule, matrix suppression becomes a guideline for reliable quantification, rather than a nuisance. A peptide in a complex mixture can be quantified even in the presence of large amounts of contaminants, as long as matrix suppression is below 70%. The theoretical basis for the quantification method using a peptide as an internal standard is presented together with its weaknesses. A systematic method to improve quantification of high concentration analytes has also been developed.
Cai, Yicun; He, Yuping; Lv, Rong; Chen, Hongchao; Wang, Qiang; Pan, Liangwen
2017-01-01
Meat products often consist of meat from multiple animal species, and inaccurate food product adulteration and mislabeling can negatively affect consumers. Therefore, a cost-effective and reliable method for identification and quantification of animal species in meat products is required. In this study, we developed a duplex droplet digital PCR (dddPCR) detection and quantification system to simultaneously identify and quantify the source of meat in samples containing a mixture of beef (Bos taurus) and pork (Sus scrofa) in a single digital PCR reaction tube. Mixed meat samples of known composition were used to test the accuracy and applicability of this method. The limit of detection (LOD) and the limit of quantification (LOQ) of this detection and quantification system were also identified. We conclude that our dddPCR detection and quantification system is suitable for quality control and routine analyses of meat products.
de Kinkelder, R; van der Veen, R L P; Verbaak, F D; Faber, D J; van Leeuwen, T G; Berendschot, T T J M
2011-01-01
Purpose Accurate assessment of the amount of macular pigment (MPOD) is necessary to investigate the role of carotenoids and their assumed protective functions. High repeatability and reliability are important to monitor patients in studies investigating the influence of diet and supplements on MPOD. We evaluated the Macuscope (Macuvision Europe Ltd., Lapworth, Solihull, UK), a recently introduced device for measuring MPOD using the technique of heterochromatic flicker photometry (HFP). We determined agreement with another HFP device (QuantifEye; MPS 9000 series: Tinsley Precision Instruments Ltd., Croydon, Essex, UK) and a fundus reflectance method. Methods The right eyes of 23 healthy subjects (mean age 33.9±15.1 years) were measured. We determined agreement with QuantifEye and correlation with a fundus reflectance method. Repeatability of QuantifEye was assessed in 20 other healthy subjects (mean age 32.1±7.3 years). Repeatability was also compared with measurements by a fundus reflectance method in 10 subjects. Results We found low agreement between test and retest measurements with Macuscope. The average difference and the limits of agreement were −0.041±0.32. We found high agreement between test and retest measurements of QuantifEye (−0.02±0.18) and the fundus reflectance method (−0.04±0.18). MPOD data obtained by Macuscope and QuantifEye showed poor agreement: −0.017±0.44. For Macuscope and the fundus reflectance method, the correlation coefficient was r=0.05 (P=0.83). A significant correlation of r=0.87 (P<0.001) was found between QuantifEye and the fundus reflectance method. Conclusions Because repeatability of Macuscope measurements was low (ie, wide limits of agreement) and MPOD values correlated poorly with the fundus reflectance method, and agreed poorly with QuantifEye, the tested Macuscope protocol seems less suitable for studying MPOD. PMID:21057522
Fluorescent quantification of melanin.
Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur
2016-11-01
Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Collender, Philip A.; Kirby, Amy E.; Addiss, David G.; Freeman, Matthew C.; Remais, Justin V.
2015-01-01
Limiting the environmental transmission of soil-transmitted helminths (STH), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost effective methods to detect and quantify STH in the environment. We review the state of the art of STH quantification in soil, biosolids, water, produce, and vegetation with respect to four major methodological issues: environmental sampling; recovery of STH from environmental matrices; quantification of recovered STH; and viability assessment of STH ova. We conclude that methods for sampling and recovering STH require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. PMID:26440788
Liu, Junyan; Liu, Yang; Gao, Mingxia; Zhang, Xiangmin
2012-08-01
A facile proteomic quantification method, fluorescent labeling absolute quantification (FLAQ), was developed. Instead of using MS for quantification, the FLAQ method is a chromatography-based quantification in combination with MS for identification. Multidimensional liquid chromatography (MDLC) with laser-induced fluorescence (LIF) detection with high accuracy and tandem MS system were employed for FLAQ. Several requirements should be met for fluorescent labeling in MS identification: Labeling completeness, minimum side-reactions, simple MS spectra, and no extra tandem MS fragmentations for structure elucidations. A fluorescence dye, 5-iodoacetamidofluorescein, was finally chosen to label proteins on all cysteine residues. The fluorescent dye was compatible with the process of the trypsin digestion and MALDI MS identification. Quantitative labeling was achieved with optimization of reacting conditions. A synthesized peptide and model proteins, BSA (35 cysteines), OVA (five cysteines), were used for verifying the completeness of labeling. Proteins were separated through MDLC and quantified based on fluorescent intensities, followed by MS identification. High accuracy (RSD% < 1.58) and wide linearity of quantification (1-10(5) ) were achieved by LIF detection. The limit of quantitation for the model protein was as low as 0.34 amol. Parts of proteins in human liver proteome were quantified and demonstrated using FLAQ. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Rathi, Bhasker; Siade, Adam J.; Donn, Michael J.; Helm, Lauren; Morris, Ryan; Davis, James A.; Berg, Michael; Prommer, Henning
2017-12-01
Coal seam gas production involves generation and management of large amounts of co-produced water. One of the most suitable methods of management is injection into deep aquifers. Field injection trials may be used to support the predictions of anticipated hydrological and geochemical impacts of injection. The present work employs reactive transport modeling (RTM) for a comprehensive analysis of data collected from a trial where arsenic mobilization was observed. Arsenic sorption behavior was studied through laboratory experiments, accompanied by the development of a surface complexation model (SCM). A field-scale RTM that incorporated the laboratory-derived SCM was used to simulate the data collected during the field injection trial and then to predict the long-term fate of arsenic. We propose a new practical procedure which integrates laboratory and field-scale models using a Monte Carlo type uncertainty analysis and alleviates a significant proportion of the computational effort required for predictive uncertainty quantification. The results illustrate that both arsenic desorption under alkaline conditions and pyrite oxidation have likely contributed to the arsenic mobilization that was observed during the field trial. The predictive simulations show that arsenic concentrations would likely remain very low if the potential for pyrite oxidation is minimized through complete deoxygenation of the injectant. The proposed modeling and predictive uncertainty quantification method can be implemented for a wide range of groundwater studies that investigate the risks of metal(loid) or radionuclide contamination.
Multi-residue method for the determination of antibiotics and some of their metabolites in seafood.
Serra-Compte, Albert; Álvarez-Muñoz, Diana; Rodríguez-Mozaz, Sara; Barceló, Damià
2017-06-01
The presence of antibiotics in seafood for human consumption may pose a risk for consumers. A methodology for the analysis of antibiotics in seafood based on QuEChERS (quick, easy, cheap, effective, rugged, and safe) extraction, followed by detection and quantification using liquid chromatography coupled to mass spectrometry was developed. The analytical method was evaluated for the determination of 23 antibiotics (including parent compounds and some metabolites) in fish, mussels and clams. Recoveries ranged between 30% and 70% for most of the compounds and method detection and quantification limits (MDLs and MQLs) were between 0.01 and 0.31 ng/g dry weigh (dw) and 0.02-1.03 ng/g (dw) respectively. Real seafood samples were analysed using this method. Nine antibiotics were found at levels above MDLs; however none of them exceed the maximum residue limits (MRL) established by the authorities. Tetracycline was the most ubiquitous compound, presenting also the highest concentration: 5.63 ng/g (dw) in fish from Netherlands. In addition, an alternative technique based on microbial growth inhibition was explored as semiquantitative detection method of antibiotics in seafood. This methodology could be applied as a fast screening technique for the detection of macrolides and β-lactams in seafood but further research is needed for other antibiotics families. Copyright © 2016 Elsevier Ltd. All rights reserved.
[Classical and molecular methods for identification and quantification of domestic moulds].
Fréalle, E; Bex, V; Reboux, G; Roussel, S; Bretagne, S
2017-12-01
To study the impact of the constant and inevitable inhalation of moulds, it is necessary to sample, identify and count the spores. Environmental sampling methods can be separated into three categories: surface sampling is easy to perform but non quantitative, air sampling is easy to calibrate but provides time limited information, and dust sampling which is more representative of long term exposure to moulds. The sampling strategy depends on the objectives (evaluation of the risk of exposure for individuals; quantification of the household contamination; evaluation of the efficacy of remediation). The mould colonies obtained in culture are identified using microscopy, Maldi-TOF, and/or DNA sequencing. Electrostatic dust collectors are an alternative to older methods for identifying and quantifying household mould spores. They are easy to use and relatively cheap. Colony counting should be progressively replaced by quantitative real-time PCR, which is already validated, while waiting for more standardised high throughput sequencing methods for assessment of mould contamination without technical bias. Despite some technical recommendations for obtaining reliable and comparable results, the huge diversity of environmental moulds, the variable quantity of spores inhaled and the association with other allergens (mites, plants) make the evaluation of their impact on human health difficult. Hence there is a need for reliable and generally applicable quantitative methods. Copyright © 2017 SPLF. Published by Elsevier Masson SAS. All rights reserved.
Mattarozzi, Monica; Suman, Michele; Cascio, Claudia; Calestani, Davide; Weigel, Stefan; Undas, Anna; Peters, Ruud
2017-01-01
Estimating consumer exposure to nanomaterials (NMs) in food products and predicting their toxicological properties are necessary steps in the assessment of the risks of this technology. To this end, analytical methods have to be available to detect, characterize and quantify NMs in food and materials related to food, e.g. food packaging and biological samples following metabolization of food. The challenge for the analytical sciences is that the characterization of NMs requires chemical as well as physical information. This article offers a comprehensive analysis of methods available for the detection and characterization of NMs in food and related products. Special attention was paid to the crucial role of sample preparation methods since these have been partially neglected in the scientific literature so far. The currently available instrumental methods are grouped as fractionation, counting and ensemble methods, and their advantages and limitations are discussed. We conclude that much progress has been made over the last 5 years but that many challenges still exist. Future perspectives and priority research needs are pointed out. Graphical Abstract Two possible analytical strategies for the sizing and quantification of Nanoparticles: Asymmetric Flow Field-Flow Fractionation with multiple detectors (allows the determination of true size and mass-based particle size distribution); Single Particle Inductively Coupled Plasma Mass Spectrometry (allows the determination of a spherical equivalent diameter of the particle and a number-based particle size distribution).
van Heerden, J; Ehlers, M M; Heim, A; Grabow, W O K
2005-01-01
Human adenoviruses (HAds), of which there are 51 serotypes, are associated with gastrointestinal, respiratory, urinary tract and eye infections. The importance of water in the transmission of HAds and the potential health risks constituted by HAds in these environments are widely recognized. Adenoviruses have not previously been quantified in river and treated drinking water samples. In this study, HAds in river water and treated drinking water sources in South Africa were detected, quantified and typed. Adenoviruses were recovered from the water samples using a glass wool adsorption-elution method followed by polyethylene glycol/NaCl precipitation for secondary concentration. The sensitivity and specificity of two nested PCR methods were compared for detection of HAds in the water samples. Over a 1-year period (June 2002 to July 2003), HAds were detected in 5.32% (10/188) of the treated drinking water and 22.22% (10/45) of river water samples using the conventional nested PCR method. The HAds detected in the water samples were quantified using a real-time PCR method. The original treated drinking water and river water samples had an estimate of less than one copy per litre of HAd DNA present. The hexon-PCR products used for typing HAds were directly sequenced or cloned into plasmids before sequencing. In treated drinking water samples, species D HAds predominated. In addition, adenovirus serotypes 2, 40 and 41 were each detected in three different treated drinking water samples. Most (70%) of the HAds detected in river water samples analysed were enteric HAds (serotypes 40 and 41). One HAd serotype 2 and two species D HAds were detected in the river water. Adenoviruses detected in river and treated drinking water samples were successfully quantified and typed. The detection of HAds in drinking water supplies treated and disinfected by internationally recommended methods, and which conform to quality limits for indicator bacteria, warrants an investigation of the risk of infection constituted by these viruses. The risk of infection may have implications for the management of drinking water quality. This study is unique as it is the first report on the quantification and typing of HAds in treated drinking water and river water. This baseline data is necessary for the meaningful assessment of the potential risk of infection constituted by these viruses.
Uncertainties in Projecting Risks of Late Effects from Space Radiation
NASA Astrophysics Data System (ADS)
Cucinotta, F.; Schimmerling, W.; Peterson, L.; Wilson, J.; Saganti, P.; Dicello, J.
The health risks faced by astronauts from space radiation include cancer, cataracts, hereditary effects, CNS risks, and non - cancer morbidity and mortality risks related to the diseases of the old age. Methods used to project risks in low -Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Within the linear-additivity model, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain a maximum likelihood estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including ISS, lunar station, deep space outpost, and Mar's missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of the primary factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time, and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative objectives, i.e., number of days in space without exceeding a given risk level within well defined confidence limits
Nucleic acids-based tools for ballast water surveillance, monitoring, and research
NASA Astrophysics Data System (ADS)
Darling, John A.; Frederick, Raymond M.
2018-03-01
Understanding the risks of biological invasion posed by ballast water-whether in the context of compliance testing, routine monitoring, or basic research-is fundamentally an exercise in biodiversity assessment, and as such should take advantage of the best tools available for tackling that problem. The past several decades have seen growing application of genetic methods for the study of biodiversity, driven in large part by dramatic technological advances in nucleic acids analysis. Monitoring approaches based on such methods have the potential to increase dramatically sampling throughput for biodiversity assessments, and to improve on the sensitivity, specificity, and taxonomic accuracy of traditional approaches. The application of targeted detection tools (largely focused on PCR but increasingly incorporating novel probe-based methodologies) has led to a paradigm shift in rare species monitoring, and such tools have already been applied for early detection in the context of ballast water surveillance. Rapid improvements in community profiling approaches based on high throughput sequencing (HTS) could similarly impact broader efforts to catalogue biodiversity present in ballast tanks, and could provide novel opportunities to better understand the risks of biotic exchange posed by ballast water transport-and the effectiveness of attempts to mitigate those risks. These various approaches still face considerable challenges to effective implementation, depending on particular management or research needs. Compliance testing, for instance, remains dependent on accurate quantification of viable target organisms; while tools based on RNA detection show promise in this context, the demands of such testing require considerable additional investment in methods development. In general surveillance and research contexts, both targeted and community-based approaches are still limited by various factors: quantification remains a challenge (especially for taxa in larger size classes), gaps in nucleic acids reference databases are still considerable, uncertainties in taxonomic assignment methods persist, and many applications have not yet matured sufficiently to offer standardized methods capable of meeting rigorous quality assurance standards. Nevertheless, the potential value of these tools, their growing utilization in biodiversity monitoring, and the rapid methodological advances over the past decade all suggest that they should be seriously considered for inclusion in the ballast water surveillance toolkit.
NASA Technical Reports Server (NTRS)
Hatfield, Glen S.; Hark, Frank; Stott, James
2016-01-01
Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account system integration risks such as those attributable to manufacturing and assembly. These sources often dominate component level risk. While consequence of failure is often understood, using predicted values in a risk model to estimate the probability of occurrence may underestimate the actual risk. Managers and decision makers use the probability of occurrence to influence the determination whether to accept the risk or require a design modification. The actual risk threshold for acceptance may not be fully understood due to the absence of system level test data or operational data. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.
A SIMPLE METHOD FOR THE EXTRACTION AND QUANTIFICATION OF PHOTOPIGMENTS FROM SYMBIODINIUM SPP.
John E. Rogers and Dragoslav Marcovich. Submitted. Simple Method for the Extraction and Quantification of Photopigments from Symbiodinium spp.. Limnol. Oceanogr. Methods. 19 p. (ERL,GB 1192).
We have developed a simple, mild extraction procedure using methanol which, when...
Tomaselli Muensterman, Elena; Tisdale, James E
2018-06-08
Prolongation of the heart rate-corrected QT (QTc) interval increases the risk for torsades de pointes (TdP), a potentially fatal arrhythmia. The likelihood of TdP is higher in patients with risk factors, which include female sex, older age, heart failure with reduced ejection fraction, hypokalemia, hypomagnesemia, concomitant administration of ≥ 2 QTc interval-prolonging medications, among others. Assessment and quantification of risk factors may facilitate prediction of patients at highest risk for developing QTc interval prolongation and TdP. Investigators have utilized the field of predictive analytics, which generates predictions using techniques including data mining, modeling, machine learning, and others, to develop methods of risk quantification and prediction of QTc interval prolongation. Predictive analytics have also been incorporated into clinical decision support (CDS) tools to alert clinicians regarding patients at increased risk of developing QTc interval prolongation. The objectives of this paper are to assess the effectiveness of predictive analytics for identification of patients at risk of drug-induced QTc interval prolongation, and to discuss the efficacy of incorporation of predictive analytics into CDS tools in clinical practice. A systematic review of English language articles (human subjects only) was performed, yielding 57 articles, with an additional 4 articles identified from other sources; a total of 10 articles were included in this review. Risk scores for QTc interval prolongation have been developed in various patient populations including those in cardiac intensive care units (ICUs) and in broader populations of hospitalized or health system patients. One group developed a risk score that includes information regarding genetic polymorphisms; this score significantly predicted TdP. Development of QTc interval prolongation risk prediction models and incorporation of these models into CDS tools reduces the risk of QTc interval prolongation in cardiac ICUs and identifies health-system patients at increased risk for mortality. The impact of these QTc interval prolongation predictive analytics on overall patient safety outcomes, such as TdP and sudden cardiac death relative to the cost of development and implementation, requires further study. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Shah, Sumit J.; Yu, Kenneth H.; Sangar, Vineet; Parry, Samuel I.; Blair, Ian A.
2009-01-01
Spontaneous preterm birth (PTB) before 37 completed weeks of gestation resulting from preterm labor (PTL) is a leading contributor of perinatal morbidity and mortality. Early identification of at-risk women by reliable screening tests could alleviate this health issue; however, conventional methods such as obstetric history and clinical risk factors, uterine activity monitoring, biochemical markers, and cervical sonography for screening women at risk for PTB have proven unsuccessful in lowering the rate of PTB. Cervicovaginal fluid (CVF) might prove to be a useful, readily available biological fluid for identifying diagnostic PTB biomarkers. Human columnar epithelial endocervical-1 (End1) and vaginal (Vk2) cell secretomes were employed to generate a stable isotope labeled proteome (SILAP) standard to facilitate characterization and relative quantification of proteins present in CVF. The SILAP standard was prepared using stable isotope labeling by amino acids in cell culture (SILAC) of End1 and Vk2 through seven passages. The labeled secreted proteins from both cell lines were combined and characterized by liquid-chromatography-tandem mass spectrometry (LC-MS/MS). 1211 proteins were identified in the End1-Vk2 SILAP standard, with 236 proteins being consistently identified in each of the replicates analyzed. Individual proteins were found to contain < 0.5 % of the endogenous unlabeled forms. Identified proteins were screened to provide a set of fifteen candidates that have either previously been identified as potential PTB biomarkers or could be linked mechanistically to PTB. Stable isotope dilution LC-multiple reaction monitoring (MRM/MS) assays were then developed for conducting relative quantification of the fifteen candidate biomarkers in human CVF samples from term and PTB cases. Three proteins were significantly elevated in PTB cases (desmoplakin isoform 1, stratifin, and thrombospondin 1 precursor), providing a foundation for further validation in larger patient cohorts. PMID:19271751
Shah, Sumit J; Yu, Kenneth H; Sangar, Vineet; Parry, Samuel I; Blair, Ian A
2009-05-01
Spontaneous preterm birth (PTB) before 37 completed weeks of gestation resulting from preterm labor (PTL) is a leading contributor of perinatal morbidity and mortality. Early identification of at-risk women by reliable screening tests could alleviate this health issue; however, conventional methods such as obstetric history and clinical risk factors, uterine activity monitoring, biochemical markers, and cervical sonography for screening women at risk for PTB have proven unsuccessful in lowering the rate of PTB. Cervicovaginal fluid (CVF) might prove to be a useful, readily available biological fluid for identifying diagnostic PTB biomarkers. Human columnar epithelial endocervical-1 (End1) and vaginal (Vk2) cell secretomes were employed to generate a stable isotope labeled proteome (SILAP) standard to facilitate characterization and relative quantification of proteins present in CVF. The SILAP standard was prepared using stable isotope labeling by amino acids in cell culture (SILAC) of End1 and Vk2 through seven passages. The labeled secreted proteins from both cell lines were combined and characterized by liquid-chromatography-tandem mass spectrometry (LC-MS/MS). In total, 1211 proteins were identified in the End1-Vk2 SILAP standard, with 236 proteins being consistently identified in each of the replicates analyzed. Individual proteins were found to contain <0.5% of the endogenous unlabeled forms. Identified proteins were screened to provide a set of 15 candidates that have either previously been identified as potential PTB biomarkers or could be linked mechanistically to PTB. Stable isotope dilution LC-multiple reaction monitoring (MRM/MS) assays were then developed for conducting relative quantification of the 15 candidate biomarkers in human CVF samples from term and PTB cases. Three proteins were significantly elevated in PTB cases (desmoplakin isoform 1, stratifin, and thrombospondin 1 precursor), providing a foundation for further validation in larger patient cohorts.
A strategy to facilitate cleanup at the Mare Island Naval Station
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collins, J.; Albert, D.
1995-12-31
A strategy based on an early realistic estimation of ecological risk was devised to facilitate cleanup of installation restoration units at the Mare Island Naval Station. The strategy uses the results of 100 years of soil-plant studies, which centered on maximizing the bioavailability of nutrients for crop growth. The screening strategy classifies sites according to whether they present (1) little or no ecological risk and require no further action, (2) an immediate and significant risk, and (3) an ecological risk that requires further quantification. The strategy assumes that the main focus of screening level risk assessment is quantification of themore » potential for abiotic-to-biotic transfer (bioavailability) of contaminants, especially at lower trophic levels where exposure is likely to be at a maximum. Sediment screening criteria developed by the California Environmental Protection Agency is used as one regulatory endpoint for evaluating total chemical concentrations. A realistic estimation of risk is then determined by estimating the bioavailability of contaminants.« less
Uncertainties in Estimates of the Risks of Late Effects from Space Radiation
NASA Technical Reports Server (NTRS)
Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P.; Dicelli, J. F.
2002-01-01
The health risks faced by astronauts from space radiation include cancer, cataracts, hereditary effects, and non-cancer morbidity and mortality risks related to the diseases of the old age. Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Within the linear-additivity model, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain a Maximum Likelihood estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including ISS, lunar station, deep space outpost, and Mar's missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time, and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative objective's, i.e., the number of days in space without exceeding a given risk level within well defined confidence limits.
Laurie, Matthew T; Bertout, Jessica A; Taylor, Sean D; Burton, Joshua N; Shendure, Jay A; Bielas, Jason H
2013-08-01
Due to the high cost of failed runs and suboptimal data yields, quantification and determination of fragment size range are crucial steps in the library preparation process for massively parallel sequencing (or next-generation sequencing). Current library quality control methods commonly involve quantification using real-time quantitative PCR and size determination using gel or capillary electrophoresis. These methods are laborious and subject to a number of significant limitations that can make library calibration unreliable. Herein, we propose and test an alternative method for quality control of sequencing libraries using droplet digital PCR (ddPCR). By exploiting a correlation we have discovered between droplet fluorescence and amplicon size, we achieve the joint quantification and size determination of target DNA with a single ddPCR assay. We demonstrate the accuracy and precision of applying this method to the preparation of sequencing libraries.
Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong
2015-01-01
Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.
Turner, Clare E; Russell, Bruce R; Gant, Nicholas
2015-11-01
Magnetic resonance spectroscopy (MRS) is an analytical procedure that can be used to non-invasively measure the concentration of a range of neural metabolites. Creatine is an important neurometabolite with dietary supplementation offering therapeutic potential for neurological disorders with dysfunctional energetic processes. Neural creatine concentrations can be probed using proton MRS and quantified using a range of software packages based on different analytical methods. This experiment examines the differences in quantification performance of two commonly used analysis packages following a creatine supplementation strategy with potential therapeutic application. Human participants followed a seven day dietary supplementation regime in a placebo-controlled, cross-over design interspersed with a five week wash-out period. Spectroscopy data were acquired the day immediately following supplementation and analyzed with two commonly-used software packages which employ vastly different quantification methods. Results demonstrate that neural creatine concentration was augmented following creatine supplementation when analyzed using the peak fitting method of quantification (105.9%±10.1). In contrast, no change in neural creatine levels were detected with supplementation when analysis was conducted using the basis spectrum method of quantification (102.6%±8.6). Results suggest that software packages that employ the peak fitting procedure for spectral quantification are possibly more sensitive to subtle changes in neural creatine concentrations. The relative simplicity of the spectroscopy sequence and the data analysis procedure suggest that peak fitting procedures may be the most effective means of metabolite quantification when detection of subtle alterations in neural metabolites is necessary. The straightforward technique can be used on a clinical magnetic resonance imaging system. Copyright © 2015 Elsevier Inc. All rights reserved.
Wille, Sarah M R; Di Fazio, Vincent; Ramírez-Fernandez, Maria del Mar; Kummer, Natalie; Samyn, Nele
2013-02-01
"Driving under the influence of drugs" (DUID) has a large impact on the worldwide mortality risk. Therefore, DUID legislations based on impairment or analytical limits are adopted. Drug detection in oral fluid is of interest due to the ease of sampling during roadside controls. The prevalence of Δ9-tetrahydrocannabinol (THC) in seriously injured drivers ranges from 0.5% to 7.6% in Europe. For these reasons, the quantification of THC in oral fluid collected with 3 alternative on-site collectors is presented and discussed in this publication. An ultra-performance liquid chromatography-mass spectrometric quantification method for THC in oral fluid samples collected with the StatSure (Diagnostic Systems), Quantisal (Immunalysis), and Certus (Concateno) devices was validated according to the international guidelines. Small sample volumes of 100-200 μL were extracted using hexane. Special attention was paid to factors such as matrix effects, THC adsorption onto the collector, and stability in the collection fluid. A relatively high-throughput analysis was developed and validated according to ISO 17025 requirements. Although the effects of the matrix on the quantification could be minimized using a deuterated internal standard, and stability was acceptable according the validation data, adsorption of THC onto the collectors was a problem. For the StatSure device, THC was totally recovered from the collector pad after storage for 24 hours at room temperature or 7 days at 4°C. A loss of 15%-25% was observed for the Quantisal collector, whereas the recovery from the Certus device was irreproducible (relative standard deviation, 44%-85%) and low (29%-80%). During the roadside setting, a practical problem arose: small volumes of oral fluid (eg, 300 μL) were collected. However, THC was easily detected and concentrations ranged from 8 to 922 ng/mL in neat oral fluid. A relatively high-throughput analysis (40 samples in 4 hours) adapted for routine DUID analysis was developed and validated for THC quantification in oral fluid samples collected from drivers under the influence of cannabis.
Collender, Philip A; Kirby, Amy E; Addiss, David G; Freeman, Matthew C; Remais, Justin V
2015-12-01
Limiting the environmental transmission of soil-transmitted helminths (STHs), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost-effective methods to detect and quantify STHs in the environment. We review the state-of-the-art of STH quantification in soil, biosolids, water, produce, and vegetation with regard to four major methodological issues: environmental sampling; recovery of STHs from environmental matrices; quantification of recovered STHs; and viability assessment of STH ova. We conclude that methods for sampling and recovering STHs require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. Copyright © 2015 Elsevier Ltd. All rights reserved.
Uncertainty quantification in flood risk assessment
NASA Astrophysics Data System (ADS)
Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto
2017-04-01
Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.
Quantification of mixed chimerism by real time PCR on whole blood-impregnated FTA cards.
Pezzoli, N; Silvy, M; Woronko, A; Le Treut, T; Lévy-Mozziconacci, A; Reviron, D; Gabert, J; Picard, C
2007-09-01
This study has investigated quantification of chimerism in sex-mismatched transplantations by quantitative real time PCR (RQ-PCR) using FTA paper for blood sampling. First, we demonstrate that the quantification of DNA from EDTA-blood which has been deposit on FTA card is accurate and reproducible. Secondly, we show that fraction of recipient cells detected by RQ-PCR was concordant between the FTA and salting-out method, reference DNA extraction method. Furthermore, the sensitivity of detection of recipient cells is relatively similar with the two methods. Our results show that this innovative method can be used for MC assessment by RQ-PCR.
Bioanalytical methods for determination of tamoxifen and its phase I metabolites: a review.
Teunissen, S F; Rosing, H; Schinkel, A H; Schellens, J H M; Beijnen, J H
2010-12-17
The selective estrogen receptor modulator tamoxifen is used in the treatment of early and advanced breast cancer and in selected cases for breast cancer prevention in high-risk subjects. The cytochrome P450 enzyme system and flavin-containing monooxygenase are responsible for the extensive metabolism of tamoxifen into several phase I metabolites that vary in toxicity and potencies towards estrogen receptor (ER) alpha and ER beta. An extensive overview of publications on the determination of tamoxifen and its phase I metabolites in biological samples is presented. In these publications techniques were used such as capillary electrophoresis, liquid, gas and thin layer chromatography coupled with various detection techniques (mass spectrometry, ultraviolet or fluorescence detection, liquid scintillation counting and nuclear magnetic resonance spectroscopy). A trend is seen towards the use of liquid chromatography coupled to mass spectrometry (LC-MS). State-of-the-art LC-MS equipment allowed for identification of unknown metabolites and quantification of known metabolites reaching lower limit of quantification levels in the sub pg mL(-1) range. Although tamoxifen is also metabolized into phase II metabolites, the number of publications reporting on phase II metabolism of tamoxifen is scarce. Therefore the focus of this review is on phase I metabolites of tamoxifen. We conclude that in the past decades tamoxifen metabolism has been studied extensively and numerous metabolites have been identified. Assays have been developed for both the identification and quantification of tamoxifen and its metabolites in an array of biological samples. This review can be used as a resource for method transfer and development of analytical methods used to support pharmacokinetic and pharmacodynamic studies of tamoxifen and its phase I metabolites. Copyright © 2010 Elsevier B.V. All rights reserved.
Biniarz, Piotr; Łukaszewicz, Marcin
2017-06-01
The rapid and accurate quantification of biosurfactants in biological samples is challenging. In contrast to the orcinol method for rhamnolipids, no simple biochemical method is available for the rapid quantification of lipopeptides. Various liquid chromatography (LC) methods are promising tools for relatively fast and exact quantification of lipopeptides. Here, we report strategies for the quantification of the lipopeptides pseudofactin and surfactin in bacterial cultures using different high- (HPLC) and ultra-performance liquid chromatography (UPLC) systems. We tested three strategies for sample pretreatment prior to LC analysis. In direct analysis (DA), bacterial cultures were injected directly and analyzed via LC. As a modification, we diluted the samples with methanol and detected an increase in lipopeptide recovery in the presence of methanol. Therefore, we suggest this simple modification as a tool for increasing the accuracy of LC methods. We also tested freeze-drying followed by solvent extraction (FDSE) as an alternative for the analysis of "heavy" samples. In FDSE, the bacterial cultures were freeze-dried, and the resulting powder was extracted with different solvents. Then, the organic extracts were analyzed via LC. Here, we determined the influence of the extracting solvent on lipopeptide recovery. HPLC methods allowed us to quantify pseudofactin and surfactin with run times of 15 and 20 min per sample, respectively, whereas UPLC quantification was as fast as 4 and 5.5 min per sample, respectively. Our methods provide highly accurate measurements and high recovery levels for lipopeptides. At the same time, UPLC-MS provides the possibility to identify lipopeptides and their structural isoforms.
2015-01-01
Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MSE quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (<2 kDa, 137 peptides from 18 families) was possible in microdialysates from 8 replicate feeding experiments. Of these NPs, 55 were detected with an average mass error below 10 ppm. The time-resolved profiles of relative concentration changes for 6 are shown, and there is great potential for the use of this method in future experiments to aid in correlation of NP changes with behavior. This work presents an unbiased approach to winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MSE quantification method using the open source software Skyline. PMID:25552291
Schmerberg, Claire M; Liang, Zhidan; Li, Lingjun
2015-01-21
Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MS(E) quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (<2 kDa, 137 peptides from 18 families) was possible in microdialysates from 8 replicate feeding experiments. Of these NPs, 55 were detected with an average mass error below 10 ppm. The time-resolved profiles of relative concentration changes for 6 are shown, and there is great potential for the use of this method in future experiments to aid in correlation of NP changes with behavior. This work presents an unbiased approach to winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MS(E) quantification method using the open source software Skyline.
Uncertainties in estimates of the risks of late effects from space radiation
NASA Technical Reports Server (NTRS)
Cucinotta, F. A.; Schimmerling, W.; Wilson, J. W.; Peterson, L. E.; Saganti, P. B.; Dicello, J. F.
2004-01-01
Methods used to project risks in low-Earth orbit are of questionable merit for exploration missions because of the limited radiobiology data and knowledge of galactic cosmic ray (GCR) heavy ions, which causes estimates of the risk of late effects to be highly uncertain. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. Using the linear-additivity model for radiation risks, we use Monte-Carlo sampling from subjective uncertainty distributions in each factor to obtain an estimate of the overall uncertainty in risk projections. The resulting methodology is applied to several human space exploration mission scenarios including a deep space outpost and Mars missions of duration of 360, 660, and 1000 days. The major results are the quantification of the uncertainties in current risk estimates, the identification of factors that dominate risk projection uncertainties, and the development of a method to quantify candidate approaches to reduce uncertainties or mitigate risks. The large uncertainties in GCR risk projections lead to probability distributions of risk that mask any potential risk reduction using the "optimization" of shielding materials or configurations. In contrast, the design of shielding optimization approaches for solar particle events and trapped protons can be made at this time and promising technologies can be shown to have merit using our approach. The methods used also make it possible to express risk management objectives in terms of quantitative metrics, e.g., the number of days in space without exceeding a given risk level within well-defined confidence limits. Published by Elsevier Ltd on behalf of COSPAR.
Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier
2018-06-01
Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Mallah, Muhammad Ali; Sherazi, Syed Tufail Hussain; Bhanger, Muhammad Iqbal; Mahesar, Sarfaraz Ahmed; Bajeer, Muhammad Ashraf
2015-04-01
A transmission FTIR spectroscopic method was developed for direct, inexpensive and fast quantification of paracetamol content in solid pharmaceutical formulations. In this method paracetamol content is directly analyzed without solvent extraction. KBr pellets were formulated for the acquisition of FTIR spectra in transmission mode. Two chemometric models: simple Beer's law and partial least squares employed over the spectral region of 1800-1000 cm-1 for quantification of paracetamol content had a regression coefficient of (R2) of 0.999. The limits of detection and quantification using FTIR spectroscopy were 0.005 mg g-1 and 0.018 mg g-1, respectively. Study for interference was also done to check effect of the excipients. There was no significant interference from the sample matrix. The results obviously showed the sensitivity of transmission FTIR spectroscopic method for pharmaceutical analysis. This method is green in the sense that it does not require large volumes of hazardous solvents or long run times and avoids prior sample preparation.
Quantitative Method for Simultaneous Analysis of Acetaminophen and 6 Metabolites.
Lammers, Laureen A; Achterbergh, Roos; Pistorius, Marcel C M; Romijn, Johannes A; Mathôt, Ron A A
2017-04-01
Hepatotoxicity after ingestion of high-dose acetaminophen [N-acetyl-para-aminophenol (APAP)] is caused by the metabolites of the drug. To gain more insight into factors influencing susceptibility to APAP hepatotoxicity, quantification of APAP and metabolites is important. A few methods have been developed to simultaneously quantify APAP and its most important metabolites. However, these methods require a comprehensive sample preparation and long run times. The aim of this study was to develop and validate a simplified, but sensitive method for the simultaneous quantification of acetaminophen, the main metabolites acetaminophen glucuronide and acetaminophen sulfate, and 4 Cytochrome P450-mediated metabolites by using liquid chromatography with mass spectrometric (LC-MS) detection. The method was developed and validated for the human plasma, and it entailed a single method for sample preparation, enabling quick processing of the samples followed by an LC-MS method with a chromatographic run time of 9 minutes. The method was validated for selectivity, linearity, accuracy, imprecision, dilution integrity, recovery, process efficiency, ionization efficiency, and carryover effect. The method showed good selectivity without matrix interferences. For all analytes, the mean process efficiency was >86%, and the mean ionization efficiency was >94%. Furthermore, the accuracy was between 90.3% and 112% for all analytes, and the within- and between-run imprecision were <20% for the lower limit of quantification and <14.3% for the middle level and upper limit of quantification. The method presented here enables the simultaneous quantification of APAP and 6 of its metabolites. It is less time consuming than previously reported methods because it requires only a single and simple method for the sample preparation followed by an LC-MS method with a short run time. Therefore, this analytical method provides a useful method for both clinical and research purposes.
Green, Daniel M; Nolan, Vikki G; Goodman, Pamela J; Whitton, John A; Srivastava, DeoKumar; Leisenring, Wendy M; Neglia, Joseph P; Sklar, Charles A; Kaste, Sue C; Hudson, Melissa M; Diller, Lisa R; Stovall, Marilyn; Donaldson, Sarah S; Robison, Leslie L
2014-01-01
Estimation of the risk of adverse long-term outcomes such as second malignant neoplasms and infertility often requires reproducible quantification of exposures. The method for quantification should be easily utilized and valid across different study populations. The widely used Alkylating Agent Dose (AAD) score is derived from the drug dose distribution of the study population and thus cannot be used for comparisons across populations as each will have a unique distribution of drug doses. We compared the performance of the Cyclophosphamide Equivalent Dose (CED), a unit for quantifying alkylating agent exposure independent of study population, to the AAD. Comparisons included associations from three Childhood Cancer Survivor Study (CCSS) outcome analyses, receiver operator characteristic (ROC) curves and goodness of fit based on the Akaike's Information Criterion (AIC). The CED and AAD performed essentially identically in analyses of risk for pregnancy among the partners of male CCSS participants, risk for adverse dental outcomes among all CCSS participants and risk for premature menopause among female CCSS participants, based on similar associations, lack of statistically significant differences between the areas under the ROC curves and similar model fit values for the AIC between models including the two measures of exposure. The CED is easily calculated, facilitating its use for patient counseling. It is independent of the drug dose distribution of a particular patient population, a characteristic that will allow direct comparisons of outcomes among epidemiological cohorts. We recommend the use of the CED in future research assessing cumulative alkylating agent exposure. © 2013 Wiley Periodicals, Inc.
1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.
Dagnino, Denise; Schripsema, Jan
2005-08-01
A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.
Morio, Florent; Corvec, Stéphane; Caroff, Nathalie; Le Gallou, Florence; Drugeon, Henri; Reynaud, Alain
2008-07-01
We developed a quantitative real-time PCR assay targeting the mip gene of Legionella pneumophila for a prospective study from September 2004 to April 2005. It was compared with a standard culture method (French guideline AFNOR T90-431), analysing 120 water samples collected to monitor the risk related to Legionellae at Nantes hospital and to investigate a case of legionellosis acquired from hospital environment. Samples from six distinct water distribution systems were analysed by DNA extraction, amplification and detection with specific primers and FRET probes. The detection limit was 100 genomic units of L. pneumophila per liter (GU/l), the positivity threshold about 600 GU/l and the quantification limit 800 GU/l. PCR results were divided into three groups: negative (n=63), positive but non-quantifiable (n=22) or positive (n=35). PCR showed higher sensitivity than culture, whereas four culture-positive samples appeared negative by PCR (PCR inhibitor detected for two of them). Although no correlation was observed between both methods and real-time PCR cannot substitute for the reference method, it represents an interesting complement. Its sensitivity, reproducibility and rapidity appear particularly interesting in epidemic contexts in order to identify the source of contamination or to evaluate critical points of contamination in water distribution systems.
Árnadóttir, Í.; Gíslason, M. K.; Carraro, U.
2016-01-01
Muscle degeneration has been consistently identified as an independent risk factor for high mortality in both aging populations and individuals suffering from neuromuscular pathology or injury. While there is much extant literature on its quantification and correlation to comorbidities, a quantitative gold standard for analyses in this regard remains undefined. Herein, we hypothesize that rigorously quantifying entire radiodensitometric distributions elicits more muscle quality information than average values reported in extant methods. This study reports the development and utility of a nonlinear trimodal regression analysis method utilized on radiodensitometric distributions of upper leg muscles from CT scans of a healthy young adult, a healthy elderly subject, and a spinal cord injury patient. The method was then employed with a THA cohort to assess pre- and postsurgical differences in their healthy and operative legs. Results from the initial representative models elicited high degrees of correlation to HU distributions, and regression parameters highlighted physiologically evident differences between subjects. Furthermore, results from the THA cohort echoed physiological justification and indicated significant improvements in muscle quality in both legs following surgery. Altogether, these results highlight the utility of novel parameters from entire HU distributions that could provide insight into the optimal quantification of muscle degeneration. PMID:28115982
Shi, Rui; Yan, Lihong; Xu, Tongguang; Liu, Dongye; Zhu, Yongfa; Zhou, Jun
2015-01-02
Polycyclic aromatic hydrocarbons (PAHs) were considered as a source of carcinogenicity in mainstream cigarette smoke (MSS). Accurate quantification of these components was necessary for assessing public health risk. In our study, a solid-phase extraction (SPE) method using graphene oxide (GO) bound silica as adsorbent for purification of 14 PAHs in MSS was developed. During SPE process, large matrices interferences of MSS were adsorbed on SPE column. The result of FTIR spectra demonstrated that these matrices interferences were adsorbed on GO mainly through OH and CO groups. The concentrations of PAHs in MSS extract were determined by gas chromatography-mass spectrometry (GC-MS). The limit of detection (LOD) and limit of quantification (LOQ) of the developed method for 14 PAHs ranged from 0.05 to 0.36 ng/cig and 0.17 to 1.19 ng/cig, respectively. The accuracy of the measurement of 14 PAHs was from 73 to 116%. The relative standard deviations of intra- and inter-day analysis were less than 7.8% and 13.9%, respectively. Moreover, the developed method was successfully applied for analysis of real cigarette containing 1R5F reference cigarette and 12 top-selling commercial cigarettes in China. Copyright © 2014 Elsevier B.V. All rights reserved.
Nakazawa, Hiroyuki; Iwasaki, Yusuke; Ito, Rie
2014-01-01
Our modern society has created a large number of chemicals that are used for the production of everyday commodities including toys, food packaging, cosmetic products, and building materials. We enjoy a comfortable and convenient lifestyle with access to these items. In addition, in specialized areas, such as experimental science and various medical fields, laboratory equipment and devices that are manufactured using a wide range of chemical substances are also extensively employed. The association between human exposure to trace hazardous chemicals and an increased incidence of endocrine disease has been recognized. However, the evaluation of human exposure to such endocrine disrupting chemicals is therefore imperative, and the determination of exposure levels requires the analysis of human biological materials, such as blood and urine. To obtain as much information as possible from limited sample sizes, highly sensitive and reliable analytical methods are also required for exposure assessments. The present review focuses on effective analytical methods for the quantification of bisphenol A (BPA), alkylphenols (APs), phthalate esters (PEs), and perfluoronated chemicals (PFCs), which are chemicals used in the production of everyday commodities. Using data obtained from liquid chromatography/mass spectrometry (LC/MS) and LC/MS/MS analyses, assessments of the risks to humans were also presented based on the estimated levels of exposure to PFCs.
Code of Federal Regulations, 2014 CFR
2014-01-01
... quantification system; data management and maintenance system; and control, oversight, and validation system for...-supervised institution's advanced IRB systems, operational risk management processes, operational risk data...-length basis between the seller and the obligor (intercompany accounts receivable and receivables subject...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-27
... DEPARTMENT OF AGRICULTURE [Docket Number: USDA-2013-0003] Science-Based Methods for Entity-Scale Quantification of Greenhouse Gas Sources and Sinks From Agriculture and Forestry Practices AGENCY: Office of the... of Agriculture (USDA) has prepared a report containing methods for quantifying entity-scale...
[DNA quantification of blood samples pre-treated with pyramidon].
Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan
2014-06-01
To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.
Simple and rapid quantification of brominated vegetable oil in commercial soft drinks by LC–MS
Chitranshi, Priyanka; da Costa, Gonçalo Gamboa
2016-01-01
We report here a simple and rapid method for the quantification of brominated vegetable oil (BVO) in soft drinks based upon liquid chromatography–electrospray ionization mass spectrometry. Unlike previously reported methods, this novel method does not require hydrolysis, extraction or derivatization steps, but rather a simple “dilute and shoot” sample preparation. The quantification is conducted by mass spectrometry in selected ion recording mode and a single point standard addition procedure. The method was validated in the range of 5–25 μg/mL BVO, encompassing the legal limit of 15 μg/mL established by the US FDA for fruit-flavored beverages in the US market. The method was characterized by excellent intra- and inter-assay accuracy (97.3–103.4%) and very low imprecision [0.5–3.6% (RSD)]. The direct nature of the quantification, simplicity, and excellent statistical performance of this methodology constitute clear advantages in relation to previously published methods for the analysis of BVO in soft drinks. PMID:27451219
USDA-ARS?s Scientific Manuscript database
High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiologic amino acids in cucurbits. This method is particularly useful because the dabsyl derivatives of glutamine and citrulline are sufficiently separated to allow quantification of ea...
Are LOD and LOQ Reliable Parameters for Sensitivity Evaluation of Spectroscopic Methods?
Ershadi, Saba; Shayanfar, Ali
2018-03-22
The limit of detection (LOD) and the limit of quantification (LOQ) are common parameters to assess the sensitivity of analytical methods. In this study, the LOD and LOQ of previously reported terbium sensitized analysis methods were calculated by different methods, and the results were compared with sensitivity parameters [lower limit of quantification (LLOQ)] of U.S. Food and Drug Administration guidelines. The details of the calibration curve and standard deviation of blank samples of three different terbium-sensitized luminescence methods for the quantification of mycophenolic acid, enrofloxacin, and silibinin were used for the calculation of LOD and LOQ. A comparison of LOD and LOQ values calculated by various methods and LLOQ shows a considerable difference. The significant difference of the calculated LOD and LOQ with various methods and LLOQ should be considered in the sensitivity evaluation of spectroscopic methods.
New approach for the quantification of processed animal proteins in feed using light microscopy.
Veys, P; Baeten, V
2010-07-01
A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.
Rodrigues, É O; Morais, F F C; Morais, N A O S; Conci, L S; Neto, L V; Conci, A
2016-01-01
The deposits of fat on the surroundings of the heart are correlated to several health risk factors such as atherosclerosis, carotid stiffness, coronary artery calcification, atrial fibrillation and many others. These deposits vary unrelated to obesity, which reinforces its direct segmentation for further quantification. However, manual segmentation of these fats has not been widely deployed in clinical practice due to the required human workload and consequential high cost of physicians and technicians. In this work, we propose a unified method for an autonomous segmentation and quantification of two types of cardiac fats. The segmented fats are termed epicardial and mediastinal, and stand apart from each other by the pericardium. Much effort was devoted to achieve minimal user intervention. The proposed methodology mainly comprises registration and classification algorithms to perform the desired segmentation. We compare the performance of several classification algorithms on this task, including neural networks, probabilistic models and decision tree algorithms. Experimental results of the proposed methodology have shown that the mean accuracy regarding both epicardial and mediastinal fats is 98.5% (99.5% if the features are normalized), with a mean true positive rate of 98.0%. In average, the Dice similarity index was equal to 97.6%. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Staley, Christopher; Gordon, Katrina V.; Schoen, Mary E.
2012-01-01
Before new, rapid quantitative PCR (qPCR) methods for assessment of recreational water quality and microbial source tracking (MST) can be useful in a regulatory context, an understanding of the ability of the method to detect a DNA target (marker) when the contaminant source has been diluted in environmental waters is needed. This study determined the limits of detection and quantification of the human-associated Bacteroides sp. (HF183) and human polyomavirus (HPyV) qPCR methods for sewage diluted in buffer and in five ambient, Florida water types (estuarine, marine, tannic, lake, and river). HF183 was quantifiable in sewage diluted up to 10−6 in 500-ml ambient-water samples, but HPyVs were not quantifiable in dilutions of >10−4. Specificity, which was assessed using fecal composites from dogs, birds, and cattle, was 100% for HPyVs and 81% for HF183. Quantitative microbial risk assessment (QMRA) estimated the possible norovirus levels in sewage and the human health risk at various sewage dilutions. When juxtaposed with the MST marker detection limits, the QMRA analysis revealed that HF183 was detectable when the modeled risk of gastrointestinal (GI) illness was at or below the benchmark of 10 illnesses per 1,000 exposures, but the HPyV method was generally not sensitive enough to detect potential health risks at the 0.01 threshold for frequency of illness. The tradeoff between sensitivity and specificity in the MST methods indicates that HF183 data should be interpreted judiciously, preferably in conjunction with a more host-specific marker, and that better methods of concentrating HPyVs from environmental waters are needed if this method is to be useful in a watershed management or monitoring context. PMID:22885746
Taylor, Jonathan Christopher; Fenner, John Wesley
2017-11-29
Semi-quantification methods are well established in the clinic for assisted reporting of (I123) Ioflupane images. Arguably, these are limited diagnostic tools. Recent research has demonstrated the potential for improved classification performance offered by machine learning algorithms. A direct comparison between methods is required to establish whether a move towards widespread clinical adoption of machine learning algorithms is justified. This study compared three machine learning algorithms with that of a range of semi-quantification methods, using the Parkinson's Progression Markers Initiative (PPMI) research database and a locally derived clinical database for validation. Machine learning algorithms were based on support vector machine classifiers with three different sets of features: Voxel intensities Principal components of image voxel intensities Striatal binding radios from the putamen and caudate. Semi-quantification methods were based on striatal binding ratios (SBRs) from both putamina, with and without consideration of the caudates. Normal limits for the SBRs were defined through four different methods: Minimum of age-matched controls Mean minus 1/1.5/2 standard deviations from age-matched controls Linear regression of normal patient data against age (minus 1/1.5/2 standard errors) Selection of the optimum operating point on the receiver operator characteristic curve from normal and abnormal training data Each machine learning and semi-quantification technique was evaluated with stratified, nested 10-fold cross-validation, repeated 10 times. The mean accuracy of the semi-quantitative methods for classification of local data into Parkinsonian and non-Parkinsonian groups varied from 0.78 to 0.87, contrasting with 0.89 to 0.95 for classifying PPMI data into healthy controls and Parkinson's disease groups. The machine learning algorithms gave mean accuracies between 0.88 to 0.92 and 0.95 to 0.97 for local and PPMI data respectively. Classification performance was lower for the local database than the research database for both semi-quantitative and machine learning algorithms. However, for both databases, the machine learning methods generated equal or higher mean accuracies (with lower variance) than any of the semi-quantification approaches. The gain in performance from using machine learning algorithms as compared to semi-quantification was relatively small and may be insufficient, when considered in isolation, to offer significant advantages in the clinical context.
Oliveira, Samara Sant'Anna; Sorgine, Marcos Henrique Ferreira; Bianco, Kayo; Pinto, Leonardo Henriques; Barreto, Camila; Albano, Rodolpho Mattos; Cardoso, Alexander Machado; Clementino, Maysa Mandetta
2016-12-01
The identification of fecal pollution in aquatic ecosystems is one of the requirements to assess the possible risks to human health. In this report, physicochemical parameters, Escherichia coli enumeration and Methanobrevibacter smithii nifH gene quantification were conducted at 13 marine waters in the coastal beaches of Rio de Janeiro, Brazil. The pH, turbidity, dissolved oxygen, temperature, and conductivity, carried out by mobile equipment, revealed varied levels due to specific conditions of the beaches. The bioindicators' enumerations were done by defined substrate method, conventional, and real-time PCR. Six marine beach sites (46 %) presenting E. coli levels in compliance with Brazilian water quality guidelines (<2500 MPN/100 mL) showed nifH gene between 5.7 × 10 9 to 9.5 × 10 11 copies. L -1 revealing poor correlation between the two approaches. To our knowledge, this is the first inquiry in qPCR using nifH gene as a biomarker of human-specific sources of sewage pollution in marine waters in Brazil. In addition, our data suggests that alternative indicator nifH gene could be used, in combination with other markers, for source tracking studies to measure the quality of marine ecosystems thereby contributing to improved microbial risk assessment.
Isak, I; Patel, M; Riddell, M; West, M; Bowers, T; Wijeyekoon, S; Lloyd, J
2016-08-01
Fourier transform infrared (FTIR) spectroscopy was used in this study for the rapid quantification of polyhydroxyalkanoates (PHA) in mixed and pure culture bacterial biomass. Three different statistical analysis methods (regression, partial least squares (PLS) and nonlinear) were applied to the FTIR data and the results were plotted against the PHA values measured with the reference gas chromatography technique. All methods predicted PHA content in mixed culture biomass with comparable efficiency, indicated by similar residuals values. The PHA in these cultures ranged from low to medium concentration (0-44 wt% of dried biomass content). However, for the analysis of the combined mixed and pure culture biomass with PHA concentration ranging from low to high (0-93% of dried biomass content), the PLS method was most efficient. This paper reports, for the first time, the use of a single calibration model constructed with a combination of mixed and pure cultures covering a wide PHA range, for predicting PHA content in biomass. Currently no one universal method exists for processing FTIR data for polyhydroxyalkanoates (PHA) quantification. This study compares three different methods of analysing FTIR data for quantification of PHAs in biomass. A new data-processing approach was proposed and the results were compared against existing literature methods. Most publications report PHA quantification of medium range in pure culture. However, in our study we encompassed both mixed and pure culture biomass containing a broader range of PHA in the calibration curve. The resulting prediction model is useful for rapid quantification of a wider range of PHA content in biomass. © 2016 The Society for Applied Microbiology.
Chen, Xing; Pavan, Matteo; Heinzer-Schweizer, Susanne; Boesiger, Peter; Henning, Anke
2012-01-01
This report describes our efforts on quantification of tissue metabolite concentrations in mM by nuclear Overhauser enhanced and proton decoupled (13) C magnetic resonance spectroscopy and the Electric Reference To access In vivo Concentrations (ERETIC) method. Previous work showed that a calibrated synthetic magnetic resonance spectroscopy-like signal transmitted through an optical fiber and inductively coupled into a transmit/receive coil represents a reliable reference standard for in vivo (1) H magnetic resonance spectroscopy quantification on a clinical platform. In this work, we introduce a related implementation that enables simultaneous proton decoupling and ERETIC-based metabolite quantification and hence extends the applicability of the ERETIC method to nuclear Overhauser enhanced and proton decoupled in vivo (13) C magnetic resonance spectroscopy. In addition, ERETIC signal stability under the influence of simultaneous proton decoupling is investigated. The proposed quantification method was cross-validated against internal and external reference standards on human skeletal muscle. The ERETIC signal intensity stability was 100.65 ± 4.18% over 3 months including measurements with and without proton decoupling. Glycogen and unsaturated fatty acid concentrations measured with the ERETIC method were in excellent agreement with internal creatine and external phantom reference methods, showing a difference of 1.85 ± 1.21% for glycogen and 1.84 ± 1.00% for unsaturated fatty acid between ERETIC and creatine-based quantification, whereas the deviations between external reference and creatine-based quantification are 6.95 ± 9.52% and 3.19 ± 2.60%, respectively. Copyright © 2011 Wiley Periodicals, Inc.
Semi-automatic segmentation of myocardium at risk in T2-weighted cardiovascular magnetic resonance.
Sjögren, Jane; Ubachs, Joey F A; Engblom, Henrik; Carlsson, Marcus; Arheden, Håkan; Heiberg, Einar
2012-01-31
T2-weighted cardiovascular magnetic resonance (CMR) has been shown to be a promising technique for determination of ischemic myocardium, referred to as myocardium at risk (MaR), after an acute coronary event. Quantification of MaR in T2-weighted CMR has been proposed to be performed by manual delineation or the threshold methods of two standard deviations from remote (2SD), full width half maximum intensity (FWHM) or Otsu. However, manual delineation is subjective and threshold methods have inherent limitations related to threshold definition and lack of a priori information about cardiac anatomy and physiology. Therefore, the aim of this study was to develop an automatic segmentation algorithm for quantification of MaR using anatomical a priori information. Forty-seven patients with first-time acute ST-elevation myocardial infarction underwent T2-weighted CMR within 1 week after admission. Endocardial and epicardial borders of the left ventricle, as well as the hyper enhanced MaR regions were manually delineated by experienced observers and used as reference method. A new automatic segmentation algorithm, called Segment MaR, defines the MaR region as the continuous region most probable of being MaR, by estimating the intensities of normal myocardium and MaR with an expectation maximization algorithm and restricting the MaR region by an a priori model of the maximal extent for the user defined culprit artery. The segmentation by Segment MaR was compared against inter observer variability of manual delineation and the threshold methods of 2SD, FWHM and Otsu. MaR was 32.9 ± 10.9% of left ventricular mass (LVM) when assessed by the reference observer and 31.0 ± 8.8% of LVM assessed by Segment MaR. The bias and correlation was, -1.9 ± 6.4% of LVM, R = 0.81 (p < 0.001) for Segment MaR, -2.3 ± 4.9%, R = 0.91 (p < 0.001) for inter observer variability of manual delineation, -7.7 ± 11.4%, R = 0.38 (p = 0.008) for 2SD, -21.0 ± 9.9%, R = 0.41 (p = 0.004) for FWHM, and 5.3 ± 9.6%, R = 0.47 (p < 0.001) for Otsu. There is a good agreement between automatic Segment MaR and manually assessed MaR in T2-weighted CMR. Thus, the proposed algorithm seems to be a promising, objective method for standardized MaR quantification in T2-weighted CMR.
Alves, L P S; Almeida, A T; Cruz, L M; Pedrosa, F O; de Souza, E M; Chubatsu, L S; Müller-Santos, M; Valdameri, G
2017-01-16
The conventional method for quantification of polyhydroxyalkanoates based on whole-cell methanolysis and gas chromatography (GC) is laborious and time-consuming. In this work, a method based on flow cytometry of Nile red stained bacterial cells was established to quantify poly-3-hydroxybutyrate (PHB) production by the diazotrophic and plant-associated bacteria, Herbaspirillum seropedicae and Azospirillum brasilense. The method consists of three steps: i) cell permeabilization, ii) Nile red staining, and iii) analysis by flow cytometry. The method was optimized step-by-step and can be carried out in less than 5 min. The final results indicated a high correlation coefficient (R2=0.99) compared to a standard method based on methanolysis and GC. This method was successfully applied to the quantification of PHB in epiphytic bacteria isolated from rice roots.
Wang, Hanghang; Muehlbauer, Michael J.; O’Neal, Sara K.; Newgard, Christopher B.; Hauser, Elizabeth R.; Shah, Svati H.
2017-01-01
The field of metabolomics as applied to human disease and health is rapidly expanding. In recent efforts of metabolomics research, greater emphasis has been placed on quality control and method validation. In this study, we report an experience with quality control and a practical application of method validation. Specifically, we sought to identify and modify steps in gas chromatography-mass spectrometry (GC-MS)-based, non-targeted metabolomic profiling of human plasma that could influence metabolite identification and quantification. Our experimental design included two studies: (1) a limiting-dilution study, which investigated the effects of dilution on analyte identification and quantification; and (2) a concentration-specific study, which compared the optimal plasma extract volume established in the first study with the volume used in the current institutional protocol. We confirmed that contaminants, concentration, repeatability and intermediate precision are major factors influencing metabolite identification and quantification. In addition, we established methods for improved metabolite identification and quantification, which were summarized to provide recommendations for experimental design of GC-MS-based non-targeted profiling of human plasma. PMID:28841195
Surface smoothness: cartilage biomarkers for knee OA beyond the radiologist
NASA Astrophysics Data System (ADS)
Tummala, Sudhakar; Dam, Erik B.
2010-03-01
Fully automatic imaging biomarkers may allow quantification of patho-physiological processes that a radiologist would not be able to assess reliably. This can introduce new insight but is problematic to validate due to lack of meaningful ground truth expert measurements. Rather than quantification accuracy, such novel markers must therefore be validated against clinically meaningful end-goals such as the ability to allow correct diagnosis. We present a method for automatic cartilage surface smoothness quantification in the knee joint. The quantification is based on a curvature flow method used on tibial and femoral cartilage compartments resulting from an automatic segmentation scheme. These smoothness estimates are validated for their ability to diagnose osteoarthritis and compared to smoothness estimates based on manual expert segmentations and to conventional cartilage volume quantification. We demonstrate that the fully automatic markers eliminate the time required for radiologist annotations, and in addition provide a diagnostic marker superior to the evaluated semi-manual markers.
Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.
Hawkins, Steve F C; Guest, Paul C
2018-01-01
The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.
A conceptually and computationally simple method for the definition, display, quantification, and comparison of the shapes of three-dimensional mathematical molecular models is presented. Molecular or solvent-accessible volume and surface area can also be calculated. Algorithms, ...
Shimizu, Eri; Kato, Hisashi; Nakagawa, Yuki; Kodama, Takashi; Futo, Satoshi; Minegishi, Yasutaka; Watanabe, Takahiro; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi
2008-07-23
A novel type of quantitative competitive polymerase chain reaction (QC-PCR) system for the detection and quantification of the Roundup Ready soybean (RRS) was developed. This system was designed based on the advantage of a fully validated real-time PCR method used for the quantification of RRS in Japan. A plasmid was constructed as a competitor plasmid for the detection and quantification of genetically modified soy, RRS. The plasmid contained the construct-specific sequence of RRS and the taxon-specific sequence of lectin1 (Le1), and both had 21 bp oligonucleotide insertion in the sequences. The plasmid DNA was used as a reference molecule instead of ground seeds, which enabled us to precisely and stably adjust the copy number of targets. The present study demonstrated that the novel plasmid-based QC-PCR method could be a simple and feasible alternative to the real-time PCR method used for the quantification of genetically modified organism contents.
Piñeiro, Zulema; Cantos-Villar, Emma; Palma, Miguel; Puertas, Belen
2011-11-09
A validated HPLC method with fluorescence detection for the simultaneous quantification of hydroxytyrosol and tyrosol in red wines is described. Detection conditions for both compounds were optimized (excitation at 279 and 278 and emission at 631 and 598 nm for hydroxytyrosol and tyrosol, respectively). The validation of the analytical method was based on selectivity, linearity, robustness, detection and quantification limits, repeatability, and recovery. The detection and quantification limits in red wines were set at 0.023 and 0.076 mg L(-1) for hydroxytyrosol and at 0.007 and 0.024 mg L(-1) for tyrosol determination, respectively. Precision values, both within-day and between-day (n = 5), remained below 3% for both compounds. In addition, a fractional factorial experimental design was developed to analyze the influence of six different conditions on analysis. The final optimized HPLC-fluorescence method allowed the analysis of 30 nonpretreated Spanish red wines to evaluate their hydroxytyrosol and tyrosol contents.
Otani, Kyoko; Nakazono, Akemi; Salgo, Ivan S; Lang, Roberto M; Takeuchi, Masaaki
2016-10-01
Echocardiographic determination of left heart chamber volumetric parameters by using manual tracings during multiple beats is tedious in atrial fibrillation (AF). The aim of this study was to determine the usefulness of fully automated left chamber quantification software with single-beat three-dimensional transthoracic echocardiographic data sets in patients with AF. Single-beat full-volume three-dimensional transthoracic echocardiographic data sets were prospectively acquired during consecutive multiple cardiac beats (≥10 beats) in 88 patients with AF. In protocol 1, left ventricular volumes, left ventricular ejection fraction, and maximal left atrial volume were validated using automated quantification against the manual tracing method in identical beats in 10 patients. In protocol 2, automated quantification-derived averaged values from multiple beats were compared with the corresponding values obtained from the indexed beat in all patients. Excellent correlations of left chamber parameters between automated quantification and the manual method were observed (r = 0.88-0.98) in protocol 1. The time required for the analysis with the automated quantification method (5 min) was significantly less compared with the manual method (27 min) (P < .0001). In protocol 2, there were excellent linear correlations between the averaged left chamber parameters and the corresponding values obtained from the indexed beat (r = 0.94-0.99), and test-retest variability of left chamber parameters was low (3.5%-4.8%). Three-dimensional transthoracic echocardiography with fully automated quantification software is a rapid and reliable way to measure averaged values of left heart chamber parameters during multiple consecutive beats. Thus, it is a potential new approach for left chamber quantification in patients with AF in daily routine practice. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Hatfield, Glen S.; Hark, Frank; Stott, James
2016-01-01
Launch vehicle reliability analysis is largely dependent upon using predicted failure rates from data sources such as MIL-HDBK-217F. Reliability prediction methodologies based on component data do not take into account risks attributable to manufacturing, assembly, and process controls. These sources often dominate component level reliability or risk of failure probability. While consequences of failure is often understood in assessing risk, using predicted values in a risk model to estimate the probability of occurrence will likely underestimate the risk. Managers and decision makers often use the probability of occurrence in determining whether to accept the risk or require a design modification. Due to the absence of system level test and operational data inherent in aerospace applications, the actual risk threshold for acceptance may not be appropriately characterized for decision making purposes. This paper will establish a method and approach to identify the pitfalls and precautions of accepting risk based solely upon predicted failure data. This approach will provide a set of guidelines that may be useful to arrive at a more realistic quantification of risk prior to acceptance by a program.
Jan, Ishrat; Dar, Alamgir A; Mubashir, Sofi; Alam Wani, Ashraf; Mukhtar, Malik; Sofi, Khurshid A; Dar, Irshad H; Sofi, Javid A
2018-05-01
Residue investigation was carried out to scrutinize the persistence, dissipation behavior, half-life, and risk assessment of ethion on green pea fruit by spraying ethion at the fruiting stage followed by another application at 10 day intervals. The samples were extracted by using a quick, easy, low-cost, effective, rugged, and safe method, and the residues of ethion were analyzed by gas chromatography with electron capture detection. Here we report a novel, accurate, and cost-effective gas chromatography method for the determination of average deposits of ethion on green pea. The initial deposits were found to be 4.65 mg/kg following the application of insecticide. Residues of ethion reached below the detection limit of 0.10 mg/kg after 25 days at recommended dosage. The half-life of ethion was found to be 4.62 days. For risk assessment studies, the 25th day will be safe for consumers for the consumption of green peas. The developed method is simple, sensitive, selective, and repeatable and can be extended for ethion-based standardization of herbal formulations containing green pea and its use in pesticide industries. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
In situ label-free quantification of human pluripotent stem cells with electrochemical potential.
Yea, Cheol-Heon; Jeong, Ho-Chang; Moon, Sung-Hwan; Lee, Mi-Ok; Kim, Kyeong-Jun; Choi, Jeong-Woo; Cha, Hyuk-Jin
2016-01-01
Conventional methods for quantification of undifferentiated pluripotent stem cells such as fluorescence-activated cell sorting and real-time PCR analysis have technical limitations in terms of their sensitivity and recyclability. Herein, we designed a real-time in situ label-free monitoring system on the basis of a specific electrochemical signature of human pluripotent stem cells in vitro. The intensity of the signal of hPSCs highly corresponded to the cell number and remained consistent in a mixed population with differentiated cells. The electrical charge used for monitoring did not markedly affect the proliferation rate or molecular characteristics of differentiated human aortic smooth muscle cells. After YM155 treatment to ablate undifferentiated hPSCs, their specific signal was significantly reduced. This suggests that detection of the specific electrochemical signature of hPSCs would be a valid approach to monitor potential contamination of undifferentiated hPSCs, which can assess the risk of teratoma formation efficiently and economically. Copyright © 2015 Elsevier Ltd. All rights reserved.
Corneal markers of diabetic neuropathy.
Pritchard, Nicola; Edwards, Katie; Shahidi, Ayda M; Sampson, Geoff P; Russell, Anthony W; Malik, Rayaz A; Efron, Nathan
2011-01-01
Diabetic neuropathy is a significant clinical problem that currently has no effective therapy, and in advanced cases, leads to foot ulceration and lower limb amputation. The accurate detection, characterization and quantification of this condition are important in order to define at-risk patients, anticipate deterioration, monitor progression, and assess new therapies. This review evaluates novel corneal methods of assessing diabetic neuropathy. Two new noninvasive corneal markers have emerged, and in cross-sectional studies have demonstrated their ability to stratify the severity of this disease. Corneal confocal microscopy allows quantification of corneal nerve parameters and noncontact corneal esthesiometry, the functional correlate of corneal structure, assesses the sensitivity of the cornea. Both these techniques are quick to perform, produce little or no discomfort for the patient, and are suitable for clinical settings. Each has advantages and disadvantages over traditional techniques for assessing diabetic neuropathy. Application of these new corneal markers for longitudinal evaluation of diabetic neuropathy has the potential to reduce dependence on more invasive, costly, and time-consuming assessments, such as skin biopsy.
Pre-Analytical Conditions in Non-Invasive Prenatal Testing of Cell-Free Fetal RHD
Rieneck, Klaus; Krog, Grethe Risum; Nielsen, Leif Kofoed; Tabor, Ann; Dziegiel, Morten Hanefeld
2013-01-01
Background Non-invasive prenatal testing of cell-free fetal DNA (cffDNA) in maternal plasma can predict the fetal RhD type in D negative pregnant women. In Denmark, routine antenatal screening for the fetal RhD gene (RHD) directs the administration of antenatal anti-D prophylaxis only to women who carry an RhD positive fetus. Prophylaxis reduces the risk of immunization that may lead to hemolytic disease of the fetus and the newborn. The reliability of predicting the fetal RhD type depends on pre-analytical factors and assay sensitivity. We evaluated the testing setup in the Capital Region of Denmark, based on data from routine antenatal RHD screening. Methods Blood samples were drawn at gestational age 25 weeks. DNA extracted from 1 mL of plasma was analyzed for fetal RHD using a duplex method for exon 7/10. We investigated the effect of blood sample transportation time (n = 110) and ambient outdoor temperatures (n = 1539) on the levels of cffDNA and total DNA. We compared two different quantification methods, the delta Ct method and a universal standard curve. PCR pipetting was compared on two systems (n = 104). Results The cffDNA level was unaffected by blood sample transportation for up to 9 days and by ambient outdoor temperatures ranging from -10°C to 28°C during transport. The universal standard curve was applicable for cffDNA quantification. Identical levels of cffDNA were observed using the two automated PCR pipetting systems. We detected a mean of 100 fetal DNA copies/mL at a median gestational age of 25 weeks (range 10–39, n = 1317). Conclusion The setup for real-time PCR-based, non-invasive prenatal testing of cffDNA in the Capital Region of Denmark is very robust. Our findings regarding the transportation of blood samples demonstrate the high stability of cffDNA. The applicability of a universal standard curve facilitates easy cffDNA quantification. PMID:24204719
Standardless quantification by parameter optimization in electron probe microanalysis
NASA Astrophysics Data System (ADS)
Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.
2012-11-01
A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively.
Lamb Wave Damage Quantification Using GA-Based LS-SVM.
Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong
2017-06-12
Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.
Lamb Wave Damage Quantification Using GA-Based LS-SVM
Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong
2017-01-01
Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification. PMID:28773003
Powder X-ray diffraction method for the quantification of cocrystals in the crystallization mixture.
Padrela, Luis; de Azevedo, Edmundo Gomes; Velaga, Sitaram P
2012-08-01
The solid state purity of cocrystals critically affects their performance. Thus, it is important to accurately quantify the purity of cocrystals in the final crystallization product. The aim of this study was to develop a powder X-ray diffraction (PXRD) quantification method for investigating the purity of cocrystals. The method developed was employed to study the formation of indomethacin-saccharin (IND-SAC) cocrystals by mechanochemical methods. Pure IND-SAC cocrystals were geometrically mixed with 1:1 w/w mixture of indomethacin/saccharin in various proportions. An accurately measured amount (550 mg) of the mixture was used for the PXRD measurements. The most intense, non-overlapping, characteristic diffraction peak of IND-SAC was used to construct the calibration curve in the range 0-100% (w/w). This calibration model was validated and used to monitor the formation of IND-SAC cocrystals by liquid-assisted grinding (LAG). The IND-SAC cocrystal calibration curve showed excellent linearity (R(2) = 0.9996) over the entire concentration range, displaying limit of detection (LOD) and limit of quantification (LOQ) values of 1.23% (w/w) and 3.74% (w/w), respectively. Validation results showed excellent correlations between actual and predicted concentrations of IND-SAC cocrystals (R(2) = 0.9981). The accuracy and reliability of the PXRD quantification method depend on the methods of sample preparation and handling. The crystallinity of the IND-SAC cocrystals was higher when larger amounts of methanol were used in the LAG method. The PXRD quantification method is suitable and reliable for verifying the purity of cocrystals in the final crystallization product.
NASA Technical Reports Server (NTRS)
Benek, John A.; Luckring, James M.
2017-01-01
A NATO symposium held in 2008 identified many promising sensitivity analysis and un-certainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not known. The STO Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic problems of interest to NATO. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper presents an overview of the AVT-191 program content.
NASA Technical Reports Server (NTRS)
Benek, John A.; Luckring, James M.
2017-01-01
A NATO symposium held in Greece in 2008 identified many promising sensitivity analysis and uncertainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not clear. The NATO Science and Technology Organization, Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic vehicle development problems. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper summarizes findings and lessons learned from the task group.
[Detection of recombinant-DNA in foods from stacked genetically modified plants].
Sorokina, E Iu; Chernyshova, O N
2012-01-01
A quantitative real-time multiplex polymerase chain reaction method was applied to the detection and quantification of MON863 and MON810 in stacked genetically modified maize MON 810xMON 863. The limit of detection was approximately 0,1%. The accuracy of the quantification, measured as bias from the accepted value and the relative repeatability standard deviation, which measures the intra-laboratory variability, were within 25% at each GM-level. A method verification has demonstrated that the MON 863 and the MON810 methods can be equally applied in quantification of the respective events in stacked MON810xMON 863.
Source separation on hyperspectral cube applied to dermatology
NASA Astrophysics Data System (ADS)
Mitra, J.; Jolivot, R.; Vabres, P.; Marzani, F. S.
2010-03-01
This paper proposes a method of quantification of the components underlying the human skin that are supposed to be responsible for the effective reflectance spectrum of the skin over the visible wavelength. The method is based on independent component analysis assuming that the epidermal melanin and the dermal haemoglobin absorbance spectra are independent of each other. The method extracts the source spectra that correspond to the ideal absorbance spectra of melanin and haemoglobin. The noisy melanin spectrum is fixed using a polynomial fit and the quantifications associated with it are reestimated. The results produce feasible quantifications of each source component in the examined skin patch.
Busatto, Zenaís; da Silva, Agnaldo Fernando Baldo; de Freitas, Osvaldo; Paschoal, Jonas Augusto Rizzato
2017-04-01
This paper describes the development of analytical methods for the quantification of albendazole (ABZ) in fish feed and ABZ and its main known metabolites (albendazole sulfoxide, albendazole sulfone and albendazole aminosulfone) in fish fillet employing LC-MS/MS. In order to assess the reliability of the analytical methods, evaluation was undertaken as recommended by related guides proposed by the Brazilian Ministry of Agriculture for analytical method validation. The calibration curve for ABZ quantification in feed showed adequate linearity (r > 0.99), precision (CV < 1.03%) and trueness ranging from 99% to 101%. The method for ABZ residues in fish fillet involving the QuEChERS technique for sample extraction had adequate linearity (r > 0.99) for all analytes, precision (CV < 13%) and trueness around 100%, with CCα < 122 ng g - 1 and CCβ < 145 ng g - 1 . Besides, by aiming to avoid the risk of ABZ leaching from feed into the aquatic environment during fish medication via the oral route, a promising procedure for drug incorporation in the feed involving coating feed pellets with ethyl cellulose polymer containing ABZ was also evaluated. The medicated feed had good homogeneity (CV < 3%) and a lower release of ABZ (< 0.2%) from feed to water when the medicated feed stayed in the water for up to 15 min.
NASA Astrophysics Data System (ADS)
Greer, Tyler; Lietz, Christopher B.; Xiang, Feng; Li, Lingjun
2015-01-01
Absolute quantification of protein targets using liquid chromatography-mass spectrometry (LC-MS) is a key component of candidate biomarker validation. One popular method combines multiple reaction monitoring (MRM) using a triple quadrupole instrument with stable isotope-labeled standards (SIS) for absolute quantification (AQUA). LC-MRM AQUA assays are sensitive and specific, but they are also expensive because of the cost of synthesizing stable isotope peptide standards. While the chemical modification approach using mass differential tags for relative and absolute quantification (mTRAQ) represents a more economical approach when quantifying large numbers of peptides, these reagents are costly and still suffer from lower throughput because only two concentration values per peptide can be obtained in a single LC-MS run. Here, we have developed and applied a set of five novel mass difference reagents, isotopic N, N-dimethyl leucine (iDiLeu). These labels contain an amine reactive group, triazine ester, are cost effective because of their synthetic simplicity, and have increased throughput compared with previous LC-MS quantification methods by allowing construction of a four-point standard curve in one run. iDiLeu-labeled peptides show remarkably similar retention time shifts, slightly lower energy thresholds for higher-energy collisional dissociation (HCD) fragmentation, and high quantification accuracy for trypsin-digested protein samples (median errors <15%). By spiking in an iDiLeu-labeled neuropeptide, allatostatin, into mouse urine matrix, two quantification methods are validated. The first uses one labeled peptide as an internal standard to normalize labeled peptide peak areas across runs (<19% error), whereas the second enables standard curve creation and analyte quantification in one run (<8% error).
Vu, Dai Long; Ranglová, Karolína; Hájek, Jan; Hrouzek, Pavel
2018-05-01
Quantification of selenated amino-acids currently relies on methods employing inductively coupled plasma mass spectrometry (ICP-MS). Although very accurate, these methods do not allow the simultaneous determination of standard amino-acids, hampering the comparison of the content of selenated versus non-selenated species such as methionine (Met) and selenomethionine (SeMet). This paper reports two approaches for the simultaneous quantification of Met and SeMet. In the first approach, standard enzymatic hydrolysis employing Protease XIV was applied for the preparation of samples. The second approach utilized methanesulfonic acid (MA) for the hydrolysis of samples, either in a reflux system or in a microwave oven, followed by derivatization with diethyl ethoxymethylenemalonate. The prepared samples were then analyzed by multiple reaction monitoring high performance liquid chromatography tandem mass spectrometry (MRM-HPLC-MS/MS). Both approaches provided platforms for the accurate determination of selenium/sulfur substitution rate in Met. Moreover the second approach also provided accurate simultaneous quantification of Met and SeMet with a low limit of detection, low limit of quantification and wide linearity range, comparable to the commonly used gas chromatography mass spectrometry (GC-MS) method or ICP-MS. The novel method was validated using certified reference material in conjunction with the GC-MS reference method. Copyright © 2018. Published by Elsevier B.V.
Evaluation of work posture and quantification of fatigue by Rapid Entire Body Assessment (REBA)
NASA Astrophysics Data System (ADS)
Rizkya, I.; Syahputri, K.; Sari, R. M.; Anizar; Siregar, I.
2018-02-01
Work related musculoskeletal disorders (MSDs), poor body postures, and low back injuries are the most common problems occurring in many industries including small-medium industries. This study presents assessment and evaluation of ergonomic postures of material handling worker. That evaluation was carried out using REBA (Rapid Entire Body Assessment). REBA is a technique to quantize the fatigue experienced by the worker while manually lifting loads. Fatigue due to abnormal work posture leads to complaints of labor-perceived pain. REBA methods were used to an assessment of working postures for the existing process by a procedural analysis of body postures involved. This study shows that parts of the body have a high risk of work are the back, neck, and upper arms with REBA score 9, so action should be taken as soon as possible. Controlling actions were implemented to those process with high risk then substantial risk reduction was achieved.
Light Water Reactor Sustainability Program FY13 Status Update for EPRI - RISMC Collaboration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Curtis
2013-09-01
The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management with the aim to improve economics, reliability, and sustain safety of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced "RISMC toolkit" that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho Nationalmore » Laboratory (INL) is collaborating with the Electric Power Research Institute (EPRI) in order to focus on applications of interest to the U.S. nuclear power industry. This report documents the collaboration activities performed between INL and EPRI during FY2013.« less
Pyschik, Marcelina; Klein-Hitpaß, Marcel; Girod, Sabrina; Winter, Martin; Nowak, Sascha
2017-02-01
In this study, an optimized method using capillary electrophoresis (CE) with a direct contactless conductivity detector (C 4 D) for a new application field is presented for the quantification of fluoride in common used lithium ion battery (LIB) electrolyte using LiPF 6 in organic carbonate solvents and in ionic liquids (ILs) after contacted to Li metal. The method development for finding the right buffer and the suitable CE conditions for the quantification of fluoride was investigated. The results of the concentration of fluoride in different LIB electrolyte samples were compared to the results from the ion-selective electrode (ISE). The relative standard deviations (RSDs) and recovery rates for fluoride were obtained with a very high accuracy in both methods. The results of the fluoride concentration in the LIB electrolytes were in very good agreement for both methods. In addition, the limit of detection (LOD) and limit of quantification (LOQ) values were determined for the CE method. The CE method has been applied also for the quantification of fluoride in ILs. In the fresh IL sample, the concentration of fluoride was under the LOD. Another sample of the IL mixed with Li metal has been investigated as well. It was possible to quantify the fluoride concentration in this sample. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos
2014-06-01
Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.
Uncertainty Quantification in Alchemical Free Energy Methods.
Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V
2018-06-12
Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.
Deconinck, E; Crevits, S; Baten, P; Courselle, P; De Beer, J
2011-04-05
A fully validated UHPLC method for the identification and quantification of folic acid in pharmaceutical preparations was developed. The starting conditions for the development were calculated starting from the HPLC conditions of a validated method. These start conditions were tested on four different UHPLC columns: Grace Vision HT™ C18-P, C18, C18-HL and C18-B (2 mm × 100 mm, 1.5 μm). After selection of the stationary phase, the method was further optimised by testing two aqueous and two organic phases and by adapting to a gradient method. The obtained method was fully validated based on its measurement uncertainty (accuracy profile) and robustness tests. A UHPLC method was obtained for the identification and quantification of folic acid in pharmaceutical preparations, which will cut analysis times and solvent consumption. Copyright © 2010 Elsevier B.V. All rights reserved.
Alves, L.P.S.; Almeida, A.T.; Cruz, L.M.; Pedrosa, F.O.; de Souza, E.M.; Chubatsu, L.S.; Müller-Santos, M.; Valdameri, G.
2017-01-01
The conventional method for quantification of polyhydroxyalkanoates based on whole-cell methanolysis and gas chromatography (GC) is laborious and time-consuming. In this work, a method based on flow cytometry of Nile red stained bacterial cells was established to quantify poly-3-hydroxybutyrate (PHB) production by the diazotrophic and plant-associated bacteria, Herbaspirillum seropedicae and Azospirillum brasilense. The method consists of three steps: i) cell permeabilization, ii) Nile red staining, and iii) analysis by flow cytometry. The method was optimized step-by-step and can be carried out in less than 5 min. The final results indicated a high correlation coefficient (R2=0.99) compared to a standard method based on methanolysis and GC. This method was successfully applied to the quantification of PHB in epiphytic bacteria isolated from rice roots. PMID:28099582
Rakesh Minocha; P. Thangavel; Om Parkash Dhankher; Stephanie Long
2008-01-01
The HPLC method presented here for the quantification of metal-binding thiols is considerably shorter than most previously published methods. It is a sensitive and highly reproducible method that separates monobromobimane tagged monothiols (cysteine, glutathione, γ-glutamylcysteine) along with polythiols (PC2, PC3...
Roger, B; Fernandez, X; Jeannot, V; Chahboun, J
2010-01-01
The essential oil obtained from iris rhizomes is one of the most precious raw materials for the perfume industry. Its fragrance is due to irones that are gradually formed by oxidative degradation of iridals during rhizome ageing. The development of an alternative method allowing irone quantification in iris rhizomes using HS-SPME-GC. The development of the method using HS-SPME-GC was achieved using the results obtained from a conventional method, i.e. a solid-liquid extraction (SLE) followed by irone quantification by CG. Among several calibration methods tested, internal calibration gave the best results and was the least sensitive to the matrix effect. The proposed method using HS-SPME-GC is as accurate and reproducible as the conventional one using SLE. These two methods were used to monitor and compare irone concentrations in iris rhizomes that had been stored for 6 months to 9 years. Irone quantification in iris rhizome can be achieved using HS-SPME-GC. This method can thus be used for the quality control of the iris rhizomes. It offers the advantage of combining extraction and analysis with an automated device and thus allows a large number of rhizome batches to be analysed and compared in a limited amount of time. Copyright © 2010 John Wiley & Sons, Ltd.
Nicolás, Paula; Lassalle, Verónica L; Ferreira, María L
2017-02-01
The aim of this manuscript was to study the application of a new method of protein quantification in Candida antarctica lipase B commercial solutions. Error sources associated to the traditional Bradford technique were demonstrated. Eight biocatalysts based on C. antarctica lipase B (CALB) immobilized onto magnetite nanoparticles were used. Magnetite nanoparticles were coated with chitosan (CHIT) and modified with glutaraldehyde (GLUT) and aminopropyltriethoxysilane (APTS). Later, CALB was adsorbed on the modified support. The proposed novel protein quantification method included the determination of sulfur (from protein in CALB solution) by means of Atomic Emission by Inductive Coupling Plasma (AE-ICP). Four different protocols were applied combining AE-ICP and classical Bradford assays, besides Carbon, Hydrogen and Nitrogen (CHN) analysis. The calculated error in protein content using the "classic" Bradford method with bovine serum albumin as standard ranged from 400 to 1200% when protein in CALB solution was quantified. These errors were calculated considering as "true protein content values" the results of the amount of immobilized protein obtained with the improved method. The optimum quantification procedure involved the combination of Bradford method, ICP and CHN analysis. Copyright © 2016 Elsevier Inc. All rights reserved.
Moroni, Francesco; Magnoni, Marco; Vergani, Vittoria; Ammirati, Enrico; Camici, Paolo G
2018-01-01
Plaque border irregularity is a known imaging characteristic of vulnerable plaques, but its evaluation heavily relies on subjective evaluation and operator expertise. Aim of the present work is to propose a novel fractal-analysis based method for the quantification of atherosclerotic plaque border irregularity and assess its relation with cardiovascular risk factors. Forty-two asymptomatic subjects with carotid stenosis underwent ultrasound evaluation and assessment of cardiovascular risk factors. Total, low-density lipoprotein (LDL), high-density lipoprotein (HDL) plasma cholesterol and triglycerides concentrations were measured for each subject. Fractal analysis was performed in all the carotid segments affected by atherosclerosis, i.e. 147 segments. The resulting fractal dimension (FD) is a measure of irregularity of plaque profile on long axis view of the plaque. FD in the severest stenosis (main plaque FD,mFD) was 1.136±0.039. Average FD per patient (global FD,gFD) was 1.145±0.039. FD was independent of other plaque characteristics. mFD significantly correlated with plasma HDL (r = -0.367,p = 0.02) and triglycerides-to-HDL ratio (r = 0.480,p = 0.002). Fractal analysis is a novel, readily available, reproducible and inexpensive technique for the quantitative measurement of plaque irregularity. The correlation between low HDL levels and plaque FD suggests a role for HDL in the acquisition of morphologic features of plaque instability. Further studies are needed to validate the prognostic value of fractal analysis in carotid plaques evaluation.
USDA-ARS?s Scientific Manuscript database
Arbuscular mycorrhizal fungi (AMF) are well-known plant symbionts which provide enhanced phosphorus uptake as well as other benefits to their host plants. Quantification of mycorrhizal biomass and root colonization has traditionally been performed by root staining and microscopic examination methods...
The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...
21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.
Code of Federal Regulations, 2011 CFR
2011-04-01
..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS EXTRALABEL DRUG USE IN ANIMALS Specific Provisions Relating to Extralabel Use of Animal and Human Drugs in Food-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a...
Amoah, Isaac Dennis; Singh, Gulshan; Stenström, Thor Axel; Reddy, Poovendhree
2017-05-01
It is estimated that over a billion people are infected with soil-transmitted helminths (STHs) globally with majority occurring in tropical and subtropical regions of the world. The roundworm (Ascaris lumbricoides), whipworm (Trichuris trichiura), and hookworms (Ancylostoma duodenale and Necator americanus) are the main species infecting people. These infections are mostly gained through exposure to faecally contaminated water, soil or contaminated food and with an increase in the risk of infections due to wastewater and sludge reuse in agriculture. Different methods have been developed for the detection and quantification of STHs eggs in environmental samples. However, there is a lack of a universally accepted technique which creates a challenge for comparative assessments of helminths egg concentrations both in different samples matrices as well as between locations. This review presents a comparison of reported methodologies for the detection of STHs eggs, an assessment of the relative performance of available detection methods and a discussion of new emerging techniques that could be applied for detection and quantification. It is based on a literature search using PubMed and Science Direct considering all geographical locations. Original research articles were selected based on their methodology and results sections. Methods reported in these articles were grouped into conventional, molecular and emerging techniques, the main steps in each method were then compared and discussed. The inclusion of a dissociation step aimed at detaching helminth eggs from particulate matter was found to improve the recovery of eggs. Additionally the selection and application of flotation solutions that take into account the relative densities of the eggs of different species of STHs also results in higher egg recovery. Generally the use of conventional methods was shown to be laborious and time consuming and prone to human error. The alternate use of nucleic acid-based techniques has improved the sensitivity of detection and made species specific identification possible. However, these nucleic acid based methods are expensive and less suitable in regions with limited resources and skill. The loop mediated isothermal amplification method shows promise for application in these settings due to its simplicity and use of basic equipment. In addition, the development of imaging soft-ware for the detection and quantification of STHs shows promise to further reduce human error associated with the analysis of environmental samples. It may be concluded that there is a need to comparatively assess the performance of different methods to determine their applicability in different settings as well as for use with different sample matrices (wastewater, sludge, compost, soil, vegetables etc.). Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ciriello, V.; Lauriola, I.; Bonvicini, S.; Cozzani, V.; Di Federico, V.; Tartakovsky, Daniel M.
2017-11-01
Ubiquitous hydrogeological uncertainty undermines the veracity of quantitative predictions of soil and groundwater contamination due to accidental hydrocarbon spills from onshore pipelines. Such predictions, therefore, must be accompanied by quantification of predictive uncertainty, especially when they are used for environmental risk assessment. We quantify the impact of parametric uncertainty on quantitative forecasting of temporal evolution of two key risk indices, volumes of unsaturated and saturated soil contaminated by a surface spill of light nonaqueous-phase liquids. This is accomplished by treating the relevant uncertain parameters as random variables and deploying two alternative probabilistic models to estimate their effect on predictive uncertainty. A physics-based model is solved with a stochastic collocation method and is supplemented by a global sensitivity analysis. A second model represents the quantities of interest as polynomials of random inputs and has a virtually negligible computational cost, which enables one to explore any number of risk-related contamination scenarios. For a typical oil-spill scenario, our method can be used to identify key flow and transport parameters affecting the risk indices, to elucidate texture-dependent behavior of different soils, and to evaluate, with a degree of confidence specified by the decision-maker, the extent of contamination and the correspondent remediation costs.
Guo, Teng; Li, Xueshuang; Li, Jianquan; Peng, Zhen; Xu, Li; Dong, Junguo; Cheng, Ping; Zhou, Zhen
2018-03-01
Harmful organic by-products, produced during the removal of volatile organic compounds (VOCs) from the air by treatment with non-thermal plasma (NTP), hinder the practical applications of NTP. An on-line quantification and risk assessment method for the organic by-products produced by the NTP removal of toluene from the air has been developed. Formaldehyde, methanol, ketene, acetaldehyde, formic acid, acetone, acetic acid, benzene, benzaldehyde, and benzoic acid were determined to be the main organic by-products by proton transfer reaction mass spectrometry (PTR-MS), a powerful technique for real-time and on-line measurements of trace levels of VOCs, and a health-related index (HRI) was introduced to assess the health risk of these organic by-products. The discharge power (P) is a key factor affecting the formation of the organic by-products and their HRI values. Higher P leads to a higher removal efficiency (η) and lower HRI. However, higher P also means higher cost and greater production of discharge by-products, such as NO x and O 3 , which are also very dangerous to the environment and human health. In practical applications P, HRI, and η must be balanced, and sometimes the risks posed by the organic by-products are even greater than those of the removed compounds. Our mechanistic study reveals that acetone is a crucial intermediate for the removal of toluene by NTP, and we found that toluene molecules first fragment into acetone molecules, followed by other by-products. These observations will guide the study of the mechanism of aromatic molecule dissociation in plasma. Copyright © 2017 Elsevier Ltd. All rights reserved.
Krämer, Nadine; Löfström, Charlotta; Vigre, Håkan; Hoorfar, Jeffrey; Bunge, Cornelia; Malorny, Burkhard
2011-03-01
Salmonella is a major zoonotic pathogen which causes outbreaks and sporadic cases of gastroenteritis in humans worldwide. The primary sources for Salmonella are food-producing animals such as pigs and poultry. For risk assessment and hazard analysis and critical control point (HACCP) concepts, it is essential to produce large amounts of quantitative data, which is currently not achievable with the standard cultural based methods for enumeration of Salmonella. This study presents the development of a novel strategy to enumerate low numbers of Salmonella in cork borer samples taken from pig carcasses as a first concept and proof of principle for a new sensitive and rapid quantification method based on combined enrichment and real-time PCR. The novelty of the approach is in the short pre-enrichment step, where for most bacteria, growth is in the log phase. The method consists of an 8h pre-enrichment of the cork borer sample diluted 1:10 in non-selective buffered peptone water, followed by DNA extraction, and Salmonella detection and quantification by real-time PCR. The limit of quantification was 1.4 colony forming units (CFU)/20 cm(2) (approximately 10 g) of artificially contaminated sample with 95% confidence interval of ± 0.7 log CFU/sample. The precision was similar to the standard reference most probable number (MPN) method. A screening of 200 potentially naturally contaminated cork borer samples obtained over seven weeks in a slaughterhouse resulted in 25 Salmonella-positive samples. The analysis of salmonellae within these samples showed that the PCR method had a higher sensitivity for samples with a low contamination level (<6.7 CFU/sample), where 15 of the samples negative with the MPN method was detected with the PCR method and 5 were found to be negative by both methods. For the samples with a higher contamination level (6.7-310 CFU/sample) a good agreement between the results obtained with the PCR and MPN methods was obtained. The quantitative real-time PCR method can easily be applied to other food and environmental matrices by adaptation of the pre-enrichment time and media. Copyright © 2010 Elsevier B.V. All rights reserved.
Richardson, Keith; Denny, Richard; Hughes, Chris; Skilling, John; Sikora, Jacek; Dadlez, Michał; Manteca, Angel; Jung, Hye Ryung; Jensen, Ole Nørregaard; Redeker, Virginie; Melki, Ronald; Langridge, James I.; Vissers, Johannes P.C.
2013-01-01
A probability-based quantification framework is presented for the calculation of relative peptide and protein abundance in label-free and label-dependent LC-MS proteomics data. The results are accompanied by credible intervals and regulation probabilities. The algorithm takes into account data uncertainties via Poisson statistics modified by a noise contribution that is determined automatically during an initial normalization stage. Protein quantification relies on assignments of component peptides to the acquired data. These assignments are generally of variable reliability and may not be present across all of the experiments comprising an analysis. It is also possible for a peptide to be identified to more than one protein in a given mixture. For these reasons the algorithm accepts a prior probability of peptide assignment for each intensity measurement. The model is constructed in such a way that outliers of any type can be automatically reweighted. Two discrete normalization methods can be employed. The first method is based on a user-defined subset of peptides, while the second method relies on the presence of a dominant background of endogenous peptides for which the concentration is assumed to be unaffected. Normalization is performed using the same computational and statistical procedures employed by the main quantification algorithm. The performance of the algorithm will be illustrated on example data sets, and its utility demonstrated for typical proteomics applications. The quantification algorithm supports relative protein quantification based on precursor and product ion intensities acquired by means of data-dependent methods, originating from all common isotopically-labeled approaches, as well as label-free ion intensity-based data-independent methods. PMID:22871168
Psifidi, Androniki; Dovas, Chrysostomos; Banos, Georgios
2011-01-01
Background Single nucleotide polymorphisms (SNP) have proven to be powerful genetic markers for genetic applications in medicine, life science and agriculture. A variety of methods exist for SNP detection but few can quantify SNP frequencies when the mutated DNA molecules correspond to a small fraction of the wild-type DNA. Furthermore, there is no generally accepted gold standard for SNP quantification, and, in general, currently applied methods give inconsistent results in selected cohorts. In the present study we sought to develop a novel method for accurate detection and quantification of SNP in DNA pooled samples. Methods The development and evaluation of a novel Ligase Chain Reaction (LCR) protocol that uses a DNA-specific fluorescent dye to allow quantitative real-time analysis is described. Different reaction components and thermocycling parameters affecting the efficiency and specificity of LCR were examined. Several protocols, including gap-LCR modifications, were evaluated using plasmid standard and genomic DNA pools. A protocol of choice was identified and applied for the quantification of a polymorphism at codon 136 of the ovine PRNP gene that is associated with susceptibility to a transmissible spongiform encephalopathy in sheep. Conclusions The real-time LCR protocol developed in the present study showed high sensitivity, accuracy, reproducibility and a wide dynamic range of SNP quantification in different DNA pools. The limits of detection and quantification of SNP frequencies were 0.085% and 0.35%, respectively. Significance The proposed real-time LCR protocol is applicable when sensitive detection and accurate quantification of low copy number mutations in DNA pools is needed. Examples include oncogenes and tumour suppressor genes, infectious diseases, pathogenic bacteria, fungal species, viral mutants, drug resistance resulting from point mutations, and genetically modified organisms in food. PMID:21283808
Lao, Yexing; Yang, Cuiping; Zou, Wei; Gan, Manquan; Chen, Ping; Su, Weiwei
2012-05-01
The cryptand Kryptofix 2.2.2 is used extensively as a phase-transfer reagent in the preparation of [18F]fluoride-labelled radiopharmaceuticals. However, it has considerable acute toxicity. The aim of this study was to develop and validate a method for rapid (within 1 min), specific and sensitive quantification of Kryptofix 2.2.2 at trace levels. Chromatographic separations were carried out by rapid-resolution liquid chromatography (Agilent ZORBAX SB-C18 rapid-resolution column, 2.1 × 30 mm, 3.5 μm). Tandem mass spectra were acquired using a triple quadrupole mass spectrometer equipped with an electrospray ionization interface. Quantitative mass spectrometric analysis was conducted in positive ion mode and multiple reaction monitoring mode for the m/z 377.3 → 114.1 transition for Kryptofix 2.2.2. The external standard method was used for quantification. The method met the precision and efficiency requirements for PET radiopharmaceuticals, providing satisfactory results for specificity, matrix effect, stability, linearity (0.5-100 ng/ml, r(2)=0.9975), precision (coefficient of variation < 5%), accuracy (relative error < ± 3%), sensitivity (lower limit of quantification=0.5 ng) and detection time (<1 min). Fluorodeoxyglucose (n=6) was analysed, and the Kryptofix 2.2.2 content was found to be well below the maximum permissible levels approved by the US Food and Drug Administration. The developed method has a short analysis time (<1 min) and high sensitivity (lower limit of quantification=0.5 ng/ml) and can be successfully applied to rapid quantification of Kryptofix 2.2.2 at trace levels in fluorodeoxyglucose. This method could also be applied to other [18F]fluorine-labelled radiopharmaceuticals that use Kryptofix 2.2.2 as a phase-transfer reagent.
Satellite Re-entry Modeling and Uncertainty Quantification
NASA Astrophysics Data System (ADS)
Horsley, M.
2012-09-01
LEO trajectory modeling is a fundamental aerospace capability and has applications in many areas of aerospace, such as maneuver planning, sensor scheduling, re-entry prediction, collision avoidance, risk analysis, and formation flying. Somewhat surprisingly, modeling the trajectory of an object in low Earth orbit is still a challenging task. This is primarily due to the large uncertainty in the upper atmospheric density, about 15-20% (1-sigma) for most thermosphere models. Other contributions come from our inability to precisely model future solar and geomagnetic activities, the potentially unknown shape, material construction and attitude history of the satellite, and intermittent, noisy tracking data. Current methods to predict a satellite's re-entry trajectory typically involve making a single prediction, with the uncertainty dealt with in an ad-hoc manner, usually based on past experience. However, due to the extreme speed of a LEO satellite, even small uncertainties in the re-entry time translate into a very large uncertainty in the location of the re-entry event. Currently, most methods simply update the re-entry estimate on a regular basis. This results in a wide range of estimates that are literally spread over the entire globe. With no understanding of the underlying distribution of potential impact points, the sequence of impact points predicted by the current methodology are largely useless until just a few hours before re-entry. This paper will discuss the development of a set of the High Performance Computing (HPC)-based capabilities to support near real-time quantification of the uncertainty inherent in uncontrolled satellite re-entries. An appropriate management of the uncertainties is essential for a rigorous treatment of the re-entry/LEO trajectory problem. The development of HPC-based tools for re-entry analysis is important as it will allow a rigorous and robust approach to risk assessment by decision makers in an operational setting. Uncertainty quantification results from the recent uncontrolled re-entry of the Phobos-Grunt satellite will be presented and discussed. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Probabilistic risk analysis of building contamination.
Bolster, D T; Tartakovsky, D M
2008-10-01
We present a general framework for probabilistic risk assessment (PRA) of building contamination. PRA provides a powerful tool for the rigorous quantification of risk in contamination of building spaces. A typical PRA starts by identifying relevant components of a system (e.g. ventilation system components, potential sources of contaminants, remediation methods) and proceeds by using available information and statistical inference to estimate the probabilities of their failure. These probabilities are then combined by means of fault-tree analyses to yield probabilistic estimates of the risk of system failure (e.g. building contamination). A sensitivity study of PRAs can identify features and potential problems that need to be addressed with the most urgency. Often PRAs are amenable to approximations, which can significantly simplify the approach. All these features of PRA are presented in this paper via a simple illustrative example, which can be built upon in further studies. The tool presented here can be used to design and maintain adequate ventilation systems to minimize exposure of occupants to contaminants.
Reverté, Laia; Soliño, Lucía; Carnicer, Olga; Diogène, Jorge; Campàs, Mònica
2014-01-01
The emergence of marine toxins in water and seafood may have a considerable impact on public health. Although the tendency in Europe is to consolidate, when possible, official reference methods based on instrumental analysis, the development of alternative or complementary methods providing functional or toxicological information may provide advantages in terms of risk identification, but also low cost, simplicity, ease of use and high-throughput analysis. This article gives an overview of the immunoassays, cell-based assays, receptor-binding assays and biosensors that have been developed for the screening and quantification of emerging marine toxins: palytoxins, ciguatoxins, cyclic imines and tetrodotoxins. Their advantages and limitations are discussed, as well as their possible integration in research and monitoring programs. PMID:25431968
Jeon, Soyoung; Paciorek, Christopher J.; Wehner, Michael F.
2016-02-16
Extreme event attribution characterizes how anthropogenic climate change may have influenced the probability and magnitude of selected individual extreme weather and climate events. Attribution statements often involve quantification of the fraction of attributable risk (FAR) or the risk ratio (RR) and associated confidence intervals. Many such analyses use climate model output to characterize extreme event behavior with and without anthropogenic influence. However, such climate models may have biases in their representation of extreme events. To account for discrepancies in the probabilities of extreme events between observational datasets and model datasets, we demonstrate an appropriate rescaling of the model output basedmore » on the quantiles of the datasets to estimate an adjusted risk ratio. Our methodology accounts for various components of uncertainty in estimation of the risk ratio. In particular, we present an approach to construct a one-sided confidence interval on the lower bound of the risk ratio when the estimated risk ratio is infinity. We demonstrate the methodology using the summer 2011 central US heatwave and output from the Community Earth System Model. In this example, we find that the lower bound of the risk ratio is relatively insensitive to the magnitude and probability of the actual event.« less
Risk Quantification of Systems Engineering Documents Improves Probability of DOD Project Success
2009-09-01
comprehensive risk model for DoD milestone review documentation as well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project...Milestone Documentation, Project Planning, Rational Frame, Political Frame, CMMI Project Planning Process Area, CMMI Risk Management Process Area...well as recommended changes to the Capability Maturity Model Integration ( CMMI ) Project Planning and Risk Management process areas. The intent is to
De Palma, Rodney; Sörensson, Peder; Verouhis, Dinos; Pernow, John; Saleh, Nawzad
2017-07-27
Clinical outcome following acute myocardial infarction is predicted by final infarct size evaluated in relation to left ventricular myocardium at risk (MaR). Contrast-enhanced steady-state free precession (CE-SSFP) cardiovascular magnetic resonance imaging (CMR) is not widely used for assessing MaR. Evidence of its utility compared to traditional assessment methods and as a surrogate for clinical outcome is needed. Retrospective analysis within a study evaluating post-conditioning during ST elevation myocardial infarction (STEMI) treated with coronary intervention (n = 78). CE-SSFP post-infarction was compared with angiographic jeopardy methods. Differences and variability between CMR and angiographic methods using Bland-Altman analyses were evaluated. Clinical outcomes were compared to MaR and extent of infarction. MaR showed correlation between CE-SSFP, and both BARI and APPROACH scores of 0.83 (p < 0.0001) and 0.84 (p < 0.0001) respectively. Bias between CE-SSFP and BARI was 1.1% (agreement limits -11.4 to +9.1). Bias between CE-SSFP and APPROACH was 1.2% (agreement limits -13 to +10.5). Inter-observer variability for the BARI score was 0.56 ± 2.9; 0.42 ± 2.1 for the APPROACH score; -1.4 ± 3.1% for CE-SSFP. Intra-observer variability was 0.15 ± 1.85 for the BARI score; for the APPROACH score 0.19 ± 1.6; and for CE-SSFP -0.58 ± 2.9%. Quantification of MaR with CE-SSFP imaging following STEMI shows high correlation and low bias compared with angiographic scoring and supports its use as a reliable and practical method to determine myocardial salvage in this patient population. Clinical trial registration information for the parent clinical trial: Karolinska Clinical Trial Registration (2008) Unique identifier: CT20080014. Registered 04 th January 2008.
2007-03-01
likelihood), which made DA a very effective technique in quantifying risk . Clemen and Reilly (2001) defined a specific process in DA shown on... quantifying risk as a function of time. Four experiments were conducted with different size fire crews and the time to complete each scenario was
Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification
ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE
2017-01-01
Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793
Loziuk, Philip L.; Sederoff, Ronald R.; Chiang, Vincent L.; Muddiman, David C.
2014-01-01
Quantitative mass spectrometry has become central to the field of proteomics and metabolomics. Selected reaction monitoring is a widely used method for the absolute quantification of proteins and metabolites. This method renders high specificity using several product ions measured simultaneously. With growing interest in quantification of molecular species in complex biological samples, confident identification and quantitation has been of particular concern. A method to confirm purity or contamination of product ion spectra has become necessary for achieving accurate and precise quantification. Ion abundance ratio assessments were introduced to alleviate some of these issues. Ion abundance ratios are based on the consistent relative abundance (RA) of specific product ions with respect to the total abundance of all product ions. To date, no standardized method of implementing ion abundance ratios has been established. Thresholds by which product ion contamination is confirmed vary widely and are often arbitrary. This study sought to establish criteria by which the relative abundance of product ions can be evaluated in an absolute quantification experiment. These findings suggest that evaluation of the absolute ion abundance for any given transition is necessary in order to effectively implement RA thresholds. Overall, the variation of the RA value was observed to be relatively constant beyond an absolute threshold ion abundance. Finally, these RA values were observed to fluctuate significantly over a 3 year period, suggesting that these values should be assessed as close as possible to the time at which data is collected for quantification. PMID:25154770
2014-04-01
Barrier methods for critical exponent problems in geometric analysis and mathematical physics, J. Erway and M. Holst, Submitted for publication ...TR-14-33 A Posteriori Error Analysis and Uncertainty Quantification for Adaptive Multiscale Operator Decomposition Methods for Multiphysics...Problems Approved for public release, distribution is unlimited. April 2014 HDTRA1-09-1-0036 Donald Estep and Michael
Simple, Fast, and Sensitive Method for Quantification of Tellurite in Culture Media▿
Molina, Roberto C.; Burra, Radhika; Pérez-Donoso, José M.; Elías, Alex O.; Muñoz, Claudia; Montes, Rebecca A.; Chasteen, Thomas G.; Vásquez, Claudio C.
2010-01-01
A fast, simple, and reliable chemical method for tellurite quantification is described. The procedure is based on the NaBH4-mediated reduction of TeO32− followed by the spectrophotometric determination of elemental tellurium in solution. The method is highly reproducible, is stable at different pH values, and exhibits linearity over a broad range of tellurite concentrations. PMID:20525868
Tey, Wei Keat; Kuang, Ye Chow; Ooi, Melanie Po-Leen; Khoo, Joon Joon
2018-03-01
Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses. This study proposes an automated quantification system for measuring the amount of interstitial fibrosis in renal biopsy images as a consistent basis of comparison among pathologists. The system extracts and segments the renal tissue structures based on colour information and structural assumptions of the tissue structures. The regions in the biopsy representing the interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area and quantified as a percentage of the total area of the biopsy sample. A ground truth image dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated a good correlation in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses due to the uncertainties in human judgement. An automated quantification system for accurately measuring the amount of interstitial fibrosis in renal biopsy images is presented as a consistent basis of comparison among pathologists. The system identifies the renal tissue structures through knowledge-based rules employing colour space transformations and structural features extraction from the images. In particular, the renal glomerulus identification is based on a multiscale textural feature analysis and a support vector machine. The regions in the biopsy representing interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area. The experiments conducted evaluate the system in terms of quantification accuracy, intra- and inter-observer variability in visual quantification by pathologists, and the effect introduced by the automated quantification system on the pathologists' diagnosis. A 40-image ground truth dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated an average error of 9 percentage points in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists involving samples from 70 kidney patients also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. The accuracy of the proposed quantification system has been validated with the ground truth dataset and compared against the pathologists' quantification results. It has been shown that the correlation between different pathologists' estimation of interstitial fibrosis area has significantly improved, demonstrating the effectiveness of the quantification system as a diagnostic aide. Copyright © 2017 Elsevier B.V. All rights reserved.
Hüffer, Thorsten; Praetorius, Antonia; Wagner, Stephan; von der Kammer, Frank; Hofmann, Thilo
2017-03-07
Microplastics (MPs) have been identified as contaminants of emerging concern in aquatic environments and research into their behavior and fate has been sharply increasing in recent years. Nevertheless, significant gaps remain in our understanding of several crucial aspects of MP exposure and risk assessment, including the quantification of emissions, dominant fate processes, types of analytical tools required for characterization and monitoring, and adequate laboratory protocols for analysis and hazard testing. This Feature aims at identifying transferrable knowledge and experience from engineered nanoparticle (ENP) exposure assessment. This is achieved by comparing ENP and MPs based on their similarities as particulate contaminants, whereas critically discussing specific differences. We also highlight the most pressing research priorities to support an efficient development of tools and methods for MPs environmental risk assessment.
Naveen, P.; Lingaraju, H. B.; Prasad, K. Shyam
2017-01-01
Mangiferin, a polyphenolic xanthone glycoside from Mangifera indica, is used as traditional medicine for the treatment of numerous diseases. The present study was aimed to develop and validate a reversed-phase high-performance liquid chromatography (RP-HPLC) method for the quantification of mangiferin from the bark extract of M. indica. RP-HPLC analysis was performed by isocratic elution with a low-pressure gradient using 0.1% formic acid: acetonitrile (87:13) as a mobile phase with a flow rate of 1.5 ml/min. The separation was done at 26°C using a Kinetex XB-C18 column as stationary phase and the detection wavelength at 256 nm. The proposed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification, and robustness by the International Conference on Harmonisation guidelines. In linearity, the excellent correlation coefficient more than 0.999 indicated good fitting of the curve and also good linearity. The intra- and inter-day precision showed < 1% of relative standard deviation of peak area indicated high reliability and reproducibility of the method. The recovery values at three different levels (50%, 100%, and 150%) of spiked samples were found to be 100.47, 100.89, and 100.99, respectively, and low standard deviation value < 1% shows high accuracy of the method. In robustness, the results remain unaffected by small variation in the analytical parameters, which shows the robustness of the method. Liquid chromatography–mass spectrometry analysis confirmed the presence of mangiferin with M/Z value of 421. The assay developed by HPLC method is a simple, rapid, and reliable for the determination of mangiferin from M. indica. SUMMARY The present study was intended to develop and validate an RP-HPLC method for the quantification of mangiferin from the bark extract of M. indica. The developed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification and robustness by International Conference on Harmonization guidelines. This study proved that the developed assay by HPLC method is a simple, rapid and reliable for the quantification of the mangiferin from M. indica. Abbreviations Used: M. indica: Mangifera indica, RP-HPLC: Reversed-phase high-performance liquid chromatography, M/Z: Mass to charge ratio, ICH: International conference on harmonization, % RSD: Percentage of relative standard deviation, ppm: Parts per million, LOD: Limit of detection, LOQ: Limit of quantification. PMID:28539748
Naveen, P; Lingaraju, H B; Prasad, K Shyam
2017-01-01
Mangiferin, a polyphenolic xanthone glycoside from Mangifera indica , is used as traditional medicine for the treatment of numerous diseases. The present study was aimed to develop and validate a reversed-phase high-performance liquid chromatography (RP-HPLC) method for the quantification of mangiferin from the bark extract of M. indica . RP-HPLC analysis was performed by isocratic elution with a low-pressure gradient using 0.1% formic acid: acetonitrile (87:13) as a mobile phase with a flow rate of 1.5 ml/min. The separation was done at 26°C using a Kinetex XB-C18 column as stationary phase and the detection wavelength at 256 nm. The proposed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification, and robustness by the International Conference on Harmonisation guidelines. In linearity, the excellent correlation coefficient more than 0.999 indicated good fitting of the curve and also good linearity. The intra- and inter-day precision showed < 1% of relative standard deviation of peak area indicated high reliability and reproducibility of the method. The recovery values at three different levels (50%, 100%, and 150%) of spiked samples were found to be 100.47, 100.89, and 100.99, respectively, and low standard deviation value < 1% shows high accuracy of the method. In robustness, the results remain unaffected by small variation in the analytical parameters, which shows the robustness of the method. Liquid chromatography-mass spectrometry analysis confirmed the presence of mangiferin with M/Z value of 421. The assay developed by HPLC method is a simple, rapid, and reliable for the determination of mangiferin from M. indica . The present study was intended to develop and validate an RP-HPLC method for the quantification of mangiferin from the bark extract of M. indica . The developed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification and robustness by International Conference on Harmonization guidelines. This study proved that the developed assay by HPLC method is a simple, rapid and reliable for the quantification of the mangiferin from M. indica . Abbreviations Used: M. indica : Mangifera indica , RP-HPLC: Reversed-phase high-performance liquid chromatography, M/Z: Mass to charge ratio, ICH: International conference on harmonization, % RSD: Percentage of relative standard deviation, ppm: Parts per million, LOD: Limit of detection, LOQ: Limit of quantification.
NASA Astrophysics Data System (ADS)
Arnst, M.; Abello Álvarez, B.; Ponthot, J.-P.; Boman, R.
2017-11-01
This paper is concerned with the characterization and the propagation of errors associated with data limitations in polynomial-chaos-based stochastic methods for uncertainty quantification. Such an issue can arise in uncertainty quantification when only a limited amount of data is available. When the available information does not suffice to accurately determine the probability distributions that must be assigned to the uncertain variables, the Bayesian method for assigning these probability distributions becomes attractive because it allows the stochastic model to account explicitly for insufficiency of the available information. In previous work, such applications of the Bayesian method had already been implemented by using the Metropolis-Hastings and Gibbs Markov Chain Monte Carlo (MCMC) methods. In this paper, we present an alternative implementation, which uses an alternative MCMC method built around an Itô stochastic differential equation (SDE) that is ergodic for the Bayesian posterior. We draw together from the mathematics literature a number of formal properties of this Itô SDE that lend support to its use in the implementation of the Bayesian method, and we describe its discretization, including the choice of the free parameters, by using the implicit Euler method. We demonstrate the proposed methodology on a problem of uncertainty quantification in a complex nonlinear engineering application relevant to metal forming.
2015-01-07
vector that helps to manage , predict, and mitigate the risk in the original variable. Residual risk can be exemplified as a quantification of the improved... the random variable of interest is viewed in concert with a related random vector that helps to manage , predict, and mitigate the risk in the original...measures of risk. They view a random variable of interest in concert with an auxiliary random vector that helps to manage , predict and mitigate the risk
Quantification of fungicides in snow-melt runoff from turf: A comparison of four extraction methods
USDA-ARS?s Scientific Manuscript database
A variety of pesticides are used to control diverse stressors to turf. These pesticides have a wide range in physical and chemical properties. The objective of this project was to develop an extraction and analysis method for quantification of chlorothalonil and PCNB (pentachloronitrobenzene), two p...
The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...
The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...
The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...
Chai, Liuying; Zhang, Jianwei; Zhang, Lili; Chen, Tongsheng
2015-03-01
Spectral measurement of fluorescence resonance energy transfer (FRET), spFRET, is a widely used FRET quantification method in living cells today. We set up a spectrometer-microscope platform that consists of a miniature fiber optic spectrometer and a widefield fluorescence microscope for the spectral measurement of absolute FRET efficiency (E) and acceptor-to-donor concentration ratio (R(C)) in single living cells. The microscope was used for guiding cells and the spectra were simultaneously detected by the miniature fiber optic spectrometer. Moreover, our platform has independent excitation and emission controllers, so different excitations can share the same emission channel. In addition, we developed a modified spectral FRET quantification method (mlux-FRET) for the multiple donors and multiple acceptors FRET construct (mD∼nA) sample, and we also developed a spectra-based 2-channel acceptor-sensitized FRET quantification method (spE-FRET). We implemented these modified FRET quantification methods on our platform to measure the absolute E and R(C) values of tandem constructs with different acceptor/donor stoichiometries in single living Huh-7 cells.
Vaudano, Enrico; Costantini, Antonella; Garcia-Moruno, Emilia
2016-10-03
The availability of genetically modified (GM) yeasts for winemaking and, in particular, transgenic strains based on the integration of genetic constructs deriving from other organisms into the genome of Saccharomyces cerevisiae, has been a reality for several years. Despite this, their use is only authorized in a few countries and limited to two strains: ML01, able to convert malic acid into lactic acid during alcoholic fermentation, and ECMo01 suitable for reducing the risk of carbamate production. In this work we propose a quali-quantitative culture-independent method for the detection of GM yeast ML01 in commercial preparations of ADY (Active Dry Yeast) consisting of efficient extraction of DNA and qPCR (quantitative PCR) analysis based on event-specific assay targeting MLC (malolactic cassette), and a taxon-specific S. cerevisiae assay detecting the MRP2 gene. The ADY DNA extraction methodology has been shown to provide good purity DNA suitable for subsequent qPCR. The MLC and MRP2 qPCR assay showed characteristics of specificity, dynamic range, limit of quantification (LOQ) limit of detection (LOD), precision and trueness, which were fully compliant with international reference guidelines. The method has been shown to reliably detect 0.005% (mass/mass) of GM ML01 S. cerevisiae in commercial preparations of ADY. Copyright © 2016 Elsevier B.V. All rights reserved.
Zhao, Pengfei; Lei, Shuo; Xing, Mingming; Xiong, Shihang; Guo, Xingjie
2018-03-01
A robust and sensitive method was developed for the enantiomeric analysis of six chiral pesticides (including metalaxyl, epoxiconazole, myclobutanil, hexaconazole, napropamide, and isocarbophos) in aquatic environmental samples. The optimized chromatographic conditions for the quantification of all the 12 enantiomers were performed with Chiralcel OD-RH column using mobile phase consisting of 0.1% aqueous formic acid and acetonitrile operated under reversed-phase conditions and then analyzed using liquid chromatography with tandem mass spectrometry. Twelve enantiomers were detected in multiple reaction monitoring mode. Solid-phase extraction and dispersive liquid-liquid microextraction were employed in this study. Response surface methodology was applied to assist in the dispersive liquid-liquid microextraction optimization. Under the optimum conditions, recoveries of pesticides enantiomers varied from 83.0 to 103.2% at two spiked levels with relative standard deviation less than 11.5%. The concentration factors were up to 1000 times. Method detection and quantification limits varied from 0.11 to 0.48 ng/L and from 0.46 to 1.49 ng/L, respectively. Finally, this method was used to determination of the enantiomers composition of the six pesticides in environmental aqueous matrices, which will help better understand the behavior of individual enantiomer and make accurate risk assessment to ecosystems. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The assessment and management of third party risk around a major airport.
Ale, B J; Piers, M
2000-01-07
Schiphol, the main airport of the Netherlands, is growing rapidly. The aircraft movements, also growing in number, place a considerable environmental burden on the surrounding population, notably, noise and odour nuisance and risks. In the process of deciding on how to extend the capacity of the airport to accommodate the anticipated twofold growth in the number of movements with respect to 1990, environmental problems form a major concern. The concern about risks for the surrounding population was enhanced after the crash on 4 October 1992, in which a Boeing 747 cargo carrier bored into a block of flats in a suburb of Amsterdam near Schiphol. In this accident, the four crew members were killed, together with 39 inhabitants of the flats/apartment building. These risks were studied as part of the Environmental Impact Assessment (EIA). To make these studies useful for decision making necessitated a major improvement in the available techniques for risk quantification. The results of the quantitative analyses, using several different methods, have all indicated that the activities of Schiphol pose a considerable risk compared to other major industrial activities in the Netherlands. This paper describes the development of the methodology from 1990 in the light of the policy context in which it took place. Use of the methods in the decision-making process is illustrated by describing the current status of this process.
Liu, Ruijuan; Wang, Mengmeng; Ding, Li
2014-10-01
Menadione (VK3), an essential fat-soluble naphthoquinone, takes very important physiological and pathological roles, but its detection and quantification is challenging. Herein, a new method was developed for quantification of VK3 in human plasma by liquid chromatography-tandem mass spectrometry (LC-MS/MS) after derivatization with 3-mercaptopropionic acid via Michael addition reaction. The derivative had been identified by the mass spectra and the derivatization conditions were optimized by considering different parameters. The method was demonstrated with high sensitivity and a low limit of quantification of 0.03 ng mL(-1) for VK3, which is about 33-fold better than that for the direct analysis of the underivatized compound. The method also had good precision and reproducibility. It was applied in the determination of basal VK3 in human plasma and a clinical pharmacokinetic study of menadiol sodium diphosphate. Furthermore, the method for the quantification of VK3 using LC-MS/MS was reported in this paper for the first time, and it will provide an important strategy for the further research on VK3 and menadione analogs. Copyright © 2014 Elsevier B.V. All rights reserved.
HPLC Quantification of astaxanthin and canthaxanthin in Salmonidae eggs.
Tzanova, Milena; Argirova, Mariana; Atanasov, Vasil
2017-04-01
Astaxanthin and canthaxanthin are naturally occurring antioxidants referred to as xanthophylls. They are used as food additives in fish farms to improve the organoleptic qualities of salmonid products and to prevent reproductive diseases. This study reports the development and single-laboratory validation of a rapid method for quantification of astaxanthin and canthaxanthin in eggs of rainbow trout (Oncorhynchus mykiss) and brook trout (Salvelinus fontinalis М.). An advantage of the proposed method is the perfect combination of selective extraction of the xanthophylls and analysis of the extract by high-performance liquid chromatography and photodiode array detection. The method validation was carried out in terms of linearity, accuracy, precision, recovery and limits of detection and quantification. The method was applied for simultaneous quantification of the two xanthophylls in eggs of rainbow trout and brook trout after their selective extraction. The results show that astaxanthin accumulations in salmonid fish eggs are larger than those of canthaxanthin. As the levels of these two xanthophylls affect fish fertility, this method can be used to improve the nutritional quality and to minimize the occurrence of the M74 syndrome in fish populations. Copyright © 2016 John Wiley & Sons, Ltd.
Tamura, Masayoshi; Takahashi, Ayumi; Uyama, Atsuo; Mochizuki, Naoki
2012-01-01
An analytical method using two solid phase extractions and ultra-high-performance liquid chromatography coupled to tandem mass spectrometry (UHPLC-MS/MS) was developed for the identification and quantification of 14 mycotoxins (patulin, deoxynivalenol, aflatoxins B1, B2, G1, G2, M1, T-2 toxin, HT-2 toxin, zearalenone, fumonisins B1, B2, B3, and ochratoxin A) in domestic and imported wines. Mycotoxins were purified with an Oasis HLB cartridge, followed by a MultiSepTM #229 Ochra. As a result, sufficient removal of the pigments and highly polar matrices from the red wines was achieved. UHPLC conditions were optimized, and 14 mycotoxins were separated in a total of 13 min. Determinations performed using this method produced high correlation coefficients for the 14 mycotoxins (R > 0.990) and recovery rates ranging from 76 to 105% with good repeatability (relative standard deviation RSD < 12%). Twenty-seven samples of domestic and imported wines were analyzed using this method. Although ochratoxin A (OTA) and fumonisins (FMs) were detected in several samples, the FM levels were less than limits of quantification (LOQs) (1 μg/L), and even the largest of the OTA levels was below the EU regulatory level (2 μg/L). These results suggest that the health risk posed to consumers from the wines available in Japan is relatively low. PMID:22822458
Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina
2006-01-01
Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to evaluate the quality and performance on different matrixes and extraction techniques. The effect of PCR efficiency on the resulting GMO content is demonstrated. Conclusion The crucial influence of extraction technique and sample matrix properties on the results of GMO quantification is demonstrated. Appropriate extraction techniques for each matrix need to be determined to achieve accurate DNA quantification. Nevertheless, as it is shown that in the area of food and feed testing matrix with certain specificities is impossible to define strict quality controls need to be introduced to monitor PCR. The results of our study are also applicable to other fields of quantitative testing by real-time PCR. PMID:16907967
NASA Astrophysics Data System (ADS)
Fleming, K. M.; Zschau, J.; Gasparini, P.; Modaressi, H.; Matrix Consortium
2011-12-01
Scientists, engineers, civil protection and disaster managers typically treat natural hazards and risks individually. This leads to the situation where the frequent causal relationships between the different hazards and risks, e.g., earthquakes and volcanos, or floods and landslides, are ignored. Such an oversight may potentially lead to inefficient mitigation planning. As part of their efforts to confront this issue, the European Union, under its FP7 program, is supporting the New Multi-HAzard and MulTi-RIsK Assessment MethodS for Europe or MATRIX project. The focus of MATRIX is on natural hazards, in particular earthquakes, landslides, volcanos, wild fires, storms and fluvial and coastal flooding. MATRIX will endeavour to develop methods and tools to tackle multi-type natural hazards and risks within a common framework, focusing on methodologies that are suited to the European context. The work will involve an assessment of current single-type hazard and risk assessment methodologies, including a comparison and quantification of uncertainties and harmonization of single-type methods, examining the consequence of cascade effects within a multi-hazard environment, time-dependent vulnerability, decision making and support for multi-hazard mitigation and adaption, and a series of test cases. Three test sites are being used to assess the methods developed within the project (Naples, Cologne, and the French West Indies), as well as a "virtual city" based on a comprehensive IT platform that will allow scenarios not represented by the test cases to be examined. In addition, a comprehensive dissemination program that will involve national platforms for disaster management, as well as various outreach activities, will be undertaken. The MATRIX consortium consists of ten research institutions (nine European and one Canadian), an end-user (i.e., one of the European national platforms for disaster reduction) and a partner from industry.
Mehta, Nehal N; Torigian, Drew A; Gelfand, Joel M; Saboury, Babak; Alavi, Abass
2012-05-02
Conventional non-invasive imaging modalities of atherosclerosis such as coronary artery calcium (CAC) and carotid intimal medial thickness (C-IMT) provide information about the burden of disease. However, despite multiple validation studies of CAC, and C-IMT, these modalities do not accurately assess plaque characteristics, and the composition and inflammatory state of the plaque determine its stability and, therefore, the risk of clinical events. [(18)F]-2-fluoro-2-deoxy-D-glucose (FDG) imaging using positron-emission tomography (PET)/computed tomography (CT) has been extensively studied in oncologic metabolism. Studies using animal models and immunohistochemistry in humans show that FDG-PET/CT is exquisitely sensitive for detecting macrophage activity, an important source of cellular inflammation in vessel walls. More recently, we and others have shown that FDG-PET/CT enables highly precise, novel measurements of inflammatory activity of activity of atherosclerotic plaques in large and medium-sized arteries. FDG-PET/CT studies have many advantages over other imaging modalities: 1) high contrast resolution; 2) quantification of plaque volume and metabolic activity allowing for multi-modal atherosclerotic plaque quantification; 3) dynamic, real-time, in vivo imaging; 4) minimal operator dependence. Finally, vascular inflammation detected by FDG-PET/CT has been shown to predict cardiovascular (CV) events independent of traditional risk factors and is also highly associated with overall burden of atherosclerosis. Plaque activity by FDG-PET/CT is modulated by known beneficial CV interventions such as short term (12 week) statin therapy as well as longer term therapeutic lifestyle changes (16 months). The current methodology for quantification of FDG uptake in atherosclerotic plaque involves measurement of the standardized uptake value (SUV) of an artery of interest and of the venous blood pool in order to calculate a target to background ratio (TBR), which is calculated by dividing the arterial SUV by the venous blood pool SUV. This method has shown to represent a stable, reproducible phenotype over time, has a high sensitivity for detection of vascular inflammation, and also has high inter-and intra-reader reliability. Here we present our methodology for patient preparation, image acquisition, and quantification of atherosclerotic plaque activity and vascular inflammation using SUV, TBR, and a global parameter called the metabolic volumetric product (MVP). These approaches may be applied to assess vascular inflammation in various study samples of interest in a consistent fashion as we have shown in several prior publications.
Provost, Karine; Leblond, Antoine; Gauthier-Lemire, Annie; Filion, Édith; Bahig, Houda; Lord, Martin
2017-09-01
Planar perfusion scintigraphy with 99m Tc-labeled macroaggregated albumin is often used for pretherapy quantification of regional lung perfusion in lung cancer patients, particularly those with poor respiratory function. However, subdividing lung parenchyma into rectangular regions of interest, as done on planar images, is a poor reflection of true lobar anatomy. New tridimensional methods using SPECT and SPECT/CT have been introduced, including semiautomatic lung segmentation software. The present study evaluated inter- and intraobserver agreement on quantification using SPECT/CT software and compared the results for regional lung contribution obtained with SPECT/CT and planar scintigraphy. Methods: Thirty lung cancer patients underwent ventilation-perfusion scintigraphy with 99m Tc-macroaggregated albumin and 99m Tc-Technegas. The regional lung contribution to perfusion and ventilation was measured on both planar scintigraphy and SPECT/CT using semiautomatic lung segmentation software by 2 observers. Interobserver and intraobserver agreement for the SPECT/CT software was assessed using the intraclass correlation coefficient, Bland-Altman plots, and absolute differences in measurements. Measurements from planar and tridimensional methods were compared using the paired-sample t test and mean absolute differences. Results: Intraclass correlation coefficients were in the excellent range (above 0.9) for both interobserver and intraobserver agreement using the SPECT/CT software. Bland-Altman analyses showed very narrow limits of agreement. Absolute differences were below 2.0% in 96% of both interobserver and intraobserver measurements. There was a statistically significant difference between planar and SPECT/CT methods ( P < 0.001) for quantification of perfusion and ventilation for all right lung lobes, with a maximal mean absolute difference of 20.7% for the right middle lobe. There was no statistically significant difference in quantification of perfusion and ventilation for the left lung lobes using either method; however, absolute differences reached 12.0%. The total right and left lung contributions were similar for the two methods, with a mean difference of 1.2% for perfusion and 2.0% for ventilation. Conclusion: Quantification of regional lung perfusion and ventilation using SPECT/CT-based lung segmentation software is highly reproducible. This tridimensional method yields statistically significant differences in measurements for right lung lobes when compared with planar scintigraphy. We recommend that SPECT/CT-based quantification be used for all lung cancer patients undergoing pretherapy evaluation of regional lung function. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
A multi-center study benchmarks software tools for label-free proteome quantification
Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan
2016-01-01
The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404
Jacchia, Sara; Nardini, Elena; Savini, Christian; Petrillo, Mauro; Angers-Loustau, Alexandre; Shim, Jung-Hyun; Trijatmiko, Kurniawan; Kreysa, Joachim; Mazzara, Marco
2015-02-18
In this study, we developed, optimized, and in-house validated a real-time PCR method for the event-specific detection and quantification of Golden Rice 2, a genetically modified rice with provitamin A in the grain. We optimized and evaluated the performance of the taxon (targeting rice Phospholipase D α2 gene)- and event (targeting the 3' insert-to-plant DNA junction)-specific assays that compose the method as independent modules, using haploid genome equivalents as unit of measurement. We verified the specificity of the two real-time PCR assays and determined their dynamic range, limit of quantification, limit of detection, and robustness. We also confirmed that the taxon-specific DNA sequence is present in single copy in the rice genome and verified its stability of amplification across 132 rice varieties. A relative quantification experiment evidenced the correct performance of the two assays when used in combination.
Accurate proteome-wide protein quantification from high-resolution 15N mass spectra
2011-01-01
In quantitative mass spectrometry-based proteomics, the metabolic incorporation of a single source of 15N-labeled nitrogen has many advantages over using stable isotope-labeled amino acids. However, the lack of a robust computational framework for analyzing the resulting spectra has impeded wide use of this approach. We have addressed this challenge by introducing a new computational methodology for analyzing 15N spectra in which quantification is integrated with identification. Application of this method to an Escherichia coli growth transition reveals significant improvement in quantification accuracy over previous methods. PMID:22182234
Jiang, Tingting; Dai, Yongmei; Miao, Miao; Zhang, Yue; Song, Chenglin; Wang, Zhixu
2015-07-01
To evaluate the usefulness and efficiency of a novel dietary method among urban pregnant women. Sixty one pregnant women were recruited from the ward and provided with a meal accurately weighed before cooking. The meal was photographed from three different angles before and after eating. The subjects were also interviewed for 24 h dietary recall by the investigators. Food weighting, image quantification and 24 h dietary recall were conducted by investigators from three different groups, and the messages were isolated from each other. Food consumption was analyzed on bases of classification and total summation. Nutrient intake from the meal was calculated for each subject. The data obtained from the dietary recall and the image quantification were compared with the actual values. Correlation and regression analyses were carried out on values between weight method and image quantification as well as dietary recall. Total twenty three kinds of food including rice, vegetables, fish, meats and soy bean curd were included in the experimental meal for the study. Compared with data from 24 h dietary recall (r = 0.413, P < 0.05), food weight estimated by image quantification (r = 0.778, P < 0.05, n = 308) were more correlated with weighed data, and show more concentrated linear distribution. Absolute difference distribution between image quantification and weight method of all food was 77.23 ± 56.02 (P < 0.05, n = 61), which was much small than the difference (172.77 ± 115.18) between 24 h recall and weight method. Values of almost all nutrients, including energy, protein, fat, carbohydrate, vitamin A, vitamin C, calcium, iron and zine calculated based on food weight from image quantification were more close to those of weighed data compared with 24 h dietary recall (P < 0.01). The results found by the Bland Altman analysis showed that the majority of the measurements for nutrient intake, were scattered along the mean difference line and close to the equality line (difference = 0). The plots show fairly good agreement between estimated and actual food consumption. It indicate that the differences (including the outliers) were random and did not exhibit any systematic bias, being consistent over different levels of mean food amount. On the other hand, the questionnaire showed that fifty six pregnant women considered the image quantification was less time-consuming and burdened than 24 h recall. Fifty eight of them would like to use image quantification to know their dietary status. The novel method which called instant photography (image quantification) for dietary assessment is more effective than conventional 24 h dietary recall and it also can obtain food intake values close to weighed data.
NASA Astrophysics Data System (ADS)
Rodríguez-Escales, Paula; Canelles, Arnau; Sanchez-Vila, Xavier; Folch, Albert; Kurtzman, Daniel; Rossetto, Rudy; Fernández-Escalante, Enrique; Lobo-Ferreira, João-Paulo; Sapiano, Manuel; San-Sebastián, Jon; Schüth, Christoph
2018-06-01
Managed aquifer recharge (MAR) can be affected by many risks. Those risks are related to different technical and non-technical aspects of recharge, like water availability, water quality, legislation, social issues, etc. Many other works have acknowledged risks of this nature theoretically; however, their quantification and definition has not been developed. In this study, the risk definition and quantification has been performed by means of fault trees
and probabilistic risk assessment (PRA). We defined a fault tree with 65 basic events applicable to the operation phase. After that, we have applied this methodology to six different managed aquifer recharge sites located in the Mediterranean Basin (Portugal, Spain, Italy, Malta, and Israel). The probabilities of the basic events were defined by expert criteria, based on the knowledge of the different managers of the facilities. From that, we conclude that in all sites, the perception of the expert criteria of the non-technical aspects were as much or even more important than the technical aspects. Regarding the risk results, we observe that the total risk in three of the six sites was equal to or above 0.90. That would mean that the MAR facilities have a risk of failure equal to or higher than 90 % in the period of 2-6 years. The other three sites presented lower risks (75, 29, and 18 % for Malta, Menashe, and Serchio, respectively).
Quantification of taurine in energy drinks using ¹H NMR.
Hohmann, Monika; Felbinger, Christine; Christoph, Norbert; Wachter, Helmut; Wiest, Johannes; Holzgrabe, Ulrike
2014-05-01
The consumption of so called energy drinks is increasing, especially among adolescents. These beverages commonly contain considerable amounts of the amino sulfonic acid taurine, which is related to a magnitude of various physiological effects. The customary method to control the legal limit of taurine in energy drinks is LC-UV/vis with postcolumn derivatization using ninhydrin. In this paper we describe the quantification of taurine in energy drinks by (1)H NMR as an alternative to existing methods of quantification. Variation of pH values revealed the separation of a distinct taurine signal in (1)H NMR spectra, which was applied for integration and quantification. Quantification was performed using external calibration (R(2)>0.9999; linearity verified by Mandel's fitting test with a 95% confidence level) and PULCON. Taurine concentrations in 20 different energy drinks were analyzed by both using (1)H NMR and LC-UV/vis. The deviation between (1)H NMR and LC-UV/vis results was always below the expanded measurement uncertainty of 12.2% for the LC-UV/vis method (95% confidence level) and at worst 10.4%. Due to the high accordance to LC-UV/vis data and adequate recovery rates (ranging between 97.1% and 108.2%), (1)H NMR measurement presents a suitable method to quantify taurine in energy drinks. Copyright © 2013 Elsevier B.V. All rights reserved.
Louwagie, Mathilde; Kieffer-Jaquinod, Sylvie; Dupierris, Véronique; Couté, Yohann; Bruley, Christophe; Garin, Jérôme; Dupuis, Alain; Jaquinod, Michel; Brun, Virginie
2012-07-06
Accurate quantification of pure peptides and proteins is essential for biotechnology, clinical chemistry, proteomics, and systems biology. The reference method to quantify peptides and proteins is amino acid analysis (AAA). This consists of an acidic hydrolysis followed by chromatographic separation and spectrophotometric detection of amino acids. Although widely used, this method displays some limitations, in particular the need for large amounts of starting material. Driven by the need to quantify isotope-dilution standards used for absolute quantitative proteomics, particularly stable isotope-labeled (SIL) peptides and PSAQ proteins, we developed a new AAA assay (AAA-MS). This method requires neither derivatization nor chromatographic separation of amino acids. It is based on rapid microwave-assisted acidic hydrolysis followed by high-resolution mass spectrometry analysis of amino acids. Quantification is performed by comparing MS signals from labeled amino acids (SIL peptide- and PSAQ-derived) with those of unlabeled amino acids originating from co-hydrolyzed NIST standard reference materials. For both SIL peptides and PSAQ standards, AAA-MS quantification results were consistent with classical AAA measurements. Compared to AAA assay, AAA-MS was much faster and was 100-fold more sensitive for peptide and protein quantification. Finally, thanks to the development of a labeled protein standard, we also extended AAA-MS analysis to the quantification of unlabeled proteins.
Malhat, Farag; Boulangé, Julien; Abdelraheem, Ehab; Abd Allah, Osama; Abd El-Hamid, Rania; Abd El-Salam, Shokr
2017-08-15
A simple and rapid gas chromatography with flame photometric detector (GC-FPD) determination method was developed to detect residue levels and investigate the dissipation pattern and safe use of fenitrothion in tomatoes. A modified quick, easy, cheap, effective, rugged, and safe (QuEChERS) using an ethyl acetate-based extraction, followed by a dispersive solid-phase extraction (d-SPE) with primary-secondary amine (PSA) and graphite carbon black (GCB) for clean up, was applied prior to GC-FPD analysis. The method showed satisfactory linearity, recovery and precision. The limits of detection (LOD) and quantification (LOQ) were 0.005 and 0.01mg/kg, respectively. The residue levels of fenitrothion were best described by first order kinetics with a half-life of 2.2days in tomatoes. The potential health risks posed by fenitrothion were not significant, based on supervised residue trial data. The current findings could provide guidance for safe and reasonable use of fenitrothion in tomatoes and prevent health problems to consumers. Copyright © 2017 Elsevier Ltd. All rights reserved.
Is qPCR a Reliable Indicator of Cyanotoxin Risk in Freshwater?
Pacheco, Ana Beatriz F.; Guedes, Iame A.; Azevedo, Sandra M.F.O.
2016-01-01
The wide distribution of cyanobacteria in aquatic environments leads to the risk of water contamination by cyanotoxins, which generate environmental and public health issues. Measurements of cell densities or pigment contents allow both the early detection of cellular growth and bloom monitoring, but these methods are not sufficiently accurate to predict actual cyanobacterial risk. To quantify cyanotoxins, analytical methods are considered the gold standards, but they are laborious, expensive, time-consuming and available in a limited number of laboratories. In cyanobacterial species with toxic potential, cyanotoxin production is restricted to some strains, and blooms can contain varying proportions of both toxic and non-toxic cells, which are morphologically indistinguishable. The sequencing of cyanobacterial genomes led to the description of gene clusters responsible for cyanotoxin production, which paved the way for the use of these genes as targets for PCR and then quantitative PCR (qPCR). Thus, the quantification of cyanotoxin genes appeared as a new method for estimating the potential toxicity of blooms. This raises a question concerning whether qPCR-based methods would be a reliable indicator of toxin concentration in the environment. Here, we review studies that report the parallel detection of microcystin genes and microcystin concentrations in natural populations and also a smaller number of studies dedicated to cylindrospermopsin and saxitoxin. We discuss the possible issues associated with the contradictory findings reported to date, present methodological limitations and consider the use of qPCR as an indicator of cyanotoxin risk. PMID:27338471
NASA Astrophysics Data System (ADS)
Holland, Katharina; van Gils, Carla H.; Wanders, Johanna OP; Mann, Ritse M.; Karssemeijer, Nico
2016-03-01
The sensitivity of mammograms is low for women with dense breasts, since cancers may be masked by dense tissue. In this study, we investigated methods to identify women with density patterns associated with a high masking risk. Risk measures are derived from volumetric breast density maps. We used the last negative screening mammograms of 93 women who subsequently presented with an interval cancer (IC), and, as controls, 930 randomly selected normal screening exams from women without cancer. Volumetric breast density maps were computed from the mammograms, which provide the dense tissue thickness at each location. These were used to compute absolute and percentage glandular tissue volume. We modeled the masking risk for each pixel location using the absolute and percentage dense tissue thickness and we investigated the effect of taking the cancer location probability distribution (CLPD) into account. For each method, we selected cases with the highest masking measure (by thresholding) and computed the fraction of ICs as a function of the fraction of controls selected. The latter can be interpreted as the negative supplemental screening rate (NSSR). Between the models, when incorporating CLPD, no significant differences were found. In general, the methods performed better when CLPD was included. At higher NSSRs some of the investigated masking measures had a significantly higher performance than volumetric breast density. These measures may therefore serve as an alternative to identify women with a high risk for a masked cancer.
Evaluating the combined adverse effects of multiple stressors upon human health is an imperative component of cumulative risk assessment (CRA)1. In addition to chemical stressors, other non-chemical factors are also considered. For examples, smoking will elevate the risks of havi...
Assessing the chemical contamination dynamics in a mixed land use stream system.
Sonne, Anne Th; McKnight, Ursula S; Rønde, Vinni; Bjerg, Poul L
2017-11-15
Traditionally, the monitoring of streams for chemical and ecological status has been limited to surface water concentrations, where the dominant focus has been on general water quality and the risk for eutrophication. Mixed land use stream systems, comprising urban areas and agricultural production, are challenging to assess with multiple chemical stressors impacting stream corridors. New approaches are urgently needed for identifying relevant sources, pathways and potential impacts for implementation of suitable source management and remedial measures. We developed a method for risk assessing chemical stressors in these systems and applied the approach to a 16-km groundwater-fed stream corridor (Grindsted, Denmark). Three methods were combined: (i) in-stream contaminant mass discharge for source quantification, (ii) Toxic Units and (iii) environmental standards. An evaluation of the chemical quality of all three stream compartments - stream water, hyporheic zone, streambed sediment - made it possible to link chemical stressors to their respective sources and obtain new knowledge about source composition and origin. Moreover, toxic unit estimation and comparison to environmental standards revealed the stream water quality was substantially impaired by both geogenic and diffuse anthropogenic sources of metals along the entire corridor, while the streambed was less impacted. Quantification of the contaminant mass discharge originating from a former pharmaceutical factory revealed that several 100 kgs of chlorinated ethenes and pharmaceutical compounds discharge into the stream every year. The strongly reduced redox conditions in the plume result in high concentrations of dissolved iron and additionally release arsenic, generating the complex contaminant mixture found in the narrow discharge zone. The fingerprint of the plume was observed in the stream several km downgradient, while nutrients, inorganics and pesticides played a minor role for the stream health. The results emphasize that future investigations should include multiple compounds and stream compartments, and highlight the need for holistic approaches when risk assessing these dynamic systems. Copyright © 2017 Elsevier Ltd. All rights reserved.
Dong, Tao; Yu, Liang; Gao, Difeng; Yu, Xiaochen; Miao, Chao; Zheng, Yubin; Lian, Jieni; Li, Tingting; Chen, Shulin
2015-12-01
Accurate determination of fatty acid contents is routinely required in microalgal and yeast biofuel studies. A method of rapid in situ fatty acid methyl ester (FAME) derivatization directly from wet fresh microalgal and yeast biomass was developed in this study. This method does not require prior solvent extraction or dehydration. FAMEs were prepared with a sequential alkaline hydrolysis (15 min at 85 °C) and acidic esterification (15 min at 85 °C) process. The resulting FAMEs were extracted into n-hexane and analyzed using gas chromatography. The effects of each processing parameter (temperature, reaction time, and water content) upon the lipids quantification in the alkaline hydrolysis step were evaluated with a full factorial design. This method could tolerate water content up to 20% (v/v) in total reaction volume, which equaled up to 1.2 mL of water in biomass slurry (with 0.05-25 mg of fatty acid). There were no significant differences in FAME quantification (p>0.05) between the standard AOAC 991.39 method and the proposed wet in situ FAME preparation method. This fatty acid quantification method is applicable to fresh wet biomass of a wide range of microalgae and yeast species.
Comprehensive Evaluation and Implementation of Improvement Actions in Butcher Shops
Leotta, Gerardo A.; Brusa, Victoria; Galli, Lucía; Adriani, Cristian; Linares, Luciano; Etcheverría, Analía; Sanz, Marcelo; Sucari, Adriana; Peral García, Pilar; Signorini, Marcelo
2016-01-01
Foodborne pathogens can cause acute and chronic diseases and produce a wide range of symptoms. Since the consumption of ground beef is a risk factor for infections with some bacterial pathogens, we performed a comprehensive evaluation of butcher shops, implemented improvement actions for both butcher shops and consumers, and verified the impact of those actions implemented. A comprehensive evaluation was made and risk was quantified on a 1–100 scale as high-risk (1–40), moderate-risk (41–70) or low-risk (71–100). A total of 172 raw ground beef and 672 environmental samples were collected from 86 butcher shops during the evaluation (2010–2011) and verification (2013) stages of the study. Ground beef samples were analyzed for mesophilic aerobic organisms, Escherichia coli and coagulase-positive Staphylococcus aureus enumeration. Salmonella spp., E. coli O157:H7, non-O157 Shiga toxin-producing E. coli (STEC), and Listeria monocytogenes were detected and isolated from all samples. Risk quantification resulted in 43 (50.0%) high-risk, 34 (39.5%) moderate-risk, and nine (10.5%) low-risk butcher shops. Training sessions for 498 handlers and 4,506 consumers were held. Re-evaluation by risk quantification and microbiological analyses resulted in 19 (22.1%) high-risk, 42 (48.8%) moderate-risk and 25 (29.1%) low-risk butcher shops. The count of indicator microorganisms decreased with respect to the 2010–2011 period. After the implementation of improvement actions, the presence of L. monocytogenes, E. coli O157:H7 and stx genes in ground beef decreased. Salmonella spp. was isolated from 10 (11.6%) ground beef samples, without detecting statistically significant differences between both study periods (evaluation and verification). The percentage of pathogens in environmental samples was reduced in the verification period (Salmonella spp., 1.5%; L. monocytogenes, 10.7%; E. coli O157:H7, 0.6%; non-O157 STEC, 6.8%). Risk quantification was useful to identify those relevant facts in butcher shops. The reduction of contamination in ground beef and the environment was possible after training handlers based on the problems identified in their own butcher shops. Our results confirm the feasibility of implementing a comprehensive risk management program in butcher shops, and the importance of information campaigns targeting consumers. Further collaborative efforts would be necessary to improve foodstuffs safety at retail level and at home. PMID:27618439
Comprehensive Evaluation and Implementation of Improvement Actions in Butcher Shops.
Leotta, Gerardo A; Brusa, Victoria; Galli, Lucía; Adriani, Cristian; Linares, Luciano; Etcheverría, Analía; Sanz, Marcelo; Sucari, Adriana; Peral García, Pilar; Signorini, Marcelo
2016-01-01
Foodborne pathogens can cause acute and chronic diseases and produce a wide range of symptoms. Since the consumption of ground beef is a risk factor for infections with some bacterial pathogens, we performed a comprehensive evaluation of butcher shops, implemented improvement actions for both butcher shops and consumers, and verified the impact of those actions implemented. A comprehensive evaluation was made and risk was quantified on a 1-100 scale as high-risk (1-40), moderate-risk (41-70) or low-risk (71-100). A total of 172 raw ground beef and 672 environmental samples were collected from 86 butcher shops during the evaluation (2010-2011) and verification (2013) stages of the study. Ground beef samples were analyzed for mesophilic aerobic organisms, Escherichia coli and coagulase-positive Staphylococcus aureus enumeration. Salmonella spp., E. coli O157:H7, non-O157 Shiga toxin-producing E. coli (STEC), and Listeria monocytogenes were detected and isolated from all samples. Risk quantification resulted in 43 (50.0%) high-risk, 34 (39.5%) moderate-risk, and nine (10.5%) low-risk butcher shops. Training sessions for 498 handlers and 4,506 consumers were held. Re-evaluation by risk quantification and microbiological analyses resulted in 19 (22.1%) high-risk, 42 (48.8%) moderate-risk and 25 (29.1%) low-risk butcher shops. The count of indicator microorganisms decreased with respect to the 2010-2011 period. After the implementation of improvement actions, the presence of L. monocytogenes, E. coli O157:H7 and stx genes in ground beef decreased. Salmonella spp. was isolated from 10 (11.6%) ground beef samples, without detecting statistically significant differences between both study periods (evaluation and verification). The percentage of pathogens in environmental samples was reduced in the verification period (Salmonella spp., 1.5%; L. monocytogenes, 10.7%; E. coli O157:H7, 0.6%; non-O157 STEC, 6.8%). Risk quantification was useful to identify those relevant facts in butcher shops. The reduction of contamination in ground beef and the environment was possible after training handlers based on the problems identified in their own butcher shops. Our results confirm the feasibility of implementing a comprehensive risk management program in butcher shops, and the importance of information campaigns targeting consumers. Further collaborative efforts would be necessary to improve foodstuffs safety at retail level and at home.
[Progress in stable isotope labeled quantitative proteomics methods].
Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui
2013-06-01
Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.
Tsukahara, Keita; Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Nishimaki-Mogami, Tomoko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi
2016-01-01
A real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event, MON87701. First, a standard plasmid for MON87701 quantification was constructed. The conversion factor (C f ) required to calculate the amount of genetically modified organism (GMO) was experimentally determined for a real-time PCR instrument. The determined C f for the real-time PCR instrument was 1.24. For the evaluation of the developed method, a blind test was carried out in an inter-laboratory trial. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr), respectively. The determined biases and the RSDr values were less than 30 and 13%, respectively, at all evaluated concentrations. The limit of quantitation of the method was 0.5%, and the developed method would thus be applicable for practical analyses for the detection and quantification of MON87701.
Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi
2013-01-01
A novel real-time polymerase chain reaction (PCR)-based quantitative screening method was developed for three genetically modified soybeans: RRS, A2704-12, and MON89788. The 35S promoter (P35S) of cauliflower mosaic virus is introduced into RRS and A2704-12 but not MON89788. We then designed a screening method comprised of the combination of the quantification of P35S and the event-specific quantification of MON89788. The conversion factor (Cf) required to convert the amount of a genetically modified organism (GMO) from a copy number ratio to a weight ratio was determined experimentally. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDR), respectively. The determined RSDR values for the method were less than 25% for both targets. We consider that the developed method would be suitable for the simple detection and approximate quantification of GMO.
Risk identification of agricultural drought for sustainable agroecosystems
NASA Astrophysics Data System (ADS)
Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.; Tarquis, A. M.
2014-04-01
Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, society and economy. Droughts affect sustainability of agriculture and may result in environmental degradation of a region, which is one of the factors contributing to the vulnerability of agriculture. This paper addresses agrometeorological or agricultural drought within the risk management framework. Risk management consists of risk assessment, as well as a feedback on the adopted risk reduction measures. And risk assessment comprises three distinct steps, namely risk identification, risk estimation and risk evaluation. This paper deals with risk identification of agricultural drought, which involves drought quantification and monitoring, as well as statistical inference. For the quantitative assessment of agricultural drought, as well as the computation of spatiotemporal features, one of the most reliable and widely used indices is applied, namely the Vegetation Health Index (VHI). The computation of VHI is based on satellite data of temperature and the Normalized Difference Vegetation Index (NDVI). The spatiotemporal features of drought, which are extracted from VHI are: areal extent, onset and end time, duration and severity. In this paper, a 20 year (1981-2001) time series of NOAA/AVHRR satellite data is used, where monthly images of VHI are extracted. Application is implemented in Thessaly, which is the major agricultural drought-prone region of Greece, characterized by vulnerable agriculture. The results show that agricultural drought appears every year during the warm season in the region. The severity of drought is increasing from mild to extreme throughout the warm season with peaks appearing in the summer. Similarly, the areal extent of drought is also increasing during the warm season, whereas the number of extreme drought pixels is much less than those of mild to moderate drought throughout the warm season. Finally, the areas with diachronic drought persistence can be located. Drought early warning is developed using empirical functional relationships of severity and areal extent. In particular, two second-order polynomials are fitted, one for low and the other for high severity drought classes, respectively. The two fitted curves offer a forecasting tool on a monthly basis from May to October. The results of this drought risk identification effort are considered quite satisfactory offering a prognostic potential. The adopted remote sensing data and methods have proven very effective in delineating spatial variability and features in drought quantification and monitoring.
Lowering the quantification limit of the QubitTM RNA HS assay using RNA spike-in.
Li, Xin; Ben-Dov, Iddo Z; Mauro, Maurizio; Williams, Zev
2015-05-06
RNA quantification is often a prerequisite for most RNA analyses such as RNA sequencing. However, the relatively low sensitivity and large sample consumption of traditional RNA quantification methods such as UV spectrophotometry and even the much more sensitive fluorescence-based RNA quantification assays, such as the Qubit™ RNA HS Assay, are often inadequate for measuring minute levels of RNA isolated from limited cell and tissue samples and biofluids. Thus, there is a pressing need for a more sensitive method to reliably and robustly detect trace levels of RNA without interference from DNA. To improve the quantification limit of the Qubit™ RNA HS Assay, we spiked-in a known quantity of RNA to achieve the minimum reading required by the assay. Samples containing trace amounts of RNA were then added to the spike-in and measured as a reading increase over RNA spike-in baseline. We determined the accuracy and precision of reading increases between 1 and 20 pg/μL as well as RNA-specificity in this range, and compared to those of RiboGreen(®), another sensitive fluorescence-based RNA quantification assay. We then applied Qubit™ Assay with RNA spike-in to quantify plasma RNA samples. RNA spike-in improved the quantification limit of the Qubit™ RNA HS Assay 5-fold, from 25 pg/μL down to 5 pg/μL while maintaining high specificity to RNA. This enabled quantification of RNA with original concentration as low as 55.6 pg/μL compared to 250 pg/μL for the standard assay and decreased sample consumption from 5 to 1 ng. Plasma RNA samples that were not measurable by the Qubit™ RNA HS Assay were measurable by our modified method. The Qubit™ RNA HS Assay with RNA spike-in is able to quantify RNA with high specificity at 5-fold lower concentration and uses 5-fold less sample quantity than the standard Qubit™ Assay.
Merrifield, R C; Stephan, C; Lead, J R
2018-02-20
Quantifying metal and nanoparticle (NP) biouptake and distribution on an individual cellular basis has previously been impossible, given available techniques which provide qualitative data that are laborious to acquire and prone to artifacts. Quantifying metal and metal NP uptake and loss processes in environmental organisms will lead to mechanistic understanding of biouptake and improved understanding of potential hazards and risks of metals and NPs. In this work, we present a new technique, single cell inductively coupled plasma mass spectrometry (SC-ICP-MS), which allows quantification of metal concentrations on an individual cell basis down to the attogram (ag) per cell level. We present data validating the novel method, along with the mass of metal per cell. Finally, we use SC-ICP-MS, with ancillary cell counting methods, to quantify the biouptake and strong sorption and distribution of both dissolved Au and Au NPs in a freshwater alga (Cyptomonas ovate). The data suggests differences between dissolved and NP uptake and loss. In the case of NPs, there was a dose and time dependent uptake, but individual cellular variations; at the highest realistic exposure conditions used in this study up to 40-50% of cells contained NPs, while 50-60% of cells did not.
Gandola, Emanuele; Antonioli, Manuela; Traficante, Alessio; Franceschini, Simone; Scardi, Michele; Congestri, Roberta
2016-05-01
Toxigenic cyanobacteria are one of the main health risks associated with water resources worldwide, as their toxins can affect humans and fauna exposed via drinking water, aquaculture and recreation. Microscopy monitoring of cyanobacteria in water bodies and massive growth systems is a routine operation for cell abundance and growth estimation. Here we present ACQUA (Automated Cyanobacterial Quantification Algorithm), a new fully automated image analysis method designed for filamentous genera in Bright field microscopy. A pre-processing algorithm has been developed to highlight filaments of interest from background signals due to other phytoplankton and dust. A spline-fitting algorithm has been designed to recombine interrupted and crossing filaments in order to perform accurate morphometric analysis and to extract the surface pattern information of highlighted objects. In addition, 17 specific pattern indicators have been developed and used as input data for a machine-learning algorithm dedicated to the recognition between five widespread toxic or potentially toxic filamentous genera in freshwater: Aphanizomenon, Cylindrospermopsis, Dolichospermum, Limnothrix and Planktothrix. The method was validated using freshwater samples from three Italian volcanic lakes comparing automated vs. manual results. ACQUA proved to be a fast and accurate tool to rapidly assess freshwater quality and to characterize cyanobacterial assemblages in aquatic environments. Copyright © 2016 Elsevier B.V. All rights reserved.
Semi-automatic knee cartilage segmentation
NASA Astrophysics Data System (ADS)
Dam, Erik B.; Folkesson, Jenny; Pettersen, Paola C.; Christiansen, Claus
2006-03-01
Osteo-Arthritis (OA) is a very common age-related cause of pain and reduced range of motion. A central effect of OA is wear-down of the articular cartilage that otherwise ensures smooth joint motion. Quantification of the cartilage breakdown is central in monitoring disease progression and therefore cartilage segmentation is required. Recent advances allow automatic cartilage segmentation with high accuracy in most cases. However, the automatic methods still fail in some problematic cases. For clinical studies, even if a few failing cases will be averaged out in the overall results, this reduces the mean accuracy and precision and thereby necessitates larger/longer studies. Since the severe OA cases are often most problematic for the automatic methods, there is even a risk that the quantification will introduce a bias in the results. Therefore, interactive inspection and correction of these problematic cases is desirable. For diagnosis on individuals, this is even more crucial since the diagnosis will otherwise simply fail. We introduce and evaluate a semi-automatic cartilage segmentation method combining an automatic pre-segmentation with an interactive step that allows inspection and correction. The automatic step consists of voxel classification based on supervised learning. The interactive step combines a watershed transformation of the original scan with the posterior probability map from the classification step at sub-voxel precision. We evaluate the method for the task of segmenting the tibial cartilage sheet from low-field magnetic resonance imaging (MRI) of knees. The evaluation shows that the combined method allows accurate and highly reproducible correction of the segmentation of even the worst cases in approximately ten minutes of interaction.
Psifidi, Androniki; Dovas, Chrysostomos; Banos, Georgios
2011-01-19
Single nucleotide polymorphisms (SNP) have proven to be powerful genetic markers for genetic applications in medicine, life science and agriculture. A variety of methods exist for SNP detection but few can quantify SNP frequencies when the mutated DNA molecules correspond to a small fraction of the wild-type DNA. Furthermore, there is no generally accepted gold standard for SNP quantification, and, in general, currently applied methods give inconsistent results in selected cohorts. In the present study we sought to develop a novel method for accurate detection and quantification of SNP in DNA pooled samples. The development and evaluation of a novel Ligase Chain Reaction (LCR) protocol that uses a DNA-specific fluorescent dye to allow quantitative real-time analysis is described. Different reaction components and thermocycling parameters affecting the efficiency and specificity of LCR were examined. Several protocols, including gap-LCR modifications, were evaluated using plasmid standard and genomic DNA pools. A protocol of choice was identified and applied for the quantification of a polymorphism at codon 136 of the ovine PRNP gene that is associated with susceptibility to a transmissible spongiform encephalopathy in sheep. The real-time LCR protocol developed in the present study showed high sensitivity, accuracy, reproducibility and a wide dynamic range of SNP quantification in different DNA pools. The limits of detection and quantification of SNP frequencies were 0.085% and 0.35%, respectively. The proposed real-time LCR protocol is applicable when sensitive detection and accurate quantification of low copy number mutations in DNA pools is needed. Examples include oncogenes and tumour suppressor genes, infectious diseases, pathogenic bacteria, fungal species, viral mutants, drug resistance resulting from point mutations, and genetically modified organisms in food.
Pocock, Tessa; Król, Marianna; Huner, Norman P A
2004-01-01
Chorophylls and carotenoids are functionally important pigment molecules in photosynthetic organisms. Methods for the determination of chlorophylls a and b, beta-carotene, neoxanthin, and the pigments that are involved in photoprotective cycles such as the xanthophylls are discussed. These cycles involve the reversible de-epoxidation of violaxanthin into antheraxanthin and zeaxanthin, as well as the reversible de-epoxidation of lutein-5,6-epoxide into lutein. This chapter describes pigment extraction procedures from higher plants and green algae. Methods for the determination and quantification using high-performance liquid chromatograpy (HPLC) are described as well as methods for the separation and purification of pigments for use as standards using thin-layer chromatography (TLC). In addition, several spectrophotometric methods for the quantification of chlorophylls a and b are described.
Metering error quantification under voltage and current waveform distortion
NASA Astrophysics Data System (ADS)
Wang, Tao; Wang, Jia; Xie, Zhi; Zhang, Ran
2017-09-01
With integration of more and more renewable energies and distortion loads into power grid, the voltage and current waveform distortion results in metering error in the smart meters. Because of the negative effects on the metering accuracy and fairness, it is an important subject to study energy metering combined error. In this paper, after the comparing between metering theoretical value and real recorded value under different meter modes for linear and nonlinear loads, a quantification method of metering mode error is proposed under waveform distortion. Based on the metering and time-division multiplier principles, a quantification method of metering accuracy error is proposed also. Analyzing the mode error and accuracy error, a comprehensive error analysis method is presented which is suitable for new energy and nonlinear loads. The proposed method has been proved by simulation.
A simple and fast method for extraction and quantification of cryptophyte phycoerythrin.
Thoisen, Christina; Hansen, Benni Winding; Nielsen, Søren Laurentius
2017-01-01
The microalgal pigment phycoerythrin (PE) is of commercial interest as natural colorant in food and cosmetics, as well as fluoroprobes for laboratory analysis. Several methods for extraction and quantification of PE are available but they comprise typically various extraction buffers, repetitive freeze-thaw cycles and liquid nitrogen, making extraction procedures more complicated. A simple method for extraction of PE from cryptophytes is described using standard laboratory materials and equipment. The cryptophyte cells on the filters were disrupted at -80 °C and added phosphate buffer for extraction at 4 °C followed by absorbance measurement. The cryptophyte Rhodomonas salina was used as a model organism. •Simple method for extraction and quantification of phycoerythrin from cryptophytes.•Minimal usage of equipment and chemicals, and low labor costs.•Applicable for industrial and biological purposes.
Pre-analytical conditions in non-invasive prenatal testing of cell-free fetal RHD.
Clausen, Frederik Banch; Jakobsen, Tanja Roien; Rieneck, Klaus; Krog, Grethe Risum; Nielsen, Leif Kofoed; Tabor, Ann; Dziegiel, Morten Hanefeld
2013-01-01
Non-invasive prenatal testing of cell-free fetal DNA (cffDNA) in maternal plasma can predict the fetal RhD type in D negative pregnant women. In Denmark, routine antenatal screening for the fetal RhD gene (RHD) directs the administration of antenatal anti-D prophylaxis only to women who carry an RhD positive fetus. Prophylaxis reduces the risk of immunization that may lead to hemolytic disease of the fetus and the newborn. The reliability of predicting the fetal RhD type depends on pre-analytical factors and assay sensitivity. We evaluated the testing setup in the Capital Region of Denmark, based on data from routine antenatal RHD screening. Blood samples were drawn at gestational age 25 weeks. DNA extracted from 1 mL of plasma was analyzed for fetal RHD using a duplex method for exon 7/10. We investigated the effect of blood sample transportation time (n = 110) and ambient outdoor temperatures (n = 1539) on the levels of cffDNA and total DNA. We compared two different quantification methods, the delta Ct method and a universal standard curve. PCR pipetting was compared on two systems (n = 104). The cffDNA level was unaffected by blood sample transportation for up to 9 days and by ambient outdoor temperatures ranging from -10 °C to 28 °C during transport. The universal standard curve was applicable for cffDNA quantification. Identical levels of cffDNA were observed using the two automated PCR pipetting systems. We detected a mean of 100 fetal DNA copies/mL at a median gestational age of 25 weeks (range 10-39, n = 1317). The setup for real-time PCR-based, non-invasive prenatal testing of cffDNA in the Capital Region of Denmark is very robust. Our findings regarding the transportation of blood samples demonstrate the high stability of cffDNA. The applicability of a universal standard curve facilitates easy cffDNA quantification.
The two major sources of arsenic exposure used in an arsenic risk assessment are water and diet. The extraction, separation and quantification of individual arsenic species from dietary sources is considered an area of uncertainty within the arsenic risk assessment. The uncertain...
Pre-planting risk assessment models for Stagonospora nodorum blotch in winter wheat
USDA-ARS?s Scientific Manuscript database
Stagonospora nodorum blotch (SNB) caused by Parastagonospora nodorum, is a major disease of wheat. Pre-planting factors such as previous crop, tillage, host genotype, disease history, and location of a field affect disease intensity. However, the risk of SNB due to these factors has not been quantif...
Witte, Anna Kristina; Fister, Susanne; Mester, Patrick; Schoder, Dagmar; Rossmanith, Peter
2016-11-01
Fast and reliable pathogen detection is an important issue for human health. Since conventional microbiological methods are rather slow, there is growing interest in detection and quantification using molecular methods. The droplet digital polymerase chain reaction (ddPCR) is a relatively new PCR method for absolute and accurate quantification without external standards. Using the Listeria monocytogenes specific prfA assay, we focused on the questions of whether the assay was directly transferable to ddPCR and whether ddPCR was suitable for samples derived from heterogeneous matrices, such as foodstuffs that often included inhibitors and a non-target bacterial background flora. Although the prfA assay showed suboptimal cluster formation, use of ddPCR for quantification of L. monocytogenes from pure bacterial cultures, artificially contaminated cheese, and naturally contaminated foodstuff was satisfactory over a relatively broad dynamic range. Moreover, results demonstrated the outstanding detection limit of one copy. However, while poorer DNA quality, such as resulting from longer storage, can impair ddPCR, internal amplification control (IAC) of prfA by ddPCR, that is integrated in the genome of L. monocytogenes ΔprfA, showed even slightly better quantification over a broader dynamic range. Graphical Abstract Evaluating the absolute quantification potential of ddPCR targeting Listeria monocytogenes prfA.
Gaubert, Alexandra; Jeudy, Jérémy; Rougemont, Blandine; Bordes, Claire; Lemoine, Jérôme; Casabianca, Hervé; Salvador, Arnaud
2016-07-01
In a stricter legislative context, greener detergent formulations are developed. In this way, synthetic surfactants are frequently replaced by bio-sourced surfactants and/or used at lower concentrations in combination with enzymes. In this paper, a LC-MS/MS method was developed for the identification and quantification of enzymes in laundry detergents. Prior to the LC-MS/MS analyses, a specific sample preparation protocol was developed due to matrix complexity (high surfactant percentages). Then for each enzyme family mainly used in detergent formulations (protease, amylase, cellulase, and lipase), specific peptides were identified on a high resolution platform. A LC-MS/MS method was then developed in selected reaction monitoring (SRM) MS mode for the light and corresponding heavy peptides. The method was linear on the peptide concentration ranges 25-1000 ng/mL for protease, lipase, and cellulase; 50-1000 ng/mL for amylase; and 5-1000 ng/mL for cellulase in both water and laundry detergent matrices. The application of the developed analytical strategy to real commercial laundry detergents enabled enzyme identification and absolute quantification. For the first time, identification and absolute quantification of enzymes in laundry detergent was realized by LC-MS/MS in a single run. Graphical Abstract Identification and quantification of enzymes by LC-MS/MS.
2011-01-01
Background Campylobacter spp., especially Campylobacter jejuni (C. jejuni) and Campylobacter coli (C. coli), are recognized as the leading human foodborne pathogens in developed countries. Livestock animals carrying Campylobacter pose an important risk for human contamination. Pigs are known to be frequently colonized with Campylobacter, especially C. coli, and to excrete high numbers of this pathogen in their faeces. Molecular tools, notably real-time PCR, provide an effective, rapid, and sensitive alternative to culture-based methods for the detection of C. coli and C. jejuni in various substrates. In order to serve as a diagnostic tool supporting Campylobacter epidemiology, we developed a quantitative real-time PCR method for species-specific detection and quantification of C. coli and C. jejuni directly in faecal, feed, and environmental samples. Results With a sensitivity of 10 genome copies and a linear range of seven to eight orders of magnitude, the C. coli and C. jejuni real-time PCR assays allowed a precise quantification of purified DNA from C. coli and C. jejuni. The assays were highly specific and showed a 6-log-linear dynamic range of quantification with a quantitative detection limit of approximately 2.5 × 102 CFU/g of faeces, 1.3 × 102 CFU/g of feed, and 1.0 × 103 CFU/m2 for the environmental samples. Compared to the results obtained by culture, both C. coli and C. jejuni real-time PCR assays exhibited a specificity of 96.2% with a kappa of 0.94 and 0.89 respectively. For faecal samples of experimentally infected pigs, the coefficients of correlation between the C. coli or C. jejuni real-time PCR assay and culture enumeration were R2 = 0.90 and R2 = 0.93 respectively. Conclusion The C. coli and C. jejuni real-time quantitative PCR assays developed in this study provide a method capable of directly detecting and quantifying C. coli and C. jejuni in faeces, feed, and environmental samples. These assays represent a new diagnostic tool for studying the epidemiology of Campylobacter by, for instance, investigating the carriage and excretion of C. coli and C. jejuni by pigs from conventional herds. PMID:21600037
Paul B. Alaback; Duncan C. Lutes
1997-01-01
Methods for the quantification of coarse woody debris volume and the description of spatial patterning were studied in the Tenderfoot Creek Experimental Forest, Montana. The line transect method was found to be an accurate, unbiased estimator of down debris volume (> 10cm diameter) on 1/4 hectare fixed-area plots, when perpendicular lines were used. The Fischer...
Evaluating life-safety risk of fieldwork at New Zealand's active volcanoes
NASA Astrophysics Data System (ADS)
Deligne, Natalia; Jolly, Gill; Taig, Tony; Webb, Terry
2014-05-01
Volcano observatories monitor active or potentially active volcanoes. Although the number and scope of remote monitoring instruments and methods continues to grow, in-person field data collection is still required for comprehensive monitoring. Fieldwork anywhere, and especially in mountainous areas, contains an element of risk. However, on volcanoes with signs of unrest, there is an additional risk of volcanic activity escalating while on site, with potentially lethal consequences. As an employer, a volcano observatory is morally and sometimes legally obligated to take reasonable measures to ensure staff safety and to minimise occupational risk. Here we present how GNS Science evaluates life-safety risk for volcanologists engaged in fieldwork on New Zealand volcanoes with signs of volcanic unrest. Our method includes several key elements: (1) an expert elicitation for how likely an eruption is within a given time frame, (2) quantification of, based on historical data when possible, given a small, moderate, or large eruption, the likelihood of exposure to near-vent processes, ballistics, or surge at various distances from the vent, and (3) estimate of fatality rate given exposure to these volcanic hazards. The final product quantifies hourly fatality risk at various distances from a volcanic vent; various thresholds of risk (for example, zones with more than 10-5 hourly fatality risk) trigger different levels of required approval to undertake work. Although an element of risk will always be present when conducting fieldwork on potentially active volcanoes, this is a first step towards providing objective guidance for go/no go decisions for volcanic monitoring.
A systematic review of waterborne disease burden methodologies from developed countries.
Murphy, H M; Pintar, K D M; McBean, E A; Thomas, M K
2014-12-01
The true incidence of endemic acute gastrointestinal illness (AGI) attributable to drinking water in Canada is unknown. Using a systematic review framework, the literature was evaluated to identify methods used to attribute AGI to drinking water. Several strategies have been suggested or applied to quantify AGI attributable to drinking water at a national level. These vary from simple point estimates, to quantitative microbial risk assessment, to Monte Carlo simulations, which rely on assumptions and epidemiological data from the literature. Using two methods proposed by researchers in the USA, this paper compares the current approaches and key assumptions. Knowledge gaps are identified to inform future waterborne disease attribution estimates. To improve future estimates, there is a need for robust epidemiological studies that quantify the health risks associated with small, private water systems, groundwater systems and the influence of distribution system intrusions on risk. Quantification of the occurrence of enteric pathogens in water supplies, particularly for groundwater, is needed. In addition, there are unanswered questions regarding the susceptibility of vulnerable sub-populations to these pathogens and the influence of extreme weather events (precipitation) on AGI-related health risks. National centralized data to quantify the proportions of the population served by different water sources, by treatment level, source water quality, and the condition of the distribution system infrastructure, are needed.
Misra, Ankita; Shukla, Pushpendra Kumar; Kumar, Bhanu; Chand, Jai; Kushwaha, Poonam; Khalid, Md.; Singh Rawat, Ajay Kumar; Srivastava, Sharad
2017-01-01
Background: Gloriosa superba L. (Colchicaceae) is used as adjuvant therapy in gout for its potential antimitotic activity due to high colchicine(s) alkaloids. Objective: This study aimed to develop an easy, cheap, precise, and accurate high-performance thin-layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L. and to identify its elite chemotype(s) from Sikkim Himalayas (India). Methods: The HPTLC chromatographic method was developed using mobile phase of chloroform: acetone: diethyl amine (5:4:1) at λmax of 350 nm. Results: Five germplasms were collected from targeted region, and on morpho-anatomical inspection, no significant variation was observed among them. Quantification data reveal that content of colchicine (Rf: 0.72) and gloriosine (Rf: 0.61) varies from 0.035%–0.150% to 0.006%–0.032% (dry wt. basis). Linearity of method was obtained in the concentration range of 100–400 ng/spot of marker(s), exhibiting regression coefficient of 0.9987 (colchicine) and 0.9983 (gloriosine) with optimum recovery of 97.79 ± 3.86 and 100.023% ± 0.01%, respectively. Limit of detection and limit of quantification were analyzed, respectively, as 6.245, 18.926 and 8.024, 24.316 (ng). Two germplasms, namely NBG-27 and NBG-26, were found to be elite chemotype of both the markers. Conclusion: The developed method is validated in terms of accuracy, recovery, and precision studies as per the ICH guidelines (2005) and can be adopted for the simultaneous quantification of colchicine and gloriosine in phytopharmaceuticals. In addition, this study is relevant to explore the chemotypic variability in metabolite content for commercial and medicinal purposes. SUMMARY An easy, cheap, precise, and accurate high performance thin layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L.Five germplasms were collected from targeted region, and on morpho anatomical inspection, no significant variation was observed among themQuantification data reveal that content of colchicine (Rf: 0.72) and gloriosine (Rf: 0.61) varies from 0.035%–0.150% to 0.006%–0.032% (dry wt. basis)Two germplasms, namely NBG 27 and NBG 26, were found to be elite chemotype of both the markers. PMID:29142436
Bihan, Kevin; Sauzay, Chloé; Goldwirt, Lauriane; Charbonnier-Beaupel, Fanny; Hulot, Jean-Sebastien; Funck-Brentano, Christian; Zahr, Noël
2015-02-01
Vemurafenib (Zelboraf) is a new tyrosine kinase inhibitor that selectively targets activated BRAF V600E gene and is indicated for the treatment of advanced BRAF mutation-positive melanoma. We developed a simple method for vemurafenib quantification using liquid chromatography-tandem mass spectrometry. A stability study of vemurafenib in human plasma was also performed. (13)C(6)-vemurafenib was used as the internal standard. A single-step protein precipitation was used for plasma sample preparation. Chromatography was performed on an Acquity UPLC system (Waters) with chromatographic separation by the use of an Acquity UPLC BEH C18 column (2.1 × 50 mm, 1.7-mm particle size; Waters). Quantification was performed using the monitoring of multiple reactions of following transitions: m/z 488.2 → 381.0 for vemurafenib and m/z 494.2 → 387.0 for internal standard. This method was linear over the range from 1.0 to 100.0 mcg/mL. The lower limit of quantification was 0.1 mcg/mL for vemurafenib in plasma. Vemurafenib remained stable for 1 month at all levels tested, when stored indifferently at room temperature (20 °C), at +4 °C, or at -20 °C. This method was used successfully to perform a plasma pharmacokinetic study of vemurafenib in a patient after oral administration at a steady state. This liquid chromatography-tandem mass spectrometry method for vemurafenib quantification in human plasma is simple, rapid, specific, sensitive, accurate, precise, and reliable.
Bliem, Rupert; Schauer, Sonja; Plicka, Helga; Obwaller, Adelheid; Sommer, Regina; Steinrigl, Adolf; Alam, Munirul; Reischer, Georg H.; Farnleitner, Andreas H.
2015-01-01
Vibrio cholerae is a severe human pathogen and a frequent member of aquatic ecosystems. Quantification of V. cholerae in environmental water samples is therefore fundamental for ecological studies and health risk assessment. Beside time-consuming cultivation techniques, quantitative PCR (qPCR) has the potential to provide reliable quantitative data and offers the opportunity to quantify multiple targets simultaneously. A novel triplex qPCR strategy was developed in order to simultaneously quantify toxigenic and nontoxigenic V. cholerae in environmental water samples. To obtain quality-controlled PCR results, an internal amplification control was included. The qPCR assay was specific, highly sensitive, and quantitative across the tested 5-log dynamic range down to a method detection limit of 5 copies per reaction. Repeatability and reproducibility were high for all three tested target genes. For environmental application, global DNA recovery (GR) rates were assessed for drinking water, river water, and water from different lakes. GR rates ranged from 1.6% to 76.4% and were dependent on the environmental background. Uncorrected and GR-corrected V. cholerae abundances were determined in two lakes with extremely high turbidity. Uncorrected abundances ranged from 4.6 × 102 to 2.3 × 104 cell equivalents liter−1, whereas GR-corrected abundances ranged from 4.7 × 103 to 1.6 × 106 cell equivalents liter−1. GR-corrected qPCR results were in good agreement with an independent cell-based direct detection method but were up to 1.6 log higher than cultivation-based abundances. We recommend the newly developed triplex qPCR strategy as a powerful tool to simultaneously quantify toxigenic and nontoxigenic V. cholerae in various aquatic environments for ecological studies as well as for risk assessment programs. PMID:25724966
Epidemiological and clinical correlates of malaria-helminth co-infections in southern Ethiopia
2013-01-01
Background In many areas of the world, including Ethiopia, malaria and helminths are co-endemic, therefore, co-infections are common. However, little is known how concurrent infections affect the epidemiology and/or pathogenesis of each other. Therefore, this study was conducted to assess the effects of intestinal helminth infections on the epidemiology and clinical patterns of malaria in southern Ethiopia where both infections are prevalent. Methods A cross-sectional study was conducted in 2006 at Wondo Genet Health Center and Bussa Clinic, southern Ethiopia. Consecutive blood film positive malaria patients (N=230) and malaria negative asymptomatic individuals (N=233) were recruited. Malaria parasite detection and quantification was diagnosed using Giemsa-stained thick and thin blood films, respectively. Helminths were detected using direct microscopy and formol-ether concentration techniques. Coarse quantification of helminths ova was made using Kato Katz method. Results The over all magnitude of intestinal parasitic infection was high irrespective of malaria infection (67% among malaria positive patients versus 53.1% among malaria non-infected asymptomatic individuals). Trichuris trichiura infection was associated with increased malaria prevalence while increased worm burden of helminths as expressed by egg intensity was associated with increased malaria parasitaemia which could be a potential factor for development of severe malarial infection with the course of the disease. Majority (77%) of the subjects had multiple helminths infection. T. trichiura, Ascaris lumbricoides, Schistosoma mansoni, and hookworm infestation accounted for 64.5, 57.7 %, 28.4%, and 12.2% of the infections, respectively. Conclusions Populations in malaria-endemic areas of southern Ethiopia are multi-parasitized with up to four helminths. Mass deworming may be a simple practical approach in endemic areas in reducing the risk of severe malarial attack particularly for those at high risk of both infections. PMID:23822192
Uncertainty Quantification for Robust Control of Wind Turbines using Sliding Mode Observer
NASA Astrophysics Data System (ADS)
Schulte, Horst
2016-09-01
A new quantification method of uncertain models for robust wind turbine control using sliding-mode techniques is presented with the objective to improve active load mitigation. This approach is based on the so-called equivalent output injection signal, which corresponds to the average behavior of the discontinuous switching term, establishing and maintaining a motion on a so-called sliding surface. The injection signal is directly evaluated to obtain estimates of the uncertainty bounds of external disturbances and parameter uncertainties. The applicability of the proposed method is illustrated by the quantification of a four degree-of-freedom model of the NREL 5MW reference turbine containing uncertainties.
NASA Astrophysics Data System (ADS)
Buongiorno, J.; Lloyd, K. G.; Shumaker, A.; Schippers, A.; Webster, G.; Weightman, A.; Turner, S.
2015-12-01
Nearly 75% of the Earth's surface is covered by marine sediment that is home to an estimated 2.9 x 1029 microbial cells. A substantial impediment to understanding the abundance and distribution of cells within marine sediment is the lack of a consistent and reliable method for their taxon-specific quantification. Catalyzed reporter fluorescent in situ hybridization (CARD-FISH) provides taxon-specific enumeration, but this process requires passing a large enzyme through cell membranes, decreasing its precision relative to general cell counts using a small DNA stain. In 2015, Yamaguchi et al. developed FISH hybridization chain reaction (FISH-HCR) as an in situ whole cell detection method for environmental microorganisms. FISH-HCR amplifies the fluorescent signal, as does CARD-FISH, but it allows for milder cell permeation methods that might prevent yield loss. To compare FISH-HCR to CARD-FISH, we examined bacteria and archaea cell counts within two sediment cores, Lille Belt (~78 meters deep) and Landsort Deep (90 meters deep), which were retrieved from the Baltic Sea Basin during IODP Expedition 347. Preliminary analysis shows that CARD-FISH counts are below the quantification limit for most depths across both cores. By contrast, quantification of cells was possible with FISH-HCR in all examined depths. When quantification with CARD-FISH was above the limit of detection, counts with FISH-HCR were up to 11 fold higher for Bacteria and 3 fold higher for Archaea from the same sediment sample. Further, FISH-HCR counts follow the trends of on board counts nicely, indicating that FISH-HCR may better reflect the cellular abundance within marine sediment than other quantification methods, including qPCR. Using FISH-HCR, we found that archaeal cell counts were on average greater than bacterial cell counts, but within the same order of magnitude.
Eriksen, Jane N; Madsen, Pia L; Dragsted, Lars O; Arrigoni, Eva
2017-02-01
An improved UHPLC-DAD-based method was developed and validated for quantification of major carotenoids present in spinach, serum, chylomicrons, and feces. Separation was achieved with gradient elution within 12.5 min for six dietary carotenoids and the internal standard, echinenone. The proposed method provides, for all standard components, resolution > 1.1, linearity covering the target range (R > 0.99), LOQ < 0.035 mg/L, and intraday and interday RSDs < 2 and 10%, respectively. Suitability of the method was tested on biological matrices. Method precision (RSD%) for carotenoid quantification in serum, chylomicrons, and feces was below 10% for intra- and interday analysis, except for lycopene. Method accuracy was consistent with mean recoveries ranging from 78.8 to 96.9% and from 57.2 to 96.9% for all carotenoids, except for lycopene, in serum and feces, respectively. Additionally, an interlaboratory validation study on spinach at two institutions showed no significant differences in lutein or β-carotene content, when evaluated on four occasions.
Nahar, Limon Khatun; Cordero, Rosa Elena; Nutt, David; Lingford-Hughes, Anne; Turton, Samuel; Durant, Claire; Wilson, Sue; Paterson, Sue
2016-01-01
Abstract A highly sensitive and fully validated method was developed for the quantification of baclofen in human plasma. After adjusting the pH of the plasma samples using a phosphate buffer solution (pH 4), baclofen was purified using mixed mode (C8/cation exchange) solid-phase extraction (SPE) cartridges. Endogenous water-soluble compounds and lipids were removed from the cartridges before the samples were eluted and concentrated. The samples were analyzed using triple-quadrupole liquid chromatography–tandem mass spectrometry (LC–MS-MS) with triggered dynamic multiple reaction monitoring mode for simultaneous quantification and confirmation. The assay was linear from 25 to 1,000 ng/mL (r2 > 0.999; n = 6). Intraday (n = 6) and interday (n = 15) imprecisions (% relative standard deviation) were <5%, and the average recovery was 30%. The limit of detection of the method was 5 ng/mL, and the limit of quantification was 25 ng/mL. Plasma samples from healthy male volunteers (n = 9, median age: 22) given two single oral doses of baclofen (10 and 60 mg) on nonconsecutive days were analyzed to demonstrate method applicability. PMID:26538544
Identifying and quantifying secondhand smoke in multiunit homes with tobacco smoke odor complaints
NASA Astrophysics Data System (ADS)
Dacunto, Philip J.; Cheng, Kai-Chung; Acevedo-Bolton, Viviana; Klepeis, Neil E.; Repace, James L.; Ott, Wayne R.; Hildemann, Lynn M.
2013-06-01
Accurate identification and quantification of the secondhand tobacco smoke (SHS) that drifts between multiunit homes (MUHs) is essential for assessing resident exposure and health risk. We collected 24 gaseous and particle measurements over 6-9 day monitoring periods in five nonsmoking MUHs with reported SHS intrusion problems. Nicotine tracer sampling showed evidence of SHS intrusion in all five homes during the monitoring period; logistic regression and chemical mass balance (CMB) analysis enabled identification and quantification of some of the precise periods of SHS entry. Logistic regression models identified SHS in eight periods when residents complained of SHS odor, and CMB provided estimates of SHS magnitude in six of these eight periods. Both approaches properly identified or apportioned all six cooking periods used as no-SHS controls. Finally, both approaches enabled identification and/or apportionment of suspected SHS in five additional periods when residents did not report smelling smoke. The time resolution of this methodology goes beyond sampling methods involving single tracers (such as nicotine), enabling the precise identification of the magnitude and duration of SHS intrusion, which is essential for accurate assessment of human exposure.
Arashida, Naoko; Nishimoto, Rumi; Harada, Masashi; Shimbo, Kazutaka; Yamada, Naoyuki
2017-02-15
Amino acids and their related metabolites play important roles in various physiological processes and have consequently become biomarkers for diseases. However, accurate quantification methods have only been established for major compounds, such as amino acids and a limited number of target metabolites. We previously reported a highly sensitive high-throughput method for the simultaneous quantification of amines using 3-aminopyridyl-N-succinimidyl carbamate as a derivatization reagent combined with liquid chromatography-tandem mass spectrometry (LC-MS/MS). Herein, we report the successful development of a practical and accurate LC-MS/MS method to analyze low concentrations of 40 physiological amines in 19 min. Thirty-five of these amines showed good linearity, limits of quantification, accuracy, precision, and recovery characteristics in plasma, with scheduled selected reaction monitoring acquisitions. Plasma samples from 10 healthy volunteers were evaluated using our newly developed method. The results revealed that 27 amines were detected in one of the samples, and that 24 of these compounds could be quantified. Notably, this new method successfully quantified metabolites with high accuracy across three orders of magnitude, with lowest and highest averaged concentrations of 31.7 nM (for spermine) and 18.3 μM (for α-aminobutyric acid), respectively. Copyright © 2016 Elsevier B.V. All rights reserved.
Barco, Sebastiano; Castagnola, Elio; Moscatelli, Andrea; Rudge, James; Tripodi, Gino; Cangemi, Giuliana
2017-10-25
In this paper we show the development and validation of a volumetric absorptive microsampling (VAMS™)-LC-MS/MS method for the simultaneous quantification of four antibiotics: piperacillin-tazobactam, meropenem, linezolid and ceftazidime in 10μL human blood. The novel VAMS-LC-MS/MS method has been compared with a dried blood spot (DBS)-based method in terms of impact of hematocrit (HCT) on accuracy, reproducibility, recovery and matrix effect. Antibiotics were extracted from VAMS and DBS by protein precipitation with methanol after a re-hydration step at 37°C for 10min. LC-MS/MS was carried out on a Thermo Scientific™ TSQ Quantum™ Access MAX triple quadrupole coupled to an Accela ™UHPLC system. The VAMS-LC-MS/MS method is selective, precise and reproducible. In contrast to DBS, it allows an accurate quantification without any HCT influence. It has been applied to samples derived from pediatric patients under therapy. VAMS is a valid alternative sampling strategy for the quantification of antibiotics and is valuable in support of clinical PK/PD studies and consequently therapeutic drug monitoring (TDM) in pediatrics. Copyright © 2017 Elsevier B.V. All rights reserved.
GMO quantification: valuable experience and insights for the future.
Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana
2014-10-01
Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.
Bostijn, N; Hellings, M; Van Der Veen, M; Vervaet, C; De Beer, T
2018-07-12
UltraViolet (UV) spectroscopy was evaluated as an innovative Process Analytical Technology (PAT) - tool for the in-line and real-time quantitative determination of low-dosed active pharmaceutical ingredients (APIs) in a semi-solid (gel) and a liquid (suspension) pharmaceutical formulation during their batch production process. The performance of this new PAT-tool (i.e., UV spectroscopy) was compared with an already more established PAT-method based on Raman spectroscopy. In-line UV measurements were carried out with an immersion probe while for the Raman measurements a non-contact PhAT probe was used. For both studied formulations, an in-line API quantification model was developed and validated per spectroscopic technique. The known API concentrations (Y) were correlated with the corresponding in-line collected preprocessed spectra (X) through a Partial Least Squares (PLS) regression. Each developed quantification method was validated by calculating the accuracy profile on the basis of the validation experiments. Furthermore, the measurement uncertainty was determined based on the data generated for the determination of the accuracy profiles. From the accuracy profile of the UV- and Raman-based quantification method for the gel, it was concluded that at the target API concentration of 2% (w/w), 95 out of 100 future routine measurements given by the Raman method will not deviate more than 10% (relative error) from the true API concentration, whereas for the UV method the acceptance limits of 10% were exceeded. For the liquid formulation, the Raman method was not able to quantify the API in the low-dosed suspension (0.09% (w/w) API). In contrast, the in-line UV method was able to adequately quantify the API in the suspension. This study demonstrated that UV spectroscopy can be adopted as a novel in-line PAT-technique for low-dose quantification purposes in pharmaceutical processes. Important is that none of the two spectroscopic techniques was superior to the other for both formulations: the Raman method was more accurate in quantifying the API in the gel (2% (w/w) API), while the UV method performed better for API quantification in the suspension (0.09% (w/w) API). Copyright © 2018 Elsevier B.V. All rights reserved.
Lautié, Emmanuelle; Rasse, Catherine; Rozet, Eric; Mourgues, Claire; Vanhelleputte, Jean-Paul; Quetin-Leclercq, Joëlle
2013-02-01
The aim of this study was to find if fast microwave-assisted extraction could be an alternative to the conventional Soxhlet extraction for the quantification of rotenone in yam bean seeds by SPE and HPLC-UV. For this purpose, an experimental design was used to determine the optimal conditions of the microwave extraction. Then the values of the quantification on three accessions from two different species of yam bean seeds were compared using the two different kinds of extraction. A microwave extraction of 11 min at 55°C using methanol/dichloromethane (50:50) allowed rotenone extraction either equivalently or more efficiently than the 8-h-Soxhlet extraction method and was less sensitive to moisture content. The selectivity, precision, trueness, accuracy, and limit of quantification of the method with microwave extraction were also demonstrated. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Teman, Carolin J.; Wilson, Andrew R.; Perkins, Sherrie L.; Hickman, Kimberly; Prchal, Josef T.; Salama, Mohamed E.
2010-01-01
Evaluation of bone marrow fibrosis and osteosclerosis in myeloproliferative neoplasms (MPN) is subject to interobserver inconsistency. Performance data for currently utilized fibrosis grading systems are lacking, and classification scales for osteosclerosis do not exist. Digital imaging can serve as a quantification method for fibrosis and osteosclerosis. We used digital imaging techniques for trabecular area assessment and reticulin-fiber quantification. Patients with all Philadelphia negative MPN subtypes had higher trabecular volume than controls (p ≤0.0015). Results suggest that the degree of osteosclerosis helps differentiate primary myelofibrosis from other MPN. Numerical quantification of fibrosis highly correlated with subjective scores, and interobserver correlation was satisfactory. Digital imaging provides accurate quantification for osteosclerosis and fibrosis. PMID:20122729
Hydrodynamic modelling and global datasets: Flow connectivity and SRTM data, a Bangkok case study.
NASA Astrophysics Data System (ADS)
Trigg, M. A.; Bates, P. B.; Michaelides, K.
2012-04-01
The rise in the global interconnected manufacturing supply chains requires an understanding and consistent quantification of flood risk at a global scale. Flood risk is often better quantified (or at least more precisely defined) in regions where there has been an investment in comprehensive topographical data collection such as LiDAR coupled with detailed hydrodynamic modelling. Yet in regions where these data and modelling are unavailable, the implications of flooding and the knock on effects for global industries can be dramatic, as evidenced by the recent floods in Bangkok, Thailand. There is a growing momentum in terms of global modelling initiatives to address this lack of a consistent understanding of flood risk and they will rely heavily on the application of available global datasets relevant to hydrodynamic modelling, such as Shuttle Radar Topography Mission (SRTM) data and its derivatives. These global datasets bring opportunities to apply consistent methodologies on an automated basis in all regions, while the use of coarser scale datasets also brings many challenges such as sub-grid process representation and downscaled hydrology data from global climate models. There are significant opportunities for hydrological science in helping define new, realistic and physically based methodologies that can be applied globally as well as the possibility of gaining new insights into flood risk through analysis of the many large datasets that will be derived from this work. We use Bangkok as a case study to explore some of the issues related to using these available global datasets for hydrodynamic modelling, with particular focus on using SRTM data to represent topography. Research has shown that flow connectivity on the floodplain is an important component in the dynamics of flood flows on to and off the floodplain, and indeed within different areas of the floodplain. A lack of representation of flow connectivity, often due to data resolution limitations, means that important subgrid processes are missing from hydrodynamic models leading to poor model predictive capabilities. Specifically here, the issue of flow connectivity during flood events is explored using geostatistical techniques to quantify the change of flow connectivity on floodplains due to grid rescaling methods. We also test whether this method of assessing connectivity can be used as new tool in the quantification of flood risk that moves beyond the simple flood extent approach, encapsulating threshold changes and data limitations.
Targeted Quantification of Isoforms of a Thylakoid-Bound Protein: MRM Method Development.
Bru-Martínez, Roque; Martínez-Márquez, Ascensión; Morante-Carriel, Jaime; Sellés-Marchart, Susana; Martínez-Esteso, María José; Pineda-Lucas, José Luis; Luque, Ignacio
2018-01-01
Targeted mass spectrometric methods such as selected/multiple reaction monitoring (SRM/MRM) have found intense application in protein detection and quantification which competes with classical immunoaffinity techniques. It provides a universal procedure to develop a fast, highly specific, sensitive, accurate, and cheap methodology for targeted detection and quantification of proteins based on the direct analysis of their surrogate peptides typically generated by tryptic digestion. This methodology can be advantageously applied in the field of plant proteomics and particularly for non-model species since immunoreagents are scarcely available. Here, we describe the issues to take into consideration in order to develop a MRM method to detect and quantify isoforms of the thylakoid-bound protein polyphenol oxidase from the non-model and database underrepresented species Eriobotrya japonica Lindl.
Watkins, Preston S; Castellon, Benjamin T; Tseng, Chiyen; Wright, Moncie V; Matson, Cole W; Cobb, George P
2018-04-13
A consistent analytical method incorporating sulfuric acid (H 2 SO 4 ) digestion and ICP-MS quantification has been developed for TiO 2 quantification in biotic and abiotic environmentally relevant matrices. Sample digestion in H 2 SO 4 at 110°C provided consistent results without using hydrofluoric acid or microwave digestion. Analysis of seven replicate samples for four matrices on each of 3 days produced Ti recoveries of 97% ± 2.5%, 91 % ± 4.0%, 94% ± 1.8%, and 73 % ± 2.6% (mean ± standard deviation) from water, fish tissue, periphyton, and sediment, respectively. The method demonstrated consistent performance in analysis of water collected over a 1 month.
Model Uncertainty Quantification Methods In Data Assimilation
NASA Astrophysics Data System (ADS)
Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.
2017-12-01
Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.
Allevi, Pietro; Femia, Eti Alessandra; Costa, Maria Letizia; Cazzola, Roberta; Anastasia, Mario
2008-11-28
The present report describes a method for the quantification of N-acetyl- and N-glycolylneuraminic acids without any derivatization, using their (13)C(3)-isotopologues as internal standards and a C(18) reversed-phase column modified by decylboronic acid which allows for the first time a complete chromatographic separation between the two analytes. The method is based on high-performance liquid chromatographic coupled with electrospray ion-trap mass spectrometry. The limit of quantification of the method is 0.1mg/L (2.0ng on column) for both analytes. The calibration curves are linear for both sialic acids over the range of 0.1-80mg/L (2.0-1600ng on column) with a correlation coefficient greater than 0.997. The proposed method was applied to the quantitative determination of sialic acids released from fetuin as a model of glycoproteins.
Interferences in the direct quantification of bisphenol S in paper by means of thermochemolysis.
Becerra, Valentina; Odermatt, Jürgen
2013-02-01
This article analyses the interferences in the quantification of traces of bisphenol S in paper by applying the direct analytical method "analytical pyrolysis gas chromatography mass spectrometry" (Py-GC/MS) in conjunction with on-line derivatisation with tetramethylammonium hydroxide (TMAH). As the analytes are simultaneously analysed with the matrix, the interferences derive from the matrix. The investigated interferences are found in the analysis of paper samples, which include bisphenol S derivative compounds. As the free bisphenol S is the hydrolysis product of the bisphenol S derivative compounds, the detected amount of bisphenol S in the sample may be overestimated. It is found that the formation of free bisphenol S from the bisphenol S derivative compounds is enhanced in the presence of tetramethylammonium hydroxide (TMAH) under pyrolytic conditions. In order to avoid the formation of bisphenol S trimethylsulphonium hydroxide (TMSH) is introduced. Different parameters are optimised in the development of the quantification method with TMSH. The quantification method based on TMSH thermochemolysis has been validated in terms of reproducibility and accuracy. Copyright © 2012 Elsevier B.V. All rights reserved.
Multivariate Analysis for Quantification of Plutonium(IV) in Nitric Acid Based on Absorption Spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lines, Amanda M.; Adami, Susan R.; Sinkov, Sergey I.
Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example ofmore » a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.« less
Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Holst-Jensen, Arne; Žel, Jana
2015-08-18
Presence of genetically modified organisms (GMO) in food and feed products is regulated in many countries. The European Union (EU) has implemented a threshold for labeling of products containing more than 0.9% of authorized GMOs per ingredient. As the number of GMOs has increased over time, standard-curve based simplex quantitative polymerase chain reaction (qPCR) analyses are no longer sufficiently cost-effective, despite widespread use of initial PCR based screenings. Newly developed GMO detection methods, also multiplex methods, are mostly focused on screening and detection but not quantification. On the basis of droplet digital PCR (ddPCR) technology, multiplex assays for quantification of all 12 EU authorized GM maize lines (per April first 2015) were developed. Because of high sequence similarity of some of the 12 GM targets, two separate multiplex assays were needed. In both assays (4-plex and 10-plex), the transgenes were labeled with one fluorescence reporter and the endogene with another (GMO concentration = transgene/endogene ratio). It was shown that both multiplex assays produce specific results and that performance parameters such as limit of quantification, repeatability, and trueness comply with international recommendations for GMO quantification methods. Moreover, for samples containing GMOs, the throughput and cost-effectiveness is significantly improved compared to qPCR. Thus, it was concluded that the multiplex ddPCR assays could be applied for routine quantification of 12 EU authorized GM maize lines. In case of new authorizations, the events can easily be added to the existing multiplex assays. The presented principle of quantitative multiplexing can be applied to any other domain.
Modeling Dental Health Care Workers' Risk of Occupational Infection from Bloodborne Pathogens.
ERIC Educational Resources Information Center
Capilouto, Eli; And Others
1990-01-01
The brief paper offers a model which permits quantification of the dental health care workers' risk of occupationally acquiring infection from bloodborne pathogens such as human immunodeficiency virus and hepatitis B virus. The model incorporates five parameters such as the probability that any individual patient is infected and number of patients…
Griffin, Michael J
2015-01-01
At work or in leisure activities, many people are exposed to vibration or mechanical shocks associated with risks of injury or disease. This paper identifies information that can be used to decide whether there may be a risk from exposure to hand-transmitted vibration or whole-body vibration and shock, and suggests actions that can control the risks. The complex and time-varying nature of human exposures to vibration and shock, the complexity of the different disorders and uncertainty as to the mechanisms of injury and the factors influencing injury have prevented the definition of dose-response relationships well proven by scientific study. It is necessary to wave a flag indicating when there is a need to control risks from exposure to vibration and shock while scientific enquiry provides understanding needed to weave a better flag. It is concluded that quantifying exposure severity is often neither necessary nor sufficient to either identify risks or implement measures that control the risks. The identification of risks associated with exposure to vibration and mechanical shock cannot, and need not, rely solely on the quantification of exposure severity. Qualitative methods can provide a sufficient indication of the need for control measures, which should not be restricted to reducing standardised measures of exposure severity.
Barry, Samantha J; Pham, Tran N; Borman, Phil J; Edwards, Andrew J; Watson, Simon A
2012-01-27
The DMAIC (Define, Measure, Analyse, Improve and Control) framework and associated statistical tools have been applied to both identify and reduce variability observed in a quantitative (19)F solid-state NMR (SSNMR) analytical method. The method had been developed to quantify levels of an additional polymorph (Form 3) in batches of an active pharmaceutical ingredient (API), where Form 1 is the predominant polymorph. In order to validate analyses of the polymorphic form, a single batch of API was used as a standard each time the method was used. The level of Form 3 in this standard was observed to gradually increase over time, the effect not being immediately apparent due to method variability. In order to determine the cause of this unexpected increase and to reduce method variability, a risk-based statistical investigation was performed to identify potential factors which could be responsible for these effects. Factors identified by the risk assessment were investigated using a series of designed experiments to gain a greater understanding of the method. The increase of the level of Form 3 in the standard was primarily found to correlate with the number of repeat analyses, an effect not previously reported in SSNMR literature. Differences in data processing (phasing and linewidth) were found to be responsible for the variability in the method. After implementing corrective actions the variability was reduced such that the level of Form 3 was within an acceptable range of ±1% ww(-1) in fresh samples of API. Copyright © 2011. Published by Elsevier B.V.
García-Jiménez, Sara; Erazo-Mijares, Miguel; Toledano-Jaimes, Cairo D; Monroy-Noyola, Antonio; Bilbao-Marcos, Fernando; Sánchez-Alemán, Miguel A; Déciga-Campos, Myrna
2016-01-01
The present study determined through analytic techniques the quantification of some biomarkers that have been useful to detect early ethanol consumption in a college population. A group of 117 students of recent entry to the Universidad Autónoma del Estado de Morelos was analyzed. The enzyme determination of aspartate aminotransferase, alanine aminotransferase, and gamma glutamyltransferase as metabolic markers of ethanol, as well as the carbohydrate-deficient transferrin (CDT) detected by high chromatographic liquid (up to 1.8% of CDT), allowed us to identify that 6% of the college population presented a potential risk of alcohol consumption. The use of the biochemical-analytical method overall with the psychological drug and a risk factor instrument established by the Universidad Autónoma del Estado de Morelos permit us to identify students whose substance abuse consumption puts their terminal efficiency at risk as well as their academic level. The timely detection on admission to college can monitor and support a student consumer's substance abuse.
Sun, Caixia; Cang, Tao; Wang, Zhiwei; Wang, Xinquan; Yu, Ruixian; Wang, Qiang; Zhao, Xueping
2015-05-01
The health risk to humans of pesticide application on minor crops, such as strawberry, requires quantification. Here, the dissipation and residual levels of three fungicides (pyraclostrobin, myclobutanil, and difenoconazole) were studied for strawberry under greenhouse conditions using high-performance liquid chromatography (HPLC)-tandem mass spectrometry after Quick, Easy, Cheap, Effective, Rugged, and Safe extraction. This method was validated using blank samples, with all mean recoveries of these three fungicides exceeding 80%. The residues of all three fungicides dissipated following first-order kinetics. The half-lives of pyraclostrobin, myclobutanil, and difenoconazole were 1.69, 3.30, and 3.65 days following one time application and 1.73, 5.78, and 6.30 days following two times applications, respectively. Fungicide residue was determined by comparing the estimated daily intake of the three fungicides against the acceptable daily intake. The results indicate that the potential health risk of the three fungicides was not significant in strawberry when following good agricultural practices (GAP) under greenhouse conditions.
Comparisons of Wilks’ and Monte Carlo Methods in Response to the 10CFR50.46(c) Proposed Rulemaking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Hongbin; Szilard, Ronaldo; Zou, Ling
The Nuclear Regulatory Commission (NRC) is proposing a new rulemaking on emergency core system/loss-of-coolant accident (LOCA) performance analysis. In the proposed rulemaking, designated as 10CFR50.46(c), the US NRC put forward an equivalent cladding oxidation criterion as a function of cladding pre-transient hydrogen content. The proposed rulemaking imposes more restrictive and burnup-dependent cladding embrittlement criteria; consequently nearly all the fuel rods in a reactor core need to be analyzed under LOCA conditions to demonstrate compliance to the safety limits. New analysis methods are required to provide a thorough characterization of the reactor core in order to identify the locations of themore » limiting rods as well as to quantify the safety margins under LOCA conditions. With the new analysis method presented in this work, the limiting transient case and the limiting rods can be easily identified to quantify the safety margins in response to the proposed new rulemaking. In this work, the best-estimate plus uncertainty (BEPU) analysis capability for large break LOCA with the new cladding embrittlement criteria using the RELAP5-3D code is established and demonstrated with a reduced set of uncertainty parameters. Both the direct Monte Carlo method and the Wilks’ nonparametric statistical method can be used to perform uncertainty quantification. Wilks’ method has become the de-facto industry standard to perform uncertainty quantification in BEPU LOCA analyses. Despite its widespread adoption by the industry, the use of small sample sizes to infer statement of compliance to the existing 10CFR50.46 rule, has been a major cause of unrealized operational margin in today’s BEPU methods. Moreover the debate on the proper interpretation of the Wilks’ theorem in the context of safety analyses is not fully resolved yet, even more than two decades after its introduction in the frame of safety analyses in the nuclear industry. This represents both a regulatory and application risk in rolling out new methods. With the 10CFR50.46(c) proposed rulemaking, the deficiencies of the Wilks’ approach are further exacerbated. The direct Monte Carlo approach offers a robust alternative to perform uncertainty quantification within the context of BEPU analyses. In this work, the Monte Carlo method is compared with the Wilks’ method in response to the NRC 10CFR50.46(c) proposed rulemaking.« less
Neutron-Encoded Protein Quantification by Peptide Carbamylation
NASA Astrophysics Data System (ADS)
Ulbrich, Arne; Merrill, Anna E.; Hebert, Alexander S.; Westphall, Michael S.; Keller, Mark P.; Attie, Alan D.; Coon, Joshua J.
2014-01-01
We describe a chemical tag for duplex proteome quantification using neutron encoding (NeuCode). The method utilizes the straightforward, efficient, and inexpensive carbamylation reaction. We demonstrate the utility of NeuCode carbamylation by accurately measuring quantitative ratios from tagged yeast lysates mixed in known ratios and by applying this method to quantify differential protein expression in mice fed a either control or high-fat diet.
Cools, Katherine; Terry, Leon A
2012-07-15
Glucosinolates are β-thioglycosides which are found naturally in Cruciferae including the genus Brassica. When enzymatically hydrolysed, glucosinolates yield isothiocyanates and give a pungent taste. Both glucosinolates and isothiocyanates have been linked with anticancer activity as well as antifungal and antibacterial properties and therefore the quantification of these compounds is scientifically important. A wide range of literature exists on glucosinolates, however the extraction and quantification procedures differ greatly resulting in discrepancies between studies. The aim of this study was therefore to compare the most popular extraction procedures to identify the most efficacious method and whether each extraction can also be used for the quantification of total isothiocyanates. Four extraction techniques were compared for the quantification of sinigrin from mustard cv. Centennial (Brassica juncea L.) seed; boiling water, boiling 50% (v/v) aqueous acetonitrile, boiling 100% methanol and 70% (v/v) aqueous methanol at 70 °C. Prior to injection into the HPLC, the extractions which involved solvents (acetonitrile or methanol) were freeze-dried and resuspended in water. To identify whether the same extract could be used to measure total isothiocyanates, a dichloromethane extraction was carried out on the sinigrin extracts. For the quantification of sinigrin alone, boiling 50% (v/v) acetonitrile was found to be the most efficacious extraction solvent of the four tested yielding 15% more sinigrin than the water extraction. However, the removal of the acetonitrile by freeze-drying had a negative impact on the isothiocyanate content. Quantification of both sinigrin and total isothiocyanates was possible when the sinigrin was extracted using boiling water. Two columns were compared for the quantification of sinigrin revealing the Zorbax Eclipse to be the best column using this particular method. Copyright © 2012 Elsevier B.V. All rights reserved.
2016-04-01
QUANTIFICATION OF VX NERVE AGENT IN VARIOUS FOOD MATRICES BY SOLID-PHASE EXTRACTION ULTRA-PERFORMANCE...TITLE AND SUBTITLE Quantification of VX Nerve Agent in Various Food Matrices by Solid-Phase Extraction Ultra-Performance Liquid Chromatography... food matrices. The mixed-mode cation exchange (MCX) sorbent and Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) methods were used for
NASA Astrophysics Data System (ADS)
Cooke, R.; Frisch, B.; Saleem, A.
1991-08-01
The topic of risk to human life is addressed from different viewpoints. The question is raised whether risk assessment is good only as a tool for ranking risk sources or whether it actually yields sensible numbers for estimating the risk of events. Various measures for the quantification of risk (e.g., deaths per million, activity specific hourly mortality rate) are given and their applications are discussed. The implications of the different uses for risk numbers are explained. Known risks of several activities are used as a baseline for the discussion of the selection criteria for an achievable safety goal for manned space flight.
Targeted methods for quantitative analysis of protein glycosylation
Goldman, Radoslav; Sanda, Miloslav
2018-01-01
Quantification of proteins by LC-MS/MS-MRM has become a standard method with broad projected clinical applicability. MRM quantification of protein modifications is, however, far less utilized, especially in the case of glycoproteins. This review summarizes current methods for quantitative analysis of protein glycosylation with a focus on MRM methods. We describe advantages of this quantitative approach, analytical parameters that need to be optimized to achieve reliable measurements, and point out the limitations. Differences between major classes of N- and O-glycopeptides are described and class-specific glycopeptide assays are demonstrated. PMID:25522218
NASA Astrophysics Data System (ADS)
Cioca, Ionel-Lucian; Moraru, Roland Iosif
2012-10-01
In order to meet statutory requirements concerning the workers health and safety, it is necessary for mine managers within Valea Jiului coal basin in Romania to address the potential for underground fires and explosions and their impact on the workforce and the mine ventilation systems. Highlighting the need for a unified and systematic approach of the specific risks, the authors are developing a general framework for fire/explosion risk assessment in gassy mines, based on the quantification of the likelihood of occurrence and gravity of the consequences of such undesired events and employing Root-Cause analysis method. It is emphasized that even a small fire should be regarded as being a major hazard from the point of view of explosion initiation, should a combustible atmosphere arise. The developed methodology, for the assessment of underground fire and explosion risks, is based on the known underground explosion hazards, fire engineering principles and fire test criteria for potentially combustible materials employed in mines.
Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi
2014-01-01
A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize event, MIR162. We first prepared a standard plasmid for MIR162 quantification. The conversion factor (Cf) required to calculate the genetically modified organism (GMO) amount was empirically determined for two real-time PCR instruments, the Applied Biosystems 7900HT (ABI7900) and the Applied Biosystems 7500 (ABI7500) for which the determined Cf values were 0.697 and 0.635, respectively. To validate the developed method, a blind test was carried out in an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr). The determined biases were less than 25% and the RSDr values were less than 20% at all evaluated concentrations. These results suggested that the limit of quantitation of the method was 0.5%, and that the developed method would thus be suitable for practical analyses for the detection and quantification of MIR162.
Image-guided spatial localization of heterogeneous compartments for magnetic resonance
An, Li; Shen, Jun
2015-01-01
Purpose: Image-guided localization SPectral Localization Achieved by Sensitivity Heterogeneity (SPLASH) allows rapid measurement of signals from irregularly shaped anatomical compartments without using phase encoding gradients. Here, the authors propose a novel method to address the issue of heterogeneous signal distribution within the localized compartments. Methods: Each compartment was subdivided into multiple subcompartments and their spectra were solved by Tikhonov regularization to enforce smoothness within each compartment. The spectrum of a given compartment was generated by combining the spectra of the components of that compartment. The proposed method was first tested using Monte Carlo simulations and then applied to reconstructing in vivo spectra from irregularly shaped ischemic stroke and normal tissue compartments. Results: Monte Carlo simulations demonstrate that the proposed regularized SPLASH method significantly reduces localization and metabolite quantification errors. In vivo results show that the intracompartment regularization results in ∼40% reduction of error in metabolite quantification. Conclusions: The proposed method significantly reduces localization errors and metabolite quantification errors caused by intracompartment heterogeneous signal distribution. PMID:26328977
Microfluidics-based digital quantitative PCR for single-cell small RNA quantification.
Yu, Tian; Tang, Chong; Zhang, Ying; Zhang, Ruirui; Yan, Wei
2017-09-01
Quantitative analyses of small RNAs at the single-cell level have been challenging because of limited sensitivity and specificity of conventional real-time quantitative PCR methods. A digital quantitative PCR (dqPCR) method for miRNA quantification has been developed, but it requires the use of proprietary stem-loop primers and only applies to miRNA quantification. Here, we report a microfluidics-based dqPCR (mdqPCR) method, which takes advantage of the Fluidigm BioMark HD system for both template partition and the subsequent high-throughput dqPCR. Our mdqPCR method demonstrated excellent sensitivity and reproducibility suitable for quantitative analyses of not only miRNAs but also all other small RNA species at the single-cell level. Using this method, we discovered that each sperm has a unique miRNA profile. © The Authors 2017. Published by Oxford University Press on behalf of Society for the Study of Reproduction. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
2013-01-01
Background T2-weighted cardiovascular magnetic resonance (CMR) is clinically-useful for imaging the ischemic area-at-risk and amount of salvageable myocardium in patients with acute myocardial infarction (MI). However, to date, quantification of oedema is user-defined and potentially subjective. Methods We describe a highly automatic framework for quantifying myocardial oedema from bright blood T2-weighted CMR in patients with acute MI. Our approach retains user input (i.e. clinical judgment) to confirm the presence of oedema on an image which is then subjected to an automatic analysis. The new method was tested on 25 consecutive acute MI patients who had a CMR within 48 hours of hospital admission. Left ventricular wall boundaries were delineated automatically by variational level set methods followed by automatic detection of myocardial oedema by fitting a Rayleigh-Gaussian mixture statistical model. These data were compared with results from manual segmentation of the left ventricular wall and oedema, the current standard approach. Results The mean perpendicular distances between automatically detected left ventricular boundaries and corresponding manual delineated boundaries were in the range of 1-2 mm. Dice similarity coefficients for agreement (0=no agreement, 1=perfect agreement) between manual delineation and automatic segmentation of the left ventricular wall boundaries and oedema regions were 0.86 and 0.74, respectively. Conclusion Compared to standard manual approaches, the new highly automatic method for estimating myocardial oedema is accurate and straightforward. It has potential as a generic software tool for physicians to use in clinical practice. PMID:23548176
Extraction and quantification of adenosine triphosphate in mammalian tissues and cells.
Chida, Junji; Kido, Hiroshi
2014-01-01
Adenosine 5'-triphosphate (ATP) is the "energy currency" of organisms and plays central roles in bioenergetics, whereby its level is used to evaluate cell viability, proliferation, death, and energy transmission. In this chapter, we describe an improved and efficient method for extraction of ATP from tissues and cells using phenol-based reagents. The chaotropic extraction reagents reported so far co-precipitate ATP with insoluble proteins during extraction and with salts during neutralization. In comparison, the phenol-based reagents extract ATP well without the risks of co-precipitation. The extracted ATP can be quantified by the luciferase assay or high-performance liquid chromatography.
Development and evaluation of a technique for in vivo monitoring of 60Co in human liver
NASA Astrophysics Data System (ADS)
Gomes, GH; Silva, MC; Mello, JQ; Dantas, ALA; Dantas, BM
2018-03-01
60Co is an artificial radioactive metal produced by activation of iron with neutrons. It decays by beta particles and gamma radiation and represents a risk of internal exposure of workers involved in the maintenance of nuclear power reactors. Intakes can be quantified through in vivo monitoring. This work describes the development of a technique for the quantification of 60Co in human liver. The sensitivity of the method is evaluated based on the minimum detectable effective doses. The results allow to state that the technique is suitable either for monitoring of occupational exposures or evaluation of accidental intakes.
Monjure, C. J.; Tatum, C. D.; Panganiban, A. T.; Arainga, M.; Traina-Dorge, V.; Marx, P. A.; Didier, E. S.
2014-01-01
Introduction Quantification of plasma viral load (PVL) is used to monitor disease progression in SIV-infected macaques. This study was aimed at optimizing of performance characteristics of the quantitative PCR (qPCR) PVL assay. Methods The PVL quantification procedure was optimized by inclusion of an exogenous control Hepatitis C Virus armored RNA (aRNA), a plasma concentration step, extended digestion with proteinase K, and a second RNA elution step. Efficiency of viral RNA (vRNA) extraction was compared using several commercial vRNA extraction kits. Various parameters of qPCR targeting the gag region of SIVmac239, SIVsmE660 and the LTR region of SIVagmSAB were also optimized. Results Modifications of the SIV PVL qPCR procedure increased vRNA recovery, reduced inhibition and improved analytical sensitivity. The PVL values determined by this SIV PVL qPCR correlated with quantification results of SIV-RNA in the same samples using the “industry standard” method of branched-DNA (bDNA) signal amplification. Conclusions Quantification of SIV genomic RNA in plasma of rhesus macaques using this optimized SIV PVL qPCR is equivalent to the bDNA signal amplification method, less costly and more versatile. Use of heterologous aRNA as an internal control is useful for optimizing performance characteristics of PVL qPCRs. PMID:24266615
Lowe, Ross H.; Karschner, Erin L.; Schwilke, Eugene W.; Barnes, Allan J.; Huestis, Marilyn A.
2009-01-01
A two-dimensional (2D) gas chromatography/electron impact-mass spectrometry (GC/EI-MS) method for simultaneous quantification of Δ9-tetrahydrocannabinol (THC), 11-hydroxy-Δ9-tetrahydrocannabinol (11-OH-THC), and 11-nor-Δ9-tetrahydrocannabinol-9-carboxylic acid (THCCOOH) in human plasma was developed and validated. The method employs 2D capillary GC and cryofocusing for enhanced resolution and sensitivity. THC, 11-OH-THC, and THCCOOH were extracted by precipitation with acetonitrile followed by solid-phase extraction. GC separation of trimethylsilyl derivatives of analytes was accomplished with two capillary columns in series coupled via a pneumatic Deans switch system. Detection and quantification were accomplished with a bench-top single quadrupole mass spectrometer operated in electron impact-selected ion monitoring mode. Limits of quantification (LOQ) were 0.125, 0.25 and 0.125 ng/mL for THC, 11-OH-THC, and THCCOOH, respectively. Accuracy ranged from 86.0 to 113.0% for all analytes. Intra- and inter-assay precision, as percent relative standard deviation, was less than 14.1% for THC, 11-OH-THC, and THCCOOH. The method was successfully applied to quantification of THC and its 11-OH-THC and THCCOOH metabolites in plasma specimens following controlled administration of THC. PMID:17640656
Quantification of DNA using the luminescent oxygen channeling assay.
Patel, R; Pollner, R; de Keczer, S; Pease, J; Pirio, M; DeChene, N; Dafforn, A; Rose, S
2000-09-01
Simplified and cost-effective methods for the detection and quantification of nucleic acid targets are still a challenge in molecular diagnostics. Luminescent oxygen channeling assay (LOCI(TM)) latex particles can be conjugated to synthetic oligodeoxynucleotides and hybridized, via linking probes, to different DNA targets. These oligomer-conjugated LOCI particles survive thermocycling in a PCR reaction and allow quantified detection of DNA targets in both real-time and endpoint formats. The endpoint DNA quantification format utilized two sensitizer bead types that are sensitive to separate illumination wavelengths. These two bead types were uniquely annealed to target or control amplicons, and separate illuminations generated time-resolved chemiluminescence, which distinguished the two amplicon types. In the endpoint method, ratios of the two signals allowed determination of the target DNA concentration over a three-log range. The real-time format allowed quantification of the DNA target over a six-log range with a linear relationship between threshold cycle and log of the number of DNA targets. This is the first report of the use of an oligomer-labeled latex particle assay capable of producing DNA quantification and sequence-specific chemiluminescent signals in a homogeneous format. It is also the first report of the generation of two signals from a LOCI assay. The methods described here have been shown to be easily adaptable to new DNA targets because of the generic nature of the oligomer-labeled LOCI particles.
Dapic, Irena; Kobetic, Renata; Brkljacic, Lidija; Kezic, Sanja; Jakasa, Ivone
2018-02-01
The free fatty acids (FFAs) are one of the major components of the lipids in the stratum corneum (SC), the uppermost layer of the skin. Relative composition of FFAs has been proposed as a biomarker of the skin barrier status in patients with atopic dermatitis (AD). Here, we developed an LC-ESI-MS/MS method for simultaneous quantification of a range of FFAs with long and very long chain length in the SC collected by adhesive tape (D-Squame). The method, based on derivatization with 2-bromo-1-methylpyridinium iodide and 3-carbinol-1-methylpyridinium iodide, allowed highly sensitive detection and quantification of FFAs using multiple reaction monitoring. For the quantification, we applied a surrogate analyte approach and internal standardization using isotope labeled derivatives of FFAs. Adhesive tapes showed the presence of several FFAs, which are also present in the SC, a problem encountered in previous studies. Therefore, the levels of FFAs in the SC were corrected using C12:0, which was present on the adhesive tape, but not detected in the SC. The method was applied to SC samples from patients with atopic dermatitis and healthy subjects. Quantification using multiple reaction monitoring allowed sufficient sensitivity to analyze FFAs of chain lengths C16-C28 in the SC collected on only one tape strip. Copyright © 2017 John Wiley & Sons, Ltd.
Bilbao, Aivett; Zhang, Ying; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Lisacek, Frédérique; Hopfgartner, Gérard
2016-01-01
Data-independent acquisition LC-MS/MS techniques complement supervised methods for peptide quantification. However, due to the wide precursor isolation windows, these techniques are prone to interference at the fragment ion level, which in turn is detrimental for accurate quantification. The “non-outlier fragment ion” (NOFI) ranking algorithm has been developed to assign low priority to fragment ions affected by interference. By using the optimal subset of high priority fragment ions these interfered fragment ions are effectively excluded from quantification. NOFI represents each fragment ion as a vector of four dimensions related to chromatographic and MS fragmentation attributes and applies multivariate outlier detection techniques. Benchmarking conducted on a well-defined quantitative dataset (i.e. the SWATH Gold Standard), indicates that NOFI on average is able to accurately quantify 11-25% more peptides than the commonly used Top-N library intensity ranking method. The sum of the area of the Top3-5 NOFIs produces similar coefficients of variation as compared to the library intensity method but with more accurate quantification results. On a biologically relevant human dendritic cell digest dataset, NOFI properly assigns low priority ranks to 85% of annotated interferences, resulting in sensitivity values between 0.92 and 0.80 against 0.76 for the Spectronaut interference detection algorithm. PMID:26412574
A method to characterize the roughness of 2-D line features: recrystallization boundaries.
Sun, J; Zhang, Y B; Dahl, A B; Conradsen, K; Juul Jensen, D
2017-03-01
A method is presented, which allows quantification of the roughness of nonplanar boundaries of objects for which the neutral plane is not known. The method provides quantitative descriptions of both the local and global characteristics. How the method can be used to estimate the sizes of rough features and local curvatures is also presented. The potential of the method is illustrated by quantification of the roughness of two recrystallization boundaries in a pure Al specimen characterized by scanning electron microscopy. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
Quantifying construction and demolition waste: an analytical review.
Wu, Zezhou; Yu, Ann T W; Shen, Liyin; Liu, Guiwen
2014-09-01
Quantifying construction and demolition (C&D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C&D waste generation at both regional and project levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C&D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested. Copyright © 2014 Elsevier Ltd. All rights reserved.
Gil, Jeovanis; Cabrales, Ania; Reyes, Osvaldo; Morera, Vivian; Betancourt, Lázaro; Sánchez, Aniel; García, Gerardo; Moya, Galina; Padrón, Gabriel; Besada, Vladimir; González, Luis Javier
2012-02-23
Growth hormone-releasing peptide 6 (GHRP-6, His-(DTrp)-Ala-Trp-(DPhe)-Lys-NH₂, MW=872.44 Da) is a potent growth hormone secretagogue that exhibits a cytoprotective effect, maintaining tissue viability during acute ischemia/reperfusion episodes in different organs like small bowel, liver and kidneys. In the present work a quantitative method to analyze GHRP-6 in human plasma was developed and fully validated following FDA guidelines. The method uses an internal standard (IS) of GHRP-6 with ¹³C-labeled Alanine for quantification. Sample processing includes a precipitation step with cold acetone to remove the most abundant plasma proteins, recovering the GHRP-6 peptide with a high yield. Quantification was achieved by LC-MS in positive full scan mode in a Q-Tof mass spectrometer. The sensitivity of the method was evaluated, establishing the lower limit of quantification at 5 ng/mL and a range for the calibration curve from 5 ng/mL to 50 ng/mL. A dilution integrity test was performed to analyze samples at higher concentration of GHRP-6. The validation process involved five calibration curves and the analysis of quality control samples to determine accuracy and precision. The calibration curves showed R² higher than 0.988. The stability of the analyte and its internal standard (IS) was demonstrated in all conditions the samples would experience in a real time analyses. This method was applied to the quantification of GHRP-6 in plasma from nine healthy volunteers participating in a phase I clinical trial. Copyright © 2011 Elsevier B.V. All rights reserved.
Zhu, Haitao; Nie, Binbin; Liu, Hua; Guo, Hua; Demachi, Kazuyuki; Sekino, Masaki; Shan, Baoci
2016-05-01
Phase map cross-correlation detection and quantification may produce highlighted signal at superparamagnetic iron oxide nanoparticles, and distinguish them from other hypointensities. The method may quantify susceptibility change by performing least squares analysis between a theoretically generated magnetic field template and an experimentally scanned phase image. Because characteristic phase recognition requires the removal of phase wrap and phase background, additional steps of phase unwrapping and filtering may increase the chance of computing error and enlarge the inconsistence among algorithms. To solve problem, phase gradient cross-correlation and quantification method is developed by recognizing characteristic phase gradient pattern instead of phase image because phase gradient operation inherently includes unwrapping and filtering functions. However, few studies have mentioned the detectable limit of currently used phase gradient calculation algorithms. The limit may lead to an underestimation of large magnetic susceptibility change caused by high-concentrated iron accumulation. In this study, mathematical derivation points out the value of maximum detectable phase gradient calculated by differential chain algorithm in both spatial and Fourier domain. To break through the limit, a modified quantification method is proposed by using unwrapped forward differentiation for phase gradient generation. The method enlarges the detectable range of phase gradient measurement and avoids the underestimation of magnetic susceptibility. Simulation and phantom experiments were used to quantitatively compare different methods. In vivo application performs MRI scanning on nude mice implanted by iron-labeled human cancer cells. Results validate the limit of detectable phase gradient and the consequent susceptibility underestimation. Results also demonstrate the advantage of unwrapped forward differentiation compared with differential chain algorithms for susceptibility quantification at high-concentrated iron accumulation. Copyright © 2015 Elsevier Inc. All rights reserved.
Wang, Tao; Liu, Tingting; Wang, Zejian; Tian, Xiwei; Yang, Yi; Guo, Meijin; Chu, Ju; Zhuang, Yingping
2016-05-01
The rapid and real-time lipid determination can provide valuable information on process regulation and optimization in the algal lipid mass production. In this study, a rapid, accurate and precise quantification method of in vivo cellular lipids of Chlorella protothecoides using low field nuclear magnetic resonance (LF-NMR) was newly developed. LF-NMR was extremely sensitive to the algal lipids with the limits of the detection (LOD) of 0.0026g and 0.32g/L in dry lipid samples and algal broth, respectively, as well as limits of quantification (LOQ) of 0.0093g and 1.18g/L. Moreover, the LF-NMR signal was specifically proportional to the cellular lipids of C. protothecoides, thus the superior regression curves existing in a wide detection range from 0.02 to 0.42g for dry lipids and from 1.12 to 8.97gL(-1) of lipid concentration for in vivo lipid quantification were obtained with all R(2) higher than 0.99, irrespective of the lipid content and fatty acids profile variations. The accuracy of this novel method was further verified to be reliable by comparing lipid quantification results to those obtained by GC-MS. And the relative standard deviation (RSD) of LF-NMR results were smaller than 2%, suggesting the precision of this method. Finally, this method was successfully used in the on-line lipid monitoring during the algal lipid fermentation processes, making it possible for better understanding of the lipid accumulation mechanism and dynamic bioprocess control. Copyright © 2016 Elsevier B.V. All rights reserved.
Elmer-Dixon, Margaret M; Bowler, Bruce E
2018-05-19
A novel approach to quantify mixed lipid systems is described. Traditional approaches to lipid vesicle quantification are time consuming, require large amounts of material and are destructive. We extend our recently described method for quantification of pure lipid systems to mixed lipid systems. The method only requires a UV-Vis spectrometer and does not destroy sample. Mie scattering data from absorbance measurements are used as input into a Matlab program to calculate the total vesicle concentration and the concentrations of each lipid in the mixed lipid system. The technique is fast and accurate, which is essential for analytical lipid binding experiments. Copyright © 2018. Published by Elsevier Inc.
Mota, Maria Fernanda S; Souza, Marcella F; Bon, Elba P S; Rodrigues, Marcoaurelio A; Freitas, Suely Pereira
2018-05-24
The use of colorimetric methods for protein quantification in microalgae is hindered by their elevated amounts of membrane-embedded intracellular proteins. In this work, the protein content of three species of microalgae was determined by the Lowry method after the cells were dried, ball-milled, and treated with the detergent sodium dodecyl sulfate (SDS). Results demonstrated that the association of milling and SDS treatment resulted in a 3- to 7-fold increase in protein quantification. Milling promoted microalgal disaggregation and cell wall disruption enabling access of the SDS detergent to the microalgal intracellular membrane proteins and their efficient solubilization and quantification. © 2018 Phycological Society of America.
Misra, Ankita; Shukla, Pushpendra Kumar; Kumar, Bhanu; Chand, Jai; Kushwaha, Poonam; Khalid, Md; Singh Rawat, Ajay Kumar; Srivastava, Sharad
2017-10-01
Gloriosa superba L. (Colchicaceae) is used as adjuvant therapy in gout for its potential antimitotic activity due to high colchicine(s) alkaloids. This study aimed to develop an easy, cheap, precise, and accurate high-performance thin-layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L. and to identify its elite chemotype(s) from Sikkim Himalayas (India). The HPTLC chromatographic method was developed using mobile phase of chloroform: acetone: diethyl amine (5:4:1) at λ max of 350 nm. Five germplasms were collected from targeted region, and on morpho-anatomical inspection, no significant variation was observed among them. Quantification data reveal that content of colchicine ( R f : 0.72) and gloriosine ( R f : 0.61) varies from 0.035%-0.150% to 0.006%-0.032% (dry wt. basis). Linearity of method was obtained in the concentration range of 100-400 ng/spot of marker(s), exhibiting regression coefficient of 0.9987 (colchicine) and 0.9983 (gloriosine) with optimum recovery of 97.79 ± 3.86 and 100.023% ± 0.01%, respectively. Limit of detection and limit of quantification were analyzed, respectively, as 6.245, 18.926 and 8.024, 24.316 (ng). Two germplasms, namely NBG-27 and NBG-26, were found to be elite chemotype of both the markers. The developed method is validated in terms of accuracy, recovery, and precision studies as per the ICH guidelines (2005) and can be adopted for the simultaneous quantification of colchicine and gloriosine in phytopharmaceuticals. In addition, this study is relevant to explore the chemotypic variability in metabolite content for commercial and medicinal purposes. An easy, cheap, precise, and accurate high performance thin layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L.Five germplasms were collected from targeted region, and on morpho anatomical inspection, no significant variation was observed among themQuantification data reveal that content of colchicine (Rf: 0.72) and gloriosine (Rf: 0.61) varies from 0.035%-0.150% to 0.006%-0.032% (dry wt. basis)Two germplasms, namely NBG 27 and NBG 26, were found to be elite chemotype of both the markers.
Neiens, Patrick; De Simone, Angela; Ramershoven, Anna; Höfner, Georg; Allmendinger, Lars; Wanner, Klaus T
2018-03-03
MS Binding Assays represent a label-free alternative to radioligand binding assays. In this study, we present an LC-ESI-MS/MS method for the quantification of (R,R)-4-(2-benzhydryloxyethyl)-1-(4-fluorobenzyl)piperidin-3-ol [(R,R)-D-84, (R,R)-1], (S,S)-reboxetine [(S,S)-2], and (S)-citalopram [(S)-3] employed as highly selective nonlabeled reporter ligands in MS Binding Assays addressing the dopamine [DAT, (R,R)-D-84], norepinephrine [NET, (S,S)-reboxetine] and serotonin transporter [SERT, (S)-citalopram], respectively. The developed LC-ESI-MS/MS method uses a pentafluorphenyl stationary phase in combination with a mobile phase composed of acetonitrile and ammonium formate buffer for chromatography and a triple quadrupole mass spectrometer in the multiple reaction monitoring mode for mass spectrometric detection. Quantification is based on deuterated derivatives of all three analytes serving as internal standards. The established LC-ESI-MS/MS method enables fast, robust, selective and highly sensitive quantification of all three reporter ligands in a single chromatographic run. The method was validated according to the Center for Drug Evaluation and Research (CDER) guideline for bioanalytical method validation regarding selectivity, accuracy, precision, calibration curve and sensitivity. Finally, filtration-based MS Binding Assays were performed for all three monoamine transporters based on this LC-ESI-MS/MS quantification method as read out. The affinities determined in saturation experiments for (R,R)-D-84 toward hDAT, for (S,S)-reboxetine toward hNET, and for (S)-citalopram toward hSERT, respectively, were in good accordance with results from literature, clearly demonstrating that the established MS Binding Assays have the potential to be an efficient alternative to radioligand binding assays widely used for this purpose so far. Copyright © 2018 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
He, Jingjing; Wang, Dengjiang; Zhang, Weifang
2015-03-01
This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.
Understanding Pre-Quantitative Risk in Projects
NASA Technical Reports Server (NTRS)
Cooper, Lynne P.
2011-01-01
Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.
Yang, Meng; Qian, Xin; Zhang, Yuchao; Sheng, Jinbao; Shen, Dengle; Ge, Yi
2011-01-01
Approximately 30,000 dams in China are aging and are considered to be high-level risks. Developing a framework for analyzing spatial multicriteria flood risk is crucial to ranking management scenarios for these dams, especially in densely populated areas. Based on the theories of spatial multicriteria decision analysis, this report generalizes a framework consisting of scenario definition, problem structuring, criteria construction, spatial quantification of criteria, criteria weighting, decision rules, sensitivity analyses, and scenario appraisal. The framework is presented in detail by using a case study to rank dam rehabilitation, decommissioning and existing-condition scenarios. The results show that there was a serious inundation, and that a dam rehabilitation scenario could reduce the multicriteria flood risk by 0.25 in the most affected areas; this indicates a mean risk decrease of less than 23%. Although increased risk (<0.20) was found for some residential and commercial buildings, if the dam were to be decommissioned, the mean risk would not be greater than the current existing risk, indicating that the dam rehabilitation scenario had a higher rank for decreasing the flood risk than the decommissioning scenario, but that dam rehabilitation alone might be of little help in abating flood risk. With adjustments and improvement to the specific methods (according to the circumstances and available data) this framework may be applied to other sites. PMID:21655125
Smith, Jim T
2007-01-01
Background Following a nuclear incident, the communication and perception of radiation risk becomes a (perhaps the) major public health issue. In response to such incidents it is therefore crucial to communicate radiation health risks in the context of other more common environmental and lifestyle risk factors. This study compares the risk of mortality from past radiation exposures (to people who survived the Hiroshima and Nagasaki atomic bombs and those exposed after the Chernobyl accident) with risks arising from air pollution, obesity and passive and active smoking. Methods A comparative assessment of mortality risks from ionising radiation was carried out by estimating radiation risks for realistic exposure scenarios and assessing those risks in comparison with risks from air pollution, obesity and passive and active smoking. Results The mortality risk to populations exposed to radiation from the Chernobyl accident may be no higher than that for other more common risk factors such as air pollution or passive smoking. Radiation exposures experienced by the most exposed group of survivors of Hiroshima and Nagasaki led to an average loss of life expectancy significantly lower than that caused by severe obesity or active smoking. Conclusion Population-averaged risks from exposures following major radiation incidents are clearly significant, but may be no greater than those from other much more common environmental and lifestyle factors. This comparative analysis, whilst highlighting inevitable uncertainties in risk quantification and comparison, helps place the potential consequences of radiation exposures in the context of other public health risks. PMID:17407581
Stepanović, Srdjan; Vuković, Dragana; Hola, Veronika; Di Bonaventura, Giovanni; Djukić, Slobodanka; Cirković, Ivana; Ruzicka, Filip
2007-08-01
The details of all steps involved in the quantification of biofilm formation in microtiter plates are described. The presented protocol incorporates information on assessment of biofilm production by staphylococci, gained both by direct experience as well as by analysis of methods for assaying biofilm production. The obtained results should simplify quantification of biofilm formation in microtiter plates, and make it more reliable and comparable among different laboratories.
On the Confounding Effect of Temperature on Chemical Shift-Encoded Fat Quantification
Hernando, Diego; Sharma, Samir D.; Kramer, Harald; Reeder, Scott B.
2014-01-01
Purpose To characterize the confounding effect of temperature on chemical shift-encoded (CSE) fat quantification. Methods The proton resonance frequency of water, unlike triglycerides, depends on temperature. This leads to a temperature dependence of the spectral models of fat (relative to water) that are commonly used by CSE-MRI methods. Simulation analysis was performed for 1.5 Tesla CSE fat–water signals at various temperatures and echo time combinations. Oil–water phantoms were constructed and scanned at temperatures between 0 and 40°C using spectroscopy and CSE imaging at three echo time combinations. An explanted human liver, rejected for transplantation due to steatosis, was scanned using spectroscopy and CSE imaging. Fat–water reconstructions were performed using four different techniques: magnitude and complex fitting, with standard or temperature-corrected signal modeling. Results In all experiments, magnitude fitting with standard signal modeling resulted in large fat quantification errors. Errors were largest for echo time combinations near TEinit ≈ 1.3 ms, ΔTE ≈ 2.2 ms. Errors in fat quantification caused by temperature-related frequency shifts were smaller with complex fitting, and were avoided using a temperature-corrected signal model. Conclusion Temperature is a confounding factor for fat quantification. If not accounted for, it can result in large errors in fat quantifications in phantom and ex vivo acquisitions. PMID:24123362
NASA Astrophysics Data System (ADS)
Martel, Dimitri; Tse Ve Koon, K.; Le Fur, Yann; Ratiney, Hélène
2015-11-01
Two-dimensional spectroscopy offers the possibility to unambiguously distinguish metabolites by spreading out the multiplet structure of J-coupled spin systems into a second dimension. Quantification methods that perform parametric fitting of the 2D MRS signal have recently been proposed for resolved PRESS (JPRESS) but not explicitly for Localized Correlation Spectroscopy (LCOSY). Here, through a whole metabolite quantification approach, correlation spectroscopy quantification performances are studied. The ability to quantify metabolite relaxation constant times is studied for three localized 2D MRS sequences (LCOSY, LCTCOSY and the JPRESS) in vitro on preclinical MR systems. The issues encountered during implementation and quantification strategies are discussed with the help of the Fisher matrix formalism. The described parameterized models enable the computation of the lower bound for error variance - generally known as the Cramér Rao bounds (CRBs), a standard of precision - on the parameters estimated from these 2D MRS signal fittings. LCOSY has a theoretical net signal loss of two per unit of acquisition time compared to JPRESS. A rapid analysis could point that the relative CRBs of LCOSY compared to JPRESS (expressed as a percentage of the concentration values) should be doubled but we show that this is not necessarily true. Finally, the LCOSY quantification procedure has been applied on data acquired in vivo on a mouse brain.
Ott, Stephan J; Musfeldt, Meike; Ullmann, Uwe; Hampe, Jochen; Schreiber, Stefan
2004-06-01
The composition of the human intestinal flora is important for the health status of the host. The global composition and the presence of specific pathogens are relevant to the effects of the flora. Therefore, accurate quantification of all major bacterial populations of the enteric flora is needed. A TaqMan real-time PCR-based method for the quantification of 20 dominant bacterial species and groups of the intestinal flora has been established on the basis of 16S ribosomal DNA taxonomy. A PCR with conserved primers was used for all reactions. In each real-time PCR, a universal probe for quantification of total bacteria and a specific probe for the species in question were included. PCR with conserved primers and the universal probe for total bacteria allowed relative and absolute quantification. Minor groove binder probes increased the sensitivity of the assays 10- to 100-fold. The method was evaluated by cross-reaction experiments and quantification of bacteria in complex clinical samples from healthy patients. A sensitivity of 10(1) to 10(3) bacterial cells per sample was achieved. No significant cross-reaction was observed. The real-time PCR assays presented may facilitate understanding of the intestinal bacterial flora through a normalized global estimation of the major contributing species.
Sánchez-Guijo, Alberto; Oji, Vinzenz; Hartmann, Michaela F.; Traupe, Heiko; Wudy, Stefan A.
2015-01-01
Steroids are primarily present in human fluids in their sulfated forms. Profiling of these compounds is important from both diagnostic and physiological points of view. Here, we present a novel method for the quantification of 11 intact steroid sulfates in human serum by LC-MS/MS. The compounds analyzed in our method, some of which are quantified for the first time in blood, include cholesterol sulfate, pregnenolone sulfate, 17-hydroxy-pregnenolone sulfate, 16-α-hydroxy-dehydroepiandrosterone sulfate, dehydroepiandrosterone sulfate, androstenediol sulfate, androsterone sulfate, epiandrosterone sulfate, testosterone sulfate, epitestosterone sulfate, and dihydrotestosterone sulfate. The assay was conceived to quantify sulfated steroids in a broad range of concentrations, requiring only 300 μl of serum. The method has been validated and its performance was studied at three quality controls, selected for each compound according to its physiological concentration. The assay showed good linearity (R2 > 0.99) and recovery for all the compounds, with limits of quantification ranging between 1 and 80 ng/ml. Averaged intra-day and between-day precisions (coefficient of variation) and accuracies (relative errors) were below 10%. The method has been successfully applied to study the sulfated steroidome in diseases such as steroid sulfatase deficiency, proving its diagnostic value. This is, to our best knowledge, the most comprehensive method available for the quantification of sulfated steroids in human blood. PMID:26239050
Leveraging transcript quantification for fast computation of alternative splicing profiles.
Alamancos, Gael P; Pagès, Amadís; Trincado, Juan L; Bellora, Nicolás; Eyras, Eduardo
2015-09-01
Alternative splicing plays an essential role in many cellular processes and bears major relevance in the understanding of multiple diseases, including cancer. High-throughput RNA sequencing allows genome-wide analyses of splicing across multiple conditions. However, the increasing number of available data sets represents a major challenge in terms of computation time and storage requirements. We describe SUPPA, a computational tool to calculate relative inclusion values of alternative splicing events, exploiting fast transcript quantification. SUPPA accuracy is comparable and sometimes superior to standard methods using simulated as well as real RNA-sequencing data compared with experimentally validated events. We assess the variability in terms of the choice of annotation and provide evidence that using complete transcripts rather than more transcripts per gene provides better estimates. Moreover, SUPPA coupled with de novo transcript reconstruction methods does not achieve accuracies as high as using quantification of known transcripts, but remains comparable to existing methods. Finally, we show that SUPPA is more than 1000 times faster than standard methods. Coupled with fast transcript quantification, SUPPA provides inclusion values at a much higher speed than existing methods without compromising accuracy, thereby facilitating the systematic splicing analysis of large data sets with limited computational resources. The software is implemented in Python 2.7 and is available under the MIT license at https://bitbucket.org/regulatorygenomicsupf/suppa. © 2015 Alamancos et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.
miR-MaGiC improves quantification accuracy for small RNA-seq.
Russell, Pamela H; Vestal, Brian; Shi, Wen; Rudra, Pratyaydipta D; Dowell, Robin; Radcliffe, Richard; Saba, Laura; Kechris, Katerina
2018-05-15
Many tools have been developed to profile microRNA (miRNA) expression from small RNA-seq data. These tools must contend with several issues: the small size of miRNAs, the small number of unique miRNAs, the fact that similar miRNAs can be transcribed from multiple loci, and the presence of miRNA isoforms known as isomiRs. Methods failing to address these issues can return misleading information. We propose a novel quantification method designed to address these concerns. We present miR-MaGiC, a novel miRNA quantification method, implemented as a cross-platform tool in Java. miR-MaGiC performs stringent mapping to a core region of each miRNA and defines a meaningful set of target miRNA sequences by collapsing the miRNA space to "functional groups". We hypothesize that these two features, mapping stringency and collapsing, provide more optimal quantification to a more meaningful unit (i.e., miRNA family). We test miR-MaGiC and several published methods on 210 small RNA-seq libraries, evaluating each method's ability to accurately reflect global miRNA expression profiles. We define accuracy as total counts close to the total number of input reads originating from miRNAs. We find that miR-MaGiC, which incorporates both stringency and collapsing, provides the most accurate counts.
Kolacsek, Orsolya; Pergel, Enikő; Varga, Nóra; Apáti, Ágota; Orbán, Tamás I
2017-01-20
There are numerous applications of quantitative PCR for both diagnostic and basic research. As in many other techniques the basis of quantification is that comparisons are made between different (unknown and known or reference) specimens of the same entity. When the aim is to compare real quantities of different species in samples, one cannot escape their separate precise absolute quantification. We have established a simple and reliable method for this purpose (Ct shift method) which combines the absolute and the relative approach. It requires a plasmid standard containing both sequences of amplicons to be compared (e.g. the target of interest and the endogenous control). It can serve as a reference sample with equal copies of templates for both targets. Using the ΔΔCt formula we can quantify the exact ratio of the two templates in each unknown sample. The Ct shift method has been successfully applied for transposon gene copy measurements, as well as for comparison of different mRNAs in cDNA samples. This study provides the proof of concept and introduces some potential applications of the method; the absolute nature of results even without the need for real reference samples can contribute to the universality of the method and comparability of different studies. Copyright © 2016 Elsevier B.V. All rights reserved.
Methods to Detect Nitric Oxide and its Metabolites in Biological Samples
Bryan, Nathan S.; Grisham, Matthew B.
2007-01-01
Nitric oxide (NO) methodology is a complex and often confusing science and the focus of many debates and discussion concerning NO biochemistry. NO is involved in many physiological processes including regulation of blood pressure, immune response and neural communication. Therefore its accurate detection and quantification is critical to understanding health and disease. Due to the extremely short physiological half life of this gaseous free radical, alternative strategies for the detection of reaction products of NO biochemistry have been developed. The quantification of NO metabolites in biological samples provides valuable information with regards to in vivo NO production, bioavailability and metabolism. Simply sampling a single compartment such as blood or plasma may not always provide an accurate assessment of whole body NO status, particularly in tissues. Therefore, extrapolation of plasma or blood NO status to specific tissues of interest is no longer a valid approach. As a result, methods continue to be developed and validated which allow the detection and quantification of NO and NO-related products/metabolites in multiple compartments of experimental animals in vivo. The methods described in this review is not an exhaustive or comprehensive discussion of all methods available for the detection of NO but rather a description of the most commonly used and practical methods which allow accurate and sensitive quantification of NO products/metabolites in multiple biological matrices under normal physiological conditions. PMID:17664129
MRI-based methods for quantification of the cerebral metabolic rate of oxygen
Rodgers, Zachary B; Detre, John A
2016-01-01
The brain depends almost entirely on oxidative metabolism to meet its significant energy requirements. As such, the cerebral metabolic rate of oxygen (CMRO2) represents a key measure of brain function. Quantification of CMRO2 has helped elucidate brain functional physiology and holds potential as a clinical tool for evaluating neurological disorders including stroke, brain tumors, Alzheimer’s disease, and obstructive sleep apnea. In recent years, a variety of magnetic resonance imaging (MRI)-based CMRO2 quantification methods have emerged. Unlike positron emission tomography – the current “gold standard” for measurement and mapping of CMRO2 – MRI is non-invasive, relatively inexpensive, and ubiquitously available in modern medical centers. All MRI-based CMRO2 methods are based on modeling the effect of paramagnetic deoxyhemoglobin on the magnetic resonance signal. The various methods can be classified in terms of the MRI contrast mechanism used to quantify CMRO2: T2*, T2′, T2, or magnetic susceptibility. This review article provides an overview of MRI-based CMRO2 quantification techniques. After a brief historical discussion motivating the need for improved CMRO2 methodology, current state-of-the-art MRI-based methods are critically appraised in terms of their respective tradeoffs between spatial resolution, temporal resolution, and robustness, all of critical importance given the spatially heterogeneous and temporally dynamic nature of brain energy requirements. PMID:27089912
Jensen, Jacob S; Egebo, Max; Meyer, Anne S
2008-05-28
Accomplishment of fast tannin measurements is receiving increased interest as tannins are important for the mouthfeel and color properties of red wines. Fourier transform mid-infrared spectroscopy allows fast measurement of different wine components, but quantification of tannins is difficult due to interferences from spectral responses of other wine components. Four different variable selection tools were investigated for the identification of the most important spectral regions which would allow quantification of tannins from the spectra using partial least-squares regression. The study included the development of a new variable selection tool, iterative backward elimination of changeable size intervals PLS. The spectral regions identified by the different variable selection methods were not identical, but all included two regions (1485-1425 and 1060-995 cm(-1)), which therefore were concluded to be particularly important for tannin quantification. The spectral regions identified from the variable selection methods were used to develop calibration models. All four variable selection methods identified regions that allowed an improved quantitative prediction of tannins (RMSEP = 69-79 mg of CE/L; r = 0.93-0.94) as compared to a calibration model developed using all variables (RMSEP = 115 mg of CE/L; r = 0.87). Only minor differences in the performance of the variable selection methods were observed.
Barricklow, Jason; Ryder, Tim F; Furlong, Michael T
2009-08-01
During LC-MS/MS quantification of a small molecule in human urine samples from a clinical study, an unexpected peak was observed to nearly co-elute with the analyte of interest in many study samples. Improved chromatographic resolution revealed the presence of at least 3 non-analyte peaks, which were identified as cysteine metabolites and N-acetyl (mercapturic acid) derivatives thereof. These metabolites produced artifact responses in the parent compound MRM channel due to decomposition in the ionization source of the mass spectrometer. Quantitative comparison of the analyte concentrations in study samples using the original chromatographic method and the improved chromatographic separation method demonstrated that the original method substantially over-estimated the analyte concentration in many cases. The substitution of electrospray ionization (ESI) for atmospheric pressure chemical ionization (APCI) nearly eliminated the source instability of these metabolites, which would have mitigated their interference in the quantification of the analyte, even without chromatographic separation. These results 1) demonstrate the potential for thiol metabolite interferences during the quantification of small molecules in pharmacokinetic samples, and 2) underscore the need to carefully evaluate LC-MS/MS methods for molecules that can undergo metabolism to thiol adducts to ensure that they are not susceptible to such interferences during quantification.
Quantifying errors without random sampling.
Phillips, Carl V; LaPole, Luwanna M
2003-06-12
All quantifications of mortality, morbidity, and other health measures involve numerous sources of error. The routine quantification of random sampling error makes it easy to forget that other sources of error can and should be quantified. When a quantification does not involve sampling, error is almost never quantified and results are often reported in ways that dramatically overstate their precision. We argue that the precision implicit in typical reporting is problematic and sketch methods for quantifying the various sources of error, building up from simple examples that can be solved analytically to more complex cases. There are straightforward ways to partially quantify the uncertainty surrounding a parameter that is not characterized by random sampling, such as limiting reported significant figures. We present simple methods for doing such quantifications, and for incorporating them into calculations. More complicated methods become necessary when multiple sources of uncertainty must be combined. We demonstrate that Monte Carlo simulation, using available software, can estimate the uncertainty resulting from complicated calculations with many sources of uncertainty. We apply the method to the current estimate of the annual incidence of foodborne illness in the United States. Quantifying uncertainty from systematic errors is practical. Reporting this uncertainty would more honestly represent study results, help show the probability that estimated values fall within some critical range, and facilitate better targeting of further research.
Nahar, Limon Khatun; Cordero, Rosa Elena; Nutt, David; Lingford-Hughes, Anne; Turton, Samuel; Durant, Claire; Wilson, Sue; Paterson, Sue
2016-03-01
A highly sensitive and fully validated method was developed for the quantification of baclofen in human plasma. After adjusting the pH of the plasma samples using a phosphate buffer solution (pH 4), baclofen was purified using mixed mode (C8/cation exchange) solid-phase extraction (SPE) cartridges. Endogenous water-soluble compounds and lipids were removed from the cartridges before the samples were eluted and concentrated. The samples were analyzed using triple-quadrupole liquid chromatography-tandem mass spectrometry (LC-MS-MS) with triggered dynamic multiple reaction monitoring mode for simultaneous quantification and confirmation. The assay was linear from 25 to 1,000 ng/mL (r(2) > 0.999; n = 6). Intraday (n = 6) and interday (n = 15) imprecisions (% relative standard deviation) were <5%, and the average recovery was 30%. The limit of detection of the method was 5 ng/mL, and the limit of quantification was 25 ng/mL. Plasma samples from healthy male volunteers (n = 9, median age: 22) given two single oral doses of baclofen (10 and 60 mg) on nonconsecutive days were analyzed to demonstrate method applicability. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Li, Xiang; Wang, Xiuxiu; Yang, Jielin; Liu, Yueming; He, Yuping; Pan, Liangwen
2014-05-16
To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5'-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products.
2014-01-01
Background To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. Results To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5′-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. Conclusions The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products. PMID:24884946
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
NASA Astrophysics Data System (ADS)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.
2018-03-01
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.
Dupré, Mathieu; Gilquin, Benoit; Fenaille, François; Feraudet-Tarisse, Cécile; Dano, Julie; Ferro, Myriam; Simon, Stéphanie; Junot, Christophe; Brun, Virginie; Becher, François
2015-08-18
The development of rapid methods for unambiguous identification and precise quantification of protein toxins in various matrices is essential for public health surveillance. Nowadays, analytical strategies classically rely on sensitive immunological assays, but mass spectrometry constitutes an attractive complementary approach thanks to direct measurement and protein characterization ability. We developed here an innovative multiplex immuno-LC-MS/MS method for the simultaneous and specific quantification of the three potential biological warfare agents, ricin, staphylococcal enterotoxin B, and epsilon toxin, in complex human biofluids and food matrices. At least 7 peptides were targeted for each toxin (43 peptides in total) with a quadrupole-Orbitrap high-resolution instrument for exquisite detection specificity. Quantification was performed using stable isotope-labeled toxin standards spiked early in the sample. Lower limits of quantification were determined at or close to 1 ng·mL(-1). The whole process was successfully applied to the quantitative analysis of toxins in complex samples such as milk, human urine, and plasma. Finally, we report new data on toxin stability with no evidence of toxin degradation in milk in a 48 h time frame, allowing relevant quantitative toxin analysis for samples collected in this time range.
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...
2018-02-09
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
Methods of quantitative risk assessment: The case of the propellant supply system
NASA Astrophysics Data System (ADS)
Merz, H. A.; Bienz, A.
1984-08-01
As a consequence of the disastrous accident in Lapua (Finland) in 1976, where an explosion in a cartridge loading facility killed 40 and injured more than 70 persons, efforts were undertaken to examine and improve the safety of such installations. An ammunition factory in Switzerland considered the replacement of the manual supply of propellant hoppers by a new pneumatic supply system. This would reduce the maximum quantity of propellant in the hoppers to a level, where an accidental ignition would no longer lead to a detonation, and this would drastically limit the effects on persons. A quantitative risk assessment of the present and the planned supply system demonstrated that, in this particular case, the pneumatic supply system would not reduce the risk enough to justify the related costs. In addition, it could be shown that the safety of the existing system can be improved more effectively by other safety measures at considerably lower costs. Based on this practical example, the advantages of a strictly quantitative risk assessment for the safety planning in explosives factories are demonstrated. The methodological background of a risk assessment and the steps involved in the analysis are summarized. In addition, problems of quantification are discussed.
Porra, Luke; Swan, Hans; Ho, Chien
2015-08-01
Introduction: Acoustic Radiation Force Impulse (ARFI) Quantification measures shear wave velocities (SWVs) within the liver. It is a reliable method for predicting the severity of liver fibrosis and has the potential to assess fibrosis in any part of the liver, but previous research has found ARFI quantification in the right lobe more accurate than in the left lobe. A lack of standardised applied transducer force when performing ARFI quantification in the left lobe of the liver may account for some of this inaccuracy. The research hypothesis of this present study predicted that an increase in applied transducer force would result in an increase in SWVs measured. Methods: ARFI quantification within the left lobe of the liver was performed within a group of healthy volunteers (n = 28). During each examination, each participant was subjected to ARFI quantification at six different levels of transducer force applied to the epigastric abdominal wall. Results: A repeated measures ANOVA test showed that ARFI quantification was significantly affected by applied transducer force (p = 0.002). Significant pairwise comparisons using Bonferroni correction for multiple comparisons showed that with an increase in applied transducer force, there was a decrease in SWVs. Conclusion: Applied transducer force has a significant effect on SWVs within the left lobe of the liver and it may explain some of the less accurate and less reliable results in previous studies where transducer force was not taken into consideration. Future studies in the left lobe of the liver should take this into account and control for applied transducer force.
Xu, Zhenzhen; Li, Jianzhong; Chen, Ailiang; Ma, Xin; Yang, Shuming
2018-05-03
The retrospectivity (the ability to retrospect to a previously unknown compound in raw data) is very meaningful for food safety and risk assessment when facing new emerging drugs. Accurate mass and retention time based screening may lead false positive and false negative results so new retrospective, reliable platform is desirable. Different concentration levels of standards with and without matrix were analyzed using ion mobility (IM)-quadrupole-time-of-flight (Q-TOF) for collecting retrospective accurate mass, retention time, drift time and tandem MS evidence for identification in a single experiment. The isomer separation ability of IM and the four-dimensional (4D) feature abundance quantification abilities were evaluated for veterinary drugs for the first time. The sensitivity of the IM-Q-TOF workflow was obviously higher than that of the traditional database searching algorithm [find by formula (FbF) function] for Q-TOF. In addition, the IM-Q-TOF workflow contained most of the results from FbF and removed the false positive results. Some isomers were separated by IM and the 4D feature abundance quantitation removed interference with similar accurate mass and showed good linearity. A new retrospective, multi-evidence platform was built for veterinary drug screening in a single experiment. The sensitivity was significantly improved and the data can be used for quantification. The platform showed its potential to be used for food safety and risk assessment. This article is protected by copyright. All rights reserved.
Kikuchi, Hiroyuki; Tsutsumi, Tomoaki; Matsuda, Rieko
2012-01-01
A method for the quantification of histamine in fish and fish products using tandem solid-phase extraction and fluorescence derivatization with fluorescamine was previously developed. In this study, we improved this analytical method to develop an official test method for quantification of histamine in fish and fish products, and performed a single laboratory study to validate it. Recovery tests of histamine from fillet (Thunnus obesus), and two fish products (fish sauce and salted and dried whole big-eye sardine) that were spiked at the level of 25 and 50 µg/g for T. obesus, and 50 and 100 µg/g for the two fish products, were carried out. The recoveries of histamine from the three samples tested were 88.8-99.6% with good repeatability (1.3-2.1%) and reproducibility (2.1-4.7%). Therefore, this method is acceptable for the quantification of histamine in fish and fish products. Moreover, surveillance of histamine content in food on the market was conducted using this method, and high levels of histamine were detected in some fish products.
Hidau, Mahendra Kumar; Kolluru, Srikanth; Palakurthi, Srinath
2018-02-01
A sensitive and selective RP-HPLC method has been developed and validated for the quantification of a highly potent poly ADP ribose polymerase inhibitor talazoparib (TZP) in rat plasma. Chromatographic separation was performed with isocratic elution method. Absorbance for TZP was measured with a UV detector (SPD-20A UV-vis) at a λ max of 227 nm. Protein precipitation was used to extract the drug from plasma samples using methanol-acetonitrile (65:35) as the precipitating solvent. The method proved to be sensitive and reproducible over a 100-2000 ng/mL linearity range with a lower limit of quantification (LLQC) of 100 ng/mL. TZP recovery was found to be >85%. Following analytical method development and validation, it was successfully employed to determine the plasma protein binding of TZP. TZP has a high level of protein binding in rat plasma (95.76 ± 0.38%) as determined by dialysis method. Copyright © 2017 John Wiley & Sons, Ltd.
Begou, O; Kontou, A; Raikos, N; Sarafidis, K; Roilides, E; Papadoyannis, I N; Gika, H G
2017-03-15
The development and validation of an ultra-high pressure liquid chromatography (UHPLC) tandem mass spectrometry (MS/MS) method was performed with the aim to be applied for the quantification of plasma teicoplanin concentrations in neonates. Pharmacokinetic data of teicoplanin in the neonatal population is very limited, therefore, a sensitive and reliable method for the determination of all isoforms of teicoplanin applied in a low volume of sample is of real importance. Teicoplanin main components were extracted by a simple acetonitrile precipitation step and analysed on a C18 chromatographic column by a triple quadrupole MS with electrospray ionization. The method provides quantitative data over a linear range of 25-6400ng/mL with LOD 8.5ng/mL and LOQ 25ng/mL for total teicoplanin. The method was applied in plasma samples from neonates to support pharmacokinetic data and proved to be a reliable and fast method for the quantification of teicoplanin concentration levels in plasma of infants during therapy in Intensive Care Unit. Copyright © 2016 Elsevier B.V. All rights reserved.
siMS Score: Simple Method for Quantifying Metabolic Syndrome.
Soldatovic, Ivan; Vukovic, Rade; Culafic, Djordje; Gajic, Milan; Dimitrijevic-Sreckovic, Vesna
2016-01-01
To evaluate siMS score and siMS risk score, novel continuous metabolic syndrome scores as methods for quantification of metabolic status and risk. Developed siMS score was calculated using formula: siMS score = 2*Waist/Height + Gly/5.6 + Tg/1.7 + TAsystolic/130-HDL/1.02 or 1.28 (for male or female subjects, respectively). siMS risk score was calculated using formula: siMS risk score = siMS score * age/45 or 50 (for male or female subjects, respectively) * family history of cardio/cerebro-vascular events (event = 1.2, no event = 1). A sample of 528 obese and non-obese participants was used to validate siMS score and siMS risk score. Scores calculated as sum of z-scores (each component of metabolic syndrome regressed with age and gender) and sum of scores derived from principal component analysis (PCA) were used for evaluation of siMS score. Variants were made by replacing glucose with HOMA in calculations. Framingham score was used for evaluation of siMS risk score. Correlation between siMS score with sum of z-scores and weighted sum of factors of PCA was high (r = 0.866 and r = 0.822, respectively). Correlation between siMS risk score and log transformed Framingham score was medium to high for age groups 18+,30+ and 35+ (0.835, 0.707 and 0.667, respectively). siMS score and siMS risk score showed high correlation with more complex scores. Demonstrated accuracy together with superior simplicity and the ability to evaluate and follow-up individual patients makes siMS and siMS risk scores very convenient for use in clinical practice and research as well.
Liu, K H; Chan, Y L; Chan, J C N; Chan, W B; Kong, M O; Poon, M Y
2005-09-01
Magnetic Resonance Imaging (MRI) is a well-accepted non-invasive method in the quantification of visceral adipose tissue. However, a standard method of measurement has not yet been universally agreed. The objectives of the present study were 2-fold, firstly, to identify the imaging plane in the Chinese population which gives the best correlation with total visceral adipose tissue volume and cardiovascular risk factors; and secondly to compare the correlations between single-slice and multiple-slice approach with cardiovascular risk factors. Thirty-seven Chinese subjects with no known medical history underwent MRI examination for quantifying total visceral adipose tissue volume. The visceral adipose tissue area at five axial imaging levels within abdomen and pelvis were determined. All subjects had blood pressure measured and fasting blood taken for analysis of cardiovascular risk factors. Framingham risk score for each subject was calculated. The imaging plane at the level of 'lower costal margin' (LCM) in both men and women had the highest correlation with total visceral adipose tissue volume (r = 0.97 and 0.99 respectively). The visceral adipose tissue area at specific imaging levels showed higher correlations with various cardiovascular risk factors and Framingham risk score than total visceral adipose tissue volume. The visceral adipose tissue area at 'umbilicus' (UMB) level in men (r = 0.88) and LCM level in women (r = 0.70) showed the best correlation with Framingham risk score. The imaging plane at the level of LCM is preferred for reflecting total visceral adipose tissue volume in Chinese subjects. For investigating the association of cardiovascular risk with visceral adipose tissue in MRI-obesity research, the single-slice approach is superior to the multiple-slice approach, with the level of UMB in men and LCM in women as the preferred imaging planes.
Zhao, Ming; Huang, Run; Peng, Leilei
2012-11-19
Förster resonant energy transfer (FRET) is extensively used to probe macromolecular interactions and conformation changes. The established FRET lifetime analysis method measures the FRET process through its effect on the donor lifetime. In this paper we present a method that directly probes the time-resolved FRET signal with frequency domain Fourier lifetime excitation-emission matrix (FLEEM) measurements. FLEEM separates fluorescent signals by their different phonon energy pathways from excitation to emission. The FRET process generates a unique signal channel that is initiated by donor excitation but ends with acceptor emission. Time-resolved analysis of the FRET EEM channel allows direct measurements on the FRET process, unaffected by free fluorophores that might be present in the sample. Together with time-resolved analysis on non-FRET channels, i.e. donor and acceptor EEM channels, time resolved EEM analysis allows precise quantification of FRET in the presence of free fluorophores. The method is extended to three-color FRET processes, where quantification with traditional methods remains challenging because of the significantly increased complexity in the three-way FRET interactions. We demonstrate the time-resolved EEM analysis method with quantification of three-color FRET in incompletely hybridized triple-labeled DNA oligonucleotides. Quantitative measurements of the three-color FRET process in triple-labeled dsDNA are obtained in the presence of free single-labeled ssDNA and double-labeled dsDNA. The results establish a quantification method for studying multi-color FRET between multiple macromolecules in biochemical equilibrium.
Zhao, Ming; Huang, Run; Peng, Leilei
2012-01-01
Förster resonant energy transfer (FRET) is extensively used to probe macromolecular interactions and conformation changes. The established FRET lifetime analysis method measures the FRET process through its effect on the donor lifetime. In this paper we present a method that directly probes the time-resolved FRET signal with frequency domain Fourier lifetime excitation-emission matrix (FLEEM) measurements. FLEEM separates fluorescent signals by their different phonon energy pathways from excitation to emission. The FRET process generates a unique signal channel that is initiated by donor excitation but ends with acceptor emission. Time-resolved analysis of the FRET EEM channel allows direct measurements on the FRET process, unaffected by free fluorophores that might be present in the sample. Together with time-resolved analysis on non-FRET channels, i.e. donor and acceptor EEM channels, time resolved EEM analysis allows precise quantification of FRET in the presence of free fluorophores. The method is extended to three-color FRET processes, where quantification with traditional methods remains challenging because of the significantly increased complexity in the three-way FRET interactions. We demonstrate the time-resolved EEM analysis method with quantification of three-color FRET in incompletely hybridized triple-labeled DNA oligonucleotides. Quantitative measurements of the three-color FRET process in triple-labeled dsDNA are obtained in the presence of free single-labeled ssDNA and double-labeled dsDNA. The results establish a quantification method for studying multi-color FRET between multiple macromolecules in biochemical equilibrium. PMID:23187535
Benami, Maya; Busgang, Allison; Gillor, Osnat; Gross, Amit
2016-08-15
Greywater (GW) reuse can alleviate water stress by lowering freshwater consumption. However, GW contains pathogens that may compromise public health. During the GW-treatment process, bioaerosols can be produced and may be hazardous to human health if inhaled, ingested, or come in contact with skin. Using air-particle monitoring, BioSampler®, and settle plates we sampled bioaerosols emitted from recirculating vertical flow constructed wetlands (RVFCW) - a domestic GW-treatment system. An array of pathogens and indicators were monitored using settle plates and by culturing the BioSampler® liquid. Further enumeration of viable pathogens in the BioSampler® liquid utilized a newer method combining the benefits of enrichment with molecular detection (MPN-qPCR). Additionally, quantitative microbial risk assessment (QMRA) was applied to assess risks of infection from a representative skin pathogen, Staphylococcus aureus. According to the settle-plate technique, low amounts (0-9.7×10(4)CFUm(-2)h(-1)) of heterotrophic bacteria, Staphylococcus spp., Pseudomonas spp., Klebsiella pneumoniae, Enterococcus spp., and Escherichia coli were found to aerosolize up to 1m away from the GW systems. At the 5m distance amounts of these bacteria were not statistically different (p>0.05) from background concentrations tested over 50m away from the systems. Using the BioSampler®, no bacteria were detected before enrichment of the GW-aerosols. However, after enrichment, using an MPN-qPCR technique, viable indicators and pathogens were occasionally detected. Consequently, the QMRA results were below the critical disability-adjusted life year (DALY) safety limits, a measure of overall disease burden, for S. aureus under the tested exposure scenarios. Our study suggests that health risks from aerosolizing pathogens near RVFCW GW-treatment systems are likely low. This study also emphasizes the growing need for standardization of bioaerosol-evaluation techniques to provide more accurate quantification of small amounts of viable, aerosolized bacterial pathogens. Copyright © 2016 Elsevier B.V. All rights reserved.
Grelewska-Nowotko, Katarzyna; Żurawska-Zajfert, Magdalena; Żmijewska, Ewelina; Sowa, Sławomir
2018-05-01
In recent years, digital polymerase chain reaction (dPCR), a new molecular biology technique, has been gaining in popularity. Among many other applications, this technique can also be used for the detection and quantification of genetically modified organisms (GMOs) in food and feed. It might replace the currently widely used real-time PCR method (qPCR), by overcoming problems related to the PCR inhibition and the requirement of certified reference materials to be used as a calibrant. In theory, validated qPCR methods can be easily transferred to the dPCR platform. However, optimization of the PCR conditions might be necessary. In this study, we report the transfer of two validated qPCR methods for quantification of maize DAS1507 and NK603 events to the droplet dPCR (ddPCR) platform. After some optimization, both methods have been verified according to the guidance of the European Network of GMO Laboratories (ENGL) on analytical method verification (ENGL working group on "Method Verification." (2011) Verification of Analytical Methods for GMO Testing When Implementing Interlaboratory Validated Methods). Digital PCR methods performed equally or better than the qPCR methods. Optimized ddPCR methods confirm their suitability for GMO determination in food and feed.
Rashed-Ul Islam, S M; Jahan, Munira; Tabassum, Shahina
2015-01-01
Virological monitoring is the best predictor for the management of chronic hepatitis B virus (HBV) infections. Consequently, it is important to use the most efficient, rapid and cost-effective testing systems for HBV DNA quantification. The present study compared the performance characteristics of a one-step HBV polymerase chain reaction (PCR) vs the two-step HBV PCR method for quantification of HBV DNA from clinical samples. A total of 100 samples consisting of 85 randomly selected samples from patients with chronic hepatitis B (CHB) and 15 samples from apparently healthy individuals were enrolled in this study. Of the 85 CHB clinical samples tested, HBV DNA was detected from 81% samples by one-step PCR method with median HBV DNA viral load (VL) of 7.50 × 10 3 lU/ml. In contrast, 72% samples were detected by the two-step PCR system with median HBV DNA of 3.71 × 10 3 lU/ml. The one-step method showed strong linear correlation with two-step PCR method (r = 0.89; p < 0.0001). Both methods showed good agreement at Bland-Altman plot, with a mean difference of 0.61 log 10 IU/ml and limits of agreement of -1.82 to 3.03 log 10 IU/ml. The intra-assay and interassay coefficients of variation (CV%) of plasma samples (4-7 log 10 IU/ml) for the one-step PCR method ranged between 0.33 to 0.59 and 0.28 to 0.48 respectively, thus demonstrating a high level of concordance between the two methods. Moreover, elimination of the DNA extraction step in the one-step PCR kit allowed time-efficient and significant labor and cost savings for the quantification of HBV DNA in a resource limited setting. Rashed-Ul Islam SM, Jahan M, Tabassum S. Evaluation of a Rapid One-step Real-time PCR Method as a High-throughput Screening for Quantification of Hepatitis B Virus DNA in a Resource-limited Setting. Euroasian J Hepato-Gastroenterol 2015;5(1):11-15.
Jahan, Munira; Tabassum, Shahina
2015-01-01
Virological monitoring is the best predictor for the management of chronic hepatitis B virus (HBV) infections. Consequently, it is important to use the most efficient, rapid and cost-effective testing systems for HBV DNA quantification. The present study compared the performance characteristics of a one-step HBV polymerase chain reaction (PCR) vs the two-step HBV PCR method for quantification of HBV DNA from clinical samples. A total of 100 samples consisting of 85 randomly selected samples from patients with chronic hepatitis B (CHB) and 15 samples from apparently healthy individuals were enrolled in this study. Of the 85 CHB clinical samples tested, HBV DNA was detected from 81% samples by one-step PCR method with median HBV DNA viral load (VL) of 7.50 × 103 lU/ml. In contrast, 72% samples were detected by the two-step PCR system with median HBV DNA of 3.71 × 103 lU/ml. The one-step method showed strong linear correlation with two-step PCR method (r = 0.89; p < 0.0001). Both methods showed good agreement at Bland-Altman plot, with a mean difference of 0.61 log10 IU/ml and limits of agreement of -1.82 to 3.03 log10 IU/ml. The intra-assay and interassay coefficients of variation (CV%) of plasma samples (4-7 log10 IU/ml) for the one-step PCR method ranged between 0.33 to 0.59 and 0.28 to 0.48 respectively, thus demonstrating a high level of concordance between the two methods. Moreover, elimination of the DNA extraction step in the one-step PCR kit allowed time-efficient and significant labor and cost savings for the quantification of HBV DNA in a resource limited setting. How to cite this article Rashed-Ul Islam SM, Jahan M, Tabassum S. Evaluation of a Rapid One-step Real-time PCR Method as a High-throughput Screening for Quantification of Hepatitis B Virus DNA in a Resource-limited Setting. Euroasian J Hepato-Gastroenterol 2015;5(1):11-15. PMID:29201678
Seo, K H; Valentin-Bon, I E; Brackett, R E
2006-03-01
Salmonellosis caused by Salmonella Enteritidis (SE) is a significant cause of foodborne illnesses in the United States. Consumption of undercooked eggs and egg-containing products has been the primary risk factor for the disease. The importance of the bacterial enumeration technique has been enormously stressed because of the quantitative risk analysis of SE in shell eggs. Traditional enumeration methods mainly depend on slow and tedious most-probable-number (MPN) methods. Therefore, specific, sensitive, and rapid methods for SE quantitation are needed to collect sufficient data for risk assessment and food safety policy development. We previously developed a real-time quantitative PCR assay for the direct detection and enumeration of SE and, in this study, applied it to naturally contaminated ice cream samples with and without enrichment. The detection limit of the real-time PCR assay was determined with artificially inoculated ice cream. When applied to the direct detection and quantification of SE in ice cream, the real-time PCR assay was as sensitive as the conventional plate count method in frequency of detection. However, populations of SE derived from real-time quantitative PCR were approximately 1 log higher than provided by MPN and CFU values obtained by conventional culture methods. The detection and enumeration of SE in naturally contaminated ice cream can be completed in 3 h by this real-time PCR method, whereas the cultural enrichment method requires 5 to 7 days. A commercial immunoassay for the specific detection of SE was also included in the study. The real-time PCR assay proved to be a valuable tool that may be useful to the food industry in monitoring its processes to improve product quality and safety.
Optimization and determination of polycyclic aromatic hydrocarbons in biochar-based fertilizers.
Chen, Ping; Zhou, Hui; Gan, Jay; Sun, Mingxing; Shang, Guofeng; Liu, Liang; Shen, Guoqing
2015-03-01
The agronomic benefit of biochar has attracted widespread attention to biochar-based fertilizers. However, the inevitable presence of polycyclic aromatic hydrocarbons in biochar is a matter of concern because of the health and ecological risks of these compounds. The strong adsorption of polycyclic aromatic hydrocarbons to biochar complicates their analysis and extraction from biochar-based fertilizers. In this study, we optimized and validated a method for determining the 16 priority polycyclic aromatic hydrocarbons in biochar-based fertilizers. Results showed that accelerated solvent extraction exhibited high extraction efficiency. Based on a Box-Behnken design with a triplicate central point, accelerated solvent extraction was used under the following optimal operational conditions: extraction temperature of 78°C, extraction time of 17 min, and two static cycles. The optimized method was validated by assessing the linearity of analysis, limit of detection, limit of quantification, recovery, and application to real samples. The results showed that the 16 polycyclic aromatic hydrocarbons exhibited good linearity, with a correlation coefficient of 0.996. The limits of detection varied between 0.001 (phenanthrene) and 0.021 mg/g (benzo[ghi]perylene), and the limits of quantification varied between 0.004 (phenanthrene) and 0.069 mg/g (benzo[ghi]perylene). The relative recoveries of the 16 polycyclic aromatic hydrocarbons were 70.26-102.99%. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Zhang, Hu; Wang, Xinquan; Qian, Mingrong; Wang, Xiangyun; Xu, Hao; Xu, Mingfei; Wang, Qiang
2011-11-23
A simple and sensitive enantioselective method for the determination of fenbuconazole and myclobutanil in strawberry was developed by high-performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS). Fenbuconazole and myclobutanil residues in strawberry were extracted with acetonitrile containing 1% acetic acid, and an aliquot was cleaned up with PSA (primary and secondary amine) and C(18) sorbent. The direct resolution of fenbuconazole and myclobutanil enantiomers was performed on a cellulose tris (3,5-dimethylphenylcarbamate) column using acetonitrile-0.1% formic acid solution (60:40, v/v) as the mobile phase. Quantification was achieved using matrix-matched standard calibration curves, and the limits of quantification for fenbuconazole and myclobutanil enantiomers in strawberry were both 2 μg/kg. The method was successfully utilized to investigate the probable enantioselective degradation of fenbuconazole and myclobutanil in strawberry. The results showed that the degradation of the fenbuconazole and myclobutanil enantiomers in strawberry followed pseudofirst-order kinetics (R(2) > 0.97). The results from this study revealed that the degradation of fenbuconazole in strawberry was not enantioselective, while the degradation of myclobutanil was enantioselective, and the (+)-myclobutanil showed a faster degradation than (-)-myclobutanil in strawberry, resulting in the relative enrichment of (-)-myclobutanil in residue. The results could provide a reference to fully evaluate the risks of these two fungicides.
Continuous Grading of Early Fibrosis in NAFLD Using Label-Free Imaging: A Proof-of-Concept Study
Pirhonen, Juho; Arola, Johanna; Sädevirta, Sanja; Luukkonen, Panu; Karppinen, Sanna-Maria; Pihlajaniemi, Taina; Isomäki, Antti; Hukkanen, Mika
2016-01-01
Background and Aims Early detection of fibrosis is important in identifying individuals at risk for advanced liver disease in non-alcoholic fatty liver disease (NAFLD). We tested whether second-harmonic generation (SHG) and coherent anti-Stokes Raman scattering (CARS) microscopy, detecting fibrillar collagen and fat in a label-free manner, might allow automated and sensitive quantification of early fibrosis in NAFLD. Methods We analyzed 32 surgical biopsies from patients covering histological fibrosis stages 0–4, using multimodal label-free microscopy. Native samples were visualized by SHG and CARS imaging for detecting fibrillar collagen and fat. Furthermore, we developed a method for quantitative assessment of early fibrosis using automated analysis of SHG signals. Results We found that the SHG mean signal intensity correlated well with fibrosis stage and the mean CARS signal intensity with liver fat. Little overlap in SHG signal intensities between fibrosis stages 0 and 1 was observed. A specific fibrillar SHG signal was detected in the liver parenchyma outside portal areas in all samples histologically classified as having no fibrosis. This signal correlated with immunohistochemical location of fibrillar collagens I and III. Conclusions This study demonstrates that label-free SHG imaging detects fibrillar collagen deposition in NAFLD more sensitively than routine histological staging and enables observer-independent quantification of early fibrosis in NAFLD with continuous grading. PMID:26808140
Sample preparation and EFTEM of Meat Samples for Nanoparticle Analysis in Food
NASA Astrophysics Data System (ADS)
Lari, L.; Dudkiewicz, A.
2014-06-01
Nanoparticles are used in industry for personal care products and the preparation of food. In the latter application, their functions include the prevention of microbes' growth, increase of the foods nutritional value and sensory quality. EU regulations require a risk assessment of the nanoparticles used in foods and food contact materials before the products can reach the market. However, availability of validated analytical methodologies for detection and characterisation of the nanoparticles in food hampers appropriate risk assessment. As part of a research on the evaluation of the methods for screening and quantification of Ag nanoparticles in meat we have tested a new TEM sample preparation alternative to resin embedding and cryo-sectioning. Energy filtered TEM analysis was applied to evaluate thickness and the uniformity of thin meat layers acquired at increasing input of the sample demonstrating that the protocols used ensured good stability under the electron beam, reliable sample concentration and reproducibility.
Organochlorine Pesticides in Gonad, Brain, and Blood of Mice in Two Agricultural Areas of Sinaloa.
Perez-Gonzalez, Ernestina; Osuna-Martinez, Ulises-Giovanni; Herrera-Moreno, Maria-Nancy; Rodriguez-Meza, Guadalupe-Durga; Gonzalez-Ocampo, Hector-A; Bucio-Pacheco, Marcos
2017-04-01
The adverse effect of pesticides on non-target wildlife and human health is a primary concern in the world, but in Mexico, we do not know which wildlife species are at the greatest risk. The aim of this study was to determine organochlorine pesticides in mice of two agricultural fields in Sinaloa, Culiacan and Guasave. Procedures of extraction, analysis, and quantification were followed according to the modified EPA 8081b method. In three mouse tissues (gonad, brain, and blood), γBHC and decachlorobiphenyl with a frequency higher than 50% and endosulfan sulfate with 43% were observed. The wildlife fauna living in agricultural areas are at great risk due to: (1) diversity of the chemicals used for pest control, like mice, and (2) variety of organochlorine pesticides in direct or indirect contact with non-target organisms, affecting the health of animals and humans (toxic effects and accumulation).
Julien, Jennifer; Dumont, Nathalie; Lebouil, David; Germain, Patrick
2014-01-01
Current waste management policies favor biogases (digester gases (DGs) and landfill gases (LFGs)) valorization as it becomes a way for energy politics. However, volatile organic silicon compounds (VOSiCs) contained into DGs/LFGs severely damage combustion engines and endanger the conversion into electricity by power plants, resulting in a high purification level requirement. Assessing treatment efficiency is still difficult. No consensus has been reached to provide a standardized sampling and quantification of VOSiCs into gases because of their diversity, their physicochemical properties, and the omnipresence of silicon in analytical chains. Usually, samplings are done by adsorption or absorption and quantification made by gas chromatography-mass spectrometry (GC-MS) or inductively coupled plasma-optical emission spectrometry (ICP-OES). In this objective, this paper presents and discusses the optimization of a patented method consisting in VOSiCs sampling by absorption of 100% ethanol and quantification of total Si by ICP-OES. PMID:25379538
Round robin test on quantification of amyloid-β 1-42 in cerebrospinal fluid by mass spectrometry.
Pannee, Josef; Gobom, Johan; Shaw, Leslie M; Korecka, Magdalena; Chambers, Erin E; Lame, Mary; Jenkins, Rand; Mylott, William; Carrillo, Maria C; Zegers, Ingrid; Zetterberg, Henrik; Blennow, Kaj; Portelius, Erik
2016-01-01
Cerebrospinal fluid (CSF) amyloid-β 1-42 (Aβ42) is an important biomarker for Alzheimer's disease, both in diagnostics and to monitor disease-modifying therapies. However, there is a great need for standardization of methods used for quantification. To overcome problems associated with immunoassays, liquid chromatography-tandem mass spectrometry (LC-MS/MS) has emerged as a critical orthogonal alternative. We compared results for CSF Aβ42 quantification in a round robin study performed in four laboratories using similar sample preparation methods and LC-MS instrumentation. The LC-MS results showed excellent correlation between laboratories (r(2) >0.98), high analytical precision, and good correlation with enzyme-linked immunosorbent assay (r(2) >0.85). The use of a common reference sample further decreased interlaboratory variation. Our results indicate that LC-MS is suitable for absolute quantification of Aβ42 in CSF and highlight the importance of developing a certified reference material. Copyright © 2016 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
Verant, Michelle L; Bohuski, Elizabeth A; Lorch, Jeffery M; Blehert, David S
2016-03-01
The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid from P. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer-based qPCR test for P. destructans to refine quantification capabilities of this assay. © 2016 The Author(s).
A multicenter study benchmarks software tools for label-free proteome quantification.
Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan
2016-11-01
Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.
Chottier, Claire; Chatain, Vincent; Julien, Jennifer; Dumont, Nathalie; Lebouil, David; Germain, Patrick
2014-01-01
Current waste management policies favor biogases (digester gases (DGs) and landfill gases (LFGs)) valorization as it becomes a way for energy politics. However, volatile organic silicon compounds (VOSiCs) contained into DGs/LFGs severely damage combustion engines and endanger the conversion into electricity by power plants, resulting in a high purification level requirement. Assessing treatment efficiency is still difficult. No consensus has been reached to provide a standardized sampling and quantification of VOSiCs into gases because of their diversity, their physicochemical properties, and the omnipresence of silicon in analytical chains. Usually, samplings are done by adsorption or absorption and quantification made by gas chromatography-mass spectrometry (GC-MS) or inductively coupled plasma-optical emission spectrometry (ICP-OES). In this objective, this paper presents and discusses the optimization of a patented method consisting in VOSiCs sampling by absorption of 100% ethanol and quantification of total Si by ICP-OES.
Verant, Michelle; Bohuski, Elizabeth A.; Lorch, Jeffrey M.; Blehert, David
2016-01-01
The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid fromP. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer–based qPCR test for P. destructans to refine quantification capabilities of this assay.
Quaternary ammonium isobaric tag for a relative and absolute quantification of peptides.
Setner, Bartosz; Stefanowicz, Piotr; Szewczuk, Zbigniew
2018-02-01
Isobaric labeling quantification of peptides has become a method of choice for mass spectrometry-based proteomics studies. However, despite of wide variety of commercially available isobaric tags, none of the currently available methods offers significant improvement of sensitivity of detection during MS experiment. Recently, many strategies were applied to increase the ionization efficiency of peptides involving chemical modifications introducing quaternary ammonium fixed charge. Here, we present a novel quaternary ammonium-based isobaric tag for relative and absolute quantification of peptides (QAS-iTRAQ 2-plex). Upon collisional activation, the new stable benzylic-type cationic reporter ion is liberated from the tag. Deuterium atoms were used to offset the differential masses of a reporter group. We tested the applicability of QAS-iTRAQ 2-plex reagent on a series of model peptides as well as bovine serum albumin tryptic digest. Obtained results suggest usefulness of this isobaric ionization tag for relative and absolute quantification of peptides. Copyright © 2017 John Wiley & Sons, Ltd.
Phylogenetic Quantification of Intra-tumour Heterogeneity
Schwarz, Roland F.; Trinh, Anne; Sipos, Botond; Brenton, James D.; Goldman, Nick; Markowetz, Florian
2014-01-01
Intra-tumour genetic heterogeneity is the result of ongoing evolutionary change within each cancer. The expansion of genetically distinct sub-clonal populations may explain the emergence of drug resistance, and if so, would have prognostic and predictive utility. However, methods for objectively quantifying tumour heterogeneity have been missing and are particularly difficult to establish in cancers where predominant copy number variation prevents accurate phylogenetic reconstruction owing to horizontal dependencies caused by long and cascading genomic rearrangements. To address these challenges, we present MEDICC, a method for phylogenetic reconstruction and heterogeneity quantification based on a Minimum Event Distance for Intra-tumour Copy-number Comparisons. Using a transducer-based pairwise comparison function, we determine optimal phasing of major and minor alleles, as well as evolutionary distances between samples, and are able to reconstruct ancestral genomes. Rigorous simulations and an extensive clinical study show the power of our method, which outperforms state-of-the-art competitors in reconstruction accuracy, and additionally allows unbiased numerical quantification of tumour heterogeneity. Accurate quantification and evolutionary inference are essential to understand the functional consequences of tumour heterogeneity. The MEDICC algorithms are independent of the experimental techniques used and are applicable to both next-generation sequencing and array CGH data. PMID:24743184
Quan, Phenix-Lan; Sauzade, Martin
2018-01-01
Digital Polymerase Chain Reaction (dPCR) is a novel method for the absolute quantification of target nucleic acids. Quantification by dPCR hinges on the fact that the random distribution of molecules in many partitions follows a Poisson distribution. Each partition acts as an individual PCR microreactor and partitions containing amplified target sequences are detected by fluorescence. The proportion of PCR-positive partitions suffices to determine the concentration of the target sequence without a need for calibration. Advances in microfluidics enabled the current revolution of digital quantification by providing efficient partitioning methods. In this review, we compare the fundamental concepts behind the quantification of nucleic acids by dPCR and quantitative real-time PCR (qPCR). We detail the underlying statistics of dPCR and explain how it defines its precision and performance metrics. We review the different microfluidic digital PCR formats, present their underlying physical principles, and analyze the technological evolution of dPCR platforms. We present the novel multiplexing strategies enabled by dPCR and examine how isothermal amplification could be an alternative to PCR in digital assays. Finally, we determine whether the theoretical advantages of dPCR over qPCR hold true by perusing studies that directly compare assays implemented with both methods. PMID:29677144
NASA Astrophysics Data System (ADS)
Zhao, Fengjun; Liu, Junting; Qu, Xiaochao; Xu, Xianhui; Chen, Xueli; Yang, Xiang; Cao, Feng; Liang, Jimin; Tian, Jie
2014-12-01
To solve the multicollinearity issue and unequal contribution of vascular parameters for the quantification of angiogenesis, we developed a quantification evaluation method of vascular parameters for angiogenesis based on in vivo micro-CT imaging of hindlimb ischemic model mice. Taking vascular volume as the ground truth parameter, nine vascular parameters were first assembled into sparse principal components (PCs) to reduce the multicolinearity issue. Aggregated boosted trees (ABTs) were then employed to analyze the importance of vascular parameters for the quantification of angiogenesis via the loadings of sparse PCs. The results demonstrated that vascular volume was mainly characterized by vascular area, vascular junction, connectivity density, segment number and vascular length, which indicated they were the key vascular parameters for the quantification of angiogenesis. The proposed quantitative evaluation method was compared with both the ABTs directly using the nine vascular parameters and Pearson correlation, which were consistent. In contrast to the ABTs directly using the vascular parameters, the proposed method can select all the key vascular parameters simultaneously, because all the key vascular parameters were assembled into the sparse PCs with the highest relative importance.
A Cyber-Attack Detection Model Based on Multivariate Analyses
NASA Astrophysics Data System (ADS)
Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi
In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.
Methods for detection of GMOs in food and feed.
Marmiroli, Nelson; Maestri, Elena; Gullì, Mariolina; Malcevschi, Alessio; Peano, Clelia; Bordoni, Roberta; De Bellis, Gianluca
2008-10-01
This paper reviews aspects relevant to detection and quantification of genetically modified (GM) material within the feed/food chain. The GM crop regulatory framework at the international level is evaluated with reference to traceability and labelling. Current analytical methods for the detection, identification, and quantification of transgenic DNA in food and feed are reviewed. These methods include quantitative real-time PCR, multiplex PCR, and multiplex real-time PCR. Particular attention is paid to methods able to identify multiple GM events in a single reaction and to the development of microdevices and microsensors, though they have not been fully validated for application.
Kamal, Abid; Khan, Washim; Ahmad, Sayeed; Ahmad, F. J.; Saleem, Kishwar
2015-01-01
Objective: The present study was used to design simple, accurate and sensitive reversed phase-high-performance liquid chromatography RP-HPLC and high-performance thin-layer chromatography (HPTLC) methods for the development of quantification of khellin present in the seeds of Ammi visnaga. Materials and Methods: RP-HPLC analysis was performed on a C18 column with methanol: Water (75: 25, v/v) as a mobile phase. The HPTLC method involved densitometric evaluation of khellin after resolving it on silica gel plate using ethyl acetate: Toluene: Formic acid (5.5:4.0:0.5, v/v/v) as a mobile phase. Results: The developed HPLC and HPTLC methods were validated for precision (interday, intraday and intersystem), robustness and accuracy, limit of detection and limit of quantification. The relationship between the concentration of standard solutions and the peak response was linear in both HPLC and HPTLC methods with the concentration range of 10–80 μg/mL in HPLC and 25–1,000 ng/spot in HPTLC for khellin. The % relative standard deviation values for method precision was found to be 0.63–1.97%, 0.62–2.05% in HPLC and HPTLC for khellin respectively. Accuracy of the method was checked by recovery studies conducted at three different concentration levels and the average percentage recovery was found to be 100.53% in HPLC and 100.08% in HPTLC for khellin. Conclusions: The developed HPLC and HPTLC methods for the quantification of khellin were found simple, precise, specific, sensitive and accurate which can be used for routine analysis and quality control of A. visnaga and several formulations containing it as an ingredient. PMID:26681890
PCR technology for screening and quantification of genetically modified organisms (GMOs).
Holst-Jensen, Arne; Rønning, Sissel B; Løvseth, Astrid; Berdal, Knut G
2003-04-01
Although PCR technology has obvious limitations, the potentially high degree of sensitivity and specificity explains why it has been the first choice of most analytical laboratories interested in detection of genetically modified (GM) organisms (GMOs) and derived materials. Because the products that laboratories receive for analysis are often processed and refined, the quality and quantity of target analyte (e.g. protein or DNA) frequently challenges the sensitivity of any detection method. Among the currently available methods, PCR methods are generally accepted as the most sensitive and reliable methods for detection of GM-derived material in routine applications. The choice of target sequence motif is the single most important factor controlling the specificity of the PCR method. The target sequence is normally a part of the modified gene construct, for example a promoter, a terminator, a gene, or a junction between two of these elements. However, the elements may originate from wildtype organisms, they may be present in more than one GMO, and their copy number may also vary from one GMO to another. They may even be combined in a similar way in more than one GMO. Thus, the choice of method should fit the purpose. Recent developments include event-specific methods, particularly useful for identification and quantification of GM content. Thresholds for labelling are now in place in many countries including those in the European Union. The success of the labelling schemes is dependent upon the efficiency with which GM-derived material can be detected. We will present an overview of currently available PCR methods for screening and quantification of GM-derived DNA, and discuss their applicability and limitations. In addition, we will discuss some of the major challenges related to determination of the limits of detection (LOD) and quantification (LOQ), and to validation of methods.
Risk identification of agricultural drought for sustainable Agroecosystems
NASA Astrophysics Data System (ADS)
Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.; Tarquis, A. M.
2014-09-01
Drought is considered as one of the major natural hazards with a significant impact on agriculture, environment, society and economy. Droughts affect sustainability of agriculture and may result in environmental degradation of a region, which is one of the factors contributing to the vulnerability of agriculture. This paper addresses agrometeorological or agricultural drought within the risk management framework. Risk management consists of risk assessment, as well as a feedback on the adopted risk reduction measures. And risk assessment comprises three distinct steps, namely risk identification, risk estimation and risk evaluation. This paper deals with risk identification of agricultural drought, which involves drought quantification and monitoring, as well as statistical inference. For the quantitative assessment of agricultural drought, as well as the computation of spatiotemporal features, one of the most reliable and widely used indices is applied, namely the vegetation health index (VHI). The computation of VHI is based on satellite data of temperature and the normalized difference vegetation index (NDVI). The spatiotemporal features of drought, which are extracted from VHI, are areal extent, onset and end time, duration and severity. In this paper, a 20-year (1981-2001) time series of the National Oceanic and Atmospheric Administration/advanced very high resolution radiometer (NOAA/AVHRR) satellite data is used, where monthly images of VHI are extracted. Application is implemented in Thessaly, which is the major agricultural drought-prone region of Greece, characterized by vulnerable agriculture. The results show that agricultural drought appears every year during the warm season in the region. The severity of drought is increasing from mild to extreme throughout the warm season, with peaks appearing in the summer. Similarly, the areal extent of drought is also increasing during the warm season, whereas the number of extreme drought pixels is much less than those of mild to moderate drought throughout the warm season. Finally, the areas with diachronic drought persistence can be located. Drought early warning is developed using empirical functional relationships of severity and areal extent. In particular, two second-order polynomials are fitted, one for low and the other for high severity drought classes, respectively. The two fitted curves offer a forecasting tool on a monthly basis from May to October. The results of this drought risk identification effort are considered quite satisfactory offering a prognostic potential. The adopted remote-sensing data and methods have proven very effective in delineating spatial variability and features in drought quantification and monitoring.
2016-02-01
SPECTROMETRY: QUANTIFICATION OF FREE GB FROM VARIOUS FOOD MATRICES ECBC-TR-1351 Sue Y. Bae Mark D. Winemiller RESEARCH AND TECHNOLOGY DIRECTORATE...Flight Mass Spectrometry: Quantification of Free GB from Various Food Matrices 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...methylphosphonofluoridate (sarin, GB) in various food matrices. The development of a solid-phase extraction method using a normal-phase silica gel column for
Li, Kefeng; Xia, Yonghong; Ma, Guolin; Zhao, Yanna; Pidatala, Venkataramana R
2017-03-29
Allura red is a widely used synthetic food dye. In this study, we developed and validated a LC-MS/MS method for the quantification of allura red in three popular takeaway Chinese dishes (braised pork, soy sauce chicken, sweet and sour pork) and human urine samples. High levels of allura red ranging from 2.85 to 8.38 mg/g wet weight were detected in the surveyed Chinese dishes. Of 113 participants who frequently consume the surveyed Chinese dishes (>once a week in the past 2 years), the median of their urinary allura red level was 22.29 nM/mM creatinine (95% CI = 19.48-25.03) . Risk assessment using Cox proportional hazard models showed that a 10-fold increase in urinary allura red was positively associated with high blood pressure (odds ratio of 1.75 (95% CI = 0.78-3.96)). Our findings provide new insights for the potential risk of hypertension for long-term allura red overconsumption.
Wu, Jianyong; Gronewold, Andrew D; Rodriguez, Roberto A; Stewart, Jill R; Sobsey, Mark D
2014-02-01
Rapid quantification of viral pathogens in drinking and recreational water can help reduce waterborne disease risks. For this purpose, samples in small volume (e.g. 1L) are favored because of the convenience of collection, transportation and processing. However, the results of viral analysis are often subject to uncertainty. To overcome this limitation, we propose an approach that integrates Bayesian statistics, efficient concentration methods, and quantitative PCR (qPCR) to quantify viral pathogens in water. Using this approach, we quantified human adenoviruses (HAdVs) in eighteen samples of source water collected from six drinking water treatment plants. HAdVs were found in seven samples. In the other eleven samples, HAdVs were not detected by qPCR, but might have existed based on Bayesian inference. Our integrated approach that quantifies uncertainty provides a better understanding than conventional assessments of potential risks to public health, particularly in cases when pathogens may present a threat but cannot be detected by traditional methods. © 2013 Elsevier B.V. All rights reserved.
Souza, Darliana M; Reichert, Jaqueline F; Martins, Ayrton F
2018-06-01
Currently, there is an increasing use of anti-cancer drugs, and hence their occurrence in the environment must be properly managed, in particular, in the light of their high degree of toxicity. In this study, analytical methods using HPLC-FLD assisted by microextraction and solid phase extraction, were developed and validated for the determination of doxorubicin, daunorubicin, epirubicin and irinotecan in hospital effluent. The validation results show determination coefficients (r 2 ) higher than 0.99 and recovery values between 74% and 105%, with an intraday precision of <15%.The limit of quantification was 1.0 μg L - 1 and there were almost no matrix effects. The methods proposed were employed for the determination of the named chemotherapeutics in effluent samples of the University Hospital of Santa Maria, Brazil and quantified in the range of ≥LOQ ̶ 6.22 μg L -1 . A preliminary ecotoxicological risk assessment showed values that were potentially very harmful, and thus, the treatment of the hospital effluents requires special attention. Copyright © 2018 Elsevier Ltd. All rights reserved.
Oberbach, Andreas; Schlichting, Nadine; Neuhaus, Jochen; Kullnick, Yvonne; Lehmann, Stefanie; Heinrich, Marco; Dietrich, Arne; Mohr, Friedrich Wilhelm; von Bergen, Martin; Baumann, Sven
2014-12-05
Multiple reaction monitoring (MRM)-based mass spectrometric quantification of peptides and their corresponding proteins has been successfully applied for biomarker validation in serum. The option of multiplexing offers the chance to analyze various proteins in parallel, which is especially important in obesity research. Here, biomarkers that reflect multiple comorbidities and allow monitoring of therapy outcomes are required. Besides the suitability of established MRM assays for serum protein quantification, it is also feasible for analysis of tissues secreting the markers of interest. Surprisingly, studies comparing MRM data sets with established methods are rare, and therefore the biological and clinical value of most analytes remains questionable. A MRM method using nano-UPLC-MS/MS for the quantification of obesity related surrogate markers for several comorbidities in serum, plasma, visceral and subcutaneous adipose tissue was established. Proteotypic peptides for complement C3, adiponectin, angiotensinogen, and plasma retinol binding protein (RBP4) were quantified using isotopic dilution analysis and compared to the standard ELISA method. MRM method variabilities were mainly below 10%. The comparison with other MS-based approaches showed a good correlation. However, large differences in absolute quantification for complement C3 and adiponectin were obtained compared to ELISA, while less marked differences were observed for angiotensinogen and RBP4. The verification of MRM in obesity was performed to discriminate first lean and obese phenotype and second to monitor excessive weight loss after gastric bypass surgery in a seven-month follow-up. The presented MRM assay was able to discriminate obese phenotype from lean and monitor weight loss related changes of surrogate markers. However, inclusion of additional biomarkers was necessary to interpret the MRM data on obesity phenotype properly. In summary, the development of disease-related MRMs should include a step of matching the MRM data with clinically approved standard methods and defining reference values in well-sized representative age, gender, and disease-matched cohorts.
Muratovic, Aida Zuberovic; Hagström, Thomas; Rosén, Johan; Granelli, Kristina; Hellenäs, Karl-Erik
2015-09-11
A method that uses mass spectrometry (MS) for identification and quantification of protein toxins, staphylococcal enterotoxins A and B (SEA and SEB), in milk and shrimp is described. The analysis was performed using a tryptic peptide, from each of the toxins, as the target analyte together with the corresponding (13)C-labeled synthetic internal standard peptide. The performance of the method was evaluated by analyzing spiked samples in the quantification range 2.5-30 ng/g (R² = 0.92-0.99). The limit of quantification (LOQ) in milk and the limit of detection (LOD) in shrimp was 2.5 ng/g, for both SEA and SEB toxins. The in-house reproducibility (RSD) was 8%-30% and 5%-41% at different concentrations for milk and shrimp, respectively. The method was compared to the ELISA method, used at the EU-RL (France), for milk samples spiked with SEA at low levels, in the quantification range of 2.5 to 5 ng/g. The comparison showed good coherence for the two methods: 2.9 (MS)/1.8 (ELISA) and 3.6 (MS)/3.8 (ELISA) ng/g. The major advantage of the developed method is that it allows direct confirmation of the molecular identity and quantitative analysis of SEA and SEB at low nanogram levels using a label and antibody free approach. Therefore, this method is an important step in the development of alternatives to the immune-assay tests currently used for staphylococcal enterotoxin analysis.
Collagen Quantification in Tissue Specimens.
Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I
2017-01-01
Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.
A quantitative witness for Greenberger-Horne-Zeilinger entanglement.
Eltschka, Christopher; Siewert, Jens
2012-01-01
Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger-type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties.
A quantitative witness for Greenberger-Horne-Zeilinger entanglement
Eltschka, Christopher; Siewert, Jens
2012-01-01
Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger–type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties. PMID:23267431
Deng, Yong; Luo, Zhaoyang; Jiang, Xu; Xie, Wenhao; Luo, Qingming
2015-07-01
We propose a method based on a decoupled fluorescence Monte Carlo model for constructing fluorescence Jacobians to enable accurate quantification of fluorescence targets within turbid media. The effectiveness of the proposed method is validated using two cylindrical phantoms enclosing fluorescent targets within homogeneous and heterogeneous background media. The results demonstrate that our method can recover relative concentrations of the fluorescent targets with higher accuracy than the perturbation fluorescence Monte Carlo method. This suggests that our method is suitable for quantitative fluorescence diffuse optical tomography, especially for in vivo imaging of fluorophore targets for diagnosis of different diseases and abnormalities.
Arellano, Cécile; Allal, Ben; Goubaa, Anwar; Roché, Henri; Chatelut, Etienne
2014-11-01
A selective and accurate analytical method is needed to quantify tamoxifen and its phase I metabolites in a prospective clinical protocol, for evaluation of pharmacokinetic parameters of tamoxifen and its metabolites in adjuvant treatment of breast cancer. The selectivity of the analytical method is a fundamental criteria to allow the quantification of the main active metabolites (Z)-isomers from (Z)'-isomers. An UPLC-MS/MS method was developed and validated for the quantification of (Z)-tamoxifen, (Z)-endoxifen, (E)-endoxifen, Z'-endoxifen, (Z)'-endoxifen, (Z)-4-hydroxytamoxifen, (Z)-4'-hydroxytamoxifen, N-desmethyl tamoxifen, and tamoxifen-N-oxide. The validation range was set between 0.5ng/mL and 125ng/mL for 4-hydroxytamoxifen and endoxifen isomers, and between 12.5ng/mL and 300ng/mL for tamoxifen, tamoxifen N-desmethyl and tamoxifen-N-oxide. The application to patient plasma samples was performed. Copyright © 2014 Elsevier B.V. All rights reserved.
Multiple products monitoring as a robust approach for peptide quantification.
Baek, Je-Hyun; Kim, Hokeun; Shin, Byunghee; Yu, Myeong-Hee
2009-07-01
Quantification of target peptides and proteins is crucial for biomarker discovery. Approaches such as selected reaction monitoring (SRM) and multiple reaction monitoring (MRM) rely on liquid chromatography and mass spectrometric analysis of defined peptide product ions. These methods are not very widespread because the determination of quantifiable product ion using either SRM or MRM is a very time-consuming process. We developed a novel approach for quantifying target peptides without such an arduous process of ion selection. This method is based on monitoring multiple product ions (multiple products monitoring: MpM) from full-range MS2 spectra of a target precursor. The MpM method uses a scoring system that considers both the absolute intensities of product ions and the similarities between the query MS2 spectrum and the reference MS2 spectrum of the target peptide. Compared with conventional approaches, MpM greatly improves sensitivity and selectivity of peptide quantification using an ion-trap mass spectrometer.
Jordan, Gregor; Onami, Ichio; Heinrich, Julia; Staack, Roland F
2017-11-01
Assessment of active drug exposure of biologics may be crucial for drug development. Typically, ligand-binding assay methods are used to provide free/active drug concentrations. To what extent hybrid LC-MS/MS procedures enable correct 'active' drug quantification is currently under consideration. Experimental & results: The relevance of appropriate extraction condition was evaluated by a hybrid target capture immuno-affinity LC-MS/MS method using total and free/active quality controls (QCs). The rapid extraction (10 min) provided correct results, whereas overnight incubation resulted in significant overestimation of the free/active drug (monclonal antibody) concentration. Conventional total QCs were inappropriate to determine optimal method conditions in contrast to free/active QCs. The 'free/active analyte QC concept' enables development of appropriate extraction conditions for correct active drug quantification by hybrid LC-MS/MS.
Jia, Xin; Fontaine, Benjamin M.; Strobel, Fred; Weinert, Emily E.
2014-01-01
A sensitive, versatile and economical method to extract and quantify cyclic nucleotide monophosphates (cNMPs) using LC-MS/MS, including both 3',5'-cNMPs and 2',3'-cNMPs, in mammalian tissues and cellular systems has been developed. Problems, such as matrix effects from complex biological samples, are addressed and have been optimized. This protocol allows for comparison of multiple cNMPs in the same system and was used to examine the relationship between tissue levels of cNMPs in a panel of rat organs. In addition, the study reports the first identification and quantification of 2',3'-cIMP. The developed method will allow for quantification of cNMPs levels in cells and tissues with varying disease states, which will provide insight into the role(s) and interplay of cNMP signalling pathways. PMID:25513747
Quantification of cardiolipin by liquid chromatography-electrospray ionization mass spectrometry.
Garrett, Teresa A; Kordestani, Reza; Raetz, Christian R H
2007-01-01
Cardiolipin (CL), a tetra-acylated glycerophospholipid composed of two phosphatidyl moieties linked by a bridging glycerol, plays an important role in mitochondrial function in eukaryotic cells. Alterations to the content and acylation state of CL cause mitochondrial dysfunction and may be associated with pathologies such as ischemia, hypothyrodism, aging, and heart failure. The structure of CL is very complex because of microheterogeneity among its four acyl chains. Here we have developed a method for the quantification of CL molecular species by liquid chromatography-electrospray ionization mass spectrometry. We quantify the [M-2H](2-) ion of a CL of a given molecular formula and identify the CLs by their total number of carbons and unsaturations in the acyl chains. This method, developed using mouse macrophage RAW 264.7 tumor cells, is broadly applicable to other cell lines, tissues, bacteria and yeast. Furthermore, this method could be used for the quantification of lyso-CLs and bis-lyso-CLs.
Jia, Xin; Fontaine, Benjamin M; Strobel, Fred; Weinert, Emily E
2014-12-12
A sensitive, versatile and economical method to extract and quantify cyclic nucleotide monophosphates (cNMPs) using LC-MS/MS, including both 3',5'-cNMPs and 2',3'-cNMPs, in mammalian tissues and cellular systems has been developed. Problems, such as matrix effects from complex biological samples, are addressed and have been optimized. This protocol allows for comparison of multiple cNMPs in the same system and was used to examine the relationship between tissue levels of cNMPs in a panel of rat organs. In addition, the study reports the first identification and quantification of 2',3'-cIMP. The developed method will allow for quantification of cNMPs levels in cells and tissues with varying disease states, which will provide insight into the role(s) and interplay of cNMP signalling pathways.
Trimboli, Francesca; Morittu, Valeria Maria; Cicino, Caterina; Palmieri, Camillo; Britti, Domenico
2017-10-13
The substitution of ewe milk with more economic cow milk is a common fraud. Here we present a capillary electrophoresis method for the quantification of ewe milk in ovine/bovine milk mixtures, which allows for the rapid and inexpensive recognition of ewe milk adulteration with cow milk. We utilized a routine CE method for human blood and urine proteins analysis, which fulfilled the separation of skimmed milk proteins in alkaline buffer. Under this condition, ovine and bovine milk exhibited a recognizable and distinct CE protein profiles, with a specific ewe peak showing a reproducible migration zone in ovine/bovine mixtures. Based on ewe specific CE peak, we developed a method for ewe milk quantification in ovine/bovine skimmed milk mixtures, which showed good linearity, precision and accuracy, and a minimum amount of detectable fraudulent cow milk equal to 5%. Copyright © 2017 Elsevier B.V. All rights reserved.
Prospective study of one million deaths in India: rationale, design, and validation results.
Jha, Prabhat; Gajalakshmi, Vendhan; Gupta, Prakash C; Kumar, Rajesh; Mony, Prem; Dhingra, Neeraj; Peto, Richard
2006-02-01
Over 75% of the annual estimated 9.5 million deaths in India occur in the home, and the large majority of these do not have a certified cause. India and other developing countries urgently need reliable quantification of the causes of death. They also need better epidemiological evidence about the relevance of physical (such as blood pressure and obesity), behavioral (such as smoking, alcohol, HIV-1 risk taking, and immunization history), and biological (such as blood lipids and gene polymorphisms) measurements to the development of disease in individuals or disease rates in populations. We report here on the rationale, design, and implementation of the world's largest prospective study of the causes and correlates of mortality. We will monitor nearly 14 million people in 2.4 million nationally representative Indian households (6.3 million people in 1.1 million households in the 1998-2003 sample frame and 7.6 million people in 1.3 million households in the 2004-2014 sample frame) for vital status and, if dead, the causes of death through a well-validated verbal autopsy (VA) instrument. About 300,000 deaths from 1998-2003 and some 700,000 deaths from 2004-2014 are expected; of these about 850,000 will be coded by two physicians to provide causes of death by gender, age, socioeconomic status, and geographical region. Pilot studies will evaluate the addition of physical and biological measurements, specifically dried blood spots. Preliminary results from over 35,000 deaths suggest that VA can ascertain the leading causes of death, reduce the misclassification of causes, and derive the probable underlying cause of death when it has not been reported. VA yields broad classification of the underlying causes in about 90% of deaths before age 70. In old age, however, the proportion of classifiable deaths is lower. By tracking underlying demographic denominators, the study permits quantification of absolute mortality rates. Household case-control, proportional mortality, and nested case-control methods permit quantification of risk factors. This study will reliably document not only the underlying cause of child and adult deaths but also key risk factors (behavioral, physical, environmental, and eventually, genetic). It offers a globally replicable model for reliably estimating cause-specific mortality using VA and strengthens India's flagship mortality monitoring system. Despite the misclassification that is still expected, the new cause-of-death data will be substantially better than that available previously.
Hynstova, Veronika; Sterbova, Dagmar; Klejdus, Borivoj; Hedbavny, Josef; Huska, Dalibor; Adam, Vojtech
2018-01-30
In this study, 14 commercial products (dietary supplements) containing alga Chlorella vulgaris and cyanobacteria Spirulina platensis, originated from China and Japan, were analysed. UV-vis spectrophotometric method was applied for rapid determination of chlorophylls, carotenoids and pheophytins; as degradation products of chlorophylls. High Performance Thin-Layer Chromatography (HPTLC) was used for effective separation of these compounds, and also Atomic Absorption Spectrometry for determination of heavy metals as indicator of environmental pollution. Based on the results obtained from UV-vis spectrophotometric determination of photosynthetic pigments (chlorophylls and carotenoids), it was confirmed that Chlorella vulgaris contains more of all these pigments compared to the cyanobacteria Spirulina platensis. The fastest mobility compound identified in Chlorella vulgaris and Spirulina platensis using HPTLC method was β-carotene. Spectral analysis and standard calibration curve method were used for identification and quantification of separated substances on Thin-Layer Chromatographic plate. Quantification of copper (Cu 2+ , at 324.7 nm) and zinc (Zn 2+ , at 213.9nm) was performed using Flame Atomic Absorption Spectrometry with air-acetylene flame atomization. Quantification of cadmium (Cd 2+ , at 228.8 nm), nickel (Ni 2+ , at 232.0nm) and lead (Pb 2+ , at 283.3nm) by Electrothermal Graphite Furnace Atomic Absorption Spectrometry; and quantification of mercury (Hg 2+ , at 254nm) by Cold Vapour Atomic Absorption Spectrometry. Copyright © 2017 Elsevier B.V. All rights reserved.
Qian, Yiyun; Zhu, Zhenhua; Duan, Jin-Ao; Guo, Sheng; Shang, Erxin; Tao, Jinhua; Su, Shulan; Guo, Jianming
2017-01-15
A highly sensitive method using ultra-high-pressure liquid chromatography coupled with linear ion trap-Orbitrap tandem mass spectrometry (UHPLC-LTQ-Orbitrap-MS) has been developed and validated for the simultaneous identification and quantification of ginkgolic acids and semi-quantification of their metabolites in rat plasma. For the five selected ginkgolic acids, the method was found to be with good linearities (r>0.9991), good intra- and inter-day precisions (RSD<15%), and good accuracies (RE, from -10.33% to 4.92%) as well. Extraction recoveries, matrix effects and stabilities for rat plasm samples were within the required limits. The validated method was successfully applied to investigate the pharmacokinetics of the five ginkgolic acids in rat plasma after oral administration of 3 dosage groups (900mg/kg, 300mg/kg and 100mg/kg). Meanwhile, six metabolites of GA (15:1) and GA (17:1) were identified by comparison of MS data with reported values. The results of validation in terms of linear ranges, precisions and stabilities were established for semi-quantification of metabolites. The curves of relative changes of these metabolites during the metabolic process were constructed by plotting the peak area ratios of metabolites to salicylic acid (internal standard, IS), respectively. Double peaks were observed in all 3 dose groups. Different type of metabolites and different dosage of each metabolite both resulted in different T max . Copyright © 2016 Elsevier B.V. All rights reserved.
Fabregat-Cabello, Neus; Sancho, Juan V; Vidal, Andreu; González, Florenci V; Roig-Navarro, Antoni Francesc
2014-02-07
We present here a new measurement method for the rapid extraction and accurate quantification of technical nonylphenol (NP) and 4-t-octylphenol (OP) in complex matrix water samples by UHPLC-ESI-MS/MS. The extraction of both compounds is achieved in 30min by means of hollow fiber liquid phase microextraction (HF-LPME) using 1-octanol as acceptor phase, which provides an enrichment (preconcentration) factor of 800. On the other hand we have developed a quantification method based on isotope dilution mass spectrometry (IDMS) and singly (13)C1-labeled compounds. To this end the minimal labeled (13)C1-4-(3,6-dimethyl-3-heptyl)-phenol and (13)C1-t-octylphenol isomers were synthesized, which coelute with the natural compounds and allows the compensation of the matrix effect. The quantification was carried out by using isotope pattern deconvolution (IPD), which permits to obtain the concentration of both compounds without the need to build any calibration graph, reducing the total analysis time. The combination of both extraction and determination techniques have allowed to validate for the first time a HF-LPME methodology at the required levels by legislation achieving limits of quantification of 0.1ngmL(-1) and recoveries within 97-109%. Due to the low cost of HF-LPME and total time consumption, this methodology is ready for implementation in routine analytical laboratories. Copyright © 2013 Elsevier B.V. All rights reserved.
Tutorial examples for uncertainty quantification methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Bord, Sarah
2015-08-01
This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.
Analysis of using the tongue deviation angle as a warning sign of a stroke
2012-01-01
Background The symptom of tongue deviation is observed in a stroke or transient ischemic attack. Nevertheless, there is much room for the interpretation of the tongue deviation test. The crucial factor is the lack of an effective quantification method of tongue deviation. If we can quantify the features of the tongue deviation and scientifically verify the relationship between the deviation angle and a stroke, the information provided by the tongue will be helpful in recognizing a warning of a stroke. Methods In this study, a quantification method of the tongue deviation angle was proposed for the first time to characterize stroke patients. We captured the tongue images of stroke patients (15 males and 10 females, ranging between 55 and 82 years of age); transient ischemic attack (TIA) patients (16 males and 9 females, ranging between 53 and 79 years of age); and normal subjects (14 males and 11 females, ranging between 52 and 80 years of age) to analyze whether the method is effective. In addition, we used the receiver operating characteristic curve (ROC) for the sensitivity analysis, and determined the threshold value of the tongue deviation angle for the warning sign of a stroke. Results The means and standard deviations of the tongue deviation angles of the stroke, TIA, and normal groups were: 6.9 ± 3.1, 4.9 ± 2.1 and 1.4 ± 0.8 degrees, respectively. Analyzed by the unpaired Student’s t-test, the p-value between the stroke group and the TIA group was 0.015 (>0.01), indicating no significant difference in the tongue deviation angle. The p-values between the stroke group and the normal group, as well as between the TIA group and the normal group were both less than 0.01. These results show the significant differences in the tongue deviation angle between the patient groups (stroke and TIA patients) and the normal group. These results also imply that the tongue deviation angle can effectively identify the patient group (stroke and TIA patients) and the normal group. With respect to the visual examination, 40% and 32% of stroke patients, 24% and 16% of TIA patients, and 4% and 0% of normal subjects were found to have tongue deviations when physicians “A” and “B” examined them. The variation showed the essentiality of the quantification method in a clinical setting. In the receiver operating characteristic curve (ROC), the Area Under Curve (AUC, = 0.96) indicates good discrimination. The tongue deviation angle more than the optimum threshold value (= 3.2°) predicts a risk of stroke. Conclusions In summary, we developed an effective quantification method to characterize the tongue deviation angle, and we confirmed the feasibility of recognizing the tongue deviation angle as an early warning sign of an impending stroke. PMID:22908956
Prado, Marta; Boix, Ana; von Holst, Christoph
2012-07-01
The development of DNA-based methods for the identification and quantification of fish in food and feed samples is frequently focused on a specific fish species and/or on the detection of mitochondrial DNA of fish origin. However, a quantitative method for the most common fish species used by the food and feed industry is needed for official control purposes, and such a method should rely on the use of a single-copy nuclear DNA target owing to its more stable copy number in different tissues. In this article, we report on the development of a real-time PCR method based on the use of a nuclear gene as a target for the simultaneous detection of fish DNA from different species and on the evaluation of its quantification potential. The method was tested in 22 different fish species, including those most commonly used by the food and feed industry, and in negative control samples, which included 15 animal species and nine feed ingredients. The results show that the method reported here complies with the requirements concerning specificity and with the criteria required for real-time PCR methods with high sensitivity.
Arrighi, Chiara; Rossi, Lauro; Trasforini, Eva; Rudari, Roberto; Ferraris, Luca; Brugioni, Marcello; Franceschini, Serena; Castelli, Fabio
2018-02-01
Flood risk mitigation usually requires a significant investment of public resources and cost-effectiveness should be ensured. The assessment of the benefits of hydraulic works requires the quantification of (i) flood risk in absence of measures, (ii) risk in presence of mitigation works, (iii) investments to achieve acceptable residual risk. In this work a building-scale is adopted to estimate direct tangible flood losses to several building classes (e.g. residential, industrial, commercial, etc.) and respective contents, exploiting various sources of public open data in a GIS environment. The impact simulations for assigned flood hazard scenarios are computed through the RASOR platform which allows for an extensive characterization of the properties and their vulnerability through libraries of stage-damage curves. Recovery and replacement costs are estimated based on insurance data, market values and socio-economic proxies. The methodology is applied to the case study of Florence (Italy) where a system of retention basins upstream of the city is under construction to reduce flood risk. Current flood risk in the study area (70 km 2 ) is about 170 Mio euros per year without accounting for people, infrastructures, cultural heritage and vehicles at risk. The monetary investment in the retention basins is paid off in about 5 years. However, the results show that although hydraulic works are cost-effective, a significant residual risk has to be managed and the achievement of the desired level of acceptable risk would require about 1 billion euros of investments. Copyright © 2017 Elsevier Ltd. All rights reserved.
Multi-fidelity methods for uncertainty quantification in transport problems
NASA Astrophysics Data System (ADS)
Tartakovsky, G.; Yang, X.; Tartakovsky, A. M.; Barajas-Solano, D. A.; Scheibe, T. D.; Dai, H.; Chen, X.
2016-12-01
We compare several multi-fidelity approaches for uncertainty quantification in flow and transport simulations that have a lower computational cost than the standard Monte Carlo method. The cost reduction is achieved by combining a small number of high-resolution (high-fidelity) simulations with a large number of low-resolution (low-fidelity) simulations. We propose a new method, a re-scaled Multi Level Monte Carlo (rMLMC) method. The rMLMC is based on the idea that the statistics of quantities of interest depends on scale/resolution. We compare rMLMC with existing multi-fidelity methods such as Multi Level Monte Carlo (MLMC) and reduced basis methods and discuss advantages of each approach.
NASA Astrophysics Data System (ADS)
Nedosekin, Dmitry A.; Nolan, Jacqueline; Biris, Alexandru S.; Zharov, Vladimir P.
2017-03-01
Arkansas Nanomedicine Center at the University of Arkansas for Medical Sciences in collaboration with other Arkansas Universities and the FDA-based National Center of Toxicological Research in Jefferson, AR is developing novel techniques for rapid quantification of graphene-based nanomaterials (GBNs) in various biological samples. All-carbon GBNs have wide range of potential applications in industry, agriculture, food processing and medicine; however, quantification of GBNs is difficult in carbon reach biological tissues. The accurate quantification of GBNs is essential for research on material toxicity and the development of GBNs-based drug delivery platforms. We have developed microscopy and cytometry platforms for detection and quantification of GBNs in single cells, tissue and blood samples using photoacoustic contrast of GBNs. We demonstrated PA quantification of individual graphene uptake by single cells. High-resolution PA microscopy provided mapping of GBN distribution within live cells to establish correlation with intracellular toxic phenomena using apoptotic and necrotic assays. This new methodology and corresponding technical platform provide the insight on possible toxicological risks of GBNs at singe cells levels. In addition, in vivo PA image flow cytometry demonstrated the capability to monitor of GBNs pharmacokinetics in mouse model and to map the resulting biodistribution of GBNs in mouse tissues. The integrated PA platform provided an unprecedented sensitivity toward GBNs and allowed to enhance conventional toxicology research by providing a direct correlation between uptake of GBNs at a single cell level and cell viability status.
Cheng, Dongwan; Zheng, Li; Hou, Junjie; Wang, Jifeng; Xue, Peng; Yang, Fuquan; Xu, Tao
2015-01-01
The absolute quantification of target proteins in proteomics involves stable isotope dilution coupled with multiple reactions monitoring mass spectrometry (SID-MRM-MS). The successful preparation of stable isotope-labeled internal standard peptides is an important prerequisite for the SID-MRM absolute quantification methods. Dimethyl labeling has been widely used in relative quantitative proteomics and it is fast, simple, reliable, cost-effective, and applicable to any protein sample, making it an ideal candidate method for the preparation of stable isotope-labeled internal standards. MRM mass spectrometry is of high sensitivity, specificity, and throughput characteristics and can quantify multiple proteins simultaneously, including low-abundance proteins in precious samples such as pancreatic islets. In this study, a new method for the absolute quantification of three proteases involved in insulin maturation, namely PC1/3, PC2 and CPE, was developed by coupling a stable isotope dimethyl labeling strategy for internal standard peptide preparation with SID-MRM-MS quantitative technology. This method offers a new and effective approach for deep understanding of the functional status of pancreatic β cells and pathogenesis in diabetes.
Quantification of alginate by aggregation induced by calcium ions and fluorescent polycations.
Zheng, Hewen; Korendovych, Ivan V; Luk, Yan-Yeung
2016-01-01
For quantification of polysaccharides, including heparins and alginates, the commonly used carbazole assay involves hydrolysis of the polysaccharide to form a mixture of UV-active dye conjugate products. Here, we describe two efficient detection and quantification methods that make use of the negative charges of the alginate polymer and do not involve degradation of the targeted polysaccharide. The first method utilizes calcium ions to induce formation of hydrogel-like aggregates with alginate polymer; the aggregates can be quantified readily by staining with a crystal violet dye. This method does not require purification of alginate from the culture medium and can measure the large amount of alginate that is produced by a mucoid Pseudomonas aeruginosa culture. The second method employs polycations tethering a fluorescent dye to form suspension aggregates with the alginate polyanion. Encasing the fluorescent dye in the aggregates provides an increased scattering intensity with a sensitivity comparable to that of the conventional carbazole assay. Both approaches provide efficient methods for monitoring alginate production by mucoid P. aeruginosa. Copyright © 2015 Elsevier Inc. All rights reserved.
Lorenz, Dominic; Erasmy, Nicole; Akil, Youssef; Saake, Bodo
2016-04-20
A new method for the chemical characterization of xylans is presented, to overcome the difficulties in quantification of 4-O-methyl-α-D-glucuronic acid (meGlcA). In this regard, the hydrolysis behavior of xylans from beech and birch wood was investigated to obtain the optimum conditions for hydrolysis, using sulfuric acid. Due to varying linkage strengths and degradation, no general method for complete hydrolysis can be designed. Therefore, partial hydrolysis was applied, yielding monosaccharides and small meGlcA containing oligosaccharides. For a new method by HPAEC-UV/VIS, these samples were reductively aminated by 2-aminobenzoic acid. By quantification of monosaccharides and oligosaccharides, as well as comparison with borate-HPAEC and (13)C NMR-spectroscopy, we revealed that the concentrations meGlcA are significantly underestimated compared to conventional methods. The detected concentrations are 85.4% (beech) and 76.3% (birch) higher with the new procedure. Furthermore, the quantified concentrations of xylose were 9.3% (beech) and 6.5% (birch) higher by considering the unhydrolyzed oligosaccharides as well. Copyright © 2015 Elsevier Ltd. All rights reserved.
Srivastava, Nishi; Srivastava, Amit; Srivastava, Sharad; Rawat, Ajay Kumar Singh; Khan, Abdul Rahman
2016-03-01
A rapid, sensitive, selective and robust quantitative densitometric high-performance thin-layer chromatographic method was developed and validated for separation and quantification of syringic acid (SYA) and kaempferol (KML) in the hydrolyzed extracts of Bergenia ciliata and Bergenia stracheyi. The separation was performed on silica gel 60F254 high-performance thin-layer chromatography plates using toluene : ethyl acetate : formic acid (5 : 4: 1, v/v/v) as the mobile phase. The quantification of SYA and KML was carried out using a densitometric reflection/absorption mode at 290 nm. A dense spot of SYA and KML appeared on the developed plate at a retention factor value of 0.61 ± 0.02 and 0.70 ± 0.01. A precise and accurate quantification was performed using linear regression analysis by plotting the peak area vs concentration 100-600 ng/band (correlation coefficient: r = 0.997, regression coefficient: R(2) = 0.996) for SYA and 100-600 ng/band (correlation coefficient: r = 0.995, regression coefficient: R(2) = 0.991) for KML. The developed method was validated in terms of accuracy, recovery and inter- and intraday study as per International Conference on Harmonisation guidelines. The limit of detection and limit of quantification of SYA and KML were determined, respectively, as 91.63, 142.26 and 277.67, 431.09 ng. The statistical data analysis showed that the method is reproducible and selective for the estimation of SYA and KML in extracts of B. ciliata and B. stracheyi. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Clais, S; Boulet, G; Van Kerckhoven, M; Lanckacker, E; Delputte, P; Maes, L; Cos, P
2015-01-01
The viable plate count (VPC) is considered as the reference method for bacterial enumeration in periodontal microbiology but shows some important limitations for anaerobic bacteria. As anaerobes such as Porphyromonas gingivalis are difficult to culture, VPC becomes time-consuming and less sensitive. Hence, efficient normalization of experimental data to bacterial cell count requires alternative rapid and reliable quantification methods. This study compared the performance of VPC with that of turbidity measurement and real-time PCR (qPCR) in an experimental context using highly concentrated bacterial suspensions. Our TaqMan-based qPCR assay for P. gingivalis 16S rRNA proved to be sensitive and specific. Turbidity measurements offer a fast method to assess P. gingivalis growth, but suffer from high variability and a limited dynamic range. VPC was very time-consuming and less repeatable than qPCR. Our study concludes that qPCR provides the most rapid and precise approach for P. gingivalis quantification. Although our data were gathered in a specific research context, we believe that our conclusions on the inferior performance of VPC and turbidity measurements in comparison to qPCR can be extended to other research and clinical settings and even to other difficult-to-culture micro-organisms. Various clinical and research settings require fast and reliable quantification of bacterial suspensions. The viable plate count method (VPC) is generally seen as 'the gold standard' for bacterial enumeration. However, VPC-based quantification of anaerobes such as Porphyromonas gingivalis is time-consuming due to their stringent growth requirements and shows poor repeatability. Comparison of VPC, turbidity measurement and TaqMan-based qPCR demonstrated that qPCR possesses important advantages regarding speed, accuracy and repeatability. © 2014 The Society for Applied Microbiology.
NASA Astrophysics Data System (ADS)
van der Vat, Marnix; Femke, Schasfoort; Rhee Gigi, Van; Manfred, Wienhoven; Nico, Polman; Joost, Delsman; den Hoek Paul, Van; Maat Judith, Ter; Marjolein, Mens
2016-04-01
It is widely acknowledged that drought management should move from a crisis to a risk-based approach. A risk-based approach to managing water resources requires a sound drought risk analysis, quantifying the probability and impacts of water shortage due to droughts. Impacts of droughts are for example crop yield losses, hydropower production losses, and water shortage for municipal and industrial use. Many studies analyse the balance between supply and demand, but there is little experience in translating this into economic metrics that can be used in a decision-making process on investments to reduce drought risk. We will present a drought risk analysis method for the Netherlands, with a focus on the underlying economic method to quantify the welfare effects of water shortage for different water users. Both the risk-based approach as well as the economic valuation of water shortage for various water users was explored in a study for the Dutch Government. First, an historic analysis of the effects of droughts on revenues and prices in agriculture as well as on shipping and nature was carried out. Second, a drought risk analysis method was developed that combines drought hazard and drought impact analysis in a probabilistic way for various sectors. This consists of a stepwise approach, from water availability through water shortage to economic impact, for a range of drought events with a certain return period. Finally, a local case study was conducted to test the applicability of the drought risk analysis method. Through the study, experience was gained into integrating hydrological and economic analyses, which is a prerequisite for drought risk analysis. Results indicate that the risk analysis method is promising and applicable for various sectors. However, it was also found that quantification of economic impacts from droughts is time-consuming, because location- and sector-specific data is needed, which is not always readily available. Furthermore, for some sectors hydrological data was lacking to make a reliable estimate of drought return periods. By 2021, the Netherlands Government aims to agree on the water supply service levels, which should describe water availability and quality that can be delivered with a certain return period. The Netherlands' Ministry of Infrastructure and the Environment, representatives of the regional water boards and Rijkswaterstaat (operating the main water system) as well as several consultants and research institutes are important stakeholders for further development of the method, evaluation of cases and the development of a quantitative risk-informed decision-making tool.
Ermacora, Alessia; Hrnčiřík, Karel
2014-01-01
Substantial progress has been recently made in the development and optimisation of analytical methods for the quantification of 2-MCPD, 3-MCPD and glycidyl esters in oils and fats, and there are a few methods currently available that allow a reliable quantification of these contaminants in bulk oils and fats. On the other hand, no standard method for the analysis of foodstuffs has yet been established. The aim of this study was the development and validation of a new method for the simultaneous quantification of 2-MCPD, 3-MCPD and glycidyl esters in oil-based food products. The developed protocol includes a first step of liquid-liquid extraction and purification of the lipophilic substances of the sample, followed by the application of a previously developed procedure based on acid transesterification, for the indirect quantification of these contaminants in oils and fats. The method validation was carried out on food products (fat-based spreads, creams, margarine, mayonnaise) manufactured in-house, in order to control the manufacturing process and account for any food matrix-analyte interactions (the sample spiking was carried out on the single components used for the formulations rather than the final products). The method showed good accuracy (the recoveries ranged from 97% to 106% for bound 3-MCPD and 2-MCPD and from 88% to 115% for bound glycidol) and sensitivity (the LOD was 0.04 and 0.05 mg kg(-1) for bound MCPD and glycidol, respectively). Repeatability and reproducibility were satisfactory (RSD below 2% and 5%, respectively) for all analytes. The levels of salts and surface-active compounds in the formulation were found to have no impact on the accuracy and the other parameters of the method.
Yin, Hong-Rui; Zhang, Lei; Xie, Li-Qi; Huang, Li-Yong; Xu, Ye; Cai, San-Jun; Yang, Peng-Yuan; Lu, Hao-Jie
2013-09-06
Novel biomarker verification assays are urgently required to improve the efficiency of biomarker development. Benefitting from lower development costs, multiple reaction monitoring (MRM) has been used for biomarker verification as an alternative to immunoassay. However, in general MRM analysis, only one sample can be quantified in a single experiment, which restricts its application. Here, a Hyperplex-MRM quantification approach, which combined mTRAQ for absolute quantification and iTRAQ for relative quantification, was developed to increase the throughput of biomarker verification. In this strategy, equal amounts of internal standard peptides were labeled with mTRAQ reagents Δ0 and Δ8, respectively, as double references, while 4-plex iTRAQ reagents were used to label four different samples as an alternative to mTRAQ Δ4. From the MRM trace and MS/MS spectrum, total amounts and relative ratios of target proteins/peptides of four samples could be acquired simultaneously. Accordingly, absolute amounts of target proteins/peptides in four different samples could be achieved in a single run. In addition, double references were used to increase the reliability of the quantification results. Using this approach, three biomarker candidates, ademosylhomocysteinase (AHCY), cathepsin D (CTSD), and lysozyme C (LYZ), were successfully quantified in colorectal cancer (CRC) tissue specimens of different stages with high accuracy, sensitivity, and reproducibility. To summarize, we demonstrated a promising quantification method for high-throughput verification of biomarker candidates.
Al Feteisi, Hajar; Achour, Brahim; Rostami-Hodjegan, Amin; Barber, Jill
2015-01-01
Drug-metabolizing enzymes and transporters play an important role in drug absorption, distribution, metabolism and excretion and, consequently, they influence drug efficacy and toxicity. Quantification of drug-metabolizing enzymes and transporters in various tissues is therefore essential for comprehensive elucidation of drug absorption, distribution, metabolism and excretion. Recent advances in liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) have improved the quantification of pharmacologically relevant proteins. This report presents an overview of mass spectrometry-based methods currently used for the quantification of drug-metabolizing enzymes and drug transporters, mainly focusing on applications and cost associated with various quantitative strategies based on stable isotope-labeled standards (absolute quantification peptide standards, quantification concatemers, protein standards for absolute quantification) and label-free analysis. In mass spectrometry, there is no simple relationship between signal intensity and analyte concentration. Proteomic strategies are therefore complex and several factors need to be considered when selecting the most appropriate method for an intended application, including the number of proteins and samples. Quantitative strategies require appropriate mass spectrometry platforms, yet choice is often limited by the availability of appropriate instrumentation. Quantitative proteomics research requires specialist practical skills and there is a pressing need to dedicate more effort and investment to training personnel in this area. Large-scale multicenter collaborations are also needed to standardize quantitative strategies in order to improve physiologically based pharmacokinetic models.
Smith, Lucy; Becker, Ingrid; Schroeder, Wolfgang; Hoelscher, Arnulf H; Haneder, Stefan; Maintz, David; Spiro, Judith Eva
2018-01-01
Purpose Anastomotic leakage is a major surgical complication following esophagectomy and gastric pull-up. Specific risk factors such as celiac trunk (TC) stenosis and high calcification score of the aorta have been identified, but no data are available on their relative prognostic values. This retrospective study aimed to compare and evaluate calcification score versus stenosis quantification with regards to prognostic impact on anastomotic leakage. Patients and methods Preoperative contrast-enhanced computed tomography scans of 164 consecutive patients with primary esophageal cancer were evaluated by two radiologists to apply a calcification score (0–3 scale) assessing the aorta, the celiac axis and the right and left postceliac arteries. Concurrently, the presence and degree of stenosis of TC and superior mesenteric artery were recorded for stenosis quantification. Results Anastomotic leakage was noted in 14/164 patients and 12/14 showed stenosis of TC (n=11). The presence of TC stenosis was found to have a significant impact on anastomotic healing (p=0.004). The odds ratio for the prediction of anastomotic leakage by the degree of stenosis was 1.04 (95% CI, 1.02–1.07). Ten of 14 patients had aortic calcification scores of 1 or 2, but calcification scores of the aorta, the celiac axis and the right and left postceliac arteries did not correlate with the corresponding TC stenosis values and showed no influence on patient outcome as defined by the occurrence of anastomotic insufficiency (p=0.565, 0.855, 0.518 and 1.000, respectively). Inter-reader reliability of computed tomography analysis and absolute agreement on calcium scoring was mostly over 90%. No significant differences in preoperative comorbidities and patient characteristics were found between those with and without anastomotic leakage. Conclusion Measurement of TC stenosis in preoperative contrast-enhanced computed tomography scans proved to be more reliable than calcification scores in predicting anastomotic leakage and should, therefore, be used in the risk assessment of patients undergoing esophagectomy and gastric pull-up. PMID:29713180
MO-A-16A-01: QA Procedures and Metrics: In Search of QA Usability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sathiaseelan, V; Thomadsen, B
Radiation therapy has undergone considerable changes in the past two decades with a surge of new technology and treatment delivery methods. The complexity of radiation therapy treatments has increased and there has been increased awareness and publicity about the associated risks. In response, there has been proliferation of guidelines for medical physicists to adopt to ensure that treatments are delivered safely. Task Group recommendations are copious, and clinical physicists' hours are longer, stretched to various degrees between site planning and management, IT support, physics QA, and treatment planning responsibilities.Radiation oncology has many quality control practices in place to ensure themore » delivery of high-quality, safe treatments. Incident reporting systems have been developed to collect statistics about near miss events at many radiation oncology centers. However, tools are lacking to assess the impact of these various control measures. A recent effort to address this shortcoming is the work of Ford et al (2012) who recently published a methodology enumerating quality control quantification for measuring the effectiveness of safety barriers. Over 4000 near-miss incidents reported from 2 academic radiation oncology clinics were analyzed using quality control quantification, and a profile of the most effective quality control measures (metrics) was identified.There is a critical need to identify a QA metric to help the busy clinical physicists to focus their limited time and resources most effectively in order to minimize or eliminate errors in the radiation treatment delivery processes. In this symposium the usefulness of workflows and QA metrics to assure safe and high quality patient care will be explored.Two presentations will be given:Quality Metrics and Risk Management with High Risk Radiation Oncology ProceduresStrategies and metrics for quality management in the TG-100 Era Learning Objectives: Provide an overview and the need for QA usability metrics: Different cultures/practices affecting the effectiveness of methods and metrics. Show examples of quality assurance workflows, Statistical process control, that monitor the treatment planning and delivery process to identify errors. To learn to identify and prioritize risks and QA procedures in radiation oncology. Try to answer the question: Can a quality assurance program aided by quality assurance metrics help minimize errors and ensure safe treatment delivery. Should such metrics be institution specific.« less
Lundh, Torbjörn; Suh, Ga-Young; DiGiacomo, Phillip; Cheng, Christopher
2018-03-03
Vascular morphology characterization is useful for disease diagnosis, risk stratification, treatment planning, and prediction of treatment durability. To quantify the dynamic surface geometry of tubular-shaped anatomic structures, we propose a simple, rigorous Lagrangian cylindrical coordinate system to monitor well-defined surface points. Specifically, the proposed system enables quantification of surface curvature and cross-sectional eccentricity. Using idealized software phantom examples, we validate the method's ability to accurately quantify longitudinal and circumferential surface curvature, as well as eccentricity and orientation of eccentricity. We then apply the method to several medical imaging data sets of human vascular structures to exemplify the utility of this coordinate system for analyzing morphology and dynamic geometric changes in blood vessels throughout the body. Graphical abstract Pointwise longitudinal curvature of a thoracic aortic endograft surface for systole and diastole, with their absolute difference.
Rapid quantification of soilborne pathogen communities in wheat-based long-term field experiments
USDA-ARS?s Scientific Manuscript database
Traditional isolation and quantification of inoculum density is difficult for most soilborne pathogens. Quantitative PCR methods have been developed to rapidly identify and quantify many of these pathogens using a single DNA extract from soil. Rainfed experiments operated continuously for up to 84 y...
In this study, a new analytical technique was developed for the identification and quantification of multi-functional compounds containing simultaneously at least one hydroxyl or one carboxylic group, or both. This technique is based on derivatizing first the carboxylic group(s) ...
Protein Quantification by Elemental Mass Spectrometry: An Experiment for Graduate Students
ERIC Educational Resources Information Center
Schwarz, Gunnar; Ickert, Stefanie; Wegner, Nina; Nehring, Andreas; Beck, Sebastian; Tiemann, Ruediger; Linscheid, Michael W.
2014-01-01
A multiday laboratory experiment was designed to integrate inductively coupled plasma-mass spectrometry (ICP-MS) in the context of protein quantification into an advanced practical course in analytical and environmental chemistry. Graduate students were familiar with the analytical methods employed, whereas the combination of bioanalytical assays…
Identification and Quantification Soil Redoximorphic Features by Digital Image Processing
USDA-ARS?s Scientific Manuscript database
Soil redoximorphic features (SRFs) have provided scientists and land managers with insight into relative soil moisture for approximately 60 years. The overall objective of this study was to develop a new method of SRF identification and quantification from soil cores using a digital camera and imag...
USDA-ARS?s Scientific Manuscript database
Currently, blueberry bruising is evaluated by either human visual/tactile inspection or firmness measurement instruments. These methods are destructive and time-consuming. The goal of this paper was to develop a non-destructive approach for blueberry bruising detection and quantification. The spe...
Quantification of micro stickies
Mahendra Doshi; Jeffrey Dyer; Salman Aziz; Kristine Jackson; Said M. Abubakr
1997-01-01
The objective of this project was to compare the different methods for the quantification of micro stickies. The hydrophobic materials investigated in this project for the collection of micro stickies were Microfoam* (polypropylene packing material), low density polyethylene film (LDPE), high density polyethylene (HDPE; a flat piece from a square plastic bottle), paper...
USDA-ARS?s Scientific Manuscript database
The pathogen causing corky root on lettuce, Sphingobium suberifaciens, is recalcitrant to standard epidemiological methods. Primers were selected from 16S rDNA sequences useful for the specific detection and quantification of S. suberifaciens. Conventional (PCR) and quantitative (qPCR) PCR protocols...
DOT National Transportation Integrated Search
2016-09-13
The Hampton Roads Climate Impact Quantification Initiative (HRCIQI) is a multi-part study sponsored by the U.S. Department of Transportation (DOT) Climate Change Center with the goals that include developing a cost tool that provides methods for volu...
Sánchez-Guijo, Alberto; Oji, Vinzenz; Hartmann, Michaela F; Traupe, Heiko; Wudy, Stefan A
2015-09-01
Steroids are primarily present in human fluids in their sulfated forms. Profiling of these compounds is important from both diagnostic and physiological points of view. Here, we present a novel method for the quantification of 11 intact steroid sulfates in human serum by LC-MS/MS. The compounds analyzed in our method, some of which are quantified for the first time in blood, include cholesterol sulfate, pregnenolone sulfate, 17-hydroxy-pregnenolone sulfate, 16-α-hydroxy-dehydroepiandrosterone sulfate, dehydroepiandrosterone sulfate, androstenediol sulfate, androsterone sulfate, epiandrosterone sulfate, testosterone sulfate, epitestosterone sulfate, and dihydrotestosterone sulfate. The assay was conceived to quantify sulfated steroids in a broad range of concentrations, requiring only 300 μl of serum. The method has been validated and its performance was studied at three quality controls, selected for each compound according to its physiological concentration. The assay showed good linearity (R(2) > 0.99) and recovery for all the compounds, with limits of quantification ranging between 1 and 80 ng/ml. Averaged intra-day and between-day precisions (coefficient of variation) and accuracies (relative errors) were below 10%. The method has been successfully applied to study the sulfated steroidome in diseases such as steroid sulfatase deficiency, proving its diagnostic value. This is, to our best knowledge, the most comprehensive method available for the quantification of sulfated steroids in human blood. Copyright © 2015 by the American Society for Biochemistry and Molecular Biology, Inc.
Ramírez, Juan Carlos; Cura, Carolina Inés; Moreira, Otacilio da Cruz; Lages-Silva, Eliane; Juiz, Natalia; Velázquez, Elsa; Ramírez, Juan David; Alberti, Anahí; Pavia, Paula; Flores-Chávez, María Delmans; Muñoz-Calderón, Arturo; Pérez-Morales, Deyanira; Santalla, José; Guedes, Paulo Marcos da Matta; Peneau, Julie; Marcet, Paula; Padilla, Carlos; Cruz-Robles, David; Valencia, Edward; Crisante, Gladys Elena; Greif, Gonzalo; Zulantay, Inés; Costales, Jaime Alfredo; Alvarez-Martínez, Miriam; Martínez, Norma Edith; Villarroel, Rodrigo; Villarroel, Sandro; Sánchez, Zunilda; Bisio, Margarita; Parrado, Rudy; Galvão, Lúcia Maria da Cunha; da Câmara, Antonia Cláudia Jácome; Espinoza, Bertha; de Noya, Belkisyole Alarcón; Puerta, Concepción; Riarte, Adelina; Diosque, Patricio; Sosa-Estani, Sergio; Guhl, Felipe; Ribeiro, Isabela; Aznar, Christine; Britto, Constança; Yadón, Zaida Estela; Schijman, Alejandro G.
2015-01-01
An international study was performed by 26 experienced PCR laboratories from 14 countries to assess the performance of duplex quantitative real-time PCR (qPCR) strategies on the basis of TaqMan probes for detection and quantification of parasitic loads in peripheral blood samples from Chagas disease patients. Two methods were studied: Satellite DNA (SatDNA) qPCR and kinetoplastid DNA (kDNA) qPCR. Both methods included an internal amplification control. Reportable range, analytical sensitivity, limits of detection and quantification, and precision were estimated according to international guidelines. In addition, inclusivity and exclusivity were estimated with DNA from stocks representing the different Trypanosoma cruzi discrete typing units and Trypanosoma rangeli and Leishmania spp. Both methods were challenged against 156 blood samples provided by the participant laboratories, including samples from acute and chronic patients with varied clinical findings, infected by oral route or vectorial transmission. kDNA qPCR showed better analytical sensitivity than SatDNA qPCR with limits of detection of 0.23 and 0.70 parasite equivalents/mL, respectively. Analyses of clinical samples revealed a high concordance in terms of sensitivity and parasitic loads determined by both SatDNA and kDNA qPCRs. This effort is a major step toward international validation of qPCR methods for the quantification of T. cruzi DNA in human blood samples, aiming to provide an accurate surrogate biomarker for diagnosis and treatment monitoring for patients with Chagas disease. PMID:26320872
Chen, Xiaoxia; Zhao, Jing; Chen, Tianshu; Gao, Tao; Zhu, Xiaoli; Li, Genxi
2018-01-01
Comprehensive analysis of the expression level and location of tumor-associated membrane proteins (TMPs) is of vital importance for the profiling of tumor cells. Currently, two kinds of independent techniques, i.e. ex situ detection and in situ imaging, are usually required for the quantification and localization of TMPs respectively, resulting in some inevitable problems. Methods: Herein, based on a well-designed and fluorophore-labeled DNAzyme, we develop an integrated and facile method, in which imaging and quantification of TMPs in situ are achieved simultaneously in a single system. The labeled DNAzyme not only produces localized fluorescence for the visualization of TMPs but also catalyzes the cleavage of a substrate to produce quantitative fluorescent signals that can be collected from solution for the sensitive detection of TMPs. Results: Results from the DNAzyme-based in situ imaging and quantification of TMPs match well with traditional immunofluorescence and western blotting. In addition to the advantage of two-in-one, the DNAzyme-based method is highly sensitivity, allowing the detection of TMPs in only 100 cells. Moreover, the method is nondestructive. Cells after analysis could retain their physiological activity and could be cultured for other applications. Conclusion: The integrated system provides solid results for both imaging and quantification of TMPs, making it a competitive method over some traditional techniques for the analysis of TMPs, which offers potential application as a toolbox in the future.
Gasperotti, Mattia; Masuero, Domenico; Guella, Graziano; Mattivi, Fulvio; Vrhovsek, Urska
2014-10-01
An increasing number of studies have concerned the profiling of polyphenol microbial metabolites, especially in urine or plasma, but only a few have regarded their accurate quantification. This study reports on a new ultra-performance liquid chromatography tandem mass spectrometry method with electrospray ionisation (UHPLC-ESI-MS/MS) using a simple clean-up step with solid phase extraction (SPE) and validation on different biological matrices. The method was tested with spiked samples of liver, heart, kidneys, brain, blood and urine. The purification procedure, after the evaluation of three different cartridges, makes it possible to obtain cleaner samples and better quantification of putative trace metabolites, especially related to dietary studies, with concentrations below ng/g in tissue and for urine and blood, starting from ng/ml. Limits of detection and linear range were also assessed using mixed polyphenol metabolite standards. Short chromatographic separation was carried out for 23 target compounds related to the polyphenol microbial metabolism, coupled with a triple quadrupole mass spectrometer for their accurate quantification. By analysing different spiked biological samples we were able to test metabolite detection in the matrix and validate the overall recovery of the method, from purification to quantification. The method developed can be successfully applied and is suitable for high-throughput targeted metabolomics analysis related to nutritional intervention, or the study of the metabolic mechanism in response to a polyphenol-rich diet. Copyright © 2014 Elsevier B.V. All rights reserved.
A New Method for Assessing How Sensitivity and Specificity of Linkage Studies Affects Estimation
Moore, Cecilia L.; Amin, Janaki; Gidding, Heather F.; Law, Matthew G.
2014-01-01
Background While the importance of record linkage is widely recognised, few studies have attempted to quantify how linkage errors may have impacted on their own findings and outcomes. Even where authors of linkage studies have attempted to estimate sensitivity and specificity based on subjects with known status, the effects of false negatives and positives on event rates and estimates of effect are not often described. Methods We present quantification of the effect of sensitivity and specificity of the linkage process on event rates and incidence, as well as the resultant effect on relative risks. Formulae to estimate the true number of events and estimated relative risk adjusted for given linkage sensitivity and specificity are then derived and applied to data from a prisoner mortality study. The implications of false positive and false negative matches are also discussed. Discussion Comparisons of the effect of sensitivity and specificity on incidence and relative risks indicate that it is more important for linkages to be highly specific than sensitive, particularly if true incidence rates are low. We would recommend that, where possible, some quantitative estimates of the sensitivity and specificity of the linkage process be performed, allowing the effect of these quantities on observed results to be assessed. PMID:25068293
Quantification Bias Caused by Plasmid DNA Conformation in Quantitative Real-Time PCR Assay
Lin, Chih-Hui; Chen, Yu-Chieh; Pan, Tzu-Ming
2011-01-01
Quantitative real-time PCR (qPCR) is the gold standard for the quantification of specific nucleic acid sequences. However, a serious concern has been revealed in a recent report: supercoiled plasmid standards cause significant over-estimation in qPCR quantification. In this study, we investigated the effect of plasmid DNA conformation on the quantification of DNA and the efficiency of qPCR. Our results suggest that plasmid DNA conformation has significant impact on the accuracy of absolute quantification by qPCR. DNA standard curves shifted significantly among plasmid standards with different DNA conformations. Moreover, the choice of DNA measurement method and plasmid DNA conformation may also contribute to the measurement error of DNA standard curves. Due to the multiple effects of plasmid DNA conformation on the accuracy of qPCR, efforts should be made to assure the highest consistency of plasmid standards for qPCR. Thus, we suggest that the conformation, preparation, quantification, purification, handling, and storage of standard plasmid DNA should be described and defined in the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) to assure the reproducibility and accuracy of qPCR absolute quantification. PMID:22194997
PET Quantification of the Norepinephrine Transporter in Human Brain with (S,S)-18F-FMeNER-D2.
Moriguchi, Sho; Kimura, Yasuyuki; Ichise, Masanori; Arakawa, Ryosuke; Takano, Harumasa; Seki, Chie; Ikoma, Yoko; Takahata, Keisuke; Nagashima, Tomohisa; Yamada, Makiko; Mimura, Masaru; Suhara, Tetsuya
2017-07-01
Norepinephrine transporter (NET) in the brain plays important roles in human cognition and the pathophysiology of psychiatric disorders. Two radioligands, ( S , S )- 11 C-MRB and ( S , S )- 18 F-FMeNER-D 2 , have been used for imaging NETs in the thalamus and midbrain (including locus coeruleus) using PET in humans. However, NET density in the equally important cerebral cortex has not been well quantified because of unfavorable kinetics with ( S , S )- 11 C-MRB and defluorination with ( S , S )- 18 F-FMeNER-D 2 , which can complicate NET quantification in the cerebral cortex adjacent to the skull containing defluorinated 18 F radioactivity. In this study, we have established analysis methods of quantification of NET density in the brain including the cerebral cortex using ( S , S )- 18 F-FMeNER-D 2 PET. Methods: We analyzed our previous ( S , S )- 18 F-FMeNER-D 2 PET data of 10 healthy volunteers dynamically acquired for 240 min with arterial blood sampling. The effects of defluorination on the NET quantification in the superficial cerebral cortex was evaluated by establishing a time stability of NET density estimations with an arterial input 2-tissue-compartment model, which guided the less-invasive reference tissue model and area under the time-activity curve methods to accurately quantify NET density in all brain regions including the cerebral cortex. Results: Defluorination of ( S , S )- 18 F-FMeNER-D 2 became prominent toward the latter half of the 240-min scan. Total distribution volumes in the superficial cerebral cortex increased with the scan duration beyond 120 min. We verified that 90-min dynamic scans provided a sufficient amount of data for quantification of NET density unaffected by defluorination. Reference tissue model binding potential values from the 90-min scan data and area under the time-activity curve ratios of 70- to 90-min data allowed for the accurate quantification of NET density in the cerebral cortex. Conclusion: We have established methods of quantification of NET densities in the brain including the cerebral cortex unaffected by defluorination using ( S , S )- 18 F-FMeNER-D 2 These results suggest that we can accurately quantify NET density with a 90-min ( S , S )- 18 F-FMeNER-D 2 scan in broad brain areas. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
Jin, Chan; Guan, Jibin; Zhang, Dong; Li, Bing; Liu, Hongzhuo; He, Zhonggui
2017-10-01
We present a technique to rapid determine taxane in blood samples by supercritical fluid chromatography together with mass spectrometry. The aim of this study was to develop a supercritical fluid chromatography with mass spectrometry method for the analysis of paclitaxel, cabazitaxel, and docetaxel in whole-blood samples of rats. Liquid-dry matrix spot extraction was selected in sample preparation procedure. Supercritical fluid chromatography separation of paclitaxel, cabazitaxel, docetaxel, and glyburide (internal standard) was accomplished within 3 min by using the gradient mobile phase consisted of methanol as the compensation solvent and carbon dioxide at a flow rate of 1.0 mL/min. The method was validated regarding specificity, the lower limit of quantification, repeatability, and reproducibility of quantification, extraction recovery, and matrix effects. The lower limit of quantification was found to be 10 ng/mL since it exhibited acceptable precision and accuracy at the corresponding level. All interday accuracies and precisions were within the accepted criteria of ±15% of the nominal value and within ±20% at the lower limit of quantification, implying that the method was reliable and reproducible. In conclusion, this method is a promising tool to support and improve preclinical or clinical pharmacokinetic studies with the taxanes anticancer drugs. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Sjödin, Marcus O D; Wetterhall, Magnus; Kultima, Kim; Artemenko, Konstantin
2013-06-01
The analytical performance of three different strategies, iTRAQ (isobaric tag for relative and absolute quantification), dimethyl labeling (DML) and label free (LF) for relative protein quantification using shotgun proteomics have been evaluated. The methods have been explored using samples containing (i) Bovine proteins in known ratios and (ii) Bovine proteins in known ratios spiked into Escherichia coli. The latter case mimics the actual conditions in a typical biological sample with a few differentially expressed proteins and a bulk of proteins with unchanged ratios. Additionally, the evaluation was performed on both QStar and LTQ-FTICR mass spectrometers. LF LTQ-FTICR was found to have the highest proteome coverage while the highest accuracy based on the artificially regulated proteins was found for DML LTQ-FTICR (54%). A varying linearity (k: 0.55-1.16, r(2): 0.61-0.96) was shown for all methods within selected dynamic ranges. All methods were found to consistently underestimate Bovine protein ratios when matrix proteins were added. However, LF LTQ-FTICR was more tolerant toward a compression effect. A single peptide was demonstrated to be sufficient for a reliable quantification using iTRAQ. A ranking system utilizing several parameters important for quantitative proteomics demonstrated that the overall performance of the five different methods was; DML LTQ-FTICR>iTRAQ QStar>LF LTQ-FTICR>DML QStar>LF QStar. Copyright © 2013 Elsevier B.V. All rights reserved.
Pedersen, S N; Lindholst, C
1999-12-09
Extraction methods were developed for quantification of the xenoestrogens 4-tert.-octylphenol (tOP) and bisphenol A (BPA) in water and in liver and muscle tissue from the rainbow trout (Oncorhynchus mykiss). The extraction of tOP and BPA from tissue samples was carried out using microwave-assisted solvent extraction (MASE) followed by solid-phase extraction (SPE). Water samples were extracted using only SPE. For the quantification of tOP and BPA, liquid chromatography mass spectrometry (LC-MS) equipped with an atmospheric pressure chemical ionisation interface (APCI) was applied. The combined methods for tissue extraction allow the use of small sample amounts of liver or muscle (typically 1 g), low volumes of solvent (20 ml), and short extraction times (25 min). Limits of quantification of tOP in tissue samples were found to be approximately 10 ng/g in muscle and 50 ng/g in liver (both based on 1 g of fresh tissue). The corresponding values for BPA were approximately 50 ng/g in both muscle and liver tissue. In water, the limit of quantification for tOP and BPA was approximately 0.1 microg/l (based on 100 ml sample size).
Quantification of sterol lipids in plants by quadrupole time-of-flight mass spectrometry
Wewer, Vera; Dombrink, Isabel; vom Dorp, Katharina; Dörmann, Peter
2011-01-01
Glycerolipids, sphingolipids, and sterol lipids constitute the major lipid classes in plants. Sterol lipids are composed of free and conjugated sterols, i.e., sterol esters, sterol glycosides, and acylated sterol glycosides. Sterol lipids play crucial roles during adaption to abiotic stresses and plant-pathogen interactions. Presently, no comprehensive method for sterol lipid quantification in plants is available. We used nanospray ionization quadrupole-time-of-flight mass spectrometry (Q-TOF MS) to resolve and identify the molecular species of all four sterol lipid classes from Arabidopsis thaliana. Free sterols were derivatized with chlorobetainyl chloride. Sterol esters, sterol glycosides, and acylated sterol glycosides were ionized as ammonium adducts. Quantification of molecular species was achieved in the positive mode after fragmentation in the presence of internal standards. The amounts of sterol lipids quantified by Q-TOF MS/MS were validated by comparison with results obtained with TLC/GC. Quantification of sterol lipids from leaves and roots of phosphate-deprived A. thaliana plants revealed changes in the amounts and molecular species composition. The Q-TOF method is far more sensitive than GC or HPLC. Therefore, Q-TOF MS/MS provides a comprehensive strategy for sterol lipid quantification that can be adapted to other tandem mass spectrometers. PMID:21382968
NASA Astrophysics Data System (ADS)
Bau, Haim; Liu, Changchun; Killawala, Chitvan; Sadik, Mohamed; Mauk, Michael
2014-11-01
Real-time amplification and quantification of specific nucleic acid sequences plays a major role in many medical and biotechnological applications. In the case of infectious diseases, quantification of the pathogen-load in patient specimens is critical to assessing disease progression, effectiveness of drug therapy, and emergence of drug-resistance. Typically, nucleic acid quantification requires sophisticated and expensive instruments, such as real-time PCR machines, which are not appropriate for on-site use and for low resource settings. We describe a simple, low-cost, reactiondiffusion based method for end-point quantification of target nucleic acids undergoing enzymatic amplification. The number of target molecules is inferred from the position of the reaction-diffusion front, analogous to reading temperature in a mercury thermometer. We model the process with the Fisher Kolmogoroff Petrovskii Piscounoff (FKPP) Equation and compare theoretical predictions with experimental observations. The proposed method is suitable for nucleic acid quantification at the point of care, compatible with multiplexing and high-throughput processing, and can function instrument-free. C.L. was supported by NIH/NIAID K25AI099160; M.S. was supported by the Pennsylvania Ben Franklin Technology Development Authority; C.K. and H.B. were funded, in part, by NIH/NIAID 1R41AI104418-01A1.
Simple and accurate quantification of BTEX in ambient air by SPME and GC-MS.
Baimatova, Nassiba; Kenessov, Bulat; Koziel, Jacek A; Carlsen, Lars; Bektassov, Marat; Demyanenko, Olga P
2016-07-01
Benzene, toluene, ethylbenzene and xylenes (BTEX) comprise one of the most ubiquitous and hazardous groups of ambient air pollutants of concern. Application of standard analytical methods for quantification of BTEX is limited by the complexity of sampling and sample preparation equipment, and budget requirements. Methods based on SPME represent simpler alternative, but still require complex calibration procedures. The objective of this research was to develop a simpler, low-budget, and accurate method for quantification of BTEX in ambient air based on SPME and GC-MS. Standard 20-mL headspace vials were used for field air sampling and calibration. To avoid challenges with obtaining and working with 'zero' air, slope factors of external standard calibration were determined using standard addition and inherently polluted lab air. For polydimethylsiloxane (PDMS) fiber, differences between the slope factors of calibration plots obtained using lab and outdoor air were below 14%. PDMS fiber provided higher precision during calibration while the use of Carboxen/PDMS fiber resulted in lower detection limits for benzene and toluene. To provide sufficient accuracy, the use of 20mL vials requires triplicate sampling and analysis. The method was successfully applied for analysis of 108 ambient air samples from Almaty, Kazakhstan. Average concentrations of benzene, toluene, ethylbenzene and o-xylene were 53, 57, 11 and 14µgm(-3), respectively. The developed method can be modified for further quantification of a wider range of volatile organic compounds in air. In addition, the new method is amenable to automation. Copyright © 2016 Elsevier B.V. All rights reserved.
Shinozuka, Hiroshi; Forster, John W
2016-01-01
Background. Multiplexed sequencing is commonly performed on massively parallel short-read sequencing platforms such as Illumina, and the efficiency of library normalisation can affect the quality of the output dataset. Although several library normalisation approaches have been established, none are ideal for highly multiplexed sequencing due to issues of cost and/or processing time. Methods. An inexpensive and high-throughput library quantification method has been developed, based on an adaptation of the melting curve assay. Sequencing libraries were subjected to the assay using the Bio-Rad Laboratories CFX Connect(TM) Real-Time PCR Detection System. The library quantity was calculated through summation of reduction of relative fluorescence units between 86 and 95 °C. Results.PCR-enriched sequencing libraries are suitable for this quantification without pre-purification of DNA. Short DNA molecules, which ideally should be eliminated from the library for subsequent processing, were differentiated from the target DNA in a mixture on the basis of differences in melting temperature. Quantification results for long sequences targeted using the melting curve assay were correlated with those from existing methods (R (2) > 0.77), and that observed from MiSeq sequencing (R (2) = 0.82). Discussion.The results of multiplexed sequencing suggested that the normalisation performance of the described method is equivalent to that of another recently reported high-throughput bead-based method, BeNUS. However, costs for the melting curve assay are considerably lower and processing times shorter than those of other existing methods, suggesting greater suitability for highly multiplexed sequencing applications.
Vadas, P A; Good, L W; Moore, P A; Widman, N
2009-01-01
Nonpoint-source pollution of fresh waters by P is a concern because it contributes to accelerated eutrophication. Given the state of the science concerning agricultural P transport, a simple tool to quantify annual, field-scale P loss is a realistic goal. We developed new methods to predict annual dissolved P loss in runoff from surface-applied manures and fertilizers and validated the methods with data from 21 published field studies. We incorporated these manure and fertilizer P runoff loss methods into an annual, field-scale P loss quantification tool that estimates dissolved and particulate P loss in runoff from soil, manure, fertilizer, and eroded sediment. We validated the P loss tool using independent data from 28 studies that monitored P loss in runoff from a variety of agricultural land uses for at least 1 yr. Results demonstrated (i) that our new methods to estimate P loss from surface manure and fertilizer are an improvement over methods used in existing Indexes, and (ii) that it was possible to reliably quantify annual dissolved, sediment, and total P loss in runoff using relatively simple methods and readily available inputs. Thus, a P loss quantification tool that does not require greater degrees of complexity or input data than existing P Indexes could accurately predict P loss across a variety of management and fertilization practices, soil types, climates, and geographic locations. However, estimates of runoff and erosion are still needed that are accurate to a level appropriate for the intended use of the quantification tool.
Youn, S Y; Jeong, O M; Choi, B K; Jung, S C; Kang, M S
2017-02-01
Raw chicken products are major causes of human foodborne salmonellosis worldwide. In particular, there is a significant risk of human exposure to Salmonella originating from the chicken slaughtering process. Controlling the contamination of chicken carcasses by Salmonella has been a considerable challenge in chicken-slaughtering facilities and involves routine microbiological monitoring using reliable detection methods. Simple and rapid detection methods, particularly those capable of determining cell viability, will significantly facilitate routine monitoring of Salmonella Here, we report an invA-based loop-mediated isothermal amplification method coupled with a simple propidium monoazide treatment (PMA-LAMP) for simple and rapid detection and quantification of viable Salmonella in rinse water of chicken carcasses. In this study, PMA-LAMP consistently gave negative results for isopropanol-killed Salmonella with concentrations up to 8.0 × 10 6 CFU/reaction. The detection limit of PMA-LAMP was 8.0 × 10 1 CFU/reaction with viable Salmonella in both pure culture and rinse water of chicken carcasses, and 10-fold lower than a conventional polymerase chain reaction coupled with PMA (PMA-PCR) targeting invA There was a high correlation (R 2 = 0.99 to 0.976) between LAMP time threshold (T T ) values and viable Salmonella with a quantification range of 1.0 × 10 3 to 1.0 × 10 8 CFU/mL in pure culture and rinse water of chicken carcasses. The PMA-LAMP assay took less than 2 h to detect Salmonella contaminated in test samples. Therefore, this simple and rapid method will be a very useful tool to detect live Salmonella contamination of chicken carcasses without pre-enrichment at the slaughterhouse where sanitizing treatments are commonly used. © 2016 Poultry Science Association Inc.
Zhao, Pengfei; Zhao, Jing; Lei, Shuo; Guo, Xingjie; Zhao, Longshan
2018-08-01
A rapid and sensitive multi-residue method was developed for the simultaneous quantification of eight chiral pesticides (including diniconazole, metalaxyl, paclobutrazol, epoxiconazole, myclobutanil, hexaconazole, napropamide and isocarbophos) at enantiomeric levels in environmental soils and sediments using chiral liquid chromatography-tandem mass spectrometry based on a combined pretreatment of matrix solid-phase dispersion and dispersive liquid-liquid microextraction (MSPD-DLLME). Under optimized conditions, 0.1 g of solid sample was dispersed with 0.4 g of C18-bonded silica sorbent, and 3 mL of methanol was used for eluting the analytes. The collected eluant was dried and then further purified by DLLME with 550 μL of dichloromethane and 960 μL of acetonitrile as extraction and disperser solvent, respectively. The established method was validated and found to be linear, precise, and accurate over the concentration range of 2-500 ng g -1 for epoxiconazole, paclobutrazol and metalaxyl and 4-500 ng g -1 for isocarbophos, hexaconazole, myclobutanil, diniconazole and napropamide. Recoveries of sixteen enantiomers varied from 87.0 to 104.1% and the relative standard deviations (RSD) were less than 10.1%. Method detection and quantification limits (MDLs and MQLs) varied from 0.22 to 1.54 ng g -1 and from 0.91 to 4.00 ng g -1 , respectively. Finally, the method was successfully applied to analyze the enantiomeric composition of the eight chiral pesticides in environmental solid matrices, which will help better understand the behavior of individual enantiomer and make accurate risk assessment on the ecosystem. Copyright © 2018 Elsevier Ltd. All rights reserved.
Krystek, Petra; Bäuerlein, Patrick S; Kooij, Pascal J F
2015-03-15
For pharmaceutical applications, the use of inorganic engineered nanoparticles is of growing interest while silver (Ag) and gold (Au) are the most relevant elements. A few methods were developed recently but the validation and the application testing were quite limited. Therefore, a routinely suitable multi element method for the identification of nanoparticles of different sizes below 100 nm and elemental composition by applying asymmetric flow field flow fraction (AF4) - inductively coupled plasma mass spectrometry (ICPMS) is developed. A complete validation model of the quantification of releasable pharmaceutical relevant inorganic nanoparticles based on Ag and Au is presented for the most relevant aqueous matrices of tap water and domestic waste water. The samples are originated from locations in the Netherlands and it is of great interest to study the unwanted presence of Ag and Au as nanoparticle residues due to possible health and environmental risks. During method development, instability effects are observed for 60 nm and 70 nm Ag ENPs with different capping agents. These effects are studied more closely in relation to matrix effects. Besides the methodological aspects, the obtained analytical results and relevant performance characteristics (e.g. measuring range, limit of detection, repeatability, reproducibility, trueness, and expanded uncertainty of measurement) are determined and discussed. For the chosen aqueous matrices, the results of the performance characteristics are significantly better for Au ENPs in comparison to Ag ENPs; e.g. repeatability and reproducibility are below 10% for all Au ENPs respectively maximal 27% repeatability for larger Ag ENPs. The method is a promising tool for the simultaneous determination of releasable pharmaceutical relevant inorganic nanoparticles. Copyright © 2014 Elsevier B.V. All rights reserved.
Spainhour, John Christian G; Janech, Michael G; Schwacke, John H; Velez, Juan Carlos Q; Ramakrishnan, Viswanathan
2014-01-01
Matrix assisted laser desorption/ionization time-of-flight (MALDI-TOF) coupled with stable isotope standards (SIS) has been used to quantify native peptides. This peptide quantification by MALDI-TOF approach has difficulties quantifying samples containing peptides with ion currents in overlapping spectra. In these overlapping spectra the currents sum together, which modify the peak heights and make normal SIS estimation problematic. An approach using Gaussian mixtures based on known physical constants to model the isotopic cluster of a known compound is proposed here. The characteristics of this approach are examined for single and overlapping compounds. The approach is compared to two commonly used SIS quantification methods for single compound, namely Peak Intensity method and Riemann sum area under the curve (AUC) method. For studying the characteristics of the Gaussian mixture method, Angiotensin II, Angiotensin-2-10, and Angiotenisn-1-9 and their associated SIS peptides were used. The findings suggest, Gaussian mixture method has similar characteristics as the two methods compared for estimating the quantity of isolated isotopic clusters for single compounds. All three methods were tested using MALDI-TOF mass spectra collected for peptides of the renin-angiotensin system. The Gaussian mixture method accurately estimated the native to labeled ratio of several isolated angiotensin peptides (5.2% error in ratio estimation) with similar estimation errors to those calculated using peak intensity and Riemann sum AUC methods (5.9% and 7.7%, respectively). For overlapping angiotensin peptides, (where the other two methods are not applicable) the estimation error of the Gaussian mixture was 6.8%, which is within the acceptable range. In summary, for single compounds the Gaussian mixture method is equivalent or marginally superior compared to the existing methods of peptide quantification and is capable of quantifying overlapping (convolved) peptides within the acceptable margin of error.
Accuracy Profiles for Analyzing Residual Solvents in Textiles by GC-MS.
Bao, Qibei; Fu, Kejie; Ren, Qingqing; Zhong, Yingying; Qian, Dan
2017-10-01
Excess residual solvents (RSs) in clothes or other textiles could be toxic and pose risks to both humans and the environment. N,N-Dimethylformamide (DMF), N,N-dimethylacetamide (DMAc) and 1-methyl-2-pyrrolidinone (NMP) are important chemicals frequently used as solvents in the textile industry. Several organizations have proposed limiting DMF, DMAc and NMP in textiles, but an appropriate detection method has not been proposed. A sensitive GC-MS method for the quantification of DMF, DMAc and NMP in textiles was developed. After extraction with ethyl acetate, these RSs were separated on a DB-5MS capillary column. The oven temperature was increased from 50°C (held for 0.5 min) at 10°C/min to 120°C (held for 1 min). The method was fully validated according to the accuracy profile procedure, which is based on β-expectation tolerance intervals for the total measurement bias. Linearity was observed in the range of 0.5-10 mg/L for the solvents with limit of quantification values of 4.2, 3.5 and 2.5 mg/kg for DMF, DMAc and NMP, respectively. The repeatability and intermediate precision were <5.34% and 7.95% for DMF, 5.37% and 9.68% for DMAc, and 2.68% and 5.85% for NMP. The recoveries of DMF, DMAc and NMP were 91.2-106.3%, 89.5-97.7% and 85.6-101.3%, respectively. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Educational Challenges in Toxicology.
ERIC Educational Resources Information Center
Dixon, Robert L.
1984-01-01
Issues and topics related to educational challenges in toxicology at all levels are discussed. They include public awareness and understanding, general approach to toxicology, quality structure-activity relationships, epidemiological studies, quantification of risk, and the types of toxicants studied. (JN)
An Analysis of Department of Energy Cost Proposal Process and Effectiveness
2011-10-11
processes to mitigate and manage risk , rather than derive upfront assessment and quantification of proposal risk (DoE, 2008a). The proposal...2. GM 2- Enhance the Federal Contract and Project Management Workforce Substantially Complete 3. GM 3 - Improve Project Risk Assessment ...proposal, contract proposal evaluation, risk , cost analysis = = ^Åèìáëáíáçå=oÉëÉ~êÅÜ=mêçÖê~ã= do^ar^qb=p`elli=lc=_rpfkbpp=C=mr_if`=mlif`v - ii
Nitric Oxide Analyzer Quantification of Plant S-Nitrosothiols.
Hussain, Adil; Yun, Byung-Wook; Loake, Gary J
2018-01-01
Nitric oxide (NO) is a small diatomic molecule that regulates multiple physiological processes in animals, plants, and microorganisms. In animals, it is involved in vasodilation and neurotransmission and is present in exhaled breath. In plants, it regulates both plant immune function and numerous developmental programs. The high reactivity and short half-life of NO and cross-reactivity of its various derivatives make its quantification difficult. Different methods based on calorimetric, fluorometric, and chemiluminescent detection of NO and its derivatives are available, but all of them have significant limitations. Here we describe a method for the chemiluminescence-based quantification of NO using ozone-chemiluminescence technology in plants. This approach provides a sensitive, robust, and flexible approach for determining the levels of NO and its signaling products, protein S-nitrosothiols.
NASA Astrophysics Data System (ADS)
Swinburne, Thomas D.; Perez, Danny
2018-05-01
A massively parallel method to build large transition rate matrices from temperature-accelerated molecular dynamics trajectories is presented. Bayesian Markov model analysis is used to estimate the expected residence time in the known state space, providing crucial uncertainty quantification for higher-scale simulation schemes such as kinetic Monte Carlo or cluster dynamics. The estimators are additionally used to optimize where exploration is performed and the degree of temperature acceleration on the fly, giving an autonomous, optimal procedure to explore the state space of complex systems. The method is tested against exactly solvable models and used to explore the dynamics of C15 interstitial defects in iron. Our uncertainty quantification scheme allows for accurate modeling of the evolution of these defects over timescales of several seconds.
User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Coleman, Kayla; Gilkey, Lindsay N.
Sandia’s Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a physics-based computational model. This can lend efficiency and rigor to manual parameter perturbation studies already being conducted by analysts. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, riskmore » analysis, and quantification of margins and uncertainty with such models. It directly supports verification and validation activities. Dakota algorithms enrich complex science and engineering models, enabling an analyst to answer crucial questions of - Sensitivity: Which are the most important input factors or parameters entering the simulation, and how do they influence key outputs?; Uncertainty: What is the uncertainty or variability in simulation output, given uncertainties in input parameters? How safe, reliable, robust, or variable is my system? (Quantification of margins and uncertainty, QMU); Optimization: What parameter values yield the best performing design or operating condition, given constraints? Calibration: What models and/or parameters best match experimental data? In general, Dakota is the Consortium for Advanced Simulation of Light Water Reactors (CASL) delivery vehicle for verification, validation, and uncertainty quantification (VUQ) algorithms. It permits ready application of the VUQ methods described above to simulation codes by CASL researchers, code developers, and application engineers.« less
Reliability and safety, and the risk of construction damage in mining areas
NASA Astrophysics Data System (ADS)
Skrzypczak, Izabela; Kogut, Janusz P.; Kokoszka, Wanda; Oleniacz, Grzegorz
2018-04-01
This article concerns the reliability and safety of building structures in mining areas, with a particular emphasis on the quantitative risk analysis of buildings. The issues of threat assessment and risk estimation, in the design of facilities in mining exploitation areas, are presented here, indicating the difficulties and ambiguities associated with their quantification and quantitative analysis. This article presents the concept of quantitative risk assessment of the impact of mining exploitation, in accordance with ISO 13824 [1]. The risk analysis is illustrated through an example of a construction located within an area affected by mining exploitation.
Durtschi, Jacob D; Stevenson, Jeffery; Hymas, Weston; Voelkerding, Karl V
2007-02-01
Real-time PCR data analysis for quantification has been the subject of many studies aimed at the identification of new and improved quantification methods. Several analysis methods have been proposed as superior alternatives to the common variations of the threshold crossing method. Notably, sigmoidal and exponential curve fit methods have been proposed. However, these studies have primarily analyzed real-time PCR with intercalating dyes such as SYBR Green. Clinical real-time PCR assays, in contrast, often employ fluorescent probes whose real-time amplification fluorescence curves differ from those of intercalating dyes. In the current study, we compared four analysis methods related to recent literature: two versions of the threshold crossing method, a second derivative maximum method, and a sigmoidal curve fit method. These methods were applied to a clinically relevant real-time human herpes virus type 6 (HHV6) PCR assay that used a minor groove binding (MGB) Eclipse hybridization probe as well as an Epstein-Barr virus (EBV) PCR assay that used an MGB Pleiades hybridization probe. We found that the crossing threshold method yielded more precise results when analyzing the HHV6 assay, which was characterized by lower signal/noise and less developed amplification curve plateaus. In contrast, the EBV assay, characterized by greater signal/noise and amplification curves with plateau regions similar to those observed with intercalating dyes, gave results with statistically similar precision by all four analysis methods.
Ding, Huanjun; Johnson, Travis; Lin, Muqing; Le, Huy Q.; Ducote, Justin L.; Su, Min-Ying; Molloi, Sabee
2013-01-01
Purpose: Quantification of breast density based on three-dimensional breast MRI may provide useful information for the early detection of breast cancer. However, the field inhomogeneity can severely challenge the computerized image segmentation process. In this work, the effect of the bias field in breast density quantification has been investigated with a postmortem study. Methods: T1-weighted images of 20 pairs of postmortem breasts were acquired on a 1.5 T breast MRI scanner. Two computer-assisted algorithms were used to quantify the volumetric breast density. First, standard fuzzy c-means (FCM) clustering was used on raw images with the bias field present. Then, the coherent local intensity clustering (CLIC) method estimated and corrected the bias field during the iterative tissue segmentation process. Finally, FCM clustering was performed on the bias-field-corrected images produced by CLIC method. The left–right correlation for breasts in the same pair was studied for both segmentation algorithms to evaluate the precision of the tissue classification. Finally, the breast densities measured with the three methods were compared to the gold standard tissue compositions obtained from chemical analysis. The linear correlation coefficient, Pearson's r, was used to evaluate the two image segmentation algorithms and the effect of bias field. Results: The CLIC method successfully corrected the intensity inhomogeneity induced by the bias field. In left–right comparisons, the CLIC method significantly improved the slope and the correlation coefficient of the linear fitting for the glandular volume estimation. The left–right breast density correlation was also increased from 0.93 to 0.98. When compared with the percent fibroglandular volume (%FGV) from chemical analysis, results after bias field correction from both the CLIC the FCM algorithms showed improved linear correlation. As a result, the Pearson's r increased from 0.86 to 0.92 with the bias field correction. Conclusions: The investigated CLIC method significantly increased the precision and accuracy of breast density quantification using breast MRI images by effectively correcting the bias field. It is expected that a fully automated computerized algorithm for breast density quantification may have great potential in clinical MRI applications. PMID:24320536
Jakubowska, Natalia; Beldì, Giorgia; Peychès Bach, Aurélie; Simoneau, Catherine
2014-01-01
This paper presents the outcome of the development, optimisation and validation at European Union level of an analytical method for using poly(2,6-diphenyl phenylene oxide--PPPO), which is stipulated in Regulation (EU) No. 10/2011, as food simulant E for testing specific migration from plastics into dry foodstuffs. Two methods for fortifying respectively PPPO and a low-density polyethylene (LDPE) film with surrogate substances that are relevant to food contact were developed. A protocol for cleaning the PPPO and an efficient analytical method were developed for the quantification of butylhydroxytoluene (BHT), benzophenone (BP), diisobutylphthalate (DiBP), bis(2-ethylhexyl) adipate (DEHA) and 1,2-cyclohexanedicarboxylic acid, diisononyl ester (DINCH) from PPPO. A protocol for a migration test from plastics using small migration cells was also developed. The method was validated by an inter-laboratory comparison (ILC) with 16 national reference laboratories for food contact materials in the European Union. This allowed for the first time data to be obtained on the precision and laboratory performance of both migration and quantification. The results showed that the validation ILC was successful even when taking into account the complexity of the exercise. The results showed that the method performance was 7-9% repeatability standard deviation (rSD) for most substances (regardless of concentration), with 12% rSD for the high level of BHT and for DiBP at very low levels. The reproducibility standard deviation results for the 16 European Union laboratories were in the range of 20-30% for the quantification from PPPO (for the three levels of concentrations of the five substances) and 15-40% from migration experiments from the fortified plastic at 60°C for 10 days and subsequent quantification. Considering the lack of data previously available in the literature, this work has demonstrated that the validation of a method is possible both for migration from a film and for quantification into a corresponding simulant for specific migration.
Fiorini, Dennis; Boarelli, Maria Chiara; Gabbianelli, Rosita; Ballini, Roberto; Pacetti, Deborah
2016-09-01
This study sought to develop and validate a quantitative method to analyze short chain free fatty acids (SCFAs) in rat feces by solid-phase microextraction and gas chromatography (SPME-GC) using the salt mixture ammonium sulfate and sodium dihydrogen phosphate as salting out agent. Conditioning and extraction time, linearity, limits of detection and quantification, repeatability, and recovery were evaluated. The proposed method allows quantification with improved sensitivity as compared with other methods exploiting SPME-GC. The method has been applied to analyze rat fecal samples, quantifying acetic, propionic, isobutyric, butyric, isopentanoic, pentanoic, and hexanoic acids. Copyright © 2016 Elsevier Inc. All rights reserved.
Automated quantification of myocardial perfusion SPECT using simplified normal limits.
Slomka, Piotr J; Nishina, Hidetaka; Berman, Daniel S; Akincioglu, Cigdem; Abidov, Aiden; Friedman, John D; Hayes, Sean W; Germano, Guido
2005-01-01
To simplify development of normal limits for myocardial perfusion SPECT (MPS), we implemented a quantification scheme in which normal limits are derived without visual scoring of abnormal scans or optimization of regional thresholds. Normal limits were derived from same-day TI-201 rest/Tc-99m-sestamibi stress scans of male (n = 40) and female (n = 40) low-likelihood patients. Defect extent, total perfusion deficit (TPD), and regional perfusion extents were derived by comparison to normal limits in polar-map coordinates. MPS scans from 256 consecutive patients without known coronary artery disease, who underwent coronary angiography, were analyzed. The new method of quantification (TPD) was compared with our previously developed quantification system and visual scoring. The receiver operator characteristic area under the curve for detection of 50% or greater stenoses by TPD (0.88 +/- 0.02) was higher than by visual scoring (0.83 +/- 0.03) ( P = .039) or standard quantification (0.82 +/- 0.03) ( P = .004). For detection of 70% or greater stenoses, it was higher for TPD (0.89 +/- 0.02) than for standard quantification (0.85 +/- 0.02) ( P = .014). Sensitivity and specificity were 93% and 79%, respectively, for TPD; 81% and 85%, respectively, for visual scoring; and 80% and 73%, respectively, for standard quantification. The use of stress mode-specific normal limits did not improve performance. Simplified quantification achieves performance better than or equivalent to visual scoring or quantification based on per-segment visual optimization of abnormality thresholds.
Sieslack, Anne K; Dziallas, Peter; Nolte, Ingo; Wefstaedt, Patrick; Hungerbühler, Stephan O
2014-10-12
Right ventricular (RV) volume and function are important diagnostic and prognostic factors in dogs with primary or secondary right-sided heart failure. The complex shape of the right ventricle and its retrosternal position make the quantification of its volume difficult. For that reason, only few studies exist, which deal with the determination of RV volume parameters. In human medicine cardiac magnetic resonance imaging (CMRI) is considered to be the reference technique for RV volumetric measurement (Nat Rev Cardiol 7(10):551-563, 2010), but cardiac computed tomography (CCT) and three-dimensional echocardiography (3DE) are other non-invasive methods feasible for RV volume quantification. The purpose of this study was the comparison of 3DE and CCT with CMRI, the gold standard for RV volumetric quantification. 3DE showed significant lower and CCT significant higher right ventricular volumes than CMRI. Both techniques showed very good correlations (R > 0.8) with CMRI for the volumetric parameters end-diastolic volume (EDV) and end-systolic volume (ESV). Ejection fraction (EF) and stroke volume (SV) were not different when considering CCT and CMRI, whereas 3DE showed a significant higher EF and lower SV than CMRI. The 3DE values showed excellent intra-observer variability (<3%) and still acceptable inter-observer variability (<13%). CCT provides an accurate image quality of the right ventricle with comparable results to the reference method CMRI. CCT overestimates the RV volumes; therefore, it is not an interchangeable method, having the disadvantage as well of needing general anaesthesia. 3DE underestimated the RV-Volumes, which could be explained by the worse image resolution. The excellent correlation between the methods indicates a close relationship between 3DE and CMRI although not directly comparable. 3DE is a promising technique for RV volumetric quantification, but further studies in awake dogs and dogs with heart disease are necessary to evaluate its usefulness in veterinary cardiology.
Liquid chromatography tandem-mass spectrometry (LC-MS/MS)- based methods such as isobaric tags for relative and absolute quantification (iTRAQ) and tandem mass tags (TMT) have been shown to provide overall better quantification accuracy and reproducibility over other LC-MS/MS techniques. However, large scale projects like the Clinical Proteomic Tumor Analysis Consortium (CPTAC) require comparisons across many genomically characterized clinical specimens in a single study and often exceed the capability of traditional iTRAQ-based quantification.
NASA Astrophysics Data System (ADS)
Rodigast, M.; Mutzel, A.; Iinuma, Y.; Haferkorn, S.; Herrmann, H.
2015-01-01
Carbonyl compounds are ubiquitous in the atmosphere and either emitted primarily from anthropogenic and biogenic sources or they are produced secondarily from the oxidation of volatile organic compounds (VOC). Despite a number of studies about the quantification of carbonyl compounds a comprehensive description of optimised methods is scarce for the quantification of atmospherically relevant carbonyl compounds. Thus a method was systematically characterised and improved to quantify carbonyl compounds. Quantification with the present method can be carried out for each carbonyl compound sampled in the aqueous phase regardless of their source. The method optimisation was conducted for seven atmospherically relevant carbonyl compounds including acrolein, benzaldehyde, glyoxal, methyl glyoxal, methacrolein, methyl vinyl ketone and 2,3-butanedione. O-(2,3,4,5,6-pentafluorobenzyl)hydroxylamine hydrochloride (PFBHA) was used as derivatisation reagent and the formed oximes were detected by gas chromatography/mass spectrometry (GC/MS). The main advantage of the improved method presented in this study is the low detection limit in the range of 0.01 and 0.17 μmol L-1 depending on carbonyl compounds. Furthermore best results were found for extraction with dichloromethane for 30 min followed by derivatisation with PFBHA for 24 h with 0.43 mg mL-1 PFBHA at a pH value of 3. The optimised method was evaluated in the present study by the OH radical initiated oxidation of 3-methylbutanone in the aqueous phase. Methyl glyoxal and 2,3-butanedione were found to be oxidation products in the samples with a yield of 2% for methyl glyoxal and 14% for 2,3-butanedione.
Bradshaw, Elizabeth J; Hume, Patria A
2012-09-01
Targeted injury prevention strategies, based on biomechanical analyses, have the potential to help reduce the incidence and severity of gymnastics injuries. This review outlines the potential benefits of biomechanics research to contribute to injury prevention strategies for women's artistic gymnastics by identification of mechanisms of injury and quantification of the effects of injury risk factors. One hundred and twenty-three articles were retained for review after searching electronic databases using key words, including 'gymnastic', 'biomech*', and 'inj*', and delimiting by language and relevance to the paper aim. Impact load can be measured biomechanically by the use of instrumented equipment (e.g. beatboard), instrumentation on the gymnast (accelerometers), or by landings on force plates. We need further information on injury mechanisms and risk factors in gymnastics and practical methods of monitoring training loads. We have not yet shown, beyond a theoretical approach, how biomechanical analysis of gymnastics can help reduce injury risk through injury prevention interventions. Given the high magnitude of impact load, both acute and accumulative, coaches should monitor impact loads per training session, taking into consideration training quality and quantity such as the control of rotation and the height from which the landings are executed.