Analyse der Literatur zu Klimawirkungen in Deutschland: ein Gesamtbild mit Lücken
NASA Astrophysics Data System (ADS)
Fleischhauer, Mark; Greiving, Stefan; Lindner, Christian; Lückenkötter, Johannes; Schauser, Inke
Dieses Kapitel präsentiert Ergebnisse einer umfassenden Literaturauswertung zu relevanten Klimawirkungen für Deutschland, die im Rahmen des Projekts "Netzwerk Vulnerabilität" vorgenommen wurde. Es zeigt auf, dass ein aggregiertes Gesamtbild der Klimawirkungen für Deutschland als Grundlage für Folgenabschätzungen und Anpassungsplanungen noch nicht gezeichnet werden kann, da eine große Bandbreite an Ansätzen zur Bewertung von Vulnerabilitäten oder Klimawandelfolgen existiert und die gegenwärtig vorhandenen Klimawirkungs- und Vulnerabilitätsstudien von großer Heterogenität gekennzeichnet sind. Als erster Schritt wird deshalb eine Zusammenschau bereits vorhandener Ansätze geliefert.
Mahdavi, Pardis
2010-11-01
This paper focuses on the perceived racialisation and resultant spatialisation of commercial sex in Dubai. In recent years, the sex industry in Dubai has grown to include women from the Middle East, Eastern Europe, East Asia and Africa. With the increase in sex workers of different nationalities has come a form of localised racism that is embedded in structures and desires seen within specific locations. The physical spatialisation of sex work hinges on perceived race and produces distinct income generating potential for women engaged in the sex industry in Dubai. The social and physical topography of Dubai is important in marginalising or privileging these various groups of sex workers, which correlates race, space and place with rights and assistance. I begin with a description of the multidirectional flows of causality between race, space, place and demand. I then discuss how these various groups are inversely spatialised within the discourse on assistance, protection and rights. The findings presented here are based on ethnographic research conducted with transnational migrants in the UAE in 2004, 2008 and 2009.
Wade, Nicholas J
2011-01-01
Pictorial images are icons as well as eye-cons: they provide distillations of objects or ideas into simpler shapes. They create the impression of representing that which cannot be presented. Even at the level of the photograph, the links between icon and object are tenuous. The dimensions of depth and motion are missing from icons, and these alone introduce all manner of potential ambiguities. The history of art can be considered as exploring the missing link between icon and object. Eye-cons can also be illusions—tricks of vision so that what is seen does not necessarily correspond to what is physically presented. Pictorial images can be spatialised or stylised; spatialised images generally share some of the projective characteristics of the object represented. Written words are also icons, but they do not resemble the objects they represent—they are stylised or conventional. Icons as stylised words and spatialised images were set in delightful opposition by René Magritte in a series of pipe paintings, and this theme is here alluded to. Most of visual science is now concerned with icons—two-dimensional displays on computer monitors. Is vision now the science of eye-cons? PMID:23145240
Radionavigation systems : a capabilities investment strategy.
DOT National Transportation Integrated Search
2004-01-01
The Final Report of the President's Commission on Critical Infrastructure Protection concluded that Global Positioning System (GPS) services and applications are susceptible to various types of interference, and that the effects of these vulnerabilit...
Human Performance in Continuous Operations: Volume 1. Human Performance Guidelines
1979-12-01
seemingly capable of enduring unusual degrees of stress and fatigue when nec- essary. When a soldier is driven by the motivation to survive, by exemplary...Vulnerabilities to degradation --Artillery 470 PS4 -?--- -. Armor Task Vulnerabilit.y The pattern of vulnurabihlties Ior tank platoon personnel (tl’Fiure 8. 2
Exploring Potential ADS-B Vulnerabilites in the FAA’s Nextgen Air Transportation System
2011-06-01
an online database of aircraft and the camera is used to filter the data, based on field of view, for display on the phone’s screen. The developers...34 December 28, 2009. [Online]. abcnews.go.com [5] P. Dempsey and L. Gesell , Air Transportation: Foundations for the 21st Century. Arizona: Coast Aire
Making Educational Spaces through Boundary Work: Territorialisation and "Boundarying"
ERIC Educational Resources Information Center
Seddon, Terri
2014-01-01
Globalising processes are shifting the established nation-building project of twentieth-century national education systems. This historic axis between education and territorialised state power is being re-spatialised and remade as a globally networked, lifelong learning educational order. Political sociology of education theorises these de- and…
ERIC Educational Resources Information Center
Ryan, Mary; Bourke, Terri
2018-01-01
Pre-service teacher educators, both nationally and internationally, must negotiate a plethora of expectations including using Professional Standards to enhance teacher quality. In Australia, the recent Teacher Education Ministerial Advisory Group (TEMAG) report highlighted weak application of Standards in Initial Teacher Education (ITE). However,…
Spaces of Possibility in Pre-Service Teacher Education
ERIC Educational Resources Information Center
Ryan, Mary
2011-01-01
Pre-service teacher education is a spatialised enterprise. It operates across a number of spaces that may or may not be linked ideologically and/or physically. These spaces can include daily practices, locations, infrastructure, relationships and representations of power and ideology. The interrelationships between and within these (sometimes…
After the Marketplace: Evidence, Social Science and Educational Research
ERIC Educational Resources Information Center
Luke, Allan
2003-01-01
This paper is an essay on the state of Australian education that frames new directions for educational research. It outlines three challenges faced by Australian educators: highly spatialised poverty with particularly strong mediating effects on primary school education; the need for intellectual and critical depth in pedagogy, with a focus in the…
ERIC Educational Resources Information Center
Schulz, Samantha
2017-01-01
Distinct from rurality, the Australian desert has long functioned as a signifier of remoteness in the dominant imagination; a product of spatialised binary relations between "progressive" (white) mainstream or idealised white countryside, and disordered/dangerous Aboriginal periphery. Remoteness constitutes a complex racial dynamic that…
Resilience and risk management in the urban project
NASA Astrophysics Data System (ADS)
Veyron, Jessica; Becue, Vincent; El Meouche, Rani; Barroca, Bruno
2017-04-01
Our study focus on the recourse of resilience concept in an urban environment coupled with risk management concerns. The urban resilience is subject to interpretation according to the expertise fields and the using context. It brings underlying concepts of time and space out. For its part, the urban environment is complex because of the elements multitude which makes up it on a dense area as much humanly, structurally, materially as functionally. The resilience polysemy rediscovers within the risk management and is confronted with the operational expectation that she arouses in this field. Between absorption, recovery, adaptability or else effectiveness, the level of requirement of the urban area could be abundant and varied to a system to another and in front of a risk to another. The capability of the urban systems to integrate the outstanding risk scope could be reinforce through the use of spatialisation tools. The access to these knowledge more and more widespread makes the changing of spatial and temporal scales thought easier in the urban project approach along with the sharing of citizen and territorial actions and innovations. Our work gets into position between resilience and urban planning. Based on bibliography and urban planning feedback in front of risks, we envisage to study the achievement of the urban planning in the field of vulnerable districts or buildings. Our aim tends torwards characterise urban resilience in risk field through spatialisation tool setting up.
2014-04-01
du groupe de travail portaient sur : • La modélisation infrarouge (IR) de navires militaires ; • Les systèmes embarqués de veille et poursuite...systèmes de surveillance et de gestion de la signature IR des navires ; et • Les problèmes spécifiques de la modélisation infrarouge dans les environnements...plus tangible a été la planification, l’exécution et l’analyse de l’essai SQUIRREL. Des mesures
SOCIAL STABILITY AND HIV RISK BEHAVIOR: EVALUATING THE ROLE OF ACCUMULATED VULNERABILITY
German, Danielle; Latkin, Carl A.
2011-01-01
This study evaluated a cumulative and syndromic relationship among commonly co-occurring vulnerabilites (homelessness, incarceration, low-income, residential transition) in association with HIV-related risk behaviors among 635 low-income women in Baltimore. Analysis included descriptive statistics, logistic regression, latent class analysis and latent class regression. Both methods of assessing multidimensional instability showed significant associations with risk indicators. Risk of multiple partners, sex exchange, and drug use decreased significantly with each additional domain. Higher stability class membership (77%) was associated with decreased likelihood of multiple partners, exchange partners, recent drug use, and recent STI. Multidimensional social vulnerabilities were cumulatively and synergistically linked to HIV risk behavior. Independent instability measures may miss important contextual determinants of risk. Social stability offers a useful framework to understand the synergy of social vulnerabilities that shape sexual risk behavior. Social policies and programs aiming to enhance housing and overall social stability are likely to be beneficial for HIV prevention. PMID:21259043
The influence of air-conditioning on street temperatures in the city of Paris
NASA Astrophysics Data System (ADS)
de Munck, C. S.; Pigeon, G.; Masson, V.; Marchadier, C.; Meunier, F.; Tréméac, B.; Merchat, M.
2010-12-01
A consequence of urban heat islands in summer is the increased use of air-conditioning during extreme heat events : the use of air-conditioning systems, while cooling the inside of buildings releases waste heat (as latent and sensible heat) in the lower part of the urban atmosphere, hence potentially increasing air street temperatures where the heat is released. This may lead locally to a further increase in air street temperatures, therefore increasing the air cooling demand, while at the same time lowering the efficiency of air-conditioning units. A coupled model consisting of a meso-scale meteorological model (MESO-NH) and an urban energy balance model (TEB) has been implemented with an air-conditioning module and used in combination to real spatialised datasets to understand and quantify potential increases in temperature due to air-conditioning heat releases for the city of Paris . In a first instance, the current types of air-conditioning systems co-existing in the city were simulated (underground chilled water network, wet cooling towers and individual air-conditioning units) to study the effects of latent and sensible heat releases on street temperatures. In a third instance, 2 scenarios were tested to characterise the impacts of likely future trends in air-conditioning equipment in the city : a first scenario for which current heat releases were converted to sensible heat, and a second based on 2030s projections of air-conditioning equipment at the scale of the city. All the scenarios showed an increase in street temperature which, as expected, was greater at night time than day time. For the first two scenarios, this increase in street temperatures was localised at or near the sources of air-conditioner heat releases, while the 2030s air-conditioning scenario impacted wider zones in the city. The amplitude of the increase in temperature varied from 0,25°C to 1°C for the air-conditioning current state, between 0,25°C and 2°C for the sensible heat release only scenario, and finally from 0,25°C to 2 °C for the 2030s scenario, with impacts of up to 3°C locally. Overall, these results demonstrated to which extend the use air-conditioning could enhance street temperatures in the city of Paris and the importance of a spatialised approach.
NASA Astrophysics Data System (ADS)
Chartin, Caroline; Krüger, Inken; Goidts, Esther; Carnol, Monique; van Wesemael, Bas
2017-04-01
The quantification and the spatialisation of reliable SOC stocks (Mg C ha-1) and total stock (Tg C) baselines and associated uncertainties are fundamental to detect the gains or losses in SOC, and to locate sensitive areas with low SOC levels. Here, we aim to both quantify and spatialize SOC stocks at regional scale (southern Belgium) based on data from one non-design-based nor model-based sampling scheme. To this end, we developed a computation procedure based on Digital Soil Mapping techniques and stochastic simulations (Monte-Carlo) allowing the estimation of multiple (here, 10,000) independent spatialized datasets. The computation of the prediction uncertainty accounts for the errors associated to the both estimations of i) SOC stock at the pixel-related area scale and ii) parameters of the spatial model. Based on these 10,000 individuals, median SOC stocks and 90% prediction intervals were computed for each pixel, as well as total SOC stocks and their 90% prediction intervals for selected sub-areas and for the entire study area. Hence, a Generalised Additive Model (GAM) explaining 69.3 % of the SOC stock variance was calibrated and then validated (R2 = 0.64). The model overestimated low SOC stock (below 50 Mg C ha-1) and underestimated high SOC stock (especially those above 100 Mg C kg-1). A positive gradient of SOC stock occurred from the northwest to the center of Wallonia with a slight decrease on the southernmost part, correlating to the evolution of precipitation and temperature (along with elevation) and dominant land use. At the catchment scale higher SOC stocks were predicted on valley bottoms, especially for poorly drained soils under grassland. Mean predicted SOC stocks for cropland and grassland in Wallonia were of 26.58 Tg C (SD 1.52) and 43.30 Tg C (2.93), respectively. The procedure developed here allowed to predict realistic spatial patterns of SOC stocks all over agricultural lands of southern Belgium and to produce reliable statistics of total SOC stocks for each of the 20 combinations of land use / agricultural regions of Wallonia. This procedure appears useful to produce soil maps as policy tools in conducting sustainable management at regional and national scales, and to compute statistics which comply with specific requirements of reporting activities.
After the clinic? Researching sexual health technology in context.
Davis, Mark
2015-01-01
There is great interest in what testing, pharmaceutical, information and social media technology can do for sexual health. Much programmatic and research activity is focused on assessing how these technologies can be used to best effect. Less obvious are analyses that place technology into historical, political and real-world settings. Developing an 'in-context' analysis of sexual health technology, this paper draws on interviews with leading community advocates, researchers and clinicians in Australia, Canada and the UK and looks across examples, including social media, rapid HIV testing, pre-Exposure Prophylaxis for HIV and polymerase chain reaction Chlamydia testing. The analysis is framed by studies of techno-society and the dialectics of sex-affirmative advocacy with biomedical authority and attends to: the rationalistic and affective dimensions of the imaginary associated with technology; the role of technology in the re-spatialisation and re-temporalisation of the sexual health clinic; and the re-invention of technology in its real-world contexts. This in-context approach is important for: the effective implementation of new technology; strengthening the social science contribution to the field; and enriching social theory in general on life in techno-societies.
Frühe Stresserfahrungen und Krankheitsvulnerabilität
Entringer, Sonja; Buss, Claudia; Heim, Christine
2016-01-01
Zusammenfassung Hintergrund Das stetig wachsende Forschungsgebiet der “Frühe[n] Programmierung von Krankheit und Gesundheit” untersucht, inwieweit die individuelle Vulnerabilität für die Entstehung verschiedenster Erkrankungen über die Lebensspanne bereits während der frühen Entwicklung beeinflusst wird. Ziele der Arbeit In der vorliegenden Übersichtsarbeit werden das Konzept der frühen Programmierung von Krankheitsvulnerabilität erläutert sowie Befunde zu den Folgen frühkindlicher Traumatisierung und pränataler Stressexposition zusammenfassend dargestellt. Es werden außerdem biologische Mechanismen diskutiert, die das erhöhte Krankheitsrisiko nach lebensgeschichtlich früher Stresserfahrungen vermitteln. Die Möglichkeit der transgenerationalen Transmission frühkindlicher Erfahrungen an die nächste Generation und die zugrundeliegenden Mechanismen dieser Übertragung werden ebenfalls vorgestellt. Fazit Die Befundlage zu Stresserfahrungen im frühen Leben und der Entstehung von psychischen und körperlichen Störungen über die Lebensspanne wächst stetig. Die Mechanismen werden derzeit weiter bis hin zur molekularbiologischen und epigenetischen Ebene erforscht. Hier ergeben sich ganz neue Perspektiven, welche die Präzision klinischer Diagnostik und den Erfolg von Interventionen erheblich verbessern könnten. Momentan existiert jedoch noch ein erheblicher Mangel an Translation zwischen diesen Forschungserkenntnissen und deren Anwendung in der klinischen Versorgung. PMID:27604117
Risk analysis based on hazards interactions
NASA Astrophysics Data System (ADS)
Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost
2017-04-01
Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).
Soil properties, soil functions and soil security
NASA Astrophysics Data System (ADS)
Poggio, Laura; Gimona, Alessandro
2017-04-01
Soil plays a crucial role in the ecosystem functioning such as food production, capture and storage of water, carbon and nutrients and in the realisation of a number of UN Sustainable Developments Goals. In this work we present an approach to spatially and jointly assess the multiple contributions of soil to the delivery of ecosystem services within multiple land-use system. We focussed on the modelling of the impact of soil on sediment retention, carbon storage, storing and filtering of nutrients, habitat for soil organisms and water regulation, taking into account examples of land use and climate scenarios. Simplified models were used for the single components. Spatialised Bayesian Belief networks were used for the jointly assessment and mapping of soil contribution to multiple land use and ecosystem services. We integrated continuous 3D soil information derived from digital soil mapping approaches covering the whole of mainland Scotland, excluding the Northern Islands. Uncertainty was accounted for and propagated across the whole process. The Scottish test case highlights the differences in roles between mineral and organic soils and provides an example of integrated study assessing the contributions of soil. The results show the importance of the multi-functional analysis of the contribution of soils to the ecosystem service delivery and UN SDGs.
Dole, Marjorie; Hoen, Michel; Meunier, Fanny
2012-06-01
Developmental dyslexia is associated with impaired speech-in-noise perception. The goal of the present research was to further characterize this deficit in dyslexic adults. In order to specify the mechanisms and processing strategies used by adults with dyslexia during speech-in-noise perception, we explored the influence of background type, presenting single target-words against backgrounds made of cocktail party sounds, modulated speech-derived noise or stationary noise. We also evaluated the effect of three listening configurations differing in terms of the amount of spatial processing required. In a monaural condition, signal and noise were presented to the same ear while in a dichotic situation, target and concurrent sound were presented to two different ears, finally in a spatialised configuration, target and competing signals were presented as if they originated from slightly differing positions in the auditory scene. Our results confirm the presence of a speech-in-noise perception deficit in dyslexic adults, in particular when the competing signal is also speech, and when both signals are presented to the same ear, an observation potentially relating to phonological accounts of dyslexia. However, adult dyslexics demonstrated better levels of spatial release of masking than normal reading controls when the background was speech, suggesting that they are well able to rely on denoising strategies based on spatial auditory scene analysis strategies. Copyright © 2012 Elsevier Ltd. All rights reserved.
Assessment of spatial distribution of fallout radionuclides through geostatistics concept.
Mabit, L; Bernard, C
2007-01-01
After introducing geostatistics concept and its utility in environmental science and especially in Fallout Radionuclide (FRN) spatialisation, a case study for cesium-137 ((137)Cs) redistribution at the field scale using geostatistics is presented. On a Canadian agricultural field, geostatistics coupled with a Geographic Information System (GIS) was used to test three different techniques of interpolation [Ordinary Kriging (OK), Inverse Distance Weighting power one (IDW1) and two (IDW2)] to create a (137)Cs map and to establish a radioisotope budget. Following the optimization of variographic parameters, an experimental semivariogram was developed to determine the spatial dependence of (137)Cs. It was adjusted to a spherical isotropic model with a range of 30 m and a very small nugget effect. This (137)Cs semivariogram showed a good autocorrelation (R(2)=0.91) and was well structured ('nugget-to-sill' ratio of 4%). It also revealed that the sampling strategy was adequate to reveal the spatial correlation of (137)Cs. The spatial redistribution of (137)Cs was estimated by Ordinary Kriging and IDW to produce contour maps. A radioisotope budget was established for the 2.16 ha agricultural field under investigation. It was estimated that around 2 x 10(7)Bq of (137)Cs were missing (around 30% of the total initial fallout) and were exported by physical processes (runoff and erosion processes) from the area under investigation. The cross-validation analysis showed that in the case of spatially structured data, OK is a better interpolation method than IDW1 or IDW2 for the assessment of potential radioactive contamination and/or pollution.
NASA Astrophysics Data System (ADS)
Cerretelli, Stefania; Poggio, Laura; Gimona, Alessandro; Peressotti, Alessandro; Black, Helaina
2017-04-01
Land and soil degradation are widespread especially in dry and developing countries such as Ethiopia. Land degradation leads to ecosystems services (ESS) degradation, because it causes the depletion and loss of several soil functions. Ethiopia's farmland faces intense degradation due to deforestation, agricultural land expansion, land overexploitation and overgrazing. In this study we modelled the impact of physical factors on ESS degradation, in particular soil erodibility, carbon storage and nutrient retention, in the Ethiopian Great Rift Valley, northwestern of Hawassa. We used models of the Sediment retention/loss, the Nutrient Retention/loss (from the software suite InVEST) and Carbon Storage. To run the models we coupled soil local data (such as soil organic carbon, soil texture) with remote sensing data as input in the parametrization phase, e.g. to derive a land use map, to calculate the aboveground and belowground carbon, the evapotraspiration coefficient and the capacity of vegetation to retain nutrient. We then used spatialised Bayesian Belief Networks (sBBNs) predicting ecosystem services degradation on the basis of the results of the three mechanistic models. The results show i) the importance of mapping of ESS degradation taking into consideration the spatial heterogeneity and the cross-correlations between impacts ii) the fundamental role of remote sensing data in monitoring and modelling in remote, data-poor areas and iii) the important role of spatial BBNs in providing spatially explicit measures of risk and uncertainty. This approach could help decision makers to identify priority areas for intervention in order to reduce land and ecosystem services degradation.
Remediation of spatial processing disorder (SPD).
Graydon, Kelley; Van Dun, Bram; Tomlin, Dani; Dowell, Richard; Rance, Gary
2018-05-01
To determine the efficacy of deficit-specific remediation for spatial processing disorder, quantify effects of remediation on functional listening, and determine if remediation is maintained. Participants had SPD, diagnosed using the Listening in Spatialised Noise-Sentences test. The LiSN and Learn software was provided as auditory training. Post-training, repeat LiSN-S testing was conducted. Questionnaires pre- and post-training acted as subjective measures of remediation. A late-outcome assessment established long-term effects of remediation. Sixteen children aged between 6;3 [years; months] and 10;0 completed between 20 and 146 training games. Post-training LiSN-S improved in measures containing spatial cues (p ≤ 0.001) by 2.0 SDs (3.6 dB) for DV90, 1.8 SDs for SV90 (3.2 dB), 1.4 SDs for spatial advantage (2.9 dB) and 1.6 SDs for total advantage (3.3 dB). Improvement was also found in the DV0 condition (1.4 dB or 0.5 SDs). Post-training changes were not significant in the talker advantage measure (1.0 dB or 0.4 SDs) or the SV0 condition (0.3 dB or 0.1 SDs). The late-outcome assessment demonstrated improvement was maintained. Subjective improvement post-remediation was observed using the parent questionnaire. Children with SPD had improved ability to utilise spatial cues following deficit-specific remediation, with the parent questionnaire sensitive to remediation. Effects of the remediation also appear to be sustained.
Benmansour, M; Mabit, L; Nouira, A; Moussadek, R; Bouksirate, H; Duchemin, M; Benkdad, A
2013-01-01
In Morocco land degradation - mainly caused by soil erosion - is one of the most serious agroenvironmental threats encountered. However, only limited data are available on the actual magnitude of soil erosion. The study site investigated was an agricultural field located in Marchouch (6°42' W, 33° 47' N) at 68 km south east from Rabat. This work demonstrates the potential of the combined use of (137)Cs, (210)Pb(ex) as radioisotopic soil tracers to estimate mid and long term erosion and deposition rates under Mediterranean agricultural areas. The net soil erosion rates obtained were comparable, 14.3 t ha(-1) yr(-1) and 12.1 ha(-1) yr(-1) for (137)Cs and (210)Pb(ex) respectively, resulting in a similar sediment delivery ratio of about 92%. Soil redistribution patterns of the study field were established using a simple spatialisation approach. The resulting maps generated by the use of both radionuclides were similar, indicating that the soil erosion processes has not changed significantly over the last 100 years. Over the previous 10 year period, the additional results provided by the test of the prediction model RUSLE 2 provided results of the same order of magnitude. Based on the (137)Cs dataset established, the contribution of the tillage erosion impact has been evaluated with the Mass Balance Model 3 and compared to the result obtained with the Mass Balance Model 2. The findings highlighted that water erosion is the leading process in this Moroccan cultivated field, tillage erosion under the experimental condition being the main translocation process within the site without a significant and major impact on the net erosion. Copyright © 2012 Elsevier Ltd. All rights reserved.
Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.
Liu, Jason Yingjie
2014-11-01
The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Multiplex Droplet Digital PCR Protocols for Quantification of GM Maize Events.
Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Štebih, Dejan; Morisset, Dany; Holst-Jensen, Arne; Žel, Jana
2018-01-01
The standard-curve based simplex quantitative polymerase chain reaction (qPCR) has been the gold standard for DNA target quantification for more than a decade. The large and growing number of individual analyses needed to test for genetically modified organisms (GMOs) is reducing the cost-effectiveness of qPCR. Droplet digital PCR (ddPCR) enables absolute quantification without standard curves, avoids the amplification efficiency bias observed with qPCR, allows more accurate estimations at low target copy numbers and, in combination with multiplexing, significantly improves cost efficiency. Here we describe two protocols for multiplex quantification of GM maize events: (1) nondiscriminating, with multiplex quantification of targets as a group (12 GM maize lines) and (2) discriminating, with multiplex quantification of individual targets (events). The first enables the quantification of twelve European Union authorized GM maize events as a group with only two assays, but does not permit determination of the individual events present. The second protocol enables the quantification of four individual targets (three GM events and one endogene) in a single reaction. Both protocols can be modified for quantification of any other DNA target.
den Braver, Michiel W; Vermeulen, Nico P E; Commandeur, Jan N M
2017-03-01
Modification of cellular macromolecules by reactive drug metabolites is considered to play an important role in the initiation of tissue injury by many drugs. Detection and identification of reactive intermediates is often performed by analyzing the conjugates formed after trapping by glutathione (GSH). Although sensitivity of modern mass spectrometrical methods is extremely high, absolute quantification of GSH-conjugates is critically dependent on the availability of authentic references. Although 1 H NMR is currently the method of choice for quantification of metabolites formed biosynthetically, its intrinsically low sensitivity can be a limiting factor in quantification of GSH-conjugates which generally are formed at low levels. In the present study, a simple but sensitive and generic method for absolute quantification of GSH-conjugates is presented. The method is based on quantitative alkaline hydrolysis of GSH-conjugates and subsequent quantification of glutamic acid and glycine by HPLC after precolumn derivatization with o-phthaldialdehyde/N-acetylcysteine (OPA/NAC). Because of the lower stability of the glycine OPA/NAC-derivate, quantification of the glutamic acid OPA/NAC-derivate appeared most suitable for quantification of GSH-conjugates. The novel method was used to quantify the concentrations of GSH-conjugates of diclofenac, clozapine and acetaminophen and quantification was consistent with 1 H NMR, but with a more than 100-fold lower detection limit for absolute quantification. Copyright © 2017. Published by Elsevier B.V.
Yan, Xiaowen; Yang, Limin; Wang, Qiuquan
2013-07-01
Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.
Almalki, Manal; Gray, Kathleen; Sanchez, Fernando Martin
2015-01-01
Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Self-quantification in personal health maintenance appears promising and exciting. However, more studies are needed to support its use in this field. The proposed model will in the future lead to developing a measure for assessing the effectiveness of interventions to support using SQS for health self-management (e.g., assessing the complexity of self-quantification activities, and activation of the individuals).
2015-01-01
Background Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. Objectives In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. Method We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. Findings We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Conclusions Self-quantification in personal health maintenance appears promising and exciting. However, more studies are needed to support its use in this field. The proposed model will in the future lead to developing a measure for assessing the effectiveness of interventions to support using SQS for health self-management (e.g., assessing the complexity of self-quantification activities, and activation of the individuals). PMID:26019809
2017-02-02
Corresponding Author Abstract Accurate virus quantification is sought, but a perfect method still eludes the scientific community. Electron...unlimited. UNCLASSIFIED 2 provides morphology data and counts all viral particles, including partial or noninfectious particles; however, EM methods ...consistent, reproducible virus quantification method called Scanning Transmission Electron Microscopy – Virus Quantification (STEM-VQ) which simplifies
HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.
2015-05-01
This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.
Tey, Wei Keat; Kuang, Ye Chow; Ooi, Melanie Po-Leen; Khoo, Joon Joon
2018-03-01
Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses. This study proposes an automated quantification system for measuring the amount of interstitial fibrosis in renal biopsy images as a consistent basis of comparison among pathologists. The system extracts and segments the renal tissue structures based on colour information and structural assumptions of the tissue structures. The regions in the biopsy representing the interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area and quantified as a percentage of the total area of the biopsy sample. A ground truth image dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated a good correlation in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses due to the uncertainties in human judgement. An automated quantification system for accurately measuring the amount of interstitial fibrosis in renal biopsy images is presented as a consistent basis of comparison among pathologists. The system identifies the renal tissue structures through knowledge-based rules employing colour space transformations and structural features extraction from the images. In particular, the renal glomerulus identification is based on a multiscale textural feature analysis and a support vector machine. The regions in the biopsy representing interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area. The experiments conducted evaluate the system in terms of quantification accuracy, intra- and inter-observer variability in visual quantification by pathologists, and the effect introduced by the automated quantification system on the pathologists' diagnosis. A 40-image ground truth dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated an average error of 9 percentage points in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists involving samples from 70 kidney patients also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. The accuracy of the proposed quantification system has been validated with the ground truth dataset and compared against the pathologists' quantification results. It has been shown that the correlation between different pathologists' estimation of interstitial fibrosis area has significantly improved, demonstrating the effectiveness of the quantification system as a diagnostic aide. Copyright © 2017 Elsevier B.V. All rights reserved.
Automated quantification of myocardial perfusion SPECT using simplified normal limits.
Slomka, Piotr J; Nishina, Hidetaka; Berman, Daniel S; Akincioglu, Cigdem; Abidov, Aiden; Friedman, John D; Hayes, Sean W; Germano, Guido
2005-01-01
To simplify development of normal limits for myocardial perfusion SPECT (MPS), we implemented a quantification scheme in which normal limits are derived without visual scoring of abnormal scans or optimization of regional thresholds. Normal limits were derived from same-day TI-201 rest/Tc-99m-sestamibi stress scans of male (n = 40) and female (n = 40) low-likelihood patients. Defect extent, total perfusion deficit (TPD), and regional perfusion extents were derived by comparison to normal limits in polar-map coordinates. MPS scans from 256 consecutive patients without known coronary artery disease, who underwent coronary angiography, were analyzed. The new method of quantification (TPD) was compared with our previously developed quantification system and visual scoring. The receiver operator characteristic area under the curve for detection of 50% or greater stenoses by TPD (0.88 +/- 0.02) was higher than by visual scoring (0.83 +/- 0.03) ( P = .039) or standard quantification (0.82 +/- 0.03) ( P = .004). For detection of 70% or greater stenoses, it was higher for TPD (0.89 +/- 0.02) than for standard quantification (0.85 +/- 0.02) ( P = .014). Sensitivity and specificity were 93% and 79%, respectively, for TPD; 81% and 85%, respectively, for visual scoring; and 80% and 73%, respectively, for standard quantification. The use of stress mode-specific normal limits did not improve performance. Simplified quantification achieves performance better than or equivalent to visual scoring or quantification based on per-segment visual optimization of abnormality thresholds.
Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong
2015-01-01
Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.
Survey of Existing Uncertainty Quantification Capabilities for Army Relevant Problems
2017-11-27
ARL-TR-8218•NOV 2017 US Army Research Laboratory Survey of Existing Uncertainty Quantification Capabilities for Army-Relevant Problems by James J...NOV 2017 US Army Research Laboratory Survey of Existing Uncertainty Quantification Capabilities for Army-Relevant Problems by James J Ramsey...Rev. 8/98) Prescribed by ANSI Std. Z39.18 November 2017 Technical Report Survey of Existing Uncertainty Quantification Capabilities for Army
2015-06-04
control, vibration and noise control, health monitoring, and energy harvesting . However, these advantages come at the cost of rate-dependent hysteresis...configuration used for energy harvesting . Uncertainty Quantification Uncertainty quantification is pursued in two steps: (i) determination of densities...Crews and R.C. Smith, “Quantification of parameter and model uncertainty for shape mem- ory alloy bending actuators,” Journal of Intelligent material
Quantitative proteome analysis using isobaric peptide termini labeling (IPTL).
Arntzen, Magnus O; Koehler, Christian J; Treumann, Achim; Thiede, Bernd
2011-01-01
The quantitative comparison of proteome level changes across biological samples has become an essential feature in proteomics that remains challenging. We have recently introduced isobaric peptide termini labeling (IPTL), a novel strategy for isobaric quantification based on the derivatization of peptide termini with complementary isotopically labeled reagents. Unlike non-isobaric quantification methods, sample complexity at the MS level is not increased, providing improved sensitivity and protein coverage. The distinguishing feature of IPTL when comparing it to more established isobaric labeling methods (iTRAQ and TMT) is the presence of quantification signatures in all sequence-determining ions in MS/MS spectra, not only in the low mass reporter ion region. This makes IPTL a quantification method that is accessible to mass spectrometers with limited capabilities in the low mass range. Also, the presence of several quantification points in each MS/MS spectrum increases the robustness of the quantification procedure.
Ahn, Sung Hee; Bae, Yong Jin; Moon, Jeong Hee; Kim, Myung Soo
2013-09-17
We propose to divide matrix suppression in matrix-assisted laser desorption ionization into two parts, normal and anomalous. In quantification of peptides, the normal effect can be accounted for by constructing the calibration curve in the form of peptide-to-matrix ion abundance ratio versus concentration. The anomalous effect forbids reliable quantification and is noticeable when matrix suppression is larger than 70%. With this 70% rule, matrix suppression becomes a guideline for reliable quantification, rather than a nuisance. A peptide in a complex mixture can be quantified even in the presence of large amounts of contaminants, as long as matrix suppression is below 70%. The theoretical basis for the quantification method using a peptide as an internal standard is presented together with its weaknesses. A systematic method to improve quantification of high concentration analytes has also been developed.
Cai, Yicun; He, Yuping; Lv, Rong; Chen, Hongchao; Wang, Qiang; Pan, Liangwen
2017-01-01
Meat products often consist of meat from multiple animal species, and inaccurate food product adulteration and mislabeling can negatively affect consumers. Therefore, a cost-effective and reliable method for identification and quantification of animal species in meat products is required. In this study, we developed a duplex droplet digital PCR (dddPCR) detection and quantification system to simultaneously identify and quantify the source of meat in samples containing a mixture of beef (Bos taurus) and pork (Sus scrofa) in a single digital PCR reaction tube. Mixed meat samples of known composition were used to test the accuracy and applicability of this method. The limit of detection (LOD) and the limit of quantification (LOQ) of this detection and quantification system were also identified. We conclude that our dddPCR detection and quantification system is suitable for quality control and routine analyses of meat products.
Asara, John M; Zhang, Xiang; Zheng, Bin; Christofk, Heather H; Wu, Ning; Cantley, Lewis C
2006-01-01
Most proteomics approaches for relative quantification of protein expression use a combination of stable-isotope labeling and mass spectrometry. Traditionally, researchers have used difference gel electrophoresis (DIGE) from stained 1D and 2D gels for relative quantification. While differences in protein staining intensity can often be visualized, abundant proteins can obscure less abundant proteins, and quantification of post-translational modifications is difficult. A method is presented for quantifying changes in the abundance of a specific protein or changes in specific modifications of a protein using In-gel Stable-Isotope Labeling (ISIL). Proteins extracted from any source (tissue, cell line, immunoprecipitate, etc.), treated under two experimental conditions, are resolved in separate lanes by gel electrophoresis. The regions of interest (visualized by staining) are reacted separately with light versus heavy isotope-labeled reagents, and the gel slices are then mixed and digested with proteases. The resulting peptides are then analyzed by LC-MS to determine relative abundance of light/heavy isotope pairs and analyzed by LC-MS/MS for identification of sequence and modifications. The strategy compares well with other relative quantification strategies, and in silico calculations reveal its effectiveness as a global relative quantification strategy. An advantage of ISIL is that visualization of gel differences can be used as a first quantification step followed by accurate and sensitive protein level stable-isotope labeling and mass spectrometry-based relative quantification.
Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.
Gerold, Chase T; Bakker, Eric; Henry, Charles S
2018-04-03
In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.
Quantification Bias Caused by Plasmid DNA Conformation in Quantitative Real-Time PCR Assay
Lin, Chih-Hui; Chen, Yu-Chieh; Pan, Tzu-Ming
2011-01-01
Quantitative real-time PCR (qPCR) is the gold standard for the quantification of specific nucleic acid sequences. However, a serious concern has been revealed in a recent report: supercoiled plasmid standards cause significant over-estimation in qPCR quantification. In this study, we investigated the effect of plasmid DNA conformation on the quantification of DNA and the efficiency of qPCR. Our results suggest that plasmid DNA conformation has significant impact on the accuracy of absolute quantification by qPCR. DNA standard curves shifted significantly among plasmid standards with different DNA conformations. Moreover, the choice of DNA measurement method and plasmid DNA conformation may also contribute to the measurement error of DNA standard curves. Due to the multiple effects of plasmid DNA conformation on the accuracy of qPCR, efforts should be made to assure the highest consistency of plasmid standards for qPCR. Thus, we suggest that the conformation, preparation, quantification, purification, handling, and storage of standard plasmid DNA should be described and defined in the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) to assure the reproducibility and accuracy of qPCR absolute quantification. PMID:22194997
NASA Astrophysics Data System (ADS)
Sylla, Daouda
Defined as a process that reduces the potential of soil production or the usefulness of natural resources, soil degradation is a major environmental problem which affects over 41 % of the land and, over 80 % of people affected by this phenomenon live in developing countries. The general objective of the present project is the characterisation of different types of land use and land cover and the detection of their spatio-temporal changes from radar data (ERS-1, RADARSAT-1 and ENVISAT) for a spatio-temporal modeling of environmental vulnerability to soil degradation in semi-arid area. Due to the high sensitivity of the radar signal to the observing conditions of the sensor and the target, a partition of the radar images with respect to their angular configurations (23° and [33°-35°-47°]) and to environmental conditions (wet and dry) was first performed. A good characterisation and a good temporal evolution of the four types of land use and land cover of interest are obtained with different levels of contrast depending on the incidence angles and environmental conditions. In addition to pixel-based approach used for change detection (images differences, Principal component analysis), a monitoring of land cover from an object-oriented approach which focused on two types of land cover is developed. The method allows a detailed mapping of bare soil occurrences as a function of environmental conditions. Finally, using different sources of information, a modeling of the environmental vulnerability to soil degradation is performed in the South-west of Niger from the probabilistic fusion rule of Dempster-Shafer. The resulting decision maps are statistically acceptable at 93 % and 91 % with Kappa values of 86 % and 84 %, for respectively dry and wet conditions. Besides, they are used to produce a global map of the environmental vulnerability to soil degradation in this semi-arid area. Key-words: Environmental vulnerability to soil degradation; data fusion; radar images; land use changes; semi-arid environment; South-west of Niger.
Teman, Carolin J.; Wilson, Andrew R.; Perkins, Sherrie L.; Hickman, Kimberly; Prchal, Josef T.; Salama, Mohamed E.
2010-01-01
Evaluation of bone marrow fibrosis and osteosclerosis in myeloproliferative neoplasms (MPN) is subject to interobserver inconsistency. Performance data for currently utilized fibrosis grading systems are lacking, and classification scales for osteosclerosis do not exist. Digital imaging can serve as a quantification method for fibrosis and osteosclerosis. We used digital imaging techniques for trabecular area assessment and reticulin-fiber quantification. Patients with all Philadelphia negative MPN subtypes had higher trabecular volume than controls (p ≤0.0015). Results suggest that the degree of osteosclerosis helps differentiate primary myelofibrosis from other MPN. Numerical quantification of fibrosis highly correlated with subjective scores, and interobserver correlation was satisfactory. Digital imaging provides accurate quantification for osteosclerosis and fibrosis. PMID:20122729
Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification
Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...
1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.
Dagnino, Denise; Schripsema, Jan
2005-08-01
A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Zhou; Adams, Rachel M; Chourey, Karuna
2012-01-01
A variety of quantitative proteomics methods have been developed, including label-free, metabolic labeling, and isobaric chemical labeling using iTRAQ or TMT. Here, these methods were compared in terms of the depth of proteome coverage, quantification accuracy, precision, and reproducibility using a high-performance hybrid mass spectrometer, LTQ Orbitrap Velos. Our results show that (1) the spectral counting method provides the deepest proteome coverage for identification, but its quantification performance is worse than labeling-based approaches, especially the quantification reproducibility; (2) metabolic labeling and isobaric chemical labeling are capable of accurate, precise, and reproducible quantification and provide deep proteome coverage for quantification. Isobaricmore » chemical labeling surpasses metabolic labeling in terms of quantification precision and reproducibility; (3) iTRAQ and TMT perform similarly in all aspects compared in the current study using a CID-HCD dual scan configuration. Based on the unique advantages of each method, we provide guidance for selection of the appropriate method for a quantitative proteomics study.« less
Processing and domain selection: Quantificational variability effects
Harris, Jesse A.; Clifton, Charles; Frazier, Lyn
2014-01-01
Three studies investigated how readers interpret sentences with variable quantificational domains, e.g., The army was mostly in the capital, where mostly may quantify over individuals or parts (Most of the army was in the capital) or over times (The army was in the capital most of the time). It is proposed that a general conceptual economy principle, No Extra Times (Majewski 2006, in preparation), discourages the postulation of potentially unnecessary times, and thus favors the interpretation quantifying over parts. Disambiguating an ambiguously quantified sentence to a quantification over times interpretation was rated as less natural than disambiguating it to a quantification over parts interpretation (Experiment 1). In an interpretation questionnaire, sentences with similar quantificational variability were constructed so that both interpretations of the sentence would require postulating multiple times; this resulted in the elimination of the preference for a quantification over parts interpretation, suggesting the parts preference observed in Experiment 1 is not reducible to a lexical bias of the adverb mostly (Experiment 2). An eye movement recording study showed that, in the absence of prior evidence for multiple times, readers exhibit greater difficulty when reading material that forces a quantification over times interpretation than when reading material that allows a quantification over parts interpretation (Experiment 3). These experiments contribute to understanding readers’ default assumptions about the temporal properties of sentences, which is essential for understanding the selection of a domain for adverbial quantifiers and, more generally, for understanding how situational constraints influence sentence processing. PMID:25328262
USDA-ARS?s Scientific Manuscript database
High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiologic amino acids in cucurbits. This method is particularly useful because the dabsyl derivatives of glutamine and citrulline are sufficiently separated to allow quantification of ea...
Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina
2006-01-01
Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to evaluate the quality and performance on different matrixes and extraction techniques. The effect of PCR efficiency on the resulting GMO content is demonstrated. Conclusion The crucial influence of extraction technique and sample matrix properties on the results of GMO quantification is demonstrated. Appropriate extraction techniques for each matrix need to be determined to achieve accurate DNA quantification. Nevertheless, as it is shown that in the area of food and feed testing matrix with certain specificities is impossible to define strict quality controls need to be introduced to monitor PCR. The results of our study are also applicable to other fields of quantitative testing by real-time PCR. PMID:16907967
43 CFR 11.71 - Quantification phase-service reduction quantification.
Code of Federal Regulations, 2011 CFR
2011-10-01
...-discharge-or-release condition. (c) Contents of the quantification. The following factors should be included...; and (6) Factors identified in the specific guidance in paragraphs (h), (i), (j), (k), and (l) of this section dealing with the different kinds of natural resources. (d) Selection of resources, services, and...
43 CFR 11.71 - Quantification phase-service reduction quantification.
Code of Federal Regulations, 2010 CFR
2010-10-01
...-discharge-or-release condition. (c) Contents of the quantification. The following factors should be included...; and (6) Factors identified in the specific guidance in paragraphs (h), (i), (j), (k), and (l) of this section dealing with the different kinds of natural resources. (d) Selection of resources, services, and...
Bassereau, Maud; Chaintreau, Alain; Duperrex, Stéphanie; Joulain, Daniel; Leijs, Hans; Loesing, Gerd; Owen, Neil; Sherlock, Alan; Schippa, Christine; Thorel, Pierre-Jean; Vey, Matthias
2007-01-10
The performances of the GC-MS determination of suspected allergens in fragrance concentrates have been investigated. The limit of quantification was experimentally determined (10 mg/L), and the variability was investigated for three different data treatment strategies: (1) two columns and three quantification ions; (2) two columns and one quantification ion; and (3) one column and three quantification ions. The first strategy best minimizes the risk of determination bias due to coelutions. This risk was evaluated by calculating the probability of coeluting a suspected allergen with perfume constituents exhibiting ions in common. For hydroxycitronellal, when using a two-column strategy, this may statistically occur more than once every 36 analyses for one ion or once every 144 analyses for three ions in common.
Liu, Junyan; Liu, Yang; Gao, Mingxia; Zhang, Xiangmin
2012-08-01
A facile proteomic quantification method, fluorescent labeling absolute quantification (FLAQ), was developed. Instead of using MS for quantification, the FLAQ method is a chromatography-based quantification in combination with MS for identification. Multidimensional liquid chromatography (MDLC) with laser-induced fluorescence (LIF) detection with high accuracy and tandem MS system were employed for FLAQ. Several requirements should be met for fluorescent labeling in MS identification: Labeling completeness, minimum side-reactions, simple MS spectra, and no extra tandem MS fragmentations for structure elucidations. A fluorescence dye, 5-iodoacetamidofluorescein, was finally chosen to label proteins on all cysteine residues. The fluorescent dye was compatible with the process of the trypsin digestion and MALDI MS identification. Quantitative labeling was achieved with optimization of reacting conditions. A synthesized peptide and model proteins, BSA (35 cysteines), OVA (five cysteines), were used for verifying the completeness of labeling. Proteins were separated through MDLC and quantified based on fluorescent intensities, followed by MS identification. High accuracy (RSD% < 1.58) and wide linearity of quantification (1-10(5) ) were achieved by LIF detection. The limit of quantitation for the model protein was as low as 0.34 amol. Parts of proteins in human liver proteome were quantified and demonstrated using FLAQ. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Lavallée-Adam, Mathieu; Rauniyar, Navin; McClatchy, Daniel B; Yates, John R
2014-12-05
The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights.
2015-01-01
The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights. PMID:25177766
NASA Astrophysics Data System (ADS)
Restaino, Stephen M.; White, Ian M.
2017-03-01
Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.
NASA Astrophysics Data System (ADS)
Bourgeat, Pierrick; Dore, Vincent; Fripp, Jurgen; Villemagne, Victor L.; Rowe, Chris C.; Salvado, Olivier
2015-03-01
With the advances of PET tracers for β-Amyloid (Aβ) detection in neurodegenerative diseases, automated quantification methods are desirable. For clinical use, there is a great need for PET-only quantification method, as MR images are not always available. In this paper, we validate a previously developed PET-only quantification method against MR-based quantification using 6 tracers: 18F-Florbetaben (N=148), 18F-Florbetapir (N=171), 18F-NAV4694 (N=47), 18F-Flutemetamol (N=180), 11C-PiB (N=381) and 18F-FDG (N=34). The results show an overall mean absolute percentage error of less than 5% for each tracer. The method has been implemented as a remote service called CapAIBL (http://milxcloud.csiro.au/capaibl). PET images are uploaded to a cloud platform where they are spatially normalised to a standard template and quantified. A report containing global as well as local quantification, along with surface projection of the β-Amyloid deposition is automatically generated at the end of the pipeline and emailed to the user.
Rapid quantification and sex determination of forensic evidence materials.
Andréasson, Hanna; Allen, Marie
2003-11-01
DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.
Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations
Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.
2013-01-01
Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Tujin; Qian, Weijun
2013-02-01
Highly sensitive technologies for multiplexed quantification of a large number of candidate proteins will play an increasingly important role in clinical biomarker discovery, systems biology, and general biomedical research. Herein we introduce the new PRISM-SRM technology, which represents a highly sensitive multiplexed quantification technology capable of simultaneous quantification of many low-abundance proteins without the need of affinity reagents. The versatility of antibody-free PRISM-SRM for quantifying various types of targets including protein isoforms, protein modifications, metabolites, and others, thus offering new competition with immunoassays.
Quantification of uncertainties for application in detonation simulation
NASA Astrophysics Data System (ADS)
Zheng, Miao; Ma, Zhibo
2016-06-01
Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.
Adrait, Annie; Lebert, Dorothée; Trauchessec, Mathieu; Dupuis, Alain; Louwagie, Mathilde; Masselon, Christophe; Jaquinod, Michel; Chevalier, Benoît; Vandenesch, François; Garin, Jérôme; Bruley, Christophe; Brun, Virginie
2012-06-06
Enterotoxin A (SEA) is a staphylococcal virulence factor which is suspected to worsen septic shock prognosis. However, the presence of SEA in the blood of sepsis patients has never been demonstrated. We have developed a mass spectrometry-based assay for the targeted and absolute quantification of SEA in serum. To enhance sensitivity and specificity, we combined an immunoaffinity-based sample preparation with mass spectrometry analysis in the selected reaction monitoring (SRM) mode. Absolute quantification of SEA was performed using the PSAQ™ method (Protein Standard Absolute Quantification), which uses a full-length isotope-labeled SEA as internal standard. The lower limit of detection (LLOD) and lower limit of quantification (LLOQ) were estimated at 352pg/mL and 1057pg/mL, respectively. SEA recovery after immunocapture was determined to be 7.8±1.4%. Therefore, we assumed that less than 1femtomole of each SEA proteotypic peptide was injected on the liquid chromatography column before SRM analysis. From a 6-point titration experiment, quantification accuracy was determined to be 77% and precision at LLOQ was lower than 5%. With this sensitive PSAQ-SRM assay, we expect to contribute to decipher the pathophysiological role of SEA in severe sepsis. This article is part of a Special Issue entitled: Proteomics: The clinical link. Copyright © 2011 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Johnson, Heather Lynn
2015-01-01
Contributing to a growing body of research addressing secondary students' quantitative and covariational reasoning, the multiple case study reported in this article investigated secondary students' quantification of ratio and rate. This article reports results from a study investigating students' quantification of rate and ratio as…
Russell, Jason D.; Scalf, Mark; Book, Adam J.; Ladror, Daniel T.; Vierstra, Richard D.; Smith, Lloyd M.; Coon, Joshua J.
2013-01-01
Quantification of gas-phase intact protein ions by mass spectrometry (MS) is impeded by highly-variable ionization, ion transmission, and ion detection efficiencies. Therefore, quantification of proteins using MS-associated techniques is almost exclusively done after proteolysis where peptides serve as proxies for estimating protein abundance. Advances in instrumentation, protein separations, and informatics have made large-scale sequencing of intact proteins using top-down proteomics accessible to the proteomics community; yet quantification of proteins using a top-down workflow has largely been unaddressed. Here we describe a label-free approach to determine the abundance of intact proteins separated by nanoflow liquid chromatography prior to MS analysis by using solution-phase measurements of ultraviolet light-induced intrinsic fluorescence (UV-IF). UV-IF is measured directly at the electrospray interface just prior to the capillary exit where proteins containing at least one tryptophan residue are readily detected. UV-IF quantification was demonstrated using commercially available protein standards and provided more accurate and precise protein quantification than MS ion current. We evaluated the parallel use of UV-IF and top-down tandem MS for quantification and identification of protein subunits and associated proteins from an affinity-purified 26S proteasome sample from Arabidopsis thaliana. We identified 26 unique proteins and quantified 13 tryptophan-containing species. Our analyses discovered previously unidentified N-terminal processing of the β6 (PBF1) and β7 (PBG1) subunit - such processing of PBG1 may generate a heretofore unknown additional protease active site upon cleavage. In addition, our approach permitted the unambiguous identification and quantification both isoforms of the proteasome-associated protein DSS1. PMID:23536786
Russell, Jason D; Scalf, Mark; Book, Adam J; Ladror, Daniel T; Vierstra, Richard D; Smith, Lloyd M; Coon, Joshua J
2013-01-01
Quantification of gas-phase intact protein ions by mass spectrometry (MS) is impeded by highly-variable ionization, ion transmission, and ion detection efficiencies. Therefore, quantification of proteins using MS-associated techniques is almost exclusively done after proteolysis where peptides serve as proxies for estimating protein abundance. Advances in instrumentation, protein separations, and informatics have made large-scale sequencing of intact proteins using top-down proteomics accessible to the proteomics community; yet quantification of proteins using a top-down workflow has largely been unaddressed. Here we describe a label-free approach to determine the abundance of intact proteins separated by nanoflow liquid chromatography prior to MS analysis by using solution-phase measurements of ultraviolet light-induced intrinsic fluorescence (UV-IF). UV-IF is measured directly at the electrospray interface just prior to the capillary exit where proteins containing at least one tryptophan residue are readily detected. UV-IF quantification was demonstrated using commercially available protein standards and provided more accurate and precise protein quantification than MS ion current. We evaluated the parallel use of UV-IF and top-down tandem MS for quantification and identification of protein subunits and associated proteins from an affinity-purified 26S proteasome sample from Arabidopsis thaliana. We identified 26 unique proteins and quantified 13 tryptophan-containing species. Our analyses discovered previously unidentified N-terminal processing of the β6 (PBF1) and β7 (PBG1) subunit - such processing of PBG1 may generate a heretofore unknown additional protease active site upon cleavage. In addition, our approach permitted the unambiguous identification and quantification both isoforms of the proteasome-associated protein DSS1.
NASA Astrophysics Data System (ADS)
Greer, Tyler; Lietz, Christopher B.; Xiang, Feng; Li, Lingjun
2015-01-01
Absolute quantification of protein targets using liquid chromatography-mass spectrometry (LC-MS) is a key component of candidate biomarker validation. One popular method combines multiple reaction monitoring (MRM) using a triple quadrupole instrument with stable isotope-labeled standards (SIS) for absolute quantification (AQUA). LC-MRM AQUA assays are sensitive and specific, but they are also expensive because of the cost of synthesizing stable isotope peptide standards. While the chemical modification approach using mass differential tags for relative and absolute quantification (mTRAQ) represents a more economical approach when quantifying large numbers of peptides, these reagents are costly and still suffer from lower throughput because only two concentration values per peptide can be obtained in a single LC-MS run. Here, we have developed and applied a set of five novel mass difference reagents, isotopic N, N-dimethyl leucine (iDiLeu). These labels contain an amine reactive group, triazine ester, are cost effective because of their synthetic simplicity, and have increased throughput compared with previous LC-MS quantification methods by allowing construction of a four-point standard curve in one run. iDiLeu-labeled peptides show remarkably similar retention time shifts, slightly lower energy thresholds for higher-energy collisional dissociation (HCD) fragmentation, and high quantification accuracy for trypsin-digested protein samples (median errors <15%). By spiking in an iDiLeu-labeled neuropeptide, allatostatin, into mouse urine matrix, two quantification methods are validated. The first uses one labeled peptide as an internal standard to normalize labeled peptide peak areas across runs (<19% error), whereas the second enables standard curve creation and analyte quantification in one run (<8% error).
Shivali, Garg; Praful, Lahorkar; Vijay, Gadgil
2012-01-01
Fourier transform infrared (FT-IR) spectroscopy is a technique widely used for detection and quantification of various chemical moieties. This paper describes the use of the FT-IR spectroscopy technique for the quantification of total lactones present in Inula racemosa and Andrographis paniculata. To validate the FT-IR spectroscopy method for quantification of total lactones in I. racemosa and A. paniculata. Dried and powdered I. racemosa roots and A. paniculata plant were extracted with ethanol and dried to remove ethanol completely. The ethanol extract was analysed in a KBr pellet by FT-IR spectroscopy. The FT-IR spectroscopy method was validated and compared with a known spectrophotometric method for quantification of lactones in A. paniculata. By FT-IR spectroscopy, the amount of total lactones was found to be 2.12 ± 0.47% (n = 3) in I. racemosa and 8.65 ± 0.51% (n = 3) in A. paniculata. The method showed comparable results with a known spectrophotometric method used for quantification of such lactones: 8.42 ± 0.36% (n = 3) in A. paniculata. Limits of detection and quantification for isoallantolactone were 1 µg and 10 µg respectively; for andrographolide they were 1.5 µg and 15 µg respectively. Recoveries were over 98%, with good intra- and interday repeatability: RSD ≤ 2%. The FT-IR spectroscopy method proved linear, accurate, precise and specific, with low limits of detection and quantification, for estimation of total lactones, and is less tedious than the UV spectrophotometric method for the compounds tested. This validated FT-IR spectroscopy method is readily applicable for the quality control of I. racemosa and A. paniculata. Copyright © 2011 John Wiley & Sons, Ltd.
A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.
Rutledge, Robert G
2011-03-02
Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.
Cools, Katherine; Terry, Leon A
2012-07-15
Glucosinolates are β-thioglycosides which are found naturally in Cruciferae including the genus Brassica. When enzymatically hydrolysed, glucosinolates yield isothiocyanates and give a pungent taste. Both glucosinolates and isothiocyanates have been linked with anticancer activity as well as antifungal and antibacterial properties and therefore the quantification of these compounds is scientifically important. A wide range of literature exists on glucosinolates, however the extraction and quantification procedures differ greatly resulting in discrepancies between studies. The aim of this study was therefore to compare the most popular extraction procedures to identify the most efficacious method and whether each extraction can also be used for the quantification of total isothiocyanates. Four extraction techniques were compared for the quantification of sinigrin from mustard cv. Centennial (Brassica juncea L.) seed; boiling water, boiling 50% (v/v) aqueous acetonitrile, boiling 100% methanol and 70% (v/v) aqueous methanol at 70 °C. Prior to injection into the HPLC, the extractions which involved solvents (acetonitrile or methanol) were freeze-dried and resuspended in water. To identify whether the same extract could be used to measure total isothiocyanates, a dichloromethane extraction was carried out on the sinigrin extracts. For the quantification of sinigrin alone, boiling 50% (v/v) acetonitrile was found to be the most efficacious extraction solvent of the four tested yielding 15% more sinigrin than the water extraction. However, the removal of the acetonitrile by freeze-drying had a negative impact on the isothiocyanate content. Quantification of both sinigrin and total isothiocyanates was possible when the sinigrin was extracted using boiling water. Two columns were compared for the quantification of sinigrin revealing the Zorbax Eclipse to be the best column using this particular method. Copyright © 2012 Elsevier B.V. All rights reserved.
A Java Program for LRE-Based Real-Time qPCR that Enables Large-Scale Absolute Quantification
Rutledge, Robert G.
2011-01-01
Background Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Findings Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. Conclusions The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples. PMID:21407812
Kim, Jong-Seo; Fillmore, Thomas L; Liu, Tao; Robinson, Errol; Hossain, Mahmud; Champion, Boyd L; Moore, Ronald J; Camp, David G; Smith, Richard D; Qian, Wei-Jun
2011-12-01
Selected reaction monitoring (SRM)-MS is an emerging technology for high throughput targeted protein quantification and verification in biomarker discovery studies; however, the cost associated with the application of stable isotope-labeled synthetic peptides as internal standards can be prohibitive for screening a large number of candidate proteins as often required in the preverification phase of discovery studies. Herein we present a proof of concept study using an (18)O-labeled proteome reference as global internal standards (GIS) for SRM-based relative quantification. The (18)O-labeled proteome reference (or GIS) can be readily prepared and contains a heavy isotope ((18)O)-labeled internal standard for every possible tryptic peptide. Our results showed that the percentage of heavy isotope ((18)O) incorporation applying an improved protocol was >99.5% for most peptides investigated. The accuracy, reproducibility, and linear dynamic range of quantification were further assessed based on known ratios of standard proteins spiked into the labeled mouse plasma reference. Reliable quantification was observed with high reproducibility (i.e. coefficient of variance <10%) for analyte concentrations that were set at 100-fold higher or lower than those of the GIS based on the light ((16)O)/heavy ((18)O) peak area ratios. The utility of (18)O-labeled GIS was further illustrated by accurate relative quantification of 45 major human plasma proteins. Moreover, quantification of the concentrations of C-reactive protein and prostate-specific antigen was illustrated by coupling the GIS with standard additions of purified protein standards. Collectively, our results demonstrated that the use of (18)O-labeled proteome reference as GIS provides a convenient, low cost, and effective strategy for relative quantification of a large number of candidate proteins in biological or clinical samples using SRM.
Recent application of quantification II in Japanese medical research.
Suzuki, T; Kudo, A
1979-01-01
Hayashi's Quantification II is a method of multivariate discrimination analysis to manipulate attribute data as predictor variables. It is very useful in the medical research field for estimation, diagnosis, prognosis, evaluation of epidemiological factors, and other problems based on multiplicity of attribute data. In Japan, this method is so well known that most of the computer program packages include the Hayashi Quantification, but it seems to be yet unfamiliar with the method for researchers outside Japan. In view of this situation, we introduced 19 selected articles of recent applications of the Quantification II in Japanese medical research. In reviewing these papers, special mention is made to clarify how the researchers were satisfied with findings provided by the method. At the same time, some recommendations are made about terminology and program packages. Also a brief discussion of the background of the quantification methods is given with special reference to the Behaviormetric Society of Japan. PMID:540587
Surface smoothness: cartilage biomarkers for knee OA beyond the radiologist
NASA Astrophysics Data System (ADS)
Tummala, Sudhakar; Dam, Erik B.
2010-03-01
Fully automatic imaging biomarkers may allow quantification of patho-physiological processes that a radiologist would not be able to assess reliably. This can introduce new insight but is problematic to validate due to lack of meaningful ground truth expert measurements. Rather than quantification accuracy, such novel markers must therefore be validated against clinically meaningful end-goals such as the ability to allow correct diagnosis. We present a method for automatic cartilage surface smoothness quantification in the knee joint. The quantification is based on a curvature flow method used on tibial and femoral cartilage compartments resulting from an automatic segmentation scheme. These smoothness estimates are validated for their ability to diagnose osteoarthritis and compared to smoothness estimates based on manual expert segmentations and to conventional cartilage volume quantification. We demonstrate that the fully automatic markers eliminate the time required for radiologist annotations, and in addition provide a diagnostic marker superior to the evaluated semi-manual markers.
Relative quantification of biomarkers using mixed-isotope labeling coupled with MS
Chapman, Heidi M; Schutt, Katherine L; Dieter, Emily M; Lamos, Shane M
2013-01-01
The identification and quantification of important biomarkers is a critical first step in the elucidation of biological systems. Biomarkers take many forms as cellular responses to stimuli and can be manifested during transcription, translation, and/or metabolic processing. Increasingly, researchers have relied upon mixed-isotope labeling (MIL) coupled with MS to perform relative quantification of biomarkers between two or more biological samples. MIL effectively tags biomarkers of interest for ease of identification and quantification within the mass spectrometer by using isotopic labels that introduce a heavy and light form of the tag. In addition to MIL coupled with MS, a number of other approaches have been used to quantify biomarkers including protein gel staining, enzymatic labeling, metabolic labeling, and several label-free approaches that generate quantitative data from the MS signal response. This review focuses on MIL techniques coupled with MS for the quantification of protein and small-molecule biomarkers. PMID:23157360
pyQms enables universal and accurate quantification of mass spectrometry data.
Leufken, Johannes; Niehues, Anna; Sarin, L Peter; Wessel, Florian; Hippler, Michael; Leidel, Sebastian A; Fufezan, Christian
2017-10-01
Quantitative mass spectrometry (MS) is a key technique in many research areas (1), including proteomics, metabolomics, glycomics, and lipidomics. Because all of the corresponding molecules can be described by chemical formulas, universal quantification tools are highly desirable. Here, we present pyQms, an open-source software for accurate quantification of all types of molecules measurable by MS. pyQms uses isotope pattern matching that offers an accurate quality assessment of all quantifications and the ability to directly incorporate mass spectrometer accuracy. pyQms is, due to its universal design, applicable to every research field, labeling strategy, and acquisition technique. This opens ultimate flexibility for researchers to design experiments employing innovative and hitherto unexplored labeling strategies. Importantly, pyQms performs very well to accurately quantify partially labeled proteomes in large scale and high throughput, the most challenging task for a quantification algorithm. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.
Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.
Hawkins, Steve F C; Guest, Paul C
2018-01-01
The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.
Recent advances in stable isotope labeling based techniques for proteome relative quantification.
Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui
2014-10-24
The large scale relative quantification of all proteins expressed in biological samples under different states is of great importance for discovering proteins with important biological functions, as well as screening disease related biomarkers and drug targets. Therefore, the accurate quantification of proteins at proteome level has become one of the key issues in protein science. Herein, the recent advances in stable isotope labeling based techniques for proteome relative quantification were reviewed, from the aspects of metabolic labeling, chemical labeling and enzyme-catalyzed labeling. Furthermore, the future research direction in this field was prospected. Copyright © 2014 Elsevier B.V. All rights reserved.
Meyer, Golo M J; Weber, Armin A; Maurer, Hans H
2014-05-01
Diagnosis and prognosis of poisonings should be confirmed by comprehensive screening and reliable quantification of xenobiotics, for example by gas chromatography-mass spectrometry (GC-MS) or liquid chromatography-mass spectrometry (LC-MS). The turnaround time should be short enough to have an impact on clinical decisions. In emergency toxicology, quantification using full-scan acquisition is preferable because this allows screening and quantification of expected and unexpected drugs in one run. Therefore, a multi-analyte full-scan GC-MS approach was developed and validated with liquid-liquid extraction and one-point calibration for quantification of 40 drugs relevant to emergency toxicology. Validation showed that 36 drugs could be determined quickly, accurately, and reliably in the range of upper therapeutic to toxic concentrations. Daily one-point calibration with calibrators stored for up to four weeks reduced workload and turn-around time to less than 1 h. In summary, the multi-analyte approach with simple liquid-liquid extraction, GC-MS identification, and quantification over fast one-point calibration could successfully be applied to proficiency tests and real case samples. Copyright © 2013 John Wiley & Sons, Ltd.
Quantification of Training and Competition Loads in Endurance Sports: Methods and Applications.
Mujika, Iñigo
2017-04-01
Training quantification is basic to evaluate an endurance athlete's responses to training loads, ensure adequate stress/recovery balance, and determine the relationship between training and performance. Quantifying both external and internal workload is important, because external workload does not measure the biological stress imposed by the exercise sessions. Generally used quantification methods include retrospective questionnaires, diaries, direct observation, and physiological monitoring, often based on the measurement of oxygen uptake, heart rate, and blood lactate concentration. Other methods in use in endurance sports include speed measurement and the measurement of power output, made possible by recent technological advances such as power meters in cycling and triathlon. Among subjective methods of quantification, rating of perceived exertion stands out because of its wide use. Concurrent assessments of the various quantification methods allow researchers and practitioners to evaluate stress/recovery balance, adjust individual training programs, and determine the relationships between external load, internal load, and athletes' performance. This brief review summarizes the most relevant external- and internal-workload-quantification methods in endurance sports and provides practical examples of their implementation to adjust the training programs of elite athletes in accordance with their individualized stress/recovery balance.
Single Color Multiplexed ddPCR Copy Number Measurements and Single Nucleotide Variant Genotyping.
Wood-Bouwens, Christina M; Ji, Hanlee P
2018-01-01
Droplet digital PCR (ddPCR) allows for accurate quantification of genetic events such as copy number variation and single nucleotide variants. Probe-based assays represent the current "gold-standard" for detection and quantification of these genetic events. Here, we introduce a cost-effective single color ddPCR assay that allows for single genome resolution quantification of copy number and single nucleotide variation.
RNA-Skim: a rapid method for RNA-Seq quantification at transcript level
Zhang, Zhaojun; Wang, Wei
2014-01-01
Motivation: RNA-Seq technique has been demonstrated as a revolutionary means for exploring transcriptome because it provides deep coverage and base pair-level resolution. RNA-Seq quantification is proven to be an efficient alternative to Microarray technique in gene expression study, and it is a critical component in RNA-Seq differential expression analysis. Most existing RNA-Seq quantification tools require the alignments of fragments to either a genome or a transcriptome, entailing a time-consuming and intricate alignment step. To improve the performance of RNA-Seq quantification, an alignment-free method, Sailfish, has been recently proposed to quantify transcript abundances using all k-mers in the transcriptome, demonstrating the feasibility of designing an efficient alignment-free method for transcriptome quantification. Even though Sailfish is substantially faster than alternative alignment-dependent methods such as Cufflinks, using all k-mers in the transcriptome quantification impedes the scalability of the method. Results: We propose a novel RNA-Seq quantification method, RNA-Skim, which partitions the transcriptome into disjoint transcript clusters based on sequence similarity, and introduces the notion of sig-mers, which are a special type of k-mers uniquely associated with each cluster. We demonstrate that the sig-mer counts within a cluster are sufficient for estimating transcript abundances with accuracy comparable with any state-of-the-art method. This enables RNA-Skim to perform transcript quantification on each cluster independently, reducing a complex optimization problem into smaller optimization tasks that can be run in parallel. As a result, RNA-Skim uses <4% of the k-mers and <10% of the CPU time required by Sailfish. It is able to finish transcriptome quantification in <10 min per sample by using just a single thread on a commodity computer, which represents >100 speedup over the state-of-the-art alignment-based methods, while delivering comparable or higher accuracy. Availability and implementation: The software is available at http://www.csbio.unc.edu/rs. Contact: weiwang@cs.ucla.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24931995
UV-Vis as quantification tool for solubilized lignin following a single-shot steam process.
Lee, Roland A; Bédard, Charles; Berberi, Véronique; Beauchet, Romain; Lavoie, Jean-Michel
2013-09-01
In this short communication, UV/Vis was used as an analytical tool for the quantification of lignin concentrations in aqueous mediums. A significant correlation was determined between absorbance and concentration of lignin in solution. For this study, lignin was produced from different types of biomasses (willow, aspen, softwood, canary grass and hemp) using steam processes. Quantification was performed at 212, 225, 237, 270, 280 and 287 nm. UV-Vis quantification of lignin was found suitable for different types of biomass making this a timesaving analytical system that could lead to uses as Process Analytical Tool (PAT) in biorefineries utilizing steam processes or comparable approaches. Copyright © 2013 Elsevier Ltd. All rights reserved.
Cues, quantification, and agreement in language comprehension.
Tanner, Darren; Bulkes, Nyssa Z
2015-12-01
We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.
Floren, C; Wiedemann, I; Brenig, B; Schütz, E; Beck, J
2015-04-15
Species fraud and product mislabelling in processed food, albeit not being a direct health issue, often results in consumer distrust. Therefore methods for quantification of undeclared species are needed. Targeting mitochondrial DNA, e.g. CYTB gene, for species quantification is unsuitable, due to a fivefold inter-tissue variation in mtDNA content per cell resulting in either an under- (-70%) or overestimation (+160%) of species DNA contents. Here, we describe a reliable two-step droplet digital PCR (ddPCR) assay targeting the nuclear F2 gene for precise quantification of cattle, horse, and pig in processed meat products. The ddPCR assay is advantageous over qPCR showing a limit of quantification (LOQ) and detection (LOD) in different meat products of 0.01% and 0.001%, respectively. The specificity was verified in 14 different species. Hence, determining F2 in food by ddPCR can be recommended for quality assurance and control in production systems. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
AVQS: attack route-based vulnerability quantification scheme for smart grid.
Ko, Jongbin; Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik
2014-01-01
A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.
A quantification model for the structure of clay materials.
Tang, Liansheng; Sang, Haitao; Chen, Haokun; Sun, Yinlei; Zhang, Longjian
2016-07-04
In this paper, the quantification for clay structure is explicitly explained, and the approach and goals of quantification are also discussed. The authors consider that the purpose of the quantification for clay structure is to determine some parameters that can be used to quantitatively characterize the impact of clay structure on the macro-mechanical behaviour. According to the system theory and the law of energy conservation, a quantification model for the structure characteristics of clay materials is established and three quantitative parameters (i.e., deformation structure potential, strength structure potential and comprehensive structure potential) are proposed. And the corresponding tests are conducted. The experimental results show that these quantitative parameters can accurately reflect the influence of clay structure on the deformation behaviour, strength behaviour and the relative magnitude of structural influence on the above two quantitative parameters, respectively. These quantitative parameters have explicit mechanical meanings, and can be used to characterize the structural influences of clay on its mechanical behaviour.
2016-05-08
unlimited. 5 1. Introduction Several liquid -fuelled combustion systems, such as liquid propellant rocket engines and gas turbines...AFRL-AFOSR-JP-TR-2016-0084 Novel techniques for quantification of correlation between primary liquid jet breakup and downstream spray characteristics...to 17 Apr 2016 4. TITLE AND SUBTITLE Novel techniques for quantification of correlation between primary liquid jet breakup and downstream spray
2016-10-05
unlimited. 5 1. Introduction Several liquid -fuelled combustion systems, such as liquid propellant rocket engines and gas turbines...AFRL-AFOSR-JP-TR-2016-0084 Novel techniques for quantification of correlation between primary liquid jet breakup and downstream spray characteristics...to 17 Apr 2016 4. TITLE AND SUBTITLE Novel techniques for quantification of correlation between primary liquid jet breakup and downstream spray
Gallizzi, Michael A; Khazai, Ravand S; Gagnon, Christine M; Bruehl, Stephen; Harden, R Norman
2015-03-01
To correlate the amount and types of pain medications prescribed to CRPS patients, using the Medication Quantification Scale, and patients' subjective pain levels. An international, multisite, retrospective review. University medical centers in the United States, Israel, Germany, and the Netherlands. A total of 89 subjects were enrolled from four different countries: 27 from the United States, 20 Germany, 18 Netherlands, and 24 Israel. The main outcome measures used were the Medication Quantification Scale III and numerical analog pain scale. There was no statistically significant correlation noted between the medication quantification scale and the visual analog scale for any site except for a moderate positive correlation at German sites. The medication quantification scale mean differences between the United States and Germany, the Netherlands, and Israel were 9.793 (P < 0.002), 10.389 (P < 0.001), and 4.984 (P = 0.303), respectively. There appears to be only a weak correlation between amount of pain medication prescribed and patients' reported subjective pain intensity within this limited patient population. The Medication Quantification Scale is a viable tool for the analysis of pharmaceutical treatment of CRPS patients and would be useful in further prospective studies of pain medication prescription practices in the CRPS population worldwide. Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Cudalbu, C.; Mlynárik, V.; Xin, L.; Gruetter, Rolf
2009-10-01
Reliable quantification of the macromolecule signals in short echo-time 1H MRS spectra is particularly important at high magnetic fields for an accurate quantification of metabolite concentrations (the neurochemical profile) due to effectively increased spectral resolution of the macromolecule components. The purpose of the present study was to assess two approaches of quantification, which take the contribution of macromolecules into account in the quantification step. 1H spectra were acquired on a 14.1 T/26 cm horizontal scanner on five rats using the ultra-short echo-time SPECIAL (spin echo full intensity acquired localization) spectroscopy sequence. Metabolite concentrations were estimated using LCModel, combined with a simulated basis set of metabolites using published spectral parameters and either the spectrum of macromolecules measured in vivo, using an inversion recovery technique, or baseline simulated by the built-in spline function. The fitted spline function resulted in a smooth approximation of the in vivo macromolecules, but in accordance with previous studies using Subtract-QUEST could not reproduce completely all features of the in vivo spectrum of macromolecules at 14.1 T. As a consequence, the measured macromolecular 'baseline' led to a more accurate and reliable quantification at higher field strengths.
NASA Astrophysics Data System (ADS)
Lee, Hyun-Seok; Heun Kim, Sook; Jeong, Ji-Seon; Lee, Yong-Moon; Yim, Yong-Hyeon
2015-10-01
An element-based reductive approach provides an effective means of realizing International System of Units (SI) traceability for high-purity biological standards. Here, we develop an absolute protein quantification method using double isotope dilution (ID) inductively coupled plasma mass spectrometry (ICP-MS) combined with microwave-assisted acid digestion for the first time. We validated the method and applied it to certify the candidate protein certified reference material (CRM) of human growth hormone (hGH). The concentration of hGH was determined by analysing the total amount of sulfur in hGH. Next, the size-exclusion chromatography method was used with ICP-MS to characterize and quantify sulfur-containing impurities. By subtracting the contribution of sulfur-containing impurities from the total sulfur content in the hGH CRM, we obtained a SI-traceable certification value. The quantification result obtained with the present method based on sulfur analysis was in excellent agreement with the result determined via a well-established protein quantification method based on amino acid analysis using conventional acid hydrolysis combined with an ID liquid chromatography-tandem mass spectrometry. The element-based protein quantification method developed here can be generally used for SI-traceable absolute quantification of proteins, especially pure-protein standards.
GMO quantification: valuable experience and insights for the future.
Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana
2014-10-01
Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.
NASA Astrophysics Data System (ADS)
Gallego, Sandra F.; Højlund, Kurt; Ejsing, Christer S.
2018-01-01
Reliable, cost-effective, and gold-standard absolute quantification of non-esterified cholesterol in human plasma is of paramount importance in clinical lipidomics and for the monitoring of metabolic health. Here, we compared the performance of three mass spectrometric approaches available for direct detection and quantification of cholesterol in extracts of human plasma. These approaches are high resolution full scan Fourier transform mass spectrometry (FTMS) analysis, parallel reaction monitoring (PRM), and novel multiplexed MS/MS (MSX) technology, where fragments from selected precursor ions are detected simultaneously. Evaluating the performance of these approaches in terms of dynamic quantification range, linearity, and analytical precision showed that the MSX-based approach is superior to that of the FTMS and PRM-based approaches. To further show the efficacy of this approach, we devised a simple routine for extensive plasma lipidome characterization using only 8 μL of plasma, using a new commercially available ready-to-spike-in mixture with 14 synthetic lipid standards, and executing a single 6 min sample injection with combined MSX analysis for cholesterol quantification and FTMS analysis for quantification of sterol esters, glycerolipids, glycerophospholipids, and sphingolipids. Using this simple routine afforded reproducible and absolute quantification of 200 lipid species encompassing 13 lipid classes in human plasma samples. Notably, the analysis time of this procedure can be shortened for high throughput-oriented clinical lipidomics studies or extended with more advanced MSALL technology (Almeida R. et al., J. Am. Soc. Mass Spectrom. 26, 133-148 [1]) to support in-depth structural elucidation of lipid molecules. [Figure not available: see fulltext.
Porra, Luke; Swan, Hans; Ho, Chien
2015-08-01
Introduction: Acoustic Radiation Force Impulse (ARFI) Quantification measures shear wave velocities (SWVs) within the liver. It is a reliable method for predicting the severity of liver fibrosis and has the potential to assess fibrosis in any part of the liver, but previous research has found ARFI quantification in the right lobe more accurate than in the left lobe. A lack of standardised applied transducer force when performing ARFI quantification in the left lobe of the liver may account for some of this inaccuracy. The research hypothesis of this present study predicted that an increase in applied transducer force would result in an increase in SWVs measured. Methods: ARFI quantification within the left lobe of the liver was performed within a group of healthy volunteers (n = 28). During each examination, each participant was subjected to ARFI quantification at six different levels of transducer force applied to the epigastric abdominal wall. Results: A repeated measures ANOVA test showed that ARFI quantification was significantly affected by applied transducer force (p = 0.002). Significant pairwise comparisons using Bonferroni correction for multiple comparisons showed that with an increase in applied transducer force, there was a decrease in SWVs. Conclusion: Applied transducer force has a significant effect on SWVs within the left lobe of the liver and it may explain some of the less accurate and less reliable results in previous studies where transducer force was not taken into consideration. Future studies in the left lobe of the liver should take this into account and control for applied transducer force.
Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen
2014-07-01
Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. © 2014 Wiley Periodicals, Inc.
Gallego, Sandra F; Højlund, Kurt; Ejsing, Christer S
2018-01-01
Reliable, cost-effective, and gold-standard absolute quantification of non-esterified cholesterol in human plasma is of paramount importance in clinical lipidomics and for the monitoring of metabolic health. Here, we compared the performance of three mass spectrometric approaches available for direct detection and quantification of cholesterol in extracts of human plasma. These approaches are high resolution full scan Fourier transform mass spectrometry (FTMS) analysis, parallel reaction monitoring (PRM), and novel multiplexed MS/MS (MSX) technology, where fragments from selected precursor ions are detected simultaneously. Evaluating the performance of these approaches in terms of dynamic quantification range, linearity, and analytical precision showed that the MSX-based approach is superior to that of the FTMS and PRM-based approaches. To further show the efficacy of this approach, we devised a simple routine for extensive plasma lipidome characterization using only 8 μL of plasma, using a new commercially available ready-to-spike-in mixture with 14 synthetic lipid standards, and executing a single 6 min sample injection with combined MSX analysis for cholesterol quantification and FTMS analysis for quantification of sterol esters, glycerolipids, glycerophospholipids, and sphingolipids. Using this simple routine afforded reproducible and absolute quantification of 200 lipid species encompassing 13 lipid classes in human plasma samples. Notably, the analysis time of this procedure can be shortened for high throughput-oriented clinical lipidomics studies or extended with more advanced MS ALL technology (Almeida R. et al., J. Am. Soc. Mass Spectrom. 26, 133-148 [1]) to support in-depth structural elucidation of lipid molecules. Graphical Abstract ᅟ.
Li, Wei; Abram, François; Pelletier, Jean-Pierre; Raynauld, Jean-Pierre; Dorais, Marc; d'Anjou, Marc-André; Martel-Pelletier, Johanne
2010-01-01
Joint effusion is frequently associated with osteoarthritis (OA) flare-up and is an important marker of therapeutic response. This study aimed at developing and validating a fully automated system based on magnetic resonance imaging (MRI) for the quantification of joint effusion volume in knee OA patients. MRI examinations consisted of two axial sequences: a T2-weighted true fast imaging with steady-state precession and a T1-weighted gradient echo. An automated joint effusion volume quantification system using MRI was developed and validated (a) with calibrated phantoms (cylinder and sphere) and effusion from knee OA patients; (b) with assessment by manual quantification; and (c) by direct aspiration. Twenty-five knee OA patients with joint effusion were included in the study. The automated joint effusion volume quantification was developed as a four stage sequencing process: bone segmentation, filtering of unrelated structures, segmentation of joint effusion, and subvoxel volume calculation. Validation experiments revealed excellent coefficients of variation with the calibrated cylinder (1.4%) and sphere (0.8%) phantoms. Comparison of the OA knee joint effusion volume assessed by the developed automated system and by manual quantification was also excellent (r = 0.98; P < 0.0001), as was the comparison with direct aspiration (r = 0.88; P = 0.0008). The newly developed fully automated MRI-based system provided precise quantification of OA knee joint effusion volume with excellent correlation with data from phantoms, a manual system, and joint aspiration. Such an automated system will be instrumental in improving the reproducibility/reliability of the evaluation of this marker in clinical application.
Lowering the quantification limit of the QubitTM RNA HS assay using RNA spike-in.
Li, Xin; Ben-Dov, Iddo Z; Mauro, Maurizio; Williams, Zev
2015-05-06
RNA quantification is often a prerequisite for most RNA analyses such as RNA sequencing. However, the relatively low sensitivity and large sample consumption of traditional RNA quantification methods such as UV spectrophotometry and even the much more sensitive fluorescence-based RNA quantification assays, such as the Qubit™ RNA HS Assay, are often inadequate for measuring minute levels of RNA isolated from limited cell and tissue samples and biofluids. Thus, there is a pressing need for a more sensitive method to reliably and robustly detect trace levels of RNA without interference from DNA. To improve the quantification limit of the Qubit™ RNA HS Assay, we spiked-in a known quantity of RNA to achieve the minimum reading required by the assay. Samples containing trace amounts of RNA were then added to the spike-in and measured as a reading increase over RNA spike-in baseline. We determined the accuracy and precision of reading increases between 1 and 20 pg/μL as well as RNA-specificity in this range, and compared to those of RiboGreen(®), another sensitive fluorescence-based RNA quantification assay. We then applied Qubit™ Assay with RNA spike-in to quantify plasma RNA samples. RNA spike-in improved the quantification limit of the Qubit™ RNA HS Assay 5-fold, from 25 pg/μL down to 5 pg/μL while maintaining high specificity to RNA. This enabled quantification of RNA with original concentration as low as 55.6 pg/μL compared to 250 pg/μL for the standard assay and decreased sample consumption from 5 to 1 ng. Plasma RNA samples that were not measurable by the Qubit™ RNA HS Assay were measurable by our modified method. The Qubit™ RNA HS Assay with RNA spike-in is able to quantify RNA with high specificity at 5-fold lower concentration and uses 5-fold less sample quantity than the standard Qubit™ Assay.
Assessment of cardiac fibrosis: a morphometric method comparison for collagen quantification.
Schipke, Julia; Brandenberger, Christina; Rajces, Alexandra; Manninger, Martin; Alogna, Alessio; Post, Heiner; Mühlfeld, Christian
2017-04-01
Fibrotic remodeling of the heart is a frequent condition linked to various diseases and cardiac dysfunction. Collagen quantification is an important objective in cardiac fibrosis research; however, a variety of different histological methods are currently used that may differ in accuracy. Here, frequently applied collagen quantification techniques were compared. A porcine model of early stage heart failure with preserved ejection fraction was used as an example. Semiautomated threshold analyses were imprecise, mainly due to inclusion of noncollagen structures or failure to detect certain collagen deposits. In contrast, collagen assessment by automated image analysis and light microscopy (LM)-stereology was more sensitive. Depending on the quantification method, the amount of estimated collagen varied and influenced intergroup comparisons. PicroSirius Red, Masson's trichrome, and Azan staining protocols yielded similar results, whereas the measured collagen area increased with increasing section thickness. Whereas none of the LM-based methods showed significant differences between the groups, electron microscopy (EM)-stereology revealed a significant collagen increase between cardiomyocytes in the experimental group, but not at other localizations. In conclusion, in contrast to the staining protocol, section thickness and the quantification method being used directly influence the estimated collagen content and thus, possibly, intergroup comparisons. EM in combination with stereology is a precise and sensitive method for collagen quantification if certain prerequisites are considered. For subtle fibrotic alterations, consideration of collagen localization may be necessary. Among LM methods, LM-stereology and automated image analysis are appropriate to quantify fibrotic changes, the latter depending on careful control of algorithm and comparable section staining. NEW & NOTEWORTHY Direct comparison of frequently applied histological fibrosis assessment techniques revealed a distinct relation of measured collagen and utilized quantification method as well as section thickness. Besides electron microscopy-stereology, which was precise and sensitive, light microscopy-stereology and automated image analysis proved to be appropriate for collagen quantification. Moreover, consideration of collagen localization might be important in revealing minor fibrotic changes. Copyright © 2017 the American Physiological Society.
Real-time quantitative PCR for retrovirus-like particle quantification in CHO cell culture.
de Wit, C; Fautz, C; Xu, Y
2000-09-01
Chinese hamster ovary (CHO) cells have been widely used to manufacture recombinant proteins intended for human therapeutic uses. Retrovirus-like particles, which are apparently defective and non-infectious, have been detected in all CHO cells by electron microscopy (EM). To assure viral safety of CHO cell-derived biologicals, quantification of retrovirus-like particles in production cell culture and demonstration of sufficient elimination of such retrovirus-like particles by the down-stream purification process are required for product market registration worldwide. EM, with a detection limit of 1x10(6) particles/ml, is the standard retrovirus-like particle quantification method. The whole process, which requires a large amount of sample (3-6 litres), is labour intensive, time consuming, expensive, and subject to significant assay variability. In this paper, a novel real-time quantitative PCR assay (TaqMan assay) has been developed for the quantification of retrovirus-like particles. Each retrovirus particle contains two copies of the viral genomic particle RNA (pRNA) molecule. Therefore, quantification of retrovirus particles can be achieved by quantifying the pRNA copy number, i.e. every two copies of retroviral pRNA is equivalent to one retrovirus-like particle. The TaqMan assay takes advantage of the 5'-->3' exonuclease activity of Taq DNA polymerase and utilizes the PRISM 7700 Sequence Detection System of PE Applied Biosystems (Foster City, CA, U.S.A.) for automated pRNA quantification through a dual-labelled fluorogenic probe. The TaqMan quantification technique is highly comparable to the EM analysis. In addition, it offers significant advantages over the EM analysis, such as a higher sensitivity of less than 600 particles/ml, greater accuracy and reliability, higher sample throughput, more flexibility and lower cost. Therefore, the TaqMan assay should be used as a substitute for EM analysis for retrovirus-like particle quantification in CHO cell-based production system. Copyright 2000 The International Association for Biologicals.
Otani, Kyoko; Nakazono, Akemi; Salgo, Ivan S; Lang, Roberto M; Takeuchi, Masaaki
2016-10-01
Echocardiographic determination of left heart chamber volumetric parameters by using manual tracings during multiple beats is tedious in atrial fibrillation (AF). The aim of this study was to determine the usefulness of fully automated left chamber quantification software with single-beat three-dimensional transthoracic echocardiographic data sets in patients with AF. Single-beat full-volume three-dimensional transthoracic echocardiographic data sets were prospectively acquired during consecutive multiple cardiac beats (≥10 beats) in 88 patients with AF. In protocol 1, left ventricular volumes, left ventricular ejection fraction, and maximal left atrial volume were validated using automated quantification against the manual tracing method in identical beats in 10 patients. In protocol 2, automated quantification-derived averaged values from multiple beats were compared with the corresponding values obtained from the indexed beat in all patients. Excellent correlations of left chamber parameters between automated quantification and the manual method were observed (r = 0.88-0.98) in protocol 1. The time required for the analysis with the automated quantification method (5 min) was significantly less compared with the manual method (27 min) (P < .0001). In protocol 2, there were excellent linear correlations between the averaged left chamber parameters and the corresponding values obtained from the indexed beat (r = 0.94-0.99), and test-retest variability of left chamber parameters was low (3.5%-4.8%). Three-dimensional transthoracic echocardiography with fully automated quantification software is a rapid and reliable way to measure averaged values of left heart chamber parameters during multiple consecutive beats. Thus, it is a potential new approach for left chamber quantification in patients with AF in daily routine practice. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.
WE-AB-204-05: Harmonizing PET/CT Quantification in Multicenter Studies: A Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marques da Silva, A; Fischer, A
2015-06-15
Purpose: To present the implementation of a strategy to harmonize FDG PET/CT quantification (SUV), performed with different scanner models and manufacturers. Methods: The strategy was based on Boellaard (2011) and EARL FDG-PET/CT accreditation program, that propose quality control measurements for harmonizing scanner performance. A NEMA IEC Body phantom study was performed using four different devices: PHP-1 (Gemini TF Base, Philips); PHP-2 (Gemini GXL, Philips); GEH (Discovery 600, General Electric); SMS (Biograph Hi-Rez 16, Siemens). The SUV Recovery Coefficient (RC) was calculated using the clinical protocol and other clinically relevant reconstruction parameters. The most appropriate reconstruction parameters (MARP) for SUV harmonization,more » in each scanner, are those which achieve EARL harmonizing standards. They were identified using the lowest root mean square errors (RMSE). To evaluate the strategy’s effectiveness, the Maximum Differences (MD) between the clinical and MARP RC values were calculated. Results: The reconstructions parameters that obtained the lowest RMSE are: FBP 5mm (PHP-1); LOR-RAMLA 2i0.008l (PHP-2); VuePointHD 2i32s10mm (GEH); and FORE+OSEM 4i8s6mm (SMS). Thus, to ensure that quantitative PET image measurements are interchangeable between these sites, images must be reconstructed with the above-mentioned parameters. Although, a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies was observed. The MD showed that the strategy was effective in reducing the variability of SUV quantification for small structures (<17mm). Conclusion: The harmonization strategy of the SUV quantification implemented with these devices was effective in reducing the variability of small structures quantification, minimizing the inter-scanner and inter-institution differences in quantification. However, it is essential that, in addition to the harmonization of quantification, the standardization of the methodology of patient preparation must be maintained, in order to minimize the SUV variability due to biological factors. Financial support by CAPES.« less
2010-01-01
throughout the entire 3D volume which made quantification of the different tissues in the breast possible. The p eaks representing glandular and fat in...coefficients. Keywords: tissue quantification , absolute attenuation coefficient, scatter correction, computed tomography, tomography... tissue types. 1-4 Accurate measurements of t he quantification and di fferentiation of numerous t issues can be useful to identify di sease from
Installation Restoration Program. Confirmation/Quantification Stage 1. Phase 2
1985-03-07
INSTALLATION RESTORATION PROGRAM i0 PHASE II - CONFIRMATION/QUANTIFICATION 0STAGE 1 KIRTLAND AFB KIRTLAND AFB, NEW MEXICO 87117 IIl PREPARED BY SCIENCE...APPLICATIONS INTERNATIONAL CORPORATION 505 MARQUETTE NW, SUITE 1200 ALBUQUERQUE, NEW MEXICO 871021 5MARCH 1985 FINAL REPORT FROM FEB 1983 TO MAR 1985...QUANTIFICATION STAGE 1 i FINAL REPORT FOR IKIRTLAND AFB KIRTLAND AFB, NEW MEXICO 87117U HEADQUARTERS MILITARY AIRLIFT COMMAND COMMAND SURGEON’S OFFICE (HQ MAC
2016-04-01
QUANTIFICATION OF VX NERVE AGENT IN VARIOUS FOOD MATRICES BY SOLID-PHASE EXTRACTION ULTRA-PERFORMANCE...TITLE AND SUBTITLE Quantification of VX Nerve Agent in Various Food Matrices by Solid-Phase Extraction Ultra-Performance Liquid Chromatography... food matrices. The mixed-mode cation exchange (MCX) sorbent and Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) methods were used for
Integrated protocol for reliable and fast quantification and documentation of electrophoresis gels.
Rehbein, Peter; Schwalbe, Harald
2015-06-01
Quantitative analysis of electrophoresis gels is an important part in molecular cloning, as well as in protein expression and purification. Parallel quantifications in yield and purity can be most conveniently obtained from densitometric analysis. This communication reports a comprehensive, reliable and simple protocol for gel quantification and documentation, applicable for single samples and with special features for protein expression screens. As major component of the protocol, the fully annotated code of a proprietary open source computer program for semi-automatic densitometric quantification of digitized electrophoresis gels is disclosed. The program ("GelQuant") is implemented for the C-based macro-language of the widespread integrated development environment of IGOR Pro. Copyright © 2014 Elsevier Inc. All rights reserved.
Fluorescent quantification of melanin.
Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur
2016-11-01
Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Collender, Philip A.; Kirby, Amy E.; Addiss, David G.; Freeman, Matthew C.; Remais, Justin V.
2015-01-01
Limiting the environmental transmission of soil-transmitted helminths (STH), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost effective methods to detect and quantify STH in the environment. We review the state of the art of STH quantification in soil, biosolids, water, produce, and vegetation with respect to four major methodological issues: environmental sampling; recovery of STH from environmental matrices; quantification of recovered STH; and viability assessment of STH ova. We conclude that methods for sampling and recovering STH require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. PMID:26440788
Turner, Clare E; Russell, Bruce R; Gant, Nicholas
2015-11-01
Magnetic resonance spectroscopy (MRS) is an analytical procedure that can be used to non-invasively measure the concentration of a range of neural metabolites. Creatine is an important neurometabolite with dietary supplementation offering therapeutic potential for neurological disorders with dysfunctional energetic processes. Neural creatine concentrations can be probed using proton MRS and quantified using a range of software packages based on different analytical methods. This experiment examines the differences in quantification performance of two commonly used analysis packages following a creatine supplementation strategy with potential therapeutic application. Human participants followed a seven day dietary supplementation regime in a placebo-controlled, cross-over design interspersed with a five week wash-out period. Spectroscopy data were acquired the day immediately following supplementation and analyzed with two commonly-used software packages which employ vastly different quantification methods. Results demonstrate that neural creatine concentration was augmented following creatine supplementation when analyzed using the peak fitting method of quantification (105.9%±10.1). In contrast, no change in neural creatine levels were detected with supplementation when analysis was conducted using the basis spectrum method of quantification (102.6%±8.6). Results suggest that software packages that employ the peak fitting procedure for spectral quantification are possibly more sensitive to subtle changes in neural creatine concentrations. The relative simplicity of the spectroscopy sequence and the data analysis procedure suggest that peak fitting procedures may be the most effective means of metabolite quantification when detection of subtle alterations in neural metabolites is necessary. The straightforward technique can be used on a clinical magnetic resonance imaging system. Copyright © 2015 Elsevier Inc. All rights reserved.
2014-01-01
Background Various computer-based methods exist for the detection and quantification of protein spots in two dimensional gel electrophoresis images. Area-based methods are commonly used for spot quantification: an area is assigned to each spot and the sum of the pixel intensities in that area, the so-called volume, is used a measure for spot signal. Other methods use the optical density, i.e. the intensity of the most intense pixel of a spot, or calculate the volume from the parameters of a fitted function. Results In this study we compare the performance of different spot quantification methods using synthetic and real data. We propose a ready-to-use algorithm for spot detection and quantification that uses fitting of two dimensional Gaussian function curves for the extraction of data from two dimensional gel electrophoresis (2-DE) images. The algorithm implements fitting using logical compounds and is computationally efficient. The applicability of the compound fitting algorithm was evaluated for various simulated data and compared with other quantification approaches. We provide evidence that even if an incorrect bell-shaped function is used, the fitting method is superior to other approaches, especially when spots overlap. Finally, we validated the method with experimental data of urea-based 2-DE of Aβ peptides andre-analyzed published data sets. Our methods showed higher precision and accuracy than other approaches when applied to exposure time series and standard gels. Conclusion Compound fitting as a quantification method for 2-DE spots shows several advantages over other approaches and could be combined with various spot detection methods. The algorithm was scripted in MATLAB (Mathworks) and is available as a supplemental file. PMID:24915860
Reiter, Rolf; Wetzel, Martin; Hamesch, Karim; Strnad, Pavel; Asbach, Patrick; Haas, Matthias; Siegmund, Britta; Trautwein, Christian; Hamm, Bernd; Klatt, Dieter; Braun, Jürgen; Sack, Ingolf; Tzschätzsch, Heiko
2018-01-01
Although it has been known for decades that patients with alpha1-antitrypsin deficiency (AATD) have an increased risk of cirrhosis and hepatocellular carcinoma, limited data exist on non-invasive imaging-based methods for assessing liver fibrosis such as magnetic resonance elastography (MRE) and acoustic radiation force impulse (ARFI) quantification, and no data exist on 2D-shear wave elastography (2D-SWE). Therefore, the purpose of this study is to evaluate and compare the applicability of different elastography methods for the assessment of AATD-related liver fibrosis. Fifteen clinically asymptomatic AATD patients (11 homozygous PiZZ, 4 heterozygous PiMZ) and 16 matched healthy volunteers were examined using MRE and ARFI quantification. Additionally, patients were examined with 2D-SWE. A high correlation is evident for the shear wave speed (SWS) determined with different elastography methods in AATD patients: 2D-SWE/MRE, ARFI quantification/2D-SWE, and ARFI quantification/MRE (R = 0.8587, 0.7425, and 0.6914, respectively; P≤0.0089). Four AATD patients with pathologically increased SWS were consistently identified with all three methods-MRE, ARFI quantification, and 2D-SWE. The high correlation and consistent identification of patients with pathologically increased SWS using MRE, ARFI quantification, and 2D-SWE suggest that elastography has the potential to become a suitable imaging tool for the assessment of AATD-related liver fibrosis. These promising results provide motivation for further investigation of non-invasive assessment of AATD-related liver fibrosis using elastography.
van Erven, Gijs; de Visser, Ries; Merkx, Donny W H; Strolenberg, Willem; de Gijsel, Peter; Gruppen, Harry; Kabel, Mirjam A
2017-10-17
Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13 C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12 C and 13 C lignin were isolated from nonlabeled and uniformly 13 C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13 C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12 C lignin analogue and was shown to be extremely accurate (>99.9%, R 2 > 0.999) and precise (RSD < 1.5%). Third, 13 C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin.
2017-01-01
Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12C and 13C lignin were isolated from nonlabeled and uniformly 13C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12C lignin analogue and was shown to be extremely accurate (>99.9%, R2 > 0.999) and precise (RSD < 1.5%). Third, 13C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin. PMID:28926698
2010-01-01
Introduction Joint effusion is frequently associated with osteoarthritis (OA) flare-up and is an important marker of therapeutic response. This study aimed at developing and validating a fully automated system based on magnetic resonance imaging (MRI) for the quantification of joint effusion volume in knee OA patients. Methods MRI examinations consisted of two axial sequences: a T2-weighted true fast imaging with steady-state precession and a T1-weighted gradient echo. An automated joint effusion volume quantification system using MRI was developed and validated (a) with calibrated phantoms (cylinder and sphere) and effusion from knee OA patients; (b) with assessment by manual quantification; and (c) by direct aspiration. Twenty-five knee OA patients with joint effusion were included in the study. Results The automated joint effusion volume quantification was developed as a four stage sequencing process: bone segmentation, filtering of unrelated structures, segmentation of joint effusion, and subvoxel volume calculation. Validation experiments revealed excellent coefficients of variation with the calibrated cylinder (1.4%) and sphere (0.8%) phantoms. Comparison of the OA knee joint effusion volume assessed by the developed automated system and by manual quantification was also excellent (r = 0.98; P < 0.0001), as was the comparison with direct aspiration (r = 0.88; P = 0.0008). Conclusions The newly developed fully automated MRI-based system provided precise quantification of OA knee joint effusion volume with excellent correlation with data from phantoms, a manual system, and joint aspiration. Such an automated system will be instrumental in improving the reproducibility/reliability of the evaluation of this marker in clinical application. PMID:20846392
A phase quantification method based on EBSD data for a continuously cooled microalloyed steel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, H.; Wynne, B.P.; Palmiere, E.J., E-mail: e.j
2017-01-15
Mechanical properties of steels depend on the phase constitutions of the final microstructures which can be related to the processing parameters. Therefore, accurate quantification of different phases is necessary to investigate the relationships between processing parameters, final microstructures and mechanical properties. Point counting on micrographs observed by optical or scanning electron microscopy is widely used as a phase quantification method, and different phases are discriminated according to their morphological characteristics. However, it is difficult to differentiate some of the phase constituents with similar morphology. Differently, for EBSD based phase quantification methods, besides morphological characteristics, other parameters derived from the orientationmore » information can also be used for discrimination. In this research, a phase quantification method based on EBSD data in the unit of grains was proposed to identify and quantify the complex phase constitutions of a microalloyed steel subjected to accelerated coolings. Characteristics of polygonal ferrite/quasi-polygonal ferrite, acicular ferrite and bainitic ferrite on grain averaged misorientation angles, aspect ratios, high angle grain boundary fractions and grain sizes were analysed and used to develop the identification criteria for each phase. Comparing the results obtained by this EBSD based method and point counting, it was found that this EBSD based method can provide accurate and reliable phase quantification results for microstructures with relatively slow cooling rates. - Highlights: •A phase quantification method based on EBSD data in the unit of grains was proposed. •The critical grain area above which GAM angles are valid parameters was obtained. •Grain size and grain boundary misorientation were used to identify acicular ferrite. •High cooling rates deteriorate the accuracy of this EBSD based method.« less
2012-03-19
THREE EXTREMITY ARMOR SYSTEMS: DETERMINATION OF PHYSIOLOGICAL, BIOMECHANICAL, AND PHYSICAL PERFORMANCE EFFECTS AND QUANTIFICATION OF BODY AREA...PHYSICAL PERFORMANCE EFFECTS AND QUANTIFICATION OF BODY AREA COVERAGE 5a. CONTRACT NUMBER MIPR #M9545006MPR6CC7 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER NATICK/TR-12/014 9
Stepanović, Srdjan; Vuković, Dragana; Hola, Veronika; Di Bonaventura, Giovanni; Djukić, Slobodanka; Cirković, Ivana; Ruzicka, Filip
2007-08-01
The details of all steps involved in the quantification of biofilm formation in microtiter plates are described. The presented protocol incorporates information on assessment of biofilm production by staphylococci, gained both by direct experience as well as by analysis of methods for assaying biofilm production. The obtained results should simplify quantification of biofilm formation in microtiter plates, and make it more reliable and comparable among different laboratories.
On the Confounding Effect of Temperature on Chemical Shift-Encoded Fat Quantification
Hernando, Diego; Sharma, Samir D.; Kramer, Harald; Reeder, Scott B.
2014-01-01
Purpose To characterize the confounding effect of temperature on chemical shift-encoded (CSE) fat quantification. Methods The proton resonance frequency of water, unlike triglycerides, depends on temperature. This leads to a temperature dependence of the spectral models of fat (relative to water) that are commonly used by CSE-MRI methods. Simulation analysis was performed for 1.5 Tesla CSE fat–water signals at various temperatures and echo time combinations. Oil–water phantoms were constructed and scanned at temperatures between 0 and 40°C using spectroscopy and CSE imaging at three echo time combinations. An explanted human liver, rejected for transplantation due to steatosis, was scanned using spectroscopy and CSE imaging. Fat–water reconstructions were performed using four different techniques: magnitude and complex fitting, with standard or temperature-corrected signal modeling. Results In all experiments, magnitude fitting with standard signal modeling resulted in large fat quantification errors. Errors were largest for echo time combinations near TEinit ≈ 1.3 ms, ΔTE ≈ 2.2 ms. Errors in fat quantification caused by temperature-related frequency shifts were smaller with complex fitting, and were avoided using a temperature-corrected signal model. Conclusion Temperature is a confounding factor for fat quantification. If not accounted for, it can result in large errors in fat quantifications in phantom and ex vivo acquisitions. PMID:24123362
NASA Astrophysics Data System (ADS)
Martel, Dimitri; Tse Ve Koon, K.; Le Fur, Yann; Ratiney, Hélène
2015-11-01
Two-dimensional spectroscopy offers the possibility to unambiguously distinguish metabolites by spreading out the multiplet structure of J-coupled spin systems into a second dimension. Quantification methods that perform parametric fitting of the 2D MRS signal have recently been proposed for resolved PRESS (JPRESS) but not explicitly for Localized Correlation Spectroscopy (LCOSY). Here, through a whole metabolite quantification approach, correlation spectroscopy quantification performances are studied. The ability to quantify metabolite relaxation constant times is studied for three localized 2D MRS sequences (LCOSY, LCTCOSY and the JPRESS) in vitro on preclinical MR systems. The issues encountered during implementation and quantification strategies are discussed with the help of the Fisher matrix formalism. The described parameterized models enable the computation of the lower bound for error variance - generally known as the Cramér Rao bounds (CRBs), a standard of precision - on the parameters estimated from these 2D MRS signal fittings. LCOSY has a theoretical net signal loss of two per unit of acquisition time compared to JPRESS. A rapid analysis could point that the relative CRBs of LCOSY compared to JPRESS (expressed as a percentage of the concentration values) should be doubled but we show that this is not necessarily true. Finally, the LCOSY quantification procedure has been applied on data acquired in vivo on a mouse brain.
Automated lobar quantification of emphysema in patients with severe COPD.
Revel, Marie-Pierre; Faivre, Jean-Baptiste; Remy-Jardin, Martine; Deken, Valérie; Duhamel, Alain; Marquette, Charles-Hugo; Tacelli, Nunzia; Bakai, Anne-Marie; Remy, Jacques
2008-12-01
Automated lobar quantification of emphysema has not yet been evaluated. Unenhanced 64-slice MDCT was performed in 47 patients evaluated before bronchoscopic lung-volume reduction. CT images reconstructed with a standard (B20) and high-frequency (B50) kernel were analyzed using a dedicated prototype software (MevisPULMO) allowing lobar quantification of emphysema extent. Lobar quantification was obtained following (a) a fully automatic delineation of the lobar limits by the software and (b) a semiautomatic delineation with manual correction of the lobar limits when necessary and was compared with the visual scoring of emphysema severity per lobe. No statistically significant difference existed between automated and semiautomated lobar quantification (p > 0.05 in the five lobes), with differences ranging from 0.4 to 3.9%. The agreement between the two methods (intraclass correlation coefficient, ICC) was excellent for left upper lobe (ICC = 0.94), left lower lobe (ICC = 0.98), and right lower lobe (ICC = 0.80). The agreement was good for right upper lobe (ICC = 0.68) and moderate for middle lobe (IC = 0.53). The Bland and Altman plots confirmed these results. A good agreement was observed between the software and visually assessed lobar predominance of emphysema (kappa 0.78; 95% CI 0.64-0.92). Automated and semiautomated lobar quantifications of emphysema are concordant and show good agreement with visual scoring.
Ott, Stephan J; Musfeldt, Meike; Ullmann, Uwe; Hampe, Jochen; Schreiber, Stefan
2004-06-01
The composition of the human intestinal flora is important for the health status of the host. The global composition and the presence of specific pathogens are relevant to the effects of the flora. Therefore, accurate quantification of all major bacterial populations of the enteric flora is needed. A TaqMan real-time PCR-based method for the quantification of 20 dominant bacterial species and groups of the intestinal flora has been established on the basis of 16S ribosomal DNA taxonomy. A PCR with conserved primers was used for all reactions. In each real-time PCR, a universal probe for quantification of total bacteria and a specific probe for the species in question were included. PCR with conserved primers and the universal probe for total bacteria allowed relative and absolute quantification. Minor groove binder probes increased the sensitivity of the assays 10- to 100-fold. The method was evaluated by cross-reaction experiments and quantification of bacteria in complex clinical samples from healthy patients. A sensitivity of 10(1) to 10(3) bacterial cells per sample was achieved. No significant cross-reaction was observed. The real-time PCR assays presented may facilitate understanding of the intestinal bacterial flora through a normalized global estimation of the major contributing species.
Jiménez-Carvelo, Ana M; González-Casado, Antonio; Cuadros-Rodríguez, Luis
2017-03-01
A new analytical method for the quantification of olive oil and palm oil in blends with other vegetable edible oils (canola, safflower, corn, peanut, seeds, grapeseed, linseed, sesame and soybean) using normal phase liquid chromatography, and applying chemometric tools was developed. The procedure for obtaining of chromatographic fingerprint from the methyl-transesterified fraction from each blend is described. The multivariate quantification methods used were Partial Least Square-Regression (PLS-R) and Support Vector Regression (SVR). The quantification results were evaluated by several parameters as the Root Mean Square Error of Validation (RMSEV), Mean Absolute Error of Validation (MAEV) and Median Absolute Error of Validation (MdAEV). It has to be highlighted that the new proposed analytical method, the chromatographic analysis takes only eight minutes and the results obtained showed the potential of this method and allowed quantification of mixtures of olive oil and palm oil with other vegetable oils. Copyright © 2016 Elsevier B.V. All rights reserved.
Inductively Coupled Plasma Mass Spectrometry (ICP-MS) Applications in Quantitative Proteomics.
Chahrour, Osama; Malone, John
2017-01-01
Recent advances in inductively coupled plasma mass spectrometry (ICP-MS) hyphenated to different separation techniques have promoted it as a valuable tool in protein/peptide quantification. These emerging ICP-MS applications allow absolute quantification by measuring specific elemental responses. One approach quantifies elements already present in the structure of the target peptide (e.g. phosphorus and sulphur) as natural tags. Quantification of these natural tags allows the elucidation of the degree of protein phosphorylation in addition to absolute protein quantification. A separate approach is based on utilising bi-functional labelling substances (those containing ICP-MS detectable elements), that form a covalent chemical bond with the protein thus creating analogs which are detectable by ICP-MS. Based on the previously established stoichiometries of the labelling reagents, quantification can be achieved. This technique is very useful for the design of precise multiplexed quantitation schemes to address the challenges of biomarker screening and discovery. This review discusses the capabilities and different strategies to implement ICP-MS in the field of quantitative proteomics. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Wang, Hanghang; Muehlbauer, Michael J.; O’Neal, Sara K.; Newgard, Christopher B.; Hauser, Elizabeth R.; Shah, Svati H.
2017-01-01
The field of metabolomics as applied to human disease and health is rapidly expanding. In recent efforts of metabolomics research, greater emphasis has been placed on quality control and method validation. In this study, we report an experience with quality control and a practical application of method validation. Specifically, we sought to identify and modify steps in gas chromatography-mass spectrometry (GC-MS)-based, non-targeted metabolomic profiling of human plasma that could influence metabolite identification and quantification. Our experimental design included two studies: (1) a limiting-dilution study, which investigated the effects of dilution on analyte identification and quantification; and (2) a concentration-specific study, which compared the optimal plasma extract volume established in the first study with the volume used in the current institutional protocol. We confirmed that contaminants, concentration, repeatability and intermediate precision are major factors influencing metabolite identification and quantification. In addition, we established methods for improved metabolite identification and quantification, which were summarized to provide recommendations for experimental design of GC-MS-based non-targeted profiling of human plasma. PMID:28841195
Colour thresholding and objective quantification in bioimaging
NASA Technical Reports Server (NTRS)
Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.
1992-01-01
Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.
AVQS: Attack Route-Based Vulnerability Quantification Scheme for Smart Grid
Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik
2014-01-01
A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification. PMID:25152923
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parreiras Nogueira, Liebert; Barroso, Regina Cely; Pereira de Almeida, Andre
2012-05-17
This work aims to evaluate histomorphometric quantification by synchrotron radiation computed microto-mography in bones of human and rat specimens. Bones specimens are classified as normal and pathological (for human samples) and irradiated and non-irradiated samples (for rat ones). Human bones are specimens which were affected by some injury, or not. Rat bones are specimens which were irradiated, simulating radiotherapy procedures, or not. Images were obtained on SYRMEP beamline at the Elettra Synchrotron Laboratory in Trieste, Italy. The system generated 14 {mu}m tomographic images. The quantification of bone structures were performed directly by the 3D rendered images using a home-made software.more » Resolution yielded was excellent what facilitate quantification of bone microstructures.« less
Event-specific real-time detection and quantification of genetically modified Roundup Ready soybean.
Huang, Chia-Chia; Pan, Tzu-Ming
2005-05-18
The event-specific real-time detection and quantification of Roundup Ready soybean (RRS) using an ABI PRISM 7700 sequence detection system with light upon extension (LUX) primer was developed in this study. The event-specific primers were designed, targeting the junction of the RRS 5' integration site and the endogenous gene lectin1. Then, a standard reference plasmid was constructed that carried both of the targeted sequences for quantitative analysis. The detection limit of the LUX real-time PCR system was 0.05 ng of 100% RRS genomic DNA, which was equal to 20.5 copies. The range of quantification was from 0.1 to 100%. The sensitivity and range of quantification successfully met the requirement of the labeling rules in the European Union and Taiwan.
Collender, Philip A; Kirby, Amy E; Addiss, David G; Freeman, Matthew C; Remais, Justin V
2015-12-01
Limiting the environmental transmission of soil-transmitted helminths (STHs), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost-effective methods to detect and quantify STHs in the environment. We review the state-of-the-art of STH quantification in soil, biosolids, water, produce, and vegetation with regard to four major methodological issues: environmental sampling; recovery of STHs from environmental matrices; quantification of recovered STHs; and viability assessment of STH ova. We conclude that methods for sampling and recovering STHs require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. Copyright © 2015 Elsevier Ltd. All rights reserved.
Weighardt, Florian; Barbati, Cristina; Paoletti, Claudia; Querci, Maddalena; Kay, Simon; De Beuckeleer, Marc; Van den Eede, Guy
2004-01-01
In Europe, a growing interest for reliable techniques for the quantification of genetically modified component(s) of food matrixes is arising from the need to comply with the European legislative framework on novel food products. Real-time polymerase chain reaction (PCR) is currently the most powerful technique for the quantification of specific nucleic acid sequences. Several real-time PCR methodologies based on different molecular principles have been developed for this purpose. The most frequently used approach in the field of genetically modified organism (GMO) quantification in food or feed samples is based on the 5'-3'-exonuclease activity of Taq DNA polymerase on specific degradation probes (TaqMan principle). A novel approach was developed for the establishment of a TaqMan quantification system assessing GMO contents around the 1% threshold stipulated under European Union (EU) legislation for the labeling of food products. The Zea mays T25 elite event was chosen as a model for the development of the novel GMO quantification approach. The most innovative aspect of the system is represented by the use of sequences cloned in plasmids as reference standards. In the field of GMO quantification, plasmids are an easy to use, cheap, and reliable alternative to Certified Reference Materials (CRMs), which are only available for a few of the GMOs authorized in Europe, have a relatively high production cost, and require further processing to be suitable for analysis. Strengths and weaknesses of the use of novel plasmid-based standards are addressed in detail. In addition, the quantification system was designed to avoid the use of a reference gene (e.g., a single copy, species-specific gene) as normalizer, i.e., to perform a GMO quantification based on an absolute instead of a relative measurement. In fact, experimental evidences show that the use of reference genes adds variability to the measurement system because a second independent real-time PCR-based measurement must be performed. Moreover, for some reference genes no sufficient information on copy number in and among genomes of different lines is available, making adequate quantification difficult. Once developed, the method was subsequently validated according to IUPAC and ISO 5725 guidelines. Thirteen laboratories from 8 EU countries participated in the trial. Eleven laboratories provided results complying with the predefined study requirements. Repeatability (RSDr) values ranged from 8.7 to 15.9%, with a mean value of 12%. Reproducibility (RSDR) values ranged from 16.3 to 25.5%, with a mean value of 21%. Following Codex Alimentarius Committee guidelines, both the limits of detection and quantitation were determined to be <0.1%.
Recurrence quantification as potential bio-markers for diagnosis of pre-cancer
NASA Astrophysics Data System (ADS)
Mukhopadhyay, Sabyasachi; Pratiher, Sawon; Barman, Ritwik; Pratiher, Souvik; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.
2017-03-01
In this paper, the spectroscopy signals have been analyzed in recurrence plots (RP), and extract recurrence quantification analysis (RQA) parameters from the RP in order to classify the tissues into normal and different precancerous grades. Three RQA parameters have been quantified in order to extract the important features in the spectroscopy data. These features have been fed to different classifiers for classification. Simulation results validate the efficacy of the recurrence quantification as potential bio-markers for diagnosis of pre-cancer.
Rauniyar, Navin
2015-01-01
The parallel reaction monitoring (PRM) assay has emerged as an alternative method of targeted quantification. The PRM assay is performed in a high resolution and high mass accuracy mode on a mass spectrometer. This review presents the features that make PRM a highly specific and selective method for targeted quantification using quadrupole-Orbitrap hybrid instruments. In addition, this review discusses the label-based and label-free methods of quantification that can be performed with the targeted approach. PMID:26633379
2016-02-01
SPECTROMETRY: QUANTIFICATION OF FREE GB FROM VARIOUS FOOD MATRICES ECBC-TR-1351 Sue Y. Bae Mark D. Winemiller RESEARCH AND TECHNOLOGY DIRECTORATE...Flight Mass Spectrometry: Quantification of Free GB from Various Food Matrices 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...methylphosphonofluoridate (sarin, GB) in various food matrices. The development of a solid-phase extraction method using a normal-phase silica gel column for
Current position of high-resolution MS for drug quantification in clinical & forensic toxicology.
Meyer, Markus R; Helfer, Andreas G; Maurer, Hans H
2014-08-01
This paper reviews high-resolution MS approaches published from January 2011 until March 2014 for the quantification of drugs (of abuse) and/or their metabolites in biosamples using LC-MS with time-of-flight or Orbitrap™ mass analyzers. Corresponding approaches are discussed including sample preparation and mass spectral settings. The advantages and limitations of high-resolution MS for drug quantification, as well as the demand for a certain resolution or a specific mass accuracy are also explored.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Tujin; Gao, Yuqian; Gaffrey, Matthew J.
2014-12-17
Mass spectrometry-based targeted quantification is a promising technology for site-specific quantification of posttranslational modifications (PTMs). However, a major constraint of most targeted MS approaches is the limited sensitivity for quantifying low-abundance PTMs, requiring the use of affinity reagents to enrich specific PTMs. Herein, we demonstrate the direct site-specific quantification of ERK phosphorylation isoforms (pT, pY, pTpY) and their relative stoichiometries using a highly sensitive targeted MS approach termed high-pressure, high-resolution separations with intelligent selection and multiplexing (PRISM). PRISM provides effective enrichment of target peptides within a given fraction from complex biological matrix with minimal sample losses, followed by selected reactionmore » monitoring (SRM) quantification. The PRISM-SRM approach enabled direct quantification of ERK phosphorylation in human mammary epithelial cells (HMEC) from as little as 25 µg tryptic peptides from whole cell lysates. Compared to immobilized metal-ion affinity chromatography, PRISM provided >10-fold improvement in signal intensities, presumably due to the better peptide recovery of PRISM for handling small size samples. This approach was applied to quantify ERK phosphorylation dynamics in HMEC treated by different doses of EGF at both the peak activation (10 min) and steady state (2 h). At 10 min, the maximal ERK activation was observed with 0.3 ng/mL dose, whereas the maximal steady state level of ERK activation at 2 h was at 3 ng/ml dose, corresponding to 1200 and 9000 occupied receptors, respectively. At 10 min, the maximally activated pTpY isoform represented ~40% of total ERK, falling to less than 10% at 2 h. The time course and dose-response profiles of individual phosphorylated ERK isoforms indicated that singly phosphorylated pT-ERK never increases significantly, while the increase of pY-ERK paralleled that of pTpY-ERK. This data supports for a processive, rather than distributed, model of ERK phosphorylation. The PRISM-SRM quantification of protein phosphorylation illustrates the potential for simultaneous quantification of multiple PTMs.« less
WATSFAR: numerical simulation of soil WATer and Solute fluxes using a FAst and Robust method
NASA Astrophysics Data System (ADS)
Crevoisier, David; Voltz, Marc
2013-04-01
To simulate the evolution of hydro- and agro-systems, numerous spatialised models are based on a multi-local approach and improvement of simulation accuracy by data-assimilation techniques are now used in many application field. The latest acquisition techniques provide a large amount of experimental data, which increase the efficiency of parameters estimation and inverse modelling approaches. In turn simulations are often run on large temporal and spatial domains which requires a large number of model runs. Eventually, despite the regular increase in computing capacities, the development of fast and robust methods describing the evolution of saturated-unsaturated soil water and solute fluxes is still a challenge. Ross (2003, Agron J; 95:1352-1361) proposed a method, solving 1D Richards' and convection-diffusion equation, that fulfil these characteristics. The method is based on a non iterative approach which reduces the numerical divergence risks and allows the use of coarser spatial and temporal discretisations, while assuring a satisfying accuracy of the results. Crevoisier et al. (2009, Adv Wat Res; 32:936-947) proposed some technical improvements and validated this method on a wider range of agro- pedo- climatic situations. In this poster, we present the simulation code WATSFAR which generalises the Ross method to other mathematical representations of soil water retention curve (i.e. standard and modified van Genuchten model) and includes a dual permeability context (preferential fluxes) for both water and solute transfers. The situations tested are those known to be the less favourable when using standard numerical methods: fine textured and extremely dry soils, intense rainfall and solute fluxes, soils near saturation, ... The results of WATSFAR have been compared with the standard finite element model Hydrus. The analysis of these comparisons highlights two main advantages for WATSFAR, i) robustness: even on fine textured soil or high water and solute fluxes - where Hydrus simulations may fail to converge - no numerical problem appears, and ii) accuracy of simulations even for loose spatial domain discretisations, which can only be obtained by Hydrus with fine discretisations.
NASA Astrophysics Data System (ADS)
Beckers, Joost; Buckman, Lora; Bachmann, Daniel; Visser, Martijn; Tollenaar, Daniel; Vatvani, Deepak; Kramer, Nienke; Goorden, Neeltje
2015-04-01
Decision making in disaster management requires fast access to reliable and relevant information. We believe that online information and services will become increasingly important in disaster management. Within the EU FP7 project RASOR (Rapid Risk Assessment and Spatialisation of Risk) an online platform is being developed for rapid multi-hazard risk analyses to support disaster management anywhere in the world. The platform will provide access to a plethora of GIS data that are relevant to risk assessment. It will also enable the user to run numerical flood models to simulate historical and newly defined flooding scenarios. The results of these models are maps of flood extent, flood depths and flow velocities. The RASOR platform will enable to overlay historical event flood maps with observations and Earth Observation (EO) imagery to fill in gaps and assess the accuracy of the flood models. New flooding scenarios can be defined by the user and simulated to investigate the potential impact of future floods. A series of flood models have been developed within RASOR for selected case study areas around the globe that are subject to very different flood hazards: • The city of Bandung in Indonesia, which is prone to fluvial flooding induced by heavy rainfall. The flood hazard is exacerbated by land subsidence. • The port of Cilacap on the south coast of Java, subject to tsunami hazard from submarine earthquakes in the Sunda trench. • The area south of city of Rotterdam in the Netherlands, prone to coastal and/or riverine flooding. • The island of Santorini in Greece, which is subject to tsunamis induced by landslides. Flood models have been developed for each of these case studies using mostly EO data, augmented by local data where necessary. Particular use was made of the new TanDEM-X (TerraSAR-X add-on for Digital Elevation Measurement) product from the German Aerospace centre (DLR) and EADS Astrium. The presentation will describe the flood models and the flooding scenarios that can be defined by the RASOR end user to support risk management in each area. Ongoing work for three more case studies (Haiti, Po valley in Italy and Jakarta, Indonesia) will also be discussed.
The NASA Langley Multidisciplinary Uncertainty Quantification Challenge
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.
2014-01-01
This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.
Takeno, Shinya; Bamba, Takeshi; Nakazawa, Yoshihisa; Fukusaki, Eiichiro; Okazawa, Atsushi; Kobayashi, Akio
2008-04-01
Commercial development of trans-1,4-polyisoprene from Eucommia ulmoides Oliver (EU-rubber) requires specific knowledge on selection of high-rubber-content lines and establishment of agronomic cultivation methods for achieving maximum EU-rubber yield. The development can be facilitated by high-throughput and highly sensitive analytical techniques for EU-rubber extraction and quantification. In this paper, we described an efficient EU-rubber extraction method, and validated that the accuracy was equivalent to that of the conventional Soxhlet extraction method. We also described a highly sensitive quantification method for EU-rubber by Fourier transform infrared spectroscopy (FT-IR) and pyrolysis-gas chromatography/mass spectrometry (PyGC/MS). We successfully applied the extraction/quantification method for study of seasonal changes in EU-rubber content and molecular weight distribution.
Quantification of lithium at ppm level in geological samples using nuclear reaction analysis.
De La Rosa, Nathaly; Kristiansson, Per; Nilsson, E J Charlotta; Ros, Linus; Pallon, Jan; Skogby, Henrik
2018-01-01
Proton-induced reaction (p,α) is one type of nuclear reaction analysis (NRA) suitable especially for light element quantification. In the case of lithium quantification presented in this work, accelerated protons with an energy about of 850 keV were used to induce the 7 Li(p,α) 4 He reaction in standard reference and geological samples such as tourmaline and other Li-minerals. It is shown that this technique for lithium quantification allowed for measurement of concentrations down below one ppm. The possibility to relate the lithium content with the boron content in a single analysis was also demonstrated using tourmaline samples, both in absolute concentration and in lateral distribution. In addition, Particle induced X-ray emission (PIXE) was utilized as a complementary IBA technique for simultaneous mapping of elements heavier than sodium.
Yin, Hong-Rui; Zhang, Lei; Xie, Li-Qi; Huang, Li-Yong; Xu, Ye; Cai, San-Jun; Yang, Peng-Yuan; Lu, Hao-Jie
2013-09-06
Novel biomarker verification assays are urgently required to improve the efficiency of biomarker development. Benefitting from lower development costs, multiple reaction monitoring (MRM) has been used for biomarker verification as an alternative to immunoassay. However, in general MRM analysis, only one sample can be quantified in a single experiment, which restricts its application. Here, a Hyperplex-MRM quantification approach, which combined mTRAQ for absolute quantification and iTRAQ for relative quantification, was developed to increase the throughput of biomarker verification. In this strategy, equal amounts of internal standard peptides were labeled with mTRAQ reagents Δ0 and Δ8, respectively, as double references, while 4-plex iTRAQ reagents were used to label four different samples as an alternative to mTRAQ Δ4. From the MRM trace and MS/MS spectrum, total amounts and relative ratios of target proteins/peptides of four samples could be acquired simultaneously. Accordingly, absolute amounts of target proteins/peptides in four different samples could be achieved in a single run. In addition, double references were used to increase the reliability of the quantification results. Using this approach, three biomarker candidates, ademosylhomocysteinase (AHCY), cathepsin D (CTSD), and lysozyme C (LYZ), were successfully quantified in colorectal cancer (CRC) tissue specimens of different stages with high accuracy, sensitivity, and reproducibility. To summarize, we demonstrated a promising quantification method for high-throughput verification of biomarker candidates.
Application of the NUREG/CR-6850 EPRI/NRC Fire PRA Methodology to a DOE Facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tom Elicson; Bentley Harwood; Richard Yorg
2011-03-01
The application NUREG/CR-6850 EPRI/NRC fire PRA methodology to DOE facility presented several challenges. This paper documents the process and discusses several insights gained during development of the fire PRA. A brief review of the tasks performed is provided with particular focus on the following: • Tasks 5 and 14: Fire-induced risk model and fire risk quantification. A key lesson learned was to begin model development and quantification as early as possible in the project using screening values and simplified modeling if necessary. • Tasks 3 and 9: Fire PRA cable selection and detailed circuit failure analysis. In retrospect, it wouldmore » have been beneficial to perform the model development and quantification in 2 phases with detailed circuit analysis applied during phase 2. This would have allowed for development of a robust model and quantification earlier in the project and would have provided insights into where to focus the detailed circuit analysis efforts. • Tasks 8 and 11: Scoping fire modeling and detailed fire modeling. More focus should be placed on detailed fire modeling and less focus on scoping fire modeling. This was the approach taken for the fire PRA. • Task 14: Fire risk quantification. Typically, multiple safe shutdown (SSD) components fail during a given fire scenario. Therefore dependent failure analysis is critical to obtaining a meaningful fire risk quantification. Dependent failure analysis for the fire PRA presented several challenges which will be discussed in the full paper.« less
2015-01-01
Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MSE quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (<2 kDa, 137 peptides from 18 families) was possible in microdialysates from 8 replicate feeding experiments. Of these NPs, 55 were detected with an average mass error below 10 ppm. The time-resolved profiles of relative concentration changes for 6 are shown, and there is great potential for the use of this method in future experiments to aid in correlation of NP changes with behavior. This work presents an unbiased approach to winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MSE quantification method using the open source software Skyline. PMID:25552291
Schmerberg, Claire M; Liang, Zhidan; Li, Lingjun
2015-01-21
Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MS(E) quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (<2 kDa, 137 peptides from 18 families) was possible in microdialysates from 8 replicate feeding experiments. Of these NPs, 55 were detected with an average mass error below 10 ppm. The time-resolved profiles of relative concentration changes for 6 are shown, and there is great potential for the use of this method in future experiments to aid in correlation of NP changes with behavior. This work presents an unbiased approach to winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MS(E) quantification method using the open source software Skyline.
Klauser, A S; Franz, M; Bellmann Weiler, R; Gruber, J; Hartig, F; Mur, E; Wick, M C; Jaschke, W
2011-12-01
To compare joint inflammation assessment using subjective grading of power Doppler ultrasonography (PDUS) and contrast-enhanced ultrasonography (CEUS) versus computer-aided objective CEUS quantification. 37 joints of 28 patients with arthritis of different etiologies underwent B-mode ultrasonography, PDUS, and CEUS using a second-generation contrast agent. Synovial thickness, extent of vascularized pannus and intensity of vascularization were included in a 4-point PDUS and CEUS grading system. Subjective CEUS and PDUS scores were compared to computer-aided objective CEUS quantification using Qontrast® software for the calculation of the signal intensity (SI) and the ratio of SI for contrast enhancement. The interobserver agreement for subjective scoring was good to excellent (κ = 0.8 - 1.0; P < 0.0001). Computer-aided objective CEUS quantification correlated statistically significantly with subjective CEUS (P < 0.001) and PDUS grading (P < 0.05). The Qontrast® SI ratio correlated with subjective CEUS (P < 0.02) and PDUS grading (P < 0.03). Clinical activity did not correlate with vascularity or synovial thickening (P = N. S.) and no correlation between synovial thickening and vascularity extent could be found, neither using PDUS nor CEUS (P = N. S.). Both subjective CEUS grading and objective CEUS quantification are valuable for assessing joint vascularity in arthritis and computer-aided CEUS quantification may be a suitable objective tool for therapy follow-up in arthritis. © Georg Thieme Verlag KG Stuttgart · New York.
Al Feteisi, Hajar; Achour, Brahim; Rostami-Hodjegan, Amin; Barber, Jill
2015-01-01
Drug-metabolizing enzymes and transporters play an important role in drug absorption, distribution, metabolism and excretion and, consequently, they influence drug efficacy and toxicity. Quantification of drug-metabolizing enzymes and transporters in various tissues is therefore essential for comprehensive elucidation of drug absorption, distribution, metabolism and excretion. Recent advances in liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) have improved the quantification of pharmacologically relevant proteins. This report presents an overview of mass spectrometry-based methods currently used for the quantification of drug-metabolizing enzymes and drug transporters, mainly focusing on applications and cost associated with various quantitative strategies based on stable isotope-labeled standards (absolute quantification peptide standards, quantification concatemers, protein standards for absolute quantification) and label-free analysis. In mass spectrometry, there is no simple relationship between signal intensity and analyte concentration. Proteomic strategies are therefore complex and several factors need to be considered when selecting the most appropriate method for an intended application, including the number of proteins and samples. Quantitative strategies require appropriate mass spectrometry platforms, yet choice is often limited by the availability of appropriate instrumentation. Quantitative proteomics research requires specialist practical skills and there is a pressing need to dedicate more effort and investment to training personnel in this area. Large-scale multicenter collaborations are also needed to standardize quantitative strategies in order to improve physiologically based pharmacokinetic models.
NASA Astrophysics Data System (ADS)
Nedosekin, Dmitry A.; Nolan, Jacqueline; Biris, Alexandru S.; Zharov, Vladimir P.
2017-03-01
Arkansas Nanomedicine Center at the University of Arkansas for Medical Sciences in collaboration with other Arkansas Universities and the FDA-based National Center of Toxicological Research in Jefferson, AR is developing novel techniques for rapid quantification of graphene-based nanomaterials (GBNs) in various biological samples. All-carbon GBNs have wide range of potential applications in industry, agriculture, food processing and medicine; however, quantification of GBNs is difficult in carbon reach biological tissues. The accurate quantification of GBNs is essential for research on material toxicity and the development of GBNs-based drug delivery platforms. We have developed microscopy and cytometry platforms for detection and quantification of GBNs in single cells, tissue and blood samples using photoacoustic contrast of GBNs. We demonstrated PA quantification of individual graphene uptake by single cells. High-resolution PA microscopy provided mapping of GBN distribution within live cells to establish correlation with intracellular toxic phenomena using apoptotic and necrotic assays. This new methodology and corresponding technical platform provide the insight on possible toxicological risks of GBNs at singe cells levels. In addition, in vivo PA image flow cytometry demonstrated the capability to monitor of GBNs pharmacokinetics in mouse model and to map the resulting biodistribution of GBNs in mouse tissues. The integrated PA platform provided an unprecedented sensitivity toward GBNs and allowed to enhance conventional toxicology research by providing a direct correlation between uptake of GBNs at a single cell level and cell viability status.
Quantifying construction and demolition waste: An analytical review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Zezhou; Yu, Ann T.W., E-mail: bsannyu@polyu.edu.hk; Shen, Liyin
2014-09-15
Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and projectmore » levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.« less
Liu, Ruolin; Dickerson, Julie
2017-11-01
We propose a novel method and software tool, Strawberry, for transcript reconstruction and quantification from RNA-Seq data under the guidance of genome alignment and independent of gene annotation. Strawberry consists of two modules: assembly and quantification. The novelty of Strawberry is that the two modules use different optimization frameworks but utilize the same data graph structure, which allows a highly efficient, expandable and accurate algorithm for dealing large data. The assembly module parses aligned reads into splicing graphs, and uses network flow algorithms to select the most likely transcripts. The quantification module uses a latent class model to assign read counts from the nodes of splicing graphs to transcripts. Strawberry simultaneously estimates the transcript abundances and corrects for sequencing bias through an EM algorithm. Based on simulations, Strawberry outperforms Cufflinks and StringTie in terms of both assembly and quantification accuracies. Under the evaluation of a real data set, the estimated transcript expression by Strawberry has the highest correlation with Nanostring probe counts, an independent experiment measure for transcript expression. Strawberry is written in C++14, and is available as open source software at https://github.com/ruolin/strawberry under the MIT license.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
Quantification of Cannabinoid Content in Cannabis
NASA Astrophysics Data System (ADS)
Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.
2015-09-01
Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.
NASA Astrophysics Data System (ADS)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.
2018-03-01
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.
Carbon Nanotubes Released from an Epoxy-Based Nanocomposite: Quantification and Particle Toxicity.
Schlagenhauf, Lukas; Buerki-Thurnherr, Tina; Kuo, Yu-Ying; Wichser, Adrian; Nüesch, Frank; Wick, Peter; Wang, Jing
2015-09-01
Studies combining both the quantification of free nanoparticle release and the toxicological investigations of the released particles from actual nanoproducts in a real-life exposure scenario are urgently needed, yet very rare. Here, a new measurement method was established to quantify the amount of free-standing and protruding multiwalled carbon nanotubes (MWCNTs) in the respirable fraction of particles abraded from a MWCNT-epoxy nanocomposite. The quantification approach involves the prelabeling of MWCNTs with lead ions, nanocomposite production, abrasion and collection of the inhalable particle fraction, and quantification of free-standing and protruding MWCNTs by measuring the concentration of released lead ions. In vitro toxicity studies for genotoxicity, reactive oxygen species formation, and cell viability were performed using A549 human alveolar epithelial cells and THP-1 monocyte-derived macrophages. The quantification experiment revealed that in the respirable fraction of the abraded particles, approximately 4000 ppm of the MWCNTs were released as exposed MWCNTs (which could contact lung cells upon inhalation) and approximately 40 ppm as free-standing MWCNTs in the worst-case scenario. The release of exposed MWCNTs was lower for nanocomposites containing agglomerated MWCNTs. The toxicity tests revealed that the abraded particles did not induce any acute cytotoxic effects.
Dupré, Mathieu; Gilquin, Benoit; Fenaille, François; Feraudet-Tarisse, Cécile; Dano, Julie; Ferro, Myriam; Simon, Stéphanie; Junot, Christophe; Brun, Virginie; Becher, François
2015-08-18
The development of rapid methods for unambiguous identification and precise quantification of protein toxins in various matrices is essential for public health surveillance. Nowadays, analytical strategies classically rely on sensitive immunological assays, but mass spectrometry constitutes an attractive complementary approach thanks to direct measurement and protein characterization ability. We developed here an innovative multiplex immuno-LC-MS/MS method for the simultaneous and specific quantification of the three potential biological warfare agents, ricin, staphylococcal enterotoxin B, and epsilon toxin, in complex human biofluids and food matrices. At least 7 peptides were targeted for each toxin (43 peptides in total) with a quadrupole-Orbitrap high-resolution instrument for exquisite detection specificity. Quantification was performed using stable isotope-labeled toxin standards spiked early in the sample. Lower limits of quantification were determined at or close to 1 ng·mL(-1). The whole process was successfully applied to the quantitative analysis of toxins in complex samples such as milk, human urine, and plasma. Finally, we report new data on toxin stability with no evidence of toxin degradation in milk in a 48 h time frame, allowing relevant quantitative toxin analysis for samples collected in this time range.
Tozaki, Mitsuhiro; Isobe, Sachiko; Sakamoto, Masaaki
2012-10-01
We evaluated the diagnostic performance of elastography and tissue quantification using acoustic radiation force impulse (ARFI) technology for differential diagnosis of breast masses. There were 161 mass lesions. First, lesion correspondence on ARFI elastographic images to those on the B-mode images was evaluated: no findings on ARFI images (pattern 1), lesions that were bright inside (pattern 2), lesions that were dark inside (pattern 4), lesions that contained both bright and dark areas (pattern 3). In addition, pattern 4 was subdivided into 4a (dark area same as B-mode lesion) and 4b (dark area larger than lesion). Next, shear wave velocity (SWV) was measured using virtual touch tissue quantification. There were 13 pattern 1 lesions and five pattern 2 lesions; all of these lesions were benign, whereas all pattern 4b lesions (n = 43) were malignant. When the value of 3.59 m/s was chosen as the cutoff value, the combination of elastography and tissue quantification showed 91 % (83-91) sensitivity, 93 % (65-70) specificity, and 92 % (148-161) accuracy. The combination of elastography and tissue quantification is thought to be a promising ultrasound technique for differential diagnosis of breast-mass lesions.
auf dem Keller, Ulrich; Prudova, Anna; Gioia, Magda; Butler, Georgina S.; Overall, Christopher M.
2010-01-01
Terminal amine isotopic labeling of substrates (TAILS), our recently introduced platform for quantitative N-terminome analysis, enables wide dynamic range identification of original mature protein N-termini and protease cleavage products. Modifying TAILS by use of isobaric tag for relative and absolute quantification (iTRAQ)-like labels for quantification together with a robust statistical classifier derived from experimental protease cleavage data, we report reliable and statistically valid identification of proteolytic events in complex biological systems in MS2 mode. The statistical classifier is supported by a novel parameter evaluating ion intensity-dependent quantification confidences of single peptide quantifications, the quantification confidence factor (QCF). Furthermore, the isoform assignment score (IAS) is introduced, a new scoring system for the evaluation of single peptide-to-protein assignments based on high confidence protein identifications in the same sample prior to negative selection enrichment of N-terminal peptides. By these approaches, we identified and validated, in addition to known substrates, low abundance novel bioactive MMP-2 targets including the plasminogen receptor S100A10 (p11) and the proinflammatory cytokine proEMAP/p43 that were previously undescribed. PMID:20305283
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...
2018-02-09
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
Advanced Technologies and Methodology for Automated Ultrasonic Testing Systems Quantification
DOT National Transportation Integrated Search
2011-04-29
For automated ultrasonic testing (AUT) detection and sizing accuracy, this program developed a methodology for quantification of AUT systems, advancing and quantifying AUT systems imagecapture capabilities, quantifying the performance of multiple AUT...
Liquid chromatography tandem-mass spectrometry (LC-MS/MS)- based methods such as isobaric tags for relative and absolute quantification (iTRAQ) and tandem mass tags (TMT) have been shown to provide overall better quantification accuracy and reproducibility over other LC-MS/MS techniques. However, large scale projects like the Clinical Proteomic Tumor Analysis Consortium (CPTAC) require comparisons across many genomically characterized clinical specimens in a single study and often exceed the capability of traditional iTRAQ-based quantification.
Absolute quantification by droplet digital PCR versus analog real-time PCR
Hindson, Christopher M; Chevillet, John R; Briggs, Hilary A; Gallichotte, Emily N; Ruf, Ingrid K; Hindson, Benjamin J; Vessella, Robert L; Tewari, Muneesh
2014-01-01
Nanoliter-sized droplet technology paired with digital PCR (ddPCR) holds promise for highly precise, absolute nucleic acid quantification. Our comparison of microRNA quantification by ddPCR and real-time PCR revealed greater precision (coefficients of variation decreased by 37–86%) and improved day-to-day reproducibility (by a factor of seven) of ddPCR but with comparable sensitivity. When we applied ddPCR to serum microRNA biomarker analysis, this translated to superior diagnostic performance for identifying individuals with cancer. PMID:23995387
Recurrence quantification analysis of electrically evoked surface EMG signal.
Liu, Chunling; Wang, Xu
2005-01-01
Recurrence Plot is a quite useful tool used in time-series analysis, in particular for measuring unstable periodic orbits embedded in a chaotic dynamical system. This paper introduced the structures of the Recurrence Plot and the ways of the plot coming into being. Then the way of the quantification of the Recurrence Plot is defined. In this paper, one of the possible applications of Recurrence Quantification Analysis (RQA) strategy to the analysis of electrical stimulation evoked surface EMG. The result shows the percent determination is increased along with stimulation intensity.
de Kinkelder, R; van der Veen, R L P; Verbaak, F D; Faber, D J; van Leeuwen, T G; Berendschot, T T J M
2011-01-01
Purpose Accurate assessment of the amount of macular pigment (MPOD) is necessary to investigate the role of carotenoids and their assumed protective functions. High repeatability and reliability are important to monitor patients in studies investigating the influence of diet and supplements on MPOD. We evaluated the Macuscope (Macuvision Europe Ltd., Lapworth, Solihull, UK), a recently introduced device for measuring MPOD using the technique of heterochromatic flicker photometry (HFP). We determined agreement with another HFP device (QuantifEye; MPS 9000 series: Tinsley Precision Instruments Ltd., Croydon, Essex, UK) and a fundus reflectance method. Methods The right eyes of 23 healthy subjects (mean age 33.9±15.1 years) were measured. We determined agreement with QuantifEye and correlation with a fundus reflectance method. Repeatability of QuantifEye was assessed in 20 other healthy subjects (mean age 32.1±7.3 years). Repeatability was also compared with measurements by a fundus reflectance method in 10 subjects. Results We found low agreement between test and retest measurements with Macuscope. The average difference and the limits of agreement were −0.041±0.32. We found high agreement between test and retest measurements of QuantifEye (−0.02±0.18) and the fundus reflectance method (−0.04±0.18). MPOD data obtained by Macuscope and QuantifEye showed poor agreement: −0.017±0.44. For Macuscope and the fundus reflectance method, the correlation coefficient was r=0.05 (P=0.83). A significant correlation of r=0.87 (P<0.001) was found between QuantifEye and the fundus reflectance method. Conclusions Because repeatability of Macuscope measurements was low (ie, wide limits of agreement) and MPOD values correlated poorly with the fundus reflectance method, and agreed poorly with QuantifEye, the tested Macuscope protocol seems less suitable for studying MPOD. PMID:21057522
Quantification of the benefits of access management for Kentucky : final report.
DOT National Transportation Integrated Search
2006-07-01
This report describes the benefits quantification performed for the proposed access management plan for Kentucky. This study evaluates the capacity, safety and economic impacts associated with access management programs. The proposed Kentucky access ...
food science. Matthew's research at NREL is focused on applying uncertainty quantification techniques . Research Interests Uncertainty quantification Computational multilinear algebra Approximation theory of and the Canonical Tensor Decomposition, Journal of Computational Physics (2017) Randomized Alternating
Lautié, Emmanuelle; Rasse, Catherine; Rozet, Eric; Mourgues, Claire; Vanhelleputte, Jean-Paul; Quetin-Leclercq, Joëlle
2013-02-01
The aim of this study was to find if fast microwave-assisted extraction could be an alternative to the conventional Soxhlet extraction for the quantification of rotenone in yam bean seeds by SPE and HPLC-UV. For this purpose, an experimental design was used to determine the optimal conditions of the microwave extraction. Then the values of the quantification on three accessions from two different species of yam bean seeds were compared using the two different kinds of extraction. A microwave extraction of 11 min at 55°C using methanol/dichloromethane (50:50) allowed rotenone extraction either equivalently or more efficiently than the 8-h-Soxhlet extraction method and was less sensitive to moisture content. The selectivity, precision, trueness, accuracy, and limit of quantification of the method with microwave extraction were also demonstrated. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Quantification of HCV RNA in Clinical Specimens by Branched DNA (bDNA) Technology.
Wilber, J C; Urdea, M S
1999-01-01
The diagnosis and monitoring of hepatitis C virus (HCV) infection have been aided by the development of HCV RNA quantification assays A direct measure of viral load, HCV RNA quantification has the advantage of providing information on viral kinetics and provides unique insight into the disease process. Branched DNA (bDNA) signal amplification technology provides a novel approach for the direct quantification of HCV RNA in patient specimens. The bDNA assay measures HCV RNA at physiological levels by boosting the reporter signal, rather than by replicating target sequences as the means of detection, and thus avoids the errors inherent in the extraction and amplification of target sequences. Inherently quantitative and nonradioactive, the bDNA assay is amenable to routine use in a clinical research setting, and has been used by several groups to explore the natural history, pathogenesis, and treatment of HCV infection.
Quantification of brain lipids by FTIR spectroscopy and partial least squares regression
NASA Astrophysics Data System (ADS)
Dreissig, Isabell; Machill, Susanne; Salzer, Reiner; Krafft, Christoph
2009-01-01
Brain tissue is characterized by high lipid content. Its content decreases and the lipid composition changes during transformation from normal brain tissue to tumors. Therefore, the analysis of brain lipids might complement the existing diagnostic tools to determine the tumor type and tumor grade. Objective of this work is to extract lipids from gray matter and white matter of porcine brain tissue, record infrared (IR) spectra of these extracts and develop a quantification model for the main lipids based on partial least squares (PLS) regression. IR spectra of the pure lipids cholesterol, cholesterol ester, phosphatidic acid, phosphatidylcholine, phosphatidylethanolamine, phosphatidylserine, phosphatidylinositol, sphingomyelin, galactocerebroside and sulfatide were used as references. Two lipid mixtures were prepared for training and validation of the quantification model. The composition of lipid extracts that were predicted by the PLS regression of IR spectra was compared with lipid quantification by thin layer chromatography.
Dutta, Sibasish; Saikia, Gunjan Prasad; Sarma, Dhruva Jyoti; Gupta, Kuldeep; Das, Priyanka; Nath, Pabitra
2017-05-01
In this paper the utilization of smartphone as a detection platform for colorimetric quantification of biological macromolecules has been demonstrated. Using V-channel of HSV color space, the quantification of BSA protein, catalase enzyme and carbohydrate (using D-glucose) have been successfully investigated. A custom designed android application has been developed for estimating the total concentration of biological macromolecules. The results have been compared with that of a standard spectrophotometer which is generally used for colorimetric quantification in laboratory settings by measuring its absorbance at a specific wavelength. The results obtained with the designed sensor is found to be similar when compared with the spectrophotometer data. The designed sensor is low cost, robust and we envision that it could promote diverse fields of bio-analytical investigations. Schematic illustration of the smartphone sensing mechanism for colorimetric analysis of biomolecular samples. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Microbial quantification in activated sludge: the hits and misses.
Hall, S J; Keller, J; Blackall, L L
2003-01-01
Since the implementation of the activated sludge process for treating wastewater, there has been a reliance on chemical and physical parameters to monitor the system. However, in biological nutrient removal (BNR) processes, the microorganisms responsible for some of the transformations should be used to monitor the processes with the overall goal to achieve better treatment performance. The development of in situ identification and rapid quantification techniques for key microorganisms involved in BNR are required to achieve this goal. This study explored the quantification of Nitrospira, a key organism in the oxidation of nitrite to nitrate in BNR. Two molecular genetic microbial quantification techniques were evaluated: real-time polymerase chain reaction (PCR) and fluorescence in situ hybridisation (FISH) followed by digital image analysis. A correlation between the Nitrospira quantitative data and the nitrate production rate, determined in batch tests, was attempted. The disadvantages and advantages of both methods will be discussed.
Detection and Quantification of Cannabinoids in Extracts of Cannabis sativa Roots Using LC-MS/MS.
Gul, Waseem; Gul, Shahbaz W; Chandra, Suman; Lata, Hemant; Ibrahim, Elsayed A; ElSohly, Mahmoud A
2018-03-01
A liquid chromatography-tandem mass spectrometry single-laboratory validation was performed for the detection and quantification of the 10 major cannabinoids of cannabis, namely, (-)- trans -Δ 9 -tetrahydrocannabinol, cannabidiol, cannabigerol, cannabichromene, tetrahydrocannabivarian, cannabinol, (-)- trans -Δ 8 -tetrahydrocannabinol, cannabidiolic acid, cannabigerolic acid, and Δ 9 -tetrahydrocannabinolic acid-A, in the root extract of Cannabis sativa . Acetonitrile : methanol (80 : 20, v/v) was used for extraction; d 3 -cannabidiol and d 3 - tetrahydrocannabinol were used as the internal standards. All 10 cannabinoids showed a good regression relationship with r 2 > 0.99. The validated method is simple, sensitive, and reproducible and is therefore suitable for the detection and quantification of these cannabinoids in extracts of cannabis roots. To our knowledge, this is the first report for the quantification of cannabinoids in cannabis roots. Georg Thieme Verlag KG Stuttgart · New York.
Laurie, Matthew T; Bertout, Jessica A; Taylor, Sean D; Burton, Joshua N; Shendure, Jay A; Bielas, Jason H
2013-08-01
Due to the high cost of failed runs and suboptimal data yields, quantification and determination of fragment size range are crucial steps in the library preparation process for massively parallel sequencing (or next-generation sequencing). Current library quality control methods commonly involve quantification using real-time quantitative PCR and size determination using gel or capillary electrophoresis. These methods are laborious and subject to a number of significant limitations that can make library calibration unreliable. Herein, we propose and test an alternative method for quality control of sequencing libraries using droplet digital PCR (ddPCR). By exploiting a correlation we have discovered between droplet fluorescence and amplicon size, we achieve the joint quantification and size determination of target DNA with a single ddPCR assay. We demonstrate the accuracy and precision of applying this method to the preparation of sequencing libraries.
Quantification of trace metals in water using complexation and filter concentration.
Dolgin, Bella; Bulatov, Valery; Japarov, Julia; Elish, Eyal; Edri, Elad; Schechter, Israel
2010-06-15
Various metals undergo complexation with organic reagents, resulting in colored products. In practice, their molar absorptivities allow for quantification in the ppm range. However, a proper pre-concentration of the colored complex on paper filter lowers the quantification limit to the low ppb range. In this study, several pre-concentration techniques have been examined and compared: filtering the already complexed mixture, complexation on filter, and dipping of dye-covered filter in solution. The best quantification has been based on the ratio of filter reflectance at a certain wavelength to that at zero metal concentration. The studied complex formations (Ni ions with TAN and Cd ions with PAN) involve production of nanoparticle suspensions, which are associated with complicated kinetics. The kinetics of the complexation of Ni ions with TAN has been investigated and optimum timing could be found. Kinetic optimization in regard to some interferences has also been suggested.
Demeke, Tigst; Eng, Monika
2018-05-01
Droplet digital PCR (ddPCR) has been used for absolute quantification of genetically engineered (GE) events. Absolute quantification of GE events by duplex ddPCR requires the use of appropriate primers and probes for target and reference gene sequences in order to accurately determine the amount of GE materials. Single copy reference genes are generally preferred for absolute quantification of GE events by ddPCR. Study has not been conducted on a comparison of reference genes for absolute quantification of GE canola events by ddPCR. The suitability of four endogenous reference sequences ( HMG-I/Y , FatA(A), CruA and Ccf) for absolute quantification of GE canola events by ddPCR was investigated. The effect of DNA extraction methods and DNA quality on the assessment of reference gene copy numbers was also investigated. ddPCR results were affected by the use of single vs. two copy reference genes. The single copy, FatA(A), reference gene was found to be stable and suitable for absolute quantification of GE canola events by ddPCR. For the copy numbers measured, the HMG-I/Y reference gene was less consistent than FatA(A) reference gene. The expected ddPCR values were underestimated when CruA and Ccf (two copy endogenous Cruciferin sequences) were used because of high number of copies. It is important to make an adjustment if two copy reference genes are used for ddPCR in order to obtain accurate results. On the other hand, real-time quantitative PCR results were not affected by the use of single vs. two copy reference genes.
Santos, Hugo M; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Nunes-Miranda, J D; Fdez-Riverola, Florentino; Carvallo, R; Capelo, J L
2010-09-15
The decision peptide-driven tool implements a software application for assisting the user in a protocol for accurate protein quantification based on the following steps: (1) protein separation through gel electrophoresis; (2) in-gel protein digestion; (3) direct and inverse (18)O-labeling and (4) matrix assisted laser desorption ionization time of flight mass spectrometry, MALDI analysis. The DPD software compares the MALDI results of the direct and inverse (18)O-labeling experiments and quickly identifies those peptides with paralleled loses in different sets of a typical proteomic workflow. Those peptides are used for subsequent accurate protein quantification. The interpretation of the MALDI data from direct and inverse labeling experiments is time-consuming requiring a significant amount of time to do all comparisons manually. The DPD software shortens and simplifies the searching of the peptides that must be used for quantification from a week to just some minutes. To do so, it takes as input several MALDI spectra and aids the researcher in an automatic mode (i) to compare data from direct and inverse (18)O-labeling experiments, calculating the corresponding ratios to determine those peptides with paralleled losses throughout different sets of experiments; and (ii) allow to use those peptides as internal standards for subsequent accurate protein quantification using (18)O-labeling. In this work the DPD software is presented and explained with the quantification of protein carbonic anhydrase. Copyright (c) 2010 Elsevier B.V. All rights reserved.
Parsing and Quantification of Raw Orbitrap Mass Spectrometer Data Using RawQuant.
Kovalchik, Kevin A; Moggridge, Sophie; Chen, David D Y; Morin, Gregg B; Hughes, Christopher S
2018-06-01
Effective analysis of protein samples by mass spectrometry (MS) requires careful selection and optimization of a range of experimental parameters. As the output from the primary detection device, the "raw" MS data file can be used to gauge the success of a given sample analysis. However, the closed-source nature of the standard raw MS file can complicate effective parsing of the data contained within. To ease and increase the range of analyses possible, the RawQuant tool was developed to enable parsing of raw MS files derived from Thermo Orbitrap instruments to yield meta and scan data in an openly readable text format. RawQuant can be commanded to export user-friendly files containing MS 1 , MS 2 , and MS 3 metadata as well as matrices of quantification values based on isobaric tagging approaches. In this study, the utility of RawQuant is demonstrated in several scenarios: (1) reanalysis of shotgun proteomics data for the identification of the human proteome, (2) reanalysis of experiments utilizing isobaric tagging for whole-proteome quantification, and (3) analysis of a novel bacterial proteome and synthetic peptide mixture for assessing quantification accuracy when using isobaric tags. Together, these analyses successfully demonstrate RawQuant for the efficient parsing and quantification of data from raw Thermo Orbitrap MS files acquired in a range of common proteomics experiments. In addition, the individual analyses using RawQuant highlights parametric considerations in the different experimental sets and suggests targetable areas to improve depth of coverage in identification-focused studies and quantification accuracy when using isobaric tags.
Self-digitization microfluidic chip for absolute quantification of mRNA in single cells.
Thompson, Alison M; Gansen, Alexander; Paguirigan, Amy L; Kreutz, Jason E; Radich, Jerald P; Chiu, Daniel T
2014-12-16
Quantification of mRNA in single cells provides direct insight into how intercellular heterogeneity plays a role in disease progression and outcomes. Quantitative polymerase chain reaction (qPCR), the current gold standard for evaluating gene expression, is insufficient for providing absolute measurement of single-cell mRNA transcript abundance. Challenges include difficulties in handling small sample volumes and the high variability in measurements. Microfluidic digital PCR provides far better sensitivity for minute quantities of genetic material, but the typical format of this assay does not allow for counting of the absolute number of mRNA transcripts samples taken from single cells. Furthermore, a large fraction of the sample is often lost during sample handling in microfluidic digital PCR. Here, we report the absolute quantification of single-cell mRNA transcripts by digital, one-step reverse transcription PCR in a simple microfluidic array device called the self-digitization (SD) chip. By performing the reverse transcription step in digitized volumes, we find that the assay exhibits a linear signal across a wide range of total RNA concentrations and agrees well with standard curve qPCR. The SD chip is found to digitize a high percentage (86.7%) of the sample for single-cell experiments. Moreover, quantification of transferrin receptor mRNA in single cells agrees well with single-molecule fluorescence in situ hybridization experiments. The SD platform for absolute quantification of single-cell mRNA can be optimized for other genes and may be useful as an independent control method for the validation of mRNA quantification techniques.
Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification
ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE
2017-01-01
Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793
Chen, Xing; Pavan, Matteo; Heinzer-Schweizer, Susanne; Boesiger, Peter; Henning, Anke
2012-01-01
This report describes our efforts on quantification of tissue metabolite concentrations in mM by nuclear Overhauser enhanced and proton decoupled (13) C magnetic resonance spectroscopy and the Electric Reference To access In vivo Concentrations (ERETIC) method. Previous work showed that a calibrated synthetic magnetic resonance spectroscopy-like signal transmitted through an optical fiber and inductively coupled into a transmit/receive coil represents a reliable reference standard for in vivo (1) H magnetic resonance spectroscopy quantification on a clinical platform. In this work, we introduce a related implementation that enables simultaneous proton decoupling and ERETIC-based metabolite quantification and hence extends the applicability of the ERETIC method to nuclear Overhauser enhanced and proton decoupled in vivo (13) C magnetic resonance spectroscopy. In addition, ERETIC signal stability under the influence of simultaneous proton decoupling is investigated. The proposed quantification method was cross-validated against internal and external reference standards on human skeletal muscle. The ERETIC signal intensity stability was 100.65 ± 4.18% over 3 months including measurements with and without proton decoupling. Glycogen and unsaturated fatty acid concentrations measured with the ERETIC method were in excellent agreement with internal creatine and external phantom reference methods, showing a difference of 1.85 ± 1.21% for glycogen and 1.84 ± 1.00% for unsaturated fatty acid between ERETIC and creatine-based quantification, whereas the deviations between external reference and creatine-based quantification are 6.95 ± 9.52% and 3.19 ± 2.60%, respectively. Copyright © 2011 Wiley Periodicals, Inc.
Loziuk, Philip L.; Sederoff, Ronald R.; Chiang, Vincent L.; Muddiman, David C.
2014-01-01
Quantitative mass spectrometry has become central to the field of proteomics and metabolomics. Selected reaction monitoring is a widely used method for the absolute quantification of proteins and metabolites. This method renders high specificity using several product ions measured simultaneously. With growing interest in quantification of molecular species in complex biological samples, confident identification and quantitation has been of particular concern. A method to confirm purity or contamination of product ion spectra has become necessary for achieving accurate and precise quantification. Ion abundance ratio assessments were introduced to alleviate some of these issues. Ion abundance ratios are based on the consistent relative abundance (RA) of specific product ions with respect to the total abundance of all product ions. To date, no standardized method of implementing ion abundance ratios has been established. Thresholds by which product ion contamination is confirmed vary widely and are often arbitrary. This study sought to establish criteria by which the relative abundance of product ions can be evaluated in an absolute quantification experiment. These findings suggest that evaluation of the absolute ion abundance for any given transition is necessary in order to effectively implement RA thresholds. Overall, the variation of the RA value was observed to be relatively constant beyond an absolute threshold ion abundance. Finally, these RA values were observed to fluctuate significantly over a 3 year period, suggesting that these values should be assessed as close as possible to the time at which data is collected for quantification. PMID:25154770
Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Holst-Jensen, Arne; Žel, Jana
2015-08-18
Presence of genetically modified organisms (GMO) in food and feed products is regulated in many countries. The European Union (EU) has implemented a threshold for labeling of products containing more than 0.9% of authorized GMOs per ingredient. As the number of GMOs has increased over time, standard-curve based simplex quantitative polymerase chain reaction (qPCR) analyses are no longer sufficiently cost-effective, despite widespread use of initial PCR based screenings. Newly developed GMO detection methods, also multiplex methods, are mostly focused on screening and detection but not quantification. On the basis of droplet digital PCR (ddPCR) technology, multiplex assays for quantification of all 12 EU authorized GM maize lines (per April first 2015) were developed. Because of high sequence similarity of some of the 12 GM targets, two separate multiplex assays were needed. In both assays (4-plex and 10-plex), the transgenes were labeled with one fluorescence reporter and the endogene with another (GMO concentration = transgene/endogene ratio). It was shown that both multiplex assays produce specific results and that performance parameters such as limit of quantification, repeatability, and trueness comply with international recommendations for GMO quantification methods. Moreover, for samples containing GMOs, the throughput and cost-effectiveness is significantly improved compared to qPCR. Thus, it was concluded that the multiplex ddPCR assays could be applied for routine quantification of 12 EU authorized GM maize lines. In case of new authorizations, the events can easily be added to the existing multiplex assays. The presented principle of quantitative multiplexing can be applied to any other domain.
Statistical image quantification toward optimal scan fusion and change quantification
NASA Astrophysics Data System (ADS)
Potesil, Vaclav; Zhou, Xiang Sean
2007-03-01
Recent advance of imaging technology has brought new challenges and opportunities for automatic and quantitative analysis of medical images. With broader accessibility of more imaging modalities for more patients, fusion of modalities/scans from one time point and longitudinal analysis of changes across time points have become the two most critical differentiators to support more informed, more reliable and more reproducible diagnosis and therapy decisions. Unfortunately, scan fusion and longitudinal analysis are both inherently plagued with increased levels of statistical errors. A lack of comprehensive analysis by imaging scientists and a lack of full awareness by physicians pose potential risks in clinical practice. In this paper, we discuss several key error factors affecting imaging quantification, studying their interactions, and introducing a simulation strategy to establish general error bounds for change quantification across time. We quantitatively show that image resolution, voxel anisotropy, lesion size, eccentricity, and orientation are all contributing factors to quantification error; and there is an intricate relationship between voxel anisotropy and lesion shape in affecting quantification error. Specifically, when two or more scans are to be fused at feature level, optimal linear fusion analysis reveals that scans with voxel anisotropy aligned with lesion elongation should receive a higher weight than other scans. As a result of such optimal linear fusion, we will achieve a lower variance than naïve averaging. Simulated experiments are used to validate theoretical predictions. Future work based on the proposed simulation methods may lead to general guidelines and error lower bounds for quantitative image analysis and change detection.
Biniarz, Piotr; Łukaszewicz, Marcin
2017-06-01
The rapid and accurate quantification of biosurfactants in biological samples is challenging. In contrast to the orcinol method for rhamnolipids, no simple biochemical method is available for the rapid quantification of lipopeptides. Various liquid chromatography (LC) methods are promising tools for relatively fast and exact quantification of lipopeptides. Here, we report strategies for the quantification of the lipopeptides pseudofactin and surfactin in bacterial cultures using different high- (HPLC) and ultra-performance liquid chromatography (UPLC) systems. We tested three strategies for sample pretreatment prior to LC analysis. In direct analysis (DA), bacterial cultures were injected directly and analyzed via LC. As a modification, we diluted the samples with methanol and detected an increase in lipopeptide recovery in the presence of methanol. Therefore, we suggest this simple modification as a tool for increasing the accuracy of LC methods. We also tested freeze-drying followed by solvent extraction (FDSE) as an alternative for the analysis of "heavy" samples. In FDSE, the bacterial cultures were freeze-dried, and the resulting powder was extracted with different solvents. Then, the organic extracts were analyzed via LC. Here, we determined the influence of the extracting solvent on lipopeptide recovery. HPLC methods allowed us to quantify pseudofactin and surfactin with run times of 15 and 20 min per sample, respectively, whereas UPLC quantification was as fast as 4 and 5.5 min per sample, respectively. Our methods provide highly accurate measurements and high recovery levels for lipopeptides. At the same time, UPLC-MS provides the possibility to identify lipopeptides and their structural isoforms.
Quantification of pericardial effusions by echocardiography and computed tomography.
Leibowitz, David; Perlman, Gidon; Planer, David; Gilon, Dan; Berman, Philip; Bogot, Naama
2011-01-15
Echocardiography is a well-accepted tool for the diagnosis and quantification of pericardial effusion (PEff). Given the increasing use of computed tomographic (CT) scanning, more PEffs are being initially diagnosed by computed tomography. No study has compared quantification of PEff by computed tomography and echocardiography. The objective of this study was to assess the accuracy of quantification of PEff by 2-dimensional echocardiography and computed tomography compared to the amount of pericardial fluid drained at pericardiocentesis. We retrospectively reviewed an institutional database to identify patients who underwent chest computed tomography and echocardiography before percutaneous pericardiocentesis with documentation of the amount of fluid withdrawn. Digital 2-dimensional echocardiographic and CT images were retrieved and quantification of PEff volume was performed by applying the formula for the volume of a prolate ellipse, π × 4/3 × maximal long-axis dimension/2 × maximal transverse dimension/2 × maximal anteroposterior dimension/2, to the pericardial sac and to the heart. Nineteen patients meeting study qualifications were entered into the study. The amount of PEff drained was 200 to 1,700 ml (mean 674 ± 340). Echocardiographically calculated pericardial effusion volume correlated relatively well with PEff volume (r = 0.73, p <0.001, mean difference -41 ± 225 ml). There was only moderate correlation between CT volume quantification and actual volume drained (r = 0.4, p = 0.004, mean difference 158 ± 379 ml). In conclusion, echocardiography appears a more accurate imaging technique than computed tomography in quantitative assessment of nonloculated PEffs and should continue to be the primary imaging in these patients. Copyright © 2011 Elsevier Inc. All rights reserved.
Chen, Gengbo; Walmsley, Scott; Cheung, Gemmy C M; Chen, Liyan; Cheng, Ching-Yu; Beuerman, Roger W; Wong, Tien Yin; Zhou, Lei; Choi, Hyungwon
2017-05-02
Data independent acquisition-mass spectrometry (DIA-MS) coupled with liquid chromatography is a promising approach for rapid, automatic sampling of MS/MS data in untargeted metabolomics. However, wide isolation windows in DIA-MS generate MS/MS spectra containing a mixed population of fragment ions together with their precursor ions. This precursor-fragment ion map in a comprehensive MS/MS spectral library is crucial for relative quantification of fragment ions uniquely representative of each precursor ion. However, existing reference libraries are not sufficient for this purpose since the fragmentation patterns of small molecules can vary in different instrument setups. Here we developed a bioinformatics workflow called MetaboDIA to build customized MS/MS spectral libraries using a user's own data dependent acquisition (DDA) data and to perform MS/MS-based quantification with DIA data, thus complementing conventional MS1-based quantification. MetaboDIA also allows users to build a spectral library directly from DIA data in studies of a large sample size. Using a marine algae data set, we show that quantification of fragment ions extracted with a customized MS/MS library can provide as reliable quantitative data as the direct quantification of precursor ions based on MS1 data. To test its applicability in complex samples, we applied MetaboDIA to a clinical serum metabolomics data set, where we built a DDA-based spectral library containing consensus spectra for 1829 compounds. We performed fragment ion quantification using DIA data using this library, yielding sensitive differential expression analysis.
Jeong, Hyun Cheol; Hong, Hee-Do; Kim, Young-Chan; Rhee, Young Kyoung; Choi, Sang Yoon; Kim, Kyung-Tack; Kim, Sung Soo; Lee, Young-Chul; Cho, Chang-Won
2015-01-01
Background: Maltol, as a type of phenolic compounds, is produced by the browning reaction during the high-temperature treatment of ginseng. Thus, maltol can be used as a marker for the quality control of various ginseng products manufactured by high-temperature treatment including red ginseng. For the quantification of maltol in Korean ginseng products, an effective high-performance liquid chromatography-diode array detector (HPLC-DAD) method was developed. Materials and Methods: The HPLC-DAD method for maltol quantification coupled with a liquid-liquid extraction (LLE) method was developed and validated in terms of linearity, precision, and accuracy. An HPLC separation was performed on a C18 column. Results: The LLE methods and HPLC running conditions for maltol quantification were optimized. The calibration curve of the maltol exhibited good linearity (R2 = 1.00). The limit of detection value of maltol was 0.26 μg/mL, and the limit of quantification value was 0.79 μg/mL. The relative standard deviations (RSDs) of the data of the intra- and inter-day experiments were <1.27% and 0.61%, respectively. The results of the recovery test were 101.35–101.75% with an RSD value of 0.21–1.65%. The developed method was applied successfully to quantify the maltol in three ginseng products manufactured by different methods. Conclusion: The results of validation demonstrated that the proposed HPLC-DAD method was useful for the quantification of maltol in various ginseng products. PMID:26246746
Schmidt, Carla; Grønborg, Mads; Deckert, Jochen; Bessonov, Sergey; Conrad, Thomas; Lührmann, Reinhard; Urlaub, Henning
2014-01-01
The spliceosome undergoes major changes in protein and RNA composition during pre-mRNA splicing. Knowing the proteins—and their respective quantities—at each spliceosomal assembly stage is critical for understanding the molecular mechanisms and regulation of splicing. Here, we applied three independent mass spectrometry (MS)–based approaches for quantification of these proteins: (1) metabolic labeling by SILAC, (2) chemical labeling by iTRAQ, and (3) label-free spectral count for quantification of the protein composition of the human spliceosomal precatalytic B and catalytic C complexes. In total we were able to quantify 157 proteins by at least two of the three approaches. Our quantification shows that only a very small subset of spliceosomal proteins (the U5 and U2 Sm proteins, a subset of U5 snRNP-specific proteins, and the U2 snRNP-specific proteins U2A′ and U2B′′) remains unaltered upon transition from the B to the C complex. The MS-based quantification approaches classify the majority of proteins as dynamically associated specifically with the B or the C complex. In terms of experimental procedure and the methodical aspect of this work, we show that metabolically labeled spliceosomes are functionally active in terms of their assembly and splicing kinetics and can be utilized for quantitative studies. Moreover, we obtain consistent quantification results from all three methods, including the relatively straightforward and inexpensive label-free spectral count technique. PMID:24448447
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Weixuan; Lian, Jianming; Engel, Dave
2017-07-27
This paper presents a general uncertainty quantification (UQ) framework that provides a systematic analysis of the uncertainty involved in the modeling of a control system, and helps to improve the performance of a control strategy.
Taylor, Jonathan Christopher; Fenner, John Wesley
2017-11-29
Semi-quantification methods are well established in the clinic for assisted reporting of (I123) Ioflupane images. Arguably, these are limited diagnostic tools. Recent research has demonstrated the potential for improved classification performance offered by machine learning algorithms. A direct comparison between methods is required to establish whether a move towards widespread clinical adoption of machine learning algorithms is justified. This study compared three machine learning algorithms with that of a range of semi-quantification methods, using the Parkinson's Progression Markers Initiative (PPMI) research database and a locally derived clinical database for validation. Machine learning algorithms were based on support vector machine classifiers with three different sets of features: Voxel intensities Principal components of image voxel intensities Striatal binding radios from the putamen and caudate. Semi-quantification methods were based on striatal binding ratios (SBRs) from both putamina, with and without consideration of the caudates. Normal limits for the SBRs were defined through four different methods: Minimum of age-matched controls Mean minus 1/1.5/2 standard deviations from age-matched controls Linear regression of normal patient data against age (minus 1/1.5/2 standard errors) Selection of the optimum operating point on the receiver operator characteristic curve from normal and abnormal training data Each machine learning and semi-quantification technique was evaluated with stratified, nested 10-fold cross-validation, repeated 10 times. The mean accuracy of the semi-quantitative methods for classification of local data into Parkinsonian and non-Parkinsonian groups varied from 0.78 to 0.87, contrasting with 0.89 to 0.95 for classifying PPMI data into healthy controls and Parkinson's disease groups. The machine learning algorithms gave mean accuracies between 0.88 to 0.92 and 0.95 to 0.97 for local and PPMI data respectively. Classification performance was lower for the local database than the research database for both semi-quantitative and machine learning algorithms. However, for both databases, the machine learning methods generated equal or higher mean accuracies (with lower variance) than any of the semi-quantification approaches. The gain in performance from using machine learning algorithms as compared to semi-quantification was relatively small and may be insufficient, when considered in isolation, to offer significant advantages in the clinical context.
Evaluation of a mass-balance approach to determine consumptive water use in northeastern Illinois
Mills, Patrick C.; Duncker, James J.; Over, Thomas M.; Marian Domanski,; ,; Engel, Frank
2014-01-01
Under ideal conditions, accurate quantification of consumptive use at the sewershed scale by the described mass-balance approach might be possible. Under most prevailing conditions, quantification likely would be more costly and time consuming than that of the present study, given the freely contributed technical support of the host community and relatively appropriate conditions of the study area. Essentials to quantification of consumptive use are a fully cooperative community, storm and sanitary sewers that are separate, and newer sewer infrastructure and (or) a robust program for limiting infiltration, exfiltration, and inflow.
NASA Technical Reports Server (NTRS)
Traversi, M.; Barbarek, L. A. C.
1979-01-01
A handy reference for JPL minimum requirements and guidelines is presented as well as information on the use of the fundamental information source represented by the Nationwide Personal Transportation Survey. Data on U.S. demographic statistics and highway speeds are included along with methodology for normal parameters evaluation, synthesis of daily distance distributions, and projection of car ownership distributions. The synthesis of tentative mission quantification results, of intermediate mission quantification results, and of mission quantification parameters are considered and 1985 in place fleet fuel economy data are included.
Accurate proteome-wide protein quantification from high-resolution 15N mass spectra
2011-01-01
In quantitative mass spectrometry-based proteomics, the metabolic incorporation of a single source of 15N-labeled nitrogen has many advantages over using stable isotope-labeled amino acids. However, the lack of a robust computational framework for analyzing the resulting spectra has impeded wide use of this approach. We have addressed this challenge by introducing a new computational methodology for analyzing 15N spectra in which quantification is integrated with identification. Application of this method to an Escherichia coli growth transition reveals significant improvement in quantification accuracy over previous methods. PMID:22182234
Witte, Anna Kristina; Fister, Susanne; Mester, Patrick; Schoder, Dagmar; Rossmanith, Peter
2016-11-01
Fast and reliable pathogen detection is an important issue for human health. Since conventional microbiological methods are rather slow, there is growing interest in detection and quantification using molecular methods. The droplet digital polymerase chain reaction (ddPCR) is a relatively new PCR method for absolute and accurate quantification without external standards. Using the Listeria monocytogenes specific prfA assay, we focused on the questions of whether the assay was directly transferable to ddPCR and whether ddPCR was suitable for samples derived from heterogeneous matrices, such as foodstuffs that often included inhibitors and a non-target bacterial background flora. Although the prfA assay showed suboptimal cluster formation, use of ddPCR for quantification of L. monocytogenes from pure bacterial cultures, artificially contaminated cheese, and naturally contaminated foodstuff was satisfactory over a relatively broad dynamic range. Moreover, results demonstrated the outstanding detection limit of one copy. However, while poorer DNA quality, such as resulting from longer storage, can impair ddPCR, internal amplification control (IAC) of prfA by ddPCR, that is integrated in the genome of L. monocytogenes ΔprfA, showed even slightly better quantification over a broader dynamic range. Graphical Abstract Evaluating the absolute quantification potential of ddPCR targeting Listeria monocytogenes prfA.
Pedersen, S N; Lindholst, C
1999-12-09
Extraction methods were developed for quantification of the xenoestrogens 4-tert.-octylphenol (tOP) and bisphenol A (BPA) in water and in liver and muscle tissue from the rainbow trout (Oncorhynchus mykiss). The extraction of tOP and BPA from tissue samples was carried out using microwave-assisted solvent extraction (MASE) followed by solid-phase extraction (SPE). Water samples were extracted using only SPE. For the quantification of tOP and BPA, liquid chromatography mass spectrometry (LC-MS) equipped with an atmospheric pressure chemical ionisation interface (APCI) was applied. The combined methods for tissue extraction allow the use of small sample amounts of liver or muscle (typically 1 g), low volumes of solvent (20 ml), and short extraction times (25 min). Limits of quantification of tOP in tissue samples were found to be approximately 10 ng/g in muscle and 50 ng/g in liver (both based on 1 g of fresh tissue). The corresponding values for BPA were approximately 50 ng/g in both muscle and liver tissue. In water, the limit of quantification for tOP and BPA was approximately 0.1 microg/l (based on 100 ml sample size).
Evaluation of digital PCR for absolute RNA quantification.
Sanders, Rebecca; Mason, Deborah J; Foy, Carole A; Huggett, Jim F
2013-01-01
Gene expression measurements detailing mRNA quantities are widely employed in molecular biology and are increasingly important in diagnostic fields. Reverse transcription (RT), necessary for generating complementary DNA, can be both inefficient and imprecise, but remains a quintessential RNA analysis tool using qPCR. This study developed a Transcriptomic Calibration Material and assessed the RT reaction using digital (d)PCR for RNA measurement. While many studies characterise dPCR capabilities for DNA quantification, less work has been performed investigating similar parameters using RT-dPCR for RNA analysis. RT-dPCR measurement using three, one-step RT-qPCR kits was evaluated using single and multiplex formats when measuring endogenous and synthetic RNAs. The best performing kit was compared to UV quantification and sensitivity and technical reproducibility investigated. Our results demonstrate assay and kit dependent RT-dPCR measurements differed significantly compared to UV quantification. Different values were reported by different kits for each target, despite evaluation of identical samples using the same instrument. RT-dPCR did not display the strong inter-assay agreement previously described when analysing DNA. This study demonstrates that, as with DNA measurement, RT-dPCR is capable of accurate quantification of low copy RNA targets, but the results are both kit and target dependent supporting the need for calibration controls.
Normal Databases for the Relative Quantification of Myocardial Perfusion
Rubeaux, Mathieu; Xu, Yuan; Germano, Guido; Berman, Daniel S.; Slomka, Piotr J.
2016-01-01
Purpose of review Myocardial perfusion imaging (MPI) with SPECT is performed clinically worldwide to detect and monitor coronary artery disease (CAD). MPI allows an objective quantification of myocardial perfusion at stress and rest. This established technique relies on normal databases to compare patient scans against reference normal limits. In this review, we aim to introduce the process of MPI quantification with normal databases and describe the associated perfusion quantitative measures that are used. Recent findings New equipment and new software reconstruction algorithms have been introduced which require the development of new normal limits. The appearance and regional count variations of normal MPI scan may differ between these new scanners and standard Anger cameras. Therefore, these new systems may require the determination of new normal limits to achieve optimal accuracy in relative myocardial perfusion quantification. Accurate diagnostic and prognostic results rivaling those obtained by expert readers can be obtained by this widely used technique. Summary Throughout this review, we emphasize the importance of the different normal databases and the need for specific databases relative to distinct imaging procedures. use of appropriate normal limits allows optimal quantification of MPI by taking into account subtle image differences due to the hardware and software used, and the population studied. PMID:28138354
Quantification of sterol lipids in plants by quadrupole time-of-flight mass spectrometry
Wewer, Vera; Dombrink, Isabel; vom Dorp, Katharina; Dörmann, Peter
2011-01-01
Glycerolipids, sphingolipids, and sterol lipids constitute the major lipid classes in plants. Sterol lipids are composed of free and conjugated sterols, i.e., sterol esters, sterol glycosides, and acylated sterol glycosides. Sterol lipids play crucial roles during adaption to abiotic stresses and plant-pathogen interactions. Presently, no comprehensive method for sterol lipid quantification in plants is available. We used nanospray ionization quadrupole-time-of-flight mass spectrometry (Q-TOF MS) to resolve and identify the molecular species of all four sterol lipid classes from Arabidopsis thaliana. Free sterols were derivatized with chlorobetainyl chloride. Sterol esters, sterol glycosides, and acylated sterol glycosides were ionized as ammonium adducts. Quantification of molecular species was achieved in the positive mode after fragmentation in the presence of internal standards. The amounts of sterol lipids quantified by Q-TOF MS/MS were validated by comparison with results obtained with TLC/GC. Quantification of sterol lipids from leaves and roots of phosphate-deprived A. thaliana plants revealed changes in the amounts and molecular species composition. The Q-TOF method is far more sensitive than GC or HPLC. Therefore, Q-TOF MS/MS provides a comprehensive strategy for sterol lipid quantification that can be adapted to other tandem mass spectrometers. PMID:21382968
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nie, Song; Shi, Tujin; Fillmore, Thomas L.
Mass spectrometry-based targeted proteomics (e.g., selected reaction monitoring, SRM) is emerging as an attractive alternative to immunoassays for protein quantification. Recently we have made significant progress in SRM sensitivity for enabling quantification of low ng/mL to sub-ng/mL level proteins in nondepleted human blood plasma/serum without affinity enrichment. However, precise quantification of extremely low abundant but biologically important proteins (e.g., ≤100 pg/mL in blood plasma/serum) using targeted proteomics approaches still remains challenging. To address this need, we have developed an antibody-independent Deep-Dive SRM (DD-SRM) approach that capitalizes on multidimensional high-resolution reversed-phase liquid chromatography (LC) separation for target peptide enrichment combined withmore » precise selection of target peptide fractions of interest, significantly improving SRM sensitivity by ~5 orders of magnitude when compared to conventional LC-SRM. Application of DD-SRM to human serum and tissue has been demonstrated to enable precise quantification of endogenous proteins at ~10 pg/mL level in nondepleted serum and at <10 copies per cell level in tissue. Thus, DD-SRM holds great promise for precisely measuring extremely low abundance proteins or protein modifications, especially when high-quality antibody is not available.« less
Gaubert, Alexandra; Jeudy, Jérémy; Rougemont, Blandine; Bordes, Claire; Lemoine, Jérôme; Casabianca, Hervé; Salvador, Arnaud
2016-07-01
In a stricter legislative context, greener detergent formulations are developed. In this way, synthetic surfactants are frequently replaced by bio-sourced surfactants and/or used at lower concentrations in combination with enzymes. In this paper, a LC-MS/MS method was developed for the identification and quantification of enzymes in laundry detergents. Prior to the LC-MS/MS analyses, a specific sample preparation protocol was developed due to matrix complexity (high surfactant percentages). Then for each enzyme family mainly used in detergent formulations (protease, amylase, cellulase, and lipase), specific peptides were identified on a high resolution platform. A LC-MS/MS method was then developed in selected reaction monitoring (SRM) MS mode for the light and corresponding heavy peptides. The method was linear on the peptide concentration ranges 25-1000 ng/mL for protease, lipase, and cellulase; 50-1000 ng/mL for amylase; and 5-1000 ng/mL for cellulase in both water and laundry detergent matrices. The application of the developed analytical strategy to real commercial laundry detergents enabled enzyme identification and absolute quantification. For the first time, identification and absolute quantification of enzymes in laundry detergent was realized by LC-MS/MS in a single run. Graphical Abstract Identification and quantification of enzymes by LC-MS/MS.
Urbina, Angel; Mahadevan, Sankaran; Paez, Thomas L.
2012-03-01
Here, performance assessment of complex systems is ideally accomplished through system-level testing, but because they are expensive, such tests are seldom performed. On the other hand, for economic reasons, data from tests on individual components that are parts of complex systems are more readily available. The lack of system-level data leads to a need to build computational models of systems and use them for performance prediction in lieu of experiments. Because their complexity, models are sometimes built in a hierarchical manner, starting with simple components, progressing to collections of components, and finally, to the full system. Quantification of uncertainty inmore » the predicted response of a system model is required in order to establish confidence in the representation of actual system behavior. This paper proposes a framework for the complex, but very practical problem of quantification of uncertainty in system-level model predictions. It is based on Bayes networks and uses the available data at multiple levels of complexity (i.e., components, subsystem, etc.). Because epistemic sources of uncertainty were shown to be secondary, in this application, aleatoric only uncertainty is included in the present uncertainty quantification. An example showing application of the techniques to uncertainty quantification of measures of response of a real, complex aerospace system is included.« less
NASA Astrophysics Data System (ADS)
Bau, Haim; Liu, Changchun; Killawala, Chitvan; Sadik, Mohamed; Mauk, Michael
2014-11-01
Real-time amplification and quantification of specific nucleic acid sequences plays a major role in many medical and biotechnological applications. In the case of infectious diseases, quantification of the pathogen-load in patient specimens is critical to assessing disease progression, effectiveness of drug therapy, and emergence of drug-resistance. Typically, nucleic acid quantification requires sophisticated and expensive instruments, such as real-time PCR machines, which are not appropriate for on-site use and for low resource settings. We describe a simple, low-cost, reactiondiffusion based method for end-point quantification of target nucleic acids undergoing enzymatic amplification. The number of target molecules is inferred from the position of the reaction-diffusion front, analogous to reading temperature in a mercury thermometer. We model the process with the Fisher Kolmogoroff Petrovskii Piscounoff (FKPP) Equation and compare theoretical predictions with experimental observations. The proposed method is suitable for nucleic acid quantification at the point of care, compatible with multiplexing and high-throughput processing, and can function instrument-free. C.L. was supported by NIH/NIAID K25AI099160; M.S. was supported by the Pennsylvania Ben Franklin Technology Development Authority; C.K. and H.B. were funded, in part, by NIH/NIAID 1R41AI104418-01A1.
Digital PCR as a tool to measure HIV persistence.
Rutsaert, Sofie; Bosman, Kobus; Trypsteen, Wim; Nijhuis, Monique; Vandekerckhove, Linos
2018-01-30
Although antiretroviral therapy is able to suppress HIV replication in infected patients, the virus persists and rebounds when treatment is stopped. In order to find a cure that can eradicate the latent reservoir, one must be able to quantify the persisting virus. Traditionally, HIV persistence studies have used real-time PCR (qPCR) to measure the viral reservoir represented by HIV DNA and RNA. Most recently, digital PCR is gaining popularity as a novel approach to nucleic acid quantification as it allows for absolute target quantification. Various commercial digital PCR platforms are nowadays available that implement the principle of digital PCR, of which Bio-Rad's QX200 ddPCR is currently the most used platform in HIV research. Quantification of HIV by digital PCR is proving to be a valuable improvement over qPCR as it is argued to have a higher robustness to mismatches between the primers-probe set and heterogeneous HIV, and forfeits the need for a standard curve, both of which are known to complicate reliable quantification. However, currently available digital PCR platforms occasionally struggle with unexplained false-positive partitions, and reliable segregation between positive and negative droplets remains disputed. Future developments and advancements of the digital PCR technology are promising to aid in the accurate quantification and characterization of the persistent HIV reservoir.
Meng, Yanan; Liu, Xin; Wang, Shu; Zhang, Dabing; Yang, Litao
2012-01-11
To enforce the labeling regulations of genetically modified organisms (GMOs), the application of DNA plasmids as calibrants is becoming essential for the practical quantification of GMOs. This study reports the construction of plasmid pTC1507 for a quantification assay of genetically modified (GM) maize TC1507 and the collaborative ring trial in international validation of its applicability as a plasmid calibrant. pTC1507 includes one event-specific sequence of TC1507 maize and one unique sequence of maize endogenous gene zSSIIb. A total of eight GMO detection laboratories worldwide were invited to join the validation process, and test results were returned from all eight participants. Statistical analysis of the returned results showed that real-time PCR assays using pTC1507 as calibrant in both GM event-specific and endogenous gene quantifications had high PCR efficiency (ranging from 0.80 to 1.15) and good linearity (ranging from 0.9921 to 0.9998). In a quantification assay of five blind samples, the bias between the test values and true values ranged from 2.6 to 24.9%. All results indicated that the developed pTC1507 plasmid is applicable for the quantitative analysis of TC1507 maize and can be used as a suitable substitute for dried powder certified reference materials (CRMs).
Mass Spectrometric Quantification of N-Linked Glycans by Reference to Exogenous Standards.
Mehta, Nickita; Porterfield, Mindy; Struwe, Weston B; Heiss, Christian; Azadi, Parastoo; Rudd, Pauline M; Tiemeyer, Michael; Aoki, Kazuhiro
2016-09-02
Environmental and metabolic processes shape the profile of glycoprotein glycans expressed by cells, whether in culture, developing tissues, or mature organisms. Quantitative characterization of glycomic changes associated with these conditions has been achieved historically by reductive coupling of oligosaccharides to various fluorophores following release from glycoprotein and subsequent HPLC or capillary electrophoretic separation. Such labeling-based approaches provide a robust means of quantifying glycan amount based on fluorescence yield. Mass spectrometry, on the other hand, has generally been limited to relative quantification in which the contribution of the signal intensity for an individual glycan is expressed as a percent of the signal intensity summed over the total profile. Relative quantification has been valuable for highlighting changes in glycan expression between samples; sensitivity is high, and structural information can be derived by fragmentation. We have investigated whether MS-based glycomics is amenable to absolute quantification by referencing signal intensities to well-characterized oligosaccharide standards. We report the qualification of a set of N-linked oligosaccharide standards by NMR, HPLC, and MS. We also demonstrate the dynamic range, sensitivity, and recovery from complex biological matrices for these standards in their permethylated form. Our results indicate that absolute quantification for MS-based glycomic analysis is reproducible and robust utilizing currently available glycan standards.
Liu, Rui; Zhang, Shixi; Wei, Chao; Xing, Zhi; Zhang, Sichun; Zhang, Xinrong
2016-05-17
The unambiguous quantification of biomolecules is of great significance in fundamental biological research as well as practical clinical diagnosis. Due to the lack of a detectable moiety, the direct and highly sensitive quantification of biomolecules is often a "mission impossible". Consequently, tagging strategies to introduce detectable moieties for labeling target biomolecules were invented, which had a long and significant impact on studies of biomolecules in the past decades. For instance, immunoassays have been developed with radioisotope tagging by Yalow and Berson in the late 1950s. The later languishment of this technology can be almost exclusively ascribed to the use of radioactive isotopes, which led to the development of nonradioactive tagging strategy-based assays such as enzyme-linked immunosorbent assay, fluorescent immunoassay, and chemiluminescent and electrochemiluminescent immunoassay. Despite great success, these strategies suffered from drawbacks such as limited spectral window capacity for multiplex detection and inability to provide absolute quantification of biomolecules. After recalling the sequences of tagging strategies, an apparent question is why not use stable isotopes from the start? A reasonable explanation is the lack of reliable means for accurate and precise quantification of stable isotopes at that time. The situation has changed greatly at present, since several atomic mass spectrometric measures for metal stable isotopes have been developed. Among the newly developed techniques, inductively coupled plasma mass spectrometry is an ideal technique to determine metal stable isotope-tagged biomolecules, for its high sensitivity, wide dynamic linear range, and more importantly multiplex and absolute quantification ability. Since the first published report by our group, metal stable isotope tagging has become a revolutionary technique and gained great success in biomolecule quantification. An exciting research highlight in this area is the development and application of the mass cytometer, which fully exploited the multiplexing potential of metal stable isotope tagging. It realized the simultaneous detection of dozens of parameters in single cells, accurate immunophenotyping in cell populations, through modeling of intracellular signaling network and undoubted discrimination of function and connection of cell subsets. Metal stable isotope tagging has great potential applications in hematopoiesis, immunology, stem cells, cancer, and drug screening related research and opened a post-fluorescence era of cytometry. Herein, we review the development of biomolecule quantification using metal stable isotope tagging. Particularly, the power of multiplex and absolute quantification is demonstrated. We address the advantages, applicable situations, and limitations of metal stable isotope tagging strategies and propose suggestions for future developments. The transfer of enzymatic or fluorescent tagging to metal stable isotope tagging may occur in many aspects of biological and clinical practices in the near future, just as the revolution from radioactive isotope tagging to fluorescent tagging happened in the past.
Jiang, Tingting; Dai, Yongmei; Miao, Miao; Zhang, Yue; Song, Chenglin; Wang, Zhixu
2015-07-01
To evaluate the usefulness and efficiency of a novel dietary method among urban pregnant women. Sixty one pregnant women were recruited from the ward and provided with a meal accurately weighed before cooking. The meal was photographed from three different angles before and after eating. The subjects were also interviewed for 24 h dietary recall by the investigators. Food weighting, image quantification and 24 h dietary recall were conducted by investigators from three different groups, and the messages were isolated from each other. Food consumption was analyzed on bases of classification and total summation. Nutrient intake from the meal was calculated for each subject. The data obtained from the dietary recall and the image quantification were compared with the actual values. Correlation and regression analyses were carried out on values between weight method and image quantification as well as dietary recall. Total twenty three kinds of food including rice, vegetables, fish, meats and soy bean curd were included in the experimental meal for the study. Compared with data from 24 h dietary recall (r = 0.413, P < 0.05), food weight estimated by image quantification (r = 0.778, P < 0.05, n = 308) were more correlated with weighed data, and show more concentrated linear distribution. Absolute difference distribution between image quantification and weight method of all food was 77.23 ± 56.02 (P < 0.05, n = 61), which was much small than the difference (172.77 ± 115.18) between 24 h recall and weight method. Values of almost all nutrients, including energy, protein, fat, carbohydrate, vitamin A, vitamin C, calcium, iron and zine calculated based on food weight from image quantification were more close to those of weighed data compared with 24 h dietary recall (P < 0.01). The results found by the Bland Altman analysis showed that the majority of the measurements for nutrient intake, were scattered along the mean difference line and close to the equality line (difference = 0). The plots show fairly good agreement between estimated and actual food consumption. It indicate that the differences (including the outliers) were random and did not exhibit any systematic bias, being consistent over different levels of mean food amount. On the other hand, the questionnaire showed that fifty six pregnant women considered the image quantification was less time-consuming and burdened than 24 h recall. Fifty eight of them would like to use image quantification to know their dietary status. The novel method which called instant photography (image quantification) for dietary assessment is more effective than conventional 24 h dietary recall and it also can obtain food intake values close to weighed data.
Detection and Quantification of Human Fecal Pollution with Real-Time PCR
ABSTRACT Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for enumeration of two recently described ...
The Effect of AOP on Software Engineering, with Particular Attention to OIF and Event Quantification
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Filman, Robert; Korsmeyer, David (Technical Monitor)
2003-01-01
We consider the impact of Aspect-Oriented Programming on Software Engineering, and, in particular, analyze two AOP systems, one of which does component wrapping and the other, quantification over events, for their software engineering effects.
Ramiro-H, Manuel; Cruz-A, J Enrique; García-Gómez, Francisco
2017-01-01
The evaluation and quantification of scientific activity is an extremely complex task. Evaluation of research results is often difficult, and different indicators for the quantification and evaluation of scientific publications have been identified, as well as the development and success of the training programs of researchers.
Literacy and Language Education: The Quantification of Learning
ERIC Educational Resources Information Center
Gibb, Tara
2015-01-01
This chapter describes international policy contexts of adult literacy and language assessment and the shift toward standardization through measurement tools. It considers the implications the quantification of learning outcomes has for pedagogy and practice and for the social inclusion of transnational migrants.
Richardson, Keith; Denny, Richard; Hughes, Chris; Skilling, John; Sikora, Jacek; Dadlez, Michał; Manteca, Angel; Jung, Hye Ryung; Jensen, Ole Nørregaard; Redeker, Virginie; Melki, Ronald; Langridge, James I.; Vissers, Johannes P.C.
2013-01-01
A probability-based quantification framework is presented for the calculation of relative peptide and protein abundance in label-free and label-dependent LC-MS proteomics data. The results are accompanied by credible intervals and regulation probabilities. The algorithm takes into account data uncertainties via Poisson statistics modified by a noise contribution that is determined automatically during an initial normalization stage. Protein quantification relies on assignments of component peptides to the acquired data. These assignments are generally of variable reliability and may not be present across all of the experiments comprising an analysis. It is also possible for a peptide to be identified to more than one protein in a given mixture. For these reasons the algorithm accepts a prior probability of peptide assignment for each intensity measurement. The model is constructed in such a way that outliers of any type can be automatically reweighted. Two discrete normalization methods can be employed. The first method is based on a user-defined subset of peptides, while the second method relies on the presence of a dominant background of endogenous peptides for which the concentration is assumed to be unaffected. Normalization is performed using the same computational and statistical procedures employed by the main quantification algorithm. The performance of the algorithm will be illustrated on example data sets, and its utility demonstrated for typical proteomics applications. The quantification algorithm supports relative protein quantification based on precursor and product ion intensities acquired by means of data-dependent methods, originating from all common isotopically-labeled approaches, as well as label-free ion intensity-based data-independent methods. PMID:22871168
Damond, F; Benard, A; Balotta, Claudia; Böni, Jürg; Cotten, Matthew; Duque, Vitor; Ferns, Bridget; Garson, Jeremy; Gomes, Perpetua; Gonçalves, Fátima; Gottlieb, Geoffrey; Kupfer, Bernd; Ruelle, Jean; Rodes, Berta; Soriano, Vicente; Wainberg, Mark; Taieb, Audrey; Matheron, Sophie; Chene, Genevieve; Brun-Vezinet, Francoise
2011-10-01
Accurate HIV-2 plasma viral load quantification is crucial for adequate HIV-2 patient management and for the proper conduct of clinical trials and international cohort collaborations. This study compared the homogeneity of HIV-2 RNA quantification when using HIV-2 assays from ACHI(E)V(2E) study sites and either in-house PCR calibration standards or common viral load standards supplied to all collaborators. Each of the 12 participating laboratories quantified blinded HIV-2 samples, using its own HIV-2 viral load assay and standard as well as centrally validated and distributed common HIV-2 group A and B standards (http://www.hiv.lanl.gov/content/sequence/HelpDocs/subtypes-more.html). Aliquots of HIV-2 group A and B strains, each at 2 theoretical concentrations (2.7 and 3.7 log(10) copies/ml), were tested. Intralaboratory, interlaboratory, and overall variances of quantification results obtained with both standards were compared using F tests. For HIV-2 group A quantifications, overall and interlaboratory and/or intralaboratory variances were significantly lower when using the common standard than when using in-house standards at the concentration levels of 2.7 log(10) copies/ml and 3.7 log(10) copies/ml, respectively. For HIV-2 group B, a high heterogeneity was observed and the variances did not differ according to the type of standard used. In this international collaboration, the use of a common standard improved the homogeneity of HIV-2 group A RNA quantification only. The diversity of HIV-2 group B, particularly in PCR primer-binding regions, may explain the heterogeneity in quantification of this strain. Development of a validated HIV-2 viral load assay that accurately quantifies distinct circulating strains is needed.
Towards high-resolution 4D flow MRI in the human aorta using kt-GRAPPA and B1+ shimming at 7T.
Schmitter, Sebastian; Schnell, Susanne; Uğurbil, Kâmil; Markl, Michael; Van de Moortele, Pierre-François
2016-08-01
To evaluate the feasibility of aortic 4D flow magnetic resonance imaging (MRI) at 7T with improved spatial resolution using kt-GRAPPA acceleration while restricting acquisition time and to address radiofrequency (RF) excitation heterogeneities with B1+ shimming. 4D flow MRI data were obtained in the aorta of eight subjects using a 16-channel transmit/receive coil array at 7T. Flow quantification and acquisition time were compared for a kt-GRAPPA accelerated (R = 5) and a standard GRAPPA (R = 2) accelerated protocol. The impact of different dynamic B1+ shimming strategies on flow quantification was investigated. Two kt-GRAPPA accelerated protocols with 1.2 × 1.2 × 1.2 mm(3) and 1.8 × 1.8 × 2.4 mm(3) spatial resolution were compared. Using kt-GRAPPA, we achieved a 4.3-fold reduction in net acquisition time resulting in scan times of about 10 minutes. No significant effect on flow quantification was observed compared to standard GRAPPA with R = 2. Optimizing the B1+ fields for the aorta impacted significantly (P < 0.05) the flow quantification while specific B1+ settings were required for respiration navigators. The high-resolution protocol yielded similar flow quantification, but allowed the depiction of branching vessels. 7T in combination with B1+ shimming allows for high-resolution 4D flow MRI acquisitions in the human aorta, while kt-GRAPPA limits total scan times without affecting flow quantification. J. Magn. Reson. Imaging 2016;44:486-499. © 2016 Wiley Periodicals, Inc.
Computer-aided Assessment of Regional Abdominal Fat with Food Residue Removal in CT
Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi
2014-01-01
Rationale and Objectives Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Materials and Methods Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. Results We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Conclusions Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. PMID:24119354
Computer-aided assessment of regional abdominal fat with food residue removal in CT.
Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi
2013-11-01
Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Schwabe, O.; Shehab, E.; Erkoyuncu, J.
2015-08-01
The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis for future work in this field.
Damond, F.; Benard, A.; Balotta, Claudia; Böni, Jürg; Cotten, Matthew; Duque, Vitor; Ferns, Bridget; Garson, Jeremy; Gomes, Perpetua; Gonçalves, Fátima; Gottlieb, Geoffrey; Kupfer, Bernd; Ruelle, Jean; Rodes, Berta; Soriano, Vicente; Wainberg, Mark; Taieb, Audrey; Matheron, Sophie; Chene, Genevieve; Brun-Vezinet, Francoise
2011-01-01
Accurate HIV-2 plasma viral load quantification is crucial for adequate HIV-2 patient management and for the proper conduct of clinical trials and international cohort collaborations. This study compared the homogeneity of HIV-2 RNA quantification when using HIV-2 assays from ACHIEV2E study sites and either in-house PCR calibration standards or common viral load standards supplied to all collaborators. Each of the 12 participating laboratories quantified blinded HIV-2 samples, using its own HIV-2 viral load assay and standard as well as centrally validated and distributed common HIV-2 group A and B standards (http://www.hiv.lanl.gov/content/sequence/HelpDocs/subtypes-more.html). Aliquots of HIV-2 group A and B strains, each at 2 theoretical concentrations (2.7 and 3.7 log10 copies/ml), were tested. Intralaboratory, interlaboratory, and overall variances of quantification results obtained with both standards were compared using F tests. For HIV-2 group A quantifications, overall and interlaboratory and/or intralaboratory variances were significantly lower when using the common standard than when using in-house standards at the concentration levels of 2.7 log10 copies/ml and 3.7 log10 copies/ml, respectively. For HIV-2 group B, a high heterogeneity was observed and the variances did not differ according to the type of standard used. In this international collaboration, the use of a common standard improved the homogeneity of HIV-2 group A RNA quantification only. The diversity of HIV-2 group B, particularly in PCR primer-binding regions, may explain the heterogeneity in quantification of this strain. Development of a validated HIV-2 viral load assay that accurately quantifies distinct circulating strains is needed. PMID:21813718
Psifidi, Androniki; Dovas, Chrysostomos; Banos, Georgios
2011-01-19
Single nucleotide polymorphisms (SNP) have proven to be powerful genetic markers for genetic applications in medicine, life science and agriculture. A variety of methods exist for SNP detection but few can quantify SNP frequencies when the mutated DNA molecules correspond to a small fraction of the wild-type DNA. Furthermore, there is no generally accepted gold standard for SNP quantification, and, in general, currently applied methods give inconsistent results in selected cohorts. In the present study we sought to develop a novel method for accurate detection and quantification of SNP in DNA pooled samples. The development and evaluation of a novel Ligase Chain Reaction (LCR) protocol that uses a DNA-specific fluorescent dye to allow quantitative real-time analysis is described. Different reaction components and thermocycling parameters affecting the efficiency and specificity of LCR were examined. Several protocols, including gap-LCR modifications, were evaluated using plasmid standard and genomic DNA pools. A protocol of choice was identified and applied for the quantification of a polymorphism at codon 136 of the ovine PRNP gene that is associated with susceptibility to a transmissible spongiform encephalopathy in sheep. The real-time LCR protocol developed in the present study showed high sensitivity, accuracy, reproducibility and a wide dynamic range of SNP quantification in different DNA pools. The limits of detection and quantification of SNP frequencies were 0.085% and 0.35%, respectively. The proposed real-time LCR protocol is applicable when sensitive detection and accurate quantification of low copy number mutations in DNA pools is needed. Examples include oncogenes and tumour suppressor genes, infectious diseases, pathogenic bacteria, fungal species, viral mutants, drug resistance resulting from point mutations, and genetically modified organisms in food.
Psifidi, Androniki; Dovas, Chrysostomos; Banos, Georgios
2011-01-01
Background Single nucleotide polymorphisms (SNP) have proven to be powerful genetic markers for genetic applications in medicine, life science and agriculture. A variety of methods exist for SNP detection but few can quantify SNP frequencies when the mutated DNA molecules correspond to a small fraction of the wild-type DNA. Furthermore, there is no generally accepted gold standard for SNP quantification, and, in general, currently applied methods give inconsistent results in selected cohorts. In the present study we sought to develop a novel method for accurate detection and quantification of SNP in DNA pooled samples. Methods The development and evaluation of a novel Ligase Chain Reaction (LCR) protocol that uses a DNA-specific fluorescent dye to allow quantitative real-time analysis is described. Different reaction components and thermocycling parameters affecting the efficiency and specificity of LCR were examined. Several protocols, including gap-LCR modifications, were evaluated using plasmid standard and genomic DNA pools. A protocol of choice was identified and applied for the quantification of a polymorphism at codon 136 of the ovine PRNP gene that is associated with susceptibility to a transmissible spongiform encephalopathy in sheep. Conclusions The real-time LCR protocol developed in the present study showed high sensitivity, accuracy, reproducibility and a wide dynamic range of SNP quantification in different DNA pools. The limits of detection and quantification of SNP frequencies were 0.085% and 0.35%, respectively. Significance The proposed real-time LCR protocol is applicable when sensitive detection and accurate quantification of low copy number mutations in DNA pools is needed. Examples include oncogenes and tumour suppressor genes, infectious diseases, pathogenic bacteria, fungal species, viral mutants, drug resistance resulting from point mutations, and genetically modified organisms in food. PMID:21283808
Lao, Yexing; Yang, Cuiping; Zou, Wei; Gan, Manquan; Chen, Ping; Su, Weiwei
2012-05-01
The cryptand Kryptofix 2.2.2 is used extensively as a phase-transfer reagent in the preparation of [18F]fluoride-labelled radiopharmaceuticals. However, it has considerable acute toxicity. The aim of this study was to develop and validate a method for rapid (within 1 min), specific and sensitive quantification of Kryptofix 2.2.2 at trace levels. Chromatographic separations were carried out by rapid-resolution liquid chromatography (Agilent ZORBAX SB-C18 rapid-resolution column, 2.1 × 30 mm, 3.5 μm). Tandem mass spectra were acquired using a triple quadrupole mass spectrometer equipped with an electrospray ionization interface. Quantitative mass spectrometric analysis was conducted in positive ion mode and multiple reaction monitoring mode for the m/z 377.3 → 114.1 transition for Kryptofix 2.2.2. The external standard method was used for quantification. The method met the precision and efficiency requirements for PET radiopharmaceuticals, providing satisfactory results for specificity, matrix effect, stability, linearity (0.5-100 ng/ml, r(2)=0.9975), precision (coefficient of variation < 5%), accuracy (relative error < ± 3%), sensitivity (lower limit of quantification=0.5 ng) and detection time (<1 min). Fluorodeoxyglucose (n=6) was analysed, and the Kryptofix 2.2.2 content was found to be well below the maximum permissible levels approved by the US Food and Drug Administration. The developed method has a short analysis time (<1 min) and high sensitivity (lower limit of quantification=0.5 ng/ml) and can be successfully applied to rapid quantification of Kryptofix 2.2.2 at trace levels in fluorodeoxyglucose. This method could also be applied to other [18F]fluorine-labelled radiopharmaceuticals that use Kryptofix 2.2.2 as a phase-transfer reagent.
Isak, I; Patel, M; Riddell, M; West, M; Bowers, T; Wijeyekoon, S; Lloyd, J
2016-08-01
Fourier transform infrared (FTIR) spectroscopy was used in this study for the rapid quantification of polyhydroxyalkanoates (PHA) in mixed and pure culture bacterial biomass. Three different statistical analysis methods (regression, partial least squares (PLS) and nonlinear) were applied to the FTIR data and the results were plotted against the PHA values measured with the reference gas chromatography technique. All methods predicted PHA content in mixed culture biomass with comparable efficiency, indicated by similar residuals values. The PHA in these cultures ranged from low to medium concentration (0-44 wt% of dried biomass content). However, for the analysis of the combined mixed and pure culture biomass with PHA concentration ranging from low to high (0-93% of dried biomass content), the PLS method was most efficient. This paper reports, for the first time, the use of a single calibration model constructed with a combination of mixed and pure cultures covering a wide PHA range, for predicting PHA content in biomass. Currently no one universal method exists for processing FTIR data for polyhydroxyalkanoates (PHA) quantification. This study compares three different methods of analysing FTIR data for quantification of PHAs in biomass. A new data-processing approach was proposed and the results were compared against existing literature methods. Most publications report PHA quantification of medium range in pure culture. However, in our study we encompassed both mixed and pure culture biomass containing a broader range of PHA in the calibration curve. The resulting prediction model is useful for rapid quantification of a wider range of PHA content in biomass. © 2016 The Society for Applied Microbiology.
Idilman, Ilkay S; Keskin, Onur; Celik, Azim; Savas, Berna; Elhan, Atilla Halil; Idilman, Ramazan; Karcaaltincaba, Musturay
2016-03-01
Many imaging methods have been defined for quantification of hepatic steatosis in non-alcoholic fatty liver disease (NAFLD). However, studies comparing the efficiency of magnetic resonance imaging-proton density fat fraction (MRI-PDFF), magnetic resonance spectroscopy (MRS), and liver histology for quantification of liver fat content are limited. To compare the efficiency of MRI-PDFF and MRS in the quantification of liver fat content in individuals with NAFLD. A total of 19 NAFLD patients underwent MRI-PDFF, MRS, and liver biopsy for quantification of liver fat content. The MR examinations were performed on a 1.5 HDx MRI system. The MRI protocol included T1-independent volumetric multi-echo gradient-echo imaging with T2* correction and spectral fat modeling and MRS with STEAM technique. A close correlation was observed between liver MRI-PDFF- and histology- determined steatosis (r = 0.743, P < 0.001) and between liver MRS- and histology-determined steatosis (r = 0.712, P < 0.001), with no superiority between them (ƶ = 0.19, P = 0.849). For quantification of hepatic steatosis, a high correlation was observed between the two MRI methods (r = 0.986, P < 0.001). MRI-PDFF and MRS accurately differentiated moderate/severe steatosis from mild/no hepatic steatosis (P = 0.007 and 0.013, respectively), with no superiority between them (AUCMRI-PDFF = 0.881 ± 0.0856 versus AUCMRS = 0.857 ± 0.0924, P = 0.461). Both MRI-PDFF and MRS can be used for accurate quantification of hepatic steatosis. © The Foundation Acta Radiologica 2015.
3.8 Proposed approach to uncertainty quantification and sensitivity analysis in the next PA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Flach, Greg; Wohlwend, Jen
2017-10-02
This memorandum builds upon Section 3.8 of SRNL (2016) and Flach (2017) by defining key error analysis, uncertainty quantification, and sensitivity analysis concepts and terms, in preparation for the next E-Area Performance Assessment (WSRC 2008) revision.
A conceptually and computationally simple method for the definition, display, quantification, and comparison of the shapes of three-dimensional mathematical molecular models is presented. Molecular or solvent-accessible volume and surface area can also be calculated. Algorithms, ...
DETECTION AND QUANTIFICATION OF COW FECAL POLLUTION WITH REAL-TIME PCR
Assessment of health risk and fecal bacteria loads associated with cow fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for enumeration of two recently described cow-specific g...
Shimizu, Eri; Kato, Hisashi; Nakagawa, Yuki; Kodama, Takashi; Futo, Satoshi; Minegishi, Yasutaka; Watanabe, Takahiro; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi
2008-07-23
A novel type of quantitative competitive polymerase chain reaction (QC-PCR) system for the detection and quantification of the Roundup Ready soybean (RRS) was developed. This system was designed based on the advantage of a fully validated real-time PCR method used for the quantification of RRS in Japan. A plasmid was constructed as a competitor plasmid for the detection and quantification of genetically modified soy, RRS. The plasmid contained the construct-specific sequence of RRS and the taxon-specific sequence of lectin1 (Le1), and both had 21 bp oligonucleotide insertion in the sequences. The plasmid DNA was used as a reference molecule instead of ground seeds, which enabled us to precisely and stably adjust the copy number of targets. The present study demonstrated that the novel plasmid-based QC-PCR method could be a simple and feasible alternative to the real-time PCR method used for the quantification of genetically modified organism contents.
Julien, Jennifer; Dumont, Nathalie; Lebouil, David; Germain, Patrick
2014-01-01
Current waste management policies favor biogases (digester gases (DGs) and landfill gases (LFGs)) valorization as it becomes a way for energy politics. However, volatile organic silicon compounds (VOSiCs) contained into DGs/LFGs severely damage combustion engines and endanger the conversion into electricity by power plants, resulting in a high purification level requirement. Assessing treatment efficiency is still difficult. No consensus has been reached to provide a standardized sampling and quantification of VOSiCs into gases because of their diversity, their physicochemical properties, and the omnipresence of silicon in analytical chains. Usually, samplings are done by adsorption or absorption and quantification made by gas chromatography-mass spectrometry (GC-MS) or inductively coupled plasma-optical emission spectrometry (ICP-OES). In this objective, this paper presents and discusses the optimization of a patented method consisting in VOSiCs sampling by absorption of 100% ethanol and quantification of total Si by ICP-OES. PMID:25379538
Round robin test on quantification of amyloid-β 1-42 in cerebrospinal fluid by mass spectrometry.
Pannee, Josef; Gobom, Johan; Shaw, Leslie M; Korecka, Magdalena; Chambers, Erin E; Lame, Mary; Jenkins, Rand; Mylott, William; Carrillo, Maria C; Zegers, Ingrid; Zetterberg, Henrik; Blennow, Kaj; Portelius, Erik
2016-01-01
Cerebrospinal fluid (CSF) amyloid-β 1-42 (Aβ42) is an important biomarker for Alzheimer's disease, both in diagnostics and to monitor disease-modifying therapies. However, there is a great need for standardization of methods used for quantification. To overcome problems associated with immunoassays, liquid chromatography-tandem mass spectrometry (LC-MS/MS) has emerged as a critical orthogonal alternative. We compared results for CSF Aβ42 quantification in a round robin study performed in four laboratories using similar sample preparation methods and LC-MS instrumentation. The LC-MS results showed excellent correlation between laboratories (r(2) >0.98), high analytical precision, and good correlation with enzyme-linked immunosorbent assay (r(2) >0.85). The use of a common reference sample further decreased interlaboratory variation. Our results indicate that LC-MS is suitable for absolute quantification of Aβ42 in CSF and highlight the importance of developing a certified reference material. Copyright © 2016 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
A Constrained Genetic Algorithm with Adaptively Defined Fitness Function in MRS Quantification
NASA Astrophysics Data System (ADS)
Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; Graveron-Demilly, D.; van Ormondt, D.
MRS Signal quantification is a rather involved procedure and has attracted the interest of the medical engineering community, regarding the development of computationally efficient methodologies. Significant contributions based on Computational Intelligence tools, such as Neural Networks (NNs), demonstrated a good performance but not without drawbacks already discussed by the authors. On the other hand preliminary application of Genetic Algorithms (GA) has already been reported in the literature by the authors regarding the peak detection problem encountered in MRS quantification using the Voigt line shape model. This paper investigates a novel constrained genetic algorithm involving a generic and adaptively defined fitness function which extends the simple genetic algorithm methodology in case of noisy signals. The applicability of this new algorithm is scrutinized through experimentation in artificial MRS signals interleaved with noise, regarding its signal fitting capabilities. Although extensive experiments with real world MRS signals are necessary, the herein shown performance illustrates the method's potential to be established as a generic MRS metabolites quantification procedure.
Digital Quantification of Proteins and mRNA in Single Mammalian Cells.
Albayrak, Cem; Jordi, Christian A; Zechner, Christoph; Lin, Jing; Bichsel, Colette A; Khammash, Mustafa; Tay, Savaş
2016-03-17
Absolute quantification of macromolecules in single cells is critical for understanding and modeling biological systems that feature cellular heterogeneity. Here we show extremely sensitive and absolute quantification of both proteins and mRNA in single mammalian cells by a very practical workflow that combines proximity ligation assay (PLA) and digital PCR. This digital PLA method has femtomolar sensitivity, which enables the quantification of very small protein concentration changes over its entire 3-log dynamic range, a quality necessary for accounting for single-cell heterogeneity. We counted both endogenous (CD147) and exogenously expressed (GFP-p65) proteins from hundreds of single cells and determined the correlation between CD147 mRNA and the protein it encodes. Using our data, a stochastic two-state model of the central dogma was constructed and verified using joint mRNA/protein distributions, allowing us to estimate transcription burst sizes and extrinsic noise strength and calculate the transcription and translation rate constants in single mammalian cells. Copyright © 2016 Elsevier Inc. All rights reserved.
Artifacts Quantification of Metal Implants in MRI
NASA Astrophysics Data System (ADS)
Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.
2017-11-01
The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.
Takemori, Nobuaki; Takemori, Ayako; Tanaka, Yuki; Endo, Yaeta; Hurst, Jane L.; Gómez-Baena, Guadalupe; Harman, Victoria M.; Beynon, Robert J.
2017-01-01
A major challenge in proteomics is the absolute accurate quantification of large numbers of proteins. QconCATs, artificial proteins that are concatenations of multiple standard peptides, are well established as an efficient means to generate standards for proteome quantification. Previously, QconCATs have been expressed in bacteria, but we now describe QconCAT expression in a robust, cell-free system. The new expression approach rescues QconCATs that previously were unable to be expressed in bacteria and can reduce the incidence of proteolytic damage to QconCATs. Moreover, it is possible to cosynthesize QconCATs in a highly-multiplexed translation reaction, coexpressing tens or hundreds of QconCATs simultaneously. By obviating bacterial culture and through the gain of high level multiplexing, it is now possible to generate tens of thousands of standard peptides in a matter of weeks, rendering absolute quantification of a complex proteome highly achievable in a reproducible, broadly deployable system. PMID:29055021
Piñeiro, Zulema; Cantos-Villar, Emma; Palma, Miguel; Puertas, Belen
2011-11-09
A validated HPLC method with fluorescence detection for the simultaneous quantification of hydroxytyrosol and tyrosol in red wines is described. Detection conditions for both compounds were optimized (excitation at 279 and 278 and emission at 631 and 598 nm for hydroxytyrosol and tyrosol, respectively). The validation of the analytical method was based on selectivity, linearity, robustness, detection and quantification limits, repeatability, and recovery. The detection and quantification limits in red wines were set at 0.023 and 0.076 mg L(-1) for hydroxytyrosol and at 0.007 and 0.024 mg L(-1) for tyrosol determination, respectively. Precision values, both within-day and between-day (n = 5), remained below 3% for both compounds. In addition, a fractional factorial experimental design was developed to analyze the influence of six different conditions on analysis. The final optimized HPLC-fluorescence method allowed the analysis of 30 nonpretreated Spanish red wines to evaluate their hydroxytyrosol and tyrosol contents.
Verant, Michelle L; Bohuski, Elizabeth A; Lorch, Jeffery M; Blehert, David S
2016-03-01
The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid from P. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer-based qPCR test for P. destructans to refine quantification capabilities of this assay. © 2016 The Author(s).
A multi-center study benchmarks software tools for label-free proteome quantification
Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan
2016-01-01
The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404
A multicenter study benchmarks software tools for label-free proteome quantification.
Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan
2016-11-01
Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.
Chottier, Claire; Chatain, Vincent; Julien, Jennifer; Dumont, Nathalie; Lebouil, David; Germain, Patrick
2014-01-01
Current waste management policies favor biogases (digester gases (DGs) and landfill gases (LFGs)) valorization as it becomes a way for energy politics. However, volatile organic silicon compounds (VOSiCs) contained into DGs/LFGs severely damage combustion engines and endanger the conversion into electricity by power plants, resulting in a high purification level requirement. Assessing treatment efficiency is still difficult. No consensus has been reached to provide a standardized sampling and quantification of VOSiCs into gases because of their diversity, their physicochemical properties, and the omnipresence of silicon in analytical chains. Usually, samplings are done by adsorption or absorption and quantification made by gas chromatography-mass spectrometry (GC-MS) or inductively coupled plasma-optical emission spectrometry (ICP-OES). In this objective, this paper presents and discusses the optimization of a patented method consisting in VOSiCs sampling by absorption of 100% ethanol and quantification of total Si by ICP-OES.
Arkas: Rapid reproducible RNAseq analysis
Colombo, Anthony R.; J. Triche Jr, Timothy; Ramsingh, Giridharan
2017-01-01
The recently introduced Kallisto pseudoaligner has radically simplified the quantification of transcripts in RNA-sequencing experiments. We offer cloud-scale RNAseq pipelines Arkas-Quantification, and Arkas-Analysis available within Illumina’s BaseSpace cloud application platform which expedites Kallisto preparatory routines, reliably calculates differential expression, and performs gene-set enrichment of REACTOME pathways . Due to inherit inefficiencies of scale, Illumina's BaseSpace computing platform offers a massively parallel distributive environment improving data management services and data importing. Arkas-Quantification deploys Kallisto for parallel cloud computations and is conveniently integrated downstream from the BaseSpace Sequence Read Archive (SRA) import/conversion application titled SRA Import. Arkas-Analysis annotates the Kallisto results by extracting structured information directly from source FASTA files with per-contig metadata, calculates the differential expression and gene-set enrichment analysis on both coding genes and transcripts. The Arkas cloud pipeline supports ENSEMBL transcriptomes and can be used downstream from the SRA Import facilitating raw sequencing importing, SRA FASTQ conversion, RNA quantification and analysis steps. PMID:28868134
Verant, Michelle; Bohuski, Elizabeth A.; Lorch, Jeffrey M.; Blehert, David
2016-01-01
The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid fromP. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer–based qPCR test for P. destructans to refine quantification capabilities of this assay.
Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier
2018-06-01
Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Mallah, Muhammad Ali; Sherazi, Syed Tufail Hussain; Bhanger, Muhammad Iqbal; Mahesar, Sarfaraz Ahmed; Bajeer, Muhammad Ashraf
2015-04-01
A transmission FTIR spectroscopic method was developed for direct, inexpensive and fast quantification of paracetamol content in solid pharmaceutical formulations. In this method paracetamol content is directly analyzed without solvent extraction. KBr pellets were formulated for the acquisition of FTIR spectra in transmission mode. Two chemometric models: simple Beer's law and partial least squares employed over the spectral region of 1800-1000 cm-1 for quantification of paracetamol content had a regression coefficient of (R2) of 0.999. The limits of detection and quantification using FTIR spectroscopy were 0.005 mg g-1 and 0.018 mg g-1, respectively. Study for interference was also done to check effect of the excipients. There was no significant interference from the sample matrix. The results obviously showed the sensitivity of transmission FTIR spectroscopic method for pharmaceutical analysis. This method is green in the sense that it does not require large volumes of hazardous solvents or long run times and avoids prior sample preparation.
Quaternary ammonium isobaric tag for a relative and absolute quantification of peptides.
Setner, Bartosz; Stefanowicz, Piotr; Szewczuk, Zbigniew
2018-02-01
Isobaric labeling quantification of peptides has become a method of choice for mass spectrometry-based proteomics studies. However, despite of wide variety of commercially available isobaric tags, none of the currently available methods offers significant improvement of sensitivity of detection during MS experiment. Recently, many strategies were applied to increase the ionization efficiency of peptides involving chemical modifications introducing quaternary ammonium fixed charge. Here, we present a novel quaternary ammonium-based isobaric tag for relative and absolute quantification of peptides (QAS-iTRAQ 2-plex). Upon collisional activation, the new stable benzylic-type cationic reporter ion is liberated from the tag. Deuterium atoms were used to offset the differential masses of a reporter group. We tested the applicability of QAS-iTRAQ 2-plex reagent on a series of model peptides as well as bovine serum albumin tryptic digest. Obtained results suggest usefulness of this isobaric ionization tag for relative and absolute quantification of peptides. Copyright © 2017 John Wiley & Sons, Ltd.
Xia, Yun; Yan, Shuangqian; Zhang, Xian; Ma, Peng; Du, Wei; Feng, Xiaojun; Liu, Bi-Feng
2017-03-21
Digital loop-mediated isothermal amplification (dLAMP) is an attractive approach for absolute quantification of nucleic acids with high sensitivity and selectivity. Theoretical and numerical analysis of dLAMP provides necessary guidance for the design and analysis of dLAMP devices. In this work, a mathematical model was proposed on the basis of the Monte Carlo method and the theories of Poisson statistics and chemometrics. To examine the established model, we fabricated a spiral chip with 1200 uniform and discrete reaction chambers (9.6 nL) for absolute quantification of pathogenic DNA samples by dLAMP. Under the optimized conditions, dLAMP analysis on the spiral chip realized quantification of nucleic acids spanning over 4 orders of magnitude in concentration with sensitivity as low as 8.7 × 10 -2 copies/μL in 40 min. The experimental results were consistent with the proposed mathematical model, which could provide useful guideline for future development of dLAMP devices.
Fujii, Yosuke; Ding, Yuqi; Umezawa, Taichi; Akimoto, Takafumi; Xu, Jiawei; Uchida, Takashi; Fujino, Tatsuya
2018-01-01
Food additives generally used in carbonated drinks, such as 4-methylimidazole (4MI), caffeine (Caf?), citric acid (CA), and aspartame (Apm), were measured by matrix-assisted laser desorption ionization mass spectrometry (MALDI MS) using nanometer-sized particles of iron oxide (Fe 2 O 3 NPs). The quantification of 4MI in Coca Cola (C-cola) was carried out. In order to improve the reproducibility of the peak intensities, Fe 2 O 3 NPs loaded on ZSM5 zeolite were used as the matrix for quantification. By using 2-ethylimidazole (2EI) as the internal standard, the amount of 4MI in C-cola was determined to range from 88 to 65 μg/355 mL. The results agree with the published value (approx. 72 μg/355 mL). It was found that MALDI using Fe 2 O 3 was applicable to the quantification of 4MI in C-cola.
Jacchia, Sara; Nardini, Elena; Savini, Christian; Petrillo, Mauro; Angers-Loustau, Alexandre; Shim, Jung-Hyun; Trijatmiko, Kurniawan; Kreysa, Joachim; Mazzara, Marco
2015-02-18
In this study, we developed, optimized, and in-house validated a real-time PCR method for the event-specific detection and quantification of Golden Rice 2, a genetically modified rice with provitamin A in the grain. We optimized and evaluated the performance of the taxon (targeting rice Phospholipase D α2 gene)- and event (targeting the 3' insert-to-plant DNA junction)-specific assays that compose the method as independent modules, using haploid genome equivalents as unit of measurement. We verified the specificity of the two real-time PCR assays and determined their dynamic range, limit of quantification, limit of detection, and robustness. We also confirmed that the taxon-specific DNA sequence is present in single copy in the rice genome and verified its stability of amplification across 132 rice varieties. A relative quantification experiment evidenced the correct performance of the two assays when used in combination.
NASA Technical Reports Server (NTRS)
Benek, John A.; Luckring, James M.
2017-01-01
A NATO symposium held in 2008 identified many promising sensitivity analysis and un-certainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not known. The STO Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic problems of interest to NATO. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper presents an overview of the AVT-191 program content.
NASA Technical Reports Server (NTRS)
Benek, John A.; Luckring, James M.
2017-01-01
A NATO symposium held in Greece in 2008 identified many promising sensitivity analysis and uncertainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not clear. The NATO Science and Technology Organization, Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic vehicle development problems. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper summarizes findings and lessons learned from the task group.
NASA Astrophysics Data System (ADS)
Font, Marianne; Lagarde, Jean-Louis; Amorese, Daniel; Coutard, Jean-Pierre; Ozouf, Jean-Claude
The degradation of the Jobourg fault-scarp occurred by cryoclastic processes in a periglacial environment during a part of Quaternary time. An attempt of quantification indicates a bulk scarp erosion of about 39 m 3 m -2, while the head accumulated at the bottom of the fault scarp only represents 4.6 m 3 m -2. To cite this article: M. Font et al., C. R. Geoscience 334 (2002) 171-178.
[Detection of recombinant-DNA in foods from stacked genetically modified plants].
Sorokina, E Iu; Chernyshova, O N
2012-01-01
A quantitative real-time multiplex polymerase chain reaction method was applied to the detection and quantification of MON863 and MON810 in stacked genetically modified maize MON 810xMON 863. The limit of detection was approximately 0,1%. The accuracy of the quantification, measured as bias from the accepted value and the relative repeatability standard deviation, which measures the intra-laboratory variability, were within 25% at each GM-level. A method verification has demonstrated that the MON 863 and the MON810 methods can be equally applied in quantification of the respective events in stacked MON810xMON 863.
McEwen, I; Mulloy, B; Hellwig, E; Kozerski, L; Beyer, T; Holzgrabe, U; Wanko, R; Spieser, J-M; Rodomonte, A
2008-12-01
Oversulphated Chondroitin Sulphate (OSCS) and Dermatan Sulphate (DS) in unfractionated heparins can be identified by nuclear magnetic resonance spectrometry (NMR). The limit of detection (LoD) of OSCS is 0.1% relative to the heparin content. This LoD is obtained at a signal-to-noise ratio (S/N) of 2000:1 of the heparin methyl signal. Quantification is best obtained by comparing peak heights of the OSCS and heparin methyl signals. Reproducibility of less than 10% relative standard deviation (RSD) has been obtained. The accuracy of quantification was good.
Weidolf, L O; Chichila, T M; Henion, J D
1988-12-09
Methods for screening by thin-layer chromatography, quantification by high-performance liquid chromatography with ultraviolet detection and confirmation by gas chromatography-mass spectrometry of boldenone sulfate in equine urine after administration of boldenone undecylenate (Equipoise) are presented. Sample work-up was done with C18 liquid-solid extraction followed by solvolytic cleavage of the sulfate ester. Confirmatory evidence of boldenone sulfate in equine urine was obtained from 2 h to 42 days following a therapeutic intramuscular dose of Equipoise. The use of 19-nortestosterone sulfate as the internal standard for quantification of boldenone sulfate is discussed.
Uncertainty Quantification for Robust Control of Wind Turbines using Sliding Mode Observer
NASA Astrophysics Data System (ADS)
Schulte, Horst
2016-09-01
A new quantification method of uncertain models for robust wind turbine control using sliding-mode techniques is presented with the objective to improve active load mitigation. This approach is based on the so-called equivalent output injection signal, which corresponds to the average behavior of the discontinuous switching term, establishing and maintaining a motion on a so-called sliding surface. The injection signal is directly evaluated to obtain estimates of the uncertainty bounds of external disturbances and parameter uncertainties. The applicability of the proposed method is illustrated by the quantification of a four degree-of-freedom model of the NREL 5MW reference turbine containing uncertainties.
Quantification of mixed chimerism by real time PCR on whole blood-impregnated FTA cards.
Pezzoli, N; Silvy, M; Woronko, A; Le Treut, T; Lévy-Mozziconacci, A; Reviron, D; Gabert, J; Picard, C
2007-09-01
This study has investigated quantification of chimerism in sex-mismatched transplantations by quantitative real time PCR (RQ-PCR) using FTA paper for blood sampling. First, we demonstrate that the quantification of DNA from EDTA-blood which has been deposit on FTA card is accurate and reproducible. Secondly, we show that fraction of recipient cells detected by RQ-PCR was concordant between the FTA and salting-out method, reference DNA extraction method. Furthermore, the sensitivity of detection of recipient cells is relatively similar with the two methods. Our results show that this innovative method can be used for MC assessment by RQ-PCR.
Practical quantification of necrosis in histological whole-slide images.
Homeyer, André; Schenk, Andrea; Arlt, Janine; Dahmen, Uta; Dirsch, Olaf; Hahn, Horst K
2013-06-01
Since the histological quantification of necrosis is a common task in medical research and practice, we evaluate different image analysis methods for quantifying necrosis in whole-slide images. In a practical usage scenario, we assess the impact of different classification algorithms and feature sets on both accuracy and computation time. We show how a well-chosen combination of multiresolution features and an efficient postprocessing step enables the accurate quantification necrosis in gigapixel images in less than a minute. The results are general enough to be applied to other areas of histological image analysis as well. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Wiemker, Rafael; Opfer, Roland; Bülow, Thomas; Rogalla, Patrik; Steinberg, Amnon; Dharaiya, Ekta; Subramanyan, Krishna
2007-03-01
Computer aided quantification of emphysema in high resolution CT data is based on identifying low attenuation areas below clinically determined Hounsfield thresholds. However, the emphysema quantification is prone to error since a gravity effect can influence the mean attenuation of healthy lung parenchyma up to +/- 50 HU between ventral and dorsal lung areas. Comparing ultra-low-dose (7 mAs) and standard-dose (70 mAs) CT scans of each patient we show that measurement of the ventrodorsal gravity effect is patient specific but reproducible. It can be measured and corrected in an unsupervised way using robust fitting of a linear function.
Harvey, Craig A.; Eash, David A.
1996-01-01
Statistical comparison tests indicate Basinsoft quantifications are not significantly different from manual topographic-map measurements for 9 of 10 basin characteristics tested. The results also indicate that elevation contours generated by ARC/INFO from l:250,000-scale digital elevation model (DEM) data are over-generalized when compared to elevation contours shown on l:250,000-scale topographic maps, and that quantification of basin-slope thus is underestimated using DEM data. A qualitative comparison test indicated that the Basinsoft module used to quantify basin slope is valid and that differences in the quantification of basin slope are due to sourcedata differences.
Source separation on hyperspectral cube applied to dermatology
NASA Astrophysics Data System (ADS)
Mitra, J.; Jolivot, R.; Vabres, P.; Marzani, F. S.
2010-03-01
This paper proposes a method of quantification of the components underlying the human skin that are supposed to be responsible for the effective reflectance spectrum of the skin over the visible wavelength. The method is based on independent component analysis assuming that the epidermal melanin and the dermal haemoglobin absorbance spectra are independent of each other. The method extracts the source spectra that correspond to the ideal absorbance spectra of melanin and haemoglobin. The noisy melanin spectrum is fixed using a polynomial fit and the quantifications associated with it are reestimated. The results produce feasible quantifications of each source component in the examined skin patch.
Quantification is Neither Necessary Nor Sufficient for Measurement
NASA Astrophysics Data System (ADS)
Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark
2013-09-01
Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement.
Samplers for evaluation and quantification of ultra-low volume space sprays
USDA-ARS?s Scientific Manuscript database
A field study was conducted to investigate the suitability of sampling devices for quantification of spray deposition from ULV space sprays. Five different samplers were included in an experiment conducted in an open grassy field. Samplers included horizontally stretched stationary cotton ribbon at ...
A SIMPLE METHOD FOR THE EXTRACTION AND QUANTIFICATION OF PHOTOPIGMENTS FROM SYMBIODINIUM SPP.
John E. Rogers and Dragoslav Marcovich. Submitted. Simple Method for the Extraction and Quantification of Photopigments from Symbiodinium spp.. Limnol. Oceanogr. Methods. 19 p. (ERL,GB 1192).
We have developed a simple, mild extraction procedure using methanol which, when...
Quantification of confocal images of biofilms grown on irregular surfaces
Ross, Stacy Sommerfeld; Tu, Mai Han; Falsetta, Megan L.; Ketterer, Margaret R.; Kiedrowski, Megan R.; Horswill, Alexander R.; Apicella, Michael A.; Reinhardt, Joseph M.; Fiegel, Jennifer
2014-01-01
Bacterial biofilms grow on many types of surfaces, including flat surfaces such as glass and metal and irregular surfaces such as rocks, biological tissues and polymers. While laser scanning confocal microscopy can provide high-resolution images of biofilms grown on any surface, quantification of biofilm-associated bacteria is currently limited to bacteria grown on flat surfaces. This can limit researchers studying irregular surfaces to qualitative analysis or quantification of only the total bacteria in an image. In this work, we introduce a new algorithm called modified connected volume filtration (MCVF) to quantify bacteria grown on top of an irregular surface that is fluorescently labeled or reflective. Using the MCVF algorithm, two new quantification parameters are introduced. The modified substratum coverage parameter enables quantification of the connected-biofilm bacteria on top of the surface and on the imaging substratum. The utility of MCVF and the modified substratum coverage parameter were shown with Pseudomonas aeruginosa and Staphylococcus aureus biofilms grown on human airway epithelial cells. A second parameter, the percent association, provides quantified data on the colocalization of the bacteria with a labeled component, including bacteria within a labeled tissue. The utility of quantifying the bacteria associated with the cell cytoplasm was demonstrated with Neisseria gonorrhoeae biofilms grown on cervical epithelial cells. This algorithm provides more flexibility and quantitative ability to researchers studying biofilms grown on a variety of irregular substrata. PMID:24632515
Mehle, Nataša; Dobnik, David; Ravnikar, Maja; Pompe Novak, Maruša
2018-05-03
RNA viruses have a great potential for high genetic variability and rapid evolution that is generated by mutation and recombination under selection pressure. This is also the case of Potato virus Y (PVY), which comprises a high diversity of different recombinant and non-recombinant strains. Consequently, it is hard to develop reverse transcription real-time quantitative PCR (RT-qPCR) with the same amplification efficiencies for all PVY strains which would enable their equilibrate quantification; this is specially needed in mixed infections and other studies of pathogenesis. To achieve this, we initially transferred the PVY universal RT-qPCR assay to a reverse transcription droplet digital PCR (RT-ddPCR) format. RT-ddPCR is an absolute quantification method, where a calibration curve is not needed, and it is less prone to inhibitors. The RT-ddPCR developed and validated in this study achieved a dynamic range of quantification over five orders of magnitude, and in terms of its sensitivity, it was comparable to, or even better than, RT-qPCR. RT-ddPCR showed lower measurement variability. We have shown that RT-ddPCR can be used as a reference tool for the evaluation of different RT-qPCR assays. In addition, it can be used for quantification of RNA based on in-house reference materials that can then be used as calibrators in diagnostic laboratories.
Advances in targeted proteomics and applications to biomedical research
Shi, Tujin; Song, Ehwang; Nie, Song; Rodland, Karin D.; Liu, Tao; Qian, Wei-Jun; Smith, Richard D.
2016-01-01
Targeted proteomics technique has emerged as a powerful protein quantification tool in systems biology, biomedical research, and increasing for clinical applications. The most widely used targeted proteomics approach, selected reaction monitoring (SRM), also known as multiple reaction monitoring (MRM), can be used for quantification of cellular signaling networks and preclinical verification of candidate protein biomarkers. As an extension to our previous review on advances in SRM sensitivity herein we review recent advances in the method and technology for further enhancing SRM sensitivity (from 2012 to present), and highlighting its broad biomedical applications in human bodily fluids, tissue and cell lines. Furthermore, we also review two recently introduced targeted proteomics approaches, parallel reaction monitoring (PRM) and data-independent acquisition (DIA) with targeted data extraction on fast scanning high-resolution accurate-mass (HR/AM) instruments. Such HR/AM targeted quantification with monitoring all target product ions addresses SRM limitations effectively in specificity and multiplexing; whereas when compared to SRM, PRM and DIA are still in the infancy with a limited number of applications. Thus, for HR/AM targeted quantification we focus our discussion on method development, data processing and analysis, and its advantages and limitations in targeted proteomics. Finally, general perspectives on the potential of achieving both high sensitivity and high sample throughput for large-scale quantification of hundreds of target proteins are discussed. PMID:27302376
Kennedy, Jacob J.; Yan, Ping; Zhao, Lei; Ivey, Richard G.; Voytovich, Uliana J.; Moore, Heather D.; Lin, Chenwei; Pogosova-Agadjanyan, Era L.; Stirewalt, Derek L.; Reding, Kerryn W.; Whiteaker, Jeffrey R.; Paulovich, Amanda G.
2016-01-01
A major goal in cell signaling research is the quantification of phosphorylation pharmacodynamics following perturbations. Traditional methods of studying cellular phospho-signaling measure one analyte at a time with poor standardization, rendering them inadequate for interrogating network biology and contributing to the irreproducibility of preclinical research. In this study, we test the feasibility of circumventing these issues by coupling immobilized metal affinity chromatography (IMAC)-based enrichment of phosphopeptides with targeted, multiple reaction monitoring (MRM) mass spectrometry to achieve precise, specific, standardized, multiplex quantification of phospho-signaling responses. A multiplex immobilized metal affinity chromatography- multiple reaction monitoring assay targeting phospho-analytes responsive to DNA damage was configured, analytically characterized, and deployed to generate phospho-pharmacodynamic curves from primary and immortalized human cells experiencing genotoxic stress. The multiplexed assays demonstrated linear ranges of ≥3 orders of magnitude, median lower limit of quantification of 0.64 fmol on column, median intra-assay variability of 9.3%, median inter-assay variability of 12.7%, and median total CV of 16.0%. The multiplex immobilized metal affinity chromatography- multiple reaction monitoring assay enabled robust quantification of 107 DNA damage-responsive phosphosites from human cells following DNA damage. The assays have been made publicly available as a resource to the community. The approach is generally applicable, enabling wide interrogation of signaling networks. PMID:26621847
Pandit, Prachi; Rivoire, Julien; King, Kevin; Li, Xiaojuan
2016-03-01
Quantitative T1ρ imaging is beneficial for early detection for osteoarthritis but has seen limited clinical use due to long scan times. In this study, we evaluated the feasibility of accelerated T1ρ mapping for knee cartilage quantification using a combination of compressed sensing (CS) and data-driven parallel imaging (ARC-Autocalibrating Reconstruction for Cartesian sampling). A sequential combination of ARC and CS, both during data acquisition and reconstruction, was used to accelerate the acquisition of T1ρ maps. Phantom, ex vivo (porcine knee), and in vivo (human knee) imaging was performed on a GE 3T MR750 scanner. T1ρ quantification after CS-accelerated acquisition was compared with non CS-accelerated acquisition for various cartilage compartments. Accelerating image acquisition using CS did not introduce major deviations in quantification. The coefficient of variation for the root mean squared error increased with increasing acceleration, but for in vivo measurements, it stayed under 5% for a net acceleration factor up to 2, where the acquisition was 25% faster than the reference (only ARC). To the best of our knowledge, this is the first implementation of CS for in vivo T1ρ quantification. These early results show that this technique holds great promise in making quantitative imaging techniques more accessible for clinical applications. © 2015 Wiley Periodicals, Inc.
Nasso, Sara; Goetze, Sandra; Martens, Lennart
2015-09-04
Selected reaction monitoring (SRM) MS is a highly selective and sensitive technique to quantify protein abundances in complex biological samples. To enhance the pace of SRM large studies, a validated, robust method to fully automate absolute quantification and to substitute for interactive evaluation would be valuable. To address this demand, we present Ariadne, a Matlab software. To quantify monitored targets, Ariadne exploits metadata imported from the transition lists, and targets can be filtered according to mProphet output. Signal processing and statistical learning approaches are combined to compute peptide quantifications. To robustly estimate absolute abundances, the external calibration curve method is applied, ensuring linearity over the measured dynamic range. Ariadne was benchmarked against mProphet and Skyline by comparing its quantification performance on three different dilution series, featuring either noisy/smooth traces without background or smooth traces with complex background. Results, evaluated as efficiency, linearity, accuracy, and precision of quantification, showed that Ariadne's performance is independent of data smoothness and complex background presence and that Ariadne outperforms mProphet on the noisier data set and improved 2-fold Skyline's accuracy and precision for the lowest abundant dilution with complex background. Remarkably, Ariadne could statistically distinguish from each other all different abundances, discriminating dilutions as low as 0.1 and 0.2 fmol. These results suggest that Ariadne offers reliable and automated analysis of large-scale SRM differential expression studies.
Monjure, C. J.; Tatum, C. D.; Panganiban, A. T.; Arainga, M.; Traina-Dorge, V.; Marx, P. A.; Didier, E. S.
2014-01-01
Introduction Quantification of plasma viral load (PVL) is used to monitor disease progression in SIV-infected macaques. This study was aimed at optimizing of performance characteristics of the quantitative PCR (qPCR) PVL assay. Methods The PVL quantification procedure was optimized by inclusion of an exogenous control Hepatitis C Virus armored RNA (aRNA), a plasma concentration step, extended digestion with proteinase K, and a second RNA elution step. Efficiency of viral RNA (vRNA) extraction was compared using several commercial vRNA extraction kits. Various parameters of qPCR targeting the gag region of SIVmac239, SIVsmE660 and the LTR region of SIVagmSAB were also optimized. Results Modifications of the SIV PVL qPCR procedure increased vRNA recovery, reduced inhibition and improved analytical sensitivity. The PVL values determined by this SIV PVL qPCR correlated with quantification results of SIV-RNA in the same samples using the “industry standard” method of branched-DNA (bDNA) signal amplification. Conclusions Quantification of SIV genomic RNA in plasma of rhesus macaques using this optimized SIV PVL qPCR is equivalent to the bDNA signal amplification method, less costly and more versatile. Use of heterologous aRNA as an internal control is useful for optimizing performance characteristics of PVL qPCRs. PMID:24266615
Monjure, C J; Tatum, C D; Panganiban, A T; Arainga, M; Traina-Dorge, V; Marx, P A; Didier, E S
2014-02-01
Quantification of plasma viral load (PVL) is used to monitor disease progression in SIV-infected macaques. This study was aimed at optimizing of performance characteristics of the quantitative PCR (qPCR) PVL assay. The PVL quantification procedure was optimized by inclusion of an exogenous control hepatitis C virus armored RNA (aRNA), a plasma concentration step, extended digestion with proteinase K, and a second RNA elution step. Efficiency of viral RNA (vRNA) extraction was compared using several commercial vRNA extraction kits. Various parameters of qPCR targeting the gag region of SIVmac239, SIVsmE660, and the LTR region of SIVagmSAB were also optimized. Modifications of the SIV PVL qPCR procedure increased vRNA recovery, reduced inhibition and improved analytical sensitivity. The PVL values determined by this SIV PVL qPCR correlated with quantification results of SIV RNA in the same samples using the 'industry standard' method of branched-DNA (bDNA) signal amplification. Quantification of SIV genomic RNA in plasma of rhesus macaques using this optimized SIV PVL qPCR is equivalent to the bDNA signal amplification method, less costly and more versatile. Use of heterologous aRNA as an internal control is useful for optimizing performance characteristics of PVL qPCRs. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Röhnisch, Hanna E; Eriksson, Jan; Müllner, Elisabeth; Agback, Peter; Sandström, Corine; Moazzami, Ali A
2018-02-06
A key limiting step for high-throughput NMR-based metabolomics is the lack of rapid and accurate tools for absolute quantification of many metabolites. We developed, implemented, and evaluated an algorithm, AQuA (Automated Quantification Algorithm), for targeted metabolite quantification from complex 1 H NMR spectra. AQuA operates based on spectral data extracted from a library consisting of one standard calibration spectrum for each metabolite. It uses one preselected NMR signal per metabolite for determining absolute concentrations and does so by effectively accounting for interferences caused by other metabolites. AQuA was implemented and evaluated using experimental NMR spectra from human plasma. The accuracy of AQuA was tested and confirmed in comparison with a manual spectral fitting approach using the ChenomX software, in which 61 out of 67 metabolites quantified in 30 human plasma spectra showed a goodness-of-fit (r 2 ) close to or exceeding 0.9 between the two approaches. In addition, three quality indicators generated by AQuA, namely, occurrence, interference, and positional deviation, were studied. These quality indicators permit evaluation of the results each time the algorithm is operated. The efficiency was tested and confirmed by implementing AQuA for quantification of 67 metabolites in a large data set comprising 1342 experimental spectra from human plasma, in which the whole computation took less than 1 s.
A universal real-time PCR assay for the quantification of group-M HIV-1 proviral load.
Malnati, Mauro S; Scarlatti, Gabriella; Gatto, Francesca; Salvatori, Francesca; Cassina, Giulia; Rutigliano, Teresa; Volpi, Rosy; Lusso, Paolo
2008-01-01
Quantification of human immunodeficiency virus type-1 (HIV-1) proviral DNA is increasingly used to measure the HIV-1 cellular reservoirs, a helpful marker to evaluate the efficacy of antiretroviral therapeutic regimens in HIV-1-infected individuals. Furthermore, the proviral DNA load represents a specific marker for the early diagnosis of perinatal HIV-1 infection and might be predictive of HIV-1 disease progression independently of plasma HIV-1 RNA levels and CD4(+) T-cell counts. The high degree of genetic variability of HIV-1 poses a serious challenge for the design of a universal quantitative assay capable of detecting all the genetic subtypes within the main (M) HIV-1 group with similar efficiency. Here, we describe a highly sensitive real-time PCR protocol that allows for the correct quantification of virtually all group-M HIV-1 strains with a higher degree of accuracy compared with other methods. The protocol involves three stages, namely DNA extraction/lysis, cellular DNA quantification and HIV-1 proviral load assessment. Owing to the robustness of the PCR design, this assay can be performed on crude cellular extracts, and therefore it may be suitable for the routine analysis of clinical samples even in developing countries. An accurate quantification of the HIV-1 proviral load can be achieved within 1 d from blood withdrawal.
Quantification of taurine in energy drinks using ¹H NMR.
Hohmann, Monika; Felbinger, Christine; Christoph, Norbert; Wachter, Helmut; Wiest, Johannes; Holzgrabe, Ulrike
2014-05-01
The consumption of so called energy drinks is increasing, especially among adolescents. These beverages commonly contain considerable amounts of the amino sulfonic acid taurine, which is related to a magnitude of various physiological effects. The customary method to control the legal limit of taurine in energy drinks is LC-UV/vis with postcolumn derivatization using ninhydrin. In this paper we describe the quantification of taurine in energy drinks by (1)H NMR as an alternative to existing methods of quantification. Variation of pH values revealed the separation of a distinct taurine signal in (1)H NMR spectra, which was applied for integration and quantification. Quantification was performed using external calibration (R(2)>0.9999; linearity verified by Mandel's fitting test with a 95% confidence level) and PULCON. Taurine concentrations in 20 different energy drinks were analyzed by both using (1)H NMR and LC-UV/vis. The deviation between (1)H NMR and LC-UV/vis results was always below the expanded measurement uncertainty of 12.2% for the LC-UV/vis method (95% confidence level) and at worst 10.4%. Due to the high accordance to LC-UV/vis data and adequate recovery rates (ranging between 97.1% and 108.2%), (1)H NMR measurement presents a suitable method to quantify taurine in energy drinks. Copyright © 2013 Elsevier B.V. All rights reserved.
Lowe, Ross H.; Karschner, Erin L.; Schwilke, Eugene W.; Barnes, Allan J.; Huestis, Marilyn A.
2009-01-01
A two-dimensional (2D) gas chromatography/electron impact-mass spectrometry (GC/EI-MS) method for simultaneous quantification of Δ9-tetrahydrocannabinol (THC), 11-hydroxy-Δ9-tetrahydrocannabinol (11-OH-THC), and 11-nor-Δ9-tetrahydrocannabinol-9-carboxylic acid (THCCOOH) in human plasma was developed and validated. The method employs 2D capillary GC and cryofocusing for enhanced resolution and sensitivity. THC, 11-OH-THC, and THCCOOH were extracted by precipitation with acetonitrile followed by solid-phase extraction. GC separation of trimethylsilyl derivatives of analytes was accomplished with two capillary columns in series coupled via a pneumatic Deans switch system. Detection and quantification were accomplished with a bench-top single quadrupole mass spectrometer operated in electron impact-selected ion monitoring mode. Limits of quantification (LOQ) were 0.125, 0.25 and 0.125 ng/mL for THC, 11-OH-THC, and THCCOOH, respectively. Accuracy ranged from 86.0 to 113.0% for all analytes. Intra- and inter-assay precision, as percent relative standard deviation, was less than 14.1% for THC, 11-OH-THC, and THCCOOH. The method was successfully applied to quantification of THC and its 11-OH-THC and THCCOOH metabolites in plasma specimens following controlled administration of THC. PMID:17640656
Quantification of DNA using the luminescent oxygen channeling assay.
Patel, R; Pollner, R; de Keczer, S; Pease, J; Pirio, M; DeChene, N; Dafforn, A; Rose, S
2000-09-01
Simplified and cost-effective methods for the detection and quantification of nucleic acid targets are still a challenge in molecular diagnostics. Luminescent oxygen channeling assay (LOCI(TM)) latex particles can be conjugated to synthetic oligodeoxynucleotides and hybridized, via linking probes, to different DNA targets. These oligomer-conjugated LOCI particles survive thermocycling in a PCR reaction and allow quantified detection of DNA targets in both real-time and endpoint formats. The endpoint DNA quantification format utilized two sensitizer bead types that are sensitive to separate illumination wavelengths. These two bead types were uniquely annealed to target or control amplicons, and separate illuminations generated time-resolved chemiluminescence, which distinguished the two amplicon types. In the endpoint method, ratios of the two signals allowed determination of the target DNA concentration over a three-log range. The real-time format allowed quantification of the DNA target over a six-log range with a linear relationship between threshold cycle and log of the number of DNA targets. This is the first report of the use of an oligomer-labeled latex particle assay capable of producing DNA quantification and sequence-specific chemiluminescent signals in a homogeneous format. It is also the first report of the generation of two signals from a LOCI assay. The methods described here have been shown to be easily adaptable to new DNA targets because of the generic nature of the oligomer-labeled LOCI particles.
Advances in targeted proteomics and applications to biomedical research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Tujin; Song, Ehwang; Nie, Song
Targeted proteomics technique has emerged as a powerful protein quantification tool in systems biology, biomedical research, and increasing for clinical applications. The most widely used targeted proteomics approach, selected reaction monitoring (SRM), also known as multiple reaction monitoring (MRM), can be used for quantification of cellular signaling networks and preclinical verification of candidate protein biomarkers. As an extension to our previous review on advances in SRM sensitivity (Shi et al., Proteomics, 12, 1074–1092, 2012) herein we review recent advances in the method and technology for further enhancing SRM sensitivity (from 2012 to present), and highlighting its broad biomedical applications inmore » human bodily fluids, tissue and cell lines. Furthermore, we also review two recently introduced targeted proteomics approaches, parallel reaction monitoring (PRM) and data-independent acquisition (DIA) with targeted data extraction on fast scanning high-resolution accurate-mass (HR/AM) instruments. Such HR/AM targeted quantification with monitoring all target product ions addresses SRM limitations effectively in specificity and multiplexing; whereas when compared to SRM, PRM and DIA are still in the infancy with a limited number of applications. Thus, for HR/AM targeted quantification we focus our discussion on method development, data processing and analysis, and its advantages and limitations in targeted proteomics. Finally, general perspectives on the potential of achieving both high sensitivity and high sample throughput for large-scale quantification of hundreds of target proteins are discussed.« less
Salmona, Maud; Fourati, Slim; Feghoul, Linda; Scieux, Catherine; Thiriez, Aline; Simon, François; Resche-Rigon, Matthieu; LeGoff, Jérôme
2016-08-01
Accurate quantification of Epstein-Barr virus (EBV) load in blood is essential for the management of post-transplant lymphoproliferative disorders. The automation of DNA extraction and amplification may improve accuracy and reproducibility. We evaluated the EBV PCR Kit V1 with fully automated DNA extraction and amplification on the m2000 system (Abbott assay). Conversion factor between copies and international units (IU), lower limit of quantification, imprecision and linearity were determined in a whole blood (WB) matrix. Results from 339 clinical WB specimens were compared with a home-brew real-time PCR assay used in our laboratory (in-house assay). The conversion factor between copies and IU was 3.22 copies/IU. The lower limit of quantification (LLQ) was 1000 copies/mL. Intra- and inter-assay coefficients of variation were 3.1% and 7.9% respectively for samples with EBV load higher than the LLQ. The comparison between Abbott assay and in-house assay showed a good concordance (kappa = 0.77). Loads were higher with the Abbott assay (mean difference = 0.62 log10 copies/mL). The EBV PCR Kit V1 assay on the m2000 system provides a reliable and easy-to-use method for quantification of EBV DNA in WB. Copyright © 2016 Elsevier Inc. All rights reserved.
Dapic, Irena; Kobetic, Renata; Brkljacic, Lidija; Kezic, Sanja; Jakasa, Ivone
2018-02-01
The free fatty acids (FFAs) are one of the major components of the lipids in the stratum corneum (SC), the uppermost layer of the skin. Relative composition of FFAs has been proposed as a biomarker of the skin barrier status in patients with atopic dermatitis (AD). Here, we developed an LC-ESI-MS/MS method for simultaneous quantification of a range of FFAs with long and very long chain length in the SC collected by adhesive tape (D-Squame). The method, based on derivatization with 2-bromo-1-methylpyridinium iodide and 3-carbinol-1-methylpyridinium iodide, allowed highly sensitive detection and quantification of FFAs using multiple reaction monitoring. For the quantification, we applied a surrogate analyte approach and internal standardization using isotope labeled derivatives of FFAs. Adhesive tapes showed the presence of several FFAs, which are also present in the SC, a problem encountered in previous studies. Therefore, the levels of FFAs in the SC were corrected using C12:0, which was present on the adhesive tape, but not detected in the SC. The method was applied to SC samples from patients with atopic dermatitis and healthy subjects. Quantification using multiple reaction monitoring allowed sufficient sensitivity to analyze FFAs of chain lengths C16-C28 in the SC collected on only one tape strip. Copyright © 2017 John Wiley & Sons, Ltd.
Quantitative proton magnetic resonance spectroscopy without water suppression
NASA Astrophysics Data System (ADS)
Özdemir, M. S.; DeDeene, Y.; Fieremans, E.; Lemahieu, I.
2009-06-01
The suppression of the abundant water signal has been traditionally employed to decrease the dynamic range of the NMR signal in proton MRS (1H MRS) in vivo. When using this approach, if the intent is to utilize the water signal as an internal reference for the absolute quantification of metabolites, additional measurements are required for the acquisition of the water signal. This can be prohibitively time-consuming and is not desired clinically. Additionally, traditional water suppression can lead to metabolite alterations. This can be overcome by performing quantitative 1H MRS without water suppression. However, the non-water-suppressed spectra suffer from gradient-induced frequency modulations, resulting in sidebands in the spectrum. Sidebands may overlap with the metabolites, which renders the spectral analysis and quantification problematic. In this paper, we performed absolute quantification of metabolites without water suppression. Sidebands were removed by utilizing the phase of an external reference signal of single resonance to observe the time-varying the static field fluctuations induced by gradient-vibration and deconvolving this phase contamination from the desired NMR signal. The quantification of metabolites was determined after sideband correction by calibrating the metabolite signal intensities against the recorded water signal. The method was evaluated by phantom and in vivo measurements in human brain. The maximum systematic error for the quantified metabolite concentrations was found to be 10.8%, showing the feasibility of the quantification after sideband correction.
Modeling qRT-PCR dynamics with application to cancer biomarker quantification.
Chervoneva, Inna; Freydin, Boris; Hyslop, Terry; Waldman, Scott A
2017-01-01
Quantitative reverse transcription polymerase chain reaction (qRT-PCR) is widely used for molecular diagnostics and evaluating prognosis in cancer. The utility of mRNA expression biomarkers relies heavily on the accuracy and precision of quantification, which is still challenging for low abundance transcripts. The critical step for quantification is accurate estimation of efficiency needed for computing a relative qRT-PCR expression. We propose a new approach to estimating qRT-PCR efficiency based on modeling dynamics of polymerase chain reaction amplification. In contrast, only models for fluorescence intensity as a function of polymerase chain reaction cycle have been used so far for quantification. The dynamics of qRT-PCR efficiency is modeled using an ordinary differential equation model, and the fitted ordinary differential equation model is used to obtain effective polymerase chain reaction efficiency estimates needed for efficiency-adjusted quantification. The proposed new qRT-PCR efficiency estimates were used to quantify GUCY2C (Guanylate Cyclase 2C) mRNA expression in the blood of colorectal cancer patients. Time to recurrence and GUCY2C expression ratios were analyzed in a joint model for survival and longitudinal outcomes. The joint model with GUCY2C quantified using the proposed polymerase chain reaction efficiency estimates provided clinically meaningful results for association between time to recurrence and longitudinal trends in GUCY2C expression.
Single cell genomic quantification by non-fluorescence nonlinear microscopy
NASA Astrophysics Data System (ADS)
Kota, Divya; Liu, Jing
2017-02-01
Human epidermal growth receptor 2 (Her2) is a gene which plays a major role in breast cancer development. The quantification of Her2 expression in single cells is limited by several drawbacks in existing fluorescence-based single molecule techniques, such as low signal-to-noise ratio (SNR), strong autofluorescence and background signals from biological components. For rigorous genomic quantification, a robust method of orthogonal detection is highly desirable and we demonstrated it by two non-fluorescent imaging techniques -transient absorption microscopy (TAM) and second harmonic generation (SHG). In TAM, gold nanoparticles (AuNPs) are chosen as an orthogonal probes for detection of single molecules which gives background-free quantifications of single mRNA transcript. In SHG, emission from barium titanium oxide (BTO) nanoprobes was demonstrated which allows stable signal beyond the autofluorescence window. Her2 mRNA was specifically labeled with nanoprobes which are conjugated with antibodies or oligonucleotides and quantified at single copy sensitivity in the cancer cells and tissues. Furthermore, a non-fluorescent super-resolution concept, named as second harmonic super-resolution microscopy (SHaSM), was proposed to quantify individual Her2 transcripts in cancer cells beyond the diffraction limit. These non-fluorescent imaging modalities will provide new dimensions in biomarker quantification at single molecule sensitivity in turbid biological samples, offering a strong cross-platform strategy for clinical monitoring at single cell resolution.
Bilbao, Aivett; Zhang, Ying; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Lisacek, Frédérique; Hopfgartner, Gérard
2016-01-01
Data-independent acquisition LC-MS/MS techniques complement supervised methods for peptide quantification. However, due to the wide precursor isolation windows, these techniques are prone to interference at the fragment ion level, which in turn is detrimental for accurate quantification. The “non-outlier fragment ion” (NOFI) ranking algorithm has been developed to assign low priority to fragment ions affected by interference. By using the optimal subset of high priority fragment ions these interfered fragment ions are effectively excluded from quantification. NOFI represents each fragment ion as a vector of four dimensions related to chromatographic and MS fragmentation attributes and applies multivariate outlier detection techniques. Benchmarking conducted on a well-defined quantitative dataset (i.e. the SWATH Gold Standard), indicates that NOFI on average is able to accurately quantify 11-25% more peptides than the commonly used Top-N library intensity ranking method. The sum of the area of the Top3-5 NOFIs produces similar coefficients of variation as compared to the library intensity method but with more accurate quantification results. On a biologically relevant human dendritic cell digest dataset, NOFI properly assigns low priority ranks to 85% of annotated interferences, resulting in sensitivity values between 0.92 and 0.80 against 0.76 for the Spectronaut interference detection algorithm. PMID:26412574
New approach for the quantification of processed animal proteins in feed using light microscopy.
Veys, P; Baeten, V
2010-07-01
A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.
Louwagie, Mathilde; Kieffer-Jaquinod, Sylvie; Dupierris, Véronique; Couté, Yohann; Bruley, Christophe; Garin, Jérôme; Dupuis, Alain; Jaquinod, Michel; Brun, Virginie
2012-07-06
Accurate quantification of pure peptides and proteins is essential for biotechnology, clinical chemistry, proteomics, and systems biology. The reference method to quantify peptides and proteins is amino acid analysis (AAA). This consists of an acidic hydrolysis followed by chromatographic separation and spectrophotometric detection of amino acids. Although widely used, this method displays some limitations, in particular the need for large amounts of starting material. Driven by the need to quantify isotope-dilution standards used for absolute quantitative proteomics, particularly stable isotope-labeled (SIL) peptides and PSAQ proteins, we developed a new AAA assay (AAA-MS). This method requires neither derivatization nor chromatographic separation of amino acids. It is based on rapid microwave-assisted acidic hydrolysis followed by high-resolution mass spectrometry analysis of amino acids. Quantification is performed by comparing MS signals from labeled amino acids (SIL peptide- and PSAQ-derived) with those of unlabeled amino acids originating from co-hydrolyzed NIST standard reference materials. For both SIL peptides and PSAQ standards, AAA-MS quantification results were consistent with classical AAA measurements. Compared to AAA assay, AAA-MS was much faster and was 100-fold more sensitive for peptide and protein quantification. Finally, thanks to the development of a labeled protein standard, we also extended AAA-MS analysis to the quantification of unlabeled proteins.
Wang, Xiaomin; Zhang, Xiaojing; Ma, Lin; Li, Shengli
2018-06-20
Quantification of hepatic fat and iron content is important for early detection and monitoring of nonalcoholic fatty liver disease (NAFLD) patients. This study evaluated quantification efficiency of hepatic proton density fat fraction (PDFF) by MRI using NAFLD rabbits. R2* was also measured to investigate whether it correlates with fat levels in NAFLD. NAFLD rabbit model was successfully established by high fat and cholesterol diet. Rabbits underwent MRI examination for fat and iron analyses, compared with liver histological findings. MR examinations were performed on a 3.0T MR system using multi-echo 3D gradient recalled echo (GRE) sequence. MRI-PDFF showed significant differences between different steatosis grades with medians of 3.72% (normal), 5.43% (mild), 9.11% (moderate) and 11.17% (severe), whereas this was not observed in R2*. Close correlation between MRI-PDFF and histological steatosis was observed (r=0.78, P=0.000). Hepatic iron deposit was not found in any rabbits. There was no correlation between R2* and either liver MRI-PDFF or histological steatosis. MR measuring MRI-PDFF and R2* simultaneously provides promising quantification of steatosis and iron. Rabbit NAFLD model confirmed accuracy of MRI-PDFF for liver fat quantification. R2* measurement and relationship between fat and iron of NAFLD liver need further experimental investigation.
Vu, Dai Long; Ranglová, Karolína; Hájek, Jan; Hrouzek, Pavel
2018-05-01
Quantification of selenated amino-acids currently relies on methods employing inductively coupled plasma mass spectrometry (ICP-MS). Although very accurate, these methods do not allow the simultaneous determination of standard amino-acids, hampering the comparison of the content of selenated versus non-selenated species such as methionine (Met) and selenomethionine (SeMet). This paper reports two approaches for the simultaneous quantification of Met and SeMet. In the first approach, standard enzymatic hydrolysis employing Protease XIV was applied for the preparation of samples. The second approach utilized methanesulfonic acid (MA) for the hydrolysis of samples, either in a reflux system or in a microwave oven, followed by derivatization with diethyl ethoxymethylenemalonate. The prepared samples were then analyzed by multiple reaction monitoring high performance liquid chromatography tandem mass spectrometry (MRM-HPLC-MS/MS). Both approaches provided platforms for the accurate determination of selenium/sulfur substitution rate in Met. Moreover the second approach also provided accurate simultaneous quantification of Met and SeMet with a low limit of detection, low limit of quantification and wide linearity range, comparable to the commonly used gas chromatography mass spectrometry (GC-MS) method or ICP-MS. The novel method was validated using certified reference material in conjunction with the GC-MS reference method. Copyright © 2018. Published by Elsevier B.V.
The Qiagen Investigator® Quantiplex HYres as an alternative kit for DNA quantification.
Frégeau, Chantal J; Laurin, Nancy
2015-05-01
The Investigator® Quantiplex HYres kit was evaluated as a potential replacement for dual DNA quantification of casework samples. This kit was determined to be highly sensitive with a limit of quantification and limit of detection of 0.0049ng/μL and 0.0003ng/μL, respectively, for both human and male DNA, using full or half reaction volumes. It was also accurate in assessing the amount of male DNA present in 96 mock and actual casework male:female mixtures (various ratios) processed in this exercise. The close correlation between the male/human DNA ratios expressed in percentages derived from the Investigator® Quantiplex HYres quantification results and the male DNA proportion calculated in mixed AmpFlSTR® Profiler® Plus or AmpFlSTR® Identifiler® Plus profiles, using the Amelogenin Y peak and STR loci, allowed guidelines to be developed to facilitate decisions regarding when to submit samples to Y-STR rather than autosomal STR profiling. The internal control (IC) target was shown to be more sensitive to inhibitors compared to the human and male DNA targets included in the Investigator® Quantiplex HYres kit serving as a good quality assessor of DNA extracts. The new kit met our criteria of enhanced sensitivity, accuracy, consistency, reliability and robustness for casework DNA quantification. Crown Copyright © 2015. Published by Elsevier Ireland Ltd. All rights reserved.
Rapid quantification of soilborne pathogen communities in wheat-based long-term field experiments
USDA-ARS?s Scientific Manuscript database
Traditional isolation and quantification of inoculum density is difficult for most soilborne pathogens. Quantitative PCR methods have been developed to rapidly identify and quantify many of these pathogens using a single DNA extract from soil. Rainfed experiments operated continuously for up to 84 y...
ERIC Educational Resources Information Center
Solow, Mike
2004-01-01
Quantification of a contaminant in water provides the first-year general chemistry students with a tangible application of mass spectrometry. The relevance of chemistry to assessing and solving environmental problems is highlighted for students when they perform mass spectroscopy experiments.
In this study, a new analytical technique was developed for the identification and quantification of multi-functional compounds containing simultaneously at least one hydroxyl or one carboxylic group, or both. This technique is based on derivatizing first the carboxylic group(s) ...
Protein Quantification by Elemental Mass Spectrometry: An Experiment for Graduate Students
ERIC Educational Resources Information Center
Schwarz, Gunnar; Ickert, Stefanie; Wegner, Nina; Nehring, Andreas; Beck, Sebastian; Tiemann, Ruediger; Linscheid, Michael W.
2014-01-01
A multiday laboratory experiment was designed to integrate inductively coupled plasma-mass spectrometry (ICP-MS) in the context of protein quantification into an advanced practical course in analytical and environmental chemistry. Graduate students were familiar with the analytical methods employed, whereas the combination of bioanalytical assays…
Identification and Quantification Soil Redoximorphic Features by Digital Image Processing
USDA-ARS?s Scientific Manuscript database
Soil redoximorphic features (SRFs) have provided scientists and land managers with insight into relative soil moisture for approximately 60 years. The overall objective of this study was to develop a new method of SRF identification and quantification from soil cores using a digital camera and imag...
USDA-ARS?s Scientific Manuscript database
Currently, blueberry bruising is evaluated by either human visual/tactile inspection or firmness measurement instruments. These methods are destructive and time-consuming. The goal of this paper was to develop a non-destructive approach for blueberry bruising detection and quantification. The spe...
Colorimetric Quantification and in Situ Detection of Collagen
ERIC Educational Resources Information Center
Esteban, Francisco J.; del Moral, Maria L.; Sanchez-Lopez, Ana M.; Blanco, Santos; Jimenez, Ana; Hernandez, Raquel; Pedrosa, Juan A.; Peinado, Maria A.
2005-01-01
A simple multidisciplinary and inexpensive laboratory exercise is proposed, in which the undergraduate student may correlate biochemical and anatomical findings. The entire practical session can be completed in one 2.5-3 hour laboratory period, and consists of the quantification of collagen and total protein content from tissue sections--without…
Uncertainty quantification in nanomechanical measurements using the atomic force microscope
Ryan Wagner; Robert Moon; Jon Pratt; Gordon Shaw; Arvind Raman
2011-01-01
Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale...
Quantification of micro stickies
Mahendra Doshi; Jeffrey Dyer; Salman Aziz; Kristine Jackson; Said M. Abubakr
1997-01-01
The objective of this project was to compare the different methods for the quantification of micro stickies. The hydrophobic materials investigated in this project for the collection of micro stickies were Microfoam* (polypropylene packing material), low density polyethylene film (LDPE), high density polyethylene (HDPE; a flat piece from a square plastic bottle), paper...
USDA-ARS?s Scientific Manuscript database
Arbuscular mycorrhizal fungi (AMF) are well-known plant symbionts which provide enhanced phosphorus uptake as well as other benefits to their host plants. Quantification of mycorrhizal biomass and root colonization has traditionally been performed by root staining and microscopic examination methods...
USDA-ARS?s Scientific Manuscript database
The pathogen causing corky root on lettuce, Sphingobium suberifaciens, is recalcitrant to standard epidemiological methods. Primers were selected from 16S rDNA sequences useful for the specific detection and quantification of S. suberifaciens. Conventional (PCR) and quantitative (qPCR) PCR protocols...
Quantification of excess water loss in plant canopies warmed with infrared heating
USDA-ARS?s Scientific Manuscript database
Here we investigate the extent to which infrared heating used to warm plant canopies in climate manipulation experiments increases transpiration. Concerns regarding the impact of the infrared heater technique on the water balance have been raised before, but a quantification is lacking. We calculate...
The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...
DOT National Transportation Integrated Search
2016-09-13
The Hampton Roads Climate Impact Quantification Initiative (HRCIQI) is a multi-part study sponsored by the U.S. Department of Transportation (DOT) Climate Change Center with the goals that include developing a cost tool that provides methods for volu...
21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.
Code of Federal Regulations, 2011 CFR
2011-04-01
..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS EXTRALABEL DRUG USE IN ANIMALS Specific Provisions Relating to Extralabel Use of Animal and Human Drugs in Food-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a...
43 CFR 11.71 - Quantification phase-service reduction quantification.
Code of Federal Regulations, 2012 CFR
2012-10-01
... identified geohydrological units, which are aquifers or confining layers, within the assessment area. (2)(i... determined by determining: (1) The surface area of soil with reduced ability to sustain the growth of vegetation from the baseline level; (2) The surface area or volume of soil with reduced suitability as...
43 CFR 11.71 - Quantification phase-service reduction quantification.
Code of Federal Regulations, 2013 CFR
2013-10-01
... identified geohydrological units, which are aquifers or confining layers, within the assessment area. (2)(i... determined by determining: (1) The surface area of soil with reduced ability to sustain the growth of vegetation from the baseline level; (2) The surface area or volume of soil with reduced suitability as...
43 CFR 11.71 - Quantification phase-service reduction quantification.
Code of Federal Regulations, 2014 CFR
2014-10-01
... identified geohydrological units, which are aquifers or confining layers, within the assessment area. (2)(i... determined by determining: (1) The surface area of soil with reduced ability to sustain the growth of vegetation from the baseline level; (2) The surface area or volume of soil with reduced suitability as...
39 CFR 3050.1 - Definitions applicable to this part.
Code of Federal Regulations, 2012 CFR
2012-07-01
... was applied by the Commission in its most recent Annual Compliance Determination unless a different analytical principle subsequently was accepted by the Commission in a final rule. (b) Accepted quantification technique refers to a quantification technique that was applied in the most recent iteration of the periodic...
39 CFR 3050.1 - Definitions applicable to this part.
Code of Federal Regulations, 2014 CFR
2014-07-01
... was applied by the Commission in its most recent Annual Compliance Determination unless a different analytical principle subsequently was accepted by the Commission in a final rule. (b) Accepted quantification technique refers to a quantification technique that was applied in the most recent iteration of the periodic...
The quantification of pattern is a key element of landscape analyses. One aspect of this quantification of particular importance to landscape ecologists regards the classification of continuous variables to produce categorical variables such as land-cover type or elevation strat...
Good quantification practices of flavours and fragrances by mass spectrometry.
Begnaud, Frédéric; Chaintreau, Alain
2016-10-28
Over the past 15 years, chromatographic techniques with mass spectrometric detection have been increasingly used to monitor the rapidly expanded list of regulated flavour and fragrance ingredients. This trend entails a need for good quantification practices suitable for complex media, especially for multi-analytes. In this article, we present experimental precautions needed to perform the analyses and ways to process the data according to the most recent approaches. This notably includes the identification of analytes during their quantification and method validation, when applied to real matrices, based on accuracy profiles. A brief survey of application studies based on such practices is given.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Authors.
Elmer-Dixon, Margaret M; Bowler, Bruce E
2018-05-19
A novel approach to quantify mixed lipid systems is described. Traditional approaches to lipid vesicle quantification are time consuming, require large amounts of material and are destructive. We extend our recently described method for quantification of pure lipid systems to mixed lipid systems. The method only requires a UV-Vis spectrometer and does not destroy sample. Mie scattering data from absorbance measurements are used as input into a Matlab program to calculate the total vesicle concentration and the concentrations of each lipid in the mixed lipid system. The technique is fast and accurate, which is essential for analytical lipid binding experiments. Copyright © 2018. Published by Elsevier Inc.
2012-01-01
Naturally occurring native peptides provide important information about physiological states of an organism and its changes in disease conditions but protocols and methods for assessing their abundance are not well-developed. In this paper, we describe a simple procedure for the quantification of non-tryptic peptides in body fluids. The workflow includes an enrichment step followed by two-dimensional fractionation of native peptides and MS/MS data management facilitating the design and validation of LC- MRM MS assays. The added value of the workflow is demonstrated in the development of a triplex LC-MRM MS assay used for quantification of peptides potentially associated with the progression of liver disease to hepatocellular carcinoma. PMID:22304756
Lesion Quantification in Dual-Modality Mammotomography
NASA Astrophysics Data System (ADS)
Li, Heng; Zheng, Yibin; More, Mitali J.; Goodale, Patricia J.; Williams, Mark B.
2007-02-01
This paper describes a novel x-ray/SPECT dual modality breast imaging system that provides 3D structural and functional information. While only a limited number of views on one side of the breast can be acquired due to mechanical and time constraints, we developed a technique to compensate for the limited angle artifact in reconstruction images and accurately estimate both the lesion size and radioactivity concentration. Various angular sampling strategies were evaluated using both simulated and experimental data. It was demonstrated that quantification of lesion size to an accuracy of 10% and quantification of radioactivity to an accuracy of 20% are feasible from limited-angle data acquired with clinically practical dosage and acquisition time
Mota, Maria Fernanda S; Souza, Marcella F; Bon, Elba P S; Rodrigues, Marcoaurelio A; Freitas, Suely Pereira
2018-05-24
The use of colorimetric methods for protein quantification in microalgae is hindered by their elevated amounts of membrane-embedded intracellular proteins. In this work, the protein content of three species of microalgae was determined by the Lowry method after the cells were dried, ball-milled, and treated with the detergent sodium dodecyl sulfate (SDS). Results demonstrated that the association of milling and SDS treatment resulted in a 3- to 7-fold increase in protein quantification. Milling promoted microalgal disaggregation and cell wall disruption enabling access of the SDS detergent to the microalgal intracellular membrane proteins and their efficient solubilization and quantification. © 2018 Phycological Society of America.
Frauen, M; Steinhart, H; Rapp, C; Hintze, U
2001-07-01
A simple, rapid and reproducible method for identification and quantification of iodopropynyl butylcarbamate (IPBC) in different cosmetic formulations is presented. The determination was carried out using a high-performance liquid chromatography (HPLC) procedure on a reversed phase column coupled to a single quadrupole mass spectrometer (MS) via an electrospray ionization (ESI) interface. Detection was performed in the positive selected ion-monitoring mode. In methanol/water extracts from different cosmetic formulations a detection limit between 50 and 100 ng/g could be achieved. A routine analytical procedure could be set up with good quantification reliability (relative standard deviation between 0.9 and 2.9%).
Quantification of Wine Mixtures with an Electronic Nose and a Human Panel.
Aleixandre, Manuel; Cabellos, Juan M; Arroyo, Teresa; Horrillo, M C
2018-01-01
In this work, an electronic nose and a human panel were used for the quantification of wines formed by binary mixtures of four white grape varieties and two varieties of red wines at different percentages (from 0 to 100% in 10% steps for the electronic nose and from 0 to 100% in 25% steps for the human panel). The wines were prepared using the traditional method with commercial yeasts. Both techniques were able to quantify the mixtures tested, but it is important to note that the technology of the electronic nose is faster, simpler, and more objective than the human panel. In addition, better results of quantification were also obtained using the electronic nose.
Quantification and Formalization of Security
2010-02-01
Quantification of Information Flow . . . . . . . . . . . . . . . . . . 30 2.4 Language Semantics . . . . . . . . . . . . . . . . . . . . . . . . . . 46...system behavior observed by users holding low clearances. This policy, or a variant of it, is enforced by many pro- gramming language -based mechanisms...illustrates with a particular programming language (while-programs plus probabilistic choice). The model is extended in §2.5 to programs in which
Liquid Chromatography-Mass Spectrometry-based Quantitative Proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Fang; Liu, Tao; Qian, Weijun
2011-07-22
Liquid chromatography-mass spectrometry (LC-MS)-based quantitative proteomics has become increasingly applied for a broad range of biological applications due to growing capabilities for broad proteome coverage and good accuracy in quantification. Herein, we review the current LC-MS-based quantification methods with respect to their advantages and limitations, and highlight their potential applications.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-27
... DEPARTMENT OF AGRICULTURE [Docket Number: USDA-2013-0003] Science-Based Methods for Entity-Scale Quantification of Greenhouse Gas Sources and Sinks From Agriculture and Forestry Practices AGENCY: Office of the... of Agriculture (USDA) has prepared a report containing methods for quantifying entity-scale...
ERIC Educational Resources Information Center
Lazarsfeld, Paul F., Ed.
Part two of a seven-section, final report on the Multi-Disciplinary Graduate Program in Educational Research, this document contains discussions of quantification and reason analysis. Quantification is presented as a language consisting of sentences (graphs and tables), words, (classificatory instruments), and grammar (rules for constructing and…
Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of bioche...
Osorio, I; Frei, M G
2009-11-01
Substantive advances in clinical epileptology may be realized through the judicious use of real-time automated seizure detection, quantification, warning, and delivery of therapy in subjects with pharmacoresistant seizures. Materialization of these objectives is likely to elevate epileptology to the level of a mature clinical science.
Being Something: Prospects for a Property-Based Approach to Predicative Quantification
ERIC Educational Resources Information Center
Rieppel, Michael Olivier
2013-01-01
Few questions concerning the character of our talk about the world are more basic than how predicates combine with names to form truth-evaluable sentences. One particularly intriguing fact that any account of predication needs to make room for is that natural language allows for quantification into predicate position, through constructions like…
Quantification of protein carbonylation.
Wehr, Nancy B; Levine, Rodney L
2013-01-01
Protein carbonylation is the most commonly used measure of oxidative modification of proteins. It is most often measured spectrophotometrically or immunochemically by derivatizing proteins with the classical carbonyl reagent 2,4 dinitrophenylhydrazine (DNPH). We present protocols for the derivatization and quantification of protein carbonylation with these two methods, including a newly described dot blot with greatly increased sensitivity.
Project research optimized the quantification technique for carbohydrates that also allows quantification of other non-polar molecular markers based on using an isotopically labeled internal standard (D-glucose-1,2,3,4,5,6,6-d7) to monitor extraction efficiency, extraction usi...
USDA-ARS?s Scientific Manuscript database
Citrus limonoid glucosides are found in large quantities in citrus fruits and seeds. Characterization and quantification of these compounds is important because they contribute to citrus quality and are reported to be biologically active. Unlike other bioactive compounds (e.g., flavonoids) present...
USDA-ARS?s Scientific Manuscript database
A general method was developed for the quantification of hydroxycinnamic acid derivatives and flavones, flavonols, and their glycosides based on the UV molar relative response factors (MRRF) of the standards. Each of these phenolic compounds contains a cinnamoyl structure and has a maximum absorban...
Quantification of Spatial Heterogeneity in Old Growth Forst of Korean Pine
Wang Zhengquan; Wang Qingcheng; Zhang Yandong
1997-01-01
Spatial hetergeneity is a very important issue in studying functions and processes of ecological systems at various scales. Semivariogram analysis is an effective technique to summarize spatial data, and quantification of sptail heterogeneity. In this paper, we propose some principles to use semivariograms to characterize and compare spatial heterogeneity of...
USDA-ARS?s Scientific Manuscript database
Satellite remote sensing provides continuous temporal and spatial information of terrestrial ecosystems. Using these remote sensing data and eddy flux measurements and biogeochemical models, such as the Terrestrial Ecosystem Model (TEM), should provide a more adequate quantification of carbon dynami...
USDA-ARS?s Scientific Manuscript database
The ectoparasitic stubby root nematode Paratrichodorus allius transmits Tobacco rattle virus, which causes corky ringspot disease resulting in significant economic losses in the potato industry. This study developed a diagnostic method for direct quantification of P. allius from soil DNA using a Taq...
USDA-ARS?s Scientific Manuscript database
Thirteen molecular species of tetraacylglycerols in the seed oil of Physaria fendleri were recently identified. We report here the quantification of these tetraacylglycerols using HPLC with evaporative light scattering detector and the MS of the HPLC fractions. Ion signal intensities of MS1 from th...
Quantification of fungicides in snow-melt runoff from turf: A comparison of four extraction methods
USDA-ARS?s Scientific Manuscript database
A variety of pesticides are used to control diverse stressors to turf. These pesticides have a wide range in physical and chemical properties. The objective of this project was to develop an extraction and analysis method for quantification of chlorothalonil and PCNB (pentachloronitrobenzene), two p...
The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...
The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...
The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...
USDA-ARS?s Scientific Manuscript database
The root-lesion nematode Pratylenchus penetrans is a major pathogen of potato world-wide. Yield losses may be exacerbated by interaction with the fungus Verticillium dahliae in the Potato early dying disease complex. Accurate identification and quantification of P. penetrans prior to planting are es...
USDA-ARS?s Scientific Manuscript database
RNA integrity is critical for successful RNA quantification. The level of integrity required differs among sources and extraction procedures and has not been determined for bacterial RNA. Three RNA isolation methods were evaluated for their ability to produce high quality RNA from D. dadantii. The i...
USDA-ARS?s Scientific Manuscript database
Hyperspectral imaging technology is increasingly regarded as a powerful tool for the classification and spatial quantification of a wide range of agrofood product properties. Taking into account the difficulties involved in validating hyperspectral calibrations, the models constructed here proved mo...
Calderón-Celis, Francisco; Sanz-Medel, Alfredo; Encinar, Jorge Ruiz
2018-01-23
We present a novel and highly sensitive ICP-MS approach for absolute quantification of all important target biomolecule containing P, S, Se, As, Br, and/or I (e.g., proteins and phosphoproteins, metabolites, pesticides, drugs), under the same simple instrumental conditions and without requiring any specific and/or isotopically enriched standard.
Lu, Tzong-Shi; Yiao, Szu-Yu; Lim, Kenneth; Jensen, Roderick V; Hsiao, Li-Li
2010-07-01
The identification of differences in protein expression resulting from methodical variations is an essential component to the interpretation of true, biologically significant results. We used the Lowry and Bradford methods- two most commonly used methods for protein quantification, to assess whether differential protein expressions are a result of true biological or methodical variations. MATERIAL #ENTITYSTARTX00026; Differential protein expression patterns was assessed by western blot following protein quantification by the Lowry and Bradford methods. We have observed significant variations in protein concentrations following assessment with the Lowry versus Bradford methods, using identical samples. Greater variations in protein concentration readings were observed over time and in samples with higher concentrations, with the Bradford method. Identical samples quantified using both methods yielded significantly different expression patterns on Western blot. We show for the first time that methodical variations observed in these protein assay techniques, can potentially translate into differential protein expression patterns, that can be falsely taken to be biologically significant. Our study therefore highlights the pivotal need to carefully consider methodical approaches to protein quantification in techniques that report quantitative differences.
[DNA quantification of blood samples pre-treated with pyramidon].
Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan
2014-06-01
To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.
Quantification of allantoin in various Zea mays L. hybrids by RP-HPLC with UV detection.
Maksimović, Z; Malenović, A; Jancić, B; Kovacević, N
2004-07-01
A RP-HPLC method for quantification of allantoin in silk of fifteen maize hybrids (Zea mays L., Poaceae) was described. Following extraction of the plant material with an acetone-water (7:3, VN) mixture, filtration and dilution, the extracts were analyzed without previous chemical derivatization. Separation and quantification were achieved using an Alltech Econosil C18 column under isocratic conditions at 40 degrees C. The mobile phase flow (20% methanol--80% water with 5 mM sodium laurylsulfate added at pH 2.5, adjusted with 85% orthophosphoric acid; pH of water phase was finally adjusted at 6.0 by addition of triethylamine) was maintained at 1.0 mL/min. Column effluent was monitored at 235 nm. This simple procedure afforded efficient separation and quantification of allantoin in plant material, without interference of polyphenols or other plant constituents of medium to high polarity, or similar UV absorption. Our study revealed that the silk of all investigated maize hybrids could be considered relatively rich in allantoin, covering the concentration range between 215 and 289 mg per 100 g of dry plant material.
Protein quantification using a cleavable reporter peptide.
Duriez, Elodie; Trevisiol, Stephane; Domon, Bruno
2015-02-06
Peptide and protein quantification based on isotope dilution and mass spectrometry analysis are widely employed for the measurement of biomarkers and in system biology applications. The accuracy and reliability of such quantitative assays depend on the quality of the stable-isotope labeled standards. Although the quantification using stable-isotope labeled peptides is precise, the accuracy of the results can be severely biased by the purity of the internal standards, their stability and formulation, and the determination of their concentration. Here we describe a rapid and cost-efficient method to recalibrate stable isotope labeled peptides in a single LC-MS analysis. The method is based on the equimolar release of a protein reference peptide (used as surrogate for the protein of interest) and a universal reporter peptide during the trypsinization of a concatenated polypeptide standard. The quality and accuracy of data generated with such concatenated polypeptide standards are highlighted by the quantification of two clinically important proteins in urine samples and compared with results obtained with conventional stable isotope labeled reference peptides. Furthermore, the application of the UCRP standards in complex samples is described.
Interferences in the direct quantification of bisphenol S in paper by means of thermochemolysis.
Becerra, Valentina; Odermatt, Jürgen
2013-02-01
This article analyses the interferences in the quantification of traces of bisphenol S in paper by applying the direct analytical method "analytical pyrolysis gas chromatography mass spectrometry" (Py-GC/MS) in conjunction with on-line derivatisation with tetramethylammonium hydroxide (TMAH). As the analytes are simultaneously analysed with the matrix, the interferences derive from the matrix. The investigated interferences are found in the analysis of paper samples, which include bisphenol S derivative compounds. As the free bisphenol S is the hydrolysis product of the bisphenol S derivative compounds, the detected amount of bisphenol S in the sample may be overestimated. It is found that the formation of free bisphenol S from the bisphenol S derivative compounds is enhanced in the presence of tetramethylammonium hydroxide (TMAH) under pyrolytic conditions. In order to avoid the formation of bisphenol S trimethylsulphonium hydroxide (TMSH) is introduced. Different parameters are optimised in the development of the quantification method with TMSH. The quantification method based on TMSH thermochemolysis has been validated in terms of reproducibility and accuracy. Copyright © 2012 Elsevier B.V. All rights reserved.
Sandra, Koen; Mortier, Kjell; Jorge, Lucie; Perez, Luis C; Sandra, Pat; Priem, Sofie; Poelmans, Sofie; Bouche, Marie-Paule
2014-05-01
Nanobodies(®) are therapeutic proteins derived from the smallest functional fragments of heavy chain-only antibodies. The development and validation of an LC-MS/MS-based method for the quantification of an IgE binding Nanobody in cynomolgus monkey plasma is presented. Nanobody quantification was performed making use of a proteotypic tryptic peptide chromatographically enriched prior to LC-MS/MS analysis. The validated LLOQ at 36 ng/ml was measured with an intra- and inter-assay precision and accuracy <20%. The required sensitivity could be obtained based on the selectivity of 2D LC combined with MS/MS. No analyte specific tools for affinity purification were used. Plasma samples originating from a PK/PD study were analyzed and compared with the results obtained with a traditional ligand-binding assay. Excellent correlations between the two techniques were obtained, and similar PK parameters were estimated. A 2D LC-MS/MS method was successfully developed and validated for the quantification of a next generation biotherapeutic.
Standardless quantification by parameter optimization in electron probe microanalysis
NASA Astrophysics Data System (ADS)
Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.
2012-11-01
A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively.
Simpson, Deborah M; Beynon, Robert J
2012-09-01
Systems biology requires knowledge of the absolute amounts of proteins in order to model biological processes and simulate the effects of changes in specific model parameters. Quantification concatamers (QconCATs) are established as a method to provide multiplexed absolute peptide standards for a set of target proteins in isotope dilution standard experiments. Two or more quantotypic peptides representing each of the target proteins are concatenated into a designer gene that is metabolically labelled with stable isotopes in Escherichia coli or other cellular or cell-free systems. Co-digestion of a known amount of QconCAT with the target proteins generates a set of labelled reference peptide standards for the unlabelled analyte counterparts, and by using an appropriate mass spectrometry platform, comparison of the intensities of the peptide ratios delivers absolute quantification of the encoded peptides and in turn the target proteins for which they are surrogates. In this review, we discuss the criteria and difficulties associated with surrogate peptide selection and provide examples in the design of QconCATs for quantification of the proteins of the nuclear factor κB pathway.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matzke, Melissa M.; Brown, Joseph N.; Gritsenko, Marina A.
2013-02-01
Liquid chromatography coupled with mass spectrometry (LC-MS) is widely used to identify and quantify peptides in complex biological samples. In particular, label-free shotgun proteomics is highly effective for the identification of peptides and subsequently obtaining a global protein profile of a sample. As a result, this approach is widely used for discovery studies. Typically, the objective of these discovery studies is to identify proteins that are affected by some condition of interest (e.g. disease, exposure). However, for complex biological samples, label-free LC-MS proteomics experiments measure peptides and do not directly yield protein quantities. Thus, protein quantification must be inferred frommore » one or more measured peptides. In recent years, many computational approaches to relative protein quantification of label-free LC-MS data have been published. In this review, we examine the most commonly employed quantification approaches to relative protein abundance from peak intensity values, evaluate their individual merits, and discuss challenges in the use of the various computational approaches.« less
Lamb Wave Damage Quantification Using GA-Based LS-SVM.
Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong
2017-06-12
Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.
Multivariate Analysis for Quantification of Plutonium(IV) in Nitric Acid Based on Absorption Spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lines, Amanda M.; Adami, Susan R.; Sinkov, Sergey I.
Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example ofmore » a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.« less
Chai, Liuying; Zhang, Jianwei; Zhang, Lili; Chen, Tongsheng
2015-03-01
Spectral measurement of fluorescence resonance energy transfer (FRET), spFRET, is a widely used FRET quantification method in living cells today. We set up a spectrometer-microscope platform that consists of a miniature fiber optic spectrometer and a widefield fluorescence microscope for the spectral measurement of absolute FRET efficiency (E) and acceptor-to-donor concentration ratio (R(C)) in single living cells. The microscope was used for guiding cells and the spectra were simultaneously detected by the miniature fiber optic spectrometer. Moreover, our platform has independent excitation and emission controllers, so different excitations can share the same emission channel. In addition, we developed a modified spectral FRET quantification method (mlux-FRET) for the multiple donors and multiple acceptors FRET construct (mD∼nA) sample, and we also developed a spectra-based 2-channel acceptor-sensitized FRET quantification method (spE-FRET). We implemented these modified FRET quantification methods on our platform to measure the absolute E and R(C) values of tandem constructs with different acceptor/donor stoichiometries in single living Huh-7 cells.
Quantification of Efficiency of Beneficiation of Lunar Regolith
NASA Technical Reports Server (NTRS)
Trigwell, Steve; Lane, John; Captain, James; Weis, Kyle; Quinn, Jacqueline; Watanabe, Fumiya
2011-01-01
Electrostatic beneficiation of lunar regolith is being researched at Kennedy Space Center to enhance the ilmenite concentration of the regolith for the production of oxygen in in-situ resource utilization on the lunar surface. Ilmenite enrichment of up to 200% was achieved using lunar simulants. For the most accurate quantification of the regolith particles, standard petrographic methods are typically followed, but in order to optimize the process, many hundreds of samples were generated in this study that made the standard analysis methods time prohibitive. In the current studies, X-ray photoelectron spectroscopy (XPS) and Secondary Electron microscopy/Energy Dispersive Spectroscopy (SEM/EDS) were used that could automatically, and quickly, analyze many separated fractions of lunar simulant. In order to test the accuracy of the quantification, test mixture samples of known quantities of ilmenite (2, 5, 10, and 20 wt%) in silica (pure quartz powder), were analyzed by XPS and EDS. The results showed that quantification for low concentrations of ilmenite in silica could be accurately achieved by both XPS and EDS, knowing the limitations of the techniques. 1
Quantification of free circulating tumor DNA as a diagnostic marker for breast cancer.
Catarino, Raquel; Ferreira, Maria M; Rodrigues, Helena; Coelho, Ana; Nogal, Ana; Sousa, Abreu; Medeiros, Rui
2008-08-01
To determine whether the amounts of circulating DNA could discriminate between breast cancer patients and healthy individuals by using real-time PCR quantification methodology. Our standard protocol for quantification of cell-free plasma DNA involved 175 consecutive patients with breast cancer and 80 healthy controls. We found increased levels of circulating DNA in breast cancer patients compared to control individuals (105.2 vs. 77.06 ng/mL, p < 0.001). We also found statistically significant differences in circulating DNA amounts in patients before and after breast surgery (105.2 vs. 59.0 ng/mL, p = 0.001). Increased plasma cell-free DNA concentration was a strong risk factor for breast cancer, conferring an increased risk for the presence of this disease (OR, 12.32; 95% CI, 2.09-52.28; p < 0.001). Quantification of circulating DNA by real-time PCR may be a good and simple tool for detection of breast cancer with a potential to clinical applicability together with other current methods used for monitoring the disease.
Lamb Wave Damage Quantification Using GA-Based LS-SVM
Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong
2017-01-01
Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification. PMID:28773003
Pérez-Castaño, Estefanía; Sánchez-Viñas, Mercedes; Gázquez-Evangelista, Domingo; Bagur-González, M Gracia
2018-01-15
This paper describes and discusses the application of trimethylsilyl (TMS)-4,4'-desmethylsterols derivatives chromatographic fingerprints (obtained from an off-line HPLC-GC-FID system) for the quantification of extra virgin olive oil in commercial vinaigrettes, dressing salad and in-house reference materials (i-HRM) using two different Partial Least Square-Regression (PLS-R) multivariate quantification methods. Different data pre-processing strategies were carried out being the whole one: (i) internal normalization; (ii) sampling based on The Nyquist Theorem; (iii) internal correlation optimized shifting, icoshift; (iv) baseline correction (v) mean centering and (vi) selecting zones. The first model corresponds to a matrix of dimensions 'n×911' variables and the second one to a matrix of dimensions 'n×431' variables. It has to be highlighted that the proposed two PLS-R models allow the quantification of extra virgin olive oil in binary blends, foodstuffs, etc., when the provided percentage is greater than 25%. Copyright © 2017 Elsevier Ltd. All rights reserved.
Consistency of flow quantifications in tridirectional phase-contrast MRI
NASA Astrophysics Data System (ADS)
Unterhinninghofen, R.; Ley, S.; Dillmann, R.
2009-02-01
Tridirectionally encoded phase-contrast MRI is a technique to non-invasively acquire time-resolved velocity vector fields of blood flow. These may not only be used to analyze pathological flow patterns, but also to quantify flow at arbitrary positions within the acquired volume. In this paper we examine the validity of this approach by analyzing the consistency of related quantifications instead of comparing it with an external reference measurement. Datasets of the thoracic aorta were acquired from 6 pigs, 1 healthy volunteer and 3 patients with artificial aortic valves. Using in-house software an elliptical flow quantification plane was placed manually at 6 positions along the descending aorta where it was rotated to 5 different angles. For each configuration flow was computed based on the original data and data that had been corrected for phase offsets. Results reveal that quantifications are more dependent on changes in position than on changes in angle. Phase offset correction considerably reduces this dependency. Overall consistency is good with a maximum variation coefficient of 9.9% and a mean variation coefficient of 7.2%.
Kretschy, Daniela; Koellensperger, Gunda; Hann, Stephan
2012-01-01
This article reviews novel quantification concepts where elemental labelling is combined with flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS) or liquid chromatography inductively coupled plasma mass spectrometry (LC–ICP-MS), and employed for quantification of biomolecules such as proteins, peptides and related molecules in challenging sample matrices. In the first sections an overview on general aspects of biomolecule quantification, as well as of labelling will be presented emphasizing the potential, which lies in such methodological approaches. In this context, ICP-MS as detector provides high sensitivity, selectivity and robustness in biological samples and offers the capability for multiplexing and isotope dilution mass spectrometry (IDMS). Fundamental methodology of elemental labelling will be highlighted and analytical, as well as biomedical applications will be presented. A special focus will lie on established applications underlining benefits and bottlenecks of such approaches for the implementation in real life analysis. Key research made in this field will be summarized and a perspective for future developments including sophisticated and innovative applications will given. PMID:23062431
Simple and rapid quantification of brominated vegetable oil in commercial soft drinks by LC–MS
Chitranshi, Priyanka; da Costa, Gonçalo Gamboa
2016-01-01
We report here a simple and rapid method for the quantification of brominated vegetable oil (BVO) in soft drinks based upon liquid chromatography–electrospray ionization mass spectrometry. Unlike previously reported methods, this novel method does not require hydrolysis, extraction or derivatization steps, but rather a simple “dilute and shoot” sample preparation. The quantification is conducted by mass spectrometry in selected ion recording mode and a single point standard addition procedure. The method was validated in the range of 5–25 μg/mL BVO, encompassing the legal limit of 15 μg/mL established by the US FDA for fruit-flavored beverages in the US market. The method was characterized by excellent intra- and inter-assay accuracy (97.3–103.4%) and very low imprecision [0.5–3.6% (RSD)]. The direct nature of the quantification, simplicity, and excellent statistical performance of this methodology constitute clear advantages in relation to previously published methods for the analysis of BVO in soft drinks. PMID:27451219
Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS
Brown, C. S.; Zhang, Hongbin
2016-05-24
Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less
Kauppinen, Ari; Toiviainen, Maunu; Korhonen, Ossi; Aaltonen, Jaakko; Järvinen, Kristiina; Paaso, Janne; Juuti, Mikko; Ketolainen, Jarkko
2013-02-19
During the past decade, near-infrared (NIR) spectroscopy has been applied for in-line moisture content quantification during a freeze-drying process. However, NIR has been used as a single-vial technique and thus is not representative of the entire batch. This has been considered as one of the main barriers for NIR spectroscopy becoming widely used in process analytical technology (PAT) for freeze-drying. Clearly it would be essential to monitor samples that reliably represent the whole batch. The present study evaluated multipoint NIR spectroscopy for in-line moisture content quantification during a freeze-drying process. Aqueous sucrose solutions were used as model formulations. NIR data was calibrated to predict the moisture content using partial least-squares (PLS) regression with Karl Fischer titration being used as a reference method. PLS calibrations resulted in root-mean-square error of prediction (RMSEP) values lower than 0.13%. Three noncontact, diffuse reflectance NIR probe heads were positioned on the freeze-dryer shelf to measure the moisture content in a noninvasive manner, through the side of the glass vials. The results showed that the detection of unequal sublimation rates within a freeze-dryer shelf was possible with the multipoint NIR system in use. Furthermore, in-line moisture content quantification was reliable especially toward the end of the process. These findings indicate that the use of multipoint NIR spectroscopy can achieve representative quantification of moisture content and hence a drying end point determination to a desired residual moisture level.
Matsumoto, Yasunori; Kano, Masayuki; Akutsu, Yasunori; Hanari, Naoyuki; Hoshino, Isamu; Murakami, Kentaro; Usui, Akihiro; Suito, Hiroshi; Takahashi, Masahiko; Otsuka, Ryota; Xin, Hu; Komatsu, Aki; Iida, Keiko; Matsubara, Hisahiro
2016-11-01
Exosomes play important roles in cancer progression. Although its contents (e.g., proteins and microRNAs) have been focused on in cancer research, particularly as potential diagnostic markers, the exosome behavior and methods for exosome quantification remain unclear. In the present study, we analyzed the tumor-derived exosome behavior and assessed the quantification of exosomes in patient plasma as a biomarker for esophageal squamous cell carcinoma (ESCC). A CD63-GFP expressing human ESCC cell line (TE2-CD63-GFP) was made by transfection, and mouse subcutaneous tumor models were established. Fluorescence imaging was performed on tumors and plasma exosomes harvested from mice. GFP-positive small vesicles were confirmed in the plasma obtained from TE2-CD63-GFP tumor-bearing mice. Patient plasma was collected in Chiba University Hospital (n=86). Exosomes were extracted from 100 µl of the plasma and quantified by acetylcholinesterase (AChE) activity. The relationship between exosome quantification and the patient clinical characteristics was assessed. The quantification of exosomes isolated from the patient plasma revealed that esophageal cancer patients (n=66) expressed higher exosome levels than non-malignant patients (n=20) (P=0.0002). Although there was no correlation between the tumor progression and the exosome levels, exosome number was the independent prognostic marker and low levels of exosome predicted a poor prognosis (P=0.03). In conclusion, exosome levels may be useful as an independent prognostic factor for ESCC patients.
Jeanneau, Laurent; Faure, Pierre
2010-09-01
The quantitative multimolecular approach (QMA) based on an exhaustive identification and quantification of molecules from the extractable organic matter (EOM) has been recently developed in order to investigate organic contamination in sediments by a more complete method than the restrictive quantification of target contaminants. Such an approach allows (i) the comparison between natural and anthropogenic inputs, (ii) between modern and fossil organic matter and (iii) the differentiation between several anthropogenic sources. However QMA is based on the quantification of molecules recovered by organic solvent and then analyzed by gas chromatography-mass spectrometry, which represent a small fraction of sedimentary organic matter (SOM). In order to extend the conclusions of QMA to SOM, radiocarbon analyses have been performed on organic extracts and decarbonated sediments. This analysis allows (i) the differentiation between modern biomass (contemporary (14)C) and fossil organic matter ((14)C-free) and (ii) the calculation of the modern carbon percentage (PMC). At the confluence between Fensch and Moselle Rivers, a catchment highly contaminated by both industrial activities and urbanization, PMC values in decarbonated sediments are well correlated with the percentage of natural molecular markers determined by QMA. It highlights that, for this type of contamination by fossil organic matter inputs, the conclusions of QMA can be scaled up to SOM. QMA is an efficient environmental diagnostic tool that leads to a more realistic quantification of fossil organic matter in sediments. Copyright 2010 Elsevier B.V. All rights reserved.
Hynstova, Veronika; Sterbova, Dagmar; Klejdus, Borivoj; Hedbavny, Josef; Huska, Dalibor; Adam, Vojtech
2018-01-30
In this study, 14 commercial products (dietary supplements) containing alga Chlorella vulgaris and cyanobacteria Spirulina platensis, originated from China and Japan, were analysed. UV-vis spectrophotometric method was applied for rapid determination of chlorophylls, carotenoids and pheophytins; as degradation products of chlorophylls. High Performance Thin-Layer Chromatography (HPTLC) was used for effective separation of these compounds, and also Atomic Absorption Spectrometry for determination of heavy metals as indicator of environmental pollution. Based on the results obtained from UV-vis spectrophotometric determination of photosynthetic pigments (chlorophylls and carotenoids), it was confirmed that Chlorella vulgaris contains more of all these pigments compared to the cyanobacteria Spirulina platensis. The fastest mobility compound identified in Chlorella vulgaris and Spirulina platensis using HPTLC method was β-carotene. Spectral analysis and standard calibration curve method were used for identification and quantification of separated substances on Thin-Layer Chromatographic plate. Quantification of copper (Cu 2+ , at 324.7 nm) and zinc (Zn 2+ , at 213.9nm) was performed using Flame Atomic Absorption Spectrometry with air-acetylene flame atomization. Quantification of cadmium (Cd 2+ , at 228.8 nm), nickel (Ni 2+ , at 232.0nm) and lead (Pb 2+ , at 283.3nm) by Electrothermal Graphite Furnace Atomic Absorption Spectrometry; and quantification of mercury (Hg 2+ , at 254nm) by Cold Vapour Atomic Absorption Spectrometry. Copyright © 2017 Elsevier B.V. All rights reserved.
Moraleja, I; Mena, M L; Lázaro, A; Neumann, B; Tejedor, A; Jakubowski, N; Gómez-Gómez, M M; Esteban-Fernández, D
2018-02-01
Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been revealed as a convenient technique for trace elemental imaging in tissue sections, providing elemental 2D distribution at a quantitative level. For quantification purposes, in the last years several approaches have been proposed in the literature such as the use of CRMs or matrix matched standards. The use of Isotope Dilution (ID) for quantification by LA-ICP-MS has been also described, being mainly useful for bulk analysis but not feasible for spatial measurements so far. In this work, a quantification method based on ID analysis was developed by printing isotope-enriched inks onto kidney slices from rats treated with antitumoral Pt-based drugs using a commercial ink-jet device, in order to perform an elemental quantification in different areas from bio-images. For the ID experiments 194 Pt enriched platinum was used. The methodology was validated by deposition of natural Pt standard droplets with a known amount of Pt onto the surface of a control tissue, where could be quantified even 50pg of Pt, with recoveries higher than 90%. The amount of Pt present in the whole kidney slices was quantified for cisplatin, carboplatin and oxaliplatin-treated rats. The results obtained were in accordance with those previously reported. The amount of Pt distributed between the medullar and cortical areas was also quantified, observing different behavior for the three drugs. Copyright © 2017 Elsevier B.V. All rights reserved.
Kim, Myeong Hee; Cha, Choong Hwan; An, Dongheui; Choi, Sung Eun; Oh, Heung Bum
2008-04-01
Hepatitis B virus (HBV) DNA quantification is necessary for starting and monitoring of antiviral therapy in patients with chronic hepatitis B. This study was intended to assess the clinical performance of Abbott RealTime HBV Quantification kit (Abbott Laboratories, USA). The performance was evaluated in terms of precision, linearity, detection sensitivity, cross-reactivity, and carry-over. A correlation with the Real-Q HBV Quantification kit (BioSewoom Inc., Korea) was also examined using serum samples from 64 patients diagnosed with chronic hepatitis B and underwent lamivudine therapy in Asan Medical Center. We verified the trueness of the system by comparing the outputs with the assigned values of the BBI panel (BBI Diagnostics, USA). Within-run and between-run coefficients of variation (CV) were 3.56-4.71% and 3.03-4.98%, respectively. Linearity was manifested ranging from 53 to 10(9)copies/mL and the detection sensitivity was verified to be 51 copies/mL. None of hepatitis C virus showed cross-reactivity. No cross-contamination occurred when negative and positive samples were alternatively placed in a row. It showed a good correlation with the Real-Q HBV (r(2)=0.9609) and the test results for the BBI panel were also well agreed to the assigned values (r(2)=0.9933). The performance of Abbott RealTime HBV Quantification kit was excellent; thus, it should be widely used in starting and monitoring of antiviral therapy in Korean patients with chronic hepatitis B.
Shen, L P; Sheridan, P; Cao, W W; Dailey, P J; Salazar-Gonzalez, J F; Breen, E C; Fahey, J L; Urdea, M S; Kolberg, J A
1998-06-01
Changes in the patterns of cytokine expression are thought to be of central importance in human infectious and inflammatory diseases. As such, there is a need for precise, reproducible assays for quantification of cytokine mRNA that are amenable to routine use in a clinical setting. In this report, we describe the design and performance of a branched DNA (bDNA) assay for the direct quantification of multiple cytokine mRNA levels in peripheral blood mononuclear cells (PBMCs). Oligonucleotide target probe sets were designed for several human cytokines, including TNFalpha, IL-2, IL-4, IL-6, IL-10, and IFNgamma. The bDNA assay yielded highly reproducible quantification of cytokine mRNAs, exhibited a broad linear dynamic range of over 3-log10, and showed a sensitivity sufficient to measure at least 3000 molecules. The potential clinical utility of the bDNA assay was explored by measuring cytokine mRNA levels in PBMCs from healthy and immunocompromised individuals. Cytokine expression levels in PBMCs from healthy blood donors were found to remain relatively stable over a one-month period of time. Elevated levels of IFNgamma mRNA were detected in PBMCs from HIV-1 seropositive individuals, but no differences in mean levels of TNFalpha or IL-6 mRNA were detected between seropositive and seronegative individuals. By providing a reproducible method for quantification of low abundance transcripts in clinical specimens, the bDNA assay may be useful for studies addressing the role of cytokine expression in disease.
Qian, Yiyun; Zhu, Zhenhua; Duan, Jin-Ao; Guo, Sheng; Shang, Erxin; Tao, Jinhua; Su, Shulan; Guo, Jianming
2017-01-15
A highly sensitive method using ultra-high-pressure liquid chromatography coupled with linear ion trap-Orbitrap tandem mass spectrometry (UHPLC-LTQ-Orbitrap-MS) has been developed and validated for the simultaneous identification and quantification of ginkgolic acids and semi-quantification of their metabolites in rat plasma. For the five selected ginkgolic acids, the method was found to be with good linearities (r>0.9991), good intra- and inter-day precisions (RSD<15%), and good accuracies (RE, from -10.33% to 4.92%) as well. Extraction recoveries, matrix effects and stabilities for rat plasm samples were within the required limits. The validated method was successfully applied to investigate the pharmacokinetics of the five ginkgolic acids in rat plasma after oral administration of 3 dosage groups (900mg/kg, 300mg/kg and 100mg/kg). Meanwhile, six metabolites of GA (15:1) and GA (17:1) were identified by comparison of MS data with reported values. The results of validation in terms of linear ranges, precisions and stabilities were established for semi-quantification of metabolites. The curves of relative changes of these metabolites during the metabolic process were constructed by plotting the peak area ratios of metabolites to salicylic acid (internal standard, IS), respectively. Double peaks were observed in all 3 dose groups. Different type of metabolites and different dosage of each metabolite both resulted in different T max . Copyright © 2016 Elsevier B.V. All rights reserved.
Deegan, Emily G; Stothers, Lynn; Kavanagh, Alex; Macnab, Andrew J
2018-01-01
There remains no gold standard for quantification of voluntary pelvic floor muscle (PFM) strength, despite international guidelines that recommend PFM assessment in females with urinary incontinence (UI). Methods currently reported for quantification of skeletal muscle strength across disciplines are systematically reviewed and their relevance for clinical and academic use related to the pelvic floor are described. A systematic review via Medline, PubMed, CINHAL, and the Cochrane database using key terms for pelvic floor anatomy and function were cross referenced with skeletal muscle strength quantification from 1946 to 2016. Full text peer-reviewed articles in English having female subjects with incontinence were identified. Each study was analyzed for use of controls, type of methodology as direct or indirect measures, benefits, and limitations of the technique. A total of 1586 articles were identified of which 50 met the inclusion criteria. Nine methodologies of determining PFM strength were described including: digital palpation, perineometer, dynamometry, EMG, vaginal cones, ultrasonography, magnetic resonance imaging, urine stream interruption test, and the Colpexin pull test. Thirty-two percent lacked a control group. Technical refinements in both direct and indirect instrumentation for PFM strength measurement are allowing for sensitivity. However, the most common methods of quantification remain digital palpation and perineometry; techniques that pose limitations and yield subjective or indirect measures of muscular strength. Dynamometry has potential as an accurate and sensitive tool, but is limited by inability to assess PFM strength during dynamic movements. © 2017 Wiley Periodicals, Inc.
An Optimized Informatics Pipeline for Mass Spectrometry-Based Peptidomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Chaochao; Monroe, Matthew E.; Xu, Zhe
2015-12-26
Comprehensive MS analysis of peptidome, the intracellular and intercellular products of protein degradation, has the potential to provide novel insights on endogenous proteolytic processing and their utility in disease diagnosis and prognosis. Along with the advances in MS instrumentation, a plethora of proteomics data analysis tools have been applied for direct use in peptidomics; however an evaluation of the currently available informatics pipelines for peptidomics data analysis has yet to be reported. In this study, we set off by evaluating the results of several popular MS/MS database search engines including MS-GF+, SEQUEST and MS-Align+ for peptidomics data analysis, followed bymore » identification and label-free quantification using the well-established accurate mass and time (AMT) tag and newly developed informed quantification (IQ) approaches, both based on direct LC-MS analysis. Our result demonstrated that MS-GF+ outperformed both SEQUEST and MS-Align+ in identifying peptidome peptides. Using a database established from the MS-GF+ peptide identifications, both the AMT tag and IQ approaches provided significantly deeper peptidome coverage and less missing value for each individual data set than the MS/MS methods, while achieving robust label-free quantification. Besides having an excellent correlation with the AMT tag quantification results, IQ also provided slightly higher peptidome coverage than AMT. Taken together, we propose an optimal informatics pipeline combining MS-GF+ for initial database searching with IQ (or AMT) for identification and label-free quantification for high-throughput, comprehensive and quantitative peptidomics analysis.« less
Pyschik, Marcelina; Klein-Hitpaß, Marcel; Girod, Sabrina; Winter, Martin; Nowak, Sascha
2017-02-01
In this study, an optimized method using capillary electrophoresis (CE) with a direct contactless conductivity detector (C 4 D) for a new application field is presented for the quantification of fluoride in common used lithium ion battery (LIB) electrolyte using LiPF 6 in organic carbonate solvents and in ionic liquids (ILs) after contacted to Li metal. The method development for finding the right buffer and the suitable CE conditions for the quantification of fluoride was investigated. The results of the concentration of fluoride in different LIB electrolyte samples were compared to the results from the ion-selective electrode (ISE). The relative standard deviations (RSDs) and recovery rates for fluoride were obtained with a very high accuracy in both methods. The results of the fluoride concentration in the LIB electrolytes were in very good agreement for both methods. In addition, the limit of detection (LOD) and limit of quantification (LOQ) values were determined for the CE method. The CE method has been applied also for the quantification of fluoride in ILs. In the fresh IL sample, the concentration of fluoride was under the LOD. Another sample of the IL mixed with Li metal has been investigated as well. It was possible to quantify the fluoride concentration in this sample. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Samanipour, Saer; Dimitriou-Christidis, Petros; Gros, Jonas; Grange, Aureline; Samuel Arey, J
2015-01-02
Comprehensive two-dimensional gas chromatography (GC×GC) is used widely to separate and measure organic chemicals in complex mixtures. However, approaches to quantify analytes in real, complex samples have not been critically assessed. We quantified 7 PAHs in a certified diesel fuel using GC×GC coupled to flame ionization detector (FID), and we quantified 11 target chlorinated hydrocarbons in a lake water extract using GC×GC with electron capture detector (μECD), further confirmed qualitatively by GC×GC with electron capture negative chemical ionization time-of-flight mass spectrometer (ENCI-TOFMS). Target analyte peak volumes were determined using several existing baseline correction algorithms and peak delineation algorithms. Analyte quantifications were conducted using external standards and also using standard additions, enabling us to diagnose matrix effects. We then applied several chemometric tests to these data. We find that the choice of baseline correction algorithm and peak delineation algorithm strongly influence the reproducibility of analyte signal, error of the calibration offset, proportionality of integrated signal response, and accuracy of quantifications. Additionally, the choice of baseline correction and the peak delineation algorithm are essential for correctly discriminating analyte signal from unresolved complex mixture signal, and this is the chief consideration for controlling matrix effects during quantification. The diagnostic approaches presented here provide guidance for analyte quantification using GC×GC. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos
2014-06-01
Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.
Development and validation of an open source quantification tool for DSC-MRI studies.
Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J
2015-03-01
This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.
Targeted quantification of low ng/mL level proteins in human serum without immunoaffinity depletion
Shi, Tujin; Sun, Xuefei; Gao, Yuqian; Fillmore, Thomas L.; Schepmoes, Athena A.; Zhao, Rui; He, Jintang; Moore, Ronald J.; Kagan, Jacob; Rodland, Karin D.; Liu, Tao; Liu, Alvin Y.; Smith, Richard D.; Tang, Keqi; Camp, David G.; Qian, Wei-Jun
2013-01-01
We recently reported an antibody-free targeted protein quantification strategy, termed high-pressure, high-resolution separations with intelligent selection and multiplexing (PRISM) for achieving significantly enhanced sensitivity using selected reaction monitoring (SRM) mass spectrometry. Integrating PRISM with front-end IgY14 immunoaffinity depletion, sensitive detection of targeted proteins at 50–100 pg/mL levels in human blood plasma/serum was demonstrated. However, immunoaffinity depletion is often associated with undesired losses of target proteins of interest. Herein we report further evaluation of PRISM-SRM quantification of low-abundance serum proteins without immunoaffinity depletion. Limits of quantification (LOQ) at low ng/mL levels with a median coefficient of variation (CV) of ~12% were achieved for proteins spiked into human female serum. PRISM-SRM provided >100-fold improvement in the LOQ when compared to conventional LC-SRM measurements. PRISM-SRM was then applied to measure several low-abundance endogenous serum proteins, including prostate-specific antigen (PSA), in clinical prostate cancer patient sera. PRISM-SRM enabled confident detection of all target endogenous serum proteins except the low pg/mL-level cardiac troponin T. A correlation coefficient >0.99 was observed for PSA between the results from PRISM-SRM and immunoassays. Our results demonstrate that PRISM-SRM can successful quantify low ng/mL proteins in human plasma or serum without depletion. We anticipate broad applications for PRISM-SRM quantification of low-abundance proteins in candidate biomarker verification and systems biology studies. PMID:23763644
Chaouachi, Maher; Alaya, Akram; Ali, Imen Ben Haj; Hafsa, Ahmed Ben; Nabi, Nesrine; Bérard, Aurélie; Romaniuk, Marcel; Skhiri, Fethia; Saïd, Khaled
2013-01-01
KEY MESSAGE : Here, we describe a new developed quantitative real-time PCR method for the detection and quantification of a new specific endogenous reference gene used in GMO analysis. The key requirement of this study was the identification of a new reference gene used for the differentiation of the four genomic sections of the sugar beet (Beta vulgaris L.) (Beta, Corrollinae, Nanae and Procumbentes) suitable for quantification of genetically modified sugar beet. A specific qualitative polymerase chain reaction (PCR) assay was designed to detect the sugar beet amplifying a region of the adenylate transporter (ant) gene only from the species of the genomic section I of the genus Beta (cultivated and wild relatives) and showing negative PCR results for 7 species of the 3 other sections, 8 related species and 20 non-sugar beet plants. The sensitivity of the assay was 15 haploid genome copies (HGC). A quantitative real-time polymerase chain reaction (QRT-PCR) assay was also performed, having high linearity (R (2) > 0.994) over sugar beet standard concentrations ranging from 20,000 to 10 HGC of the sugar beet DNA per PCR. The QRT-PCR assay described in this study was specific and more sensitive for sugar beet quantification compared to the validated test previously reported in the European Reference Laboratory. This assay is suitable for GMO quantification in routine analysis from a wide variety of matrices.
Fabregat-Cabello, Neus; Sancho, Juan V; Vidal, Andreu; González, Florenci V; Roig-Navarro, Antoni Francesc
2014-02-07
We present here a new measurement method for the rapid extraction and accurate quantification of technical nonylphenol (NP) and 4-t-octylphenol (OP) in complex matrix water samples by UHPLC-ESI-MS/MS. The extraction of both compounds is achieved in 30min by means of hollow fiber liquid phase microextraction (HF-LPME) using 1-octanol as acceptor phase, which provides an enrichment (preconcentration) factor of 800. On the other hand we have developed a quantification method based on isotope dilution mass spectrometry (IDMS) and singly (13)C1-labeled compounds. To this end the minimal labeled (13)C1-4-(3,6-dimethyl-3-heptyl)-phenol and (13)C1-t-octylphenol isomers were synthesized, which coelute with the natural compounds and allows the compensation of the matrix effect. The quantification was carried out by using isotope pattern deconvolution (IPD), which permits to obtain the concentration of both compounds without the need to build any calibration graph, reducing the total analysis time. The combination of both extraction and determination techniques have allowed to validate for the first time a HF-LPME methodology at the required levels by legislation achieving limits of quantification of 0.1ngmL(-1) and recoveries within 97-109%. Due to the low cost of HF-LPME and total time consumption, this methodology is ready for implementation in routine analytical laboratories. Copyright © 2013 Elsevier B.V. All rights reserved.
Meisser Redeuil, Karine; Longet, Karin; Bénet, Sylvie; Munari, Caroline; Campos-Giménez, Esther
2015-11-27
This manuscript reports a validated analytical approach for the quantification of 21 water soluble vitamins and their main circulating forms in human plasma. Isotope dilution-based sample preparation consisted of protein precipitation using acidic methanol enriched with stable isotope labelled internal standards. Separation was achieved by reversed-phase liquid chromatography and detection performed by tandem mass spectrometry in positive electrospray ionization mode. Instrumental lower limits of detection and quantification reached <0.1-10nM and 0.2-25nM, respectively. Commercially available pooled human plasma was used to build matrix-matched calibration curves ranging 2-500, 5-1250, 20-5000 or 150-37500nM depending on the analyte. The overall performance of the method was considered adequate, with 2.8-20.9% and 5.2-20.0% intra and inter-day precision, respectively and averaged accuracy reaching 91-108%. Recovery experiments were also performed and reached in average 82%. This analytical approach was then applied for the quantification of circulating water soluble vitamins in human plasma single donor samples. The present report provides a sensitive and reliable approach for the quantification of water soluble vitamins and main circulating forms in human plasma. In the future, the application of this analytical approach will give more confidence to provide a comprehensive assessment of water soluble vitamins nutritional status and bioavailability studies in humans. Copyright © 2015 Elsevier B.V. All rights reserved.
QUESP and QUEST revisited - fast and accurate quantitative CEST experiments.
Zaiss, Moritz; Angelovski, Goran; Demetriou, Eleni; McMahon, Michael T; Golay, Xavier; Scheffler, Klaus
2018-03-01
Chemical exchange saturation transfer (CEST) NMR or MRI experiments allow detection of low concentrated molecules with enhanced sensitivity via their proton exchange with the abundant water pool. Be it endogenous metabolites or exogenous contrast agents, an exact quantification of the actual exchange rate is required to design optimal pulse sequences and/or specific sensitive agents. Refined analytical expressions allow deeper insight and improvement of accuracy for common quantification techniques. The accuracy of standard quantification methodologies, such as quantification of exchange rate using varying saturation power or varying saturation time, is improved especially for the case of nonequilibrium initial conditions and weak labeling conditions, meaning the saturation amplitude is smaller than the exchange rate (γB 1 < k). The improved analytical 'quantification of exchange rate using varying saturation power/time' (QUESP/QUEST) equations allow for more accurate exchange rate determination, and provide clear insights on the general principles to execute the experiments and to perform numerical evaluation. The proposed methodology was evaluated on the large-shift regime of paramagnetic chemical-exchange-saturation-transfer agents using simulated data and data of the paramagnetic Eu(III) complex of DOTA-tetraglycineamide. The refined formulas yield improved exchange rate estimation. General convergence intervals of the methods that would apply for smaller shift agents are also discussed. Magn Reson Med 79:1708-1721, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Liu, Dian; Steingoetter, Andreas; Curcic, Jelena; Kozerke, Sebastian
2018-01-01
To investigate and exploit the effect of intravoxel off-resonance compartments in the triple-echo steady-state (TESS) sequence without fat suppression for T 2 mapping and to leverage the results for fat fraction quantification. In multicompartment tissue, where at least one compartment is excited off-resonance, the total signal exhibits periodic modulations as a function of echo time (TE). Simulated multicompartment TESS signals were synthesized at various TEs. Fat emulsion phantoms were prepared and scanned at the same TE combinations using TESS. In vivo knee data were obtained with TESS to validate the simulations. The multicompartment effect was exploited for fat fraction quantification in the stomach by acquiring TESS signals at two TE combinations. Simulated and measured multicompartment signal intensities were in good agreement. Multicompartment effects caused erroneous T 2 offsets, even at low water-fat ratios. The choice of TE caused T 2 variations of as much as 28% in cartilage. The feasibility of fat fraction quantification to monitor the decrease of fat content in the stomach during digestion is demonstrated. Intravoxel off-resonance compartments are a confounding factor for T 2 quantification using TESS, causing errors that are dependent on the TE. At the same time, off-resonance effects may allow for efficient fat fraction mapping using steady-state imaging. Magn Reson Med 79:423-429, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
van Hamersvelt, Robbert W; Willemink, Martin J; de Jong, Pim A; Milles, Julien; Vlassenbroek, Alain; Schilham, Arnold M R; Leiner, Tim
2017-09-01
The aim of this study was to evaluate the feasibility and accuracy of dual-layer spectral detector CT (SDCT) for the quantification of clinically encountered gadolinium concentrations. The cardiac chamber of an anthropomorphic thoracic phantom was equipped with 14 tubular inserts containing different gadolinium concentrations, ranging from 0 to 26.3 mg/mL (0.0, 0.1, 0.2, 0.4, 0.5, 1.0, 2.0, 3.0, 4.0, 5.1, 10.6, 15.7, 20.7 and 26.3 mg/mL). Images were acquired using a novel 64-detector row SDCT system at 120 and 140 kVp. Acquisitions were repeated five times to assess reproducibility. Regions of interest (ROIs) were drawn on three slices per insert. A spectral plot was extracted for every ROI and mean attenuation profiles were fitted to known attenuation profiles of water and pure gadolinium using in-house-developed software to calculate gadolinium concentrations. At both 120 and 140 kVp, excellent correlations between scan repetitions and true and measured gadolinium concentrations were found (R > 0.99, P < 0.001; ICCs > 0.99, CI 0.99-1.00). Relative mean measurement errors stayed below 10% down to 2.0 mg/mL true gadolinium concentration at 120 kVp and below 5% down to 1.0 mg/mL true gadolinium concentration at 140 kVp. SDCT allows for accurate quantification of gadolinium at both 120 and 140 kVp. Lowest measurement errors were found for 140 kVp acquisitions. • Gadolinium quantification may be useful in patients with contraindication to iodine. • Dual-layer spectral detector CT allows for overall accurate quantification of gadolinium. • Interscan variability of gadolinium quantification using SDCT material decomposition is excellent.
Technical advances in proteomics: new developments in data-independent acquisition.
Hu, Alex; Noble, William S; Wolf-Yadlin, Alejandro
2016-01-01
The ultimate aim of proteomics is to fully identify and quantify the entire complement of proteins and post-translational modifications in biological samples of interest. For the last 15 years, liquid chromatography-tandem mass spectrometry (LC-MS/MS) in data-dependent acquisition (DDA) mode has been the standard for proteomics when sampling breadth and discovery were the main objectives; multiple reaction monitoring (MRM) LC-MS/MS has been the standard for targeted proteomics when precise quantification, reproducibility, and validation were the main objectives. Recently, improvements in mass spectrometer design and bioinformatics algorithms have resulted in the rediscovery and development of another sampling method: data-independent acquisition (DIA). DIA comprehensively and repeatedly samples every peptide in a protein digest, producing a complex set of mass spectra that is difficult to interpret without external spectral libraries. Currently, DIA approaches the identification breadth of DDA while achieving the reproducible quantification characteristic of MRM or its newest version, parallel reaction monitoring (PRM). In comparative de novo identification and quantification studies in human cell lysates, DIA identified up to 89% of the proteins detected in a comparable DDA experiment while providing reproducible quantification of over 85% of them. DIA analysis aided by spectral libraries derived from prior DIA experiments or auxiliary DDA data produces identification and quantification as reproducible and precise as that achieved by MRM/PRM, except on low‑abundance peptides that are obscured by stronger signals. DIA is still a work in progress toward the goal of sensitive, reproducible, and precise quantification without external spectral libraries. New software tools applied to DIA analysis have to deal with deconvolution of complex spectra as well as proper filtering of false positives and false negatives. However, the future outlook is positive, and various researchers are working on novel bioinformatics techniques to address these issues and increase the reproducibility, fidelity, and identification breadth of DIA.
Stillhart, Cordula; Kuentz, Martin
2012-02-05
Self-emulsifying drug delivery systems (SEDDS) are complex mixtures in which drug quantification can become a challenging task. Thus, a general need exists for novel analytical methods and a particular interest lies in techniques with the potential for process monitoring. This article compares Raman spectroscopy with high-resolution ultrasonic resonator technology (URT) for drug quantification in SEDDS. The model drugs fenofibrate, indomethacin, and probucol were quantitatively assayed in different self-emulsifying formulations. We measured ultrasound velocity and attenuation in the bulk formulation containing drug at different concentrations. The formulations were also studied by Raman spectroscopy. We used both, an in-line immersion probe for the bulk formulation and a multi-fiber sensor for measuring through hard-gelatin capsules that were filled with SEDDS. Each method was assessed by calculating the relative standard error of prediction (RSEP) as well as the limit of quantification (LOQ) and the mean recovery. Raman spectroscopy led to excellent calibration models for the bulk formulation as well as the capsules. The RSEP depended on the SEDDS type with values of 1.5-3.8%, while LOQ was between 0.04 and 0.35% (w/w) for drug quantification in the bulk. Similarly, the analysis of the capsules led to RSEP of 1.9-6.5% and LOQ of 0.01-0.41% (w/w). On the other hand, ultrasound attenuation resulted in RSEP of 2.3-4.4% and LOQ of 0.1-0.6% (w/w). Moreover, ultrasound velocity provided an interesting analytical response in cases where the drug strongly affected the density or compressibility of the SEDDS. We conclude that ultrasonic resonator technology and Raman spectroscopy constitute suitable methods for drug quantification in SEDDS, which is promising for their use as process analytical technologies. Copyright © 2011 Elsevier B.V. All rights reserved.
Coelho-Lima, Jose; Mohammed, Ashfaq; Cormack, Suzanne; Jones, Samuel; Das, Rajiv; Egred, Mohaned; Panahi, Pedram; Ali, Simi; Spyridopoulos, Ioakim
2018-06-11
Cardiac-enriched micro ribonucleic acids (miRNAs) are released into the circulation following ST-elevation myocardial infarction (STEMI). Lack of standardized approaches for reverse transcription quantitative real-time polymerase chain reaction (RT-qPCR) data normalization and presence of RT-qPCR inhibitors (e.g. heparin) in patient blood samples have prevented reproducible miRNA quantification in this cohort and subsequent translation of these biomarkers to clinical practice. Using a RT-qPCR miRNA screening platform, we identified and validated an endogenous circulating miRNA as a normalization control. In addition, we assessed the effects of in vivo and in vitro anticoagulant drugs administration (heparin and bivalirudin) on three RT-qPCR normalization strategies (global miRNA mean, exogenous spike-in control [cel-miR-39] and endogenous miRNA control). Finally, we evaluated the effect of heparin and its in vitro inhibition with heparinase on the quantification of cardiac-enriched miRNAs in STEMI patients. miR-425-5p was validated as an endogenous miRNA control. Heparin administration in vitro and in vivo inhibited all RT-qPCR normalization strategies. In contrast, bivalirudin had no effects on cel-miR-39 or miR-425-5p quantification. In vitro RNA sample treatment with 0.3 U of heparinase overcame heparin-induced over-estimation of cardiac-enriched miRNA levels and improved their correlation with high-sensitivity troponin T. miRNA quantification in STEMI patients receiving heparin is jeopardized by its effect on all RT-qPCR normalization approaches. Use of samples from bivalirudin-treated patients or in vitro treatment of heparin-contaminated samples with heparinase are suitable alternatives for miRNA quantification in this cohort. Finally, we reinforce the evidence that cardiac-enriched miRNAs early after myocardial reperfusion reflect the severity of cardiac injury. Schattauer GmbH Stuttgart.
Linscheid, Michael W
2018-03-30
To understand biological processes, not only reliable identification, but quantification of constituents in biological processes play a pivotal role. This is especially true for the proteome: protein quantification must follow protein identification, since sometimes minute changes in abundance tell the real tale. To obtain quantitative data, many sophisticated strategies using electrospray and MALDI mass spectrometry (MS) have been developed in recent years. All of them have advantages and limitations. Several years ago, we started to work on strategies, which are principally capable to overcome some of these limits. The fundamental idea is to use elemental signals as a measure for quantities. We began by replacing the radioactive 32 P with the "cold" natural 31 P to quantify modified nucleotides and phosphorylated peptides and proteins and later used tagging strategies for quantification of proteins more generally. To do this, we introduced Inductively Coupled Plasma Mass Spectrometry (ICP-MS) into the bioanalytical workflows, allowing not only reliable and sensitive detection but also quantification based on isotope dilution absolute measurements using poly-isotopic elements. The detection capability of ICP-MS becomes particularly attractive with heavy metals. The covalently bound proteins tags developed in our group are based on the well-known DOTA chelate complex (1,4,7,10-tetraazacyclododecane-N,N',N″,N‴-tetraacetic acid) carrying ions of lanthanoides as metal core. In this review, I will outline the development of this mutual assistance between molecular and elemental mass spectrometry and discuss the scope and limitations particularly of peptide and protein quantification. The lanthanoide tags provide low detection limits, but offer multiplexing capabilities due to the number of very similar lanthanoides and their isotopes. With isotope dilution comes previously unknown accuracy. Separation techniques such as electrophoresis and HPLC were used and just slightly adapted workflows, already in use for quantification in bioanalysis. Imaging mass spectrometry (MSI) with MALDI and laser ablation ICP-MS complemented the range of application in recent years. © 2018 Wiley Periodicals, Inc.
Targeted quantification of low ng/mL level proteins in human serum without immunoaffinity depletion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Tujin; Sun, Xuefei; Gao, Yuqian
2013-07-05
We recently reported an antibody-free targeted protein quantification strategy, termed high-pressure, high-resolution separations with intelligent selection and multiplexing (PRISM) for achieving significantly enhanced sensitivity using selected reaction monitoring (SRM) mass spectrometry. Integrating PRISM with front-end IgY14 immunoaffinity depletion, sensitive detection of targeted proteins at 50-100 pg/mL levels in human blood plasma/serum was demonstrated. However, immunoaffinity depletion is often associated with undesired losses of target proteins of interest. Herein we report further evaluation of PRISM-SRM quantification of low-abundance serum proteins without immunoaffinity depletion and the multiplexing potential of this technique. Limits of quantification (LOQs) at low ng/mL levels with a medianmore » CV of ~12% were achieved for proteins spiked into human female serum using as little as 2 µL serum. PRISM-SRM provided up to ~1000-fold improvement in the LOQ when compared to conventional SRM measurements. Multiplexing capability of PRISM-SRM was also evaluated by two sets of serum samples with 6 and 21 target peptides spiked at the low attomole/µL levels. The results from SRM measurements for pooled or post-concatenated samples were comparable to those obtained from individual peptide fractions in terms of signal-to-noise ratios and SRM peak area ratios of light to heavy peptides. PRISM-SRM was applied to measure several ng/mL-level endogenous plasma proteins, including prostate-specific antigen, in clinical patient sera where correlation coefficients > 0.99 were observed between the results from PRISM-SRM and ELISA assays. Our results demonstrate that PRISM-SRM can be successfully used for quantification of low-abundance endogenous proteins in highly complex samples. Moderate throughput (50 samples/week) can be achieved by applying the post-concatenation or fraction multiplexing strategies. We anticipate broad applications for targeted PRISM-SRM quantification of low-abundance cellular proteins in systems biology studies as well as candidate biomarkers in biofluids.« less
Franquesa, Marcella; Hoogduijn, Martin J.; Ripoll, Elia; Luk, Franka; Salih, Mahdi; Betjes, Michiel G. H.; Torras, Juan; Baan, Carla C.; Grinyó, Josep M.; Merino, Ana Maria
2014-01-01
The research field on extracellular vesicles (EV) has rapidly expanded in recent years due to the therapeutic potential of EV. Adipose tissue human mesenchymal stem cells (ASC) may be a suitable source for therapeutic EV. A major limitation in the field is the lack of standardization of the challenging techniques to isolate and characterize EV. The aim of our study was to incorporate new controls for the detection and quantification of EV derived from ASC and to analyze the applicability and limitations of the available techniques. ASC were cultured in medium supplemented with 5% of vesicles-free fetal bovine serum. The EV were isolated from conditioned medium by differential centrifugation with size filtration (0.2 μm). As a control, non-conditioned culture medium was used (control medium). To detect EV, electron microscopy, conventional flow cytometry, and western blot were used. The quantification of the EV was by total protein quantification, ExoELISA immunoassay, and Nanosight. Cytokines and growth factors in the EV samples were measured by multiplex bead array kit. The EV were detected by electron microscope. Total protein measurement was not useful to quantify EV as the control medium showed similar protein contents as the EV samples. The ExoELISA kits had technical troubles and it was not possible to quantify the concentration of exosomes in the samples. The use of Nanosight enabled quantification and size determination of the EV. It is, however, not possible to distinguish protein aggregates from EV with this method. The technologies for quantification and characterization of the EV need to be improved. In addition, we detected protein contaminants in the EV samples, which make it difficult to determine the real effect of EV in experimental models. It will be crucial in the future to optimize design novel methods for purification and characterization of EV. PMID:25374572
Lombard-Banek, Camille; Reddy, Sushma; Moody, Sally A; Nemes, Peter
2016-08-01
Quantification of protein expression in single cells promises to advance a systems-level understanding of normal development. Using a bottom-up proteomic workflow and multiplexing quantification by tandem mass tags, we recently demonstrated relative quantification between single embryonic cells (blastomeres) in the frog (Xenopus laevis) embryo. In this study, we minimize derivatization steps to enhance analytical sensitivity and use label-free quantification (LFQ) for single Xenopus cells. The technology builds on a custom-designed capillary electrophoresis microflow-electrospray ionization high-resolution mass spectrometry platform and LFQ by MaxLFQ (MaxQuant). By judiciously tailoring performance to peptide separation, ionization, and data-dependent acquisition, we demonstrate an ∼75-amol (∼11 nm) lower limit of detection and quantification for proteins in complex cell digests. The platform enabled the identification of 438 nonredundant protein groups by measuring 16 ng of protein digest, or <0.2% of the total protein contained in a blastomere in the 16-cell embryo. LFQ intensity was validated as a quantitative proxy for protein abundance. Correlation analysis was performed to compare protein quantities between the embryo and n = 3 different single D11 blastomeres, which are fated to develop into the nervous system. A total of 335 nonredundant protein groups were quantified in union between the single D11 cells spanning a 4 log-order concentration range. LFQ and correlation analysis detected expected proteomic differences between the whole embryo and blastomeres, and also found translational differences between individual D11 cells. LFQ on single cells raises exciting possibilities to study gene expression in other cells and models to help better understand cell processes on a systems biology level. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
Motion-aware stroke volume quantification in 4D PC-MRI data of the human aorta.
Köhler, Benjamin; Preim, Uta; Grothoff, Matthias; Gutberlet, Matthias; Fischbach, Katharina; Preim, Bernhard
2016-02-01
4D PC-MRI enables the noninvasive measurement of time-resolved, three-dimensional blood flow data that allow quantification of the hemodynamics. Stroke volumes are essential to assess the cardiac function and evolution of different cardiovascular diseases. The calculation depends on the wall position and vessel orientation, which both change during the cardiac cycle due to the heart muscle contraction and the pumped blood. However, current systems for the quantitative 4D PC-MRI data analysis neglect the dynamic character and instead employ a static 3D vessel approximation. We quantify differences between stroke volumes in the aorta obtained with and without consideration of its dynamics. We describe a method that uses the approximating 3D segmentation to automatically initialize segmentation algorithms that require regions inside and outside the vessel for each temporal position. This enables the use of graph cuts to obtain 4D segmentations, extract vessel surfaces including centerlines for each temporal position and derive motion information. The stroke volume quantification is compared using measuring planes in static (3D) vessels, planes with fixed angulation inside dynamic vessels (this corresponds to the common 2D PC-MRI) and moving planes inside dynamic vessels. Seven datasets with different pathologies such as aneurysms and coarctations were evaluated in close collaboration with radiologists. Compared to the experts' manual stroke volume estimations, motion-aware quantification performs, on average, 1.57% better than calculations without motion consideration. The mean difference between stroke volumes obtained with the different methods is 7.82%. Automatically obtained 4D segmentations overlap by 85.75% with manually generated ones. Incorporating motion information in the stroke volume quantification yields slight but not statistically significant improvements. The presented method is feasible for the clinical routine, since computation times are low and essential parts run fully automatically. The 4D segmentations can be used for other algorithms as well. The simultaneous visualization and quantification may support the understanding and interpretation of cardiac blood flow.
NASA Astrophysics Data System (ADS)
Buongiorno, J.; Lloyd, K. G.; Shumaker, A.; Schippers, A.; Webster, G.; Weightman, A.; Turner, S.
2015-12-01
Nearly 75% of the Earth's surface is covered by marine sediment that is home to an estimated 2.9 x 1029 microbial cells. A substantial impediment to understanding the abundance and distribution of cells within marine sediment is the lack of a consistent and reliable method for their taxon-specific quantification. Catalyzed reporter fluorescent in situ hybridization (CARD-FISH) provides taxon-specific enumeration, but this process requires passing a large enzyme through cell membranes, decreasing its precision relative to general cell counts using a small DNA stain. In 2015, Yamaguchi et al. developed FISH hybridization chain reaction (FISH-HCR) as an in situ whole cell detection method for environmental microorganisms. FISH-HCR amplifies the fluorescent signal, as does CARD-FISH, but it allows for milder cell permeation methods that might prevent yield loss. To compare FISH-HCR to CARD-FISH, we examined bacteria and archaea cell counts within two sediment cores, Lille Belt (~78 meters deep) and Landsort Deep (90 meters deep), which were retrieved from the Baltic Sea Basin during IODP Expedition 347. Preliminary analysis shows that CARD-FISH counts are below the quantification limit for most depths across both cores. By contrast, quantification of cells was possible with FISH-HCR in all examined depths. When quantification with CARD-FISH was above the limit of detection, counts with FISH-HCR were up to 11 fold higher for Bacteria and 3 fold higher for Archaea from the same sediment sample. Further, FISH-HCR counts follow the trends of on board counts nicely, indicating that FISH-HCR may better reflect the cellular abundance within marine sediment than other quantification methods, including qPCR. Using FISH-HCR, we found that archaeal cell counts were on average greater than bacterial cell counts, but within the same order of magnitude.
Esquinas, Pedro L; Uribe, Carlos F; Gonzalez, M; Rodríguez-Rodríguez, Cristina; Häfeli, Urs O; Celler, Anna
2017-07-20
The main applications of 188 Re in radionuclide therapies include trans-arterial liver radioembolization and palliation of painful bone-metastases. In order to optimize 188 Re therapies, the accurate determination of radiation dose delivered to tumors and organs at risk is required. Single photon emission computed tomography (SPECT) can be used to perform such dosimetry calculations. However, the accuracy of dosimetry estimates strongly depends on the accuracy of activity quantification in 188 Re images. In this study, we performed a series of phantom experiments aiming to investigate the accuracy of activity quantification for 188 Re SPECT using high-energy and medium-energy collimators. Objects of different shapes and sizes were scanned in Air, non-radioactive water (Cold-water) and water with activity (Hot-water). The ordered subset expectation maximization algorithm with clinically available corrections (CT-based attenuation, triple-energy window (TEW) scatter and resolution recovery was used). For high activities, the dead-time corrections were applied. The accuracy of activity quantification was evaluated using the ratio of the reconstructed activity in each object to this object's true activity. Each object's activity was determined with three segmentation methods: a 1% fixed threshold (for cold background), a 40% fixed threshold and a CT-based segmentation. Additionally, the activity recovered in the entire phantom, as well as the average activity concentration of the phantom background were compared to their true values. Finally, Monte-Carlo simulations of a commercial [Formula: see text]-camera were performed to investigate the accuracy of the TEW method. Good quantification accuracy (errors <10%) was achieved for the entire phantom, the hot-background activity concentration and for objects in cold background segmented with a 1% threshold. However, the accuracy of activity quantification for objects segmented with 40% threshold or CT-based methods decreased (errors >15%), mostly due to partial-volume effects. The Monte-Carlo simulations confirmed that TEW-scatter correction applied to 188 Re, although practical, yields only approximate estimates of the true scatter.
Mannheim, Julia G; Schmid, Andreas M; Pichler, Bernd J
2017-12-01
Non-invasive in vivo positron emission tomography (PET) provides high detection sensitivity in the nano- to picomolar range and in addition to other advantages, the possibility to absolutely quantify the acquired data. The present study focuses on the comparison of transmission data acquired with an X-ray computed tomography (CT) scanner or a Co-57 source for the Inveon small animal PET scanner (Siemens Healthcare, Knoxville, TN, USA), as well as determines their influences on the quantification accuracy and partial volume effect (PVE). A special focus included the impact of the performed calibration on the quantification accuracy. Phantom measurements were carried out to determine the quantification accuracy, the influence of the object size on the quantification, and the PVE for different sphere sizes, along the field of view and for different contrast ratios. An influence of the emission activity on the Co-57 transmission measurements was discovered (deviations up to 24.06 % measured to true activity), whereas no influence of the emission activity on the CT attenuation correction was identified (deviations <3 % for measured to true activity). The quantification accuracy was substantially influenced by the applied calibration factor and by the object size. The PVE demonstrated a dependency on the sphere size, the position within the field of view, the reconstruction and correction algorithms and the count statistics. Depending on the reconstruction algorithm, only ∼30-40 % of the true activity within a small sphere could be resolved. The iterative 3D reconstruction algorithms uncovered substantially increased recovery values compared to the analytical and 2D iterative reconstruction algorithms (up to 70.46 % and 80.82 % recovery for the smallest and largest sphere using iterative 3D reconstruction algorithms). The transmission measurement (CT or Co-57 source) to correct for attenuation did not severely influence the PVE. The analysis of the quantification accuracy and the PVE revealed an influence of the object size, the reconstruction algorithm and the applied corrections. Particularly, the influence of the emission activity during the transmission measurement performed with a Co-57 source must be considered. To receive comparable results, also among different scanner configurations, standardization of the acquisition (imaging parameters, as well as applied reconstruction and correction protocols) is necessary.
Misra, Ankita; Shukla, Pushpendra Kumar; Kumar, Bhanu; Chand, Jai; Kushwaha, Poonam; Khalid, Md.; Singh Rawat, Ajay Kumar; Srivastava, Sharad
2017-01-01
Background: Gloriosa superba L. (Colchicaceae) is used as adjuvant therapy in gout for its potential antimitotic activity due to high colchicine(s) alkaloids. Objective: This study aimed to develop an easy, cheap, precise, and accurate high-performance thin-layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L. and to identify its elite chemotype(s) from Sikkim Himalayas (India). Methods: The HPTLC chromatographic method was developed using mobile phase of chloroform: acetone: diethyl amine (5:4:1) at λmax of 350 nm. Results: Five germplasms were collected from targeted region, and on morpho-anatomical inspection, no significant variation was observed among them. Quantification data reveal that content of colchicine (Rf: 0.72) and gloriosine (Rf: 0.61) varies from 0.035%–0.150% to 0.006%–0.032% (dry wt. basis). Linearity of method was obtained in the concentration range of 100–400 ng/spot of marker(s), exhibiting regression coefficient of 0.9987 (colchicine) and 0.9983 (gloriosine) with optimum recovery of 97.79 ± 3.86 and 100.023% ± 0.01%, respectively. Limit of detection and limit of quantification were analyzed, respectively, as 6.245, 18.926 and 8.024, 24.316 (ng). Two germplasms, namely NBG-27 and NBG-26, were found to be elite chemotype of both the markers. Conclusion: The developed method is validated in terms of accuracy, recovery, and precision studies as per the ICH guidelines (2005) and can be adopted for the simultaneous quantification of colchicine and gloriosine in phytopharmaceuticals. In addition, this study is relevant to explore the chemotypic variability in metabolite content for commercial and medicinal purposes. SUMMARY An easy, cheap, precise, and accurate high performance thin layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L.Five germplasms were collected from targeted region, and on morpho anatomical inspection, no significant variation was observed among themQuantification data reveal that content of colchicine (Rf: 0.72) and gloriosine (Rf: 0.61) varies from 0.035%–0.150% to 0.006%–0.032% (dry wt. basis)Two germplasms, namely NBG 27 and NBG 26, were found to be elite chemotype of both the markers. PMID:29142436
Misra, Ankita; Shukla, Pushpendra Kumar; Kumar, Bhanu; Chand, Jai; Kushwaha, Poonam; Khalid, Md; Singh Rawat, Ajay Kumar; Srivastava, Sharad
2017-10-01
Gloriosa superba L. (Colchicaceae) is used as adjuvant therapy in gout for its potential antimitotic activity due to high colchicine(s) alkaloids. This study aimed to develop an easy, cheap, precise, and accurate high-performance thin-layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L. and to identify its elite chemotype(s) from Sikkim Himalayas (India). The HPTLC chromatographic method was developed using mobile phase of chloroform: acetone: diethyl amine (5:4:1) at λ max of 350 nm. Five germplasms were collected from targeted region, and on morpho-anatomical inspection, no significant variation was observed among them. Quantification data reveal that content of colchicine ( R f : 0.72) and gloriosine ( R f : 0.61) varies from 0.035%-0.150% to 0.006%-0.032% (dry wt. basis). Linearity of method was obtained in the concentration range of 100-400 ng/spot of marker(s), exhibiting regression coefficient of 0.9987 (colchicine) and 0.9983 (gloriosine) with optimum recovery of 97.79 ± 3.86 and 100.023% ± 0.01%, respectively. Limit of detection and limit of quantification were analyzed, respectively, as 6.245, 18.926 and 8.024, 24.316 (ng). Two germplasms, namely NBG-27 and NBG-26, were found to be elite chemotype of both the markers. The developed method is validated in terms of accuracy, recovery, and precision studies as per the ICH guidelines (2005) and can be adopted for the simultaneous quantification of colchicine and gloriosine in phytopharmaceuticals. In addition, this study is relevant to explore the chemotypic variability in metabolite content for commercial and medicinal purposes. An easy, cheap, precise, and accurate high performance thin layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L.Five germplasms were collected from targeted region, and on morpho anatomical inspection, no significant variation was observed among themQuantification data reveal that content of colchicine (Rf: 0.72) and gloriosine (Rf: 0.61) varies from 0.035%-0.150% to 0.006%-0.032% (dry wt. basis)Two germplasms, namely NBG 27 and NBG 26, were found to be elite chemotype of both the markers.
PET Quantification of the Norepinephrine Transporter in Human Brain with (S,S)-18F-FMeNER-D2.
Moriguchi, Sho; Kimura, Yasuyuki; Ichise, Masanori; Arakawa, Ryosuke; Takano, Harumasa; Seki, Chie; Ikoma, Yoko; Takahata, Keisuke; Nagashima, Tomohisa; Yamada, Makiko; Mimura, Masaru; Suhara, Tetsuya
2017-07-01
Norepinephrine transporter (NET) in the brain plays important roles in human cognition and the pathophysiology of psychiatric disorders. Two radioligands, ( S , S )- 11 C-MRB and ( S , S )- 18 F-FMeNER-D 2 , have been used for imaging NETs in the thalamus and midbrain (including locus coeruleus) using PET in humans. However, NET density in the equally important cerebral cortex has not been well quantified because of unfavorable kinetics with ( S , S )- 11 C-MRB and defluorination with ( S , S )- 18 F-FMeNER-D 2 , which can complicate NET quantification in the cerebral cortex adjacent to the skull containing defluorinated 18 F radioactivity. In this study, we have established analysis methods of quantification of NET density in the brain including the cerebral cortex using ( S , S )- 18 F-FMeNER-D 2 PET. Methods: We analyzed our previous ( S , S )- 18 F-FMeNER-D 2 PET data of 10 healthy volunteers dynamically acquired for 240 min with arterial blood sampling. The effects of defluorination on the NET quantification in the superficial cerebral cortex was evaluated by establishing a time stability of NET density estimations with an arterial input 2-tissue-compartment model, which guided the less-invasive reference tissue model and area under the time-activity curve methods to accurately quantify NET density in all brain regions including the cerebral cortex. Results: Defluorination of ( S , S )- 18 F-FMeNER-D 2 became prominent toward the latter half of the 240-min scan. Total distribution volumes in the superficial cerebral cortex increased with the scan duration beyond 120 min. We verified that 90-min dynamic scans provided a sufficient amount of data for quantification of NET density unaffected by defluorination. Reference tissue model binding potential values from the 90-min scan data and area under the time-activity curve ratios of 70- to 90-min data allowed for the accurate quantification of NET density in the cerebral cortex. Conclusion: We have established methods of quantification of NET densities in the brain including the cerebral cortex unaffected by defluorination using ( S , S )- 18 F-FMeNER-D 2 These results suggest that we can accurately quantify NET density with a 90-min ( S , S )- 18 F-FMeNER-D 2 scan in broad brain areas. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
Provost, Karine; Leblond, Antoine; Gauthier-Lemire, Annie; Filion, Édith; Bahig, Houda; Lord, Martin
2017-09-01
Planar perfusion scintigraphy with 99m Tc-labeled macroaggregated albumin is often used for pretherapy quantification of regional lung perfusion in lung cancer patients, particularly those with poor respiratory function. However, subdividing lung parenchyma into rectangular regions of interest, as done on planar images, is a poor reflection of true lobar anatomy. New tridimensional methods using SPECT and SPECT/CT have been introduced, including semiautomatic lung segmentation software. The present study evaluated inter- and intraobserver agreement on quantification using SPECT/CT software and compared the results for regional lung contribution obtained with SPECT/CT and planar scintigraphy. Methods: Thirty lung cancer patients underwent ventilation-perfusion scintigraphy with 99m Tc-macroaggregated albumin and 99m Tc-Technegas. The regional lung contribution to perfusion and ventilation was measured on both planar scintigraphy and SPECT/CT using semiautomatic lung segmentation software by 2 observers. Interobserver and intraobserver agreement for the SPECT/CT software was assessed using the intraclass correlation coefficient, Bland-Altman plots, and absolute differences in measurements. Measurements from planar and tridimensional methods were compared using the paired-sample t test and mean absolute differences. Results: Intraclass correlation coefficients were in the excellent range (above 0.9) for both interobserver and intraobserver agreement using the SPECT/CT software. Bland-Altman analyses showed very narrow limits of agreement. Absolute differences were below 2.0% in 96% of both interobserver and intraobserver measurements. There was a statistically significant difference between planar and SPECT/CT methods ( P < 0.001) for quantification of perfusion and ventilation for all right lung lobes, with a maximal mean absolute difference of 20.7% for the right middle lobe. There was no statistically significant difference in quantification of perfusion and ventilation for the left lung lobes using either method; however, absolute differences reached 12.0%. The total right and left lung contributions were similar for the two methods, with a mean difference of 1.2% for perfusion and 2.0% for ventilation. Conclusion: Quantification of regional lung perfusion and ventilation using SPECT/CT-based lung segmentation software is highly reproducible. This tridimensional method yields statistically significant differences in measurements for right lung lobes when compared with planar scintigraphy. We recommend that SPECT/CT-based quantification be used for all lung cancer patients undergoing pretherapy evaluation of regional lung function. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
How to Make Data a Blessing to Parametric Uncertainty Quantification and Reduction?
NASA Astrophysics Data System (ADS)
Ye, M.; Shi, X.; Curtis, G. P.; Kohler, M.; Wu, J.
2013-12-01
In a Bayesian point of view, probability of model parameters and predictions are conditioned on data used for parameter inference and prediction analysis. It is critical to use appropriate data for quantifying parametric uncertainty and its propagation to model predictions. However, data are always limited and imperfect. When a dataset cannot properly constrain model parameters, it may lead to inaccurate uncertainty quantification. While in this case data appears to be a curse to uncertainty quantification, a comprehensive modeling analysis may help understand the cause and characteristics of parametric uncertainty and thus turns data into a blessing. In this study, we illustrate impacts of data on uncertainty quantification and reduction using an example of surface complexation model (SCM) developed to simulate uranyl (U(VI)) adsorption. The model includes two adsorption sites, referred to as strong and weak sites. The amount of uranium adsorption on these sites determines both the mean arrival time and the long tail of the breakthrough curves. There is one reaction on the weak site but two reactions on the strong site. The unknown parameters include fractions of the total surface site density of the two sites and surface complex formation constants of the three reactions. A total of seven experiments were conducted with different geochemical conditions to estimate these parameters. The experiments with low initial concentration of U(VI) result in a large amount of parametric uncertainty. A modeling analysis shows that it is because the experiments cannot distinguish the relative adsorption affinity of the strong and weak sites on uranium adsorption. Therefore, the experiments with high initial concentration of U(VI) are needed, because in the experiments the strong site is nearly saturated and the weak site can be determined. The experiments with high initial concentration of U(VI) are a blessing to uncertainty quantification, and the experiments with low initial concentration help modelers turn a curse into a blessing. The data impacts on uncertainty quantification and reduction are quantified using probability density functions of model parameters obtained from Markov Chain Monte Carlo simulation using the DREAM algorithm. This study provides insights to model calibration, uncertainty quantification, experiment design, and data collection in groundwater reactive transport modeling and other environmental modeling.
USDA-ARS?s Scientific Manuscript database
The ability to rapidly screen a large number of individuals is the key to any successful plant breeding program. One of the primary bottlenecks in high throughput screening is the preparation of DNA samples, particularly the quantification and normalization of samples for downstream processing. A ...
A DDDAS Framework for Volcanic Ash Propagation and Hazard Analysis
2012-01-01
probability distribution for the input variables (for example, Hermite polynomials for normally distributed parameters, or Legendre for uniformly...parameters and windfields will drive our simulations. We will use uncertainty quantification methodology – polynomial chaos quadrature in combination...quantification methodology ? polynomial chaos quadrature in combination with data integration to complete the DDDAS loop. 15. SUBJECT TERMS 16. SECURITY
Intrinsic Bioprobes, Inc. (Tempe, AZ)
Nelson, Randall W [Phoenix, AZ; Williams, Peter [Phoenix, AZ; Krone, Jennifer Reeve [Granbury, TX
2008-07-15
Rapid mass spectrometric immunoassay methods for detecting and/or quantifying antibody and antigen analytes utilizing affinity capture to isolate the analytes and internal reference species (for quantification) followed by mass spectrometric analysis of the isolated analyte/internal reference species. Quantification is obtained by normalizing and calibrating obtained mass spectrum against the mass spectrum obtained for an antibody/antigen of known concentration.
USDA-ARS?s Scientific Manuscript database
Accurate identification and quantification of Fusarium virguliforme, the cause of sudden death syndrome (SDS) in soybean, within root tissue and soil are important tasks. Several quantitative PCR (qPCR) assays have been developed but there are no reports comparing their use in sensitive and specific...
Mass spectrometric immunoassay
Nelson, Randall W; Williams, Peter; Krone, Jennifer Reeve
2007-12-04
Rapid mass spectrometric immunoassay methods for detecting and/or quantifying antibody and antigen analytes utilizing affinity capture to isolate the analytes and internal reference species (for quantification) followed by mass spectrometric analysis of the isolated analyte/internal reference species. Quantification is obtained by normalizing and calibrating obtained mass spectrum against the mass spectrum obtained for an antibody/antigen of known concentration.
Mass spectrometric immunoassay
Nelson, Randall W; Williams, Peter; Krone, Jennifer Reeve
2013-07-16
Rapid mass spectrometric immunoassay methods for detecting and/or quantifying antibody and antigen analytes utilizing affinity capture to isolate the analytes and internal reference species (for quantification) followed by mass spectrometric analysis of the isolated analyte/internal reference species. Quantification is obtained by normalizing and calibrating obtained mass spectrum against the mass spectrum obtained for an antibody/antigen of known concentration.
Mass spectrometric immunoassay
Nelson, Randall W.; Williams, Peter; Krone, Jennifer Reeve
2005-12-13
Rapid mass spectrometric immunoassay methods for detecting and/or quantifying antibody and antigen analytes utilizing affinity capture to isolate the analytes and internal reference species (for quantification) followed by mass spectrometric analysis of the isolated analyte/internal reference species. Quantification is obtained by normalizing and calibrating obtained mass spectrum against the mass spectrum obtained for an antibody/antigen of known concentration.
Quantification of RNA Content in Reconstituted Ebola Virus Nucleocapsids by Immunoprecipitation.
Banadyga, Logan; Ebihara, Hideki
2017-01-01
Immunoprecipitations are commonly used to isolate proteins or protein complexes and assess protein-protein interactions; however, they can also be used to assess protein-RNA complexes. Here we describe an adapted RNA immunoprecipitation technique that permits the quantification of RNA content in Ebola virus nucleocapsids that have been reconstituted in vitro by transient transfection.
How to Improve Your Impact Factor: Questioning the Quantification of Academic Quality
ERIC Educational Resources Information Center
Smeyers, Paul; Burbules, Nicholas C.
2011-01-01
A broad-scale quantification of the measure of quality for scholarship is under way. This trend has fundamental implications for the future of academic publishing and employment. In this essay we want to raise questions about these burgeoning practices, particularly how they affect philosophy of education and similar sub-disciplines. First,…
Clinical applications of MS-based protein quantification.
Sabbagh, Bassel; Mindt, Sonani; Neumaier, Michael; Findeisen, Peter
2016-04-01
Mass spectrometry-based assays are increasingly important in clinical laboratory medicine and nowadays are already commonly used in several areas of routine diagnostics. These include therapeutic drug monitoring, toxicology, endocrinology, pediatrics, and microbiology. Accordingly, some of the most common analyses are therapeutic drug monitoring of immunosuppressants, vitamin D, steroids, newborn screening, and bacterial identification. However, MS-based quantification of peptides and proteins for routine diagnostic use is rather rare up to now despite excellent analytical specificity and good sensitivity. Here, we want to give an overview over current fit-for-purpose assays for MS-based protein quantification. Advantages as well as challenges of this approach will be discussed with focus on feasibility for routine diagnostic use. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Venhuizen, Freerk G; van Ginneken, Bram; Liefers, Bart; van Asten, Freekje; Schreur, Vivian; Fauser, Sascha; Hoyng, Carel; Theelen, Thomas; Sánchez, Clara I
2018-04-01
We developed a deep learning algorithm for the automatic segmentation and quantification of intraretinal cystoid fluid (IRC) in spectral domain optical coherence tomography (SD-OCT) volumes independent of the device used for acquisition. A cascade of neural networks was introduced to include prior information on the retinal anatomy, boosting performance significantly. The proposed algorithm approached human performance reaching an overall Dice coefficient of 0.754 ± 0.136 and an intraclass correlation coefficient of 0.936, for the task of IRC segmentation and quantification, respectively. The proposed method allows for fast quantitative IRC volume measurements that can be used to improve patient care, reduce costs, and allow fast and reliable analysis in large population studies.
Venhuizen, Freerk G.; van Ginneken, Bram; Liefers, Bart; van Asten, Freekje; Schreur, Vivian; Fauser, Sascha; Hoyng, Carel; Theelen, Thomas; Sánchez, Clara I.
2018-01-01
We developed a deep learning algorithm for the automatic segmentation and quantification of intraretinal cystoid fluid (IRC) in spectral domain optical coherence tomography (SD-OCT) volumes independent of the device used for acquisition. A cascade of neural networks was introduced to include prior information on the retinal anatomy, boosting performance significantly. The proposed algorithm approached human performance reaching an overall Dice coefficient of 0.754 ± 0.136 and an intraclass correlation coefficient of 0.936, for the task of IRC segmentation and quantification, respectively. The proposed method allows for fast quantitative IRC volume measurements that can be used to improve patient care, reduce costs, and allow fast and reliable analysis in large population studies. PMID:29675301
Quantification of Wine Mixtures with an Electronic Nose and a Human Panel
Aleixandre, Manuel; Cabellos, Juan M.; Arroyo, Teresa; Horrillo, M. C.
2018-01-01
In this work, an electronic nose and a human panel were used for the quantification of wines formed by binary mixtures of four white grape varieties and two varieties of red wines at different percentages (from 0 to 100% in 10% steps for the electronic nose and from 0 to 100% in 25% steps for the human panel). The wines were prepared using the traditional method with commercial yeasts. Both techniques were able to quantify the mixtures tested, but it is important to note that the technology of the electronic nose is faster, simpler, and more objective than the human panel. In addition, better results of quantification were also obtained using the electronic nose. PMID:29484296
Collagen Quantification in Tissue Specimens.
Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I
2017-01-01
Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.
A quantitative witness for Greenberger-Horne-Zeilinger entanglement.
Eltschka, Christopher; Siewert, Jens
2012-01-01
Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger-type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties.
Multiscale recurrence quantification analysis of order recurrence plots
NASA Astrophysics Data System (ADS)
Xu, Mengjia; Shang, Pengjian; Lin, Aijing
2017-03-01
In this paper, we propose a new method of multiscale recurrence quantification analysis (MSRQA) to analyze the structure of order recurrence plots. The MSRQA is based on order patterns over a range of time scales. Compared with conventional recurrence quantification analysis (RQA), the MSRQA can show richer and more recognizable information on the local characteristics of diverse systems which successfully describes their recurrence properties. Both synthetic series and stock market indexes exhibit their properties of recurrence at large time scales that quite differ from those at a single time scale. Some systems present more accurate recurrence patterns under large time scales. It demonstrates that the new approach is effective for distinguishing three similar stock market systems and showing some inherent differences.
Javorka, M; Turianikova, Z; Tonhajzerova, I; Javorka, K; Baumert, M
2009-01-01
The purpose of this paper is to investigate the effect of orthostatic challenge on recurrence plot based complexity measures of heart rate and blood pressure variability (HRV and BPV). HRV and BPV complexities were assessed in 28 healthy subjects over 15 min in the supine and standing positions. The complexity of HRV and BPV was assessed based on recurrence quantification analysis. HRV complexity was reduced along with the HRV magnitude after changing from the supine to the standing position. In contrast, the BPV magnitude increased and BPV complexity decreased upon standing. Recurrence quantification analysis (RQA) of HRV and BPV is sensitive to orthostatic challenge and might therefore be suited to assess changes in autonomic neural outflow to the cardiovascular system.
A quantitative witness for Greenberger-Horne-Zeilinger entanglement
Eltschka, Christopher; Siewert, Jens
2012-01-01
Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger–type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties. PMID:23267431
Statistical modeling of isoform splicing dynamics from RNA-seq time series data.
Huang, Yuanhua; Sanguinetti, Guido
2016-10-01
Isoform quantification is an important goal of RNA-seq experiments, yet it remains problematic for genes with low expression or several isoforms. These difficulties may in principle be ameliorated by exploiting correlated experimental designs, such as time series or dosage response experiments. Time series RNA-seq experiments, in particular, are becoming increasingly popular, yet there are no methods that explicitly leverage the experimental design to improve isoform quantification. Here, we present DICEseq, the first isoform quantification method tailored to correlated RNA-seq experiments. DICEseq explicitly models the correlations between different RNA-seq experiments to aid the quantification of isoforms across experiments. Numerical experiments on simulated datasets show that DICEseq yields more accurate results than state-of-the-art methods, an advantage that can become considerable at low coverage levels. On real datasets, our results show that DICEseq provides substantially more reproducible and robust quantifications, increasing the correlation of estimates from replicate datasets by up to 10% on genes with low or moderate expression levels (bottom third of all genes). Furthermore, DICEseq permits to quantify the trade-off between temporal sampling of RNA and depth of sequencing, frequently an important choice when planning experiments. Our results have strong implications for the design of RNA-seq experiments, and offer a novel tool for improved analysis of such datasets. Python code is freely available at http://diceseq.sf.net G.Sanguinetti@ed.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Whole farm quantification of GHG emissions within smallholder farms in developing countries
NASA Astrophysics Data System (ADS)
Seebauer, Matthias
2014-03-01
The IPCC has compiled the best available scientific methods into published guidelines for estimating greenhouse gas emissions and emission removals from the land-use sector. In order to evaluate existing GHG quantification tools to comprehensively quantify GHG emissions and removals in smallholder conditions, farm scale quantification was tested with farm data from Western Kenya. After conducting a cluster analysis to identify different farm typologies GHG quantification was exercised using the VCS SALM methodology complemented with IPCC livestock emission factors and the cool farm tool. The emission profiles of four farm clusters representing the baseline conditions in the year 2009 are compared with 2011 where farmers adopted sustainable land management practices (SALM). The results demonstrate the variation in both the magnitude of the estimated GHG emissions per ha between different smallholder farm typologies and the emissions estimated by applying two different accounting tools. The farm scale quantification further shows that the adoption of SALM has a significant impact on emission reduction and removals and the mitigation benefits range between 4 and 6.5 tCO2 ha-1 yr-1 with significantly different mitigation benefits depending on typologies of the crop-livestock systems, their different agricultural practices, as well as adoption rates of improved practices. However, the inherent uncertainty related to the emission factors applied by accounting tools has substantial implications for reported agricultural emissions. With regard to uncertainty related to activity data, the assessment confirms the high variability within different farm types as well as between different parameters surveyed to comprehensively quantify GHG emissions within smallholder farms.
Mercurio, Meagan D; Smith, Paul A
2008-07-23
Quantification of red grape tannin and red wine tannin using the methyl cellulose precipitable (MCP) tannin assay and the Adams-Harbertson (A-H) tannin assay were investigated. The study allowed for direct comparison between the repeatability of the assays and for the assessment of other practical considerations such as time efficiency, ease of practice, and throughput, and assessed the relationships between tannin quantification by both analytical techniques. A strong correlation between the two analytical techniques was observed when quantifying grape tannin (r(2) = 0.96), and a good correlation was observed for wine tannins (r(2) = 0.80). However, significant differences in the reported tannin values for the analytical techniques were observed (approximately 3-fold). To explore potential reasons for the difference, investigations were undertaken to determine how several variables influenced the final tannin quantification for both assays. These variables included differences in the amount of tannin precipitated (monitored by HPLC), assay matrix variables, and the monomers used to report the final values. The relationship between tannin quantification and wine astringency was assessed for the MCP and A-H tannin assays, and both showed strong correlations with perceived wine astringency (r(2) = 0.83 and r(2) = 0.90, respectively). The work described here gives guidance to those wanting to understand how the values between the two assays relate; however, a conclusive explanation for the differences in values between the MCP and A-H tannin assays remains unclear, and further work in this area is required.
Seibert, Cathrin; Davidson, Brian R; Fuller, Barry J; Patterson, Laurence H; Griffiths, William J; Wang, Yuqin
2009-04-01
Here we report the identification and approximate quantification of cytochrome P450 (CYP) proteins in human liver microsomes as determined by nano-LC-MS/MS with application of the exponentially modified protein abundance index (emPAI) algorithm during database searching. Protocols based on 1D-gel protein separation and 2D-LC peptide separation gave comparable results. In total, 18 CYP isoforms were unambiguously identified based on unique peptide matches. Further, we have determined the absolute quantity of two CYP enzymes (2E1 and 1A2) in human liver microsomes using stable-isotope dilution mass spectrometry, where microsomal proteins were separated by 1D-gel electrophoresis, digested with trypsin in the presence of either a CYP2E1- or 1A2-specific stable-isotope labeled tryptic peptide and analyzed by LC-MS/MS. Using multiple reaction monitoring (MRM) for the isotope-labeled tryptic peptides and their natural unlabeled analogues quantification could be performed over the range of 0.1-1.5 pmol on column. Liver microsomes from four individuals were analyzed for CYP2E1 giving values of 88-200 pmol/mg microsomal protein. The CYP1A2 content of microsomes from a further three individuals ranged from 165 to 263 pmol/mg microsomal protein. Although, in this proof-of-concept study for CYP quantification, the two CYP isoforms were quantified from different samples, there are no practical reasons to prevent multiplexing the method to allow the quantification of multiple CYP isoforms in a single sample.
Siebenhaar, Markus; Küllmer, Kai; Fernandes, Nuno Miguel de Barros; Hüllen, Volker; Hopf, Carsten
2015-09-01
Desorption electrospray ionization (DESI) mass spectrometry is an emerging technology for direct therapeutic drug monitoring in dried blood spots (DBS). Current DBS methods require manual application of small molecules as internal standards for absolute drug quantification. With industrial standardization in mind, we superseded the manual addition of standard and built a three-layer setup for robust quantification of salicylic acid directly from DBS. We combined a dioctyl sodium sulfosuccinate weave facilitating sample spreading with a cellulose layer for addition of isotope-labeled salicylic acid as internal standard and a filter paper for analysis of the standard-containing sample by DESI-MS. Using this setup, we developed a quantification method for salicylic acid from whole blood with a validated linear curve range from 10 to 2000 mg/L, a relative standard deviation (RSD%) ≤14%, and determination coefficients of 0.997. The limit of detection (LOD) was 8 mg/L and the lower limit of quantification (LLOQ) was 10 mg/L. Recovery rates in method verification by LC-MS/MS were 97 to 101% for blinded samples. Most importantly, a study in healthy volunteers after administration of a single dose of Aspirin provides evidence to suggest that the three-layer setup may enable individual pharmacokinetic and endpoint testing following blood collection by finger pricking by patients at home. Taken together, our data suggests that DBS-based quantification of drugs by DESI-MS on pre-manufactured three-layer cartridges may be a promising approach for future near-patient therapeutic drug monitoring.
Chaouachi, Maher; El Malki, Redouane; Berard, Aurélie; Romaniuk, Marcel; Laval, Valérie; Brunel, Dominique; Bertheau, Yves
2008-03-26
The labeling of products containing genetically modified organisms (GMO) is linked to their quantification since a threshold for the presence of fortuitous GMOs in food has been established. This threshold is calculated from a combination of two absolute quantification values: one for the specific GMO target and the second for an endogenous reference gene specific to the taxon. Thus, the development of reliable methods to quantify GMOs using endogenous reference genes in complex matrixes such as food and feed is needed. Plant identification can be difficult in the case of closely related taxa, which moreover are subject to introgression events. Based on the homology of beta-fructosidase sequences obtained from public databases, two couples of consensus primers were designed for the detection, quantification, and differentiation of four Solanaceae: potato (Solanum tuberosum), tomato (Solanum lycopersicum), pepper (Capsicum annuum), and eggplant (Solanum melongena). Sequence variability was studied first using lines and cultivars (intraspecies sequence variability), then using taxa involved in gene introgressions, and finally, using taxonomically close taxa (interspecies sequence variability). This study allowed us to design four highly specific TaqMan-MGB probes. A duplex real time PCR assay was developed for simultaneous quantification of tomato and potato. For eggplant and pepper, only simplex real time PCR tests were developed. The results demonstrated the high specificity and sensitivity of the assays. We therefore conclude that beta-fructosidase can be used as an endogenous reference gene for GMO analysis.
Tranquart, F; Mercier, L; Frinking, P; Gaud, E; Arditi, M
2012-07-01
With contrast-enhanced ultrasound (CEUS) now established as a valuable imaging modality for many applications, a more specific demand has recently emerged for quantifying perfusion and using measured parameters as objective indicators for various disease states. However, CEUS perfusion quantification remains challenging and is not well integrated in daily clinical practice. The development of VueBox™ alleviates existing limitations and enables quantification in a standardized way. VueBox™ operates as an off-line software application, after dynamic contrast-enhanced ultrasound (DCE-US) is performed. It enables linearization of DICOM clips, assessment of perfusion using patented curve-fitting models, and generation of parametric images by synthesizing perfusion information at the pixel level using color coding. VueBox™ is compatible with most of the available ultrasound platforms (nonlinear contrast-enabled), has the ability to process both bolus and disruption-replenishment kinetics loops, allows analysis results and their context to be saved, and generates analysis reports automatically. Specific features have been added to VueBox™, such as fully automatic in-plane motion compensation and an easy-to-use clip editor. Processing time has been reduced as a result of parallel programming optimized for multi-core processors. A long list of perfusion parameters is available for each of the two administration modes to address all possible demands currently reported in the literature for diagnosis or treatment monitoring. In conclusion, VueBox™ is a valid and robust quantification tool to be used for standardizing perfusion quantification and to improve the reproducibility of results across centers. © Georg Thieme Verlag KG Stuttgart · New York.
Tasaniyananda, Natt; Tungtrongchitr, Anchalee; Seesuay, Watee; Sakolvaree, Yuwaporn; Aiumurai, Pisinee; Indrawattana, Nitaya; Chaicumpa, Wanpen; Sookrung, Nitat
2018-03-01
Avoidance of allergen exposure is an effective measure for preventing naÏve and allergic individuals from sensitization (primary intervention) and disease aggravation (secondary intervention), respectively. Regular monitoring of the allergens in the environment is required for the effective intervention. Thus, there is a need for cost-effective test kits for environmental allergen quantifications. To invent a test kit for quantification of cat major allergen, Fel d 1. A mouse monoclonal antibody (MAb) specific to the newly identified IgE-binding conformational epitope of the cat major allergen (Fel d 1) and rabbit polyclonal IgG to recombinant Fel d 1 were used as allergen capture and detection reagents, respectively. Native Fel d 1 was used in constructing a standard curve. Sixteen of 36 dust samples collected from houses of cat allergic subjects in Bangkok contained Fel d 1 above 0.29 μg/gram of dust which is considered as a novel threshold level for causing cat allergy sensitization or symptoms. Among them, 7 samples contained the allergen exceeding 2.35 μg/gram of dust which is the level that would aggravate asthma. Results of the allergen quantification using the locally made test kit showed strong correlation (r = 0.923) with the allergen quantification using commercialized reagents. The assay using MAb to Fel d 1 IgE-binding epitope of this study has potential application as an economic and practical tool for cat allergy intervention measure especially in localities where health resources are relatively limited.
Sobrado, Laura Alonso; Freije-Carrelo, Laura; Moldovan, Mariella; Encinar, Jorge Ruiz; Alonso, J Ignacio García
2016-07-29
GC-FID has been effectively used as a universal quantification technique for volatile organic compounds for a long time. In most cases, the use of the ECN allows for quantification by GC-FID without external calibration using only the response of a single internal standard. In this paper we compare the performance characteristics of GC-FID with those of post-column (13)C Isotope Dilution GC-Combustion-MS for the absolute quantification of organic compounds without the need for individual standards. For this comparison we have selected the quantification of FAMEs in biodiesel. The selection of the right internal standard was critical for GC-FID even when ECN were considered. On the other hand, the nature of the internal standard was not relevant when GC-Combustion-MS was employed. The proposed method was validated with the analysis of the certified reference material SRM 2772 and comparative data was obtained on real biodiesel samples. The analysis of the SRM 2772 biodiesel provided recoveries in the range 100.6-103.5% and 96.4-103.6% for GC-combustion-MS and GC-FID, respectively. The detection limit for GC-combustion-MS was found to be 4.2ng compound/g of injected sample. In conclusion, the quantitative performance of GC-Combustion-MS compared satisfactorily with that of GC-FID constituting a viable alternative for the quantification of organic compounds without the need for individual standards. Copyright © 2016 Elsevier B.V. All rights reserved.
Seibert, Cathrin; Davidson, Brian R.; Fuller, Barry J.; Patterson, Laurence H.; Griffiths, William J.; Wang, Yuqin
2009-01-01
Here we report the identification and approximate quantification of cytochrome P450 (CYP) proteins in human liver microsomes as determined by nano-LC-MS/MS with application of the exponentially modified protein abundance index (emPAI) algorithm during database searching. Protocols based on 1D-gel protein separation and 2D-LC peptide separation gave comparable results. In total 18 CYP isoforms were unambiguously identified based on unique peptide matches. Further, we have determined the absolute quantity of two CYP enzymes (2E1 and 1A2) in human liver microsomes using stable-isotope dilution mass spectrometry, where microsomal proteins were separated by 1D-gel electrophoresis, digested with trypsin in the presence of either a CYP2E1- or 1A2-specific stable-isotope labelled tryptic peptide and analysed by LC-MS/MS. Using multiple reaction monitoring (MRM) for the isotope-labelled tryptic peptides and their natural unlabelled analogues quantification could be performed over the range of 0.1 – 1.5 pmol on column. Liver microsomes from four individuals were analysed for CYP2E1 giving values of 88 - 200 pmol/mg microsomal protein. The CYP1A2 content of microsomes from a further three individuals ranged from 165 – 263 pmol/mg microsomal protein. Although, in this proof-of-concept study for CYP quantification, the two CYP-isoforms were quantified from different samples, there are no practical reasons to prevent multiplexing the method to allow the quantification of multiple CYP-isoforms in a single sample. PMID:19714871
NASA Astrophysics Data System (ADS)
Keller, Brad M.; Reeves, Anthony P.; Yankelevitz, David F.; Henschke, Claudia I.; Barr, R. Graham
2009-02-01
Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for the imaging of the anatomical basis of emphysema and quantification of the underlying disease state. Several measures have been introduced for the quantification emphysema directly from CT data; most,however, are based on the analysis of density information provided by the CT scans, which vary by scanner and can be hard to standardize across sites and time. Given that one of the anatomical variations associated with the progression of emphysema is the flatting of the diaphragm due to the loss of elasticity in the lung parenchyma, curvature analysis of the diaphragm would provide information about emphysema from CT. Therefore, we propose a new, non-density based measure of the curvature of the diaphragm that would allow for further quantification methods in a robust manner. To evaluate the new method, 24 whole-lung scans were analyzed using the ratios of the lung height and diaphragm width to diaphragm height as curvature estimates as well as using the emphysema index as comparison. Pearson correlation coefficients showed a strong trend of several of the proposed diaphragm curvature measures to have higher correlations, of up to r=0.57, with DLCO% and VA than did the emphysema index. Furthermore, we found emphysema index to have only a 0.27 correlation to the proposed measures, indicating that the proposed measures evaluate different aspects of the disease.
Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif
2015-01-01
Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.
Barrera-Escorcia, Guadalupe; Wong-Chang, Irma; Fernández-Rendón, Carlos Leopoldo; Botello, Alfonso Vázquez; Gómez-Gil, Bruno; Lizárraga-Partida, Marcial Leonardo
2016-11-01
Oysters can accumulate potentially pathogenic water bacteria. The objective of this study was to compare two procedures to quantify Vibrio species present in oysters to determine the most sensitive method. We analyzed oyster samples from the Gulf of Mexico, commercialized in Mexico City. The samples were inoculated in tubes with alkaline peptone water (APW), based on three tubes and four dilutions (10 -1 to 10 -4 ). From these tubes, the first quantification of Vibrio species was performed (most probable number (MPN) from tubes) and bacteria were inoculated by streaking on thiosulfate-citrate-bile salts-sucrose (TCBS) petri dishes. Colonies were isolated for a second quantification (MPN from dishes). Polymerase chain reaction (PCR) was used to determine species with specific primers: ompW for Vibrio cholerae, tlh for Vibrio parahaemolyticus, and VvhA for Vibrio vulnificus. Simultaneously, the sanitary quality of oysters was determined. The quantification of V. parahaemolyticus was significantly higher in APW tubes than in TCBS dishes. Regarding V. vulnificus counts, the differences among both approaches were not significant. In contrast, the MPNs of V. cholerae obtained from dishes were higher than from tubes. The quantification of MPNs through PCR of V. parahaemolyticus and V. vulnificus obtained from APW was sensitive and recommendable for the detection of both species. In contrast, to quantify V. cholerae, it was necessary to isolate colonies on TCBS prior PCR. Culturing in APW at 42 °C could be an alternative to avoid colony isolation. The MPNs of V. cholerae from dishes was associated with the bad sanitary quality of the samples.
Powder X-ray diffraction method for the quantification of cocrystals in the crystallization mixture.
Padrela, Luis; de Azevedo, Edmundo Gomes; Velaga, Sitaram P
2012-08-01
The solid state purity of cocrystals critically affects their performance. Thus, it is important to accurately quantify the purity of cocrystals in the final crystallization product. The aim of this study was to develop a powder X-ray diffraction (PXRD) quantification method for investigating the purity of cocrystals. The method developed was employed to study the formation of indomethacin-saccharin (IND-SAC) cocrystals by mechanochemical methods. Pure IND-SAC cocrystals were geometrically mixed with 1:1 w/w mixture of indomethacin/saccharin in various proportions. An accurately measured amount (550 mg) of the mixture was used for the PXRD measurements. The most intense, non-overlapping, characteristic diffraction peak of IND-SAC was used to construct the calibration curve in the range 0-100% (w/w). This calibration model was validated and used to monitor the formation of IND-SAC cocrystals by liquid-assisted grinding (LAG). The IND-SAC cocrystal calibration curve showed excellent linearity (R(2) = 0.9996) over the entire concentration range, displaying limit of detection (LOD) and limit of quantification (LOQ) values of 1.23% (w/w) and 3.74% (w/w), respectively. Validation results showed excellent correlations between actual and predicted concentrations of IND-SAC cocrystals (R(2) = 0.9981). The accuracy and reliability of the PXRD quantification method depend on the methods of sample preparation and handling. The crystallinity of the IND-SAC cocrystals was higher when larger amounts of methanol were used in the LAG method. The PXRD quantification method is suitable and reliable for verifying the purity of cocrystals in the final crystallization product.
Martins, Ayrton F; Frank, Carla da S; Altissimo, Joseline; de Oliveira, Júlia A; da Silva, Daiane S; Reichert, Jaqueline F; Souza, Darliana M
2017-08-24
Statins are classified as being amongst the most prescribed agents for treating hypercholesterolaemia and preventing vascular diseases. In this study, a rapid and effective liquid chromatography method, assisted by diode array detection, was designed and validated for the simultaneous quantification of atorvastatin (ATO) and simvastatin (SIM) in hospital effluent samples. The solid phase extraction (SPE) of the analytes was optimized regarding sorbent material and pH, and the dispersive liquid-liquid microextraction (DLLME), in terms of pH, ionic strength, type and volume of extractor/dispersor solvents. The performance of both extraction procedures was evaluated in terms of linearity, quantification limits, accuracy (recovery %), precision and matrix effects for each analyte. The methods proved to be linear in the concentration range considered; the quantification limits were 0.45 µg L -1 for ATO and 0.75 µg L -1 for SIM; the matrix effect was almost absent in both methods and the average recoveries remained between 81.5-90.0%; and the RSD values were <20%. The validated methods were applied to the quantification of the statins in real samples of hospital effluent; the concentrations ranged from 18.8 µg L -1 to 35.3 µg L -1 for ATO, and from 30.3 µg L -1 to 38.5 µg L -1 for SIM. Since the calculated risk quotient was ≤192, the occurrence of ATO and SIM in hospital effluent poses a potential serious risk to human health and the aquatic ecosystem.
Simultaneous detection of valine and lactate using MEGA-PRESS editing in pyogenic brain abscess.
Lange, Thomas; Ko, Cheng-Wen; Lai, Ping-Hong; Dacko, Michael; Tsai, Shang-Yueh; Buechert, Martin
2016-12-01
Valine and lactate have been recognized as important metabolic markers to diagnose brain abscess by means of MRS. However, in vivo unambiguous detection and quantification is hampered by macromolecular contamination. In this work, MEGA-PRESS difference editing of valine and lactate is proposed. The method is validated in vitro and applied for quantitative in vivo experiments in one healthy subject and two brain abscess patients. It is demonstrated that with this technique the overlapping lipid signal can be reduced by more than an order of magnitude and thus the robustness of valine and lactate detection in vivo can be enhanced. Quantification of the two abscess MEGA-PRESS spectra yielded valine/lactate concentration ratios of 0.10 and 0.27. These ratios agreed with the concentration ratios determined from concomitantly acquired short-T E PRESS data and were in line with literature values. The quantification accuracy of lactate (as measured with Cramér-Rao lower bounds in LCModel processing) was better for MEGA-PRESS than for short-T E PRESS in all acquired in vivo datasets. The Cramér-Rao lower bounds of valine were only better for MEGA-PRESS in one of the two abscess cases, while in the other case coediting of isoleucine confounded the quantification in the MEGA-PRESS analysis. MEGA-PRESS and short-T E PRESS should be combined for unambiguous quantification of amino acids in abscess measurements. Simultaneous valine/lactate MEGA-PRESS editing might benefit the distinction of brain abscesses from tumors, and further categorization of bacteria with reasonable sensitivity and specificity. Copyright © 2016 John Wiley & Sons, Ltd.
A refined methodology for modeling volume quantification performance in CT
NASA Astrophysics Data System (ADS)
Chen, Baiyu; Wilson, Joshua; Samei, Ehsan
2014-03-01
The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.
Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield
NASA Technical Reports Server (NTRS)
Baurle, R. A.; Axdahl, E. L.
2017-01-01
Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.
Bihan, Kevin; Sauzay, Chloé; Goldwirt, Lauriane; Charbonnier-Beaupel, Fanny; Hulot, Jean-Sebastien; Funck-Brentano, Christian; Zahr, Noël
2015-02-01
Vemurafenib (Zelboraf) is a new tyrosine kinase inhibitor that selectively targets activated BRAF V600E gene and is indicated for the treatment of advanced BRAF mutation-positive melanoma. We developed a simple method for vemurafenib quantification using liquid chromatography-tandem mass spectrometry. A stability study of vemurafenib in human plasma was also performed. (13)C(6)-vemurafenib was used as the internal standard. A single-step protein precipitation was used for plasma sample preparation. Chromatography was performed on an Acquity UPLC system (Waters) with chromatographic separation by the use of an Acquity UPLC BEH C18 column (2.1 × 50 mm, 1.7-mm particle size; Waters). Quantification was performed using the monitoring of multiple reactions of following transitions: m/z 488.2 → 381.0 for vemurafenib and m/z 494.2 → 387.0 for internal standard. This method was linear over the range from 1.0 to 100.0 mcg/mL. The lower limit of quantification was 0.1 mcg/mL for vemurafenib in plasma. Vemurafenib remained stable for 1 month at all levels tested, when stored indifferently at room temperature (20 °C), at +4 °C, or at -20 °C. This method was used successfully to perform a plasma pharmacokinetic study of vemurafenib in a patient after oral administration at a steady state. This liquid chromatography-tandem mass spectrometry method for vemurafenib quantification in human plasma is simple, rapid, specific, sensitive, accurate, precise, and reliable.
ERIC Educational Resources Information Center
Yeung, Brendan; Ng, Tuck Wah; Tan, Han Yen; Liew, Oi Wah
2012-01-01
The use of different types of stains in the quantification of proteins separated on gels using electrophoresis offers the capability of deriving good outcomes in terms of linear dynamic range, sensitivity, and compatibility with specific proteins. An inexpensive, simple, and versatile lighting system based on liquid crystal display backlighting is…
Peñuelas-Urquides, Katia; Villarreal-Treviño, Licet; Silva-Ramírez, Beatriz; Rivadeneyra-Espinoza, Liliana; Said-Fernández, Salvador; de León, Mario Bermúdez
2013-01-01
The quantification of colony forming units (cfu), turbidity, and optical density at 600 nm (OD600) measurements were used to evaluate Mycobacterium tuberculosis growth. Turbidity and OD600 measurements displayed similar growth curves, while cfu quantification showed a continuous growth curve. We determined the cfu equivalents to McFarland and OD600 units. PMID:24159318
Peñuelas-Urquides, Katia; Villarreal-Treviño, Licet; Silva-Ramírez, Beatriz; Rivadeneyra-Espinoza, Liliana; Said-Fernández, Salvador; de León, Mario Bermúdez
2013-01-01
The quantification of colony forming units (cfu), turbidity, and optical density at 600 nm (OD600) measurements were used to evaluate Mycobacterium tuberculosis growth. Turbidity and OD600 measurements displayed similar growth curves, while cfu quantification showed a continuous growth curve. We determined the cfu equivalents to McFarland and OD600 units.
Quantification of Fluorine Content in AFFF Concentrates
2017-09-29
and quantitative integrations, a 100 ppm spectral window (FIDRes 0.215 Hz) was scanned using the following acquisition parameters: acquisition time ...Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/6120--17-9752 Quantification of Fluorine Content in AFFF Concentrates September 29, 2017...collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources
Background: Next generation air measurement (NGAM) technologies are enabling new regulatory and compliance approaches that will help EPA better understand and meet emerging challenges associated with fugitive and area source emissions from industrial and oil and gas sectors. In...
Utilization of remote sensing techniques for the quantification of fire behavior in two pine stands
Eric V. Mueller; Nicholas Skowronski; Kenneth Clark; Michael Gallagher; Robert Kremens; Jan C. Thomas; Mohamad El Houssami; Alexander Filkov; Rory M. Hadden; William Mell; Albert Simeoni
2017-01-01
Quantification of field-scale fire behavior is necessary to improve the current scientific understanding of wildland fires and to develop and test relevant, physics-based models. In particular, detailed descriptions of individual fires are required, for which the available literature is limited. In this work, two such field-scale experiments, carried out in pine stands...
Rakesh Minocha; P. Thangavel; Om Parkash Dhankher; Stephanie Long
2008-01-01
The HPLC method presented here for the quantification of metal-binding thiols is considerably shorter than most previously published methods. It is a sensitive and highly reproducible method that separates monobromobimane tagged monothiols (cysteine, glutathione, γ-glutamylcysteine) along with polythiols (PC2, PC3...
NASA Astrophysics Data System (ADS)
Bejjani, A.; Roumié, M.; Akkad, S.; El-Yazbi, F.; Nsouli, B.
2016-03-01
We have demonstrated, in previous studies that Particle Induced X-ray Emission (PIXE) is one of the most rapid and accurate choices for quantification of an active ingredient, in a solid drug, from the reactions induced on its specific heteroatom using pellets made from original tablets. In this work, PIXE is used, for the first time, for simultaneous quantification of two active ingredients, amoxicillin trihydrate and potassium clavulanate, in six different commercial antibiotic type of drugs. Since the quality control process of a drug covers a large number of samples, the scope of this study was also to found the most rapid and low cost sample preparation needed to analyze these drugs with a good precision. The chosen drugs were analyzed in their tablets' "as received" form, in pellets made from the powder of the tablets and also in pellets made from the powder of the tablets after being heated up to 70 °C to avoid any molecular destruction until constant weight and removal of humidity. The quantification validity related to the aspects of each sample preparation (homogeneity of the drug components and humidity) are presented and discussed.
Nicolás, Paula; Lassalle, Verónica L; Ferreira, María L
2017-02-01
The aim of this manuscript was to study the application of a new method of protein quantification in Candida antarctica lipase B commercial solutions. Error sources associated to the traditional Bradford technique were demonstrated. Eight biocatalysts based on C. antarctica lipase B (CALB) immobilized onto magnetite nanoparticles were used. Magnetite nanoparticles were coated with chitosan (CHIT) and modified with glutaraldehyde (GLUT) and aminopropyltriethoxysilane (APTS). Later, CALB was adsorbed on the modified support. The proposed novel protein quantification method included the determination of sulfur (from protein in CALB solution) by means of Atomic Emission by Inductive Coupling Plasma (AE-ICP). Four different protocols were applied combining AE-ICP and classical Bradford assays, besides Carbon, Hydrogen and Nitrogen (CHN) analysis. The calculated error in protein content using the "classic" Bradford method with bovine serum albumin as standard ranged from 400 to 1200% when protein in CALB solution was quantified. These errors were calculated considering as "true protein content values" the results of the amount of immobilized protein obtained with the improved method. The optimum quantification procedure involved the combination of Bradford method, ICP and CHN analysis. Copyright © 2016 Elsevier Inc. All rights reserved.
Nahar, Limon Khatun; Cordero, Rosa Elena; Nutt, David; Lingford-Hughes, Anne; Turton, Samuel; Durant, Claire; Wilson, Sue; Paterson, Sue
2016-01-01
Abstract A highly sensitive and fully validated method was developed for the quantification of baclofen in human plasma. After adjusting the pH of the plasma samples using a phosphate buffer solution (pH 4), baclofen was purified using mixed mode (C8/cation exchange) solid-phase extraction (SPE) cartridges. Endogenous water-soluble compounds and lipids were removed from the cartridges before the samples were eluted and concentrated. The samples were analyzed using triple-quadrupole liquid chromatography–tandem mass spectrometry (LC–MS-MS) with triggered dynamic multiple reaction monitoring mode for simultaneous quantification and confirmation. The assay was linear from 25 to 1,000 ng/mL (r2 > 0.999; n = 6). Intraday (n = 6) and interday (n = 15) imprecisions (% relative standard deviation) were <5%, and the average recovery was 30%. The limit of detection of the method was 5 ng/mL, and the limit of quantification was 25 ng/mL. Plasma samples from healthy male volunteers (n = 9, median age: 22) given two single oral doses of baclofen (10 and 60 mg) on nonconsecutive days were analyzed to demonstrate method applicability. PMID:26538544
[Analysis of free foetal DNA in maternal plasma using STR loci].
Vodicka, R; Vrtel, R; Procházka, M; Santavá, A; Dusek, L; Vrbická, D; Singh, R; Krejciríková, E; Schneiderová, E; Santavý, J
2006-01-01
Problems of maternal and foetal genotype differentiation of maternal plasma in pregnant women are solved generally by real-time systems. In this case the specific probes are used to distinguish particular genotype. Mostly gonosomal sequences are utilised to recognise the male foetus. This work describes possibilities in free foetal DNA detection and quantification by STR. Artificial genotype mixtures ranging from 0,2 % to 100 % to simulate maternal and paternal genotypes and 27 DNA samples from pregnant women in different stage of pregnancy were used for DNA quantification and detection. Foetal genotype was confirmed by biological father genotyping. The detection was performed in STR from 21st chromosome Down syndrome (DS) responsible region by innovated (I) QF PCR which allows to reveal and quantify even very rare DNA mosaics. The STR quantification was assessed in artificial mixtures of genotypes and discriminability of particular genotypes was on the level of few percent. Foetal DNA was detected in 74 % of tested samples. The IQF PCR application in quantification and differentiation between maternal and foetal genotypes by STR loci could have importance in non-invasive prenatal diagnostics as another possible marker for DS risk assessment.
Moody, Jonathan B; Lee, Benjamin C; Corbett, James R; Ficaro, Edward P; Murthy, Venkatesh L
2015-10-01
A number of exciting advances in PET/CT technology and improvements in methodology have recently converged to enhance the feasibility of routine clinical quantification of myocardial blood flow and flow reserve. Recent promising clinical results are pointing toward an important role for myocardial blood flow in the care of patients. Absolute blood flow quantification can be a powerful clinical tool, but its utility will depend on maintaining precision and accuracy in the face of numerous potential sources of methodological errors. Here we review recent data and highlight the impact of PET instrumentation, image reconstruction, and quantification methods, and we emphasize (82)Rb cardiac PET which currently has the widest clinical application. It will be apparent that more data are needed, particularly in relation to newer PET technologies, as well as clinical standardization of PET protocols and methods. We provide recommendations for the methodological factors considered here. At present, myocardial flow reserve appears to be remarkably robust to various methodological errors; however, with greater attention to and more detailed understanding of these sources of error, the clinical benefits of stress-only blood flow measurement may eventually be more fully realized.
Venturelli, Gustavo L; Brod, Fábio C A; Rossi, Gabriela B; Zimmermann, Naíra F; Oliveira, Jaison P; Faria, Josias C; Arisi, Ana C M
2014-11-01
The Embrapa 5.1 genetically modified (GM) common bean was approved for commercialization in Brazil. Methods for the quantification of this new genetically modified organism (GMO) are necessary. The development of a suitable endogenous reference is essential for GMO quantification by real-time PCR. Based on this, a new taxon-specific endogenous reference quantification assay was developed for Phaseolus vulgaris L. Three genes encoding common bean proteins (phaseolin, arcelin, and lectin) were selected as candidates for endogenous reference. Primers targeting these candidate genes were designed and the detection was evaluated using the SYBR Green chemistry. The assay targeting lectin gene showed higher specificity than the remaining assays, and a hydrolysis probe was then designed. This assay showed high specificity for 50 common bean samples from two gene pools, Andean and Mesoamerican. For GM common bean varieties, the results were similar to those obtained for non-GM isogenic varieties with PCR efficiency values ranging from 92 to 101 %. Moreover, this assay presented a limit of detection of ten haploid genome copies. The primers and probe developed in this work are suitable to detect and quantify either GM or non-GM common bean.
Phylogenetic Quantification of Intra-tumour Heterogeneity
Schwarz, Roland F.; Trinh, Anne; Sipos, Botond; Brenton, James D.; Goldman, Nick; Markowetz, Florian
2014-01-01
Intra-tumour genetic heterogeneity is the result of ongoing evolutionary change within each cancer. The expansion of genetically distinct sub-clonal populations may explain the emergence of drug resistance, and if so, would have prognostic and predictive utility. However, methods for objectively quantifying tumour heterogeneity have been missing and are particularly difficult to establish in cancers where predominant copy number variation prevents accurate phylogenetic reconstruction owing to horizontal dependencies caused by long and cascading genomic rearrangements. To address these challenges, we present MEDICC, a method for phylogenetic reconstruction and heterogeneity quantification based on a Minimum Event Distance for Intra-tumour Copy-number Comparisons. Using a transducer-based pairwise comparison function, we determine optimal phasing of major and minor alleles, as well as evolutionary distances between samples, and are able to reconstruct ancestral genomes. Rigorous simulations and an extensive clinical study show the power of our method, which outperforms state-of-the-art competitors in reconstruction accuracy, and additionally allows unbiased numerical quantification of tumour heterogeneity. Accurate quantification and evolutionary inference are essential to understand the functional consequences of tumour heterogeneity. The MEDICC algorithms are independent of the experimental techniques used and are applicable to both next-generation sequencing and array CGH data. PMID:24743184
Xiang, Yun; Koomen, John M.
2012-01-01
Protein quantification with liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM) has emerged as a powerful platform for assessing panels of biomarkers. In this study, direct infusion, using automated, chip-based nanoelectrospray ionization, coupled with MRM (DI-MRM) is used for protein quantification. Removal of the LC separation step increases the importance of evaluating the ratios between the transitions. Therefore, the effects of solvent composition, analyte concentration, spray voltage, and quadrupole resolution settings on fragmentation patterns have been studied using peptide and protein standards. After DI-MRM quantification was evaluated for standards, quantitative assays for the expression of heat shock proteins (HSPs) were translated from LC-MRM to DI-MRM for implementation in cell line models of multiple myeloma. Requirements for DI-MRM assay development are described. Then, the two methods are compared; criteria for effective DI-MRM analysis are reported based on the analysis of HSP expression in digests of whole cell lysates. The increased throughput of DI-MRM analysis is useful for rapid analysis of large batches of similar samples, such as time course measurements of cellular responses to therapy. PMID:22293045
Quan, Phenix-Lan; Sauzade, Martin
2018-01-01
Digital Polymerase Chain Reaction (dPCR) is a novel method for the absolute quantification of target nucleic acids. Quantification by dPCR hinges on the fact that the random distribution of molecules in many partitions follows a Poisson distribution. Each partition acts as an individual PCR microreactor and partitions containing amplified target sequences are detected by fluorescence. The proportion of PCR-positive partitions suffices to determine the concentration of the target sequence without a need for calibration. Advances in microfluidics enabled the current revolution of digital quantification by providing efficient partitioning methods. In this review, we compare the fundamental concepts behind the quantification of nucleic acids by dPCR and quantitative real-time PCR (qPCR). We detail the underlying statistics of dPCR and explain how it defines its precision and performance metrics. We review the different microfluidic digital PCR formats, present their underlying physical principles, and analyze the technological evolution of dPCR platforms. We present the novel multiplexing strategies enabled by dPCR and examine how isothermal amplification could be an alternative to PCR in digital assays. Finally, we determine whether the theoretical advantages of dPCR over qPCR hold true by perusing studies that directly compare assays implemented with both methods. PMID:29677144
Liu, Ruijuan; Wang, Mengmeng; Ding, Li
2014-10-01
Menadione (VK3), an essential fat-soluble naphthoquinone, takes very important physiological and pathological roles, but its detection and quantification is challenging. Herein, a new method was developed for quantification of VK3 in human plasma by liquid chromatography-tandem mass spectrometry (LC-MS/MS) after derivatization with 3-mercaptopropionic acid via Michael addition reaction. The derivative had been identified by the mass spectra and the derivatization conditions were optimized by considering different parameters. The method was demonstrated with high sensitivity and a low limit of quantification of 0.03 ng mL(-1) for VK3, which is about 33-fold better than that for the direct analysis of the underivatized compound. The method also had good precision and reproducibility. It was applied in the determination of basal VK3 in human plasma and a clinical pharmacokinetic study of menadiol sodium diphosphate. Furthermore, the method for the quantification of VK3 using LC-MS/MS was reported in this paper for the first time, and it will provide an important strategy for the further research on VK3 and menadione analogs. Copyright © 2014 Elsevier B.V. All rights reserved.
Critical aspects of data analysis for quantification in laser-induced breakdown spectroscopy
NASA Astrophysics Data System (ADS)
Motto-Ros, V.; Syvilay, D.; Bassel, L.; Negre, E.; Trichard, F.; Pelascini, F.; El Haddad, J.; Harhira, A.; Moncayo, S.; Picard, J.; Devismes, D.; Bousquet, B.
2018-02-01
In this study, a collaborative contest focused on LIBS data processing has been conducted in an original way since the participants did not share the same samples to be analyzed on their own LIBS experiments but a set of LIBS spectra obtained from one single experiment. Each participant was asked to provide the predicted concentrations of several elements for two glass samples. The analytical contest revealed a wide diversity of results among participants, even when the same spectral lines were considered for the analysis. Then, a parametric study was conducted to investigate the influence of each step during the data processing. This study was based on several analytical figures of merit such as the determination coefficient, uncertainty, limit of quantification and prediction ability (i.e., trueness). Then, it was possible to interpret the results provided by the participants, emphasizing the fact that the type of data extraction, baseline modeling as well as the calibration model play key roles in the quantification performance of the technique. This work provides a set of recommendations based on a systematic evaluation of the quantification procedure with the aim of optimizing the methodological steps toward the standardization of LIBS.
HPLC Quantification of astaxanthin and canthaxanthin in Salmonidae eggs.
Tzanova, Milena; Argirova, Mariana; Atanasov, Vasil
2017-04-01
Astaxanthin and canthaxanthin are naturally occurring antioxidants referred to as xanthophylls. They are used as food additives in fish farms to improve the organoleptic qualities of salmonid products and to prevent reproductive diseases. This study reports the development and single-laboratory validation of a rapid method for quantification of astaxanthin and canthaxanthin in eggs of rainbow trout (Oncorhynchus mykiss) and brook trout (Salvelinus fontinalis М.). An advantage of the proposed method is the perfect combination of selective extraction of the xanthophylls and analysis of the extract by high-performance liquid chromatography and photodiode array detection. The method validation was carried out in terms of linearity, accuracy, precision, recovery and limits of detection and quantification. The method was applied for simultaneous quantification of the two xanthophylls in eggs of rainbow trout and brook trout after their selective extraction. The results show that astaxanthin accumulations in salmonid fish eggs are larger than those of canthaxanthin. As the levels of these two xanthophylls affect fish fertility, this method can be used to improve the nutritional quality and to minimize the occurrence of the M74 syndrome in fish populations. Copyright © 2016 John Wiley & Sons, Ltd.
Brunner, J; Krummenauer, F; Lehr, H A
2000-04-01
Study end-points in microcirculation research are usually video-taped images rather than numeric computer print-outs. Analysis of these video-taped images for the quantification of microcirculatory parameters usually requires computer-based image analysis systems. Most software programs for image analysis are custom-made, expensive, and limited in their applicability to selected parameters and study end-points. We demonstrate herein that an inexpensive, commercially available computer software (Adobe Photoshop), run on a Macintosh G3 computer with inbuilt graphic capture board provides versatile, easy to use tools for the quantification of digitized video images. Using images obtained by intravital fluorescence microscopy from the pre- and postischemic muscle microcirculation in the skinfold chamber model in hamsters, Photoshop allows simple and rapid quantification (i) of microvessel diameters, (ii) of the functional capillary density and (iii) of postischemic leakage of FITC-labeled high molecular weight dextran from postcapillary venules. We present evidence of the technical accuracy of the software tools and of a high degree of interobserver reliability. Inexpensive commercially available imaging programs (i.e., Adobe Photoshop) provide versatile tools for image analysis with a wide range of potential applications in microcirculation research.
Aeroelastic Uncertainty Quantification Studies Using the S4T Wind Tunnel Model
NASA Technical Reports Server (NTRS)
Nikbay, Melike; Heeg, Jennifer
2017-01-01
This paper originates from the joint efforts of an aeroelastic study team in the Applied Vehicle Technology Panel from NATO Science and Technology Organization, with the Task Group number AVT-191, titled "Application of Sensitivity Analysis and Uncertainty Quantification to Military Vehicle Design." We present aeroelastic uncertainty quantification studies using the SemiSpan Supersonic Transport wind tunnel model at the NASA Langley Research Center. The aeroelastic study team decided treat both structural and aerodynamic input parameters as uncertain and represent them as samples drawn from statistical distributions, propagating them through aeroelastic analysis frameworks. Uncertainty quantification processes require many function evaluations to asses the impact of variations in numerous parameters on the vehicle characteristics, rapidly increasing the computational time requirement relative to that required to assess a system deterministically. The increased computational time is particularly prohibitive if high-fidelity analyses are employed. As a remedy, the Istanbul Technical University team employed an Euler solver in an aeroelastic analysis framework, and implemented reduced order modeling with Polynomial Chaos Expansion and Proper Orthogonal Decomposition to perform the uncertainty propagation. The NASA team chose to reduce the prohibitive computational time by employing linear solution processes. The NASA team also focused on determining input sample distributions.
In vivo behavior of NTBI revealed by automated quantification system.
Ito, Satoshi; Ikuta, Katsuya; Kato, Daisuke; Lynda, Addo; Shibusa, Kotoe; Niizeki, Noriyasu; Toki, Yasumichi; Hatayama, Mayumi; Yamamoto, Masayo; Shindo, Motohiro; Iizuka, Naomi; Kohgo, Yutaka; Fujiya, Mikihiro
2016-08-01
Non-Tf-bound iron (NTBI), which appears in serum in iron overload, is thought to contribute to organ damage; the monitoring of serum NTBI levels may therefore be clinically useful in iron-overloaded patients. However, NTBI quantification methods remain complex, limiting their use in clinical practice. To overcome the technical difficulties often encountered, we recently developed a novel automated NTBI quantification system capable of measuring large numbers of samples. In the present study, we investigated the in vivo behavior of NTBI in human and animal serum using this newly established automated system. Average NTBI in healthy volunteers was 0.44 ± 0.076 μM (median 0.45 μM, range 0.28-0.66 μM), with no significant difference between sexes. Additionally, serum NTBI rapidly increased after iron loading, followed by a sudden disappearance. NTBI levels also decreased in inflammation. The results indicate that NTBI is a unique marker of iron metabolism, unlike other markers of iron metabolism, such as serum ferritin. Our new automated NTBI quantification method may help to reveal the clinical significance of NTBI and contribute to our understanding of iron overload.
NASA Astrophysics Data System (ADS)
Jayasena, T.; Poljak, A.; Braidy, N.; Zhong, L.; Rowlands, B.; Muenchhoff, J.; Grant, R.; Smythe, G.; Teo, C.; Raftery, M.; Sachdev, P.
2016-10-01
Sirtuin proteins have a variety of intracellular targets, thereby regulating multiple biological pathways including neurodegeneration. However, relatively little is currently known about the role or expression of the 7 mammalian sirtuins in the central nervous system. Western blotting, PCR and ELISA are the main techniques currently used to measure sirtuin levels. To achieve sufficient sensitivity and selectivity in a multiplex-format, a targeted mass spectrometric assay was developed and validated for the quantification of all seven mammalian sirtuins (SIRT1-7). Quantification of all peptides was by multiple reaction monitoring (MRM) using three mass transitions per protein-specific peptide, two specific peptides for each sirtuin and a stable isotope labelled internal standard. The assay was applied to a variety of samples including cultured brain cells, mammalian brain tissue, CSF and plasma. All sirtuin peptides were detected in the human brain, with SIRT2 being the most abundant. Sirtuins were also detected in human CSF and plasma, and guinea pig and mouse tissues. In conclusion, we have successfully applied MRM mass spectrometry for the detection and quantification of sirtuin proteins in the central nervous system, paving the way for more quantitative and functional studies.
Singh, Gurmit; Koerner, Terence; Gelinas, Jean-Marc; Abbott, Michael; Brady, Beth; Huet, Anne-Catherine; Charlier, Caroline; Delahaut, Philippe; Godefroy, Samuel Benrejeb
2011-01-01
Malachite green (MG), a member of the N-methylated triphenylmethane class of dyes, has long been used to control fungal and protozoan infections in fish. MG is easily absorbed by fish during waterborne exposure and is rapidly metabolized into leucomalachite green (LMG), which is known for its long residence time in edible fish tissue. This paper describes the development of an enzyme-linked immunosorbent assay (ELISA) for the detection and quantification of LMG in fish tissue. This development includes a simple and versatile method for the conversion of LMG to monodesmethyl-LMG, which is then conjugated to bovine serum albumin (BSA) to produce an immunogenic material. Rabbit polyclonal antibodies are generated against this immunogen, purified and used to develop a direct competitive enzyme-linked immunosorbent assay (ELISA) for the screening and quantification of LMG in fish tissue. The assay performed well, with a limit of detection (LOD) and limit of quantification (LOQ) of 0.1 and 0.3 ng g−1 of fish tissue, respectively. The average extraction efficiency from a matrix of tilapia fillets was approximately 73% and the day-to-day reproducibility for these extractions in the assay was between 5 and 10%. PMID:21623496
Kruse, Niels; Mollenhauer, Brit
2015-11-01
The quantification of α-Synuclein in cerebrospinal fluid (CSF) as a biomarker has gained tremendous interest in the last years. Several commercially available immunoassays are emerging. We here describe the full validation of one commercially available ELISA assay for the quantification of α-Synuclein in human CSF (Covance alpha-Synuclein ELISA kit). The study was conducted within the BIOMARKAPD project in the European initiative Joint Program for Neurodegenerative Diseases (JPND). We investigated the effect of several pre-analytical and analytical confounders: i.e. (1) need for centrifugation of freshly drawn CSF, (2) sample stability, (3) delay of freezing, (4) volume of storage aliquots, (5) freeze/thaw cycles, (6) thawing conditions, (7) dilution linearity, (8) parallelism, (9) spike recovery, and (10) precision. None of these confounders influenced the levels of α-Synuclein in CSF significantly. We found a very high intra-assay precision. The inter-assay precision was lower than expected due to different performances of kit lots used. Overall the validated immunoassay is useful for the quantification of α-Synuclein in human CSF. Copyright © 2015 Elsevier B.V. All rights reserved.
Hunsche, Mauricio; Noga, Georg
2009-12-01
In the present study the principle of energy dispersive X-ray microanalysis (EDX), i.e. the detection of elements based on their characteristic X-rays, was used to localise and quantify organic and inorganic pesticides on enzymatically isolated fruit cuticles. Pesticides could be discriminated from the plant surface because of their distinctive elemental composition. Findings confirm the close relation between net intensity (NI) and area covered by the active ingredient (AI area). Using wide and narrow concentration ranges of glyphosate and glufosinate, respectively, results showed that quantification of AI requires the selection of appropriate regression equations while considering NI, peak-to-background (P/B) ratio, and AI area. The use of selected internal standards (ISs) such as Ca(NO(3))(2) improved the accuracy of the quantification slightly but led to the formation of particular, non-typical microstructured deposits. The suitability of SEM-EDX as a general technique to quantify pesticides was evaluated additionally on 14 agrochemicals applied at diluted or regular concentration. Among the pesticides tested, spatial localisation and quantification of AI amount could be done for inorganic copper and sulfur as well for the organic agrochemicals glyphosate, glufosinate, bromoxynil and mancozeb. (c) 2009 Society of Chemical Industry.
NASA Astrophysics Data System (ADS)
Zhao, Fengjun; Liu, Junting; Qu, Xiaochao; Xu, Xianhui; Chen, Xueli; Yang, Xiang; Cao, Feng; Liang, Jimin; Tian, Jie
2014-12-01
To solve the multicollinearity issue and unequal contribution of vascular parameters for the quantification of angiogenesis, we developed a quantification evaluation method of vascular parameters for angiogenesis based on in vivo micro-CT imaging of hindlimb ischemic model mice. Taking vascular volume as the ground truth parameter, nine vascular parameters were first assembled into sparse principal components (PCs) to reduce the multicolinearity issue. Aggregated boosted trees (ABTs) were then employed to analyze the importance of vascular parameters for the quantification of angiogenesis via the loadings of sparse PCs. The results demonstrated that vascular volume was mainly characterized by vascular area, vascular junction, connectivity density, segment number and vascular length, which indicated they were the key vascular parameters for the quantification of angiogenesis. The proposed quantitative evaluation method was compared with both the ABTs directly using the nine vascular parameters and Pearson correlation, which were consistent. In contrast to the ABTs directly using the vascular parameters, the proposed method can select all the key vascular parameters simultaneously, because all the key vascular parameters were assembled into the sparse PCs with the highest relative importance.
Huillet, Céline; Adrait, Annie; Lebert, Dorothée; Picard, Guillaume; Trauchessec, Mathieu; Louwagie, Mathilde; Dupuis, Alain; Hittinger, Luc; Ghaleh, Bijan; Le Corvoisier, Philippe; Jaquinod, Michel; Garin, Jérôme; Bruley, Christophe; Brun, Virginie
2012-01-01
Development of new biomarkers needs to be significantly accelerated to improve diagnostic, prognostic, and toxicity monitoring as well as therapeutic follow-up. Biomarker evaluation is the main bottleneck in this development process. Selected Reaction Monitoring (SRM) combined with stable isotope dilution has emerged as a promising option to speed this step, particularly because of its multiplexing capacities. However, analytical variabilities because of upstream sample handling or incomplete trypsin digestion still need to be resolved. In 2007, we developed the PSAQ™ method (Protein Standard Absolute Quantification), which uses full-length isotope-labeled protein standards to quantify target proteins. In the present study we used clinically validated cardiovascular biomarkers (LDH-B, CKMB, myoglobin, and troponin I) to demonstrate that the combination of PSAQ and SRM (PSAQ-SRM) allows highly accurate biomarker quantification in serum samples. A multiplex PSAQ-SRM assay was used to quantify these biomarkers in clinical samples from myocardial infarction patients. Good correlation between PSAQ-SRM and ELISA assay results was found and demonstrated the consistency between these analytical approaches. Thus, PSAQ-SRM has the capacity to improve both accuracy and reproducibility in protein analysis. This will be a major contribution to efficient biomarker development strategies. PMID:22080464
Chang, Po-Chih; Reddy, P Muralidhar; Ho, Yen-Peng
2014-09-01
Stable-isotope dimethyl labeling was applied to the quantification of genetically modified (GM) soya. The herbicide-resistant gene-related protein 5-enolpyruvylshikimate-3-phosphate synthase (CP4 EPSPS) was labeled using a dimethyl labeling reagent, formaldehyde-H2 or -D2. The identification and quantification of CP4 EPSPS was performed using matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS). The CP4 EPSPS protein was separated from high abundance proteins using strong anion exchange chromatography and sodium dodecyl sulfate-polyacrylamide gel electrophoresis. Then, the tryptic peptides from the samples and reference were labeled with formaldehyde-H2 and formaldehyde-D2, respectively. The two labeled pools were mixed and analyzed using MALDI-MS. The data showed a good correlation between the peak ratio of the H- and D-labeled peptides and the GM soya percentages at 0.5, 1, 3, and 5 %, with R (2) of 0.99. The labeling reagents are readily available. The labeling experiments and the detection procedures are simple. The approach is useful for the quantification of GM soya at a level as low as 0.5 %.
Seyer, Alexandre; Fenaille, François; Féraudet-Tarisse, Cecile; Volland, Hervé; Popoff, Michel R; Tabet, Jean-Claude; Junot, Christophe; Becher, François
2012-06-05
Epsilon toxin (ETX) is one of the most lethal toxins produced by Clostridium species and is considered as a potential bioterrorist weapon. Here, we present a rapid mass spectrometry-based method for ETX quantification in complex matrixes. As a prerequisite, naturally occurring prototoxin and toxin species were first structurally characterized by top-down and bottom-up experiments, to identify the most pertinent peptides for quantification. Following selective ETX immunoextraction and trypsin digestion, two proteotypic peptides shared by all the toxin forms were separated by ultraperformance liquid chromatography (UPLC) and monitored by ESI-MS (electrospray ionization-mass spectrometry) operating in the multiple reaction monitoring mode (MRM) with collision-induced dissociation. Thorough protocol optimization, i.e., a 15 min immunocapture, a 2 h enzymatic digestion, and an UPLC-MS/MS detection, allowed the whole quantification process including the calibration curve to be performed in less than 4 h, without compromising assay robustness and sensitivity. The assay sensitivity in milk and serum was estimated at 5 ng·mL(-1) for ETX, making this approach complementary to enzyme linked immunosorbent assay (ELISA) techniques.
NASA Astrophysics Data System (ADS)
Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana
2016-10-01
The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.
Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana
2016-10-14
The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.
Ullrich, Sebastian; Neef, Sylvia K; Schmarr, Hans-Georg
2018-02-01
Low-molecular-weight volatile sulfur compounds such as thiols, sulfides, disulfides as well as thioacetates cause a sulfidic off-flavor in wines even at low concentration levels. The proposed analytical method for quantification of these compounds in wine is based on headspace solid-phase microextraction, followed by gas chromatographic analysis with sulfur-specific detection using a pulsed flame photometric detector. Robust quantification was achieved via a stable isotope dilution assay using commercial and synthesized deuterated isotopic standards. The necessary chromatographic separation of analytes and isotopic standards benefits from the inverse isotope effect realized on an apolar polydimethylsiloxane stationary phase of increased film thickness. Interferences with sulfur-specific detection in wine caused by sulfur dioxide were minimized by addition of propanal. The method provides adequate validation data, with good repeatability and limits of detection and quantification. It suits the requirements of wine quality management, allowing the control of oenological treatments to counteract an eventual formation of excessively high concentration of such malodorous compounds. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Nitric Oxide Analyzer Quantification of Plant S-Nitrosothiols.
Hussain, Adil; Yun, Byung-Wook; Loake, Gary J
2018-01-01
Nitric oxide (NO) is a small diatomic molecule that regulates multiple physiological processes in animals, plants, and microorganisms. In animals, it is involved in vasodilation and neurotransmission and is present in exhaled breath. In plants, it regulates both plant immune function and numerous developmental programs. The high reactivity and short half-life of NO and cross-reactivity of its various derivatives make its quantification difficult. Different methods based on calorimetric, fluorometric, and chemiluminescent detection of NO and its derivatives are available, but all of them have significant limitations. Here we describe a method for the chemiluminescence-based quantification of NO using ozone-chemiluminescence technology in plants. This approach provides a sensitive, robust, and flexible approach for determining the levels of NO and its signaling products, protein S-nitrosothiols.
NASA Astrophysics Data System (ADS)
Swinburne, Thomas D.; Perez, Danny
2018-05-01
A massively parallel method to build large transition rate matrices from temperature-accelerated molecular dynamics trajectories is presented. Bayesian Markov model analysis is used to estimate the expected residence time in the known state space, providing crucial uncertainty quantification for higher-scale simulation schemes such as kinetic Monte Carlo or cluster dynamics. The estimators are additionally used to optimize where exploration is performed and the degree of temperature acceleration on the fly, giving an autonomous, optimal procedure to explore the state space of complex systems. The method is tested against exactly solvable models and used to explore the dynamics of C15 interstitial defects in iron. Our uncertainty quantification scheme allows for accurate modeling of the evolution of these defects over timescales of several seconds.
[Clinical and analytical toxicology of opiate, cocaine and amphetamine].
Feliu, Catherine; Fouley, Aurélie; Millart, Hervé; Gozalo, Claire; Marty, Hélène; Djerada, Zoubir
2015-01-01
In several circumstances, determination and quantification of illicit drugs in biological fluids are determinant. Contexts are varied such as driving under influence, traffic accident, clinical and forensic toxicology, doping analysis, chemical submission. Whole blood is the favoured matrix for the quantification of illicit drugs. Gas chromatography coupled with mass spectrometry (GC-MS) is the gold standard for these analyses. All methods developed must be at least equivalent to gas chromatography coupled with a mass spectrometer. Nowadays, new technologies are available to biologists and clinicians: liquid chromatography coupled with a mass spectrometry (LC/MS) or coupled with a tandem mass spectrometer (LC/MS/MS). The aim of this paper is to describe the state of the art regarding techniques of confirmation by mass spectrometry used for quantification of conventional drugs except cannabis.
Targeted Quantification of Isoforms of a Thylakoid-Bound Protein: MRM Method Development.
Bru-Martínez, Roque; Martínez-Márquez, Ascensión; Morante-Carriel, Jaime; Sellés-Marchart, Susana; Martínez-Esteso, María José; Pineda-Lucas, José Luis; Luque, Ignacio
2018-01-01
Targeted mass spectrometric methods such as selected/multiple reaction monitoring (SRM/MRM) have found intense application in protein detection and quantification which competes with classical immunoaffinity techniques. It provides a universal procedure to develop a fast, highly specific, sensitive, accurate, and cheap methodology for targeted detection and quantification of proteins based on the direct analysis of their surrogate peptides typically generated by tryptic digestion. This methodology can be advantageously applied in the field of plant proteomics and particularly for non-model species since immunoreagents are scarcely available. Here, we describe the issues to take into consideration in order to develop a MRM method to detect and quantify isoforms of the thylakoid-bound protein polyphenol oxidase from the non-model and database underrepresented species Eriobotrya japonica Lindl.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dong, C.; Department of Physics, SAPIENZA University of Rome, Piazzale A. Moro 5, 00185, Rome; Corsetti, S.
2015-06-23
We report a phenomenological approach for the quantification of the diameter of magnetic nanoparticles (MNPs) incorporated in non-ionic surfactant vesicles (niosomes) using magnetic force microscopy (MFM). After a simple specimen preparation, i.e., by putting a drop of solution containing MNPs-loaded niosomes on flat substrates, topography and MFM phase images are collected. To attempt the quantification of the diameter of entrapped MNPs, the method is calibrated on the sole MNPs deposited on the same substrates by analyzing the MFM signal as a function of the MNP diameter (at fixed tip-sample distance) and of the tip-sample distance (for selected MNPs). After calibration,more » the effective diameter of the MNPs entrapped in some niosomes is quantitatively deduced from MFM images.« less
Watkins, Preston S; Castellon, Benjamin T; Tseng, Chiyen; Wright, Moncie V; Matson, Cole W; Cobb, George P
2018-04-13
A consistent analytical method incorporating sulfuric acid (H 2 SO 4 ) digestion and ICP-MS quantification has been developed for TiO 2 quantification in biotic and abiotic environmentally relevant matrices. Sample digestion in H 2 SO 4 at 110°C provided consistent results without using hydrofluoric acid or microwave digestion. Analysis of seven replicate samples for four matrices on each of 3 days produced Ti recoveries of 97% ± 2.5%, 91 % ± 4.0%, 94% ± 1.8%, and 73 % ± 2.6% (mean ± standard deviation) from water, fish tissue, periphyton, and sediment, respectively. The method demonstrated consistent performance in analysis of water collected over a 1 month.
Sobanska, Anna W; Pyzowski, Jaroslaw
2012-01-01
Ethylhexyl triazone (ET) was separated from other sunscreens such as avobenzone, octocrylene, octyl methoxycinnamate, and diethylamino hydroxybenzoyl hexyl benzoate and from parabens by normal-phase HPTLC on silica gel 60 as stationary phase. Two mobile phases were particularly effective: (A) cyclohexane-diethyl ether 1 : 1 (v/v) and (B) cyclohexane-diethyl ether-acetone 15 : 1 : 2 (v/v/v) since apart from ET analysis they facilitated separation and quantification of other sunscreens present in the formulations. Densitometric scanning was performed at 300 nm. Calibration curves for ET were nonlinear (second-degree polynomials), with R > 0.998. For both mobile phases limits of detection (LOD) were 0.03 and limits of quantification (LOQ) 0.1 μg spot(-1). Both methods were validated.
Digital Droplet PCR: CNV Analysis and Other Applications.
Mazaika, Erica; Homsy, Jason
2014-07-14
Digital droplet PCR (ddPCR) is an assay that combines state-of-the-art microfluidics technology with TaqMan-based PCR to achieve precise target DNA quantification at high levels of sensitivity and specificity. Because quantification is achieved without the need for standard assays in an easy to interpret, unambiguous digital readout, ddPCR is far simpler, faster, and less error prone than real-time qPCR. The basic protocol can be modified with minor adjustments to suit a wide range of applications, such as CNV analysis, rare variant detection, SNP genotyping, and transcript quantification. This unit describes the ddPCR workflow in detail for the Bio-Rad QX100 system, but the theory and data interpretation are generalizable to any ddPCR system. Copyright © 2014 John Wiley & Sons, Inc.
NASA Technical Reports Server (NTRS)
Goldman, Aaron
1999-01-01
The Langley-D.U. collaboration on the analysis of high resolution infrared atmospheric spectra covered a number of important studies of trace gases identification and quantification from field spectra, and spectral line parameters analysis. The collaborative work included: Quantification and monitoring of trace gases from ground-based spectra available from various locations and seasons and from balloon flights. Studies toward identification and quantification of isotopic species, mostly oxygen and Sulfur isotopes. Search for new species on the available spectra. Update of spectroscopic line parameters, by combining laboratory and atmospheric spectra with theoretical spectroscopy methods. Study of trends of atmosphere trace constituents. Algorithms developments, retrievals intercomparisons and automatization of the analysis of NDSC spectra, for both column amounts and vertical profiles.
Naveen, P.; Lingaraju, H. B.; Prasad, K. Shyam
2017-01-01
Mangiferin, a polyphenolic xanthone glycoside from Mangifera indica, is used as traditional medicine for the treatment of numerous diseases. The present study was aimed to develop and validate a reversed-phase high-performance liquid chromatography (RP-HPLC) method for the quantification of mangiferin from the bark extract of M. indica. RP-HPLC analysis was performed by isocratic elution with a low-pressure gradient using 0.1% formic acid: acetonitrile (87:13) as a mobile phase with a flow rate of 1.5 ml/min. The separation was done at 26°C using a Kinetex XB-C18 column as stationary phase and the detection wavelength at 256 nm. The proposed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification, and robustness by the International Conference on Harmonisation guidelines. In linearity, the excellent correlation coefficient more than 0.999 indicated good fitting of the curve and also good linearity. The intra- and inter-day precision showed < 1% of relative standard deviation of peak area indicated high reliability and reproducibility of the method. The recovery values at three different levels (50%, 100%, and 150%) of spiked samples were found to be 100.47, 100.89, and 100.99, respectively, and low standard deviation value < 1% shows high accuracy of the method. In robustness, the results remain unaffected by small variation in the analytical parameters, which shows the robustness of the method. Liquid chromatography–mass spectrometry analysis confirmed the presence of mangiferin with M/Z value of 421. The assay developed by HPLC method is a simple, rapid, and reliable for the determination of mangiferin from M. indica. SUMMARY The present study was intended to develop and validate an RP-HPLC method for the quantification of mangiferin from the bark extract of M. indica. The developed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification and robustness by International Conference on Harmonization guidelines. This study proved that the developed assay by HPLC method is a simple, rapid and reliable for the quantification of the mangiferin from M. indica. Abbreviations Used: M. indica: Mangifera indica, RP-HPLC: Reversed-phase high-performance liquid chromatography, M/Z: Mass to charge ratio, ICH: International conference on harmonization, % RSD: Percentage of relative standard deviation, ppm: Parts per million, LOD: Limit of detection, LOQ: Limit of quantification. PMID:28539748
Prediction of autosomal STR typing success in ancient and Second World War bone samples.
Zupanič Pajnič, Irena; Zupanc, Tomaž; Balažic, Jože; Geršak, Živa Miriam; Stojković, Oliver; Skadrić, Ivan; Črešnar, Matija
2017-03-01
Human-specific quantitative PCR (qPCR) has been developed for forensic use in the last 10 years and is the preferred DNA quantification technique since it is very accurate, sensitive, objective, time-effective and automatable. The amount of information that can be gleaned from a single quantification reaction using commercially available quantification kits has increased from the quantity of nuclear DNA to the amount of male DNA, presence of inhibitors and, most recently, to the degree of DNA degradation. In skeletal remains samples from disaster victims, missing persons and war conflict victims, the DNA is usually degraded. Therefore the new commercial qPCR kits able to assess the degree of degradation are potentially able to predict the success of downstream short tandem repeat (STR) typing. The goal of this study was to verify the quantification step using the PowerQuant kit with regard to its suitability as a screening method for autosomal STR typing success on ancient and Second World War (WWII) skeletal remains. We analysed 60 skeletons excavated from five archaeological sites and four WWII mass graves from Slovenia. The bones were cleaned, surface contamination was removed and the bones ground to a powder. Genomic DNA was obtained from 0.5g of bone powder after total demineralization. The DNA was purified using a Biorobot EZ1 device. Following PowerQuant quantification, DNA samples were subjected to autosomal STR amplification using the NGM kit. Up to 2.51ng DNA/g of powder were extracted. No inhibition was detected in any of bones analysed. 82% of the WWII bones gave full profiles while 73% of the ancient bones gave profiles not suitable for interpretation. Four bone extracts yielded no detectable amplification or zero quantification results and no profiles were obtained from any of them. Full or useful partial profiles were produced only from bone extracts where short autosomal (Auto) and long degradation (Deg) PowerQuant targets were detected. It is concluded that STR typing of old bones after quantification with the PowerQuant should be performed only when both Auto and Deg targets are detected simultaneously with no respect to [Auto]/[Deg] ratio. Prediction of STR typing success could be made according to successful amplification of Deg fragment. The PowerQuant kit is capable of identifying bone DNA samples that will not yield useful STR profiles using the NGM kit, and it can be used as a predictor of autosomal STR typing success of bone extracts obtained from ancient and WWII skeletal remains. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Naveen, P; Lingaraju, H B; Prasad, K Shyam
2017-01-01
Mangiferin, a polyphenolic xanthone glycoside from Mangifera indica , is used as traditional medicine for the treatment of numerous diseases. The present study was aimed to develop and validate a reversed-phase high-performance liquid chromatography (RP-HPLC) method for the quantification of mangiferin from the bark extract of M. indica . RP-HPLC analysis was performed by isocratic elution with a low-pressure gradient using 0.1% formic acid: acetonitrile (87:13) as a mobile phase with a flow rate of 1.5 ml/min. The separation was done at 26°C using a Kinetex XB-C18 column as stationary phase and the detection wavelength at 256 nm. The proposed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification, and robustness by the International Conference on Harmonisation guidelines. In linearity, the excellent correlation coefficient more than 0.999 indicated good fitting of the curve and also good linearity. The intra- and inter-day precision showed < 1% of relative standard deviation of peak area indicated high reliability and reproducibility of the method. The recovery values at three different levels (50%, 100%, and 150%) of spiked samples were found to be 100.47, 100.89, and 100.99, respectively, and low standard deviation value < 1% shows high accuracy of the method. In robustness, the results remain unaffected by small variation in the analytical parameters, which shows the robustness of the method. Liquid chromatography-mass spectrometry analysis confirmed the presence of mangiferin with M/Z value of 421. The assay developed by HPLC method is a simple, rapid, and reliable for the determination of mangiferin from M. indica . The present study was intended to develop and validate an RP-HPLC method for the quantification of mangiferin from the bark extract of M. indica . The developed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification and robustness by International Conference on Harmonization guidelines. This study proved that the developed assay by HPLC method is a simple, rapid and reliable for the quantification of the mangiferin from M. indica . Abbreviations Used: M. indica : Mangifera indica , RP-HPLC: Reversed-phase high-performance liquid chromatography, M/Z: Mass to charge ratio, ICH: International conference on harmonization, % RSD: Percentage of relative standard deviation, ppm: Parts per million, LOD: Limit of detection, LOQ: Limit of quantification.
Dakota Graphical User Interface v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman-Hill, Ernest; Glickman, Matthew; Gibson, Marcus
Graphical analysis environment for Sandia’s Dakota software for optimization and uncertainty quantification. The Dakota GUI is an interactive graphical analysis environment for creating, running, and interpreting Dakota optimization and uncertainty quantification studies. It includes problem (Dakota study) set-up, option specification, simulation interfacing, analysis execution, and results visualization. Through the use of wizards, templates, and views, Dakota GUI helps uses navigate Dakota’s complex capability landscape.
ERIC Educational Resources Information Center
Tweedie, John W.; Stowell, Kathryn M.
2005-01-01
A two-session laboratory exercise for advanced undergraduate students in biochemistry and molecular biology is described. The first session introduces students to DNA quantification by ultraviolet absorbance and agarose gel electrophoresis followed by ethidium bromide staining. The second session involves treatment of various topological forms of…
2014-04-01
Barrier methods for critical exponent problems in geometric analysis and mathematical physics, J. Erway and M. Holst, Submitted for publication ...TR-14-33 A Posteriori Error Analysis and Uncertainty Quantification for Adaptive Multiscale Operator Decomposition Methods for Multiphysics...Problems Approved for public release, distribution is unlimited. April 2014 HDTRA1-09-1-0036 Donald Estep and Michael
Film Cooling in Fuel Rich Environments
2013-03-27
Heat Release Analysis . . . . . . . 89 4.26 Enhanced Blue Value Photographs for Flame Length Quantification, M=2.0, φ = 1.175, Single Row, Triple Row...Photographs for Flame Length . . . . . . . . . . . . . . 105 B.1 Charts Used to Calculate Trip Height . . . . . . . . . . . . . . . . . . . . . . . 107 B...coolant and quantification of flame length . The side window is shown in Figure 3.20 with a ruler used to calibrate the images. 51 Figure 3.18: Spanwise
Thermal In-Pouch Microwave Sterilization
2012-01-09
technologies. Annex TOPIC Page Overview & Summary 2 1 Quantification of Hexanal in Yogurt and Extra Virgin Olive Oil as an indicator of Photo Oxidation 8...reports addressing the above-mentioned five goals and incorporates them as annexes here. 1. Methods 1. Quantification ofHexanal in Yogurt and... yogurt and extra virgin olive oil) from light- catalyzed degradation of linoleic acid to hexanal. Several alternative opacifying tactics were evaluated
Neutron-Encoded Protein Quantification by Peptide Carbamylation
NASA Astrophysics Data System (ADS)
Ulbrich, Arne; Merrill, Anna E.; Hebert, Alexander S.; Westphall, Michael S.; Keller, Mark P.; Attie, Alan D.; Coon, Joshua J.
2014-01-01
We describe a chemical tag for duplex proteome quantification using neutron encoding (NeuCode). The method utilizes the straightforward, efficient, and inexpensive carbamylation reaction. We demonstrate the utility of NeuCode carbamylation by accurately measuring quantitative ratios from tagged yeast lysates mixed in known ratios and by applying this method to quantify differential protein expression in mice fed a either control or high-fat diet.
Simple, Fast, and Sensitive Method for Quantification of Tellurite in Culture Media▿
Molina, Roberto C.; Burra, Radhika; Pérez-Donoso, José M.; Elías, Alex O.; Muñoz, Claudia; Montes, Rebecca A.; Chasteen, Thomas G.; Vásquez, Claudio C.
2010-01-01
A fast, simple, and reliable chemical method for tellurite quantification is described. The procedure is based on the NaBH4-mediated reduction of TeO32− followed by the spectrophotometric determination of elemental tellurium in solution. The method is highly reproducible, is stable at different pH values, and exhibits linearity over a broad range of tellurite concentrations. PMID:20525868
Carratalà, Anna; Rusinol, Marta; Hundesa, Ayalkibet; Biarnes, Mar; Rodriguez-Manzano, Jesus; Vantarakis, Apostolos; Kern, Anita; Suñen, Ester; Girones, Rosina; Bofill-Mas, Sílvia
2012-10-01
Poultry farming may introduce pathogens into the environment and food chains. High concentrations of chicken/turkey parvoviruses were detected in chicken stools and slaughterhouse and downstream urban wastewaters by applying new PCR-based specific detection and quantification techniques. Our results confirm that chicken/turkey parvoviruses may be useful viral indicators of poultry fecal contamination.
Dual Approach To Superquantile Estimation And Applications To Density Fitting
2016-06-01
incorporate additional constraints to improve the fidelity of density estimates in tail regions. We limit our investigation to data with heavy tails, where...samples of various heavy -tailed distributions. 14. SUBJECT TERMS probability density estimation, epi-splines, optimization, risk quantification...limit our investigation to data with heavy tails, where risk quantification is typically the most difficult. Demonstrations are provided in the form of
Statistical evaluation of vibration analysis techniques
NASA Technical Reports Server (NTRS)
Milner, G. Martin; Miller, Patrice S.
1987-01-01
An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.
Lin, Yong-Qing; Zhang, Yilu; Li, Connie; Li, Louis; Zhang, Kelley; Li, Shawn
2012-01-01
To evaluate the dried blood spot (DBS) technique in ELISA quantification of larger biomolecular drugs, an anti-CD20 monoclonal antibody drug was used as an example. A method for the quantification of the anti-CD20 drug in human DBS was developed and validated. The drug standard and quality control samples prepared in fresh human blood were spotted on DBS cards and then extracted. A luminescent ELISA was used for quantification of the drug from DBS samples. The assay range of the anti-CD20 drug standards in DBS was 100-2500ng/mL. The intra-assay precision (%CV) ranged from 0.4% to 10.1%, and the accuracy (%Recovery) ranged from 77.9% to 113.9%. The inter assay precision (%CV) ranged from 5.9% to 17.4%, and the accuracy ranged from 81.5% to 110.5%. The DBS samples diluted 500 and 50-fold yielded recovery of 88.7% and 90.7%, respectively. The preparation of DBS in higher and lower hematocrit (53% and 35%) conditions did not affect the recovery of the drug. Furthermore, the storage stability of the anti-CD20 drug on DBS cards was tested at various conditions. It was found that the anti-CD20 drug was stable for one week in DBS stored at room temperature. However, it was determined that the stability was compro]mised in DBS stored at high humidity, high temperature (55°C), and exposed to direct daylight for a week, as well as for samples stored at room temperature and high humidity conditions for a month. Stability did not change significantly in samples that underwent 3 freeze/thaw cycles. Our results demonstrated a successful use of DBS technique in ELISA quantification of an anti-CD20 monoclonal antibody drug in human blood. The stability data provides information regarding sample storage and shipping for future clinical studies. It is, therefore, concluded that the DBS technique is applicable in the quantification of other large biomolecule drugs or biomarkers. Copyright © 2011 Elsevier Inc. All rights reserved.
qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jackson, Christopher B., E-mail: Christopher.jackson@insel.ch; Gallati, Sabina, E-mail: sabina.gallati@insel.ch; Schaller, Andre, E-mail: andre.schaller@insel.ch
2012-07-06
Highlights: Black-Right-Pointing-Pointer Serial qPCR accurately determines fragmentation state of any given DNA sample. Black-Right-Pointing-Pointer Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. Black-Right-Pointing-Pointer Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. Black-Right-Pointing-Pointer Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serialmore » qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze-thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA ({lambda}{sub nDNA}) and mtDNA ({lambda}{sub mtDNA}) we present an approach to possibly correct measurements in degraded samples in the future. To our knowledge this is the first time different degradation impact of the two genomes is demonstrated and which evaluates systematically the impact of DNA degradation on quantification of mtDNA copy number.« less
Matoušková, Petra; Bártíková, Hana; Boušová, Iva; Hanušová, Veronika; Szotáková, Barbora; Skálová, Lenka
2014-01-01
Obesity and metabolic syndrome is increasing health problem worldwide. Among other ways, nutritional intervention using phytochemicals is important method for treatment and prevention of this disease. Recent studies have shown that certain phytochemicals could alter the expression of specific genes and microRNAs (miRNAs) that play a fundamental role in the pathogenesis of obesity. For study of the obesity and its treatment, monosodium glutamate (MSG)-injected mice with developed central obesity, insulin resistance and liver lipid accumulation are frequently used animal models. To understand the mechanism of phytochemicals action in obese animals, the study of selected genes expression together with miRNA quantification is extremely important. For this purpose, real-time quantitative PCR is a sensitive and reproducible method, but it depends on proper normalization entirely. The aim of present study was to identify the appropriate reference genes for mRNA and miRNA quantification in MSG mice treated with green tea catechins, potential anti-obesity phytochemicals. Two sets of reference genes were tested: first set contained seven commonly used genes for normalization of messenger RNA, the second set of candidate reference genes included ten small RNAs for normalization of miRNA. The expression stability of these reference genes were tested upon treatment of mice with catechins using geNorm, NormFinder and BestKeeper algorithms. Selected normalizers for mRNA quantification were tested and validated on expression of quinone oxidoreductase, biotransformation enzyme known to be modified by catechins. The effect of selected normalizers for miRNA quantification was tested on two obesity- and diabetes- related miRNAs, miR-221 and miR-29b, respectively. Finally, the combinations of B2M/18S/HPRT1 and miR-16/sno234 were validated as optimal reference genes for mRNA and miRNA quantification in liver and 18S/RPlP0/HPRT1 and sno234/miR-186 in small intestine of MSG mice. These reference genes will be used for mRNA and miRNA normalization in further study of green tea catechins action in obese mice.
quantGenius: implementation of a decision support system for qPCR-based gene quantification.
Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina
2017-05-25
Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.
Quantitative Method for Simultaneous Analysis of Acetaminophen and 6 Metabolites.
Lammers, Laureen A; Achterbergh, Roos; Pistorius, Marcel C M; Romijn, Johannes A; Mathôt, Ron A A
2017-04-01
Hepatotoxicity after ingestion of high-dose acetaminophen [N-acetyl-para-aminophenol (APAP)] is caused by the metabolites of the drug. To gain more insight into factors influencing susceptibility to APAP hepatotoxicity, quantification of APAP and metabolites is important. A few methods have been developed to simultaneously quantify APAP and its most important metabolites. However, these methods require a comprehensive sample preparation and long run times. The aim of this study was to develop and validate a simplified, but sensitive method for the simultaneous quantification of acetaminophen, the main metabolites acetaminophen glucuronide and acetaminophen sulfate, and 4 Cytochrome P450-mediated metabolites by using liquid chromatography with mass spectrometric (LC-MS) detection. The method was developed and validated for the human plasma, and it entailed a single method for sample preparation, enabling quick processing of the samples followed by an LC-MS method with a chromatographic run time of 9 minutes. The method was validated for selectivity, linearity, accuracy, imprecision, dilution integrity, recovery, process efficiency, ionization efficiency, and carryover effect. The method showed good selectivity without matrix interferences. For all analytes, the mean process efficiency was >86%, and the mean ionization efficiency was >94%. Furthermore, the accuracy was between 90.3% and 112% for all analytes, and the within- and between-run imprecision were <20% for the lower limit of quantification and <14.3% for the middle level and upper limit of quantification. The method presented here enables the simultaneous quantification of APAP and 6 of its metabolites. It is less time consuming than previously reported methods because it requires only a single and simple method for the sample preparation followed by an LC-MS method with a short run time. Therefore, this analytical method provides a useful method for both clinical and research purposes.
Onofrejová, Lucia; Farková, Marta; Preisler, Jan
2009-04-13
The application of an internal standard in quantitative analysis is desirable in order to correct for variations in sample preparation and instrumental response. In mass spectrometry of organic compounds, the internal standard is preferably labelled with a stable isotope, such as (18)O, (15)N or (13)C. In this study, a method for the quantification of fructo-oligosaccharides using matrix-assisted laser desorption/ionisation time-of-flight mass spectrometry (MALDI TOF MS) was proposed and tested on raftilose, a partially hydrolysed inulin with a degree of polymeration 2-7. A tetraoligosaccharide nystose, which is chemically identical to the raftilose tetramer, was used as an internal standard rather than an isotope-labelled analyte. Two mathematical approaches used for data processing, conventional calculations and artificial neural networks (ANN), were compared. The conventional data processing relies on the assumption that a constant oligomer dispersion profile will change after the addition of the internal standard and some simple numerical calculations. On the other hand, ANN was found to compensate for a non-linear MALDI response and variations in the oligomer dispersion profile with raftilose concentration. As a result, the application of ANN led to lower quantification errors and excellent day-to-day repeatability compared to the conventional data analysis. The developed method is feasible for MS quantification of raftilose in the range of 10-750 pg with errors below 7%. The content of raftilose was determined in dietary cream; application can be extended to other similar polymers. It should be stressed that no special optimisation of the MALDI process was carried out. A common MALDI matrix and sample preparation were used and only the basic parameters, such as sampling and laser energy, were optimised prior to quantification.
Chevolleau, S; Noguer-Meireles, M-H; Jouanin, I; Naud, N; Pierre, F; Gueraud, F; Debrauwer, L
2018-04-15
Red or processed meat rich diets have been shown to be associated with an elevated risk of colorectal cancer (CRC). One major hypothesis involves dietary heme iron which induces lipid peroxidation. The quantification of the resulting reactive aldehydes (e.g. HNE and HHE) in the colon lumen is therefore of great concern since these compounds are known for their cytotoxic and genotoxic properties. UHPLC-ESI-MS/MS method has been developed and validated for HNE and HHE quantification in rat faeces. Samples were derivatised using a brominated reagent (BBHA) in presence of pre-synthesized deuterated internal standards (HNE-d11/HHE-d5), extracted by solid phase extraction, and then analysed by LC-positive ESI-MS/MS (MRM) on a TSQ Vantage mass spectrometer. The use of BBHA allowed the efficient stabilisation of the unstable and reactive hydroxy-alkenals HNE and HHE. The MRM method allowed selective detection of HNE and HHE on the basis of characteristic transitions monitored from both the 79 and 81 bromine isotopic peaks. This method was validated according to the European Medicines Agency (EMEA) guidelines, by determining selectivity, sensitivity, linearity, carry-over effect, recovery, matrix effect, repeatability, trueness and intermediate precision. The performance of the method enabled the quantification of HNE and HHE in concentrations 0.10-0.15 μM in faecal water. Results are presented on the application to the quantification of HNE and HHE in different faecal waters obtained from faeces of rats fed diets with various fatty acid compositions thus corresponding to different pro-oxidative features. Copyright © 2018 Elsevier B.V. All rights reserved.
Gil, Jeovanis; Cabrales, Ania; Reyes, Osvaldo; Morera, Vivian; Betancourt, Lázaro; Sánchez, Aniel; García, Gerardo; Moya, Galina; Padrón, Gabriel; Besada, Vladimir; González, Luis Javier
2012-02-23
Growth hormone-releasing peptide 6 (GHRP-6, His-(DTrp)-Ala-Trp-(DPhe)-Lys-NH₂, MW=872.44 Da) is a potent growth hormone secretagogue that exhibits a cytoprotective effect, maintaining tissue viability during acute ischemia/reperfusion episodes in different organs like small bowel, liver and kidneys. In the present work a quantitative method to analyze GHRP-6 in human plasma was developed and fully validated following FDA guidelines. The method uses an internal standard (IS) of GHRP-6 with ¹³C-labeled Alanine for quantification. Sample processing includes a precipitation step with cold acetone to remove the most abundant plasma proteins, recovering the GHRP-6 peptide with a high yield. Quantification was achieved by LC-MS in positive full scan mode in a Q-Tof mass spectrometer. The sensitivity of the method was evaluated, establishing the lower limit of quantification at 5 ng/mL and a range for the calibration curve from 5 ng/mL to 50 ng/mL. A dilution integrity test was performed to analyze samples at higher concentration of GHRP-6. The validation process involved five calibration curves and the analysis of quality control samples to determine accuracy and precision. The calibration curves showed R² higher than 0.988. The stability of the analyte and its internal standard (IS) was demonstrated in all conditions the samples would experience in a real time analyses. This method was applied to the quantification of GHRP-6 in plasma from nine healthy volunteers participating in a phase I clinical trial. Copyright © 2011 Elsevier B.V. All rights reserved.
Srivastava, Nishi; Srivastava, Amit; Srivastava, Sharad; Rawat, Ajay Kumar Singh; Khan, Abdul Rahman
2016-03-01
A rapid, sensitive, selective and robust quantitative densitometric high-performance thin-layer chromatographic method was developed and validated for separation and quantification of syringic acid (SYA) and kaempferol (KML) in the hydrolyzed extracts of Bergenia ciliata and Bergenia stracheyi. The separation was performed on silica gel 60F254 high-performance thin-layer chromatography plates using toluene : ethyl acetate : formic acid (5 : 4: 1, v/v/v) as the mobile phase. The quantification of SYA and KML was carried out using a densitometric reflection/absorption mode at 290 nm. A dense spot of SYA and KML appeared on the developed plate at a retention factor value of 0.61 ± 0.02 and 0.70 ± 0.01. A precise and accurate quantification was performed using linear regression analysis by plotting the peak area vs concentration 100-600 ng/band (correlation coefficient: r = 0.997, regression coefficient: R(2) = 0.996) for SYA and 100-600 ng/band (correlation coefficient: r = 0.995, regression coefficient: R(2) = 0.991) for KML. The developed method was validated in terms of accuracy, recovery and inter- and intraday study as per International Conference on Harmonisation guidelines. The limit of detection and limit of quantification of SYA and KML were determined, respectively, as 91.63, 142.26 and 277.67, 431.09 ng. The statistical data analysis showed that the method is reproducible and selective for the estimation of SYA and KML in extracts of B. ciliata and B. stracheyi. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Silva, Raquel V S; Tessarolo, Nathalia S; Pereira, Vinícius B; Ximenes, Vitor L; Mendes, Fábio L; de Almeida, Marlon B B; Azevedo, Débora A
2017-03-01
The elucidation of bio-oil composition is important to evaluate the processes of biomass conversion and its upgrading, and to suggest the proper use for each sample. Comprehensive two-dimensional gas chromatography with time-of-flight mass spectrometry (GC×GC-TOFMS) is a widely applied analytical approach for bio-oil investigation due to the higher separation and resolution capacity from this technique. This work addresses the issue of analytical performance to assess the comprehensive characterization of real bio-oil samples via GC×GC-TOFMS. The approach was applied to the individual quantification of compounds of real thermal (PWT), catalytic process (CPO), and hydrodeoxygenation process (HDO) bio-oils. Quantification was performed with reliability using the analytical curves of oxygenated and hydrocarbon standards as well as the deuterated internal standards. The limit of quantification was set at 1ngµL -1 for major standards, except for hexanoic acid, which was set at 5ngµL -1 . The GC×GC-TOFMS method provided good precision (<10%) and excellent accuracy (recovery range of 70-130%) for the quantification of individual hydrocarbons and oxygenated compounds in real bio-oil samples. Sugars, furans, and alcohols appear as the major constituents of the PWT, CPO, and HDO samples, respectively. In order to obtain bio-oils with better quality, the catalytic pyrolysis process may be a better option than hydrogenation due to the effective reduction of oxygenated compound concentrations and the lower cost of the process, when hydrogen is not required to promote deoxygenation in the catalytic pyrolysis process. Copyright © 2016 Elsevier B.V. All rights reserved.
Guehrs, Erik; Schneider, Michael; Günther, Christian M; Hessing, Piet; Heitz, Karen; Wittke, Doreen; López-Serrano Oliver, Ana; Jakubowski, Norbert; Plendl, Johanna; Eisebitt, Stefan; Haase, Andrea
2017-03-21
Quantification of nanoparticle (NP) uptake in cells or tissues is very important for safety assessment. Often, electron microscopy based approaches are used for this purpose, which allow imaging at very high resolution. However, precise quantification of NP numbers in cells and tissues remains challenging. The aim of this study was to present a novel approach, that combines precise quantification of NPs in individual cells together with high resolution imaging of their intracellular distribution based on focused ion beam/ scanning electron microscopy (FIB/SEM) slice and view approaches. We quantified cellular uptake of 75 nm diameter citrate stabilized silver NPs (Ag 75 Cit) into an individual human macrophage derived from monocytic THP-1 cells using a FIB/SEM slice and view approach. Cells were treated with 10 μg/ml for 24 h. We investigated a single cell and found in total 3138 ± 722 silver NPs inside this cell. Most of the silver NPs were located in large agglomerates, only a few were found in clusters of fewer than five NPs. Furthermore, we cross-checked our results by using inductively coupled plasma mass spectrometry and could confirm the FIB/SEM results. Our approach based on FIB/SEM slice and view is currently the only one that allows the quantification of the absolute dose of silver NPs in individual cells and at the same time to assess their intracellular distribution at high resolution. We therefore propose to use FIB/SEM slice and view to systematically analyse the cellular uptake of various NPs as a function of size, concentration and incubation time.
Kapke, G E; Watson, G; Sheffler, S; Hunt, D; Frederick, C
1997-01-01
Several assays for quantification of DNA have been developed and are currently used in research and clinical laboratories. However, comparison of assay results has been difficult owing to the use of different standards and units of measurements as well as differences between assays in dynamic range and quantification limits. Although a few studies have compared results generated by different assays, there has been no consensus on conversion factors and thorough analysis has been precluded by small sample size and limited dynamic range studied. In this study, we have compared the Chiron branched DNA (bDNA) and Abbott liquid hybridization assays for quantification of hepatitis B virus (HBV) DNA in clinical specimens and have derived conversion factors to facilitate comparison of assay results. Additivity and variance stabilizing (AVAS) regression, a form of non-linear regression analysis, was performed on assay results for specimens from HBV clinical trials. Our results show that there is a strong linear relationship (R2 = 0.96) between log Chiron and log Abbott assay results. Conversion factors derived from regression analyses were found to be non-constant and ranged from 6-40. Analysis of paired assay results below and above each assay's limit of quantification (LOQ) indicated that a significantly (P < 0.01) larger proportion of observations were below the Abbott assay LOQ but above the Chiron assay LOQ, indicating that the Chiron assay is significantly more sensitive than the Abbott assay. Testing of replicate specimens showed that the Chiron assay consistently yielded lower per cent coefficients of variance (% CVs) than the Abbott assay, indicating that the Chiron assay provides superior precision.