Sample records for quantification analysis rqa

  1. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action

    PubMed Central

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels—from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures. PMID:27920748

  2. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action.

    PubMed

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels-from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures.

  3. RP and RQA Analysis for Floating Potential Fluctuations in a DC Magnetron Sputtering Plasma

    NASA Astrophysics Data System (ADS)

    Sabavath, Gopikishan; Banerjee, I.; Mahapatra, S. K.

    2016-04-01

    The nonlinear dynamics of a direct current magnetron sputtering plasma is visualized using recurrence plot (RP) technique. RP comprises the recurrence quantification analysis (RQA) which is an efficient method to observe critical regime transitions in dynamics. Further, RQA provides insight information about the system’s behavior. We observed the floating potential fluctuations of the plasma as a function of discharge voltage by using Langmuir probe. The system exhibits quasi-periodic-chaotic-quasi-periodic-chaotic transitions. These transitions are quantified from determinism, Lmax, and entropy of RQA. Statistical investigations like kurtosis and skewness also studied for these transitions which are in well agreement with RQA results.

  4. Recurrence Quantification Analysis of Sentence-Level Speech Kinematics.

    PubMed

    Jackson, Eric S; Tiede, Mark; Riley, Michael A; Whalen, D H

    2016-12-01

    Current approaches to assessing sentence-level speech variability rely on measures that quantify variability across utterances and use normalization procedures that alter raw trajectory data. The current work tests the feasibility of a less restrictive nonlinear approach-recurrence quantification analysis (RQA)-via a procedural example and subsequent analysis of kinematic data. To test the feasibility of RQA, lip aperture (i.e., the Euclidean distance between lip-tracking sensors) was recorded for 21 typically developing adult speakers during production of a simple utterance. The utterance was produced in isolation and in carrier structures differing just in length or in length and complexity. Four RQA indices were calculated: percent recurrence (%REC), percent determinism (%DET), stability (MAXLINE), and stationarity (TREND). Percent determinism (%DET) decreased only for the most linguistically complex sentence; MAXLINE decreased as a function of linguistic complexity but increased for the longer-only sentence; TREND decreased as a function of both length and linguistic complexity. This research note demonstrates the feasibility of using RQA as a tool to compare speech variability across speakers and groups. RQA offers promise as a technique to assess effects of potential stressors (e.g., linguistic or cognitive factors) on the speech production system.

  5. Recurrence quantification analysis of global stock markets

    NASA Astrophysics Data System (ADS)

    Bastos, João A.; Caiado, Jorge

    2011-04-01

    This study investigates the presence of deterministic dependencies in international stock markets using recurrence plots and recurrence quantification analysis (RQA). The results are based on a large set of free float-adjusted market capitalization stock indices, covering a period of 15 years. The statistical tests suggest that the dynamics of stock prices in emerging markets is characterized by higher values of RQA measures when compared to their developed counterparts. The behavior of stock markets during critical financial events, such as the burst of the technology bubble, the Asian currency crisis, and the recent subprime mortgage crisis, is analyzed by performing RQA in sliding windows. It is shown that during these events stock markets exhibit a distinctive behavior that is characterized by temporary decreases in the fraction of recurrence points contained in diagonal and vertical structures.

  6. Recurrence Quantification Analysis of Sentence-Level Speech Kinematics

    PubMed Central

    Tiede, Mark; Riley, Michael A.; Whalen, D. H.

    2016-01-01

    Purpose Current approaches to assessing sentence-level speech variability rely on measures that quantify variability across utterances and use normalization procedures that alter raw trajectory data. The current work tests the feasibility of a less restrictive nonlinear approach—recurrence quantification analysis (RQA)—via a procedural example and subsequent analysis of kinematic data. Method To test the feasibility of RQA, lip aperture (i.e., the Euclidean distance between lip-tracking sensors) was recorded for 21 typically developing adult speakers during production of a simple utterance. The utterance was produced in isolation and in carrier structures differing just in length or in length and complexity. Four RQA indices were calculated: percent recurrence (%REC), percent determinism (%DET), stability (MAXLINE), and stationarity (TREND). Results Percent determinism (%DET) decreased only for the most linguistically complex sentence; MAXLINE decreased as a function of linguistic complexity but increased for the longer-only sentence; TREND decreased as a function of both length and linguistic complexity. Conclusions This research note demonstrates the feasibility of using RQA as a tool to compare speech variability across speakers and groups. RQA offers promise as a technique to assess effects of potential stressors (e.g., linguistic or cognitive factors) on the speech production system. PMID:27824987

  7. Assessing Spontaneous Combustion Instability with Recurrence Quantification Analysis

    NASA Technical Reports Server (NTRS)

    Eberhart, Chad J.; Casiano, Matthew J.

    2016-01-01

    Spontaneous instabilities can pose a significant challenge to verification of combustion stability, and characterizing its onset is an important avenue of improvement for stability assessments of liquid propellant rocket engines. Recurrence Quantification Analysis (RQA) is used here to explore nonlinear combustion dynamics that might give insight into instability. Multiple types of patterns representative of different dynamical states are identified within fluctuating chamber pressure data, and markers for impending instability are found. A class of metrics which describe these patterns is also calculated. RQA metrics are compared with and interpreted against another metric from nonlinear time series analysis, the Hurst exponent, to help better distinguish between stable and unstable operation.

  8. Recurrence quantification analysis of electrically evoked surface EMG signal.

    PubMed

    Liu, Chunling; Wang, Xu

    2005-01-01

    Recurrence Plot is a quite useful tool used in time-series analysis, in particular for measuring unstable periodic orbits embedded in a chaotic dynamical system. This paper introduced the structures of the Recurrence Plot and the ways of the plot coming into being. Then the way of the quantification of the Recurrence Plot is defined. In this paper, one of the possible applications of Recurrence Quantification Analysis (RQA) strategy to the analysis of electrical stimulation evoked surface EMG. The result shows the percent determination is increased along with stimulation intensity.

  9. Characterising non-linear dynamics in nocturnal breathing patterns of healthy infants using recurrence quantification analysis.

    PubMed

    Terrill, Philip I; Wilson, Stephen J; Suresh, Sadasivam; Cooper, David M; Dakin, Carolyn

    2013-05-01

    Breathing dynamics vary between infant sleep states, and are likely to exhibit non-linear behaviour. This study applied the non-linear analytical tool recurrence quantification analysis (RQA) to 400 breath interval periods of REM and N-REM sleep, and then using an overlapping moving window. The RQA variables were different between sleep states, with REM radius 150% greater than N-REM radius, and REM laminarity 79% greater than N-REM laminarity. RQA allowed the observation of temporal variations in non-linear breathing dynamics across a night's sleep at 30s resolution, and provides a basis for quantifying changes in complex breathing dynamics with physiology and pathology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Investigation of complexity dynamics in a DC glow discharge magnetized plasma using recurrence quantification analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitra, Vramori; Sarma, Bornali; Sarma, Arun

    Recurrence is an ubiquitous feature which provides deep insights into the dynamics of real dynamical systems. A suitable tool for investigating recurrences is recurrence quantification analysis (RQA). It allows, e.g., the detection of regime transitions with respect to varying control parameters. We investigate the complexity of different coexisting nonlinear dynamical regimes of the plasma floating potential fluctuations at different magnetic fields and discharge voltages by using recurrence quantification variables, in particular, DET, L{sub max}, and Entropy. The recurrence analysis reveals that the predictability of the system strongly depends on discharge voltage. Furthermore, the persistent behaviour of the plasma time seriesmore » is characterized by the Detrended fluctuation analysis technique to explore the complexity in terms of long range correlation. The enhancement of the discharge voltage at constant magnetic field increases the nonlinear correlations; hence, the complexity of the system decreases, which corroborates the RQA analysis.« less

  11. Recurrence quantification analysis of human postural fluctuations in older fallers and non-fallers.

    PubMed

    Ramdani, Sofiane; Tallon, Guillaume; Bernard, Pierre Louis; Blain, Hubert

    2013-08-01

    We investigate postural sway data dynamics in older adult fallers and non-fallers. Center of pressure (COP) signals were recorded during quiet standing in 28 older adults. The subjects were divided in two groups: with and without history of falls. COP time series were analyzed using recurrence quantification analysis (RQA) in both anteroposterior and mediolateral (ML) directions. Classical stabilometric variables (path length and range) were also computed. The results showed that RQA outputs quantifying predictability of COP fluctuations and Shannon entropy of recurrence plot diagonal line length distribution, were significantly higher in fallers, only for ML direction. In addition, the range of ML COP signals was also significantly higher in fallers. This result is in accordance with some findings of the literature and could be interpreted as an increased hip strategy in fallers. The RQA results seem coherent with the theory of loss of complexity with aging and disease. Our results suggest that RQA is a promising approach for the investigation of COP fluctuations in a frail population.

  12. Recurrence quantification as potential bio-markers for diagnosis of pre-cancer

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sabyasachi; Pratiher, Sawon; Barman, Ritwik; Pratiher, Souvik; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.

    2017-03-01

    In this paper, the spectroscopy signals have been analyzed in recurrence plots (RP), and extract recurrence quantification analysis (RQA) parameters from the RP in order to classify the tissues into normal and different precancerous grades. Three RQA parameters have been quantified in order to extract the important features in the spectroscopy data. These features have been fed to different classifiers for classification. Simulation results validate the efficacy of the recurrence quantification as potential bio-markers for diagnosis of pre-cancer.

  13. Recurrence quantification analysis applied to spatiotemporal pattern analysis in high-density mapping of human atrial fibrillation.

    PubMed

    Zeemering, Stef; Bonizzi, Pietro; Maesen, Bart; Peeters, Ralf; Schotten, Ulrich

    2015-01-01

    Spatiotemporal complexity of atrial fibrillation (AF) patterns is often quantified by annotated intracardiac contact mapping. We introduce a new approach that applies recurrence plot (RP) construction followed by recurrence quantification analysis (RQA) to epicardial atrial electrograms, recorded with a high-density grid of electrodes. In 32 patients with no history of AF (aAF, n=11), paroxysmal AF (PAF, n=12) and persistent AF (persAF, n=9), RPs were constructed using a phase space electrogram embedding dimension equal to the estimated AF cycle length. Spatial information was incorporated by 1) averaging the recurrence over all electrodes, and 2) by applying principal component analysis (PCA) to the matrix of embedded electrograms and selecting the first principal component as a representation of spatial diversity. Standard RQA parameters were computed on the constructed RPs and correlated to the number of fibrillation waves per AF cycle (NW). Averaged RP RQA parameters showed no correlation with NW. Correlations improved when applying PCA, with maximum correlation achieved between RP threshold and NW (RR1%, r=0.68, p <; 0.001) and RP determinism (DET, r=-0.64, p <; 0.001). All studied RQA parameters based on the PCA RP were able to discriminate between persAF and aAF/PAF (DET persAF 0.40 ± 0.11 vs. 0.59 ± 0.14/0.62 ± 0.16, p <; 0.01). RP construction and RQA combined with PCA provide a quick and reliable tool to visualize dynamical behaviour and to assess the complexity of contact mapping patterns in AF.

  14. An online sleep apnea detection method based on recurrence quantification analysis.

    PubMed

    Nguyen, Hoa Dinh; Wilkins, Brek A; Cheng, Qi; Benjamin, Bruce Allen

    2014-07-01

    This paper introduces an online sleep apnea detection method based on heart rate complexity as measured by recurrence quantification analysis (RQA) statistics of heart rate variability (HRV) data. RQA statistics can capture nonlinear dynamics of a complex cardiorespiratory system during obstructive sleep apnea. In order to obtain a more robust measurement of the nonstationarity of the cardiorespiratory system, we use different fixed amount of neighbor thresholdings for recurrence plot calculation. We integrate a feature selection algorithm based on conditional mutual information to select the most informative RQA features for classification, and hence, to speed up the real-time classification process without degrading the performance of the system. Two types of binary classifiers, i.e., support vector machine and neural network, are used to differentiate apnea from normal sleep. A soft decision fusion rule is developed to combine the results of these classifiers in order to improve the classification performance of the whole system. Experimental results show that our proposed method achieves better classification results compared with the previous recurrence analysis-based approach. We also show that our method is flexible and a strong candidate for a real efficient sleep apnea detection system.

  15. Multiscale recurrence quantification analysis of order recurrence plots

    NASA Astrophysics Data System (ADS)

    Xu, Mengjia; Shang, Pengjian; Lin, Aijing

    2017-03-01

    In this paper, we propose a new method of multiscale recurrence quantification analysis (MSRQA) to analyze the structure of order recurrence plots. The MSRQA is based on order patterns over a range of time scales. Compared with conventional recurrence quantification analysis (RQA), the MSRQA can show richer and more recognizable information on the local characteristics of diverse systems which successfully describes their recurrence properties. Both synthetic series and stock market indexes exhibit their properties of recurrence at large time scales that quite differ from those at a single time scale. Some systems present more accurate recurrence patterns under large time scales. It demonstrates that the new approach is effective for distinguishing three similar stock market systems and showing some inherent differences.

  16. Characterization of high-intensity, long-duration continuous auroral activity (HILDCAA) events using recurrence quantification analysis

    NASA Astrophysics Data System (ADS)

    Mendes, Odim; Oliveira Domingues, Margarete; Echer, Ezequiel; Hajra, Rajkumar; Everton Menconi, Varlei

    2017-08-01

    Considering the magnetic reconnection and the viscous interaction as the fundamental mechanisms for transfer particles and energy into the magnetosphere, we study the dynamical characteristics of auroral electrojet (AE) index during high-intensity, long-duration continuous auroral activity (HILDCAA) events, using a long-term geomagnetic database (1975-2012), and other distinct interplanetary conditions (geomagnetically quiet intervals, co-rotating interaction regions (CIRs)/high-speed streams (HSSs) not followed by HILDCAAs, and events of AE comprised in global intense geomagnetic disturbances). It is worth noting that we also study active but non-HILDCAA intervals. Examining the geomagnetic AE index, we apply a dynamics analysis composed of the phase space, the recurrence plot (RP), and the recurrence quantification analysis (RQA) methods. As a result, the quantification finds two distinct clusterings of the dynamical behaviours occurring in the interplanetary medium: one regarding a geomagnetically quiet condition regime and the other regarding an interplanetary activity regime. Furthermore, the HILDCAAs seem unique events regarding a visible, intense manifestations of interplanetary Alfvénic waves; however, they are similar to the other kinds of conditions regarding a dynamical signature (based on RQA), because it is involved in the same complex mechanism of generating geomagnetic disturbances. Also, by characterizing the proper conditions of transitions from quiescent conditions to weaker geomagnetic disturbances inside the magnetosphere and ionosphere system, the RQA method indicates clearly the two fundamental dynamics (geomagnetically quiet intervals and HILDCAA events) to be evaluated with magneto-hydrodynamics simulations to understand better the critical processes related to energy and particle transfer into the magnetosphere-ionosphere system. Finally, with this work, we have also reinforced the potential applicability of the RQA method for

  17. Application of recurrence quantification analysis to automatically estimate infant sleep states using a single channel of respiratory data.

    PubMed

    Terrill, Philip I; Wilson, Stephen J; Suresh, Sadasivam; Cooper, David M; Dakin, Carolyn

    2012-08-01

    Previous work has identified that non-linear variables calculated from respiratory data vary between sleep states, and that variables derived from the non-linear analytical tool recurrence quantification analysis (RQA) are accurate infant sleep state discriminators. This study aims to apply these discriminators to automatically classify 30 s epochs of infant sleep as REM, non-REM and wake. Polysomnograms were obtained from 25 healthy infants at 2 weeks, 3, 6 and 12 months of age, and manually sleep staged as wake, REM and non-REM. Inter-breath interval data were extracted from the respiratory inductive plethysmograph, and RQA applied to calculate radius, determinism and laminarity. Time-series statistic and spectral analysis variables were also calculated. A nested cross-validation method was used to identify the optimal feature subset, and to train and evaluate a linear discriminant analysis-based classifier. The RQA features radius and laminarity and were reliably selected. Mean agreement was 79.7, 84.9, 84.0 and 79.2 % at 2 weeks, 3, 6 and 12 months, and the classifier performed better than a comparison classifier not including RQA variables. The performance of this sleep-staging tool compares favourably with inter-human agreement rates, and improves upon previous systems using only respiratory data. Applications include diagnostic screening and population-based sleep research.

  18. Reliability of recurrence quantification analysis measures of the center of pressure during standing in individuals with musculoskeletal disorders.

    PubMed

    Mazaheri, Masood; Negahban, Hossein; Salavati, Mahyar; Sanjari, Mohammad Ali; Parnianpour, Mohamad

    2010-09-01

    Although the application of nonlinear tools including recurrence quantification analysis (RQA) has increasingly grown in the recent years especially in balance-disordered populations, there have been few studies which determine their measurement properties. Therefore, a methodological study was performed to estimate the intersession and intrasession reliability of some dynamic features provided by RQA for nonlinear analysis of center of pressure (COP) signals recorded during quiet standing in a sample of patients with musculoskeletal disorders (MSDs) including low back pain (LBP), anterior cruciate ligament (ACL) injury and functional ankle instability (FAI). The subjects completed postural measurements with three levels of difficulty (rigid surface-eyes open, rigid surface-eyes closed, and foam surface-eyes closed). Four RQA measures (% recurrence, % determinism, entropy, and trend) were extracted from the recurrence plot. Relative reliability of these measures was assessed using intraclass correlation coefficient and absolute reliability using standard error of measurement and coefficient of variation. % Determinism and entropy were the most reliable features of RQA for the both intersession and intrasession reliability measures. High level of reliability of % determinism and entropy in this preliminary investigation may show their clinical promise for discriminative and evaluative purposes of balance performance. 2010 IPEM. Published by Elsevier Ltd. All rights reserved.

  19. Recurrence quantification analysis of electroencephalograph signals during standard tasks of Waterloo-Stanford group scale of hypnotic susceptibility.

    PubMed

    Yargholi, Elahe'; Nasrabadi, Ali Motie

    2015-01-01

    The purpose of this study was to apply RQA (recurrence quantification analysis) on hypnotic electroencephalograph (EEG) signals recorded after hypnotic induction while subjects were doing standard tasks of the Waterloo-Stanford Group Scale (WSGS) of hypnotic susceptibility. Then recurrence quantifiers were used to analyse the influence of hypnotic depth on EEGs. By the application of this method, the capability of tasks to distinguish subjects of different hypnotizability levels was determined. Besides, medium hypnotizable subjects showed the highest disposition to be inducted by hypnotizer. Similarities between brain governing dynamics during tasks of the same type were also observed. The present study demonstrated two remarkable innovations; investigating the EEGs of the hypnotized as doing mental tasks of Waterloo-Stanford Group Scale (WSGS) and applying RQA on hypnotic EEGs.

  20. Investigating chaotic features in solar radiation over a tropical station using recurrence quantification analysis

    NASA Astrophysics Data System (ADS)

    Ogunjo, Samuel T.; Adediji, Adekunle T.; Dada, Joseph B.

    2017-01-01

    The use of solar energy for power generation and other uses is on the increase. This demand necessitate a better understanding of the underlying dynamics for better prediction. Nonlinear dynamics and its associated tools readily lend itself for such analysis. In this paper, nonlinearity in solar radiation data is tested using recurrence plot (RP) and recurrence quantification analysis (RQA) in a tropical station. The data used was obtained from an ongoing campaign at the Federal University of Technology, Akure, Southwestern Nigeria using an Integrated Sensor Suite (Vantage2 Pro). Half hourly and daily values were tested for each month of the year. Both were found to be nonlinear. The dry months of the year exhibit higher chaoticity compared to the wet months of the year. The daily average values were found to be mildly chaotic. Using RQA, features due to external effects such as harmattan and intertropical discontinuity (ITD) on solar radiation data were uniquely identified.

  1. Characterization of QT and RR interval series during acute myocardial ischemia by means of recurrence quantification analysis.

    PubMed

    Peng, Yi; Sun, Zhongwei

    2011-01-01

    This study is aimed to investigate the nonlinear dynamic properties of the fluctuations in ventricular repolarization, heart rate and their correlation during acute myocardial ischemia. From 13 ECG records in long-term ST-T database, 170 ischemic episodes were selected with the duration of 34 s to 23 min 18 s, and two 5-min episodes immediately before and after each ischemic episode as non-ischemic ones for comparison. QT interval (QTI) and RR interval (RRI) were extracted and the ectopic beats were removed. Recurrence quantification analysis (RQA) was performed on QTI and RRI series, respectively, and cross recurrence quantification analysis (CRQA) on paired normalized QTI and RRI series. Wilcoxon signed-rank test was used for statistical analysis. Results revealed that the RQA indexes for QTI and HRI series had the same changing trend during ischemia with more significantly changed indexes in QTI series. In the CRQA, indexes related to the vertical and horizontal structures in recurrence plot significantly increased, representing decreased dependency of QTI on RRI. Both QTI and RRI series showed reduced complexity during ischemia with higher sensitivity in ventricular repolarization. The weakened coupling between QTI and RRI suggests the decreased influence of sinoatrial node on QTI modulation during ischemia.

  2. The effect of orthostasis on recurrence quantification analysis of heart rate and blood pressure dynamics.

    PubMed

    Javorka, M; Turianikova, Z; Tonhajzerova, I; Javorka, K; Baumert, M

    2009-01-01

    The purpose of this paper is to investigate the effect of orthostatic challenge on recurrence plot based complexity measures of heart rate and blood pressure variability (HRV and BPV). HRV and BPV complexities were assessed in 28 healthy subjects over 15 min in the supine and standing positions. The complexity of HRV and BPV was assessed based on recurrence quantification analysis. HRV complexity was reduced along with the HRV magnitude after changing from the supine to the standing position. In contrast, the BPV magnitude increased and BPV complexity decreased upon standing. Recurrence quantification analysis (RQA) of HRV and BPV is sensitive to orthostatic challenge and might therefore be suited to assess changes in autonomic neural outflow to the cardiovascular system.

  3. Application of recurrence quantification analysis for the automated identification of epileptic EEG signals.

    PubMed

    Acharya, U Rajendra; Sree, S Vinitha; Chattopadhyay, Subhagata; Yu, Wenwei; Ang, Peng Chuan Alvin

    2011-06-01

    Epilepsy is a common neurological disorder that is characterized by the recurrence of seizures. Electroencephalogram (EEG) signals are widely used to diagnose seizures. Because of the non-linear and dynamic nature of the EEG signals, it is difficult to effectively decipher the subtle changes in these signals by visual inspection and by using linear techniques. Therefore, non-linear methods are being researched to analyze the EEG signals. In this work, we use the recorded EEG signals in Recurrence Plots (RP), and extract Recurrence Quantification Analysis (RQA) parameters from the RP in order to classify the EEG signals into normal, ictal, and interictal classes. Recurrence Plot (RP) is a graph that shows all the times at which a state of the dynamical system recurs. Studies have reported significantly different RQA parameters for the three classes. However, more studies are needed to develop classifiers that use these promising features and present good classification accuracy in differentiating the three types of EEG segments. Therefore, in this work, we have used ten RQA parameters to quantify the important features in the EEG signals.These features were fed to seven different classifiers: Support vector machine (SVM), Gaussian Mixture Model (GMM), Fuzzy Sugeno Classifier, K-Nearest Neighbor (KNN), Naive Bayes Classifier (NBC), Decision Tree (DT), and Radial Basis Probabilistic Neural Network (RBPNN). Our results show that the SVM classifier was able to identify the EEG class with an average efficiency of 95.6%, sensitivity and specificity of 98.9% and 97.8%, respectively.

  4. Recurrence analysis of ant activity patterns

    PubMed Central

    2017-01-01

    In this study, we used recurrence quantification analysis (RQA) and recurrence plots (RPs) to compare the movement activity of individual workers of three ant species, as well as a gregarious beetle species. RQA and RPs quantify the number and duration of recurrences of a dynamical system, including a detailed quantification of signals that could be stochastic, deterministic, or both. First, we found substantial differences between the activity dynamics of beetles and ants, with the results suggesting that the beetles have quasi-periodic dynamics and the ants do not. Second, workers from different ant species varied with respect to their dynamics, presenting degrees of predictability as well as stochastic signals. Finally, differences were found among minor and major caste of the same (dimorphic) ant species. Our results underscore the potential of RQA and RPs in the analysis of complex behavioral patterns, as well as in general inferences on animal behavior and other biological phenomena. PMID:29016648

  5. Improving the understanding of sleep apnea characterization using Recurrence Quantification Analysis by defining overall acceptable values for the dimensionality of the system, the delay, and the distance threshold

    PubMed Central

    Navarro-Mesa, Juan L.; Juliá-Serdá, Gabriel; Ramírez-Ávila, G. Marcelo; Ravelo-García, Antonio G.

    2018-01-01

    Our contribution focuses on the characterization of sleep apnea from a cardiac rate point of view, using Recurrence Quantification Analysis (RQA), based on a Heart Rate Variability (HRV) feature selection process. Three parameters are crucial in RQA: those related to the embedding process (dimension and delay) and the threshold distance. There are no overall accepted parameters for the study of HRV using RQA in sleep apnea. We focus on finding an overall acceptable combination, sweeping a range of values for each of them simultaneously. Together with the commonly used RQA measures, we include features related to recurrence times, and features originating in the complex network theory. To the best of our knowledge, no author has used them all for sleep apnea previously. The best performing feature subset is entered into a Linear Discriminant classifier. The best results in the “Apnea-ECG Physionet database” and the “HuGCDN2014 database” are, according to the area under the receiver operating characteristic curve, 0.93 (Accuracy: 86.33%) and 0.86 (Accuracy: 84.18%), respectively. Our system outperforms, using a relatively small set of features, previously existing studies in the context of sleep apnea. We conclude that working with dimensions around 7–8 and delays about 4–5, and using for the threshold distance the Fixed Amount of Nearest Neighbours (FAN) method with 5% of neighbours, yield the best results. Therefore, we would recommend these reference values for future work when applying RQA to the analysis of HRV in sleep apnea. We also conclude that, together with the commonly used vertical and diagonal RQA measures, there are newly used features that contribute valuable information for apnea minutes discrimination. Therefore, they are especially interesting for characterization purposes. Using two different databases supports that the conclusions reached are potentially generalizable, and are not limited by database variability. PMID:29621264

  6. Improving the understanding of sleep apnea characterization using Recurrence Quantification Analysis by defining overall acceptable values for the dimensionality of the system, the delay, and the distance threshold.

    PubMed

    Martín-González, Sofía; Navarro-Mesa, Juan L; Juliá-Serdá, Gabriel; Ramírez-Ávila, G Marcelo; Ravelo-García, Antonio G

    2018-01-01

    Our contribution focuses on the characterization of sleep apnea from a cardiac rate point of view, using Recurrence Quantification Analysis (RQA), based on a Heart Rate Variability (HRV) feature selection process. Three parameters are crucial in RQA: those related to the embedding process (dimension and delay) and the threshold distance. There are no overall accepted parameters for the study of HRV using RQA in sleep apnea. We focus on finding an overall acceptable combination, sweeping a range of values for each of them simultaneously. Together with the commonly used RQA measures, we include features related to recurrence times, and features originating in the complex network theory. To the best of our knowledge, no author has used them all for sleep apnea previously. The best performing feature subset is entered into a Linear Discriminant classifier. The best results in the "Apnea-ECG Physionet database" and the "HuGCDN2014 database" are, according to the area under the receiver operating characteristic curve, 0.93 (Accuracy: 86.33%) and 0.86 (Accuracy: 84.18%), respectively. Our system outperforms, using a relatively small set of features, previously existing studies in the context of sleep apnea. We conclude that working with dimensions around 7-8 and delays about 4-5, and using for the threshold distance the Fixed Amount of Nearest Neighbours (FAN) method with 5% of neighbours, yield the best results. Therefore, we would recommend these reference values for future work when applying RQA to the analysis of HRV in sleep apnea. We also conclude that, together with the commonly used vertical and diagonal RQA measures, there are newly used features that contribute valuable information for apnea minutes discrimination. Therefore, they are especially interesting for characterization purposes. Using two different databases supports that the conclusions reached are potentially generalizable, and are not limited by database variability.

  7. Recurrence quantification analysis and support vector machines for golf handicap and low back pain EMG classification.

    PubMed

    Silva, Luís; Vaz, João Rocha; Castro, Maria António; Serranho, Pedro; Cabri, Jan; Pezarat-Correia, Pedro

    2015-08-01

    The quantification of non-linear characteristics of electromyography (EMG) must contain information allowing to discriminate neuromuscular strategies during dynamic skills. There are a lack of studies about muscle coordination under motor constrains during dynamic contractions. In golf, both handicap (Hc) and low back pain (LBP) are the main factors associated with the occurrence of injuries. The aim of this study was to analyze the accuracy of support vector machines SVM on EMG-based classification to discriminate Hc (low and high handicap) and LBP (with and without LPB) in the main phases of golf swing. For this purpose recurrence quantification analysis (RQA) features of the trunk and the lower limb muscles were used to feed a SVM classifier. Recurrence rate (RR) and the ratio between determinism (DET) and RR showed a high discriminant power. The Hc accuracy for the swing, backswing, and downswing were 94.4±2.7%, 97.1±2.3%, and 95.3±2.6%, respectively. For LBP, the accuracy was 96.9±3.8% for the swing, and 99.7±0.4% in the backswing. External oblique (EO), biceps femoris (BF), semitendinosus (ST) and rectus femoris (RF) showed high accuracy depending on the laterality within the phase. RQA features and SVM showed a high muscle discriminant capacity within swing phases by Hc and by LBP. Low back pain golfers showed different neuromuscular coordination strategies when compared with asymptomatic. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Comparative study of chaotic features in hourly wind speed using recurrence quantification analysis

    NASA Astrophysics Data System (ADS)

    Adeniji, A. E.; Olusola, O. I.; Njah, A. N.

    2018-02-01

    Due to the shortage in electricity supply in Nigeria, there is a need to improve the alternative power generation from wind energy by analysing the wind speed data available in some parts of the country, for a better understanding of its underlying dynamics for the purpose of good prediction and modelling. The wind speed data used in this study were collected over a period of two years by National Space Research and Development Agency (NASRDA) from five different stations in the tropics namely; Abuja (7050'02.09"N and 6004'29.97"E), Akungba (6059'05.40"N and 5035'52.23"E), Nsukka (6051'28.14"N and 7024'28.15"E), Port Harcourt (4047'05.41"N and 6059'30.62"E), and Yola (9017'33.58"N and 12023'26.69"E). In this paper, recurrence plot (RP) and recurrence quantification analysis (RQA) are applied to investigate a non-linear deterministic dynamical process and non-stationarity in hourly wind speed data from the study areas. Using RQA for each month of the two years, it is observed that wind speed data for the wet months exhibit higher chaoticity than that of the dry months for all the stations, due to strong and weak monsoonal effect during the wet and dry seasons respectively. The results show that recurrence techniques are able to identify areas and periods for which the harvest of wind energy for power generation is good (high predictability) and poor (low predictability) in the study areas. This work also validates the RQA measures (Lmax, DET and ENT) used and establishes that they are similar/related as they give similar results for the dynamical characterization of the wind speed data.

  9. Prediction of protein structural classes by recurrence quantification analysis based on chaos game representation.

    PubMed

    Yang, Jian-Yi; Peng, Zhen-Ling; Yu, Zu-Guo; Zhang, Rui-Jie; Anh, Vo; Wang, Desheng

    2009-04-21

    In this paper, we intend to predict protein structural classes (alpha, beta, alpha+beta, or alpha/beta) for low-homology data sets. Two data sets were used widely, 1189 (containing 1092 proteins) and 25PDB (containing 1673 proteins) with sequence homology being 40% and 25%, respectively. We propose to decompose the chaos game representation of proteins into two kinds of time series. Then, a novel and powerful nonlinear analysis technique, recurrence quantification analysis (RQA), is applied to analyze these time series. For a given protein sequence, a total of 16 characteristic parameters can be calculated with RQA, which are treated as feature representation of protein sequences. Based on such feature representation, the structural class for each protein is predicted with Fisher's linear discriminant algorithm. The jackknife test is used to test and compare our method with other existing methods. The overall accuracies with step-by-step procedure are 65.8% and 64.2% for 1189 and 25PDB data sets, respectively. With one-against-others procedure used widely, we compare our method with five other existing methods. Especially, the overall accuracies of our method are 6.3% and 4.1% higher for the two data sets, respectively. Furthermore, only 16 parameters are used in our method, which is less than that used by other methods. This suggests that the current method may play a complementary role to the existing methods and is promising to perform the prediction of protein structural classes.

  10. Attractor structure discriminates sleep states: recurrence plot analysis applied to infant breathing patterns.

    PubMed

    Terrill, Philip Ian; Wilson, Stephen James; Suresh, Sadasivam; Cooper, David M; Dakin, Carolyn

    2010-05-01

    Breathing patterns are characteristically different between infant active sleep (AS) and quiet sleep (QS), and statistical quantifications of interbreath interval (IBI) data have previously been used to discriminate between infant sleep states. It has also been identified that breathing patterns are governed by a nonlinear controller. This study aims to investigate whether nonlinear quantifications of infant IBI data are characteristically different between AS and QS, and whether they may be used to discriminate between these infant sleep states. Polysomnograms were obtained from 24 healthy infants at six months of age. Periods of AS and QS were identified, and IBI data extracted. Recurrence quantification analysis (RQA) was applied to each period, and recurrence calculated for a fixed radius in the range of 0-8 in steps of 0.02, and embedding dimensions of 4, 6, 8, and 16. When a threshold classifier was trained, the RQA variable recurrence was able to correctly classify 94.3% of periods in a test dataset. It was concluded that RQA of IBI data is able to accurately discriminate between infant sleep states. This is a promising step toward development of a minimal-channel automatic sleep state classification system.

  11. Recurrence quantity analysis based on singular value decomposition

    NASA Astrophysics Data System (ADS)

    Bian, Songhan; Shang, Pengjian

    2017-05-01

    Recurrence plot (RP) has turned into a powerful tool in many different sciences in the last three decades. To quantify the complexity and structure of RP, recurrence quantification analysis (RQA) has been developed based on the measures of recurrence density, diagonal lines, vertical lines and horizontal lines. This paper will study the RP based on singular value decomposition which is a new perspective of RP study. Principal singular value proportion (PSVP) will be proposed as one new RQA measure and bigger PSVP means higher complexity for one system. In contrast, smaller PSVP reflects a regular and stable system. Considering the advantage of this method in detecting the complexity and periodicity of systems, several simulation and real data experiments are chosen to examine the performance of this new RQA.

  12. Detection of traffic incidents using nonlinear time series analysis

    NASA Astrophysics Data System (ADS)

    Fragkou, A. D.; Karakasidis, T. E.; Nathanail, E.

    2018-06-01

    In this study, we present results of the application of nonlinear time series analysis on traffic data for incident detection. More specifically, we analyze daily volume records of Attica Tollway (Greece) collected from sensors located at various locations. The analysis was performed using the Recurrence Plot (RP) and Recurrence Quantification Analysis (RQA) method of the volume data of the lane closest to the median. The results show that it is possible to identify, through the abrupt change of the dynamics of the system revealed by RPs and RQA, the occurrence of incidents on the freeway and differentiate from recurrent traffic congestion. The proposed methodology could be of interest for big data traffic analysis.

  13. Recurrence Quantifcation Analysis of Sentence-Level Speech Kinematics

    ERIC Educational Resources Information Center

    Jackson, Eric S.; Tiede, Mark; Riley, Michael A.; Whalen, D. H.

    2016-01-01

    Purpose: Current approaches to assessing sentence-level speech variability rely on measures that quantify variability across utterances and use normalization procedures that alter raw trajectory data. The current work tests the feasibility of a less restrictive nonlinear approach--recurrence quantification analysis (RQA)--via a procedural example…

  14. Detecting intrinsic dynamics of traffic flow with recurrence analysis and empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Xiong, Hui; Shang, Pengjian; Bian, Songhan

    2017-05-01

    In this paper, we apply the empirical mode decomposition (EMD) method to the recurrence plot (RP) and recurrence quantification analysis (RQA), to evaluate the frequency- and time-evolving dynamics of the traffic flow. Based on the cumulative intrinsic mode functions extracted by the EMD, the frequency-evolving RP regarding different oscillation of modes suggests that apparent dynamics of the data considered are mainly dominated by its components of medium- and low-frequencies while severely affected by fast oscillated noises contained in the signal. Noises are then eliminated to analyze the intrinsic dynamics and consequently, the denoised time-evolving RQA diversely characterizes the properties of the signal and marks crucial points more accurately where white bands in the RP occur, whereas a strongly qualitative agreement exists between all the non-denoised RQA measures. Generally, the EMD combining with the recurrence analysis sheds more reliable, abundant and inherent lights into the traffic flow, which is meaningful to the empirical analysis of complex systems.

  15. Characterization of local complex structures in a recurrence plot to improve nonlinear dynamic discriminant analysis.

    PubMed

    Ding, Hang

    2014-01-01

    Structures in recurrence plots (RPs), preserving the rich information of nonlinear invariants and trajectory characteristics, have been increasingly analyzed in dynamic discrimination studies. The conventional analysis of RPs is mainly focused on quantifying the overall diagonal and vertical line structures through a method, called recurrence quantification analysis (RQA). This study extensively explores the information in RPs by quantifying local complex RP structures. To do this, an approach was developed to analyze the combination of three major RQA variables: determinism, laminarity, and recurrence rate (DLR) in a metawindow moving over a RP. It was then evaluated in two experiments discriminating (1) ideal nonlinear dynamic series emulated from the Lorenz system with different control parameters and (2) data sets of human heart rate regulations with normal sinus rhythms (n = 18) and congestive heart failure (n = 29). Finally, the DLR was compared with seven major RQA variables in terms of discriminatory power, measured by standardized mean difference (DSMD). In the two experiments, DLR resulted in the highest discriminatory power with DSMD = 2.53 and 0.98, respectively, which were 7.41 and 2.09 times the best performance from RQA. The study also revealed that the optimal RP structures for the discriminations were neither typical diagonal structures nor vertical structures. These findings indicate that local complex RP structures contain some rich information unexploited by RQA. Therefore, future research to extensively analyze complex RP structures would potentially improve the effectiveness of the RP analysis in dynamic discrimination studies.

  16. Applying the recurrence quantification analysis method for analyzing the recurrence of simulated multiple African easterly waves in 2006

    NASA Astrophysics Data System (ADS)

    Reyes, T.; Shen, B. W.; Wu, Y.; Faghih-Naini, S.; Li, J.

    2017-12-01

    In late August, 2006, six African easterly waves (AEWs) appeared sequentially over the African continent during a 30-day period. With a global model of 1/4 degree resolution, statistics of these AEWs were realistically captured. More interestingly, the formation, subsequent intensification, and movement of Hurricane Helene (2006) were simulated to a degree of satisfaction during the model integration from Day 22 to 30 (Shen et al., 2010). We then developed a parallel ensemble empirical mode decomposition method (PEEMD; Shen et al. 2012; 2017; Cheung et al. 2013) to reveal the role of downscaling processes associated with the environmental flows in determining the timing and location of Helene's formation (Wu and Shen, 2016), supporting its practical predictability at extended-range time scales. Recently, further analysis of the correlation coefficients (CCs) between the simulated temperature and reanalysis data showed that CCs are above 0.65 during the 30 day simulations but display oscillations. While high CCs are consistent with the accurate simulations of the AEWs and Hurricane Helene, oscillations may indicate the inaccurate simulations of moving speeds (i.e., an inaccurate phase) as compared to observations. The observed AEWs have comparable but slightly different periods. To quantitatively examine this space-varying feature in observations and the temporal oscillations in the CCs of the simulations, we select recurrence quantification analysis (RQA) methods and the recurrence plot (RP) in order to account for the local nature of these features. A recurrence is defined when the trajectory returns back to the neighborhood of a previously visited state. With the RQA methods, we can compute the "recurrence rate" and "determinism" present in the RP in order to reveal the degree of recurrence and determinism (or "predictability") of the recurrent solutions. To verify of our implementations in Python, we applied our methods to analyze idealized solutions (e

  17. IEC 61267: Feasibility of type 1100 aluminium and a copper/aluminium combination for RQA beam qualities.

    PubMed

    Leong, David L; Rainford, Louise; Zhao, Wei; Brennan, Patrick C

    2016-01-01

    In the course of performance acceptance testing, benchmarking or quality control of X-ray imaging systems, it is sometimes necessary to harden the X-ray beam spectrum. IEC 61267 specifies materials and methods to accomplish beam hardening and, unfortunately, requires the use of 99.9% pure aluminium (Alloy 1190) for the RQA beam quality, which is expensive and difficult to obtain. Less expensive and more readily available filters, such as Alloy 1100 (99.0% pure) aluminium and copper/aluminium combinations, have been used clinically to produce RQA series without rigorous scientific investigation to support their use. In this paper, simulation and experimental methods are developed to determine the differences in beam quality using Alloy 1190 and Alloy 1100. Additional simulation investigated copper/aluminium combinations to produce RQA5 and outputs from this simulation are verified with laboratory tests using different filter samples. The results of the study demonstrate that although Alloy 1100 produces a harder beam spectrum compared to Alloy 1190, it is a reasonable substitute. A combination filter of 0.5 mm copper and 2 mm aluminium produced a spectrum closer to that of Alloy 1190 than Alloy 1100 with the added benefits of lower exposures and lower batch variability. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  18. Quantification of scaling exponents and dynamical complexity of microwave refractivity in a tropical climate

    NASA Astrophysics Data System (ADS)

    Fuwape, Ibiyinka A.; Ogunjo, Samuel T.

    2016-12-01

    Radio refractivity index is used to quantify the effect of atmospheric parameters in communication systems. Scaling and dynamical complexities of radio refractivity across different climatic zones of Nigeria have been studied. Scaling property of the radio refractivity across Nigeria was estimated from the Hurst Exponent obtained using two different scaling methods namely: The Rescaled Range (R/S) and the detrended fluctuation analysis(DFA). The delay vector variance (DVV), Largest Lyapunov Exponent (λ1) and Correlation Dimension (D2) methods were used to investigate nonlinearity and the results confirm the presence of deterministic nonlinear profile in the radio refractivity time series. The recurrence quantification analysis (RQA) was used to quantify the degree of chaoticity in the radio refractivity across the different climatic zones. RQA was found to be a good measure for identifying unique fingerprint and signature of chaotic time series data. Microwave radio refractivity was found to be persistent and chaotic in all the study locations. The dynamics of radio refractivity increases in complexity and chaoticity from the Coastal region towards the Sahelian climate. The design, development and deployment of robust and reliable microwave communication link in the region will be greatly affected by the chaotic nature of radio refractivity in the region.

  19. Detection of the fracture zone by the method of recurrence plot

    NASA Astrophysics Data System (ADS)

    Hilarov, V. L.

    2017-12-01

    Recurrence plots (RPs) and recurrence quantification analysis (RQA) characteristics for the normal component of the displacement vector upon excitation of a defect steel plate by a sound pulse are analyzed. Different cases of spatial distribution of defects (uniform and normal) are considered, and a difference in the RQA parameters in these cases is revealed.

  20. Nonlinear analysis of heart rate variability to assess the reaction of ewe fetuses undergoing fetal cardiac surgery.

    PubMed

    Del Gaudio, Costantino; Carotti, Adriano; Grigioni, Mauro; Morbiducci, Umberto

    2012-05-01

    Fetal cardiac surgery (FCS) represents a challenging issue for the in utero treatment of congenital heart defects. However, FCS has still not gained the sufficient reliability for clinical practice due to an incompletely elucidated fetal stress response. For example, blood sampling can contribute to its onset, leading to fetoplacental unit dysfunction, one of the main causes of failure of the surgical procedure. In order to address this issue, the role of the autonomic control system during an experimental procedure of cardiac bypass on ewe fetuses was investigated by means of recurrence quantification analysis (RQA), a well-recognized method for the analysis of nonlinear systems. RQA was applied to time series extracted from fetal arterial pressure recordings before and after the cardiac bypass established by means of an extracorporeal circuit, including an axial blood pump, and taking advantage of the capability of the placenta to work as a natural oxygenator. Statistically significant correlations were found among RQA-based metrics and fetal blood gas data, suggesting the possibility to infer the clinical status of the fetus starting from its hemodynamic signals.This study shows the relevance of RQA as a complementary tool for the monitoring of the fetal status during cardiac bypass.

  1. Recurrence quantification analysis to characterize cyclical components of environmental elemental exposures during fetal and postnatal development

    PubMed Central

    Austin, Christine; Gennings, Chris; Tammimies, Kristiina; Bölte, Sven; Arora, Manish

    2017-01-01

    Environmental exposures to essential and toxic elements may alter health trajectories, depending on the timing, intensity, and mixture of exposures. In epidemiologic studies, these factors are typically analyzed as a function of elemental concentrations in biological matrices measured at one or more points in time. Such an approach, however, fails to account for the temporal cyclicity in the metabolism of environmental chemicals, which if perturbed may lead to adverse health outcomes. Here, we conceptualize and apply a non-linear method–recurrence quantification analysis (RQA)–to quantify cyclical components of prenatal and early postnatal exposure profiles for elements essential to normal development, including Zn, Mn, Mg, and Ca, and elements associated with deleterious health effects or narrow tolerance ranges, including Pb, As, and Cr. We found robust evidence of cyclical patterns in the metabolic profiles of nutrient elements, which we validated against randomized twin-surrogate time-series, and further found that nutrient dynamical properties differ from those of Cr, As, and Pb. Furthermore, we extended this approach to provide a novel method of quantifying dynamic interactions between two environmental exposures. To achieve this, we used cross-recurrence quantification analysis (CRQA), and found that elemental nutrient-nutrient interactions differed from those involving toxicants. These rhythmic regulatory interactions, which we characterize in two geographically distinct cohorts, have not previously been uncovered using traditional regression-based approaches, and may provide a critical unit of analysis for environmental and dietary exposures in epidemiological studies. PMID:29112980

  2. Structure & Coupling of Semiotic Sets

    NASA Astrophysics Data System (ADS)

    Orsucci, Franco; Giuliani, Alessandro; Zbilut, Joseph

    2004-12-01

    We investigated the informational structure of written texts (also in the form of speech transcriptions) using Recurrence Quantification Analysis (RQA). RQA technique provides a quantitative description of text sequences at the orthographic level in terms of structuring, and may be useful for a variety of linguistics-related studies. We used RQA to measure differences in linguistic samples from different subjects. They were divided in subgroups based on personality and culture differences. We used RQA and KRQA (Cross Recurrence) to measure the coupling and synchronization during the conversation (semiotic interaction) of different subjects. We discuss results both for the improvement of methodology and some general implications for neurocognitive science.

  3. The Social Attitude Empowerment of Biology Students: Implementation JiRQA Learning Strategy in Different Ethnics

    ERIC Educational Resources Information Center

    Bustami, Yakobus; Corebima, Aloysius Duran; Suarsini, Endang; Ibrohim

    2017-01-01

    The empowerment of social attitudes in higher education is indispensable. The aim of this research was to uncover the effect of the empowerment efforts on the social attitudes of multiethnic biology students through the implementation of JiRQA learning strategy. This research was a quasi experimental of 2 x 3 factorial design implemented on the…

  4. Mathematical and Computational Foundations of Recurrence Quantifications

    NASA Astrophysics Data System (ADS)

    Marwan, Norbert; Webber, Charles L.

    Real-world systems possess deterministic trajectories, phase singularities and noise. Dynamic trajectories have been studied in temporal and frequency domains, but these are linear approaches. Basic to the field of nonlinear dynamics is the representation of trajectories in phase space. A variety of nonlinear tools such as the Lyapunov exponent, Kolmogorov-Sinai entropy, correlation dimension, etc. have successfully characterized trajectories in phase space, provided the systems studied were stationary in time. Ubiquitous in nature, however, are systems that are nonlinear and nonstationary, existing in noisy environments all of which are assumption breaking to otherwise powerful linear tools. What has been unfolding over the last quarter of a century, however, is the timely discovery and practical demonstration that the recurrences of system trajectories in phase space can provide important clues to the system designs from which they derive. In this chapter we will introduce the basics of recurrence plots (RP) and their quantification analysis (RQA). We will begin by summarizing the concept of phase space reconstructions. Then we will provide the mathematical underpinnings of recurrence plots followed by the details of recurrence quantifications. Finally, we will discuss computational approaches that have been implemented to make recurrence strategies feasible and useful. As computers become faster and computer languages advance, younger generations of researchers will be stimulated and encouraged to capture nonlinear recurrence patterns and quantification in even better formats. This particular branch of nonlinear dynamics remains wide open for the definition of new recurrence variables and new applications untouched to date.

  5. Recurrence quantification analysis of extremes of maximum and minimum temperature patterns for different climate scenarios in the Mesochora catchment in Central-Western Greece

    NASA Astrophysics Data System (ADS)

    Panagoulia, Dionysia; Vlahogianni, Eleni I.

    2018-06-01

    A methodological framework based on nonlinear recurrence analysis is proposed to examine the historical data evolution of extremes of maximum and minimum daily mean areal temperature patterns over time under different climate scenarios. The methodology is based on both historical data and atmospheric General Circulation Model (GCM) produced climate scenarios for the periods 1961-2000 and 2061-2100 which correspond to 1 × CO2 and 2 × CO2 scenarios. Historical data were derived from the actual daily observations coupled with atmospheric circulation patterns (CPs). The dynamics of the temperature was reconstructed in the phase-space from the time series of temperatures. The statistically comparing different temperature patterns were based on some discriminating statistics obtained by the Recurrence Quantification Analysis (RQA). Moreover, the bootstrap method of Schinkel et al. (2009) was adopted to calculate the confidence bounds of RQA parameters based on a structural preserving resampling. The overall methodology was implemented to the mountainous Mesochora catchment in Central-Western Greece. The results reveal substantial similarities between the historical maximum and minimum daily mean areal temperature statistical patterns and their confidence bounds, as well as the maximum and minimum temperature patterns in evolution under the 2 × CO2 scenario. A significant variability and non-stationary behaviour characterizes all climate series analyzed. Fundamental differences are produced from the historical and maximum 1 × CO2 scenarios, the maximum 1 × CO2 and minimum 1 × CO2 scenarios, as well as the confidence bounds for the two CO2 scenarios. The 2 × CO2 scenario reflects the strongest shifts in intensity, duration and frequency in temperature patterns. Such transitions can help the scientists and policy makers to understand the effects of extreme temperature changes on water resources, economic development, and health of ecosystems and hence to proceed to

  6. Recurrence quantity analysis based on matrix eigenvalues

    NASA Astrophysics Data System (ADS)

    Yang, Pengbo; Shang, Pengjian

    2018-06-01

    Recurrence plots is a powerful tool for visualization and analysis of dynamical systems. Recurrence quantification analysis (RQA), based on point density and diagonal and vertical line structures in the recurrence plots, is considered to be alternative measures to quantify the complexity of dynamical systems. In this paper, we present a new measure based on recurrence matrix to quantify the dynamical properties of a given system. Matrix eigenvalues can reflect the basic characteristics of the complex systems, so we show the properties of the system by exploring the eigenvalues of the recurrence matrix. Considering that Shannon entropy has been defined as a complexity measure, we propose the definition of entropy of matrix eigenvalues (EOME) as a new RQA measure. We confirm that EOME can be used as a metric to quantify the behavior changes of the system. As a given dynamical system changes from a non-chaotic to a chaotic regime, the EOME will increase as well. The bigger EOME values imply higher complexity and lower predictability. We also study the effect of some factors on EOME,including data length, recurrence threshold, the embedding dimension, and additional noise. Finally, we demonstrate an application in physiology. The advantage of this measure lies in a high sensitivity and simple computation.

  7. Structure-related statistical singularities along protein sequences: a correlation study.

    PubMed

    Colafranceschi, Mauro; Colosimo, Alfredo; Zbilut, Joseph P; Uversky, Vladimir N; Giuliani, Alessandro

    2005-01-01

    A data set composed of 1141 proteins representative of all eukaryotic protein sequences in the Swiss-Prot Protein Knowledge base was coded by seven physicochemical properties of amino acid residues. The resulting numerical profiles were submitted to correlation analysis after the application of a linear (simple mean) and a nonlinear (Recurrence Quantification Analysis, RQA) filter. The main RQA variables, Recurrence and Determinism, were subsequently analyzed by Principal Component Analysis. The RQA descriptors showed that (i) within protein sequences is embedded specific information neither present in the codes nor in the amino acid composition and (ii) the most sensitive code for detecting ordered recurrent (deterministic) patterns of residues in protein sequences is the Miyazawa-Jernigan hydrophobicity scale. The most deterministic proteins in terms of autocorrelation properties of primary structures were found (i) to be involved in protein-protein and protein-DNA interactions and (ii) to display a significantly higher proportion of structural disorder with respect to the average data set. A study of the scaling behavior of the average determinism with the setting parameters of RQA (embedding dimension and radius) allows for the identification of patterns of minimal length (six residues) as possible markers of zones specifically prone to inter- and intramolecular interactions.

  8. Research on Zheng Classification Fusing Pulse Parameters in Coronary Heart Disease

    PubMed Central

    Guo, Rui; Wang, Yi-Qin; Xu, Jin; Yan, Hai-Xia; Yan, Jian-Jun; Li, Fu-Feng; Xu, Zhao-Xia; Xu, Wen-Jie

    2013-01-01

    This study was conducted to illustrate that nonlinear dynamic variables of Traditional Chinese Medicine (TCM) pulse can improve the performances of TCM Zheng classification models. Pulse recordings of 334 coronary heart disease (CHD) patients and 117 normal subjects were collected in this study. Recurrence quantification analysis (RQA) was employed to acquire nonlinear dynamic variables of pulse. TCM Zheng models in CHD were constructed, and predictions using a novel multilabel learning algorithm based on different datasets were carried out. Datasets were designed as follows: dataset1, TCM inquiry information including inspection information; dataset2, time-domain variables of pulse and dataset1; dataset3, RQA variables of pulse and dataset1; and dataset4, major principal components of RQA variables and dataset1. The performances of the different models for Zheng differentiation were compared. The model for Zheng differentiation based on RQA variables integrated with inquiry information had the best performance, whereas that based only on inquiry had the worst performance. Meanwhile, the model based on time-domain variables of pulse integrated with inquiry fell between the above two. This result showed that RQA variables of pulse can be used to construct models of TCM Zheng and improve the performance of Zheng differentiation models. PMID:23737839

  9. The Impact of Social-Cognitive Stress on Speech Variability, Determinism, and Stability in Adults Who Do and Do Not Stutter

    ERIC Educational Resources Information Center

    Jackson, Eric S.; Tiede, Mark; Beal, Deryk; Whalen, D. H.

    2016-01-01

    Purpose: This study examined the impact of social-cognitive stress on sentence-level speech variability, determinism, and stability in adults who stutter (AWS) and adults who do not stutter (AWNS). We demonstrated that complementing the spatiotemporal index (STI) with recurrence quantification analysis (RQA) provides a novel approach to both…

  10. Numerical characteristics of recurrence plots as applied to the evaluation of mechanical damage in materials

    NASA Astrophysics Data System (ADS)

    Hilarov, V. L.

    2017-09-01

    The response of a material with a random uniform distribution of pores to a sound impulse was studied. The behavior of the numerical characteristics of the recurrence plots (RP) of the normal displacement vector component depending on the degree of damage was investigated. It was shown that the recurrence quantification analysis (RQA) parameters could be very informative for sonic fault detection.

  11. Children’s looking preference for biological motion may be related to an affinity for mathematical chaos

    PubMed Central

    Haworth, Joshua L.; Kyvelidou, Anastasia; Fisher, Wayne; Stergiou, Nicholas

    2015-01-01

    Recognition of biological motion is pervasive in early child development. Further, viewing the movement behavior of others is a primary component of a child’s acquisition of complex, robust movement repertoires, through imitation and real-time coordinated action. We theorize that inherent to biological movements are particular qualities of mathematical chaos and complexity. We further posit that this character affords the rich and complex inter-dynamics throughout early motor development. Specifically, we explored whether children’s preference for biological motion may be related to an affinity for mathematical chaos. Cross recurrence quantification analysis (cRQA) was used to investigate the coordination of gaze and posture with various temporal structures (periodic, chaotic, and aperiodic) of the motion of an oscillating visual stimulus. Children appear to competently perceive and respond to chaotic motion, both in rate (cRQA-percent determinism) and duration (cRQA-maxline) of coordination. We interpret this to indicate that children not only recognize chaotic motion structures, but also have a preference for coordination with them. Further, stratification of our sample (by age) uncovers the suggestion that this preference may become refined with age. PMID:25852600

  12. Non Linear Assessment of Musical Consonance

    NASA Astrophysics Data System (ADS)

    Trulla, Lluis Lligoña; Guiliani, Alessandro; Zimatore, Giovanna; Colosimo, Alfredo; Zbilut, Joseph P.

    2005-12-01

    The position of intervals and the degree of musical consonance can be objectively explained by temporal series formed by mixing two pure sounds covering an octave. This result is achieved by means of Recurrence Quantification Analysis (RQA) without considering neither overtones nor physiological hypotheses. The obtained prediction of a consonance can be considered a novel solution to Galileo's conjecture on the nature of consonance. It constitutes an objective link between musical performance and listeners' hearing activity..

  13. The Impact of Social-Cognitive Stress on Speech Variability, Determinism, and Stability in Adults Who Do and Do Not Stutter.

    PubMed

    Jackson, Eric S; Tiede, Mark; Beal, Deryk; Whalen, D H

    2016-12-01

    This study examined the impact of social-cognitive stress on sentence-level speech variability, determinism, and stability in adults who stutter (AWS) and adults who do not stutter (AWNS). We demonstrated that complementing the spatiotemporal index (STI) with recurrence quantification analysis (RQA) provides a novel approach to both assessing and interpreting speech variability in stuttering. Twenty AWS and 21 AWNS repeated sentences in audience and nonaudience conditions while their lip movements were tracked. Across-sentence variability was assessed via the STI; within-sentence determinism and stability were assessed via RQA. Compared with the AWNS, the AWS produced speech that was more variable across sentences and more deterministic and stable within sentences. Audience presence contributed to greater within-sentence determinism and stability in the AWS. A subset of AWS who were more susceptible to experiencing anxiety exhibited reduced across-sentence variability in the audience condition compared with the nonaudience condition. This study extends the assessment of speech variability in AWS and AWNS into the social-cognitive domain and demonstrates that the characterization of speech within sentences using RQA is complementary to the across-sentence STI measure. AWS seem to adopt a more restrictive, less flexible speaking approach in response to social-cognitive stress, which is presumably a strategy for maintaining observably fluent speech.

  14. An exploratory analysis of emotion dynamics between mothers and adolescents during conflict discussions.

    PubMed

    Main, Alexandra; Paxton, Alexandra; Dale, Rick

    2016-09-01

    Dynamic patterns of influence between parents and children have long been considered key to understanding family relationships. Despite this, most observational research on emotion in parent-child interactions examines global behaviors at the expense of exploring moment-to-moment fluctuations in emotions that are important for relational outcomes. Using recurrence quantification analysis (RQA) and growth curve analysis, this investigation explored emotion dynamics during parent-adolescent conflict interactions, focusing not only on concurrently shared emotional states but also on time-lagged synchrony of parents' and adolescents' emotions relative to one another. Mother-adolescent dyads engaged in a 10-min conflict discussion and reported on their satisfaction with the process and outcome of the discussion. Emotions were coded using the Specific Affect Coding System (SPAFF) and were collapsed into the following categories: negativity, positivity, and validation/interest. RQA and growth curve analyses revealed that negative and positive emotions were characterized by a concurrently synchronous pattern across all dyads, with the highest recurrence rates occurring around simultaneity. However, lower levels of concurrent synchrony of negative emotions were associated with higher discussion satisfaction. We also found that patterns of negativity differed with age: Mothers led negativity in dyads with younger adolescents, and adolescents led negativity in dyads with older adolescents. In contrast to negative and positive emotions, validation/interest showed the time-lagged pattern characteristic of turn-taking, and more highly satisfied dyads showed stronger patterns of time-lagged coordination in validation/interest. Our findings underscore the dynamic nature of emotions in parent-adolescent interactions and highlight the important contributions of these moment-to-moment dynamics toward overall interaction quality. (PsycINFO Database Record (c) 2016 APA, all rights

  15. Dynamical characteristics of surface EMG signals of hand grasps via recurrence plot.

    PubMed

    Ouyang, Gaoxiang; Zhu, Xiangyang; Ju, Zhaojie; Liu, Honghai

    2014-01-01

    Recognizing human hand grasp movements through surface electromyogram (sEMG) is a challenging task. In this paper, we investigated nonlinear measures based on recurrence plot, as a tool to evaluate the hidden dynamical characteristics of sEMG during four different hand movements. A series of experimental tests in this study show that the dynamical characteristics of sEMG data with recurrence quantification analysis (RQA) can distinguish different hand grasp movements. Meanwhile, adaptive neuro-fuzzy inference system (ANFIS) is applied to evaluate the performance of the aforementioned measures to identify the grasp movements. The experimental results show that the recognition rate (99.1%) based on the combination of linear and nonlinear measures is much higher than those with only linear measures (93.4%) or nonlinear measures (88.1%). These results suggest that the RQA measures might be a potential tool to reveal the sEMG hidden characteristics of hand grasp movements and an effective supplement for the traditional linear grasp recognition methods.

  16. Dynamical transitions associated with turbulence in a helicon plasma

    NASA Astrophysics Data System (ADS)

    Light, Adam D.; Tian, Li; Chakraborty Thakur, Saikat; Tynan, George R.

    2017-10-01

    Diagnostic capabilities are often cited as a limiting factor in our understanding of transport in fusion devices. Increasingly advanced multichannel diagnostics are being applied to classify transport regimes and to search for ``trigger'' features that signal an oncoming dynamical event, such as an ELM or an L-H transition. In this work, we explore a technique that yields information about global properties of plasma dynamics from a single time series of a relevant plasma quantity. Electrostatic probe data from the Controlled Shear Decorrelation eXperiment (CSDX) is analyzed using recurrence quantification analysis (RQA) in the context of previous work on the transition to weak drift-wave turbulence. The recurrence characteristics of a phase space trajectory provide a quantitative means to classify dynamics and identify transitions in a complex system. We present and quantify dynamical variations in the plasma variables as a function of the background magnetic field strength. A dynamical transition corresponding to the emergence of broadband fluctuations is identified using RQA measures.

  17. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    PubMed

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  18. The Impact of Social–Cognitive Stress on Speech Variability, Determinism, and Stability in Adults Who Do and Do Not Stutter

    PubMed Central

    Tiede, Mark; Beal, Deryk; Whalen, D. H.

    2016-01-01

    Purpose This study examined the impact of social–cognitive stress on sentence-level speech variability, determinism, and stability in adults who stutter (AWS) and adults who do not stutter (AWNS). We demonstrated that complementing the spatiotemporal index (STI) with recurrence quantification analysis (RQA) provides a novel approach to both assessing and interpreting speech variability in stuttering. Method Twenty AWS and 21 AWNS repeated sentences in audience and nonaudience conditions while their lip movements were tracked. Across-sentence variability was assessed via the STI; within-sentence determinism and stability were assessed via RQA. Results Compared with the AWNS, the AWS produced speech that was more variable across sentences and more deterministic and stable within sentences. Audience presence contributed to greater within-sentence determinism and stability in the AWS. A subset of AWS who were more susceptible to experiencing anxiety exhibited reduced across-sentence variability in the audience condition compared with the nonaudience condition. Conclusions This study extends the assessment of speech variability in AWS and AWNS into the social–cognitive domain and demonstrates that the characterization of speech within sentences using RQA is complementary to the across-sentence STI measure. AWS seem to adopt a more restrictive, less flexible speaking approach in response to social–cognitive stress, which is presumably a strategy for maintaining observably fluent speech. PMID:27936276

  19. Dynamics of a two-phase flow through a minichannel: Transition from churn to slug flow

    NASA Astrophysics Data System (ADS)

    Górski, Grzegorz; Litak, Grzegorz; Mosdorf, Romuald; Rysak, Andrzej

    2016-04-01

    The churn-to-slug flow bifurcations of two-phase (air-water) flow patterns in a 2mm diameter minichannel were investigated. With increasing a water flow rate, we observed the transition of slugs to bubbles of different sizes. The process was recorded by a digital camera. The sequences of light transmission time series were recorded by a laser-phototransistor sensor, and then analyzed using the recurrence plots and recurrence quantification analysis (RQA). Due to volume dependence of bubbles velocities, we observed the formation of periodic modulations in the laser signal.

  20. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    DOE PAGES

    Brown, C. S.; Zhang, Hongbin

    2016-05-24

    Uncertainty quantification and sensitivity analysis are important for nuclear reactor safety design and analysis. A 2x2 fuel assembly core design was developed and simulated by the Virtual Environment for Reactor Applications, Core Simulator (VERA-CS) coupled neutronics and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis with fourteen uncertain input parameters. Furthermore, the minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surfacemore » temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. We used parameters as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.« less

  1. Compositional Solution Space Quantification for Probabilistic Software Analysis

    NASA Technical Reports Server (NTRS)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  2. 1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.

    PubMed

    Dagnino, Denise; Schripsema, Jan

    2005-08-01

    A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.

  3. Onset of normal and inverse homoclinic bifurcation in a double plasma system near a plasma fireball

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitra, Vramori; Sarma, Bornali; Sarma, Arun

    Plasma fireballs are generated due to a localized discharge and appear as a luminous glow with a sharp boundary, which suggests the presence of a localized electric field such as electrical sheath or double layer structure. The present work reports the observation of normal and inverse homoclinic bifurcation phenomena in plasma oscillations that are excited in the presence of fireball in a double plasma device. The controlling parameters for these observations are the ratio of target to source chamber (n{sub T}/n{sub S}) densities and applied electrode voltage. Homoclinic bifurcation is noticed in the plasma potential fluctuations as the system evolvesmore » from narrow to long time period oscillations and vice versa with the change of control parameter. The dynamical transition in plasma fireball is demonstrated by spectral analysis, recurrence quantification analysis (RQA), and statistical measures, viz., skewness and kurtosis. The increasing trend of normalized variance reflects that enhancing n{sub T}/n{sub S} induces irregularity in plasma dynamics. The exponential growth of the time period is strongly indicative of homoclinic bifurcation in the system. The gradual decrease of skewness and increase of kurtosis with the increase of n{sub T}/n{sub S} also reflect growing complexity in the system. The visual change of recurrence plot and gradual enhancement of RQA variables DET, L{sub max}, and ENT reflects the bifurcation behavior in the dynamics. The combination of RQA and spectral analysis is a clear evidence that homoclinic bifurcation occurs due to the presence of plasma fireball with different density ratios. However, inverse bifurcation takes place due to the change of fireball voltage. Some of the features observed in the experiment are consistent with a model that describes the dynamics of ionization instabilities.« less

  4. Fault Detection and Severity Analysis of Servo Valves Using Recurrence Quantification Analysis

    DTIC Science & Technology

    2014-10-02

    Fault Detection and Severity Analysis of Servo Valves Using Recurrence Quantification Analysis M. Samadani1, C. A. Kitio Kwuimy2, and C. Nataraj3...diagnostics of nonlinear systems. A detailed nonlinear math- ematical model of a servo electro-hydraulic system has been used to demonstrate the procedure...Two faults have been considered associated with the servo valve including the in- creased friction between spool and sleeve and the degradation of the

  5. PSEA-Quant: a protein set enrichment analysis on label-free and label-based protein quantification data.

    PubMed

    Lavallée-Adam, Mathieu; Rauniyar, Navin; McClatchy, Daniel B; Yates, John R

    2014-12-05

    The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights.

  6. PSEA-Quant: A Protein Set Enrichment Analysis on Label-Free and Label-Based Protein Quantification Data

    PubMed Central

    2015-01-01

    The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights. PMID:25177766

  7. Quantification of lithium at ppm level in geological samples using nuclear reaction analysis.

    PubMed

    De La Rosa, Nathaly; Kristiansson, Per; Nilsson, E J Charlotta; Ros, Linus; Pallon, Jan; Skogby, Henrik

    2018-01-01

    Proton-induced reaction (p,α) is one type of nuclear reaction analysis (NRA) suitable especially for light element quantification. In the case of lithium quantification presented in this work, accelerated protons with an energy about of 850 keV were used to induce the 7 Li(p,α) 4 He reaction in standard reference and geological samples such as tourmaline and other Li-minerals. It is shown that this technique for lithium quantification allowed for measurement of concentrations down below one ppm. The possibility to relate the lithium content with the boron content in a single analysis was also demonstrated using tourmaline samples, both in absolute concentration and in lateral distribution. In addition, Particle induced X-ray emission (PIXE) was utilized as a complementary IBA technique for simultaneous mapping of elements heavier than sodium.

  8. Nonlinear analysis of electromyogram following gait training with myoelectrically triggered neuromuscular electrical stimulation in stroke survivors

    NASA Astrophysics Data System (ADS)

    Dutta, Anirban; Khattar, Bhawna; Banerjee, Alakananda

    2012-12-01

    Neuromuscular electrical stimulation (NMES) facilitates ambulatory function after paralysis by activating the muscles of the lower extremities. The NMES-assisted stepping can either be triggered by a heel-switch (switch-trigger), or by an electromyogram (EMG)-based gait event detector (EMG-trigger). The command sources—switch-trigger or EMG-trigger—were presented to each group of six chronic (>6 months post-stroke) hemiplegic stroke survivors. The switch-trigger group underwent transcutaneous NMES-assisted gait training for 1 h, five times a week for 2 weeks, where the stimulation of the tibialis anterior muscle of the paretic limb was triggered with a heel-switch detecting heel-rise of the same limb. The EMG-trigger group underwent transcutaneous NMES-assisted gait training of the same duration and frequency where the stimulation was triggered with surface EMG from medial gastrocnemius (MG) of the paretic limb in conjunction with a heel-switch detecting heel-rise of the same limb. During the baseline and post-intervention surface EMG assessment, a total of 10 s of surface EMG was recorded from bilateral MG muscle while the subjects tried to stand steady on their toes. A nonlinear tool—recurrence quantification analysis (RQA)—was used to analyze the surface EMG. The objective of this study was to find the effect of NMES-assisted gait training with switch-trigger or EMG-trigger on two RQA parameters—the percentage of recurrence (%Rec) and determinism (%Det), which were extracted from surface EMG during fatiguing contractions of the paretic muscle. The experimental results showed that during fatiguing contractions, (1) %Rec and %Det have a higher initial value for paretic muscle than the non-paretic muscle, (2) the rate of change in %Rec and %Det was negative for the paretic muscle but positive for the non-paretic muscle, (3) the rate of change in %Rec and %Det significantly increased from baseline for the paretic muscle after EMG-triggered NMES

  9. Responses of bistable piezoelectric-composite energy harvester by means of recurrences

    NASA Astrophysics Data System (ADS)

    Syta, Arkadiusz; Bowen, Christopher R.; Kim, H. Alicia; Rysak, Andrzej; Litak, Grzegorz

    2016-08-01

    In this paper we examine the modal response of a bistable electro-mechanical energy harvesting device based on characterization of the experimental time-series. A piezoelectric element attached to a vibrating bistable carbon-fibre reinforced polymer laminate plate was used for the conversion of mechanical vibrations to electrical energy under harmonic excitations at a variety of frequencies and amplitudes. The inherent bistability of the mechanical resonator and snap-through phenomenon between stable states were exploited for energy harvesting. To identify the dynamics of the response of the studied harvesting structure and the associated output power generation we used the Fourier spectrum and Recurrence Quantification Analysis (RQA).

  10. Statistical image quantification toward optimal scan fusion and change quantification

    NASA Astrophysics Data System (ADS)

    Potesil, Vaclav; Zhou, Xiang Sean

    2007-03-01

    Recent advance of imaging technology has brought new challenges and opportunities for automatic and quantitative analysis of medical images. With broader accessibility of more imaging modalities for more patients, fusion of modalities/scans from one time point and longitudinal analysis of changes across time points have become the two most critical differentiators to support more informed, more reliable and more reproducible diagnosis and therapy decisions. Unfortunately, scan fusion and longitudinal analysis are both inherently plagued with increased levels of statistical errors. A lack of comprehensive analysis by imaging scientists and a lack of full awareness by physicians pose potential risks in clinical practice. In this paper, we discuss several key error factors affecting imaging quantification, studying their interactions, and introducing a simulation strategy to establish general error bounds for change quantification across time. We quantitatively show that image resolution, voxel anisotropy, lesion size, eccentricity, and orientation are all contributing factors to quantification error; and there is an intricate relationship between voxel anisotropy and lesion shape in affecting quantification error. Specifically, when two or more scans are to be fused at feature level, optimal linear fusion analysis reveals that scans with voxel anisotropy aligned with lesion elongation should receive a higher weight than other scans. As a result of such optimal linear fusion, we will achieve a lower variance than naïve averaging. Simulated experiments are used to validate theoretical predictions. Future work based on the proposed simulation methods may lead to general guidelines and error lower bounds for quantitative image analysis and change detection.

  11. Analysis of laser fluorosensor systems for remote algae detection and quantification

    NASA Technical Reports Server (NTRS)

    Browell, E. V.

    1977-01-01

    The development and performance of single- and multiple-wavelength laser fluorosensor systems for use in the remote detection and quantification of algae are discussed. The appropriate equation for the fluorescence power received by a laser fluorosensor system is derived in detail. Experimental development of a single wavelength system and a four wavelength system, which selectively excites the algae contained in the four primary algal color groups, is reviewed, and test results are presented. A comprehensive error analysis is reported which evaluates the uncertainty in the remote determination of the chlorophyll a concentration contained in algae by single- and multiple-wavelength laser fluorosensor systems. Results of the error analysis indicate that the remote quantification of chlorophyll a by a laser fluorosensor system requires optimum excitation wavelength(s), remote measurement of marine attenuation coefficients, and supplemental instrumentation to reduce uncertainties in the algal fluorescence cross sections.

  12. 3.8 Proposed approach to uncertainty quantification and sensitivity analysis in the next PA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flach, Greg; Wohlwend, Jen

    2017-10-02

    This memorandum builds upon Section 3.8 of SRNL (2016) and Flach (2017) by defining key error analysis, uncertainty quantification, and sensitivity analysis concepts and terms, in preparation for the next E-Area Performance Assessment (WSRC 2008) revision.

  13. Detecting transitions in protein dynamics using a recurrence quantification analysis based bootstrap method.

    PubMed

    Karain, Wael I

    2017-11-28

    Proteins undergo conformational transitions over different time scales. These transitions are closely intertwined with the protein's function. Numerous standard techniques such as principal component analysis are used to detect these transitions in molecular dynamics simulations. In this work, we add a new method that has the ability to detect transitions in dynamics based on the recurrences in the dynamical system. It combines bootstrapping and recurrence quantification analysis. We start from the assumption that a protein has a "baseline" recurrence structure over a given period of time. Any statistically significant deviation from this recurrence structure, as inferred from complexity measures provided by recurrence quantification analysis, is considered a transition in the dynamics of the protein. We apply this technique to a 132 ns long molecular dynamics simulation of the β-Lactamase Inhibitory Protein BLIP. We are able to detect conformational transitions in the nanosecond range in the recurrence dynamics of the BLIP protein during the simulation. The results compare favorably to those extracted using the principal component analysis technique. The recurrence quantification analysis based bootstrap technique is able to detect transitions between different dynamics states for a protein over different time scales. It is not limited to linear dynamics regimes, and can be generalized to any time scale. It also has the potential to be used to cluster frames in molecular dynamics trajectories according to the nature of their recurrence dynamics. One shortcoming for this method is the need to have large enough time windows to insure good statistical quality for the recurrence complexity measures needed to detect the transitions.

  14. Plasma cell quantification in bone marrow by computer-assisted image analysis.

    PubMed

    Went, P; Mayer, S; Oberholzer, M; Dirnhofer, S

    2006-09-01

    Minor and major criteria for the diagnosis of multiple meloma according to the definition of the WHO classification include different categories of the bone marrow plasma cell count: a shift from the 10-30% group to the > 30% group equals a shift from a minor to a major criterium, while the < 10% group does not contribute to the diagnosis. Plasma cell fraction in the bone marrow is therefore critical for the classification and optimal clinical management of patients with plasma cell dyscrasias. The aim of this study was (i) to establish a digital image analysis system able to quantify bone marrow plasma cells and (ii) to evaluate two quantification techniques in bone marrow trephines i.e. computer-assisted digital image analysis and conventional light-microscopic evaluation. The results were compared regarding inter-observer variation of the obtained results. Eighty-seven patients, 28 with multiple myeloma, 29 with monoclonal gammopathy of undetermined significance, and 30 with reactive plasmocytosis were included in the study. Plasma cells in H&E- and CD138-stained slides were quantified by two investigators using light-microscopic estimation and computer-assisted digital analysis. The sets of results were correlated with rank correlation coefficients. Patients were categorized according to WHO criteria addressing the plasma cell content of the bone marrow (group 1: 0-10%, group 2: 11-30%, group 3: > 30%), and the results compared by kappa statistics. The degree of agreement in CD138-stained slides was higher for results obtained using the computer-assisted image analysis system compared to light microscopic evaluation (corr.coeff. = 0.782), as was seen in the intra- (corr.coeff. = 0.960) and inter-individual results correlations (corr.coeff. = 0.899). Inter-observer agreement for categorized results (SM/PW: kappa 0.833) was in a high range. Computer-assisted image analysis demonstrated a higher reproducibility of bone marrow plasma cell quantification. This might

  15. Recurrence plots and recurrence quantification analysis of human motion data

    NASA Astrophysics Data System (ADS)

    Josiński, Henryk; Michalczuk, Agnieszka; Świtoński, Adam; Szczesna, Agnieszka; Wojciechowski, Konrad

    2016-06-01

    The authors present exemplary application of recurrence plots, cross recurrence plots and recurrence quantification analysis for the purpose of exploration of experimental time series describing selected aspects of human motion. Time series were extracted from treadmill gait sequences which were recorded in the Human Motion Laboratory (HML) of the Polish-Japanese Academy of Information Technology in Bytom, Poland by means of the Vicon system. Analysis was focused on the time series representing movements of hip, knee, ankle and wrist joints in the sagittal plane.

  16. Multivariate Analysis for Quantification of Plutonium(IV) in Nitric Acid Based on Absorption Spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lines, Amanda M.; Adami, Susan R.; Sinkov, Sergey I.

    Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example ofmore » a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.« less

  17. A new background subtraction method for Western blot densitometry band quantification through image analysis software.

    PubMed

    Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier

    2018-06-01

    Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    NASA Technical Reports Server (NTRS)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  19. A stochastic approach to uncertainty quantification in residual moveout analysis

    NASA Astrophysics Data System (ADS)

    Johng-Ay, T.; Landa, E.; Dossou-Gbété, S.; Bordes, L.

    2015-06-01

    Oil and gas exploration and production relies usually on the interpretation of a single seismic image, which is obtained from observed data. However, the statistical nature of seismic data and the various approximations and assumptions are sources of uncertainties which may corrupt the evaluation of parameters. The quantification of these uncertainties is a major issue which supposes to help in decisions that have important social and commercial implications. The residual moveout analysis, which is an important step in seismic data processing is usually performed by a deterministic approach. In this paper we discuss a Bayesian approach to the uncertainty analysis.

  20. Rapid quantification and sex determination of forensic evidence materials.

    PubMed

    Andréasson, Hanna; Allen, Marie

    2003-11-01

    DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.

  1. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  2. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  3. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    NASA Astrophysics Data System (ADS)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.

    2018-03-01

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  4. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE PAGES

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...

    2018-02-09

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  5. Critical aspects of data analysis for quantification in laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Motto-Ros, V.; Syvilay, D.; Bassel, L.; Negre, E.; Trichard, F.; Pelascini, F.; El Haddad, J.; Harhira, A.; Moncayo, S.; Picard, J.; Devismes, D.; Bousquet, B.

    2018-02-01

    In this study, a collaborative contest focused on LIBS data processing has been conducted in an original way since the participants did not share the same samples to be analyzed on their own LIBS experiments but a set of LIBS spectra obtained from one single experiment. Each participant was asked to provide the predicted concentrations of several elements for two glass samples. The analytical contest revealed a wide diversity of results among participants, even when the same spectral lines were considered for the analysis. Then, a parametric study was conducted to investigate the influence of each step during the data processing. This study was based on several analytical figures of merit such as the determination coefficient, uncertainty, limit of quantification and prediction ability (i.e., trueness). Then, it was possible to interpret the results provided by the participants, emphasizing the fact that the type of data extraction, baseline modeling as well as the calibration model play key roles in the quantification performance of the technique. This work provides a set of recommendations based on a systematic evaluation of the quantification procedure with the aim of optimizing the methodological steps toward the standardization of LIBS.

  6. Toward a dynamical theory of body movement in musical performance

    PubMed Central

    Demos, Alexander P.; Chaffin, Roger; Kant, Vivek

    2014-01-01

    Musicians sway expressively as they play in ways that seem clearly related to the music, but quantifying the relationship has been difficult. We suggest that a complex systems framework and its accompanying tools for analyzing non-linear dynamical systems can help identify the motor synergies involved. Synergies are temporary assemblies of parts that come together to accomplish specific goals. We assume that the goal of the performer is to convey musical structure and expression to the audience and to other performers. We provide examples of how dynamical systems tools, such as recurrence quantification analysis (RQA), can be used to examine performers' movements and relate them to the musical structure and to the musician's expressive intentions. We show how detrended fluctuation analysis (DFA) can be used to identify synergies and discover how they are affected by the performer's expressive intentions. PMID:24904490

  7. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    NASA Astrophysics Data System (ADS)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  8. Numerical Uncertainty Quantification for Radiation Analysis Tools

    NASA Technical Reports Server (NTRS)

    Anderson, Brooke; Blattnig, Steve; Clowdsley, Martha

    2007-01-01

    Recently a new emphasis has been placed on engineering applications of space radiation analyses and thus a systematic effort of Verification, Validation and Uncertainty Quantification (VV&UQ) of the tools commonly used for radiation analysis for vehicle design and mission planning has begun. There are two sources of uncertainty in geometric discretization addressed in this paper that need to be quantified in order to understand the total uncertainty in estimating space radiation exposures. One source of uncertainty is in ray tracing, as the number of rays increase the associated uncertainty decreases, but the computational expense increases. Thus, a cost benefit analysis optimizing computational time versus uncertainty is needed and is addressed in this paper. The second source of uncertainty results from the interpolation over the dose vs. depth curves that is needed to determine the radiation exposure. The question, then, is what is the number of thicknesses that is needed to get an accurate result. So convergence testing is performed to quantify the uncertainty associated with interpolating over different shield thickness spatial grids.

  9. Easy, Fast, and Reproducible Quantification of Cholesterol and Other Lipids in Human Plasma by Combined High Resolution MSX and FTMS Analysis

    NASA Astrophysics Data System (ADS)

    Gallego, Sandra F.; Højlund, Kurt; Ejsing, Christer S.

    2018-01-01

    Reliable, cost-effective, and gold-standard absolute quantification of non-esterified cholesterol in human plasma is of paramount importance in clinical lipidomics and for the monitoring of metabolic health. Here, we compared the performance of three mass spectrometric approaches available for direct detection and quantification of cholesterol in extracts of human plasma. These approaches are high resolution full scan Fourier transform mass spectrometry (FTMS) analysis, parallel reaction monitoring (PRM), and novel multiplexed MS/MS (MSX) technology, where fragments from selected precursor ions are detected simultaneously. Evaluating the performance of these approaches in terms of dynamic quantification range, linearity, and analytical precision showed that the MSX-based approach is superior to that of the FTMS and PRM-based approaches. To further show the efficacy of this approach, we devised a simple routine for extensive plasma lipidome characterization using only 8 μL of plasma, using a new commercially available ready-to-spike-in mixture with 14 synthetic lipid standards, and executing a single 6 min sample injection with combined MSX analysis for cholesterol quantification and FTMS analysis for quantification of sterol esters, glycerolipids, glycerophospholipids, and sphingolipids. Using this simple routine afforded reproducible and absolute quantification of 200 lipid species encompassing 13 lipid classes in human plasma samples. Notably, the analysis time of this procedure can be shortened for high throughput-oriented clinical lipidomics studies or extended with more advanced MSALL technology (Almeida R. et al., J. Am. Soc. Mass Spectrom. 26, 133-148 [1]) to support in-depth structural elucidation of lipid molecules. [Figure not available: see fulltext.

  10. Easy, Fast, and Reproducible Quantification of Cholesterol and Other Lipids in Human Plasma by Combined High Resolution MSX and FTMS Analysis.

    PubMed

    Gallego, Sandra F; Højlund, Kurt; Ejsing, Christer S

    2018-01-01

    Reliable, cost-effective, and gold-standard absolute quantification of non-esterified cholesterol in human plasma is of paramount importance in clinical lipidomics and for the monitoring of metabolic health. Here, we compared the performance of three mass spectrometric approaches available for direct detection and quantification of cholesterol in extracts of human plasma. These approaches are high resolution full scan Fourier transform mass spectrometry (FTMS) analysis, parallel reaction monitoring (PRM), and novel multiplexed MS/MS (MSX) technology, where fragments from selected precursor ions are detected simultaneously. Evaluating the performance of these approaches in terms of dynamic quantification range, linearity, and analytical precision showed that the MSX-based approach is superior to that of the FTMS and PRM-based approaches. To further show the efficacy of this approach, we devised a simple routine for extensive plasma lipidome characterization using only 8 μL of plasma, using a new commercially available ready-to-spike-in mixture with 14 synthetic lipid standards, and executing a single 6 min sample injection with combined MSX analysis for cholesterol quantification and FTMS analysis for quantification of sterol esters, glycerolipids, glycerophospholipids, and sphingolipids. Using this simple routine afforded reproducible and absolute quantification of 200 lipid species encompassing 13 lipid classes in human plasma samples. Notably, the analysis time of this procedure can be shortened for high throughput-oriented clinical lipidomics studies or extended with more advanced MS ALL technology (Almeida R. et al., J. Am. Soc. Mass Spectrom. 26, 133-148 [1]) to support in-depth structural elucidation of lipid molecules. Graphical Abstract ᅟ.

  11. Automated Quantification and Integrative Analysis of 2D and 3D Mitochondrial Shape and Network Properties

    PubMed Central

    Nikolaisen, Julie; Nilsson, Linn I. H.; Pettersen, Ina K. N.; Willems, Peter H. G. M.; Lorens, James B.; Koopman, Werner J. H.; Tronstad, Karl J.

    2014-01-01

    Mitochondrial morphology and function are coupled in healthy cells, during pathological conditions and (adaptation to) endogenous and exogenous stress. In this sense mitochondrial shape can range from small globular compartments to complex filamentous networks, even within the same cell. Understanding how mitochondrial morphological changes (i.e. “mitochondrial dynamics”) are linked to cellular (patho) physiology is currently the subject of intense study and requires detailed quantitative information. During the last decade, various computational approaches have been developed for automated 2-dimensional (2D) analysis of mitochondrial morphology and number in microscopy images. Although these strategies are well suited for analysis of adhering cells with a flat morphology they are not applicable for thicker cells, which require a three-dimensional (3D) image acquisition and analysis procedure. Here we developed and validated an automated image analysis algorithm allowing simultaneous 3D quantification of mitochondrial morphology and network properties in human endothelial cells (HUVECs). Cells expressing a mitochondria-targeted green fluorescence protein (mitoGFP) were visualized by 3D confocal microscopy and mitochondrial morphology was quantified using both the established 2D method and the new 3D strategy. We demonstrate that both analyses can be used to characterize and discriminate between various mitochondrial morphologies and network properties. However, the results from 2D and 3D analysis were not equivalent when filamentous mitochondria in normal HUVECs were compared with circular/spherical mitochondria in metabolically stressed HUVECs treated with rotenone (ROT). 2D quantification suggested that metabolic stress induced mitochondrial fragmentation and loss of biomass. In contrast, 3D analysis revealed that the mitochondrial network structure was dissolved without affecting the amount and size of the organelles. Thus, our results demonstrate that 3D

  12. Quantification of Pulmonary Fibrosis in a Bleomycin Mouse Model Using Automated Histological Image Analysis.

    PubMed

    Gilhodes, Jean-Claude; Julé, Yvon; Kreuz, Sebastian; Stierstorfer, Birgit; Stiller, Detlef; Wollin, Lutz

    2017-01-01

    Current literature on pulmonary fibrosis induced in animal models highlights the need of an accurate, reliable and reproducible histological quantitative analysis. One of the major limits of histological scoring concerns the fact that it is observer-dependent and consequently subject to variability, which may preclude comparative studies between different laboratories. To achieve a reliable and observer-independent quantification of lung fibrosis we developed an automated software histological image analysis performed from digital image of entire lung sections. This automated analysis was compared to standard evaluation methods with regard to its validation as an end-point measure of fibrosis. Lung fibrosis was induced in mice by intratracheal administration of bleomycin (BLM) at 0.25, 0.5, 0.75 and 1 mg/kg. A detailed characterization of BLM-induced fibrosis was performed 14 days after BLM administration using lung function testing, micro-computed tomography and Ashcroft scoring analysis. Quantification of fibrosis by automated analysis was assessed based on pulmonary tissue density measured from thousands of micro-tiles processed from digital images of entire lung sections. Prior to analysis, large bronchi and vessels were manually excluded from the original images. Measurement of fibrosis has been expressed by two indexes: the mean pulmonary tissue density and the high pulmonary tissue density frequency. We showed that tissue density indexes gave access to a very accurate and reliable quantification of morphological changes induced by BLM even for the lowest concentration used (0.25 mg/kg). A reconstructed 2D-image of the entire lung section at high resolution (3.6 μm/pixel) has been performed from tissue density values allowing the visualization of their distribution throughout fibrotic and non-fibrotic regions. A significant correlation (p<0.0001) was found between automated analysis and the above standard evaluation methods. This correlation establishes

  13. Quantification of Pulmonary Fibrosis in a Bleomycin Mouse Model Using Automated Histological Image Analysis

    PubMed Central

    Gilhodes, Jean-Claude; Kreuz, Sebastian; Stierstorfer, Birgit; Stiller, Detlef; Wollin, Lutz

    2017-01-01

    Current literature on pulmonary fibrosis induced in animal models highlights the need of an accurate, reliable and reproducible histological quantitative analysis. One of the major limits of histological scoring concerns the fact that it is observer-dependent and consequently subject to variability, which may preclude comparative studies between different laboratories. To achieve a reliable and observer-independent quantification of lung fibrosis we developed an automated software histological image analysis performed from digital image of entire lung sections. This automated analysis was compared to standard evaluation methods with regard to its validation as an end-point measure of fibrosis. Lung fibrosis was induced in mice by intratracheal administration of bleomycin (BLM) at 0.25, 0.5, 0.75 and 1 mg/kg. A detailed characterization of BLM-induced fibrosis was performed 14 days after BLM administration using lung function testing, micro-computed tomography and Ashcroft scoring analysis. Quantification of fibrosis by automated analysis was assessed based on pulmonary tissue density measured from thousands of micro-tiles processed from digital images of entire lung sections. Prior to analysis, large bronchi and vessels were manually excluded from the original images. Measurement of fibrosis has been expressed by two indexes: the mean pulmonary tissue density and the high pulmonary tissue density frequency. We showed that tissue density indexes gave access to a very accurate and reliable quantification of morphological changes induced by BLM even for the lowest concentration used (0.25 mg/kg). A reconstructed 2D-image of the entire lung section at high resolution (3.6 μm/pixel) has been performed from tissue density values allowing the visualization of their distribution throughout fibrotic and non-fibrotic regions. A significant correlation (p<0.0001) was found between automated analysis and the above standard evaluation methods. This correlation establishes

  14. Characterization of a clinical unit for digital radiography based on irradiation side sampling technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rivetti, Stefano; Lanconelli, Nico; Bertolini, Marco

    2013-10-15

    Purpose: A characterization of a clinical unit for digital radiography (FUJIFILM FDR D-EVO) is presented. This system is based on the irradiation side sampling (ISS) technology and can be equipped with two different scintillators: one traditional gadolinium-oxysulphide phosphor (GOS) and a needle structured cesium iodide (CsI) phosphor panel.Methods: The characterization was achieved in terms of response curve, modulation transfer function (MTF), noise power spectra (NPS), detective quantum efficiency (DQE), and psychophysical parameters (contrast-detail analysis with an automatic reading of CDRAD images). For both scintillation screens the authors accomplished the measurements with four standard beam conditions: RAQ3, RQA5, RQA7, and RQA9.Results:more » At the Nyquist frequency (3.33 lp/mm) the MTF is about 35% and 25% for CsI and GOS detectors, respectively. The CsI scintillator has better noise properties than the GOS screen in almost all the conditions. This is particularly true for low-energy beams, where the noise for the GOS system can go up to a factor 2 greater than that found for CsI. The DQE of the CsI detector reaches a peak of 60%, 60%, 58%, and 50% for the RQA3, RQA5, RQA7, and RQA9 beams, respectively, whereas for the GOS screen the maximum DQE is 40%, 44%, 44%, and 35%. The contrast-detail analysis confirms that in the majority of cases the CsI scintillator is able to provide improved outcomes to those obtained with the GOS screen.Conclusions: The limited diffusion of light produced by the ISS reading makes possible the achievement of very good spatial resolution. In fact, the MTF of the unit with the CsI panel is only slightly lower to that achieved with direct conversion detectors. The combination of very good spatial resolution, together with the good noise properties reached with the CsI screen, allows achieving DQE on average about 1.5 times greater than that obtained with GOS. In fact, the DQE of unit equipped with CsI is comparable to the

  15. Quantification of fibre polymerization through Fourier space image analysis

    PubMed Central

    Nekouzadeh, Ali; Genin, Guy M.

    2011-01-01

    Quantification of changes in the total length of randomly oriented and possibly curved lines appearing in an image is a necessity in a wide variety of biological applications. Here, we present an automated approach based upon Fourier space analysis. Scaled, band-pass filtered power spectral densities of greyscale images are integrated to provide a quantitative measurement of the total length of lines of a particular range of thicknesses appearing in an image. A procedure is presented to correct for changes in image intensity. The method is most accurate for two-dimensional processes with fibres that do not occlude one another. PMID:24959096

  16. Technical characterization of five x-ray detectors for paediatric radiography applications

    NASA Astrophysics Data System (ADS)

    Marshall, N. W.; Smet, M.; Hofmans, M.; Pauwels, H.; De Clercq, T.; Bosmans, H.

    2017-12-01

    Physical image quality of five x-ray detectors used in the paediatric imaging department is characterized with the aim of establishing the range/scope of imaging performance provided by these detectors for neonatal imaging. Two computed radiography (CR) detectors (MD4.0 powder imaging plate (PIP) and HD5.0 needle imaging plate (NIP), Agfa HealthCare NV, B-2640 Mortsel, Belgium) and three flat panel detectors (FPD) (the Agfa DX-D35C and DX-D45C and the DRX-2530C (Carestream Health Inc., Rochester, NY 14608, USA)) were assessed. Physical image quality was characterized using the detector metrics given by the International Electrotechnical Commission (IEC 62220-1) to measure modulation transfer function (MTF), the noise power spectrum (NPS) and the detective quantum efficiency (DQE) using the IEC-specified beam qualities of RQA3 and RQA5. The DQE was evaluated at the normal operating detector air kerma (DAK) level, defined at 2.5 µGy for all detectors, and at factors of 1/3.2 and 3.2 times the normal level. MTF curves for the different detectors were similar at both RQA3 and RQA5 energies; the average spatial frequency for the 50% point (MTF0.5) at RQA3 was 1.26 mm-1, with a range from 1.20 mm-1 to 1.37 mm-1. The DQE of the NIP CR compared to the PIP CR was notably greater and similar to that for the FPD devices. At RQA3, average DQE for the FPD and NIP (at 0.5 mm-1 2.5 µGy) was 0.57 compared to 0.26 for the PIP CR. At the RQA5 energy, the DRX-2530C and the DX-D45C had the highest DQE (~0.6 at 0.5 mm-1 2.5 µGy). Noise separation analysis using the polynomial model showed higher electronic noise for the DX-D35C and DRX-2530C detectors; this explains the reduced DQE seen at 0.7 µGy/image. The NIP CR detector offers notably improved DQE performance compared to the PIP CR system and a value similar to the DQE for FPD devices at the RQA3 energy.

  17. Combinatorics and synchronization in natural semiotics

    NASA Astrophysics Data System (ADS)

    Orsucci, Franco; Giuliani, Alessandro; Webber, Charles; Zbilut, Joseph; Fonagy, Peter; Mazza, Marianna

    2006-03-01

    In this study the derivation of an objective metrics to appreciate the degree of structuring of written and spoken texts is presented. The proposed metrics is based on the scoring of recurrences inside a text by means of the application of recurrence quantification analysis (RQA), a nonlinear technique widely used in other fields of sciences. The adopted approach allowed us to create a ranking of different poems strictly related to their prosodic structure and, more importantly, the possibility to recognize the same structure across different languages, to define a level of structuring typical of spoken texts and identifying the progressive synchronization of a dyadic relation between two speakers in terms of relative complexity of their speeches. These results suggest the possibility of introducing objective measurement methods into humanities studies.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yasumatsu, S; Iwase, K; Shimizu, Y

    Purpose: The exposure index (EI) proposed by the International Electrotechnical Commission (IEC) 62494-1 is expected to be utilized as a standard dose index by every manufacturer. The IEC recommended the usage of RQA5 for the EI. However, X-ray beam qualities, particularly in clinical practices, vary depending on the examination objects and exposure conditions, including usage of anti-scatter grids. We investigated the effects of the X-ray beam qualities other than RQA5 on the EI. Methods: The Xray beam qualities of RQA3, 5, 7, and 9 in IEC 61267 Ed. 1.0 were adopted in a computed radiography system. A uniform exposure withoutmore » objects was performed to measure the exposure indicators (S values) and air kerma (K). The relational equations between the S values and K were derived for the determination of the EI values. The EI values for RQA3, 7, and 9 were compared to those for RQA5 at the fixed S values of 100, 200, 400, and 600. Finally, the half-value layers (HVLs) using four grids (ratio 6:1, 8:1, 10:1, and 12:1) for the RQA5 X-ray were compared to those with RQA3–9. Results: The EI values for RQA3, 7, and 9 were up to 35.3%, 11.8%, and 38.7% higher, respectively, than that for RQA5 at the S value of 600. The HVLs without grids and with various grids for RQA5 were 6.85 mm Al. and in the range of 6.94–7.29 mm Al. (ΔHVL: up to 0.44 mm Al.), respectively. This variation in the HVLs with grids was smaller than that observed for RQA3–9 (ΔHVL: 2.0–7.5 mm Al.). Conclusion: Although the usage of grids may not greatly affect the EI, the X-ray beam quality for the determination of the EI cannot be ignored in the clinical evaluation of the dose index.« less

  19. Chaotic behavior in Malaysian stock market: A study with recurrence quantification analysis

    NASA Astrophysics Data System (ADS)

    Niu, Betty Voon Wan; Noorani, Mohd Salmi Md; Jaaman, Saiful Hafizah

    2016-11-01

    The dynamics of stock market has been questioned for decades. Its behavior appeared random yet some found it behaves as chaos. Up to 5000 daily adjusted closing data of FTSE Bursa Malaysia Kuala Lumpur Composite Index (KLSE) was investigated through recurrence plot and recurrence quantification analysis. Results were compared between stochastic system, chaotic system and deterministic system. Results show that KLSE daily adjusted closing data behaves chaotically.

  20. Quantification of epithelial cells in coculture with fibroblasts by fluorescence image analysis.

    PubMed

    Krtolica, Ana; Ortiz de Solorzano, Carlos; Lockett, Stephen; Campisi, Judith

    2002-10-01

    To demonstrate that senescent fibroblasts stimulate the proliferation and neoplastic transformation of premalignant epithelial cells (Krtolica et al.: Proc Natl Acad Sci USA 98:12072-12077, 2001), we developed methods to quantify the proliferation of epithelial cells cocultured with fibroblasts. We stained epithelial-fibroblast cocultures with the fluorescent DNA-intercalating dye 4,6-diamidino-2-phenylindole (DAPI), or expressed green fluorescent protein (GFP) in the epithelial cells, and then cultured them with fibroblasts. The cocultures were photographed under an inverted microscope with appropriate filters, and the fluorescent images were captured with a digital camera. We modified an image analysis program to selectively recognize the smaller, more intensely fluorescent epithelial cell nuclei in DAPI-stained cultures and used the program to quantify areas with DAPI fluorescence generated by epithelial nuclei or GFP fluorescence generated by epithelial cells in each field. Analysis of the image areas with DAPI and GFP fluorescences produced nearly identical quantification of epithelial cells in coculture with fibroblasts. We confirmed these results by manual counting. In addition, GFP labeling permitted kinetic studies of the same coculture over multiple time points. The image analysis-based quantification method we describe here is an easy and reliable way to monitor cells in coculture and should be useful for a variety of cell biological studies. Copyright 2002 Wiley-Liss, Inc.

  1. Overview of the AVT-191 Project to Assess Sensitivity Analysis and Uncertainty Quantification Methods for Military Vehicle Design

    NASA Technical Reports Server (NTRS)

    Benek, John A.; Luckring, James M.

    2017-01-01

    A NATO symposium held in 2008 identified many promising sensitivity analysis and un-certainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not known. The STO Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic problems of interest to NATO. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper presents an overview of the AVT-191 program content.

  2. Nonlinear dynamics and intermittency in a turbulent reacting wake with density ratio as bifurcation parameter

    NASA Astrophysics Data System (ADS)

    Suresha, Suhas; Sujith, R. I.; Emerson, Benjamin; Lieuwen, Tim

    2016-10-01

    The flame or flow behavior of a turbulent reacting wake is known to be fundamentally different at high and low values of flame density ratio (ρu/ρb ), as the flow transitions from globally stable to unstable. This paper analyzes the nonlinear dynamics present in a bluff-body stabilized flame, and identifies the transition characteristics in the wake as ρu/ρb is varied over a Reynolds number (based on the bluff-body lip velocity) range of 1000-3300. Recurrence quantification analysis (RQA) of the experimentally obtained time series of the flame edge fluctuations reveals that the time series is highly aperiodic at high values of ρu/ρb and transitions to increasingly correlated or nearly periodic behavior at low values. From the RQA of the transverse velocity time series, we observe that periodicity in the flame oscillations are related to periodicity in the flow. Therefore, we hypothesize that this transition from aperiodic to nearly periodic behavior in the flame edge time series is a manifestation of the transition in the flow from globally stable, convective instability to global instability as ρu/ρb decreases. The recurrence analysis further reveals that the transition in periodicity is not a sudden shift; rather it occurs through an intermittent regime present at low and intermediate ρu/ρb . During intermittency, the flow behavior switches between aperiodic oscillations, reminiscent of a globally stable, convective instability, and periodic oscillations, reminiscent of a global instability. Analysis of the distribution of the lengths of the periodic regions in the intermittent time series and the first return map indicate the presence of type-II intermittency.

  3. Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2014-01-01

    The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.

  4. Quantification of protein expression in cells and cellular subcompartments on immunohistochemical sections using a computer supported image analysis system.

    PubMed

    Braun, Martin; Kirsten, Robert; Rupp, Niels J; Moch, Holger; Fend, Falko; Wernert, Nicolas; Kristiansen, Glen; Perner, Sven

    2013-05-01

    Quantification of protein expression based on immunohistochemistry (IHC) is an important step for translational research and clinical routine. Several manual ('eyeballing') scoring systems are used in order to semi-quantify protein expression based on chromogenic intensities and distribution patterns. However, manual scoring systems are time-consuming and subject to significant intra- and interobserver variability. The aim of our study was to explore, whether new image analysis software proves to be sufficient as an alternative tool to quantify protein expression. For IHC experiments, one nucleus specific marker (i.e., ERG antibody), one cytoplasmic specific marker (i.e., SLC45A3 antibody), and one marker expressed in both compartments (i.e., TMPRSS2 antibody) were chosen. Stainings were applied on TMAs, containing tumor material of 630 prostate cancer patients. A pathologist visually quantified all IHC stainings in a blinded manner, applying a four-step scoring system. For digital quantification, image analysis software (Tissue Studio v.2.1, Definiens AG, Munich, Germany) was applied to obtain a continuous spectrum of average staining intensity. For each of the three antibodies we found a strong correlation of the manual protein expression score and the score of the image analysis software. Spearman's rank correlation coefficient was 0.94, 0.92, and 0.90 for ERG, SLC45A3, and TMPRSS2, respectively (p⟨0.01). Our data suggest that the image analysis software Tissue Studio is a powerful tool for quantification of protein expression in IHC stainings. Further, since the digital analysis is precise and reproducible, computer supported protein quantification might help to overcome intra- and interobserver variability and increase objectivity of IHC based protein assessment.

  5. Summary Findings from the AVT-191 Project to Assess Sensitivity Analysis and Uncertainty Quantification Methods for Military Vehicle Design

    NASA Technical Reports Server (NTRS)

    Benek, John A.; Luckring, James M.

    2017-01-01

    A NATO symposium held in Greece in 2008 identified many promising sensitivity analysis and uncertainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not clear. The NATO Science and Technology Organization, Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic vehicle development problems. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper summarizes findings and lessons learned from the task group.

  6. A Posteriori Error Analysis and Uncertainty Quantification for Adaptive Multiscale Operator Decomposition Methods for Multiphysics Problems

    DTIC Science & Technology

    2014-04-01

    Barrier methods for critical exponent problems in geometric analysis and mathematical physics, J. Erway and M. Holst, Submitted for publication ...TR-14-33 A Posteriori Error Analysis and Uncertainty Quantification for Adaptive Multiscale Operator Decomposition Methods for Multiphysics...Problems Approved for public release, distribution is unlimited. April 2014 HDTRA1-09-1-0036 Donald Estep and Michael

  7. Assessing nonlinear structures in real exchange rates using recurrence plot strategies

    NASA Astrophysics Data System (ADS)

    Belaire-Franch, Jorge; Contreras, Dulce; Tordera-Lledó, Lorena

    2002-11-01

    Purchasing power parity (PPP) is an important theory at the basis of a large number of economic models. However, the implication derived from the theory that real exchange rates must follow stationary processes is not conclusively supported by empirical studies. In a recent paper, Serletis and Gogas [Appl. Finance Econ. 10 (2000) 615] show evidence of deterministic chaos in several OECD exchange rates. As a consequence, PPP rejections could be spurious. In this work, we follow a two-stage testing procedure to test for nonlinearities and chaos in real exchange rates, using a new set of techniques designed by Webber and Zbilut [J. Appl. Physiol. 76 (1994) 965], called recurrence quantification analysis (RQA). Our conclusions differ slightly from Serletis and Gogas [Appl. Finance Econ. 10 (2000) 615], but they are also supportive of chaos for some exchange rates.

  8. A novel approach to the dynamical complexity of the Earth's magnetosphere at geomagnetic storm time-scales based on recurrences

    NASA Astrophysics Data System (ADS)

    Donner, Reik; Balasis, Georgios; Stolbova, Veronika; Wiedermann, Marc; Georgiou, Marina; Kurths, Jürgen

    2016-04-01

    Magnetic storms are the most prominent global manifestations of out-of-equilibrium magnetospheric dynamics. Investigating the dynamical complexity exhibited by geomagnetic observables can provide valuable insights into relevant physical processes as well as temporal scales associated with this phenomenon. In this work, we introduce several innovative data analysis techniques enabling a quantitative analysis of the Dst index non-stationary behavior. Using recurrence quantification analysis (RQA) and recurrence network analysis (RNA), we obtain a variety of complexity measures serving as markers of quiet- and storm-time magnetospheric dynamics. We additionally apply these techniques to the main driver of Dst index variations, the V BSouth coupling function and interplanetary medium parameters Bz and Pdyn in order to discriminate internal processes from the magnetosphere's response directly induced by the external forcing by the solar wind. The derived recurrence-based measures allow us to improve the accuracy with which magnetospheric storms can be classified based on ground-based observations. The new methodology presented here could be of significant interest for the space weather research community working on time series analysis for magnetic storm forecasts.

  9. Evidence of Mixed-mode oscillations and Farey arithmetic in double plasma system in presence of fireball

    NASA Astrophysics Data System (ADS)

    Mitra, Vramori; Sarma, Bornali; Sarma, Arun

    2017-10-01

    Plasma fireballs are luminous glowing region formed around a positively biased electrode. The present work reports the observation of mix mode oscillation (MMO) in the dynamics of plasma oscillations that are excited in the presence of fireball in a double plasma device. Source voltage and applied electrode voltage are considered as the controlling parameters for the experiment. Many sequences of distinct multi peaked periodic states reflects the presence of MMO with the variation of control parameter. The sequences of states with two patterns are characterized well by Farey arithmetic, which provides rational approximations of irrational numbers. These states can be characterized by a firing number, the ratio of the number of small amplitude oscillations to the total number of oscillations per period. The dynamical transition in plasma fireball is also demonstrated by spectral analysis, recurrence quantification analysis (RQA) and by statistical measures viz., skewness and kurtosis. The mix mode phenomenon observed in the experiment is consistent with a model that describes the dynamics of ionization instabilities.

  10. Development of a Protein Standard Absolute Quantification (PSAQ™) assay for the quantification of Staphylococcus aureus enterotoxin A in serum.

    PubMed

    Adrait, Annie; Lebert, Dorothée; Trauchessec, Mathieu; Dupuis, Alain; Louwagie, Mathilde; Masselon, Christophe; Jaquinod, Michel; Chevalier, Benoît; Vandenesch, François; Garin, Jérôme; Bruley, Christophe; Brun, Virginie

    2012-06-06

    Enterotoxin A (SEA) is a staphylococcal virulence factor which is suspected to worsen septic shock prognosis. However, the presence of SEA in the blood of sepsis patients has never been demonstrated. We have developed a mass spectrometry-based assay for the targeted and absolute quantification of SEA in serum. To enhance sensitivity and specificity, we combined an immunoaffinity-based sample preparation with mass spectrometry analysis in the selected reaction monitoring (SRM) mode. Absolute quantification of SEA was performed using the PSAQ™ method (Protein Standard Absolute Quantification), which uses a full-length isotope-labeled SEA as internal standard. The lower limit of detection (LLOD) and lower limit of quantification (LLOQ) were estimated at 352pg/mL and 1057pg/mL, respectively. SEA recovery after immunocapture was determined to be 7.8±1.4%. Therefore, we assumed that less than 1femtomole of each SEA proteotypic peptide was injected on the liquid chromatography column before SRM analysis. From a 6-point titration experiment, quantification accuracy was determined to be 77% and precision at LLOQ was lower than 5%. With this sensitive PSAQ-SRM assay, we expect to contribute to decipher the pathophysiological role of SEA in severe sepsis. This article is part of a Special Issue entitled: Proteomics: The clinical link. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Dynamical Coordination of Hand Intrinsic Muscles for Precision Grip in Diabetes Mellitus.

    PubMed

    Li, Ke; Wei, Na; Cheng, Mei; Hou, Xingguo; Song, Jun

    2018-03-12

    This study investigated the effects of diabetes mellitus (DM) on dynamical coordination of hand intrinsic muscles during precision grip. Precision grip was tested using a custom designed apparatus with stable and unstable loads, during which the surface electromyographic (sEMG) signals of the abductor pollicis brevis (APB) and first dorsal interosseous (FDI) were recorded simultaneously. Recurrence quantification analysis (RQA) was applied to quantify the dynamical structure of sEMG signals of the APB and FDI; and cross recurrence quantification analysis (CRQA) was used to assess the intermuscular coupling between the two intrinsic muscles. This study revealed that the DM altered the dynamical structure of muscle activation for the FDI and the dynamical intermuscular coordination between the APB and FDI during precision grip. A reinforced feedforward mechanism that compensates the loss of sensory feedbacks in DM may be responsible for the stronger intermuscular coupling between the APB and FDI muscles. Sensory deficits in DM remarkably decreased the capacity of online motor adjustment based on sensory feedback, rendering a lower adaptability to the uncertainty of environment. This study shed light on inherent dynamical properties underlying the intrinsic muscle activation and intermuscular coordination for precision grip and the effects of DM on hand sensorimotor function.

  12. An approach for quantification of platinum distribution in tissues by LA-ICP-MS imaging using isotope dilution analysis.

    PubMed

    Moraleja, I; Mena, M L; Lázaro, A; Neumann, B; Tejedor, A; Jakubowski, N; Gómez-Gómez, M M; Esteban-Fernández, D

    2018-02-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been revealed as a convenient technique for trace elemental imaging in tissue sections, providing elemental 2D distribution at a quantitative level. For quantification purposes, in the last years several approaches have been proposed in the literature such as the use of CRMs or matrix matched standards. The use of Isotope Dilution (ID) for quantification by LA-ICP-MS has been also described, being mainly useful for bulk analysis but not feasible for spatial measurements so far. In this work, a quantification method based on ID analysis was developed by printing isotope-enriched inks onto kidney slices from rats treated with antitumoral Pt-based drugs using a commercial ink-jet device, in order to perform an elemental quantification in different areas from bio-images. For the ID experiments 194 Pt enriched platinum was used. The methodology was validated by deposition of natural Pt standard droplets with a known amount of Pt onto the surface of a control tissue, where could be quantified even 50pg of Pt, with recoveries higher than 90%. The amount of Pt present in the whole kidney slices was quantified for cisplatin, carboplatin and oxaliplatin-treated rats. The results obtained were in accordance with those previously reported. The amount of Pt distributed between the medullar and cortical areas was also quantified, observing different behavior for the three drugs. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Sensitivity Analysis and Uncertainty Quantification for the LAMMPS Molecular Dynamics Simulation Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Picard, Richard Roy; Bhat, Kabekode Ghanasham

    2017-07-18

    We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.

  14. Quantification method for the appearance of melanin pigmentation using independent component analysis

    NASA Astrophysics Data System (ADS)

    Ojima, Nobutoshi; Okiyama, Natsuko; Okaguchi, Saya; Tsumura, Norimichi; Nakaguchi, Toshiya; Hori, Kimihiko; Miyake, Yoichi

    2005-04-01

    In the cosmetics industry, skin color is very important because skin color gives a direct impression of the face. In particular, many people suffer from melanin pigmentation such as liver spots and freckles. However, it is very difficult to evaluate melanin pigmentation using conventional colorimetric values because these values contain information on various skin chromophores simultaneously. Therefore, it is necessary to extract information of the chromophore of individual skins independently as density information. The isolation of the melanin component image based on independent component analysis (ICA) from a single skin image was reported in 2003. However, this technique has not developed a quantification method for melanin pigmentation. This paper introduces a quantification method based on the ICA of a skin color image to isolate melanin pigmentation. The image acquisition system we used consists of commercially available equipment such as digital cameras and lighting sources with polarized light. The images taken were analyzed using ICA to extract the melanin component images, and Laplacian of Gaussian (LOG) filter was applied to extract the pigmented area. As a result, for skin images including those showing melanin pigmentation and acne, the method worked well. Finally, the total amount of extracted area had a strong correspondence to the subjective rating values for the appearance of pigmentation. Further analysis is needed to recognize the appearance of pigmentation concerning the size of the pigmented area and its spatial gradation.

  15. Practical considerations of image analysis and quantification of signal transduction IHC staining.

    PubMed

    Grunkin, Michael; Raundahl, Jakob; Foged, Niels T

    2011-01-01

    The dramatic increase in computer processing power in combination with the availability of high-quality digital cameras during the last 10 years has fertilized the grounds for quantitative microscopy based on digital image analysis. With the present introduction of robust scanners for whole slide imaging in both research and routine, the benefits of automation and objectivity in the analysis of tissue sections will be even more obvious. For in situ studies of signal transduction, the combination of tissue microarrays, immunohistochemistry, digital imaging, and quantitative image analysis will be central operations. However, immunohistochemistry is a multistep procedure including a lot of technical pitfalls leading to intra- and interlaboratory variability of its outcome. The resulting variations in staining intensity and disruption of original morphology are an extra challenge for the image analysis software, which therefore preferably should be dedicated to the detection and quantification of histomorphometrical end points.

  16. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  17. Quantification of sensory and food quality: the R-index analysis.

    PubMed

    Lee, Hye-Seong; van Hout, Danielle

    2009-08-01

    The accurate quantification of sensory difference/similarity between foods, as well as consumer acceptance/preference and concepts, is greatly needed to optimize and maintain food quality. The R-Index is one class of measures of the degree of difference/similarity, and was originally developed for sensory difference tests for food quality control, product development, and so on. The index is based on signal detection theory and is free of the response bias that can invalidate difference testing protocols, including categorization and same-different and A-Not A tests. It is also a nonparametric analysis, making no assumptions about sensory distributions, and is simple to compute and understand. The R-Index is also flexible in its application. Methods based on R-Index analysis have been used as detection and sensory difference tests, as simple alternatives to hedonic scaling, and for the measurement of consumer concepts. This review indicates the various computational strategies for the R-Index and its practical applications to consumer and sensory measurements in food science.

  18. Relative quantification in seed GMO analysis: state of art and bottlenecks.

    PubMed

    Chaouachi, Maher; Bérard, Aurélie; Saïd, Khaled

    2013-06-01

    Reliable quantitative methods are needed to comply with current EU regulations on the mandatory labeling of genetically modified organisms (GMOs) and GMO-derived food and feed products with a minimum GMO content of 0.9 %. The implementation of EU Commission Recommendation 2004/787/EC on technical guidance for sampling and detection which meant as a helpful tool for the practical implementation of EC Regulation 1830/2003, which states that "the results of quantitative analysis should be expressed as the number of target DNA sequences per target taxon specific sequences calculated in terms of haploid genomes". This has led to an intense debate on the type of calibrator best suitable for GMO quantification. The main question addressed in this review is whether reference materials and calibrators should be matrix based or whether pure DNA analytes should be used for relative quantification in GMO analysis. The state of the art, including the advantages and drawbacks, of using DNA plasmid (compared to genomic DNA reference materials) as calibrators, is widely described. In addition, the influence of the genetic structure of seeds on real-time PCR quantitative results obtained for seed lots is discussed. The specific composition of a seed kernel, the mode of inheritance, and the ploidy level ensure that there is discordance between a GMO % expressed as a haploid genome equivalent and a GMO % based on numbers of seeds. This means that a threshold fixed as a percentage of seeds cannot be used as such for RT-PCR. All critical points that affect the expression of the GMO content in seeds are discussed in this paper.

  19. Comparison of analysis methods for airway quantification

    NASA Astrophysics Data System (ADS)

    Odry, Benjamin L.; Kiraly, Atilla P.; Novak, Carol L.; Naidich, David P.

    2012-03-01

    Diseased airways have been known for several years as a possible contributing factor to airflow limitation in Chronic Obstructive Pulmonary Diseases (COPD). Quantification of disease severity through the evaluation of airway dimensions - wall thickness and lumen diameter - has gained increased attention, thanks to the availability of multi-slice computed tomography (CT). Novel approaches have focused on automated methods of measurement as a faster and more objective means that the visual assessment routinely employed in the clinic. Since the Full-Width Half-Maximum (FWHM) method of airway measurement was introduced two decades ago [1], several new techniques for quantifying airways have been detailed in the literature, but no approach has truly become a standard for such analysis. Our own research group has presented two alternative approaches for determining airway dimensions, one involving a minimum path and the other active contours [2, 3]. With an increasing number of techniques dedicated to the same goal, we decided to take a step back and analyze the differences of these methods. We consequently put to the test our two methods of analysis and the FWHM approach. We first measured a set of 5 airways from a phantom of known dimensions. Then we compared measurements from the three methods to those of two independent readers, performed on 35 airways in 5 patients. We elaborate on the differences of each approach and suggest conclusions on which could be defined as the best one.

  20. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    PubMed

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  1. Reliability and quality assurance on the MOD 2 wind system

    NASA Technical Reports Server (NTRS)

    Mason, W. E. B.; Jones, B. G.

    1981-01-01

    The Safety, Reliability, and Quality Assurance (R&QA) approach developed for the largest wind turbine generator, the Mod 2, is described. The R&QA approach assures that the machine is not hazardous to the public or to the operating personnel, is operated unattended on a utility grid, demonstrates reliable operation, and helps establish the quality assurance and maintainability requirements for future wind turbine projects. The significant guideline consisted of a failure modes and effects analysis (FMEA) during the design phase, hardware inspections during parts fabrication, and three simple documents to control activities during machine construction and operation.

  2. HPLC-MRM relative quantification analysis of fatty acids based on a novel derivatization strategy.

    PubMed

    Cai, Tie; Ting, Hu; Xin-Xiang, Zhang; Jiang, Zhou; Jin-Lan, Zhang

    2014-12-07

    Fatty acids (FAs) are associated with a series of diseases including tumors, diabetes, and heart diseases. As potential biomarkers, FAs have attracted increasing attention from both biological researchers and the pharmaceutical industry. However, poor ionization efficiency, extreme diversity, strict dependence on internal standards and complicated multiple reaction monitoring (MRM) optimization protocols have challenged efforts to quantify FAs. In this work, a novel derivatization strategy based on 2,4-bis(diethylamino)-6-hydrazino-1,3,5-triazine was developed to enable quantification of FAs. The sensitivity of FA detection was significantly enhanced as a result of the derivatization procedure. FA quantities as low as 10 fg could be detected by high-performance liquid chromatography coupled with triple-quadrupole mass spectrometry. General MRM conditions were developed for any FA, which facilitated the quantification and extended the application of the method. The FA quantification strategy based on HPLC-MRM was carried out using deuterated derivatization reagents. "Heavy" derivatization reagents were used as internal standards (ISs) to minimize matrix effects. Prior to statistical analysis, amounts of each FA species were normalized by their corresponding IS, which guaranteed the accuracy and reliability of the method. FA changes in plasma induced by ageing were studied using this strategy. Several FA species were identified as potential ageing biomarkers. The sensitivity, accuracy, reliability, and full coverage of the method ensure that this strategy has strong potential for both biomarker discovery and lipidomic research.

  3. Protein quantification using a cleavable reporter peptide.

    PubMed

    Duriez, Elodie; Trevisiol, Stephane; Domon, Bruno

    2015-02-06

    Peptide and protein quantification based on isotope dilution and mass spectrometry analysis are widely employed for the measurement of biomarkers and in system biology applications. The accuracy and reliability of such quantitative assays depend on the quality of the stable-isotope labeled standards. Although the quantification using stable-isotope labeled peptides is precise, the accuracy of the results can be severely biased by the purity of the internal standards, their stability and formulation, and the determination of their concentration. Here we describe a rapid and cost-efficient method to recalibrate stable isotope labeled peptides in a single LC-MS analysis. The method is based on the equimolar release of a protein reference peptide (used as surrogate for the protein of interest) and a universal reporter peptide during the trypsinization of a concatenated polypeptide standard. The quality and accuracy of data generated with such concatenated polypeptide standards are highlighted by the quantification of two clinically important proteins in urine samples and compared with results obtained with conventional stable isotope labeled reference peptides. Furthermore, the application of the UCRP standards in complex samples is described.

  4. Rapid method for the quantification of hydroquinone concentration: chemiluminescent analysis.

    PubMed

    Chen, Tung-Sheng; Liou, Show-Yih; Kuo, Wei-Wen; Wu, Hsi-Chin; Jong, Gwo-Ping; Wang, Hsueh-Fang; Shen, Chia-Yao; Padma, V Vijaya; Huang, Chih-Yang; Chang, Yen-Lin

    2015-11-01

    Topical hydroquinone serves as a skin whitener and is usually available in cosmetics or on prescription based on the hydroquinone concentration. Quantification of hydroquinone content therefore becomes an important issue in topical agents. High-performance liquid chromatography (HPLC) is the commonest method for determining hydroquinone content in topical agents, but this method is time-consuming and uses many solvents that can become an environmental issue. We report a rapid method for quantifying hydroquinone content by chemiluminescent analysis. Hydroquinone induces the production of hydrogen peroxide in the presence of basic compounds. Hydrogen peroxide induced by hydroquinone oxidized light-emitting materials such as lucigenin, resulted in the production of ultra-weak chemiluminescence that was detected by a chemiluminescence analyzer. The intensity of the chemiluminescence was found to be proportional to the hydroquinone concentration. We suggest that the rapid (measurement time, 60 s) and virtually solvent-free (solvent volume, <2 mL) chemiluminescent method described here for quantifying hydroquinone content may be an alternative to HPLC analysis. Copyright © 2015 John Wiley & Sons, Ltd.

  5. Comparison of colorimetric assays with quantitative amino acid analysis for protein quantification of Generalized Modules for Membrane Antigens (GMMA).

    PubMed

    Rossi, Omar; Maggiore, Luana; Necchi, Francesca; Koeberling, Oliver; MacLennan, Calman A; Saul, Allan; Gerke, Christiane

    2015-01-01

    Genetically induced outer membrane particles from Gram-negative bacteria, called Generalized Modules for Membrane Antigens (GMMA), are being investigated as vaccines. Rapid methods are required for estimating the protein content for in-process assays during production. Since GMMA are complex biological structures containing lipid and polysaccharide as well as protein, protein determinations are not necessarily straightforward. We compared protein quantification by Bradford, Lowry, and Non-Interfering assays using bovine serum albumin (BSA) as standard with quantitative amino acid (AA) analysis, the most accurate currently available method for protein quantification. The Lowry assay has the lowest inter- and intra-assay variation and gives the best linearity between protein amount and absorbance. In all three assays, the color yield (optical density per mass of protein) of GMMA was markedly different from that of BSA with a ratio of approximately 4 for the Bradford assay, and highly variable between different GMMA; and approximately 0.7 for the Lowry and Non-Interfering assays, highlighting the need for calibrating the standard used in the colorimetric assay against GMMA quantified by AA analysis. In terms of a combination of ease, reproducibility, and proportionality of protein measurement, and comparability between samples, the Lowry assay was superior to Bradford and Non-Interfering assays for GMMA quantification.

  6. Colour thresholding and objective quantification in bioimaging

    NASA Technical Reports Server (NTRS)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  7. Quantification of Cannabinoid Content in Cannabis

    NASA Astrophysics Data System (ADS)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  8. Pore network quantification of sandstones under experimental CO2 injection using image analysis

    NASA Astrophysics Data System (ADS)

    Berrezueta, Edgar; González-Menéndez, Luís; Ordóñez-Casado, Berta; Olaya, Peter

    2015-04-01

    Automated-image identification and quantification of minerals, pores and textures together with petrographic analysis can be applied to improve pore system characterization in sedimentary rocks. Our case study is focused on the application of these techniques to study the evolution of rock pore network subjected to super critical CO2-injection. We have proposed a Digital Image Analysis (DIA) protocol that guarantees measurement reproducibility and reliability. This can be summarized in the following stages: (i) detailed description of mineralogy and texture (before and after CO2-injection) by optical and scanning electron microscopy (SEM) techniques using thin sections; (ii) adjustment and calibration of DIA tools; (iii) data acquisition protocol based on image capture with different polarization conditions (synchronized movement of polarizers); (iv) study and quantification by DIA that allow (a) identification and isolation of pixels that belong to the same category: minerals vs. pores in each sample and (b) measurement of changes in pore network, after the samples have been exposed to new conditions (in our case: SC-CO2-injection). Finally, interpretation of the petrography and the measured data by an automated approach were done. In our applied study, the DIA results highlight the changes observed by SEM and microscopic techniques, which consisted in a porosity increase when CO2 treatment occurs. Other additional changes were minor: variations in the roughness and roundness of pore edges, and pore aspect ratio, shown in the bigger pore population. Additionally, statistic tests of pore parameters measured were applied to verify that the differences observed between samples before and after CO2-injection were significant.

  9. Evaluation of digital PCR for absolute RNA quantification.

    PubMed

    Sanders, Rebecca; Mason, Deborah J; Foy, Carole A; Huggett, Jim F

    2013-01-01

    Gene expression measurements detailing mRNA quantities are widely employed in molecular biology and are increasingly important in diagnostic fields. Reverse transcription (RT), necessary for generating complementary DNA, can be both inefficient and imprecise, but remains a quintessential RNA analysis tool using qPCR. This study developed a Transcriptomic Calibration Material and assessed the RT reaction using digital (d)PCR for RNA measurement. While many studies characterise dPCR capabilities for DNA quantification, less work has been performed investigating similar parameters using RT-dPCR for RNA analysis. RT-dPCR measurement using three, one-step RT-qPCR kits was evaluated using single and multiplex formats when measuring endogenous and synthetic RNAs. The best performing kit was compared to UV quantification and sensitivity and technical reproducibility investigated. Our results demonstrate assay and kit dependent RT-dPCR measurements differed significantly compared to UV quantification. Different values were reported by different kits for each target, despite evaluation of identical samples using the same instrument. RT-dPCR did not display the strong inter-assay agreement previously described when analysing DNA. This study demonstrates that, as with DNA measurement, RT-dPCR is capable of accurate quantification of low copy RNA targets, but the results are both kit and target dependent supporting the need for calibration controls.

  10. Practical quantification of necrosis in histological whole-slide images.

    PubMed

    Homeyer, André; Schenk, Andrea; Arlt, Janine; Dahmen, Uta; Dirsch, Olaf; Hahn, Horst K

    2013-06-01

    Since the histological quantification of necrosis is a common task in medical research and practice, we evaluate different image analysis methods for quantifying necrosis in whole-slide images. In a practical usage scenario, we assess the impact of different classification algorithms and feature sets on both accuracy and computation time. We show how a well-chosen combination of multiresolution features and an efficient postprocessing step enables the accurate quantification necrosis in gigapixel images in less than a minute. The results are general enough to be applied to other areas of histological image analysis as well. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Quantification of early cutaneous manifestations of chronic venous insufficiency by automated analysis of photographic images: Feasibility and technical considerations.

    PubMed

    Becker, François; Fourgeau, Patrice; Carpentier, Patrick H; Ouchène, Amina

    2018-06-01

    We postulate that blue telangiectasia and brownish pigmentation at ankle level, early markers of chronic venous insufficiency, can be quantified for longitudinal studies of chronic venous disease in Caucasian people. Objectives and methods To describe a photographic technique specially developed for this purpose. The pictures were acquired using a dedicated photo stand to position the foot in a reproducible way, with a normalized lighting and acquisition protocol. The image analysis was performed with a tool developed using algorithms optimized to detect and quantify blue telangiectasia and brownish pigmentation and their relative surface in the region of interest. To test the short-term reproducibility of the measures. Results The quantification of the blue telangiectasia and of the brownish pigmentation using an automated digital photo analysis is feasible. The short-term reproducibility is good for blue telangiectasia quantification. It is a less accurate for the brownish pigmentation. Conclusion The blue telangiectasia of the corona phlebectatica and the ankle flare can be assessed using a clinimetric approach based on the automated digital photo analysis.

  12. Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.

    PubMed

    Gerold, Chase T; Bakker, Eric; Henry, Charles S

    2018-04-03

    In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.

  13. Use of recurrence plot and recurrence quantification analysis in Taiwan unemployment rate time series

    NASA Astrophysics Data System (ADS)

    Chen, Wei-Shing

    2011-04-01

    The aim of the article is to answer the question if the Taiwan unemployment rate dynamics is generated by a non-linear deterministic dynamic process. This paper applies a recurrence plot and recurrence quantification approach based on the analysis of non-stationary hidden transition patterns of the unemployment rate of Taiwan. The case study uses the time series data of the Taiwan’s unemployment rate during the period from 1978/01 to 2010/06. The results show that recurrence techniques are able to identify various phases in the evolution of unemployment transition in Taiwan.

  14. Assessment of cardiac fibrosis: a morphometric method comparison for collagen quantification.

    PubMed

    Schipke, Julia; Brandenberger, Christina; Rajces, Alexandra; Manninger, Martin; Alogna, Alessio; Post, Heiner; Mühlfeld, Christian

    2017-04-01

    Fibrotic remodeling of the heart is a frequent condition linked to various diseases and cardiac dysfunction. Collagen quantification is an important objective in cardiac fibrosis research; however, a variety of different histological methods are currently used that may differ in accuracy. Here, frequently applied collagen quantification techniques were compared. A porcine model of early stage heart failure with preserved ejection fraction was used as an example. Semiautomated threshold analyses were imprecise, mainly due to inclusion of noncollagen structures or failure to detect certain collagen deposits. In contrast, collagen assessment by automated image analysis and light microscopy (LM)-stereology was more sensitive. Depending on the quantification method, the amount of estimated collagen varied and influenced intergroup comparisons. PicroSirius Red, Masson's trichrome, and Azan staining protocols yielded similar results, whereas the measured collagen area increased with increasing section thickness. Whereas none of the LM-based methods showed significant differences between the groups, electron microscopy (EM)-stereology revealed a significant collagen increase between cardiomyocytes in the experimental group, but not at other localizations. In conclusion, in contrast to the staining protocol, section thickness and the quantification method being used directly influence the estimated collagen content and thus, possibly, intergroup comparisons. EM in combination with stereology is a precise and sensitive method for collagen quantification if certain prerequisites are considered. For subtle fibrotic alterations, consideration of collagen localization may be necessary. Among LM methods, LM-stereology and automated image analysis are appropriate to quantify fibrotic changes, the latter depending on careful control of algorithm and comparable section staining. NEW & NOTEWORTHY Direct comparison of frequently applied histological fibrosis assessment techniques

  15. Integrated protocol for reliable and fast quantification and documentation of electrophoresis gels.

    PubMed

    Rehbein, Peter; Schwalbe, Harald

    2015-06-01

    Quantitative analysis of electrophoresis gels is an important part in molecular cloning, as well as in protein expression and purification. Parallel quantifications in yield and purity can be most conveniently obtained from densitometric analysis. This communication reports a comprehensive, reliable and simple protocol for gel quantification and documentation, applicable for single samples and with special features for protein expression screens. As major component of the protocol, the fully annotated code of a proprietary open source computer program for semi-automatic densitometric quantification of digitized electrophoresis gels is disclosed. The program ("GelQuant") is implemented for the C-based macro-language of the widespread integrated development environment of IGOR Pro. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Recent application of quantification II in Japanese medical research.

    PubMed Central

    Suzuki, T; Kudo, A

    1979-01-01

    Hayashi's Quantification II is a method of multivariate discrimination analysis to manipulate attribute data as predictor variables. It is very useful in the medical research field for estimation, diagnosis, prognosis, evaluation of epidemiological factors, and other problems based on multiplicity of attribute data. In Japan, this method is so well known that most of the computer program packages include the Hayashi Quantification, but it seems to be yet unfamiliar with the method for researchers outside Japan. In view of this situation, we introduced 19 selected articles of recent applications of the Quantification II in Japanese medical research. In reviewing these papers, special mention is made to clarify how the researchers were satisfied with findings provided by the method. At the same time, some recommendations are made about terminology and program packages. Also a brief discussion of the background of the quantification methods is given with special reference to the Behaviormetric Society of Japan. PMID:540587

  17. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    PubMed Central

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary

  18. Archetypal Analysis for Sparse Representation-Based Hyperspectral Sub-Pixel Quantification

    NASA Astrophysics Data System (ADS)

    Drees, L.; Roscher, R.

    2017-05-01

    This paper focuses on the quantification of land cover fractions in an urban area of Berlin, Germany, using simulated hyperspectral EnMAP data with a spatial resolution of 30m×30m. For this, sparse representation is applied, where each pixel with unknown surface characteristics is expressed by a weighted linear combination of elementary spectra with known land cover class. The elementary spectra are determined from image reference data using simplex volume maximization, which is a fast heuristic technique for archetypal analysis. In the experiments, the estimation of class fractions based on the archetypal spectral library is compared to the estimation obtained by a manually designed spectral library by means of reconstruction error, mean absolute error of the fraction estimates, sum of fractions and the number of used elementary spectra. We will show, that a collection of archetypes can be an adequate and efficient alternative to the spectral library with respect to mentioned criteria.

  19. Automated flow quantification in valvular heart disease based on backscattered Doppler power analysis: implementation on matrix-array ultrasound imaging systems.

    PubMed

    Buck, Thomas; Hwang, Shawn M; Plicht, Björn; Mucci, Ronald A; Hunold, Peter; Erbel, Raimund; Levine, Robert A

    2008-06-01

    Cardiac ultrasound imaging systems are limited in the noninvasive quantification of valvular regurgitation due to indirect measurements and inaccurate hemodynamic assumptions. We recently demonstrated that the principle of integration of backscattered acoustic Doppler power times velocity can be used for flow quantification in valvular regurgitation directly at the vena contracta of a regurgitant flow jet. We now aimed to accomplish implementation of automated Doppler power flow analysis software on a standard cardiac ultrasound system utilizing novel matrix-array transducer technology with detailed description of system requirements, components and software contributing to the system. This system based on a 3.5 MHz, matrix-array cardiac ultrasound scanner (Sonos 5500, Philips Medical Systems) was validated by means of comprehensive experimental signal generator trials, in vitro flow phantom trials and in vivo testing in 48 patients with mitral regurgitation of different severity and etiology using magnetic resonance imaging (MRI) for reference. All measurements displayed good correlation to the reference values, indicating successful implementation of automated Doppler power flow analysis on a matrix-array ultrasound imaging system. Systematic underestimation of effective regurgitant orifice areas >0.65 cm(2) and volumes >40 ml was found due to currently limited Doppler beam width that could be readily overcome by the use of new generation 2D matrix-array technology. Automated flow quantification in valvular heart disease based on backscattered Doppler power can be fully implemented on board a routinely used matrix-array ultrasound imaging systems. Such automated Doppler power flow analysis of valvular regurgitant flow directly, noninvasively, and user independent overcomes the practical limitations of current techniques.

  20. A comparison of the imaging characteristics of the new Kodak Hyper Speed G film with the current T-MAT G/RA film and the CR 9000 system.

    PubMed

    Monnin, P; Gutierrez, D; Bulling, S; Lepori, D; Verdun, F R

    2005-10-07

    Three standard radiation qualities (RQA 3, RQA 5 and RQA 9) and two screens, Kodak Lanex Regular and Insight Skeletal, were used to compare the imaging performance and dose requirements of the new Kodak Hyper Speed G and the current Kodak T-MAT G/RA medical x-ray films. The noise equivalent quanta (NEQ) and detective quantum efficiencies (DQE) of the four screen-film combinations were measured at three gross optical densities and compared with the characteristics for the Kodak CR 9000 system with GP (general purpose) and HR (high resolution) phosphor plates. The new Hyper Speed G film has double the intrinsic sensitivity of the T-MAT G/RA film and a higher contrast in the high optical density range for comparable exposure latitude. By providing both high sensitivity and high spatial resolution, the new film significantly improves the compromise between dose and image quality. As expected, the new film has a higher noise level and a lower signal-to-noise ratio than the standard film, although in the high frequency range this is compensated for by a better resolution, giving better DQE results--especially at high optical density. Both screen-film systems outperform the phosphor plates in terms of MTF and DQE for standard imaging conditions (Regular screen at RQA 5 and RQA 9 beam qualities). At low energy (RQA 3), the CR system has a comparable low-frequency DQE to screen-film systems when used with a fine screen at low and middle optical densities, and a superior low-frequency DQE at high optical density.

  1. One month of contemporary dance modulates fractal posture in aging

    PubMed Central

    Coubard, Olivier A.; Ferrufino, Lena; Nonaka, Tetsushi; Zelada, Oscar; Bril, Blandine; Dietrich, Gilles

    2013-01-01

    Understanding the human aging of postural control and how physical or motor activity improves balance and gait is challenging for both clinicians and researchers. Previous studies have evidenced that physical and sporting activity focusing on cardiovascular and strength conditioning help older adults develop their balance and gait and/or decrease their frequency of falls. Motor activity based on motor-skill learning has also been put forward as an alternative to develop balance and/or prevent falls in aging. Specifically dance has been advocated as a promising program to boost motor control. In this study, we examined the effects of contemporary dance (CD) on postural control of older adults. Upright stance posturography was performed in 38 participants aged 54–89 years before and after the intervention period, during which one half of the randomly assigned participants was trained to CD and the other half was not trained at all (no dance, ND). CD training lasted 4 weeks, 3 times a week. We performed classical statistic scores of postural signal and dynamic analyses, namely signal diffusion analysis (SDA), recurrence quantification analysis (RQA), and detrended fluctuation analysis (DFA). CD modulated postural control in older trainees, as revealed in the eyes closed condition by a decrease in fractal dimension and an increase in DFA alpha component in the mediolateral plane. The ND group showed an increase in length and mean velocity of postural signal, and the eyes open a decrease in RQA maximal diagonal line in the anteroposterior plane and an increase in DFA alpha component in the mediolateral plane. No change was found in SDA in either group. We suggest that such a massed practice of CD reduced the quantity of exchange between the subject and the environment by increasing their postural confidence. Since CD has low-physical but high-motor impact, we conclude that it may be recommended as a useful program to rehabilitate posture in aging. PMID:24611047

  2. A research design for the quantification of the neuropeptides substance p and calcitonin gene-related Peptide in rat skin using Western blot analysis.

    PubMed

    Lapin, Guilherme Abbud Franco; Hochman, Bernardo; Nishioka, Michele Akemi; Maximino, Jessica Ruivo; Chadi, Gerson; Ferreira, Lydia Masako

    2015-06-01

    To describe and standardize a protocol that overcomes the technical limitations of Western blot (WB) analysis in the quantification of the neuropeptides substance P (SP) and calcitonin gene-related peptide (CGRP) following nociceptive stimuli in rat skin. Male Wistar rats (Rattus norvegicus albinus) weighing 250 to 350 g were used in this study. Elements of WB analysis were adapted by using specific manipulation of samples, repeated cycles of freezing and thawing, more thorough maceration, and a more potent homogenizer; increasing lytic reagents; promoting greater inhibition of protease activity; and using polyvinylidene fluoride membranes as transfer means for skin-specific protein. Other changes were also made to adapt the WB analysis to a rat model. University research center. Western blot analysis adapted to a rat model. This research design has proven effective in collecting and preparing skin samples to quantify SP and CGRP using WB analysis in rat skin. This study described a research design that uses WB analysis as a reproducible, technically accessible, and cost-effective method for the quantification of SP and CGRP in rat skin that overcomes technical biases.

  3. Recommendations and Standardization of Biomarker Quantification Using NMR-Based Metabolomics with Particular Focus on Urinary Analysis.

    PubMed

    Emwas, Abdul-Hamid; Roy, Raja; McKay, Ryan T; Ryan, Danielle; Brennan, Lorraine; Tenori, Leonardo; Luchinat, Claudio; Gao, Xin; Zeri, Ana Carolina; Gowda, G A Nagana; Raftery, Daniel; Steinbeck, Christoph; Salek, Reza M; Wishart, David S

    2016-02-05

    NMR-based metabolomics has shown considerable promise in disease diagnosis and biomarker discovery because it allows one to nondestructively identify and quantify large numbers of novel metabolite biomarkers in both biofluids and tissues. Precise metabolite quantification is a prerequisite to move any chemical biomarker or biomarker panel from the lab to the clinic. Among the biofluids commonly used for disease diagnosis and prognosis, urine has several advantages. It is abundant, sterile, and easily obtained, needs little sample preparation, and does not require invasive medical procedures for collection. Furthermore, urine captures and concentrates many "unwanted" or "undesirable" compounds throughout the body, providing a rich source of potentially useful disease biomarkers; however, incredible variation in urine chemical concentrations makes analysis of urine and identification of useful urinary biomarkers by NMR challenging. We discuss a number of the most significant issues regarding NMR-based urinary metabolomics with specific emphasis on metabolite quantification for disease biomarker applications and propose data collection and instrumental recommendations regarding NMR pulse sequences, acceptable acquisition parameter ranges, relaxation effects on quantitation, proper handling of instrumental differences, sample preparation, and biomarker assessment.

  4. Automatic quantification of morphological features for hepatic trabeculae analysis in stained liver specimens

    PubMed Central

    Ishikawa, Masahiro; Murakami, Yuri; Ahi, Sercan Taha; Yamaguchi, Masahiro; Kobayashi, Naoki; Kiyuna, Tomoharu; Yamashita, Yoshiko; Saito, Akira; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie

    2016-01-01

    Abstract. This paper proposes a digital image analysis method to support quantitative pathology by automatically segmenting the hepatocyte structure and quantifying its morphological features. To structurally analyze histopathological hepatic images, we isolate the trabeculae by extracting the sinusoids, fat droplets, and stromata. We then measure the morphological features of the extracted trabeculae, divide the image into cords, and calculate the feature values of the local cords. We propose a method of calculating the nuclear–cytoplasmic ratio, nuclear density, and number of layers using the local cords. Furthermore, we evaluate the effectiveness of the proposed method using surgical specimens. The proposed method was found to be an effective method for the quantification of the Edmondson grade. PMID:27335894

  5. Recent advances in hopanoids analysis: Quantification protocols overview, main research targets and selected problems of complex data exploration.

    PubMed

    Zarzycki, Paweł K; Portka, Joanna K

    2015-09-01

    Pentacyclic triterpenoids, particularly hopanoids, are organism-specific compounds and are generally considered as useful biomarkers that allow fingerprinting and classification of biological, environmental and geological samples. Simultaneous quantification of various hopanoids together with battery of related non-polar and low-molecular mass compounds may provide principal information for geochemical and environmental research focusing on both modern and ancient investigations. Target compounds can be derived from microbial biomass, water columns, sediments, coals, crude fossils or rocks. This create number of analytical problems due to different composition of the analytical matrix and interfering compounds and therefore, proper optimization of quantification protocols for such biomarkers is still the challenge. In this work we summarizing typical analytical protocols that were recently applied for quantification of hopanoids like compounds from different samples. Main steps including components of interest extraction, pre-purification, fractionation, derivatization and quantification involving gas (1D and 2D) as well as liquid separation techniques (liquid-liquid extraction, solid-phase extraction, planar and low resolution column chromatography, high-performance liquid chromatography) are described and discussed from practical point of view, mainly based on the experimental papers that were published within last two years, where significant increase in hopanoids research was noticed. The second aim of this review is to describe the latest research trends concerning determination of hopanoids and related low-molecular mass lipids analyzed in various samples including sediments, rocks, coals, crude oils and plant fossils as well as stromatolites and microbial biomass cultivated under different conditions. It has been found that majority of the most recent papers are based on uni- or bivariate approach for complex data analysis. Data interpretation involves

  6. Myocardial blood flow quantification by Rb-82 cardiac PET/CT: A detailed reproducibility study between two semi-automatic analysis programs.

    PubMed

    Dunet, Vincent; Klein, Ran; Allenbach, Gilles; Renaud, Jennifer; deKemp, Robert A; Prior, John O

    2016-06-01

    Several analysis software packages for myocardial blood flow (MBF) quantification from cardiac PET studies exist, but they have not been compared using concordance analysis, which can characterize precision and bias separately. Reproducible measurements are needed for quantification to fully develop its clinical potential. Fifty-one patients underwent dynamic Rb-82 PET at rest and during adenosine stress. Data were processed with PMOD and FlowQuant (Lortie model). MBF and myocardial flow reserve (MFR) polar maps were quantified and analyzed using a 17-segment model. Comparisons used Pearson's correlation ρ (measuring precision), Bland and Altman limit-of-agreement and Lin's concordance correlation ρc = ρ·C b (C b measuring systematic bias). Lin's concordance and Pearson's correlation values were very similar, suggesting no systematic bias between software packages with an excellent precision ρ for MBF (ρ = 0.97, ρc = 0.96, C b = 0.99) and good precision for MFR (ρ = 0.83, ρc = 0.76, C b = 0.92). On a per-segment basis, no mean bias was observed on Bland-Altman plots, although PMOD provided slightly higher values than FlowQuant at higher MBF and MFR values (P < .0001). Concordance between software packages was excellent for MBF and MFR, despite higher values by PMOD at higher MBF values. Both software packages can be used interchangeably for quantification in daily practice of Rb-82 cardiac PET.

  7. Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.

    PubMed

    Hawkins, Steve F C; Guest, Paul C

    2018-01-01

    The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.

  8. Parsing and Quantification of Raw Orbitrap Mass Spectrometer Data Using RawQuant.

    PubMed

    Kovalchik, Kevin A; Moggridge, Sophie; Chen, David D Y; Morin, Gregg B; Hughes, Christopher S

    2018-06-01

    Effective analysis of protein samples by mass spectrometry (MS) requires careful selection and optimization of a range of experimental parameters. As the output from the primary detection device, the "raw" MS data file can be used to gauge the success of a given sample analysis. However, the closed-source nature of the standard raw MS file can complicate effective parsing of the data contained within. To ease and increase the range of analyses possible, the RawQuant tool was developed to enable parsing of raw MS files derived from Thermo Orbitrap instruments to yield meta and scan data in an openly readable text format. RawQuant can be commanded to export user-friendly files containing MS 1 , MS 2 , and MS 3 metadata as well as matrices of quantification values based on isobaric tagging approaches. In this study, the utility of RawQuant is demonstrated in several scenarios: (1) reanalysis of shotgun proteomics data for the identification of the human proteome, (2) reanalysis of experiments utilizing isobaric tagging for whole-proteome quantification, and (3) analysis of a novel bacterial proteome and synthetic peptide mixture for assessing quantification accuracy when using isobaric tags. Together, these analyses successfully demonstrate RawQuant for the efficient parsing and quantification of data from raw Thermo Orbitrap MS files acquired in a range of common proteomics experiments. In addition, the individual analyses using RawQuant highlights parametric considerations in the different experimental sets and suggests targetable areas to improve depth of coverage in identification-focused studies and quantification accuracy when using isobaric tags.

  9. Isotope-coded ESI-enhancing derivatization reagents for differential analysis, quantification and profiling of metabolites in biological samples by LC/MS: A review.

    PubMed

    Higashi, Tatsuya; Ogawa, Shoujiro

    2016-10-25

    The analysis of the qualitative and quantitative changes of metabolites in body fluids and tissues yields valuable information for the diagnosis, pathological analysis and treatment of many diseases. Recently, liquid chromatography/electrospray ionization-(tandem) mass spectrometry [LC/ESI-MS(/MS)] has been widely used for these purposes due to the high separation capability of LC, broad coverage of ESI for various compounds and high specificity of MS(/MS). However, there are still two major problems to be solved regarding the biological sample analysis; lack of sensitivity and limited availability of stable isotope-labeled analogues (internal standards, ISs) for most metabolites. Stable isotope-coded derivatization (ICD) can be the answer for these problems. By the ICD, different isotope-coded moieties are introduced to the metabolites and one of the resulting derivatives can serve as the IS, which minimize the matrix effects. Furthermore, the derivatization can improve the ESI efficiency, fragmentation property in the MS/MS and chromatographic behavior of the metabolites, which lead to a high sensitivity and specificity in the various detection modes. Based on this background, this article reviews the recently-reported isotope-coded ESI-enhancing derivatization (ICEED) reagents, which are key components for the ICD-based LC/MS(/MS) studies, and their applications to the detection, identification, quantification and profiling of metabolites in human and animal samples. The LC/MS(/MS) using the ICEED reagents is the powerful method especially for the differential analysis (relative quantification) of metabolites in two comparative samples, simultaneous quantification of multiple metabolites whose stable isotope-labeled ISs are not available, and submetabolome profiling. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Real-time quantitative PCR for retrovirus-like particle quantification in CHO cell culture.

    PubMed

    de Wit, C; Fautz, C; Xu, Y

    2000-09-01

    Chinese hamster ovary (CHO) cells have been widely used to manufacture recombinant proteins intended for human therapeutic uses. Retrovirus-like particles, which are apparently defective and non-infectious, have been detected in all CHO cells by electron microscopy (EM). To assure viral safety of CHO cell-derived biologicals, quantification of retrovirus-like particles in production cell culture and demonstration of sufficient elimination of such retrovirus-like particles by the down-stream purification process are required for product market registration worldwide. EM, with a detection limit of 1x10(6) particles/ml, is the standard retrovirus-like particle quantification method. The whole process, which requires a large amount of sample (3-6 litres), is labour intensive, time consuming, expensive, and subject to significant assay variability. In this paper, a novel real-time quantitative PCR assay (TaqMan assay) has been developed for the quantification of retrovirus-like particles. Each retrovirus particle contains two copies of the viral genomic particle RNA (pRNA) molecule. Therefore, quantification of retrovirus particles can be achieved by quantifying the pRNA copy number, i.e. every two copies of retroviral pRNA is equivalent to one retrovirus-like particle. The TaqMan assay takes advantage of the 5'-->3' exonuclease activity of Taq DNA polymerase and utilizes the PRISM 7700 Sequence Detection System of PE Applied Biosystems (Foster City, CA, U.S.A.) for automated pRNA quantification through a dual-labelled fluorogenic probe. The TaqMan quantification technique is highly comparable to the EM analysis. In addition, it offers significant advantages over the EM analysis, such as a higher sensitivity of less than 600 particles/ml, greater accuracy and reliability, higher sample throughput, more flexibility and lower cost. Therefore, the TaqMan assay should be used as a substitute for EM analysis for retrovirus-like particle quantification in CHO cell

  11. CometQ: An automated tool for the detection and quantification of DNA damage using comet assay image analysis.

    PubMed

    Ganapathy, Sreelatha; Muraleedharan, Aparna; Sathidevi, Puthumangalathu Savithri; Chand, Parkash; Rajkumar, Ravi Philip

    2016-09-01

    DNA damage analysis plays an important role in determining the approaches for treatment and prevention of various diseases like cancer, schizophrenia and other heritable diseases. Comet assay is a sensitive and versatile method for DNA damage analysis. The main objective of this work is to implement a fully automated tool for the detection and quantification of DNA damage by analysing comet assay images. The comet assay image analysis consists of four stages: (1) classifier (2) comet segmentation (3) comet partitioning and (4) comet quantification. Main features of the proposed software are the design and development of four comet segmentation methods, and the automatic routing of the input comet assay image to the most suitable one among these methods depending on the type of the image (silver stained or fluorescent stained) as well as the level of DNA damage (heavily damaged or lightly/moderately damaged). A classifier stage, based on support vector machine (SVM) is designed and implemented at the front end, to categorise the input image into one of the above four groups to ensure proper routing. Comet segmentation is followed by comet partitioning which is implemented using a novel technique coined as modified fuzzy clustering. Comet parameters are calculated in the comet quantification stage and are saved in an excel file. Our dataset consists of 600 silver stained images obtained from 40 Schizophrenia patients with different levels of severity, admitted to a tertiary hospital in South India and 56 fluorescent stained images obtained from different internet sources. The performance of "CometQ", the proposed standalone application for automated analysis of comet assay images, is evaluated by a clinical expert and is also compared with that of a most recent and related software-OpenComet. CometQ gave 90.26% positive predictive value (PPV) and 93.34% sensitivity which are much higher than those of OpenComet, especially in the case of silver stained images. The

  12. Adaptive quantification and longitudinal analysis of pulmonary emphysema with a hidden Markov measure field model.

    PubMed

    Hame, Yrjo; Angelini, Elsa D; Hoffman, Eric A; Barr, R Graham; Laine, Andrew F

    2014-07-01

    The extent of pulmonary emphysema is commonly estimated from CT scans by computing the proportional area of voxels below a predefined attenuation threshold. However, the reliability of this approach is limited by several factors that affect the CT intensity distributions in the lung. This work presents a novel method for emphysema quantification, based on parametric modeling of intensity distributions and a hidden Markov measure field model to segment emphysematous regions. The framework adapts to the characteristics of an image to ensure a robust quantification of emphysema under varying CT imaging protocols, and differences in parenchymal intensity distributions due to factors such as inspiration level. Compared to standard approaches, the presented model involves a larger number of parameters, most of which can be estimated from data, to handle the variability encountered in lung CT scans. The method was applied on a longitudinal data set with 87 subjects and a total of 365 scans acquired with varying imaging protocols. The resulting emphysema estimates had very high intra-subject correlation values. By reducing sensitivity to changes in imaging protocol, the method provides a more robust estimate than standard approaches. The generated emphysema delineations promise advantages for regional analysis of emphysema extent and progression.

  13. Analysis of actuator delay and its effect on uncertainty quantification for real-time hybrid simulation

    NASA Astrophysics Data System (ADS)

    Chen, Cheng; Xu, Weijie; Guo, Tong; Chen, Kai

    2017-10-01

    Uncertainties in structure properties can result in different responses in hybrid simulations. Quantification of the effect of these uncertainties would enable researchers to estimate the variances of structural responses observed from experiments. This poses challenges for real-time hybrid simulation (RTHS) due to the existence of actuator delay. Polynomial chaos expansion (PCE) projects the model outputs on a basis of orthogonal stochastic polynomials to account for influences of model uncertainties. In this paper, PCE is utilized to evaluate effect of actuator delay on the maximum displacement from real-time hybrid simulation of a single degree of freedom (SDOF) structure when accounting for uncertainties in structural properties. The PCE is first applied for RTHS without delay to determine the order of PCE, the number of sample points as well as the method for coefficients calculation. The PCE is then applied to RTHS with actuator delay. The mean, variance and Sobol indices are compared and discussed to evaluate the effects of actuator delay on uncertainty quantification for RTHS. Results show that the mean and the variance of the maximum displacement increase linearly and exponentially with respect to actuator delay, respectively. Sensitivity analysis through Sobol indices also indicates the influence of the single random variable decreases while the coupling effect increases with the increase of actuator delay.

  14. Recommendations and Standardization of Biomarker Quantification Using NMR-Based Metabolomics with Particular Focus on Urinary Analysis

    PubMed Central

    2016-01-01

    NMR-based metabolomics has shown considerable promise in disease diagnosis and biomarker discovery because it allows one to nondestructively identify and quantify large numbers of novel metabolite biomarkers in both biofluids and tissues. Precise metabolite quantification is a prerequisite to move any chemical biomarker or biomarker panel from the lab to the clinic. Among the biofluids commonly used for disease diagnosis and prognosis, urine has several advantages. It is abundant, sterile, and easily obtained, needs little sample preparation, and does not require invasive medical procedures for collection. Furthermore, urine captures and concentrates many “unwanted” or “undesirable” compounds throughout the body, providing a rich source of potentially useful disease biomarkers; however, incredible variation in urine chemical concentrations makes analysis of urine and identification of useful urinary biomarkers by NMR challenging. We discuss a number of the most significant issues regarding NMR-based urinary metabolomics with specific emphasis on metabolite quantification for disease biomarker applications and propose data collection and instrumental recommendations regarding NMR pulse sequences, acceptable acquisition parameter ranges, relaxation effects on quantitation, proper handling of instrumental differences, sample preparation, and biomarker assessment. PMID:26745651

  15. AVQS: attack route-based vulnerability quantification scheme for smart grid.

    PubMed

    Ko, Jongbin; Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  16. AVQS: Attack Route-Based Vulnerability Quantification Scheme for Smart Grid

    PubMed Central

    Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification. PMID:25152923

  17. Microbial quantification in activated sludge: the hits and misses.

    PubMed

    Hall, S J; Keller, J; Blackall, L L

    2003-01-01

    Since the implementation of the activated sludge process for treating wastewater, there has been a reliance on chemical and physical parameters to monitor the system. However, in biological nutrient removal (BNR) processes, the microorganisms responsible for some of the transformations should be used to monitor the processes with the overall goal to achieve better treatment performance. The development of in situ identification and rapid quantification techniques for key microorganisms involved in BNR are required to achieve this goal. This study explored the quantification of Nitrospira, a key organism in the oxidation of nitrite to nitrate in BNR. Two molecular genetic microbial quantification techniques were evaluated: real-time polymerase chain reaction (PCR) and fluorescence in situ hybridisation (FISH) followed by digital image analysis. A correlation between the Nitrospira quantitative data and the nitrate production rate, determined in batch tests, was attempted. The disadvantages and advantages of both methods will be discussed.

  18. Quantification of fossil organic matter in contaminated sediments from an industrial watershed: validation of the quantitative multimolecular approach by radiocarbon analysis.

    PubMed

    Jeanneau, Laurent; Faure, Pierre

    2010-09-01

    The quantitative multimolecular approach (QMA) based on an exhaustive identification and quantification of molecules from the extractable organic matter (EOM) has been recently developed in order to investigate organic contamination in sediments by a more complete method than the restrictive quantification of target contaminants. Such an approach allows (i) the comparison between natural and anthropogenic inputs, (ii) between modern and fossil organic matter and (iii) the differentiation between several anthropogenic sources. However QMA is based on the quantification of molecules recovered by organic solvent and then analyzed by gas chromatography-mass spectrometry, which represent a small fraction of sedimentary organic matter (SOM). In order to extend the conclusions of QMA to SOM, radiocarbon analyses have been performed on organic extracts and decarbonated sediments. This analysis allows (i) the differentiation between modern biomass (contemporary (14)C) and fossil organic matter ((14)C-free) and (ii) the calculation of the modern carbon percentage (PMC). At the confluence between Fensch and Moselle Rivers, a catchment highly contaminated by both industrial activities and urbanization, PMC values in decarbonated sediments are well correlated with the percentage of natural molecular markers determined by QMA. It highlights that, for this type of contamination by fossil organic matter inputs, the conclusions of QMA can be scaled up to SOM. QMA is an efficient environmental diagnostic tool that leads to a more realistic quantification of fossil organic matter in sediments. Copyright 2010 Elsevier B.V. All rights reserved.

  19. Uncertainty Reduction using Bayesian Inference and Sensitivity Analysis: A Sequential Approach to the NASA Langley Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar

    2016-01-01

    This paper presents a computational framework for uncertainty characterization and propagation, and sensitivity analysis under the presence of aleatory and epistemic un- certainty, and develops a rigorous methodology for efficient refinement of epistemic un- certainty by identifying important epistemic variables that significantly affect the overall performance of an engineering system. The proposed methodology is illustrated using the NASA Langley Uncertainty Quantification Challenge (NASA-LUQC) problem that deals with uncertainty analysis of a generic transport model (GTM). First, Bayesian inference is used to infer subsystem-level epistemic quantities using the subsystem-level model and corresponding data. Second, tools of variance-based global sensitivity analysis are used to identify four important epistemic variables (this limitation specified in the NASA-LUQC is reflective of practical engineering situations where not all epistemic variables can be refined due to time/budget constraints) that significantly affect system-level performance. The most significant contribution of this paper is the development of the sequential refine- ment methodology, where epistemic variables for refinement are not identified all-at-once. Instead, only one variable is first identified, and then, Bayesian inference and global sensi- tivity calculations are repeated to identify the next important variable. This procedure is continued until all 4 variables are identified and the refinement in the system-level perfor- mance is computed. The advantages of the proposed sequential refinement methodology over the all-at-once uncertainty refinement approach are explained, and then applied to the NASA Langley Uncertainty Quantification Challenge problem.

  20. Comparison of algorithms to quantify muscle fatigue in upper limb muscles based on sEMG signals.

    PubMed

    Kahl, Lorenz; Hofmann, Ulrich G

    2016-11-01

    This work compared the performance of six different fatigue detection algorithms quantifying muscle fatigue based on electromyographic signals. Surface electromyography (sEMG) was obtained by an experiment from upper arm contractions at three different load levels from twelve volunteers. Fatigue detection algorithms mean frequency (MNF), spectral moments ratio (SMR), the wavelet method WIRM1551, sample entropy (SampEn), fuzzy approximate entropy (fApEn) and recurrence quantification analysis (RQA%DET) were calculated. The resulting fatigue signals were compared considering the disturbances incorporated in fatiguing situations as well as according to the possibility to differentiate the load levels based on the fatigue signals. Furthermore we investigated the influence of the electrode locations on the fatigue detection quality and whether an optimized channel set is reasonable. The results of the MNF, SMR, WIRM1551 and fApEn algorithms fell close together. Due to the small amount of subjects in this study significant differences could not be found. In terms of disturbances the SMR algorithm showed a slight tendency to out-perform the others. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.

  1. Quantification of Efficiency of Beneficiation of Lunar Regolith

    NASA Technical Reports Server (NTRS)

    Trigwell, Steve; Lane, John; Captain, James; Weis, Kyle; Quinn, Jacqueline; Watanabe, Fumiya

    2011-01-01

    Electrostatic beneficiation of lunar regolith is being researched at Kennedy Space Center to enhance the ilmenite concentration of the regolith for the production of oxygen in in-situ resource utilization on the lunar surface. Ilmenite enrichment of up to 200% was achieved using lunar simulants. For the most accurate quantification of the regolith particles, standard petrographic methods are typically followed, but in order to optimize the process, many hundreds of samples were generated in this study that made the standard analysis methods time prohibitive. In the current studies, X-ray photoelectron spectroscopy (XPS) and Secondary Electron microscopy/Energy Dispersive Spectroscopy (SEM/EDS) were used that could automatically, and quickly, analyze many separated fractions of lunar simulant. In order to test the accuracy of the quantification, test mixture samples of known quantities of ilmenite (2, 5, 10, and 20 wt%) in silica (pure quartz powder), were analyzed by XPS and EDS. The results showed that quantification for low concentrations of ilmenite in silica could be accurately achieved by both XPS and EDS, knowing the limitations of the techniques. 1

  2. Sulfur-based absolute quantification of proteins using isotope dilution inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Lee, Hyun-Seok; Heun Kim, Sook; Jeong, Ji-Seon; Lee, Yong-Moon; Yim, Yong-Hyeon

    2015-10-01

    An element-based reductive approach provides an effective means of realizing International System of Units (SI) traceability for high-purity biological standards. Here, we develop an absolute protein quantification method using double isotope dilution (ID) inductively coupled plasma mass spectrometry (ICP-MS) combined with microwave-assisted acid digestion for the first time. We validated the method and applied it to certify the candidate protein certified reference material (CRM) of human growth hormone (hGH). The concentration of hGH was determined by analysing the total amount of sulfur in hGH. Next, the size-exclusion chromatography method was used with ICP-MS to characterize and quantify sulfur-containing impurities. By subtracting the contribution of sulfur-containing impurities from the total sulfur content in the hGH CRM, we obtained a SI-traceable certification value. The quantification result obtained with the present method based on sulfur analysis was in excellent agreement with the result determined via a well-established protein quantification method based on amino acid analysis using conventional acid hydrolysis combined with an ID liquid chromatography-tandem mass spectrometry. The element-based protein quantification method developed here can be generally used for SI-traceable absolute quantification of proteins, especially pure-protein standards.

  3. Technical Note: Detective quantum efficiency simulation of a-Se imaging detectors using ARTEMIS.

    PubMed

    Fang, Yuan; Ito, Takaaki; Nariyuki, Fumito; Kuwabara, Takao; Badano, Aldo; Karim, Karim S

    2017-08-01

    This work studies the detective quantum efficiency (DQE) of a-Se-based solid state x-ray detectors for medical imaging applications using ARTEMIS, a Monte Carlo simulation tool for modeling x-ray photon, electron and charged carrier transport in semiconductors with the presence of applied electric field. ARTEMIS is used to model the signal formation process in a-Se. The simulation model includes x-ray photon and high-energy electron interactions, and detailed electron-hole pair transport with applied detector bias taking into account drift, diffusion, Coulomb interactions, recombination and trapping. For experimental validation, the DQE performance of prototype a-Se detectors is measured following IEC Testing Standard 62220-1-3. Comparison of simulated and experimental DQE results show reasonable agreement for RQA beam qualities. Experimental validation demonstrated within 5% percentage difference between simulation and experimental DQE results for spatial frequency above 0.25 cycles/mm using uniform applied electric field for RQA beam qualities (RQA5, RQA7 and RQA9). Results include two different prototype detectors with thicknesses of 240 μm and 1 mm. ARTEMIS can be used to model the DQE of a-Se detectors as a function of x-ray energy, detector thickness, and spatial frequency. The ARTEMIS model can be used to improve understanding of the physics of x-ray interactions in a-Se and in optimization studies for the development of novel medical imaging applications. © 2017 American Association of Physicists in Medicine.

  4. Development of a Framework for Model-Based Analysis, Uncertainty Quantification, and Robust Control Design of Nonlinear Smart Composite Systems

    DTIC Science & Technology

    2015-06-04

    control, vibration and noise control, health monitoring, and energy harvesting . However, these advantages come at the cost of rate-dependent hysteresis...configuration used for energy harvesting . Uncertainty Quantification Uncertainty quantification is pursued in two steps: (i) determination of densities...Crews and R.C. Smith, “Quantification of parameter and model uncertainty for shape mem- ory alloy bending actuators,” Journal of Intelligent material

  5. Crack Imaging and Quantification in Aluminum Plates with Guided Wave Wavenumber Analysis Methods

    NASA Technical Reports Server (NTRS)

    Yu, Lingyu; Tian, Zhenhua; Leckey, Cara A. C.

    2015-01-01

    Guided wavefield analysis methods for detection and quantification of crack damage in an aluminum plate are presented in this paper. New wavenumber components created by abrupt wave changes at the structural discontinuity are identified in the frequency-wavenumber spectra. It is shown that the new wavenumbers can be used to detect and characterize the crack dimensions. Two imaging based approaches, filter reconstructed imaging and spatial wavenumber imaging, are used to demonstrate how the cracks can be evaluated with wavenumber analysis. The filter reconstructed imaging is shown to be a rapid method to map the plate and any existing damage, but with less precision in estimating crack dimensions; while the spatial wavenumber imaging provides an intensity image of spatial wavenumber values with enhanced resolution of crack dimensions. These techniques are applied to simulated wavefield data, and the simulation based studies show that spatial wavenumber imaging method is able to distinguish cracks of different severities. Laboratory experimental validation is performed for a single crack case to confirm the methods' capabilities for imaging cracks in plates.

  6. Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification

    PubMed Central

    ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE

    2017-01-01

    Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793

  7. Absolute quantification by droplet digital PCR versus analog real-time PCR

    PubMed Central

    Hindson, Christopher M; Chevillet, John R; Briggs, Hilary A; Gallichotte, Emily N; Ruf, Ingrid K; Hindson, Benjamin J; Vessella, Robert L; Tewari, Muneesh

    2014-01-01

    Nanoliter-sized droplet technology paired with digital PCR (ddPCR) holds promise for highly precise, absolute nucleic acid quantification. Our comparison of microRNA quantification by ddPCR and real-time PCR revealed greater precision (coefficients of variation decreased by 37–86%) and improved day-to-day reproducibility (by a factor of seven) of ddPCR but with comparable sensitivity. When we applied ddPCR to serum microRNA biomarker analysis, this translated to superior diagnostic performance for identifying individuals with cancer. PMID:23995387

  8. How to Make Data a Blessing to Parametric Uncertainty Quantification and Reduction?

    NASA Astrophysics Data System (ADS)

    Ye, M.; Shi, X.; Curtis, G. P.; Kohler, M.; Wu, J.

    2013-12-01

    In a Bayesian point of view, probability of model parameters and predictions are conditioned on data used for parameter inference and prediction analysis. It is critical to use appropriate data for quantifying parametric uncertainty and its propagation to model predictions. However, data are always limited and imperfect. When a dataset cannot properly constrain model parameters, it may lead to inaccurate uncertainty quantification. While in this case data appears to be a curse to uncertainty quantification, a comprehensive modeling analysis may help understand the cause and characteristics of parametric uncertainty and thus turns data into a blessing. In this study, we illustrate impacts of data on uncertainty quantification and reduction using an example of surface complexation model (SCM) developed to simulate uranyl (U(VI)) adsorption. The model includes two adsorption sites, referred to as strong and weak sites. The amount of uranium adsorption on these sites determines both the mean arrival time and the long tail of the breakthrough curves. There is one reaction on the weak site but two reactions on the strong site. The unknown parameters include fractions of the total surface site density of the two sites and surface complex formation constants of the three reactions. A total of seven experiments were conducted with different geochemical conditions to estimate these parameters. The experiments with low initial concentration of U(VI) result in a large amount of parametric uncertainty. A modeling analysis shows that it is because the experiments cannot distinguish the relative adsorption affinity of the strong and weak sites on uranium adsorption. Therefore, the experiments with high initial concentration of U(VI) are needed, because in the experiments the strong site is nearly saturated and the weak site can be determined. The experiments with high initial concentration of U(VI) are a blessing to uncertainty quantification, and the experiments with low initial

  9. Emphysema quantification from CT scans using novel application of diaphragm curvature estimation: comparison with standard quantification methods and pulmonary function data

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Yankelevitz, David F.; Henschke, Claudia I.; Barr, R. Graham

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for the imaging of the anatomical basis of emphysema and quantification of the underlying disease state. Several measures have been introduced for the quantification emphysema directly from CT data; most,however, are based on the analysis of density information provided by the CT scans, which vary by scanner and can be hard to standardize across sites and time. Given that one of the anatomical variations associated with the progression of emphysema is the flatting of the diaphragm due to the loss of elasticity in the lung parenchyma, curvature analysis of the diaphragm would provide information about emphysema from CT. Therefore, we propose a new, non-density based measure of the curvature of the diaphragm that would allow for further quantification methods in a robust manner. To evaluate the new method, 24 whole-lung scans were analyzed using the ratios of the lung height and diaphragm width to diaphragm height as curvature estimates as well as using the emphysema index as comparison. Pearson correlation coefficients showed a strong trend of several of the proposed diaphragm curvature measures to have higher correlations, of up to r=0.57, with DLCO% and VA than did the emphysema index. Furthermore, we found emphysema index to have only a 0.27 correlation to the proposed measures, indicating that the proposed measures evaluate different aspects of the disease.

  10. Detection and quantification of proteins and cells by use of elemental mass spectrometry: progress and challenges.

    PubMed

    Yan, Xiaowen; Yang, Limin; Wang, Qiuquan

    2013-07-01

    Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.

  11. Comparing model-based and model-free analysis methods for QUASAR arterial spin labeling perfusion quantification.

    PubMed

    Chappell, Michael A; Woolrich, Mark W; Petersen, Esben T; Golay, Xavier; Payne, Stephen J

    2013-05-01

    Amongst the various implementations of arterial spin labeling MRI methods for quantifying cerebral perfusion, the QUASAR method is unique. By using a combination of labeling with and without flow suppression gradients, the QUASAR method offers the separation of macrovascular and tissue signals. This permits local arterial input functions to be defined and "model-free" analysis, using numerical deconvolution, to be used. However, it remains unclear whether arterial spin labeling data are best treated using model-free or model-based analysis. This work provides a critical comparison of these two approaches for QUASAR arterial spin labeling in the healthy brain. An existing two-component (arterial and tissue) model was extended to the mixed flow suppression scheme of QUASAR to provide an optimal model-based analysis. The model-based analysis was extended to incorporate dispersion of the labeled bolus, generally regarded as the major source of discrepancy between the two analysis approaches. Model-free and model-based analyses were compared for perfusion quantification including absolute measurements, uncertainty estimation, and spatial variation in cerebral blood flow estimates. Major sources of discrepancies between model-free and model-based analysis were attributed to the effects of dispersion and the degree to which the two methods can separate macrovascular and tissue signal. Copyright © 2012 Wiley Periodicals, Inc.

  12. Individual and population pharmacokinetic compartment analysis: a graphic procedure for quantification of predictive performance.

    PubMed

    Eksborg, Staffan

    2013-01-01

    Pharmacokinetic studies are important for optimizing of drug dosing, but requires proper validation of the used pharmacokinetic procedures. However, simple and reliable statistical methods suitable for evaluation of the predictive performance of pharmacokinetic analysis are essentially lacking. The aim of the present study was to construct and evaluate a graphic procedure for quantification of predictive performance of individual and population pharmacokinetic compartment analysis. Original data from previously published pharmacokinetic compartment analyses after intravenous, oral, and epidural administration, and digitized data, obtained from published scatter plots of observed vs predicted drug concentrations from population pharmacokinetic studies using the NPEM algorithm and NONMEM computer program and Bayesian forecasting procedures, were used for estimating the predictive performance according to the proposed graphical method and by the method of Sheiner and Beal. The graphical plot proposed in the present paper proved to be a useful tool for evaluation of predictive performance of both individual and population compartment pharmacokinetic analysis. The proposed method is simple to use and gives valuable information concerning time- and concentration-dependent inaccuracies that might occur in individual and population pharmacokinetic compartment analysis. Predictive performance can be quantified by the fraction of concentration ratios within arbitrarily specified ranges, e.g. within the range 0.8-1.2.

  13. Fluorescent quantification of melanin.

    PubMed

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  14. Proof-of-Concept Study for Uncertainty Quantification and Sensitivity Analysis using the BRL Shaped-Charge Example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hughes, Justin Matthew

    These are the slides for a graduate presentation at Mississippi State University. It covers the following: the BRL Shaped-Charge Geometry in PAGOSA, mesh refinement study, surrogate modeling using a radial basis function network (RBFN), ruling out parameters using sensitivity analysis (equation of state study), uncertainty quantification (UQ) methodology, and sensitivity analysis (SA) methodology. In summary, a mesh convergence study was used to ensure that solutions were numerically stable by comparing PDV data between simulations. A Design of Experiments (DOE) method was used to reduce the simulation space to study the effects of the Jones-Wilkins-Lee (JWL) Parameters for the Composition Bmore » main charge. Uncertainty was quantified by computing the 95% data range about the median of simulation output using a brute force Monte Carlo (MC) random sampling method. Parameter sensitivities were quantified using the Fourier Amplitude Sensitivity Test (FAST) spectral analysis method where it was determined that detonation velocity, initial density, C1, and B1 controlled jet tip velocity.« less

  15. Computational analysis of PET by AIBL (CapAIBL): a cloud-based processing pipeline for the quantification of PET images

    NASA Astrophysics Data System (ADS)

    Bourgeat, Pierrick; Dore, Vincent; Fripp, Jurgen; Villemagne, Victor L.; Rowe, Chris C.; Salvado, Olivier

    2015-03-01

    With the advances of PET tracers for β-Amyloid (Aβ) detection in neurodegenerative diseases, automated quantification methods are desirable. For clinical use, there is a great need for PET-only quantification method, as MR images are not always available. In this paper, we validate a previously developed PET-only quantification method against MR-based quantification using 6 tracers: 18F-Florbetaben (N=148), 18F-Florbetapir (N=171), 18F-NAV4694 (N=47), 18F-Flutemetamol (N=180), 11C-PiB (N=381) and 18F-FDG (N=34). The results show an overall mean absolute percentage error of less than 5% for each tracer. The method has been implemented as a remote service called CapAIBL (http://milxcloud.csiro.au/capaibl). PET images are uploaded to a cloud platform where they are spatially normalised to a standard template and quantified. A report containing global as well as local quantification, along with surface projection of the β-Amyloid deposition is automatically generated at the end of the pipeline and emailed to the user.

  16. Standardless quantification by parameter optimization in electron probe microanalysis

    NASA Astrophysics Data System (ADS)

    Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.

    2012-11-01

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively.

  17. The role of PET quantification in cardiovascular imaging.

    PubMed

    Slomka, Piotr; Berman, Daniel S; Alexanderson, Erick; Germano, Guido

    2014-08-01

    Positron Emission Tomography (PET) has several clinical and research applications in cardiovascular imaging. Myocardial perfusion imaging with PET allows accurate global and regional measurements of myocardial perfusion, myocardial blood flow and function at stress and rest in one exam. Simultaneous assessment of function and perfusion by PET with quantitative software is currently the routine practice. Combination of ejection fraction reserve with perfusion information may improve the identification of severe disease. The myocardial viability can be estimated by quantitative comparison of fluorodeoxyglucose ( 18 FDG) and rest perfusion imaging. The myocardial blood flow and coronary flow reserve measurements are becoming routinely included in the clinical assessment due to enhanced dynamic imaging capabilities of the latest PET/CT scanners. Absolute flow measurements allow evaluation of the coronary microvascular dysfunction and provide additional prognostic and diagnostic information for coronary disease. Standard quantitative approaches to compute myocardial blood flow from kinetic PET data in automated and rapid fashion have been developed for 13 N-ammonia, 15 O-water and 82 Rb radiotracers. The agreement between software methods available for such analysis is excellent. Relative quantification of 82 Rb PET myocardial perfusion, based on comparisons to normal databases, demonstrates high performance for the detection of obstructive coronary disease. New tracers, such as 18 F-flurpiridaz may allow further improvements in the disease detection. Computerized analysis of perfusion at stress and rest reduces the variability of the assessment as compared to visual analysis. PET quantification can be enhanced by precise coregistration with CT angiography. In emerging clinical applications, the potential to identify vulnerable plaques by quantification of atherosclerotic plaque uptake of 18 FDG and 18 F-sodium fluoride tracers in carotids, aorta and coronary arteries

  18. Quantification of video-taped images in microcirculation research using inexpensive imaging software (Adobe Photoshop).

    PubMed

    Brunner, J; Krummenauer, F; Lehr, H A

    2000-04-01

    Study end-points in microcirculation research are usually video-taped images rather than numeric computer print-outs. Analysis of these video-taped images for the quantification of microcirculatory parameters usually requires computer-based image analysis systems. Most software programs for image analysis are custom-made, expensive, and limited in their applicability to selected parameters and study end-points. We demonstrate herein that an inexpensive, commercially available computer software (Adobe Photoshop), run on a Macintosh G3 computer with inbuilt graphic capture board provides versatile, easy to use tools for the quantification of digitized video images. Using images obtained by intravital fluorescence microscopy from the pre- and postischemic muscle microcirculation in the skinfold chamber model in hamsters, Photoshop allows simple and rapid quantification (i) of microvessel diameters, (ii) of the functional capillary density and (iii) of postischemic leakage of FITC-labeled high molecular weight dextran from postcapillary venules. We present evidence of the technical accuracy of the software tools and of a high degree of interobserver reliability. Inexpensive commercially available imaging programs (i.e., Adobe Photoshop) provide versatile tools for image analysis with a wide range of potential applications in microcirculation research.

  19. Quantification of integrated HIV DNA by repetitive-sampling Alu-HIV PCR on the basis of poisson statistics.

    PubMed

    De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos

    2014-06-01

    Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.

  20. Adaptive Quantification and Longitudinal Analysis of Pulmonary Emphysema with a Hidden Markov Measure Field Model

    PubMed Central

    Häme, Yrjö; Angelini, Elsa D.; Hoffman, Eric A.; Barr, R. Graham; Laine, Andrew F.

    2014-01-01

    The extent of pulmonary emphysema is commonly estimated from CT images by computing the proportional area of voxels below a predefined attenuation threshold. However, the reliability of this approach is limited by several factors that affect the CT intensity distributions in the lung. This work presents a novel method for emphysema quantification, based on parametric modeling of intensity distributions in the lung and a hidden Markov measure field model to segment emphysematous regions. The framework adapts to the characteristics of an image to ensure a robust quantification of emphysema under varying CT imaging protocols and differences in parenchymal intensity distributions due to factors such as inspiration level. Compared to standard approaches, the present model involves a larger number of parameters, most of which can be estimated from data, to handle the variability encountered in lung CT scans. The method was used to quantify emphysema on a cohort of 87 subjects, with repeated CT scans acquired over a time period of 8 years using different imaging protocols. The scans were acquired approximately annually, and the data set included a total of 365 scans. The results show that the emphysema estimates produced by the proposed method have very high intra-subject correlation values. By reducing sensitivity to changes in imaging protocol, the method provides a more robust estimate than standard approaches. In addition, the generated emphysema delineations promise great advantages for regional analysis of emphysema extent and progression, possibly advancing disease subtyping. PMID:24759984

  1. Aeroelastic Uncertainty Quantification Studies Using the S4T Wind Tunnel Model

    NASA Technical Reports Server (NTRS)

    Nikbay, Melike; Heeg, Jennifer

    2017-01-01

    This paper originates from the joint efforts of an aeroelastic study team in the Applied Vehicle Technology Panel from NATO Science and Technology Organization, with the Task Group number AVT-191, titled "Application of Sensitivity Analysis and Uncertainty Quantification to Military Vehicle Design." We present aeroelastic uncertainty quantification studies using the SemiSpan Supersonic Transport wind tunnel model at the NASA Langley Research Center. The aeroelastic study team decided treat both structural and aerodynamic input parameters as uncertain and represent them as samples drawn from statistical distributions, propagating them through aeroelastic analysis frameworks. Uncertainty quantification processes require many function evaluations to asses the impact of variations in numerous parameters on the vehicle characteristics, rapidly increasing the computational time requirement relative to that required to assess a system deterministically. The increased computational time is particularly prohibitive if high-fidelity analyses are employed. As a remedy, the Istanbul Technical University team employed an Euler solver in an aeroelastic analysis framework, and implemented reduced order modeling with Polynomial Chaos Expansion and Proper Orthogonal Decomposition to perform the uncertainty propagation. The NASA team chose to reduce the prohibitive computational time by employing linear solution processes. The NASA team also focused on determining input sample distributions.

  2. The Challenges of Credible Thermal Protection System Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.

    2013-01-01

    The paper discusses several of the challenges associated with developing a credible reliability estimate for a human-rated crew capsule thermal protection system. The process of developing such a credible estimate is subject to the quantification, modeling and propagation of numerous uncertainties within a probabilistic analysis. The development of specific investment recommendations, to improve the reliability prediction, among various potential testing and programmatic options is then accomplished through Bayesian analysis.

  3. A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.

    PubMed

    Rutledge, Robert G

    2011-03-02

    Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.

  4. Protein, enzyme and carbohydrate quantification using smartphone through colorimetric digitization technique.

    PubMed

    Dutta, Sibasish; Saikia, Gunjan Prasad; Sarma, Dhruva Jyoti; Gupta, Kuldeep; Das, Priyanka; Nath, Pabitra

    2017-05-01

    In this paper the utilization of smartphone as a detection platform for colorimetric quantification of biological macromolecules has been demonstrated. Using V-channel of HSV color space, the quantification of BSA protein, catalase enzyme and carbohydrate (using D-glucose) have been successfully investigated. A custom designed android application has been developed for estimating the total concentration of biological macromolecules. The results have been compared with that of a standard spectrophotometer which is generally used for colorimetric quantification in laboratory settings by measuring its absorbance at a specific wavelength. The results obtained with the designed sensor is found to be similar when compared with the spectrophotometer data. The designed sensor is low cost, robust and we envision that it could promote diverse fields of bio-analytical investigations. Schematic illustration of the smartphone sensing mechanism for colorimetric analysis of biomolecular samples. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Event-specific real-time detection and quantification of genetically modified Roundup Ready soybean.

    PubMed

    Huang, Chia-Chia; Pan, Tzu-Ming

    2005-05-18

    The event-specific real-time detection and quantification of Roundup Ready soybean (RRS) using an ABI PRISM 7700 sequence detection system with light upon extension (LUX) primer was developed in this study. The event-specific primers were designed, targeting the junction of the RRS 5' integration site and the endogenous gene lectin1. Then, a standard reference plasmid was constructed that carried both of the targeted sequences for quantitative analysis. The detection limit of the LUX real-time PCR system was 0.05 ng of 100% RRS genomic DNA, which was equal to 20.5 copies. The range of quantification was from 0.1 to 100%. The sensitivity and range of quantification successfully met the requirement of the labeling rules in the European Union and Taiwan.

  6. Evaluation of Direct Infusion-Multiple Reaction Monitoring Mass Spectrometry for Quantification of Heat Shock Proteins

    PubMed Central

    Xiang, Yun; Koomen, John M.

    2012-01-01

    Protein quantification with liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM) has emerged as a powerful platform for assessing panels of biomarkers. In this study, direct infusion, using automated, chip-based nanoelectrospray ionization, coupled with MRM (DI-MRM) is used for protein quantification. Removal of the LC separation step increases the importance of evaluating the ratios between the transitions. Therefore, the effects of solvent composition, analyte concentration, spray voltage, and quadrupole resolution settings on fragmentation patterns have been studied using peptide and protein standards. After DI-MRM quantification was evaluated for standards, quantitative assays for the expression of heat shock proteins (HSPs) were translated from LC-MRM to DI-MRM for implementation in cell line models of multiple myeloma. Requirements for DI-MRM assay development are described. Then, the two methods are compared; criteria for effective DI-MRM analysis are reported based on the analysis of HSP expression in digests of whole cell lysates. The increased throughput of DI-MRM analysis is useful for rapid analysis of large batches of similar samples, such as time course measurements of cellular responses to therapy. PMID:22293045

  7. RNA-Skim: a rapid method for RNA-Seq quantification at transcript level

    PubMed Central

    Zhang, Zhaojun; Wang, Wei

    2014-01-01

    Motivation: RNA-Seq technique has been demonstrated as a revolutionary means for exploring transcriptome because it provides deep coverage and base pair-level resolution. RNA-Seq quantification is proven to be an efficient alternative to Microarray technique in gene expression study, and it is a critical component in RNA-Seq differential expression analysis. Most existing RNA-Seq quantification tools require the alignments of fragments to either a genome or a transcriptome, entailing a time-consuming and intricate alignment step. To improve the performance of RNA-Seq quantification, an alignment-free method, Sailfish, has been recently proposed to quantify transcript abundances using all k-mers in the transcriptome, demonstrating the feasibility of designing an efficient alignment-free method for transcriptome quantification. Even though Sailfish is substantially faster than alternative alignment-dependent methods such as Cufflinks, using all k-mers in the transcriptome quantification impedes the scalability of the method. Results: We propose a novel RNA-Seq quantification method, RNA-Skim, which partitions the transcriptome into disjoint transcript clusters based on sequence similarity, and introduces the notion of sig-mers, which are a special type of k-mers uniquely associated with each cluster. We demonstrate that the sig-mer counts within a cluster are sufficient for estimating transcript abundances with accuracy comparable with any state-of-the-art method. This enables RNA-Skim to perform transcript quantification on each cluster independently, reducing a complex optimization problem into smaller optimization tasks that can be run in parallel. As a result, RNA-Skim uses <4% of the k-mers and <10% of the CPU time required by Sailfish. It is able to finish transcriptome quantification in <10 min per sample by using just a single thread on a commodity computer, which represents >100 speedup over the state-of-the-art alignment-based methods, while delivering

  8. Deep-Dive Targeted Quantification for Ultrasensitive Analysis of Proteins in Nondepleted Human Blood Plasma/Serum and Tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, Song; Shi, Tujin; Fillmore, Thomas L.

    Mass spectrometry-based targeted proteomics (e.g., selected reaction monitoring, SRM) is emerging as an attractive alternative to immunoassays for protein quantification. Recently we have made significant progress in SRM sensitivity for enabling quantification of low ng/mL to sub-ng/mL level proteins in nondepleted human blood plasma/serum without affinity enrichment. However, precise quantification of extremely low abundant but biologically important proteins (e.g., ≤100 pg/mL in blood plasma/serum) using targeted proteomics approaches still remains challenging. To address this need, we have developed an antibody-independent Deep-Dive SRM (DD-SRM) approach that capitalizes on multidimensional high-resolution reversed-phase liquid chromatography (LC) separation for target peptide enrichment combined withmore » precise selection of target peptide fractions of interest, significantly improving SRM sensitivity by ~5 orders of magnitude when compared to conventional LC-SRM. Application of DD-SRM to human serum and tissue has been demonstrated to enable precise quantification of endogenous proteins at ~10 pg/mL level in nondepleted serum and at <10 copies per cell level in tissue. Thus, DD-SRM holds great promise for precisely measuring extremely low abundance proteins or protein modifications, especially when high-quality antibody is not available.« less

  9. On the Confounding Effect of Temperature on Chemical Shift-Encoded Fat Quantification

    PubMed Central

    Hernando, Diego; Sharma, Samir D.; Kramer, Harald; Reeder, Scott B.

    2014-01-01

    Purpose To characterize the confounding effect of temperature on chemical shift-encoded (CSE) fat quantification. Methods The proton resonance frequency of water, unlike triglycerides, depends on temperature. This leads to a temperature dependence of the spectral models of fat (relative to water) that are commonly used by CSE-MRI methods. Simulation analysis was performed for 1.5 Tesla CSE fat–water signals at various temperatures and echo time combinations. Oil–water phantoms were constructed and scanned at temperatures between 0 and 40°C using spectroscopy and CSE imaging at three echo time combinations. An explanted human liver, rejected for transplantation due to steatosis, was scanned using spectroscopy and CSE imaging. Fat–water reconstructions were performed using four different techniques: magnitude and complex fitting, with standard or temperature-corrected signal modeling. Results In all experiments, magnitude fitting with standard signal modeling resulted in large fat quantification errors. Errors were largest for echo time combinations near TEinit ≈ 1.3 ms, ΔTE ≈ 2.2 ms. Errors in fat quantification caused by temperature-related frequency shifts were smaller with complex fitting, and were avoided using a temperature-corrected signal model. Conclusion Temperature is a confounding factor for fat quantification. If not accounted for, it can result in large errors in fat quantifications in phantom and ex vivo acquisitions. PMID:24123362

  10. Breast density quantification with cone-beam CT: A post-mortem study

    PubMed Central

    Johnson, Travis; Ding, Huanjun; Le, Huy Q.; Ducote, Justin L.; Molloi, Sabee

    2014-01-01

    Forty post-mortem breasts were imaged with a flat-panel based cone-beam x-ray CT system at 50 kVp. The feasibility of breast density quantification has been investigated using standard histogram thresholding and an automatic segmentation method based on the fuzzy c-means algorithm (FCM). The breasts were chemically decomposed into water, lipid, and protein immediately after image acquisition was completed. The percent fibroglandular volume (%FGV) from chemical analysis was used as the gold standard for breast density comparison. Both image-based segmentation techniques showed good precision in breast density quantification with high linear coefficients between the right and left breast of each pair. When comparing with the gold standard using %FGV from chemical analysis, Pearson’s r-values were estimated to be 0.983 and 0.968 for the FCM clustering and the histogram thresholding techniques, respectively. The standard error of the estimate (SEE) was also reduced from 3.92% to 2.45% by applying the automatic clustering technique. The results of the postmortem study suggested that breast tissue can be characterized in terms of water, lipid and protein contents with high accuracy by using chemical analysis, which offers a gold standard for breast density studies comparing different techniques. In the investigated image segmentation techniques, the FCM algorithm had high precision and accuracy in breast density quantification. In comparison to conventional histogram thresholding, it was more efficient and reduced inter-observer variation. PMID:24254317

  11. HPLC Quantification of astaxanthin and canthaxanthin in Salmonidae eggs.

    PubMed

    Tzanova, Milena; Argirova, Mariana; Atanasov, Vasil

    2017-04-01

    Astaxanthin and canthaxanthin are naturally occurring antioxidants referred to as xanthophylls. They are used as food additives in fish farms to improve the organoleptic qualities of salmonid products and to prevent reproductive diseases. This study reports the development and single-laboratory validation of a rapid method for quantification of astaxanthin and canthaxanthin in eggs of rainbow trout (Oncorhynchus mykiss) and brook trout (Salvelinus fontinalis М.). An advantage of the proposed method is the perfect combination of selective extraction of the xanthophylls and analysis of the extract by high-performance liquid chromatography and photodiode array detection. The method validation was carried out in terms of linearity, accuracy, precision, recovery and limits of detection and quantification. The method was applied for simultaneous quantification of the two xanthophylls in eggs of rainbow trout and brook trout after their selective extraction. The results show that astaxanthin accumulations in salmonid fish eggs are larger than those of canthaxanthin. As the levels of these two xanthophylls affect fish fertility, this method can be used to improve the nutritional quality and to minimize the occurrence of the M74 syndrome in fish populations. Copyright © 2016 John Wiley & Sons, Ltd.

  12. Quantification of confocal images of biofilms grown on irregular surfaces

    PubMed Central

    Ross, Stacy Sommerfeld; Tu, Mai Han; Falsetta, Megan L.; Ketterer, Margaret R.; Kiedrowski, Megan R.; Horswill, Alexander R.; Apicella, Michael A.; Reinhardt, Joseph M.; Fiegel, Jennifer

    2014-01-01

    Bacterial biofilms grow on many types of surfaces, including flat surfaces such as glass and metal and irregular surfaces such as rocks, biological tissues and polymers. While laser scanning confocal microscopy can provide high-resolution images of biofilms grown on any surface, quantification of biofilm-associated bacteria is currently limited to bacteria grown on flat surfaces. This can limit researchers studying irregular surfaces to qualitative analysis or quantification of only the total bacteria in an image. In this work, we introduce a new algorithm called modified connected volume filtration (MCVF) to quantify bacteria grown on top of an irregular surface that is fluorescently labeled or reflective. Using the MCVF algorithm, two new quantification parameters are introduced. The modified substratum coverage parameter enables quantification of the connected-biofilm bacteria on top of the surface and on the imaging substratum. The utility of MCVF and the modified substratum coverage parameter were shown with Pseudomonas aeruginosa and Staphylococcus aureus biofilms grown on human airway epithelial cells. A second parameter, the percent association, provides quantified data on the colocalization of the bacteria with a labeled component, including bacteria within a labeled tissue. The utility of quantifying the bacteria associated with the cell cytoplasm was demonstrated with Neisseria gonorrhoeae biofilms grown on cervical epithelial cells. This algorithm provides more flexibility and quantitative ability to researchers studying biofilms grown on a variety of irregular substrata. PMID:24632515

  13. quantGenius: implementation of a decision support system for qPCR-based gene quantification.

    PubMed

    Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina

    2017-05-25

    Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.

  14. A Java Program for LRE-Based Real-Time qPCR that Enables Large-Scale Absolute Quantification

    PubMed Central

    Rutledge, Robert G.

    2011-01-01

    Background Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Findings Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. Conclusions The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples. PMID:21407812

  15. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  16. Accurate quantification of chromosomal lesions via short tandem repeat analysis using minimal amounts of DNA

    PubMed Central

    Jann, Johann-Christoph; Nowak, Daniel; Nolte, Florian; Fey, Stephanie; Nowak, Verena; Obländer, Julia; Pressler, Jovita; Palme, Iris; Xanthopoulos, Christina; Fabarius, Alice; Platzbecker, Uwe; Giagounidis, Aristoteles; Götze, Katharina; Letsch, Anne; Haase, Detlef; Schlenk, Richard; Bug, Gesine; Lübbert, Michael; Ganser, Arnold; Germing, Ulrich; Haferlach, Claudia; Hofmann, Wolf-Karsten; Mossner, Maximilian

    2017-01-01

    Background Cytogenetic aberrations such as deletion of chromosome 5q (del(5q)) represent key elements in routine clinical diagnostics of haematological malignancies. Currently established methods such as metaphase cytogenetics, FISH or array-based approaches have limitations due to their dependency on viable cells, high costs or semi-quantitative nature. Importantly, they cannot be used on low abundance DNA. We therefore aimed to establish a robust and quantitative technique that overcomes these shortcomings. Methods For precise determination of del(5q) cell fractions, we developed an inexpensive multiplex-PCR assay requiring only nanograms of DNA that simultaneously measures allelic imbalances of 12 independent short tandem repeat markers. Results Application of this method to n=1142 samples from n=260 individuals revealed strong intermarker concordance (R²=0.77–0.97) and reproducibility (mean SD: 1.7%). Notably, the assay showed accurate quantification via standard curve assessment (R²>0.99) and high concordance with paired FISH measurements (R²=0.92) even with subnanogram amounts of DNA. Moreover, cytogenetic response was reliably confirmed in del(5q) patients with myelodysplastic syndromes treated with lenalidomide. While the assay demonstrated good diagnostic accuracy in receiver operating characteristic analysis (area under the curve: 0.97), we further observed robust correlation between bone marrow and peripheral blood samples (R²=0.79), suggesting its potential suitability for less-invasive clonal monitoring. Conclusions In conclusion, we present an adaptable tool for quantification of chromosomal aberrations, particularly in problematic samples, which should be easily applicable to further tumour entities. PMID:28600436

  17. Processing and domain selection: Quantificational variability effects

    PubMed Central

    Harris, Jesse A.; Clifton, Charles; Frazier, Lyn

    2014-01-01

    Three studies investigated how readers interpret sentences with variable quantificational domains, e.g., The army was mostly in the capital, where mostly may quantify over individuals or parts (Most of the army was in the capital) or over times (The army was in the capital most of the time). It is proposed that a general conceptual economy principle, No Extra Times (Majewski 2006, in preparation), discourages the postulation of potentially unnecessary times, and thus favors the interpretation quantifying over parts. Disambiguating an ambiguously quantified sentence to a quantification over times interpretation was rated as less natural than disambiguating it to a quantification over parts interpretation (Experiment 1). In an interpretation questionnaire, sentences with similar quantificational variability were constructed so that both interpretations of the sentence would require postulating multiple times; this resulted in the elimination of the preference for a quantification over parts interpretation, suggesting the parts preference observed in Experiment 1 is not reducible to a lexical bias of the adverb mostly (Experiment 2). An eye movement recording study showed that, in the absence of prior evidence for multiple times, readers exhibit greater difficulty when reading material that forces a quantification over times interpretation than when reading material that allows a quantification over parts interpretation (Experiment 3). These experiments contribute to understanding readers’ default assumptions about the temporal properties of sentences, which is essential for understanding the selection of a domain for adverbial quantifiers and, more generally, for understanding how situational constraints influence sentence processing. PMID:25328262

  18. A tool for selective inline quantification of co-eluting proteins in chromatography using spectral analysis and partial least squares regression.

    PubMed

    Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen

    2014-07-01

    Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. © 2014 Wiley Periodicals, Inc.

  19. Metering error quantification under voltage and current waveform distortion

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Wang, Jia; Xie, Zhi; Zhang, Ran

    2017-09-01

    With integration of more and more renewable energies and distortion loads into power grid, the voltage and current waveform distortion results in metering error in the smart meters. Because of the negative effects on the metering accuracy and fairness, it is an important subject to study energy metering combined error. In this paper, after the comparing between metering theoretical value and real recorded value under different meter modes for linear and nonlinear loads, a quantification method of metering mode error is proposed under waveform distortion. Based on the metering and time-division multiplier principles, a quantification method of metering accuracy error is proposed also. Analyzing the mode error and accuracy error, a comprehensive error analysis method is presented which is suitable for new energy and nonlinear loads. The proposed method has been proved by simulation.

  20. Comparative quantification of dietary supplemented neural creatine concentrations with (1)H-MRS peak fitting and basis spectrum methods.

    PubMed

    Turner, Clare E; Russell, Bruce R; Gant, Nicholas

    2015-11-01

    Magnetic resonance spectroscopy (MRS) is an analytical procedure that can be used to non-invasively measure the concentration of a range of neural metabolites. Creatine is an important neurometabolite with dietary supplementation offering therapeutic potential for neurological disorders with dysfunctional energetic processes. Neural creatine concentrations can be probed using proton MRS and quantified using a range of software packages based on different analytical methods. This experiment examines the differences in quantification performance of two commonly used analysis packages following a creatine supplementation strategy with potential therapeutic application. Human participants followed a seven day dietary supplementation regime in a placebo-controlled, cross-over design interspersed with a five week wash-out period. Spectroscopy data were acquired the day immediately following supplementation and analyzed with two commonly-used software packages which employ vastly different quantification methods. Results demonstrate that neural creatine concentration was augmented following creatine supplementation when analyzed using the peak fitting method of quantification (105.9%±10.1). In contrast, no change in neural creatine levels were detected with supplementation when analysis was conducted using the basis spectrum method of quantification (102.6%±8.6). Results suggest that software packages that employ the peak fitting procedure for spectral quantification are possibly more sensitive to subtle changes in neural creatine concentrations. The relative simplicity of the spectroscopy sequence and the data analysis procedure suggest that peak fitting procedures may be the most effective means of metabolite quantification when detection of subtle alterations in neural metabolites is necessary. The straightforward technique can be used on a clinical magnetic resonance imaging system. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Sequence signatures of allosteric proteins towards rational design.

    PubMed

    Namboodiri, Saritha; Verma, Chandra; Dhar, Pawan K; Giuliani, Alessandro; Nair, Achuthsankar S

    2010-12-01

    Allostery is the phenomenon of changes in the structure and activity of proteins that appear as a consequence of ligand binding at sites other than the active site. Studying mechanistic basis of allostery leading to protein design with predetermined functional endpoints is an important unmet need of synthetic biology. Here, we screened the amino acid sequence landscape in search of sequence-signatures of allostery using Recurrence Quantitative Analysis (RQA) method. A characteristic vector, comprised of 10 features extracted from RQA was defined for amino acid sequences. Using Principal Component Analysis, four factors were found to be important determinants of allosteric behavior. Our sequence-based predictor method shows 82.6% accuracy, 85.7% sensitivity and 77.9% specificity with the current dataset. Further, we show that Laminarity-Mean-hydrophobicity representing repeated hydrophobic patches is the most crucial indicator of allostery. To our best knowledge this is the first report that describes sequence determinants of allostery based on hydrophobicity. As an outcome of these findings, we plan to explore possibility of inducing allostery in proteins.

  2. Quantification of applied dose in irradiated citrus fruits by DNA Comet Assay together with image analysis.

    PubMed

    Cetinkaya, Nurcan; Ercin, Demet; Özvatan, Sümer; Erel, Yakup

    2016-02-01

    The experiments were conducted for quantification of applied dose for quarantine control in irradiated citrus fruits. Citrus fruits exposed to doses of 0.1 to 1.5 kGy and analyzed by DNA Comet Assay. Observed comets were evaluated by image analysis. The tail length, tail moment and tail DNA% of comets were used for the interpretation of comets. Irradiated citrus fruits showed the separated tails from the head of the comet by increasing applied doses from 0.1 to 1.5 kGy. The mean tail length and mean tail moment% levels of irradiated citrus fruits at all doses are significantly different (p < 0.01) from control even for the lowest dose at 0.1 kGy. Thus, DNA Comet Assay may be a practical quarantine control method for irradiated citrus fruits since it has been possible to estimate the applied low doses as small as 0.1 kGy when it is combined with image analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Recurrence quantification analysis of heart rate variability and respiratory flow series in patients on weaning trials.

    PubMed

    Arcentales, Andrés; Giraldo, Beatriz F; Caminal, Pere; Benito, Salvador; Voss, Andreas

    2011-01-01

    Autonomic nervous system regulates the behavior of cardiac and respiratory systems. Its assessment during the ventilator weaning can provide information about physio-pathological imbalances. This work proposes a non linear analysis of the complexity of the heart rate variability (HRV) and breathing duration (T(Tot)) applying recurrence plot (RP) and their interaction joint recurrence plot (JRP). A total of 131 patients on weaning trials from mechanical ventilation were analyzed: 92 patients with successful weaning (group S) and 39 patients that failed to maintain spontaneous breathing (group F). The results show that parameters as determinism (DET), average diagonal line length (L), and entropy (ENTR), are statistically significant with RP for T(Tot) series, but not with HRV. When comparing the groups with JRP, all parameters have been relevant. In all cases, mean values of recurrence quantification analysis are higher in the group S than in the group F. The main differences between groups were found on the diagonal and vertical structures of the joint recurrence plot.

  4. Quantification of brain lipids by FTIR spectroscopy and partial least squares regression

    NASA Astrophysics Data System (ADS)

    Dreissig, Isabell; Machill, Susanne; Salzer, Reiner; Krafft, Christoph

    2009-01-01

    Brain tissue is characterized by high lipid content. Its content decreases and the lipid composition changes during transformation from normal brain tissue to tumors. Therefore, the analysis of brain lipids might complement the existing diagnostic tools to determine the tumor type and tumor grade. Objective of this work is to extract lipids from gray matter and white matter of porcine brain tissue, record infrared (IR) spectra of these extracts and develop a quantification model for the main lipids based on partial least squares (PLS) regression. IR spectra of the pure lipids cholesterol, cholesterol ester, phosphatidic acid, phosphatidylcholine, phosphatidylethanolamine, phosphatidylserine, phosphatidylinositol, sphingomyelin, galactocerebroside and sulfatide were used as references. Two lipid mixtures were prepared for training and validation of the quantification model. The composition of lipid extracts that were predicted by the PLS regression of IR spectra was compared with lipid quantification by thin layer chromatography.

  5. Ensemble-based uncertainty quantification for coordination and control of thermostatically controlled loads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Weixuan; Lian, Jianming; Engel, Dave

    2017-07-27

    This paper presents a general uncertainty quantification (UQ) framework that provides a systematic analysis of the uncertainty involved in the modeling of a control system, and helps to improve the performance of a control strategy.

  6. Direct liquid chromatography method for the simultaneous quantification of hydroxytyrosol and tyrosol in red wines.

    PubMed

    Piñeiro, Zulema; Cantos-Villar, Emma; Palma, Miguel; Puertas, Belen

    2011-11-09

    A validated HPLC method with fluorescence detection for the simultaneous quantification of hydroxytyrosol and tyrosol in red wines is described. Detection conditions for both compounds were optimized (excitation at 279 and 278 and emission at 631 and 598 nm for hydroxytyrosol and tyrosol, respectively). The validation of the analytical method was based on selectivity, linearity, robustness, detection and quantification limits, repeatability, and recovery. The detection and quantification limits in red wines were set at 0.023 and 0.076 mg L(-1) for hydroxytyrosol and at 0.007 and 0.024 mg L(-1) for tyrosol determination, respectively. Precision values, both within-day and between-day (n = 5), remained below 3% for both compounds. In addition, a fractional factorial experimental design was developed to analyze the influence of six different conditions on analysis. The final optimized HPLC-fluorescence method allowed the analysis of 30 nonpretreated Spanish red wines to evaluate their hydroxytyrosol and tyrosol contents.

  7. Quantification method analysis of the relationship between occupant injury and environmental factors in traffic accidents.

    PubMed

    Ju, Yong Han; Sohn, So Young

    2011-01-01

    Injury analysis following a vehicle crash is one of the most important research areas. However, most injury analyses have focused on one-dimensional injury variables, such as the AIS (Abbreviated Injury Scale) or the IIS (Injury Impairment Scale), at a time in relation to various traffic accident factors. However, these studies cannot reflect the various injury phenomena that appear simultaneously. In this paper, we apply quantification method II to the NASS (National Automotive Sampling System) CDS (Crashworthiness Data System) to find the relationship between the categorical injury phenomena, such as the injury scale, injury position, and injury type, and the various traffic accident condition factors, such as speed, collision direction, vehicle type, and seat position. Our empirical analysis indicated the importance of safety devices, such as restraint equipment and airbags. In addition, we found that narrow impact, ejection, air bag deployment, and higher speed are associated with more severe than minor injury to the thigh, ankle, and leg in terms of dislocation, abrasion, or laceration. Copyright © 2010 Elsevier Ltd. All rights reserved.

  8. The variable and chaotic nature of professional golf performance.

    PubMed

    Stöckl, Michael; Lamb, Peter F

    2018-05-01

    In golf, unlike most other sports, individual performance is not the result of direct interactions between players. Instead decision-making and performance is influenced by numerous constraining factors affecting each shot. This study looked at the performance of PGA TOUR golfers in 2011 in terms of stability and variability on a shot-by-shot basis. Stability and variability were assessed using Recurrence Quantification Analysis (RQA) and standard deviation, respectively. About 10% of all shots comprised short stable phases of performance (3.7 ± 1.1 shots per stable phase). Stable phases tended to consist of shots of typical performance, rather than poor or exceptional shots; this finding was consistent for all shot categories. Overall, stability measures were not correlated with tournament performance. Variability across all shots was not related to tournament performance; however, variability in tee shots and short approach shots was higher than for other shot categories. Furthermore, tee shot variability was related to tournament standing: decreased variability was associated with better tournament ranking. The findings in this study showed that PGA TOUR golf performance is chaotic. Further research on amateur golf performance is required to determine whether the structure of amateur golf performance is universal.

  9. Quantification of synthesized hydration products using synchrotron microtomography and spectral analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deboodt, Tyler; Ideker, Jason H.; Isgor, O. Burkan

    2017-12-01

    The use of x-ray computed tomography (CT) as a standalone method has primarily been used to characterize pore structure, cracking and mechanical damage in cementitious systems due to low contrast in the hydrated phases. These limitations have resulted in the inability to extract quantifiable information on such phases. The goal of this research was to address the limitations caused by low contrast and improving the ability to distinguish the four primary hydrated phases in portland cement; C-S-H, calcium hydroxide, monosulfate, and ettringite. X-ray CT on individual layers, binary mixtures of phases, and quaternary mixtures of phases to represent a hydratedmore » portland cement paste were imaged with synchrotron radiation. Known masses of each phase were converted to a volume and compared to the segmented image volumes. It was observed that adequate contrast in binary mixing of phases allowed for segmentation, and subsequent image analysis indicated quantifiable volumes could be extracted from the tomographic volume. However, low contrast was observed when C-S-H and monosulfate were paired together leading to difficulties segmenting in an unbiased manner. Quantification of phases in quaternary mixtures included larger errors than binary mixes due to histogram overlaps of monosulfate, C-S-H, and calcium hydroxide.« less

  10. Subnuclear foci quantification using high-throughput 3D image cytometry

    NASA Astrophysics Data System (ADS)

    Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.

    2015-07-01

    Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.

  11. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  12. Accurate Quantification of Cardiovascular Biomarkers in Serum Using Protein Standard Absolute Quantification (PSAQ™) and Selected Reaction Monitoring*

    PubMed Central

    Huillet, Céline; Adrait, Annie; Lebert, Dorothée; Picard, Guillaume; Trauchessec, Mathieu; Louwagie, Mathilde; Dupuis, Alain; Hittinger, Luc; Ghaleh, Bijan; Le Corvoisier, Philippe; Jaquinod, Michel; Garin, Jérôme; Bruley, Christophe; Brun, Virginie

    2012-01-01

    Development of new biomarkers needs to be significantly accelerated to improve diagnostic, prognostic, and toxicity monitoring as well as therapeutic follow-up. Biomarker evaluation is the main bottleneck in this development process. Selected Reaction Monitoring (SRM) combined with stable isotope dilution has emerged as a promising option to speed this step, particularly because of its multiplexing capacities. However, analytical variabilities because of upstream sample handling or incomplete trypsin digestion still need to be resolved. In 2007, we developed the PSAQ™ method (Protein Standard Absolute Quantification), which uses full-length isotope-labeled protein standards to quantify target proteins. In the present study we used clinically validated cardiovascular biomarkers (LDH-B, CKMB, myoglobin, and troponin I) to demonstrate that the combination of PSAQ and SRM (PSAQ-SRM) allows highly accurate biomarker quantification in serum samples. A multiplex PSAQ-SRM assay was used to quantify these biomarkers in clinical samples from myocardial infarction patients. Good correlation between PSAQ-SRM and ELISA assay results was found and demonstrated the consistency between these analytical approaches. Thus, PSAQ-SRM has the capacity to improve both accuracy and reproducibility in protein analysis. This will be a major contribution to efficient biomarker development strategies. PMID:22080464

  13. Accurate quantification of chromosomal lesions via short tandem repeat analysis using minimal amounts of DNA.

    PubMed

    Jann, Johann-Christoph; Nowak, Daniel; Nolte, Florian; Fey, Stephanie; Nowak, Verena; Obländer, Julia; Pressler, Jovita; Palme, Iris; Xanthopoulos, Christina; Fabarius, Alice; Platzbecker, Uwe; Giagounidis, Aristoteles; Götze, Katharina; Letsch, Anne; Haase, Detlef; Schlenk, Richard; Bug, Gesine; Lübbert, Michael; Ganser, Arnold; Germing, Ulrich; Haferlach, Claudia; Hofmann, Wolf-Karsten; Mossner, Maximilian

    2017-09-01

    Cytogenetic aberrations such as deletion of chromosome 5q (del(5q)) represent key elements in routine clinical diagnostics of haematological malignancies. Currently established methods such as metaphase cytogenetics, FISH or array-based approaches have limitations due to their dependency on viable cells, high costs or semi-quantitative nature. Importantly, they cannot be used on low abundance DNA. We therefore aimed to establish a robust and quantitative technique that overcomes these shortcomings. For precise determination of del(5q) cell fractions, we developed an inexpensive multiplex-PCR assay requiring only nanograms of DNA that simultaneously measures allelic imbalances of 12 independent short tandem repeat markers. Application of this method to n=1142 samples from n=260 individuals revealed strong intermarker concordance (R²=0.77-0.97) and reproducibility (mean SD: 1.7%). Notably, the assay showed accurate quantification via standard curve assessment (R²>0.99) and high concordance with paired FISH measurements (R²=0.92) even with subnanogram amounts of DNA. Moreover, cytogenetic response was reliably confirmed in del(5q) patients with myelodysplastic syndromes treated with lenalidomide. While the assay demonstrated good diagnostic accuracy in receiver operating characteristic analysis (area under the curve: 0.97), we further observed robust correlation between bone marrow and peripheral blood samples (R²=0.79), suggesting its potential suitability for less-invasive clonal monitoring. In conclusion, we present an adaptable tool for quantification of chromosomal aberrations, particularly in problematic samples, which should be easily applicable to further tumour entities. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  14. Quantification of prebiotics in commercial infant formulas.

    PubMed

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Improving microstructural quantification in FIB/SEM nanotomography.

    PubMed

    Taillon, Joshua A; Pellegrinelli, Christopher; Huang, Yi-Lin; Wachsman, Eric D; Salamanca-Riba, Lourdes G

    2018-01-01

    FIB/SEM nanotomography (FIB-nt) is a powerful technique for the determination and quantification of the three-dimensional microstructure in subsurface features. Often times, the microstructure of a sample is the ultimate determiner of the overall performance of a system, and a detailed understanding of its properties is crucial in advancing the materials engineering of a resulting device. While the FIB-nt technique has developed significantly in the 15 years since its introduction, advanced nanotomographic analysis is still far from routine, and a number of challenges remain in data acquisition and post-processing. In this work, we present a number of techniques to improve the quality of the acquired data, together with easy-to-implement methods to obtain "advanced" microstructural quantifications. The techniques are applied to a solid oxide fuel cell cathode of interest to the electrochemistry community, but the methodologies are easily adaptable to a wide range of material systems. Finally, results from an analyzed sample are presented as a practical example of how these techniques can be implemented. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. 43 CFR 11.71 - Quantification phase-service reduction quantification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...-discharge-or-release condition. (c) Contents of the quantification. The following factors should be included...; and (6) Factors identified in the specific guidance in paragraphs (h), (i), (j), (k), and (l) of this section dealing with the different kinds of natural resources. (d) Selection of resources, services, and...

  17. 43 CFR 11.71 - Quantification phase-service reduction quantification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-discharge-or-release condition. (c) Contents of the quantification. The following factors should be included...; and (6) Factors identified in the specific guidance in paragraphs (h), (i), (j), (k), and (l) of this section dealing with the different kinds of natural resources. (d) Selection of resources, services, and...

  18. An accurate proteomic quantification method: fluorescence labeling absolute quantification (FLAQ) using multidimensional liquid chromatography and tandem mass spectrometry.

    PubMed

    Liu, Junyan; Liu, Yang; Gao, Mingxia; Zhang, Xiangmin

    2012-08-01

    A facile proteomic quantification method, fluorescent labeling absolute quantification (FLAQ), was developed. Instead of using MS for quantification, the FLAQ method is a chromatography-based quantification in combination with MS for identification. Multidimensional liquid chromatography (MDLC) with laser-induced fluorescence (LIF) detection with high accuracy and tandem MS system were employed for FLAQ. Several requirements should be met for fluorescent labeling in MS identification: Labeling completeness, minimum side-reactions, simple MS spectra, and no extra tandem MS fragmentations for structure elucidations. A fluorescence dye, 5-iodoacetamidofluorescein, was finally chosen to label proteins on all cysteine residues. The fluorescent dye was compatible with the process of the trypsin digestion and MALDI MS identification. Quantitative labeling was achieved with optimization of reacting conditions. A synthesized peptide and model proteins, BSA (35 cysteines), OVA (five cysteines), were used for verifying the completeness of labeling. Proteins were separated through MDLC and quantified based on fluorescent intensities, followed by MS identification. High accuracy (RSD% < 1.58) and wide linearity of quantification (1-10(5) ) were achieved by LIF detection. The limit of quantitation for the model protein was as low as 0.34 amol. Parts of proteins in human liver proteome were quantified and demonstrated using FLAQ. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Multiplexed data independent acquisition (MSX-DIA) applied by high resolution mass spectrometry improves quantification quality for the analysis of histone peptides.

    PubMed

    Sidoli, Simone; Fujiwara, Rina; Garcia, Benjamin A

    2016-08-01

    We present the MS-based application of the innovative, although scarcely exploited, multiplexed data-independent acquisition (MSX-DIA) for the analysis of histone PTMs. Histones are golden standard for complexity in MS based proteomics, due to their large number of combinatorial modifications, leading to isobaric peptides after proteolytic digestion. DIA has, thus, gained popularity for the purpose as it allows for MS/MS-based quantification without upfront assay development. In this work, we evaluated the performance of traditional DIA versus MSX-DIA in terms of MS/MS spectra quality, instrument scan rate and quantification precision using histones from HeLa cells. We used an MS/MS isolation window of 10 and 6 m/z for DIA and MSX-DIA, respectively. Four MS/MS scans were multiplexed for MSX-DIA. Despite MSX-DIA was programmed to perform two-fold more MS/MS events than traditional DIA, it acquired on average ∼5% more full MS scans, indicating even faster scan rate. Results highlighted an overall decrease of background ion signals using MSX-DIA, and we illustrated specific examples where peptides of different precursor masses were co-fragmented by DIA but not MSX-DIA. Taken together, MSX-DIA proved thus to be a more favorable method for histone analysis in data independent mode. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Multiplexed data independent acquisition (MSX-DIA) applied by high resolution mass spectrometry improves quantification quality for the analysis of histone peptides

    PubMed Central

    Sidoli, Simone; Fujiwara, Rina; Garcia, Benjamin A.

    2016-01-01

    We present the mass spectrometry (MS) based application of the innovative, although scarcely exploited, multiplexed data-independent acquisition (MSX-DIA) for the analysis of histone post-translational modifications (PTMs). Histones are golden standard for complexity in MS based proteomics, due to their large number of combinatorial modifications, leading to isobaric peptides after proteolytic digestion. DIA has thus gained popularity for the purpose as it allows for MS/MS-based quantification without upfront assay development. In this work, we evaluated the performance of traditional DIA versus MSX-DIA in terms of MS/MS spectra quality, instrument scan rate and quantification precision using histones from HeLa cells. We used an MS/MS isolation window of 10 and 6 m/z for DIA and MSX-DIA, respectively. Four MS/MS scans were multiplexed for MSX-DIA. Despite MSX-DIA was programmed to perform 2-fold more MS/MS events than traditional DIA, it acquired on average ~5% more full MS scans, indicating even faster scan rate. Results highlighted an overall decrease of background ion signals using MSX-DIA, and we illustrated specific examples where peptides of different precursor masses were co-fragmented by DIA but not MSX-DIA. Taken together, MSX-DIA proved thus to be a more favorable method for histone analysis in data independent mode. PMID:27193262

  1. Mass Spectrometric Quantification of N-Linked Glycans by Reference to Exogenous Standards.

    PubMed

    Mehta, Nickita; Porterfield, Mindy; Struwe, Weston B; Heiss, Christian; Azadi, Parastoo; Rudd, Pauline M; Tiemeyer, Michael; Aoki, Kazuhiro

    2016-09-02

    Environmental and metabolic processes shape the profile of glycoprotein glycans expressed by cells, whether in culture, developing tissues, or mature organisms. Quantitative characterization of glycomic changes associated with these conditions has been achieved historically by reductive coupling of oligosaccharides to various fluorophores following release from glycoprotein and subsequent HPLC or capillary electrophoretic separation. Such labeling-based approaches provide a robust means of quantifying glycan amount based on fluorescence yield. Mass spectrometry, on the other hand, has generally been limited to relative quantification in which the contribution of the signal intensity for an individual glycan is expressed as a percent of the signal intensity summed over the total profile. Relative quantification has been valuable for highlighting changes in glycan expression between samples; sensitivity is high, and structural information can be derived by fragmentation. We have investigated whether MS-based glycomics is amenable to absolute quantification by referencing signal intensities to well-characterized oligosaccharide standards. We report the qualification of a set of N-linked oligosaccharide standards by NMR, HPLC, and MS. We also demonstrate the dynamic range, sensitivity, and recovery from complex biological matrices for these standards in their permethylated form. Our results indicate that absolute quantification for MS-based glycomic analysis is reproducible and robust utilizing currently available glycan standards.

  2. Headspace solid-phase microextraction and gas chromatographic analysis of low-molecular-weight sulfur volatiles with pulsed flame photometric detection and quantification by a stable isotope dilution assay.

    PubMed

    Ullrich, Sebastian; Neef, Sylvia K; Schmarr, Hans-Georg

    2018-02-01

    Low-molecular-weight volatile sulfur compounds such as thiols, sulfides, disulfides as well as thioacetates cause a sulfidic off-flavor in wines even at low concentration levels. The proposed analytical method for quantification of these compounds in wine is based on headspace solid-phase microextraction, followed by gas chromatographic analysis with sulfur-specific detection using a pulsed flame photometric detector. Robust quantification was achieved via a stable isotope dilution assay using commercial and synthesized deuterated isotopic standards. The necessary chromatographic separation of analytes and isotopic standards benefits from the inverse isotope effect realized on an apolar polydimethylsiloxane stationary phase of increased film thickness. Interferences with sulfur-specific detection in wine caused by sulfur dioxide were minimized by addition of propanal. The method provides adequate validation data, with good repeatability and limits of detection and quantification. It suits the requirements of wine quality management, allowing the control of oenological treatments to counteract an eventual formation of excessively high concentration of such malodorous compounds. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Mixture quantification using PLS in plastic scintillation measurements.

    PubMed

    Bagán, H; Tarancón, A; Rauret, G; García, J F

    2011-06-01

    This article reports the capability of plastic scintillation (PS) combined with multivariate calibration (Partial least squares; PLS) to detect and quantify alpha and beta emitters in mixtures. While several attempts have been made with this purpose in mind using liquid scintillation (LS), no attempt was done using PS that has the great advantage of not producing mixed waste after the measurements are performed. Following this objective, ternary mixtures of alpha and beta emitters ((241)Am, (137)Cs and (90)Sr/(90)Y) have been quantified. Procedure optimisation has evaluated the use of the net spectra or the sample spectra, the inclusion of different spectra obtained at different values of the Pulse Shape Analysis parameter and the application of the PLS1 or PLS2 algorithms. The conclusions show that the use of PS+PLS2 applied to the sample spectra, without the use of any pulse shape discrimination, allows quantification of the activities with relative errors less than 10% in most of the cases. This procedure not only allows quantification of mixtures but also reduces measurement time (no blanks are required) and the application of this procedure does not require detectors that include the pulse shape analysis parameter. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Applicability of plasmid calibrant pTC1507 in quantification of TC1507 maize: an interlaboratory study.

    PubMed

    Meng, Yanan; Liu, Xin; Wang, Shu; Zhang, Dabing; Yang, Litao

    2012-01-11

    To enforce the labeling regulations of genetically modified organisms (GMOs), the application of DNA plasmids as calibrants is becoming essential for the practical quantification of GMOs. This study reports the construction of plasmid pTC1507 for a quantification assay of genetically modified (GM) maize TC1507 and the collaborative ring trial in international validation of its applicability as a plasmid calibrant. pTC1507 includes one event-specific sequence of TC1507 maize and one unique sequence of maize endogenous gene zSSIIb. A total of eight GMO detection laboratories worldwide were invited to join the validation process, and test results were returned from all eight participants. Statistical analysis of the returned results showed that real-time PCR assays using pTC1507 as calibrant in both GM event-specific and endogenous gene quantifications had high PCR efficiency (ranging from 0.80 to 1.15) and good linearity (ranging from 0.9921 to 0.9998). In a quantification assay of five blind samples, the bias between the test values and true values ranged from 2.6 to 24.9%. All results indicated that the developed pTC1507 plasmid is applicable for the quantitative analysis of TC1507 maize and can be used as a suitable substitute for dried powder certified reference materials (CRMs).

  5. Robust high-resolution quantification of time signals encoded by in vivo magnetic resonance spectroscopy

    NASA Astrophysics Data System (ADS)

    Belkić, Dževad; Belkić, Karen

    2018-01-01

    This paper on molecular imaging emphasizes improving specificity of magnetic resonance spectroscopy (MRS) for early cancer diagnostics by high-resolution data analysis. Sensitivity of magnetic resonance imaging (MRI) is excellent, but specificity is insufficient. Specificity is improved with MRS by going beyond morphology to assess the biochemical content of tissue. This is contingent upon accurate data quantification of diagnostically relevant biomolecules. Quantification is spectral analysis which reconstructs chemical shifts, amplitudes and relaxation times of metabolites. Chemical shifts inform on electronic shielding of resonating nuclei bound to different molecular compounds. Oscillation amplitudes in time signals retrieve the abundance of MR sensitive nuclei whose number is proportional to metabolite concentrations. Transverse relaxation times, the reciprocal of decay probabilities of resonances, arise from spin-spin coupling and reflect local field inhomogeneities. In MRS single voxels are used. For volumetric coverage, multi-voxels are employed within a hybrid of MRS and MRI called magnetic resonance spectroscopic imaging (MRSI). Common to MRS and MRSI is encoding of time signals and subsequent spectral analysis. Encoded data do not provide direct clinical information. Spectral analysis of time signals can yield the quantitative information, of which metabolite concentrations are the most clinically important. This information is equivocal with standard data analysis through the non-parametric, low-resolution fast Fourier transform and post-processing via fitting. By applying the fast Padé transform (FPT) with high-resolution, noise suppression and exact quantification via quantum mechanical signal processing, advances are made, presented herein, focusing on four areas of critical public health importance: brain, prostate, breast and ovarian cancers.

  6. Uncertainty quantification for PZT bimorph actuators

    NASA Astrophysics Data System (ADS)

    Bravo, Nikolas; Smith, Ralph C.; Crews, John

    2018-03-01

    In this paper, we discuss the development of a high fidelity model for a PZT bimorph actuator used for micro-air vehicles, which includes the Robobee. We developed a high-fidelity model for the actuator using the homogenized energy model (HEM) framework, which quantifies the nonlinear, hysteretic, and rate-dependent behavior inherent to PZT in dynamic operating regimes. We then discussed an inverse problem on the model. We included local and global sensitivity analysis of the parameters in the high-fidelity model. Finally, we will discuss the results of Bayesian inference and uncertainty quantification on the HEM.

  7. Monte Carlo Modeling-Based Digital Loop-Mediated Isothermal Amplification on a Spiral Chip for Absolute Quantification of Nucleic Acids.

    PubMed

    Xia, Yun; Yan, Shuangqian; Zhang, Xian; Ma, Peng; Du, Wei; Feng, Xiaojun; Liu, Bi-Feng

    2017-03-21

    Digital loop-mediated isothermal amplification (dLAMP) is an attractive approach for absolute quantification of nucleic acids with high sensitivity and selectivity. Theoretical and numerical analysis of dLAMP provides necessary guidance for the design and analysis of dLAMP devices. In this work, a mathematical model was proposed on the basis of the Monte Carlo method and the theories of Poisson statistics and chemometrics. To examine the established model, we fabricated a spiral chip with 1200 uniform and discrete reaction chambers (9.6 nL) for absolute quantification of pathogenic DNA samples by dLAMP. Under the optimized conditions, dLAMP analysis on the spiral chip realized quantification of nucleic acids spanning over 4 orders of magnitude in concentration with sensitivity as low as 8.7 × 10 -2 copies/μL in 40 min. The experimental results were consistent with the proposed mathematical model, which could provide useful guideline for future development of dLAMP devices.

  8. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics

    PubMed Central

    Röst, Hannes L.; Liu, Yansheng; D’Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C.; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-01-01

    Large scale, quantitative proteomic studies have become essential for the analysis of clinical cohorts, large perturbation experiments and systems biology studies. While next-generation mass spectrometric techniques such as SWATH-MS have substantially increased throughput and reproducibility, ensuring consistent quantification of thousands of peptide analytes across multiple LC-MS/MS runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we have developed the TRIC software which utilizes fragment ion data to perform cross-run alignment, consistent peak-picking and quantification for high throughput targeted proteomics. TRIC uses a graph-based alignment strategy based on non-linear retention time correction to integrate peak elution information from all LC-MS/MS runs acquired in a study. When compared to state-of-the-art SWATH-MS data analysis, the algorithm was able to reduce the identification error by more than 3-fold at constant recall, while correcting for highly non-linear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem (iPS) cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups and substantially increased the quantitative completeness and biological information in the data, providing insights into protein dynamics of iPS cells. Overall, this study demonstrates the importance of consistent quantification in highly challenging experimental setups, and proposes an algorithm to automate this task, constituting the last missing piece in a pipeline for automated analysis of massively parallel targeted proteomics datasets. PMID:27479329

  9. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics.

    PubMed

    Röst, Hannes L; Liu, Yansheng; D'Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-09-01

    Next-generation mass spectrometric (MS) techniques such as SWATH-MS have substantially increased the throughput and reproducibility of proteomic analysis, but ensuring consistent quantification of thousands of peptide analytes across multiple liquid chromatography-tandem MS (LC-MS/MS) runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we developed TRIC (http://proteomics.ethz.ch/tric/), a software tool that utilizes fragment-ion data to perform cross-run alignment, consistent peak-picking and quantification for high-throughput targeted proteomics. TRIC reduced the identification error compared to a state-of-the-art SWATH-MS analysis without alignment by more than threefold at constant recall while correcting for highly nonlinear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups. Thus, TRIC fills a gap in the pipeline for automated analysis of massively parallel targeted proteomics data sets.

  10. Quantification of Spatial Heterogeneity in Old Growth Forst of Korean Pine

    Treesearch

    Wang Zhengquan; Wang Qingcheng; Zhang Yandong

    1997-01-01

    Spatial hetergeneity is a very important issue in studying functions and processes of ecological systems at various scales. Semivariogram analysis is an effective technique to summarize spatial data, and quantification of sptail heterogeneity. In this paper, we propose some principles to use semivariograms to characterize and compare spatial heterogeneity of...

  11. Spot quantification in two dimensional gel electrophoresis image analysis: comparison of different approaches and presentation of a novel compound fitting algorithm

    PubMed Central

    2014-01-01

    Background Various computer-based methods exist for the detection and quantification of protein spots in two dimensional gel electrophoresis images. Area-based methods are commonly used for spot quantification: an area is assigned to each spot and the sum of the pixel intensities in that area, the so-called volume, is used a measure for spot signal. Other methods use the optical density, i.e. the intensity of the most intense pixel of a spot, or calculate the volume from the parameters of a fitted function. Results In this study we compare the performance of different spot quantification methods using synthetic and real data. We propose a ready-to-use algorithm for spot detection and quantification that uses fitting of two dimensional Gaussian function curves for the extraction of data from two dimensional gel electrophoresis (2-DE) images. The algorithm implements fitting using logical compounds and is computationally efficient. The applicability of the compound fitting algorithm was evaluated for various simulated data and compared with other quantification approaches. We provide evidence that even if an incorrect bell-shaped function is used, the fitting method is superior to other approaches, especially when spots overlap. Finally, we validated the method with experimental data of urea-based 2-DE of Aβ peptides andre-analyzed published data sets. Our methods showed higher precision and accuracy than other approaches when applied to exposure time series and standard gels. Conclusion Compound fitting as a quantification method for 2-DE spots shows several advantages over other approaches and could be combined with various spot detection methods. The algorithm was scripted in MATLAB (Mathworks) and is available as a supplemental file. PMID:24915860

  12. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  13. USACM Thematic Workshop On Uncertainty Quantification And Data-Driven Modeling.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stewart, James R.

    The USACM Thematic Workshop on Uncertainty Quantification and Data-Driven Modeling was held on March 23-24, 2017, in Austin, TX. The organizers of the technical program were James R. Stewart of Sandia National Laboratories and Krishna Garikipati of University of Michigan. The administrative organizer was Ruth Hengst, who serves as Program Coordinator for the USACM. The organization of this workshop was coordinated through the USACM Technical Thrust Area on Uncertainty Quantification and Probabilistic Analysis. The workshop website (http://uqpm2017.usacm.org) includes the presentation agenda as well as links to several of the presentation slides (permission to access the presentations was granted by eachmore » of those speakers, respectively). Herein, this final report contains the complete workshop program that includes the presentation agenda, the presentation abstracts, and the list of posters.« less

  14. Reductive amination derivatization for the quantification of garlic components by isotope dilution analysis.

    PubMed

    Lin, Yi-Reng; Huang, Mei-Fang; Wu, You-Ying; Liu, Meng-Chieh; Huang, Jing-Heng; Chen, Ziyu; Shiue, Yow-Ling; Wu, Chia-En; Liang, Shih-Shin

    2017-09-01

    In this work, we synthesized internal standards for four garlic organosulfur compounds (OSCs) by reductive amination with 13 C, D 2 -formaldehyde, and developed an isotope dilution analysis method to quantitate these organosulfur components in garlic samples. Internal standards were synthesized for internal absolute quantification of S-allylcysteine (SAC), S-allylcysteine sulfoxide (alliin), S-methylcysteine (SMC), and S-ethylcysteine (SEC). We used a multiple reaction monitoring (MRM) to detect 13 C, D 2 -formaldehyde-modified OSCs by ultrahigh-performance liquid phase chromatography coupled with tandem mass spectrometry (UHPLC-MS/MS) and obtained MS spectra showing different ratios of 13 C, D 2 -formaldehyde-modified and H 2 -formaldehyde-modified compounds. The resulting labeled and unlabeled OSCs were exhibited correlation coefficient (R 2 ) ranged from 0.9989 to 0.9994, respectively. The average recoveries for four OSCs at three concentration levels ranged from 89% to 105%. By 13 C, D 2 -formaldehyde and sodium cyanoborohydride, the reductive amination-based method can be utilized to generate novel internal standard for isotope dilution and to extend the quantitative application. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Targeted Proteomic Quantification on Quadrupole-Orbitrap Mass Spectrometer*

    PubMed Central

    Gallien, Sebastien; Duriez, Elodie; Crone, Catharina; Kellmann, Markus; Moehring, Thomas; Domon, Bruno

    2012-01-01

    There is an immediate need for improved methods to systematically and precisely quantify large sets of peptides in complex biological samples. To date protein quantification in biological samples has been routinely performed on triple quadrupole instruments operated in selected reaction monitoring mode (SRM), and two major challenges remain. Firstly, the number of peptides to be included in one survey experiment needs to be increased to routinely reach several hundreds, and secondly, the degree of selectivity should be improved so as to reliably discriminate the targeted analytes from background interferences. High resolution and accurate mass (HR/AM) analysis on the recently developed Q-Exactive mass spectrometer can potentially address these issues. This instrument presents a unique configuration: it is constituted of an orbitrap mass analyzer equipped with a quadrupole mass filter as the front-end for precursor ion mass selection. This configuration enables new quantitative methods based on HR/AM measurements, including targeted analysis in MS mode (single ion monitoring) and in MS/MS mode (parallel reaction monitoring). The ability of the quadrupole to select a restricted m/z range allows one to overcome the dynamic range limitations associated with trapping devices, and the MS/MS mode provides an additional stage of selectivity. When applied to targeted protein quantification in urine samples and benchmarked with the reference SRM technique, the quadrupole-orbitrap instrument exhibits similar or better performance in terms of selectivity, dynamic range, and sensitivity. This high performance is further enhanced by leveraging the multiplexing capability of the instrument to design novel acquisition methods and apply them to large targeted proteomic studies for the first time, as demonstrated on 770 tryptic yeast peptides analyzed in one 60-min experiment. The increased quality of quadrupole-orbitrap data has the potential to improve existing protein

  16. Cues, quantification, and agreement in language comprehension.

    PubMed

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.

  17. PaCeQuant: A Tool for High-Throughput Quantification of Pavement Cell Shape Characteristics.

    PubMed

    Möller, Birgit; Poeschl, Yvonne; Plötner, Romina; Bürstenbinder, Katharina

    2017-11-01

    Pavement cells (PCs) are the most frequently occurring cell type in the leaf epidermis and play important roles in leaf growth and function. In many plant species, PCs form highly complex jigsaw-puzzle-shaped cells with interlocking lobes. Understanding of their development is of high interest for plant science research because of their importance for leaf growth and hence for plant fitness and crop yield. Studies of PC development, however, are limited, because robust methods are lacking that enable automatic segmentation and quantification of PC shape parameters suitable to reflect their cellular complexity. Here, we present our new ImageJ-based tool, PaCeQuant, which provides a fully automatic image analysis workflow for PC shape quantification. PaCeQuant automatically detects cell boundaries of PCs from confocal input images and enables manual correction of automatic segmentation results or direct import of manually segmented cells. PaCeQuant simultaneously extracts 27 shape features that include global, contour-based, skeleton-based, and PC-specific object descriptors. In addition, we included a method for classification and analysis of lobes at two-cell junctions and three-cell junctions, respectively. We provide an R script for graphical visualization and statistical analysis. We validated PaCeQuant by extensive comparative analysis to manual segmentation and existing quantification tools and demonstrated its usability to analyze PC shape characteristics during development and between different genotypes. PaCeQuant thus provides a platform for robust, efficient, and reproducible quantitative analysis of PC shape characteristics that can easily be applied to study PC development in large data sets. © 2017 American Society of Plant Biologists. All Rights Reserved.

  18. Multiplex Droplet Digital PCR Protocols for Quantification of GM Maize Events.

    PubMed

    Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Štebih, Dejan; Morisset, Dany; Holst-Jensen, Arne; Žel, Jana

    2018-01-01

    The standard-curve based simplex quantitative polymerase chain reaction (qPCR) has been the gold standard for DNA target quantification for more than a decade. The large and growing number of individual analyses needed to test for genetically modified organisms (GMOs) is reducing the cost-effectiveness of qPCR. Droplet digital PCR (ddPCR) enables absolute quantification without standard curves, avoids the amplification efficiency bias observed with qPCR, allows more accurate estimations at low target copy numbers and, in combination with multiplexing, significantly improves cost efficiency. Here we describe two protocols for multiplex quantification of GM maize events: (1) nondiscriminating, with multiplex quantification of targets as a group (12 GM maize lines) and (2) discriminating, with multiplex quantification of individual targets (events). The first enables the quantification of twelve European Union authorized GM maize events as a group with only two assays, but does not permit determination of the individual events present. The second protocol enables the quantification of four individual targets (three GM events and one endogene) in a single reaction. Both protocols can be modified for quantification of any other DNA target.

  19. A new clinical unit for digital radiography based on a thick amorphous selenium plate: physical and psychophysical characterization.

    PubMed

    Rivetti, Stefano; Lanconelli, Nico; Bertolini, Marco; Acchiappati, Domenico

    2011-08-01

    Here, we present a physical and psychophysical characterization of a new clinical unit (named AcSelerate) for digital radiography based on a thick a-Se layer. We also compared images acquired with and without a software filter (named CRF) developed for reducing sharpness and noise of the images and making them similar to images coming from traditional computed radiography systems. The characterization was achieved in terms of physical figures of merit [modulation transfer function (MTF), noise power spectra (NPS), detective quantum efficiency (DQE)], and psychophysical parameters (contrast-detail analysis with an automatic reading of CDRAD images). We accomplished measurements with four standard beam conditions: RAQ3, RQA5, RQA7, and RQA9. The system shows an excellent MTF (about 50% at the Nyquist frequency). The DQE is about 55% at 0.5 lp/mm and above 20% at the Nyquist frequency and is almost independent from exposure. The contrast-detail curves are comparable to some of the best published data for other systems devoted to imaging in general radiography. The CRF filter influences both the MTF and NPS, but it does lead to very small changes on DQE. Also the visibility of CDRAD details is basically unaltered, when the filter is activated. As normally happens with detector based on direct conversion, the system presents an excellent MTF. The improved efficiency caused by the thick layer allows getting good noise characteristics and DQE results better (about 10% on average) than many of the computed radiography (CR) systems and comparable to those obtained by the best systems for digital radiography available on the market.

  20. Automated Quantification of Hematopoietic Cell – Stromal Cell Interactions in Histological Images of Undecalcified Bone

    PubMed Central

    Zehentmeier, Sandra; Cseresnyes, Zoltan; Escribano Navarro, Juan; Niesner, Raluca A.; Hauser, Anja E.

    2015-01-01

    Confocal microscopy is the method of choice for the analysis of localization of multiple cell types within complex tissues such as the bone marrow. However, the analysis and quantification of cellular localization is difficult, as in many cases it relies on manual counting, thus bearing the risk of introducing a rater-dependent bias and reducing interrater reliability. Moreover, it is often difficult to judge whether the co-localization between two cells results from random positioning, especially when cell types differ strongly in the frequency of their occurrence. Here, a method for unbiased quantification of cellular co-localization in the bone marrow is introduced. The protocol describes the sample preparation used to obtain histological sections of whole murine long bones including the bone marrow, as well as the staining protocol and the acquisition of high-resolution images. An analysis workflow spanning from the recognition of hematopoietic and non-hematopoietic cell types in 2-dimensional (2D) bone marrow images to the quantification of the direct contacts between those cells is presented. This also includes a neighborhood analysis, to obtain information about the cellular microenvironment surrounding a certain cell type. In order to evaluate whether co-localization of two cell types is the mere result of random cell positioning or reflects preferential associations between the cells, a simulation tool which is suitable for testing this hypothesis in the case of hematopoietic as well as stromal cells, is used. This approach is not limited to the bone marrow, and can be extended to other tissues to permit reproducible, quantitative analysis of histological data. PMID:25938636

  1. Quantification and characterization of Si in Pinus Insignis Dougl by TXRF

    NASA Astrophysics Data System (ADS)

    Navarro, Henry; Bennun, Leonardo; Marcó, Lué M.

    2015-03-01

    A simple quantification of silicon is described, in woods such as Pinus Insigne Dougl obtained from the 8th region of Bío-Bío, 37°15″ South-73°19″ West, Chile. The samples were prepared through fractional calcination, and the ashes were directly analyzed by total reflection X-ray fluorescence (TXRF) technique. The analysis of 16 samples that were calcined is presented. The samples were weighed on plastic reflectors in a microbalance with sensitivity of 0.1 µg. Later, the samples were irradiated in a TXRF PICOFOX spectrometer, for 350 and 700 s. To each sample, cobalt was added as an internal standard. Concentrations of silicon over the 1 % in each sample and the self-absorption effect on the quantification were observed, in masses higher than 100 μg.

  2. Automated quantification of myocardial perfusion SPECT using simplified normal limits.

    PubMed

    Slomka, Piotr J; Nishina, Hidetaka; Berman, Daniel S; Akincioglu, Cigdem; Abidov, Aiden; Friedman, John D; Hayes, Sean W; Germano, Guido

    2005-01-01

    To simplify development of normal limits for myocardial perfusion SPECT (MPS), we implemented a quantification scheme in which normal limits are derived without visual scoring of abnormal scans or optimization of regional thresholds. Normal limits were derived from same-day TI-201 rest/Tc-99m-sestamibi stress scans of male (n = 40) and female (n = 40) low-likelihood patients. Defect extent, total perfusion deficit (TPD), and regional perfusion extents were derived by comparison to normal limits in polar-map coordinates. MPS scans from 256 consecutive patients without known coronary artery disease, who underwent coronary angiography, were analyzed. The new method of quantification (TPD) was compared with our previously developed quantification system and visual scoring. The receiver operator characteristic area under the curve for detection of 50% or greater stenoses by TPD (0.88 +/- 0.02) was higher than by visual scoring (0.83 +/- 0.03) ( P = .039) or standard quantification (0.82 +/- 0.03) ( P = .004). For detection of 70% or greater stenoses, it was higher for TPD (0.89 +/- 0.02) than for standard quantification (0.85 +/- 0.02) ( P = .014). Sensitivity and specificity were 93% and 79%, respectively, for TPD; 81% and 85%, respectively, for visual scoring; and 80% and 73%, respectively, for standard quantification. The use of stress mode-specific normal limits did not improve performance. Simplified quantification achieves performance better than or equivalent to visual scoring or quantification based on per-segment visual optimization of abnormality thresholds.

  3. Quantification of immobilized Candida antarctica lipase B (CALB) using ICP-AES combined with Bradford method.

    PubMed

    Nicolás, Paula; Lassalle, Verónica L; Ferreira, María L

    2017-02-01

    The aim of this manuscript was to study the application of a new method of protein quantification in Candida antarctica lipase B commercial solutions. Error sources associated to the traditional Bradford technique were demonstrated. Eight biocatalysts based on C. antarctica lipase B (CALB) immobilized onto magnetite nanoparticles were used. Magnetite nanoparticles were coated with chitosan (CHIT) and modified with glutaraldehyde (GLUT) and aminopropyltriethoxysilane (APTS). Later, CALB was adsorbed on the modified support. The proposed novel protein quantification method included the determination of sulfur (from protein in CALB solution) by means of Atomic Emission by Inductive Coupling Plasma (AE-ICP). Four different protocols were applied combining AE-ICP and classical Bradford assays, besides Carbon, Hydrogen and Nitrogen (CHN) analysis. The calculated error in protein content using the "classic" Bradford method with bovine serum albumin as standard ranged from 400 to 1200% when protein in CALB solution was quantified. These errors were calculated considering as "true protein content values" the results of the amount of immobilized protein obtained with the improved method. The optimum quantification procedure involved the combination of Bradford method, ICP and CHN analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Localized 2D COSY sequences: Method and experimental evaluation for a whole metabolite quantification approach

    NASA Astrophysics Data System (ADS)

    Martel, Dimitri; Tse Ve Koon, K.; Le Fur, Yann; Ratiney, Hélène

    2015-11-01

    Two-dimensional spectroscopy offers the possibility to unambiguously distinguish metabolites by spreading out the multiplet structure of J-coupled spin systems into a second dimension. Quantification methods that perform parametric fitting of the 2D MRS signal have recently been proposed for resolved PRESS (JPRESS) but not explicitly for Localized Correlation Spectroscopy (LCOSY). Here, through a whole metabolite quantification approach, correlation spectroscopy quantification performances are studied. The ability to quantify metabolite relaxation constant times is studied for three localized 2D MRS sequences (LCOSY, LCTCOSY and the JPRESS) in vitro on preclinical MR systems. The issues encountered during implementation and quantification strategies are discussed with the help of the Fisher matrix formalism. The described parameterized models enable the computation of the lower bound for error variance - generally known as the Cramér Rao bounds (CRBs), a standard of precision - on the parameters estimated from these 2D MRS signal fittings. LCOSY has a theoretical net signal loss of two per unit of acquisition time compared to JPRESS. A rapid analysis could point that the relative CRBs of LCOSY compared to JPRESS (expressed as a percentage of the concentration values) should be doubled but we show that this is not necessarily true. Finally, the LCOSY quantification procedure has been applied on data acquired in vivo on a mouse brain.

  5. Direct quantification of lipopeptide biosurfactants in biological samples via HPLC and UPLC-MS requires sample modification with an organic solvent.

    PubMed

    Biniarz, Piotr; Łukaszewicz, Marcin

    2017-06-01

    The rapid and accurate quantification of biosurfactants in biological samples is challenging. In contrast to the orcinol method for rhamnolipids, no simple biochemical method is available for the rapid quantification of lipopeptides. Various liquid chromatography (LC) methods are promising tools for relatively fast and exact quantification of lipopeptides. Here, we report strategies for the quantification of the lipopeptides pseudofactin and surfactin in bacterial cultures using different high- (HPLC) and ultra-performance liquid chromatography (UPLC) systems. We tested three strategies for sample pretreatment prior to LC analysis. In direct analysis (DA), bacterial cultures were injected directly and analyzed via LC. As a modification, we diluted the samples with methanol and detected an increase in lipopeptide recovery in the presence of methanol. Therefore, we suggest this simple modification as a tool for increasing the accuracy of LC methods. We also tested freeze-drying followed by solvent extraction (FDSE) as an alternative for the analysis of "heavy" samples. In FDSE, the bacterial cultures were freeze-dried, and the resulting powder was extracted with different solvents. Then, the organic extracts were analyzed via LC. Here, we determined the influence of the extracting solvent on lipopeptide recovery. HPLC methods allowed us to quantify pseudofactin and surfactin with run times of 15 and 20 min per sample, respectively, whereas UPLC quantification was as fast as 4 and 5.5 min per sample, respectively. Our methods provide highly accurate measurements and high recovery levels for lipopeptides. At the same time, UPLC-MS provides the possibility to identify lipopeptides and their structural isoforms.

  6. Ferromagnetic resonance for the quantification of superparamagnetic iron oxide nanoparticles in biological materials

    PubMed Central

    Gamarra, Lionel F; daCosta-Filho, Antonio J; Mamani, Javier B; de Cassia Ruiz, Rita; Pavon, Lorena F; Sibov, Tatiana T; Vieira, Ernanni D; Silva, André C; Pontuschka, Walter M; Amaro, Edson

    2010-01-01

    The aim of the present work is the presentation of a quantification methodology for the control of the amount of superparamagnetic iron oxide nanoparticles (SPIONs) administered in biological materials by means of the ferromagnetic resonance technique (FMR) applied to studies both in vivo and in vitro. The in vivo study consisted in the analysis of the elimination and biodistribution kinetics of SPIONs after intravenous administration in Wistar rats. The results were corroborated by X-ray fluorescence. For the in vitro study, a quantitative analysis of the concentration of SPIONs bound to the specific AC133 monoclonal antibodies was carried out in order to detect the expression of the antigenic epitopes (CD133) in stem cells from human umbilical cord blood. In both studies FMR has proven to be an efficient technique for the SPIONs quantification per volume unit (in vivo) or per labeled cell (in vitro). PMID:20463936

  7. Multiplex quantification of protein toxins in human biofluids and food matrices using immunoextraction and high-resolution targeted mass spectrometry.

    PubMed

    Dupré, Mathieu; Gilquin, Benoit; Fenaille, François; Feraudet-Tarisse, Cécile; Dano, Julie; Ferro, Myriam; Simon, Stéphanie; Junot, Christophe; Brun, Virginie; Becher, François

    2015-08-18

    The development of rapid methods for unambiguous identification and precise quantification of protein toxins in various matrices is essential for public health surveillance. Nowadays, analytical strategies classically rely on sensitive immunological assays, but mass spectrometry constitutes an attractive complementary approach thanks to direct measurement and protein characterization ability. We developed here an innovative multiplex immuno-LC-MS/MS method for the simultaneous and specific quantification of the three potential biological warfare agents, ricin, staphylococcal enterotoxin B, and epsilon toxin, in complex human biofluids and food matrices. At least 7 peptides were targeted for each toxin (43 peptides in total) with a quadrupole-Orbitrap high-resolution instrument for exquisite detection specificity. Quantification was performed using stable isotope-labeled toxin standards spiked early in the sample. Lower limits of quantification were determined at or close to 1 ng·mL(-1). The whole process was successfully applied to the quantitative analysis of toxins in complex samples such as milk, human urine, and plasma. Finally, we report new data on toxin stability with no evidence of toxin degradation in milk in a 48 h time frame, allowing relevant quantitative toxin analysis for samples collected in this time range.

  8. Automated quantification of pancreatic β-cell mass

    PubMed Central

    Golson, Maria L.; Bush, William S.

    2014-01-01

    β-Cell mass is a parameter commonly measured in studies of islet biology and diabetes. However, the rigorous quantification of pancreatic β-cell mass using conventional histological methods is a time-consuming process. Rapidly evolving virtual slide technology with high-resolution slide scanners and newly developed image analysis tools has the potential to transform β-cell mass measurement. To test the effectiveness and accuracy of this new approach, we assessed pancreata from normal C57Bl/6J mice and from mouse models of β-cell ablation (streptozotocin-treated mice) and β-cell hyperplasia (leptin-deficient mice), using a standardized systematic sampling of pancreatic specimens. Our data indicate that automated analysis of virtual pancreatic slides is highly reliable and yields results consistent with those obtained by conventional morphometric analysis. This new methodology will allow investigators to dramatically reduce the time required for β-cell mass measurement by automating high-resolution image capture and analysis of entire pancreatic sections. PMID:24760991

  9. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection

    PubMed Central

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-01-01

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets. PMID:27739510

  10. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection

    NASA Astrophysics Data System (ADS)

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-10-01

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.

  11. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection.

    PubMed

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-10-14

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.

  12. WE-AB-204-05: Harmonizing PET/CT Quantification in Multicenter Studies: A Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marques da Silva, A; Fischer, A

    2015-06-15

    Purpose: To present the implementation of a strategy to harmonize FDG PET/CT quantification (SUV), performed with different scanner models and manufacturers. Methods: The strategy was based on Boellaard (2011) and EARL FDG-PET/CT accreditation program, that propose quality control measurements for harmonizing scanner performance. A NEMA IEC Body phantom study was performed using four different devices: PHP-1 (Gemini TF Base, Philips); PHP-2 (Gemini GXL, Philips); GEH (Discovery 600, General Electric); SMS (Biograph Hi-Rez 16, Siemens). The SUV Recovery Coefficient (RC) was calculated using the clinical protocol and other clinically relevant reconstruction parameters. The most appropriate reconstruction parameters (MARP) for SUV harmonization,more » in each scanner, are those which achieve EARL harmonizing standards. They were identified using the lowest root mean square errors (RMSE). To evaluate the strategy’s effectiveness, the Maximum Differences (MD) between the clinical and MARP RC values were calculated. Results: The reconstructions parameters that obtained the lowest RMSE are: FBP 5mm (PHP-1); LOR-RAMLA 2i0.008l (PHP-2); VuePointHD 2i32s10mm (GEH); and FORE+OSEM 4i8s6mm (SMS). Thus, to ensure that quantitative PET image measurements are interchangeable between these sites, images must be reconstructed with the above-mentioned parameters. Although, a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies was observed. The MD showed that the strategy was effective in reducing the variability of SUV quantification for small structures (<17mm). Conclusion: The harmonization strategy of the SUV quantification implemented with these devices was effective in reducing the variability of small structures quantification, minimizing the inter-scanner and inter-institution differences in quantification. However, it is essential that, in addition to the harmonization of quantification, the standardization of

  13. Pesticide residue quantification analysis by hyperspectral imaging sensors

    NASA Astrophysics Data System (ADS)

    Liao, Yuan-Hsun; Lo, Wei-Sheng; Guo, Horng-Yuh; Kao, Ching-Hua; Chou, Tau-Meu; Chen, Junne-Jih; Wen, Chia-Hsien; Lin, Chinsu; Chen, Hsian-Min; Ouyang, Yen-Chieh; Wu, Chao-Cheng; Chen, Shih-Yu; Chang, Chein-I.

    2015-05-01

    Pesticide residue detection in agriculture crops is a challenging issue and is even more difficult to quantify pesticide residue resident in agriculture produces and fruits. This paper conducts a series of base-line experiments which are particularly designed for three specific pesticides commonly used in Taiwan. The materials used for experiments are single leaves of vegetable produces which are being contaminated by various amount of concentration of pesticides. Two sensors are used to collected data. One is Fourier Transform Infrared (FTIR) spectroscopy. The other is a hyperspectral sensor, called Geophysical and Environmental Research (GER) 2600 spectroradiometer which is a batteryoperated field portable spectroradiometer with full real-time data acquisition from 350 nm to 2500 nm. In order to quantify data with different levels of pesticide residue concentration, several measures for spectral discrimination are developed. Mores specifically, new measures for calculating relative power between two sensors are particularly designed to be able to evaluate effectiveness of each of sensors in quantifying the used pesticide residues. The experimental results show that the GER is a better sensor than FTIR in the sense of pesticide residue quantification.

  14. Targeted Quantification of Isoforms of a Thylakoid-Bound Protein: MRM Method Development.

    PubMed

    Bru-Martínez, Roque; Martínez-Márquez, Ascensión; Morante-Carriel, Jaime; Sellés-Marchart, Susana; Martínez-Esteso, María José; Pineda-Lucas, José Luis; Luque, Ignacio

    2018-01-01

    Targeted mass spectrometric methods such as selected/multiple reaction monitoring (SRM/MRM) have found intense application in protein detection and quantification which competes with classical immunoaffinity techniques. It provides a universal procedure to develop a fast, highly specific, sensitive, accurate, and cheap methodology for targeted detection and quantification of proteins based on the direct analysis of their surrogate peptides typically generated by tryptic digestion. This methodology can be advantageously applied in the field of plant proteomics and particularly for non-model species since immunoreagents are scarcely available. Here, we describe the issues to take into consideration in order to develop a MRM method to detect and quantify isoforms of the thylakoid-bound protein polyphenol oxidase from the non-model and database underrepresented species Eriobotrya japonica Lindl.

  15. Quantification of myocardial fibrosis by digital image analysis and interactive stereology.

    PubMed

    Daunoravicius, Dainius; Besusparis, Justinas; Zurauskas, Edvardas; Laurinaviciene, Aida; Bironaite, Daiva; Pankuweit, Sabine; Plancoulaine, Benoit; Herlin, Paulette; Bogomolovas, Julius; Grabauskiene, Virginija; Laurinavicius, Arvydas

    2014-06-09

    Cardiac fibrosis disrupts the normal myocardial structure and has a direct impact on heart function and survival. Despite already available digital methods, the pathologist's visual score is still widely considered as ground truth and used as a primary method in histomorphometric evaluations. The aim of this study was to compare the accuracy of digital image analysis tools and the pathologist's visual scoring for evaluating fibrosis in human myocardial biopsies, based on reference data obtained by point counting performed on the same images. Endomyocardial biopsy material from 38 patients diagnosed with inflammatory dilated cardiomyopathy was used. The extent of total cardiac fibrosis was assessed by image analysis on Masson's trichrome-stained tissue specimens using automated Colocalization and Genie software, by Stereology grid count and manually by Pathologist's visual score. A total of 116 slides were analyzed. The mean results obtained by the Colocalization software (13.72 ± 12.24%) were closest to the reference value of stereology (RVS), while the Genie software and Pathologist score gave a slight underestimation. RVS values correlated strongly with values obtained using the Colocalization and Genie (r>0.9, p<0.001) software as well as the pathologist visual score. Differences in fibrosis quantification by Colocalization and RVS were statistically insignificant. However, significant bias was found in the results obtained by using Genie versus RVS and pathologist score versus RVS with mean difference values of: -1.61% and 2.24%. Bland-Altman plots showed a bidirectional bias dependent on the magnitude of the measurement: Colocalization software overestimated the area fraction of fibrosis in the lower end, and underestimated in the higher end of the RVS values. Meanwhile, Genie software as well as the pathologist score showed more uniform results throughout the values, with a slight underestimation in the mid-range for both. Both applied digital image analysis

  16. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  17. Automated quantification of renal interstitial fibrosis for computer-aided diagnosis: A comprehensive tissue structure segmentation method.

    PubMed

    Tey, Wei Keat; Kuang, Ye Chow; Ooi, Melanie Po-Leen; Khoo, Joon Joon

    2018-03-01

    through knowledge-based rules employing colour space transformations and structural features extraction from the images. In particular, the renal glomerulus identification is based on a multiscale textural feature analysis and a support vector machine. The regions in the biopsy representing interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area. The experiments conducted evaluate the system in terms of quantification accuracy, intra- and inter-observer variability in visual quantification by pathologists, and the effect introduced by the automated quantification system on the pathologists' diagnosis. A 40-image ground truth dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated an average error of 9 percentage points in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists involving samples from 70 kidney patients also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. The accuracy of the proposed quantification system has been validated with the ground truth dataset and compared against the pathologists' quantification results. It has been shown that the correlation between different pathologists' estimation of interstitial fibrosis area has significantly improved, demonstrating the effectiveness of the quantification system as a diagnostic aide. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Sources of hydrocarbons in urban road dust: Identification, quantification and prediction.

    PubMed

    Mummullage, Sandya; Egodawatta, Prasanna; Ayoko, Godwin A; Goonetilleke, Ashantha

    2016-09-01

    Among urban stormwater pollutants, hydrocarbons are a significant environmental concern due to their toxicity and relatively stable chemical structure. This study focused on the identification of hydrocarbon contributing sources to urban road dust and approaches for the quantification of pollutant loads to enhance the design of source control measures. The study confirmed the validity of the use of mathematical techniques of principal component analysis (PCA) and hierarchical cluster analysis (HCA) for source identification and principal component analysis/absolute principal component scores (PCA/APCS) receptor model for pollutant load quantification. Study outcomes identified non-combusted lubrication oils, non-combusted diesel fuels and tyre and asphalt wear as the three most critical urban hydrocarbon sources. The site specific variabilities of contributions from sources were replicated using three mathematical models. The models employed predictor variables of daily traffic volume (DTV), road surface texture depth (TD), slope of the road section (SLP), effective population (EPOP) and effective impervious fraction (EIF), which can be considered as the five governing parameters of pollutant generation, deposition and redistribution. Models were developed such that they can be applicable in determining hydrocarbon contributions from urban sites enabling effective design of source control measures. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Quantification of the resist dissolution process: an in situ analysis using high speed atomic force microscopy

    NASA Astrophysics Data System (ADS)

    Santillan, Julius Joseph; Shichiri, Motoharu; Itani, Toshiro

    2016-03-01

    This work focuses on the application of a high speed atomic force microscope (HS-AFM) for the in situ visualization / quantification of the resist dissolution process. This technique, as reported in the past, has provided useful pointers on the formation of resist patterns during dissolution. This paper discusses about an investigation made on the quantification of what we refer to as "dissolution unit size" or the basic units of patterning material dissolution. This was done through the establishment of an originally developed analysis method which extracts the difference between two succeeding temporal states of the material film surface (images) to indicate the amount of change occurring in the material film at a specific span of time. Preliminary experiments with actual patterning materials were done using a positive-tone EUV model resist composed only of polyhydroxystyrene (PHS)-based polymer with a molecular weight of 2,500 and a polydispersity index of 1.2. In the absence of a protecting group, the material was utilized at a 50nm film thickness with post application bake of 90°C/60s. The resulting film is soluble in the alkali-based developer even without exposure. Results have shown that the dissolution components (dissolution unit size) of the PHS-based material are not of fixed size. Instead, it was found that aside from one constantly dissolving unit size, another, much larger dissolution unit size trend also occurs during material dissolution. The presence of this larger dissolution unit size suggests an occurrence of "polymer clustering". Such polymer clustering was not significantly present during the initial stages of dissolution (near the original film surface) but becomes more persistently obvious after the dissolution process reaches a certain film thickness below the initial surface.

  20. A flow-cytometry-based method to simplify the analysis and quantification of protein association to chromatin in mammalian cells

    PubMed Central

    Forment, Josep V.; Jackson, Stephen P.

    2016-01-01

    Protein accumulation on chromatin has traditionally been studied using immunofluorescence microscopy or biochemical cellular fractionation followed by western immunoblot analysis. As a way to improve the reproducibility of this kind of analysis, make it easier to quantify and allow a stream-lined application in high-throughput screens, we recently combined a classical immunofluorescence microscopy detection technique with flow cytometry1. In addition to the features described above, and by combining it with detection of both DNA content and DNA replication, this method allows unequivocal and direct assignment of cell-cycle distribution of protein association to chromatin without the need for cell culture synchronization. Furthermore, it is relatively quick (no more than a working day from sample collection to quantification), requires less starting material compared to standard biochemical fractionation methods and overcomes the need for flat, adherent cell types that are required for immunofluorescence microscopy. PMID:26226461

  1. An efficient assisted history matching and uncertainty quantification workflow using Gaussian processes proxy models and variogram based sensitivity analysis: GP-VARS

    NASA Astrophysics Data System (ADS)

    Rana, Sachin; Ertekin, Turgay; King, Gregory R.

    2018-05-01

    Reservoir history matching is frequently viewed as an optimization problem which involves minimizing misfit between simulated and observed data. Many gradient and evolutionary strategy based optimization algorithms have been proposed to solve this problem which typically require a large number of numerical simulations to find feasible solutions. Therefore, a new methodology referred to as GP-VARS is proposed in this study which uses forward and inverse Gaussian processes (GP) based proxy models combined with a novel application of variogram analysis of response surface (VARS) based sensitivity analysis to efficiently solve high dimensional history matching problems. Empirical Bayes approach is proposed to optimally train GP proxy models for any given data. The history matching solutions are found via Bayesian optimization (BO) on forward GP models and via predictions of inverse GP model in an iterative manner. An uncertainty quantification method using MCMC sampling in conjunction with GP model is also presented to obtain a probabilistic estimate of reservoir properties and estimated ultimate recovery (EUR). An application of the proposed GP-VARS methodology on PUNQ-S3 reservoir is presented in which it is shown that GP-VARS provides history match solutions in approximately four times less numerical simulations as compared to the differential evolution (DE) algorithm. Furthermore, a comparison of uncertainty quantification results obtained by GP-VARS, EnKF and other previously published methods shows that the P50 estimate of oil EUR obtained by GP-VARS is in close agreement to the true values for the PUNQ-S3 reservoir.

  2. Noninvasive bi-graphical analysis for the quantification of slowly reversible radioligand binding

    NASA Astrophysics Data System (ADS)

    Seo, Seongho; Kim, Su Jin; Yoo, Hye Bin; Lee, Jee-Young; Kyeong Kim, Yu; Lee, Dong Soo; Zhou, Yun; Lee, Jae Sung

    2016-09-01

    In this paper, we presented a novel reference-region-based (noninvasive) bi-graphical analysis for the quantification of a reversible radiotracer binding that may be too slow to reach relative equilibrium (RE) state during positron emission tomography (PET) scans. The proposed method indirectly implements the noninvasive Logan plot, through arithmetic combination of the parameters of two other noninvasive methods and the apparent tissue-to-plasma efflux rate constant for the reference region (k2\\prime ). We investigated its validity and statistical properties, by performing a simulation study with various noise levels and k2\\prime values, and also evaluated its feasibility for [18F]FP-CIT PET in human brain. The results revealed that the proposed approach provides distribution volume ratio estimation comparable to the Logan plot at low noise levels while improving underestimation caused by non-RE state differently depending on k2\\prime . Furthermore, the proposed method was able to avoid noise-induced bias of the Logan plot, and the variability of its results was less dependent on k2\\prime than the Logan plot. Therefore, this approach, without issues related to arterial blood sampling given a pre-estimate of k2\\prime (e.g. population-based), could be useful in parametric image generation for slow kinetic tracers staying in a non-RE state within a PET scan.

  3. Perfusion quantification in contrast-enhanced ultrasound (CEUS)--ready for research projects and routine clinical use.

    PubMed

    Tranquart, F; Mercier, L; Frinking, P; Gaud, E; Arditi, M

    2012-07-01

    With contrast-enhanced ultrasound (CEUS) now established as a valuable imaging modality for many applications, a more specific demand has recently emerged for quantifying perfusion and using measured parameters as objective indicators for various disease states. However, CEUS perfusion quantification remains challenging and is not well integrated in daily clinical practice. The development of VueBox™ alleviates existing limitations and enables quantification in a standardized way. VueBox™ operates as an off-line software application, after dynamic contrast-enhanced ultrasound (DCE-US) is performed. It enables linearization of DICOM clips, assessment of perfusion using patented curve-fitting models, and generation of parametric images by synthesizing perfusion information at the pixel level using color coding. VueBox™ is compatible with most of the available ultrasound platforms (nonlinear contrast-enabled), has the ability to process both bolus and disruption-replenishment kinetics loops, allows analysis results and their context to be saved, and generates analysis reports automatically. Specific features have been added to VueBox™, such as fully automatic in-plane motion compensation and an easy-to-use clip editor. Processing time has been reduced as a result of parallel programming optimized for multi-core processors. A long list of perfusion parameters is available for each of the two administration modes to address all possible demands currently reported in the literature for diagnosis or treatment monitoring. In conclusion, VueBox™ is a valid and robust quantification tool to be used for standardizing perfusion quantification and to improve the reproducibility of results across centers. © Georg Thieme Verlag KG Stuttgart · New York.

  4. Cross-recurrence quantification analysis of categorical and continuous time series: an R package

    PubMed Central

    Coco, Moreno I.; Dale, Rick

    2014-01-01

    This paper describes the R package crqa to perform cross-recurrence quantification analysis of two time series of either a categorical or continuous nature. Streams of behavioral information, from eye movements to linguistic elements, unfold over time. When two people interact, such as in conversation, they often adapt to each other, leading these behavioral levels to exhibit recurrent states. In dialog, for example, interlocutors adapt to each other by exchanging interactive cues: smiles, nods, gestures, choice of words, and so on. In order for us to capture closely the goings-on of dynamic interaction, and uncover the extent of coupling between two individuals, we need to quantify how much recurrence is taking place at these levels. Methods available in crqa would allow researchers in cognitive science to pose such questions as how much are two people recurrent at some level of analysis, what is the characteristic lag time for one person to maximally match another, or whether one person is leading another. First, we set the theoretical ground to understand the difference between “correlation” and “co-visitation” when comparing two time series, using an aggregative or cross-recurrence approach. Then, we describe more formally the principles of cross-recurrence, and show with the current package how to carry out analyses applying them. We end the paper by comparing computational efficiency, and results’ consistency, of crqa R package, with the benchmark MATLAB toolbox crptoolbox (Marwan, 2013). We show perfect comparability between the two libraries on both levels. PMID:25018736

  5. An inexpensive and worldwide available digital image analysis technique for histological fibrosis quantification in chronic hepatitis C.

    PubMed

    Campos, C F F; Paiva, D D; Perazzo, H; Moreira, P S; Areco, L F F; Terra, C; Perez, R; Figueiredo, F A F

    2014-03-01

    Hepatic fibrosis staging is based on semiquantitative scores. Digital imaging analysis (DIA) appears more accurate because fibrosis is quantified in a continuous scale. However, high cost, lack of standardization and worldwide unavailability restrict its use in clinical practice. We developed an inexpensive and widely available DIA technique for fibrosis quantification in hepatitis C, and here, we evaluate its reproducibility and correlation with semiquantitative scores, and determine the fibrosis percentage associated with septal fibrosis and cirrhosis. 282 needle biopsies staged by Ishak and METAVIR scores were included. Images of trichrome-stained sections were captured and processed using Adobe(®) Photoshop(®) CS3 and Adobe(®) Bridge(®) softwares. The percentage of fibrosis (fibrosis index) was determined by the ratio between the fibrosis area and the total sample area, expressed in pixels calculated in an automated way. An excellent correlation between DIA fibrosis index and Ishak and METAVIR scores was observed (Spearman's r = 0.95 and 0.92; P < 0.001, respectively). Excellent intra-observer reproducibility was observed in a randomly chosen subset of 39 biopsies with an intraclass correlation index of 0.99 (95% CI, 0.95-0.99). The best cut-offs associated with septal fibrosis and cirrhosis were 6% (AUROC 0.97, 95% CI, 0.95-0.99) and 27% (AUROC 1.0, 95% CI, 0.99-1), respectively. This new DIA technique had high correlation with semiquantitative scores in hepatitis C. This method is reproducible, inexpensive and available worldwide allowing its use in clinical practice. The incorporation of DIA technique provides a more complete evaluation of fibrosis adding the quantification to architectural patterns. © 2013 John Wiley & Sons Ltd.

  6. PaCeQuant: A Tool for High-Throughput Quantification of Pavement Cell Shape Characteristics1[OPEN

    PubMed Central

    Poeschl, Yvonne; Plötner, Romina

    2017-01-01

    Pavement cells (PCs) are the most frequently occurring cell type in the leaf epidermis and play important roles in leaf growth and function. In many plant species, PCs form highly complex jigsaw-puzzle-shaped cells with interlocking lobes. Understanding of their development is of high interest for plant science research because of their importance for leaf growth and hence for plant fitness and crop yield. Studies of PC development, however, are limited, because robust methods are lacking that enable automatic segmentation and quantification of PC shape parameters suitable to reflect their cellular complexity. Here, we present our new ImageJ-based tool, PaCeQuant, which provides a fully automatic image analysis workflow for PC shape quantification. PaCeQuant automatically detects cell boundaries of PCs from confocal input images and enables manual correction of automatic segmentation results or direct import of manually segmented cells. PaCeQuant simultaneously extracts 27 shape features that include global, contour-based, skeleton-based, and PC-specific object descriptors. In addition, we included a method for classification and analysis of lobes at two-cell junctions and three-cell junctions, respectively. We provide an R script for graphical visualization and statistical analysis. We validated PaCeQuant by extensive comparative analysis to manual segmentation and existing quantification tools and demonstrated its usability to analyze PC shape characteristics during development and between different genotypes. PaCeQuant thus provides a platform for robust, efficient, and reproducible quantitative analysis of PC shape characteristics that can easily be applied to study PC development in large data sets. PMID:28931626

  7. Three-dimensional morphological analysis of intracranial aneurysms: a fully automated method for aneurysm sac isolation and quantification.

    PubMed

    Larrabide, Ignacio; Cruz Villa-Uriol, Maria; Cárdenes, Rubén; Pozo, Jose Maria; Macho, Juan; San Roman, Luis; Blasco, Jordi; Vivas, Elio; Marzo, Alberto; Hose, D Rod; Frangi, Alejandro F

    2011-05-01

    Morphological descriptors are practical and essential biomarkers for diagnosis and treatment selection for intracranial aneurysm management according to the current guidelines in use. Nevertheless, relatively little work has been dedicated to improve the three-dimensional quantification of aneurysmal morphology, to automate the analysis, and hence to reduce the inherent intra and interobserver variability of manual analysis. In this paper we propose a methodology for the automated isolation and morphological quantification of saccular intracranial aneurysms based on a 3D representation of the vascular anatomy. This methodology is based on the analysis of the vasculature skeleton's topology and the subsequent application of concepts from deformable cylinders. These are expanded inside the parent vessel to identify different regions and discriminate the aneurysm sac from the parent vessel wall. The method renders as output the surface representation of the isolated aneurysm sac, which can then be quantified automatically. The proposed method provides the means for identifying the aneurysm neck in a deterministic way. The results obtained by the method were assessed in two ways: they were compared to manual measurements obtained by three independent clinicians as normally done during diagnosis and to automated measurements from manually isolated aneurysms by three independent operators, nonclinicians, experts in vascular image analysis. All the measurements were obtained using in-house tools. The results were qualitatively and quantitatively compared for a set of the saccular intracranial aneurysms (n = 26). Measurements performed on a synthetic phantom showed that the automated measurements obtained from manually isolated aneurysms where the most accurate. The differences between the measurements obtained by the clinicians and the manually isolated sacs were statistically significant (neck width: p <0.001, sac height: p = 0.002). When comparing clinicians

  8. Experimental artefacts occurring during atom probe tomography analysis of oxide nanoparticles in metallic matrix: Quantification and correction

    NASA Astrophysics Data System (ADS)

    Hatzoglou, C.; Radiguet, B.; Pareige, P.

    2017-08-01

    Oxide Dispersion Strengthened (ODS) steels are promising candidates for future nuclear reactors, partly due to the fine dispersion of the nanoparticles they contain. Until now, there was no consensus as to the nature of the nanoparticles because their analysis pushed the techniques to their limits and in consequence, introduced some artefacts. In this study, the artefacts that occur during atom probe tomography analysis are quantified. The artefacts quantification reveals that the particles morphology, chemical composition and atomic density are biased. A model is suggested to correct these artefacts in order to obtain a fine and accurate characterization of the nanoparticles. This model is based on volume fraction calculation and an analytical expression of the atomic density. Then, the studied ODS steel reveals nanoparticles, pure in Y, Ti and O, with a core/shell structure. The shell is rich in Cr. The Cr content of the shell is dependent on that of the matrix by a factor of 1.5. This study also shows that 15% of the atoms that were initially in the particles are not detected during the analysis. This only affects O atoms. The particle stoichiometry evolves from YTiO2 for the smallest observed (<2 nm) to Y2TiO5 for the biggest (>8 nm).

  9. Multiple products monitoring as a robust approach for peptide quantification.

    PubMed

    Baek, Je-Hyun; Kim, Hokeun; Shin, Byunghee; Yu, Myeong-Hee

    2009-07-01

    Quantification of target peptides and proteins is crucial for biomarker discovery. Approaches such as selected reaction monitoring (SRM) and multiple reaction monitoring (MRM) rely on liquid chromatography and mass spectrometric analysis of defined peptide product ions. These methods are not very widespread because the determination of quantifiable product ion using either SRM or MRM is a very time-consuming process. We developed a novel approach for quantifying target peptides without such an arduous process of ion selection. This method is based on monitoring multiple product ions (multiple products monitoring: MpM) from full-range MS2 spectra of a target precursor. The MpM method uses a scoring system that considers both the absolute intensities of product ions and the similarities between the query MS2 spectrum and the reference MS2 spectrum of the target peptide. Compared with conventional approaches, MpM greatly improves sensitivity and selectivity of peptide quantification using an ion-trap mass spectrometer.

  10. Quantification of biofilm in microtiter plates: overview of testing conditions and practical recommendations for assessment of biofilm production by staphylococci.

    PubMed

    Stepanović, Srdjan; Vuković, Dragana; Hola, Veronika; Di Bonaventura, Giovanni; Djukić, Slobodanka; Cirković, Ivana; Ruzicka, Filip

    2007-08-01

    The details of all steps involved in the quantification of biofilm formation in microtiter plates are described. The presented protocol incorporates information on assessment of biofilm production by staphylococci, gained both by direct experience as well as by analysis of methods for assaying biofilm production. The obtained results should simplify quantification of biofilm formation in microtiter plates, and make it more reliable and comparable among different laboratories.

  11. Uncertainty quantification analysis of the dynamics of an electrostatically actuated microelectromechanical switch model

    NASA Astrophysics Data System (ADS)

    Snow, Michael G.; Bajaj, Anil K.

    2015-08-01

    This work presents an uncertainty quantification (UQ) analysis of a comprehensive model for an electrostatically actuated microelectromechanical system (MEMS) switch. The goal is to elucidate the effects of parameter variations on certain key performance characteristics of the switch. A sufficiently detailed model of the electrostatically actuated switch in the basic configuration of a clamped-clamped beam is developed. This multi-physics model accounts for various physical effects, including the electrostatic fringing field, finite length of electrodes, squeeze film damping, and contact between the beam and the dielectric layer. The performance characteristics of immediate interest are the static and dynamic pull-in voltages for the switch. Numerical approaches for evaluating these characteristics are developed and described. Using Latin Hypercube Sampling and other sampling methods, the model is evaluated to find these performance characteristics when variability in the model's geometric and physical parameters is specified. Response surfaces of these results are constructed via a Multivariate Adaptive Regression Splines (MARS) technique. Using a Direct Simulation Monte Carlo (DSMC) technique on these response surfaces gives smooth probability density functions (PDFs) of the outputs characteristics when input probability characteristics are specified. The relative variation in the two pull-in voltages due to each of the input parameters is used to determine the critical parameters.

  12. Quantification of differential gene expression by multiplexed targeted resequencing of cDNA

    PubMed Central

    Arts, Peer; van der Raadt, Jori; van Gestel, Sebastianus H.C.; Steehouwer, Marloes; Shendure, Jay; Hoischen, Alexander; Albers, Cornelis A.

    2017-01-01

    Whole-transcriptome or RNA sequencing (RNA-Seq) is a powerful and versatile tool for functional analysis of different types of RNA molecules, but sample reagent and sequencing cost can be prohibitive for hypothesis-driven studies where the aim is to quantify differential expression of a limited number of genes. Here we present an approach for quantification of differential mRNA expression by targeted resequencing of complementary DNA using single-molecule molecular inversion probes (cDNA-smMIPs) that enable highly multiplexed resequencing of cDNA target regions of ∼100 nucleotides and counting of individual molecules. We show that accurate estimates of differential expression can be obtained from molecule counts for hundreds of smMIPs per reaction and that smMIPs are also suitable for quantification of relative gene expression and allele-specific expression. Compared with low-coverage RNA-Seq and a hybridization-based targeted RNA-Seq method, cDNA-smMIPs are a cost-effective high-throughput tool for hypothesis-driven expression analysis in large numbers of genes (10 to 500) and samples (hundreds to thousands). PMID:28474677

  13. Multiple headspace-solid-phase microextraction: an application to quantification of mushroom volatiles.

    PubMed

    Costa, Rosaria; Tedone, Laura; De Grazia, Selenia; Dugo, Paola; Mondello, Luigi

    2013-04-03

    Multiple headspace-solid phase microextraction (MHS-SPME) followed by gas chromatography/mass spectrometry (GC-MS) and flame ionization detection (GC-FID) was applied to the identification and quantification of volatiles released by the mushroom Agaricus bisporus, also known as champignon. MHS-SPME allows to perform quantitative analysis of volatiles from solid matrices, free of matrix interferences. Samples analyzed were fresh mushrooms (chopped and homogenized) and mushroom-containing food dressings. 1-Octen-3-ol, 3-octanol, 3-octanone, 1-octen-3-one and benzaldehyde were common constituents of the samples analyzed. Method performance has been tested through the evaluation of limit of detection (LoD, range 0.033-0.078 ng), limit of quantification (LoQ, range 0.111-0.259 ng) and analyte recovery (92.3-108.5%). The results obtained showed quantitative differences among the samples, which can be attributed to critical factors, such as the degree of cell damage upon sample preparation, that are here discussed. Considerations on the mushrooms biochemistry and on the basic principles of MHS analysis are also presented. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. The impact of carbon-13 and deuterium on relative quantification of proteins using stable isotope diethyl labeling.

    PubMed

    Koehler, Christian J; Arntzen, Magnus Ø; Thiede, Bernd

    2015-05-15

    Stable isotopic labeling techniques are useful for quantitative proteomics. A cost-effective and convenient method for diethylation by reductive amination was established. The impact using either carbon-13 or deuterium on quantification accuracy and precision was investigated using diethylation. We established an effective approach for stable isotope labeling by diethylation of amino groups of peptides. The approach was validated using matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) and nanospray liquid chromatography/electrospray ionization (nanoLC/ESI)-ion trap/orbitrap for mass spectrometric analysis as well as MaxQuant for quantitative data analysis. Reaction conditions with low reagent costs, high yields and minor side reactions were established for diethylation. Furthermore, we showed that diethylation can be applied to up to sixplex labeling. For duplex experiments, we compared diethylation in the analysis of the proteome of HeLa cells using acetaldehyde-(13) C(2)/(12) C(2) and acetaldehyde-(2) H(4)/(1) H(4). Equal numbers of proteins could be identified and quantified; however, (13) C(4)/(12) C(4) -diethylation revealed a lower variance of quantitative peptide ratios within proteins resulting in a higher precision of quantified proteins and less falsely regulated proteins. The results were compared with dimethylation showing minor effects because of the lower number of deuteriums. The described approach for diethylation of primary amines is a cost-effective and accurate method for up to sixplex relative quantification of proteomes. (13) C(4)/(12) C(4) -diethylation enables duplex quantification based on chemical labeling without using deuterium which reduces identification of false-negatives and increases the quality of the quantification results. Copyright © 2015 John Wiley & Sons, Ltd.

  15. Two-stream Convolutional Neural Network for Methane Emissions Quantification

    NASA Astrophysics Data System (ADS)

    Wang, J.; Ravikumar, A. P.; McGuire, M.; Bell, C.; Tchapmi, L. P.; Brandt, A. R.

    2017-12-01

    Methane, a key component of natural gas, has a 25x higher global warming potential than carbon dioxide on a 100-year basis. Accurately monitoring and mitigating methane emissions require cost-effective detection and quantification technologies. Optical gas imaging, one of the most commonly used leak detection technology, adopted by Environmental Protection Agency, cannot estimate leak-sizes. In this work, we harness advances in computer science to allow for rapid and automatic leak quantification. Particularly, we utilize two-stream deep Convolutional Networks (ConvNets) to estimate leak-size by capturing complementary spatial information from still plume frames, and temporal information from plume motion between frames. We build large leak datasets for training and evaluating purposes by collecting about 20 videos (i.e. 397,400 frames) of leaks. The videos were recorded at six distances from the source, covering 10 -60 ft. Leak sources included natural gas well-heads, separators, and tanks. All frames were labeled with a true leak size, which has eight levels ranging from 0 to 140 MCFH. Preliminary analysis shows that two-stream ConvNets provides significant accuracy advantage over single steam ConvNets. Spatial stream ConvNet can achieve an accuracy of 65.2%, by extracting important features, including texture, plume area, and pattern. Temporal stream, fed by the results of optical flow analysis, results in an accuracy of 58.3%. The integration of the two-stream ConvNets gives a combined accuracy of 77.6%. For future work, we will split the training and testing datasets in distinct ways in order to test the generalization of the algorithm for different leak sources. Several analytic metrics, including confusion matrix and visualization of key features, will be used to understand accuracy rates and occurrences of false positives. The quantification algorithm can help to find and fix super-emitters, and improve the cost-effectiveness of leak detection and repair

  16. Intramyocellular lipid quantification: repeatability with 1H MR spectroscopy.

    PubMed

    Torriani, Martin; Thomas, Bijoy J; Halpern, Elkan F; Jensen, Megan E; Rosenthal, Daniel I; Palmer, William E

    2005-08-01

    To prospectively determine the repeatability and variability of tibialis anterior intramyocellular lipid (IMCL) quantifications performed by using 1.5-T hydrogen 1 (1H) magnetic resonance (MR) spectroscopy in healthy subjects. Institutional review board approval and written informed consent were obtained for this Health Insurance Portability and Accountability Act-compliant study. The authors examined the anterior tibial muscles of 27 healthy subjects aged 19-48 years (12 men, 15 women; mean age, 25 years) by using single-voxel short-echo-time point-resolved 1H MR spectroscopy. During a first visit, the subjects underwent 1H MR spectroscopy before and after being repositioned in the magnet bore, with voxels carefully placed on the basis of osseous landmarks. Measurements were repeated after a mean interval of 12 days. All spectra were fitted by using Java-based MR user interface (jMRUI) and LCModel software, and lipid peaks were scaled to the unsuppressed water peak (at 4.7 ppm) and the total creatine peak (at approximately 3.0 ppm). A one-way random-effects variance components model was used to determine intraday and intervisit coefficients of variation (CVs). A power analysis was performed to determine the detectable percentage change in lipid measurements for two subject sample sizes. Measurements of the IMCL methylene protons peak at a resonance of 1.3 ppm scaled to the unsuppressed water peak (IMCL(W)) that were obtained by using jMRUI software yielded the lowest CVs overall (intraday and intervisit CVs, 13.4% and 14.4%, respectively). The random-effects variance components model revealed that nonbiologic factors (equipment and repositioning) accounted for 50% of the total variability in IMCL quantifications. Power analysis for a sample size of 20 subjects revealed that changes in IMCL(W) of greater than 15% could be confidently detected between 1H MR spectroscopic measurements obtained on different days. 1H MR spectroscopy is feasible for repeatable

  17. Dynamic Coupling Between Respiratory and Cardiovascular System

    NASA Astrophysics Data System (ADS)

    Censi, Federica; Calcagnini, Giovanni; Cerutti, Sergio

    The analysis of non-linear dynamics of the coupling among interacting quantities can be very useful for understanding the cardiorespiratory and cardiovascular control mechanisms. In this chapter RP is used to detect and quantify the degree of non-linear coupling between respiration and spontaneous rhythms of both heart rate and blood pressure variability signals. RQA turned out to be suitable for a quantitative evaluation of the observed coupling patterns among rhythms, both in simulated and real data, providing different degrees of coupling. The results from the simulated data showed that the increased degree of coupling between the signals was marked by the increase of PR and PD, and by the decrease of ER. When the RQA was applied to experimental data, PD and ER turned out to be the most significant variables, compared to PR. A remarkable finding is the detection of transient 1:2 PL episodes between respiration and cardiovascular variability signals. This phenomenon can be associated to a sub-harmonic synchronization between the two main rhythms of HR and BP variability series.

  18. Application of Photoshop and Scion Image analysis to quantification of signals in histochemistry, immunocytochemistry and hybridocytochemistry.

    PubMed

    Tolivia, Jorge; Navarro, Ana; del Valle, Eva; Perez, Cristina; Ordoñez, Cristina; Martínez, Eva

    2006-02-01

    To describe a simple method to achieve the differential selection and subsequent quantification of the strength signal using only one section. Several methods for performing quantitative histochemistry, immunocytochemistry or hybridocytochemistry, without use of specific commercial image analysis systems, rely on pixel-counting algorithms, which do not provide information on the amount of chromogen present in the section. Other techniques use complex algorithms to calculate the cumulative signal strength using two consecutive sections. To separate the chromogen signal we used the "Color range" option of the Adobe Photoshop program, which provides a specific file for a particular chromogen selection that could be applied on similar sections. The measurement of the chromogen signal strength of the specific staining is achieved with the Scion Image software program. The method described in this paper can also be applied to simultaneous detection of different signals on the same section or different parameters (area of particles, number of particles, etc.) when the "Analyze particles" tool of the Scion program is used.

  19. Instantaneous Wavenumber Estimation for Damage Quantification in Layered Plate Structures

    NASA Technical Reports Server (NTRS)

    Mesnil, Olivier; Leckey, Cara A. C.; Ruzzene, Massimo

    2014-01-01

    This paper illustrates the application of instantaneous and local wavenumber damage quantification techniques for high frequency guided wave interrogation. The proposed methodologies can be considered as first steps towards a hybrid structural health monitoring/ nondestructive evaluation (SHM/NDE) approach for damage assessment in composites. The challenges and opportunities related to the considered type of interrogation and signal processing are explored through the analysis of numerical data obtained via EFIT simulations of damage in CRFP plates. Realistic damage configurations are modeled from x-ray CT scan data of plates subjected to actual impacts, in order to accurately predict wave-damage interactions in terms of scattering and mode conversions. Simulation data is utilized to enhance the information provided by instantaneous and local wavenumbers and mitigate the complexity related to the multi-modal content of the plate response. Signal processing strategies considered for this purpose include modal decoupling through filtering in the frequency/wavenumber domain, the combination of displacement components, and the exploitation of polarization information for the various modes as evaluated through the dispersion analysis of the considered laminate lay-up sequence. The results presented assess the effectiveness of the proposed wavefield processing techniques as a hybrid SHM/NDE technique for damage detection and quantification in composite, plate-like structures.

  20. Optimized approaches for quantification of drug transporters in tissues and cells by MRM proteomics.

    PubMed

    Prasad, Bhagwat; Unadkat, Jashvant D

    2014-07-01

    Drug transporter expression in tissues (in vivo) usually differs from that in cell lines used to measure transporter activity (in vitro). Therefore, quantification of transporter expression in tissues and cell lines is important to develop scaling factor for in vitro to in vivo extrapolation (IVIVE) of transporter-mediated drug disposition. Since traditional immunoquantification methods are semiquantitative, targeted proteomics is now emerging as a superior method to quantify proteins, including membrane transporters. This superiority is derived from the selectivity, precision, accuracy, and speed of analysis by liquid chromatography tandem mass spectrometry (LC-MS/MS) in multiple reaction monitoring (MRM) mode. Moreover, LC-MS/MS proteomics has broader applicability because it does not require selective antibodies for individual proteins. There are a number of recent research and review papers that discuss the use of LC-MS/MS for transporter quantification. Here, we have compiled from the literature various elements of MRM proteomics to provide a comprehensive systematic strategy to quantify drug transporters. This review emphasizes practical aspects and challenges in surrogate peptide selection, peptide qualification, peptide synthesis and characterization, membrane protein isolation, protein digestion, sample preparation, LC-MS/MS parameter optimization, method validation, and sample analysis. In particular, bioinformatic tools used in method development and sample analysis are discussed in detail. Various pre-analytical and analytical sources of variability that should be considered during transporter quantification are highlighted. All these steps are illustrated using P-glycoprotein (P-gp) as a case example. Greater use of quantitative transporter proteomics will lead to a better understanding of the role of drug transporters in drug disposition.

  1. Survey of Existing Uncertainty Quantification Capabilities for Army Relevant Problems

    DTIC Science & Technology

    2017-11-27

    ARL-TR-8218•NOV 2017 US Army Research Laboratory Survey of Existing Uncertainty Quantification Capabilities for Army-Relevant Problems by James J...NOV 2017 US Army Research Laboratory Survey of Existing Uncertainty Quantification Capabilities for Army-Relevant Problems by James J Ramsey...Rev. 8/98)    Prescribed by ANSI Std. Z39.18 November 2017 Technical Report Survey of Existing Uncertainty Quantification Capabilities for Army

  2. An Efficient Approach to Evaluate Reporter Ion Behavior from MALDI-MS/MS Data for Quantification Studies using Isobaric Tags

    PubMed Central

    Cologna, Stephanie M.; Crutchfield, Christopher A.; Searle, Brian C.; Blank, Paul S.; Toth, Cynthia L.; Ely, Alexa M.; Picache, Jaqueline A.; Backlund, Peter S.; Wassif, Christopher A.; Porter, Forbes D.; Yergey, Alfred L.

    2017-01-01

    Protein quantification, identification and abundance determination are important aspects of proteome characterization and are crucial in understanding biological mechanisms and human diseases. Different strategies are available to quantify proteins using mass spectrometric detection, and most are performed at the peptide level and include both targeted and un-targeted methodologies. Discovery-based or un-targeted approaches oftentimes use covalent tagging strategies (i.e., iTRAQ®, TMT™) where reporter ion signals collected in the tandem MS experiment are used for quantification. Herein we investigate the behavior of the iTRAQ 8-plex chemistry using MALDI-TOF/TOF instrumentation. The experimental design and data analysis approach described is simple and straightforward, which allows researchers to optimize data collection and proper analysis within a laboratory. iTRAQ reporter ion signals were normalized within each spectrum to remove peptide biases. An advantage of this approach is that missing reporter ion values can be accepted for purposes of protein identification and quantification with the need for ANOVA analysis. We investigate the distribution of reporter ion peak areas in an equimolar system and a mock biological system and provide recommendations for establishing fold-change cutoff values at the peptide level for iTRAQ datasets. These data provide a unique dataset available to the community for informatics training and analysis. PMID:26288259

  3. Whole farm quantification of GHG emissions within smallholder farms in developing countries

    NASA Astrophysics Data System (ADS)

    Seebauer, Matthias

    2014-03-01

    The IPCC has compiled the best available scientific methods into published guidelines for estimating greenhouse gas emissions and emission removals from the land-use sector. In order to evaluate existing GHG quantification tools to comprehensively quantify GHG emissions and removals in smallholder conditions, farm scale quantification was tested with farm data from Western Kenya. After conducting a cluster analysis to identify different farm typologies GHG quantification was exercised using the VCS SALM methodology complemented with IPCC livestock emission factors and the cool farm tool. The emission profiles of four farm clusters representing the baseline conditions in the year 2009 are compared with 2011 where farmers adopted sustainable land management practices (SALM). The results demonstrate the variation in both the magnitude of the estimated GHG emissions per ha between different smallholder farm typologies and the emissions estimated by applying two different accounting tools. The farm scale quantification further shows that the adoption of SALM has a significant impact on emission reduction and removals and the mitigation benefits range between 4 and 6.5 tCO2 ha-1 yr-1 with significantly different mitigation benefits depending on typologies of the crop-livestock systems, their different agricultural practices, as well as adoption rates of improved practices. However, the inherent uncertainty related to the emission factors applied by accounting tools has substantial implications for reported agricultural emissions. With regard to uncertainty related to activity data, the assessment confirms the high variability within different farm types as well as between different parameters surveyed to comprehensively quantify GHG emissions within smallholder farms.

  4. Integrative analysis with ChIP-seq advances the limits of transcript quantification from RNA-seq

    PubMed Central

    Liu, Peng; Sanalkumar, Rajendran; Bresnick, Emery H.; Keleş, Sündüz; Dewey, Colin N.

    2016-01-01

    RNA-seq is currently the technology of choice for global measurement of transcript abundances in cells. Despite its successes, isoform-level quantification remains difficult because short RNA-seq reads are often compatible with multiple alternatively spliced isoforms. Existing methods rely heavily on uniquely mapping reads, which are not available for numerous isoforms that lack regions of unique sequence. To improve quantification accuracy in such difficult cases, we developed a novel computational method, prior-enhanced RSEM (pRSEM), which uses a complementary data type in addition to RNA-seq data. We found that ChIP-seq data of RNA polymerase II and histone modifications were particularly informative in this approach. In qRT-PCR validations, pRSEM was shown to be superior than competing methods in estimating relative isoform abundances within or across conditions. Data-driven simulations suggested that pRSEM has a greatly decreased false-positive rate at the expense of a small increase in false-negative rate. In aggregate, our study demonstrates that pRSEM transforms existing capacity to precisely estimate transcript abundances, especially at the isoform level. PMID:27405803

  5. Multi-tissue partial volume quantification in multi-contrast MRI using an optimised spectral unmixing approach.

    PubMed

    Collewet, Guylaine; Moussaoui, Saïd; Deligny, Cécile; Lucas, Tiphaine; Idier, Jérôme

    2018-06-01

    Multi-tissue partial volume estimation in MRI images is investigated with a viewpoint related to spectral unmixing as used in hyperspectral imaging. The main contribution of this paper is twofold. It firstly proposes a theoretical analysis of the statistical optimality conditions of the proportion estimation problem, which in the context of multi-contrast MRI data acquisition allows to appropriately set the imaging sequence parameters. Secondly, an efficient proportion quantification algorithm based on the minimisation of a penalised least-square criterion incorporating a regularity constraint on the spatial distribution of the proportions is proposed. Furthermore, the resulting developments are discussed using empirical simulations. The practical usefulness of the spectral unmixing approach for partial volume quantification in MRI is illustrated through an application to food analysis on the proving of a Danish pastry. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. TRAP: automated classification, quantification and annotation of tandemly repeated sequences.

    PubMed

    Sobreira, Tiago José P; Durham, Alan M; Gruber, Arthur

    2006-02-01

    TRAP, the Tandem Repeats Analysis Program, is a Perl program that provides a unified set of analyses for the selection, classification, quantification and automated annotation of tandemly repeated sequences. TRAP uses the results of the Tandem Repeats Finder program to perform a global analysis of the satellite content of DNA sequences, permitting researchers to easily assess the tandem repeat content for both individual sequences and whole genomes. The results can be generated in convenient formats such as HTML and comma-separated values. TRAP can also be used to automatically generate annotation data in the format of feature table and GFF files.

  7. THE QUANTIFICATION OF AQUEOUS TRACERS IN LABORATORY AQUIFER MODELS USING LIGHT TRANSMISSION VISUALIZATION METHOD

    EPA Science Inventory

    The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...

  8. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    NASA Astrophysics Data System (ADS)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis

  9. Development and validation of an open source quantification tool for DSC-MRI studies.

    PubMed

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Quantification of Kryptofix 2.2.2 in [18F]fluorine-labelled radiopharmaceuticals by rapid-resolution liquid chromatography.

    PubMed

    Lao, Yexing; Yang, Cuiping; Zou, Wei; Gan, Manquan; Chen, Ping; Su, Weiwei

    2012-05-01

    The cryptand Kryptofix 2.2.2 is used extensively as a phase-transfer reagent in the preparation of [18F]fluoride-labelled radiopharmaceuticals. However, it has considerable acute toxicity. The aim of this study was to develop and validate a method for rapid (within 1 min), specific and sensitive quantification of Kryptofix 2.2.2 at trace levels. Chromatographic separations were carried out by rapid-resolution liquid chromatography (Agilent ZORBAX SB-C18 rapid-resolution column, 2.1 × 30 mm, 3.5 μm). Tandem mass spectra were acquired using a triple quadrupole mass spectrometer equipped with an electrospray ionization interface. Quantitative mass spectrometric analysis was conducted in positive ion mode and multiple reaction monitoring mode for the m/z 377.3 → 114.1 transition for Kryptofix 2.2.2. The external standard method was used for quantification. The method met the precision and efficiency requirements for PET radiopharmaceuticals, providing satisfactory results for specificity, matrix effect, stability, linearity (0.5-100 ng/ml, r(2)=0.9975), precision (coefficient of variation < 5%), accuracy (relative error < ± 3%), sensitivity (lower limit of quantification=0.5 ng) and detection time (<1 min). Fluorodeoxyglucose (n=6) was analysed, and the Kryptofix 2.2.2 content was found to be well below the maximum permissible levels approved by the US Food and Drug Administration. The developed method has a short analysis time (<1 min) and high sensitivity (lower limit of quantification=0.5 ng/ml) and can be successfully applied to rapid quantification of Kryptofix 2.2.2 at trace levels in fluorodeoxyglucose. This method could also be applied to other [18F]fluorine-labelled radiopharmaceuticals that use Kryptofix 2.2.2 as a phase-transfer reagent.

  11. Quantification of polyhydroxyalkanoates in mixed and pure cultures biomass by Fourier transform infrared spectroscopy: comparison of different approaches.

    PubMed

    Isak, I; Patel, M; Riddell, M; West, M; Bowers, T; Wijeyekoon, S; Lloyd, J

    2016-08-01

    Fourier transform infrared (FTIR) spectroscopy was used in this study for the rapid quantification of polyhydroxyalkanoates (PHA) in mixed and pure culture bacterial biomass. Three different statistical analysis methods (regression, partial least squares (PLS) and nonlinear) were applied to the FTIR data and the results were plotted against the PHA values measured with the reference gas chromatography technique. All methods predicted PHA content in mixed culture biomass with comparable efficiency, indicated by similar residuals values. The PHA in these cultures ranged from low to medium concentration (0-44 wt% of dried biomass content). However, for the analysis of the combined mixed and pure culture biomass with PHA concentration ranging from low to high (0-93% of dried biomass content), the PLS method was most efficient. This paper reports, for the first time, the use of a single calibration model constructed with a combination of mixed and pure cultures covering a wide PHA range, for predicting PHA content in biomass. Currently no one universal method exists for processing FTIR data for polyhydroxyalkanoates (PHA) quantification. This study compares three different methods of analysing FTIR data for quantification of PHAs in biomass. A new data-processing approach was proposed and the results were compared against existing literature methods. Most publications report PHA quantification of medium range in pure culture. However, in our study we encompassed both mixed and pure culture biomass containing a broader range of PHA in the calibration curve. The resulting prediction model is useful for rapid quantification of a wider range of PHA content in biomass. © 2016 The Society for Applied Microbiology.

  12. Leveraging transcript quantification for fast computation of alternative splicing profiles.

    PubMed

    Alamancos, Gael P; Pagès, Amadís; Trincado, Juan L; Bellora, Nicolás; Eyras, Eduardo

    2015-09-01

    Alternative splicing plays an essential role in many cellular processes and bears major relevance in the understanding of multiple diseases, including cancer. High-throughput RNA sequencing allows genome-wide analyses of splicing across multiple conditions. However, the increasing number of available data sets represents a major challenge in terms of computation time and storage requirements. We describe SUPPA, a computational tool to calculate relative inclusion values of alternative splicing events, exploiting fast transcript quantification. SUPPA accuracy is comparable and sometimes superior to standard methods using simulated as well as real RNA-sequencing data compared with experimentally validated events. We assess the variability in terms of the choice of annotation and provide evidence that using complete transcripts rather than more transcripts per gene provides better estimates. Moreover, SUPPA coupled with de novo transcript reconstruction methods does not achieve accuracies as high as using quantification of known transcripts, but remains comparable to existing methods. Finally, we show that SUPPA is more than 1000 times faster than standard methods. Coupled with fast transcript quantification, SUPPA provides inclusion values at a much higher speed than existing methods without compromising accuracy, thereby facilitating the systematic splicing analysis of large data sets with limited computational resources. The software is implemented in Python 2.7 and is available under the MIT license at https://bitbucket.org/regulatorygenomicsupf/suppa. © 2015 Alamancos et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  13. Interferences in the direct quantification of bisphenol S in paper by means of thermochemolysis.

    PubMed

    Becerra, Valentina; Odermatt, Jürgen

    2013-02-01

    This article analyses the interferences in the quantification of traces of bisphenol S in paper by applying the direct analytical method "analytical pyrolysis gas chromatography mass spectrometry" (Py-GC/MS) in conjunction with on-line derivatisation with tetramethylammonium hydroxide (TMAH). As the analytes are simultaneously analysed with the matrix, the interferences derive from the matrix. The investigated interferences are found in the analysis of paper samples, which include bisphenol S derivative compounds. As the free bisphenol S is the hydrolysis product of the bisphenol S derivative compounds, the detected amount of bisphenol S in the sample may be overestimated. It is found that the formation of free bisphenol S from the bisphenol S derivative compounds is enhanced in the presence of tetramethylammonium hydroxide (TMAH) under pyrolytic conditions. In order to avoid the formation of bisphenol S trimethylsulphonium hydroxide (TMSH) is introduced. Different parameters are optimised in the development of the quantification method with TMSH. The quantification method based on TMSH thermochemolysis has been validated in terms of reproducibility and accuracy. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. The Non-linear Trajectory of Change in Play Profiles of Three Children in Psychodynamic Play Therapy

    PubMed Central

    Halfon, Sibel; Çavdar, Alev; Orsucci, Franco; Schiepek, Gunter K.; Andreassi, Silvia; Giuliani, Alessandro; de Felice, Giulio

    2016-01-01

    Aim: Even though there is substantial evidence that play based therapies produce significant change, the specific play processes in treatment remain unexamined. For that purpose, processes of change in long-term psychodynamic play therapy are assessed through a repeated systematic assessment of three children’s “play profiles,” which reflect patterns of organization among play variables that contribute to play activity in therapy, indicative of the children’s coping strategies, and an expression of their internal world. The main aims of the study are to investigate the kinds of play profiles expressed in treatment, and to test whether there is emergence of new and more adaptive play profiles using dynamic systems theory as a methodological framework. Methods and Procedures: Each session from the long-term psychodynamic treatment (mean number of sessions = 55) of three 6-year-old good outcome cases presenting with Separation Anxiety were recorded, transcribed and coded using items from the Children’s Play Therapy Instrument (CPTI), created to assess the play activity of children in psychotherapy, generating discrete and measurable units of play activity arranged along a continuum of four play profiles: “Adaptive,” “Inhibited,” “Impulsive,” and “Disorganized.” The play profiles were clustered through K-means Algorithm, generating seven discrete states characterizing the course of treatment and the transitions between these states were analyzed by Markov Transition Matrix, Recurrence Quantification Analysis (RQA) and odds ratios comparing the first and second halves of psychotherapy. Results: The Markov Transitions between the states scaled almost perfectly and also showed the ergodicity of the system, meaning that the child can reach any state or shift to another one in play. The RQA and odds ratios showed two trends of change, first concerning the decrease in the use of “less adaptive” strategies, second regarding the reduction of play

  15. The Non-linear Trajectory of Change in Play Profiles of Three Children in Psychodynamic Play Therapy.

    PubMed

    Halfon, Sibel; Çavdar, Alev; Orsucci, Franco; Schiepek, Gunter K; Andreassi, Silvia; Giuliani, Alessandro; de Felice, Giulio

    2016-01-01

    Aim: Even though there is substantial evidence that play based therapies produce significant change, the specific play processes in treatment remain unexamined. For that purpose, processes of change in long-term psychodynamic play therapy are assessed through a repeated systematic assessment of three children's "play profiles," which reflect patterns of organization among play variables that contribute to play activity in therapy, indicative of the children's coping strategies, and an expression of their internal world. The main aims of the study are to investigate the kinds of play profiles expressed in treatment, and to test whether there is emergence of new and more adaptive play profiles using dynamic systems theory as a methodological framework. Methods and Procedures: Each session from the long-term psychodynamic treatment (mean number of sessions = 55) of three 6-year-old good outcome cases presenting with Separation Anxiety were recorded, transcribed and coded using items from the Children's Play Therapy Instrument (CPTI), created to assess the play activity of children in psychotherapy, generating discrete and measurable units of play activity arranged along a continuum of four play profiles: "Adaptive," "Inhibited," "Impulsive," and "Disorganized." The play profiles were clustered through K -means Algorithm, generating seven discrete states characterizing the course of treatment and the transitions between these states were analyzed by Markov Transition Matrix, Recurrence Quantification Analysis (RQA) and odds ratios comparing the first and second halves of psychotherapy. Results: The Markov Transitions between the states scaled almost perfectly and also showed the ergodicity of the system, meaning that the child can reach any state or shift to another one in play. The RQA and odds ratios showed two trends of change, first concerning the decrease in the use of "less adaptive" strategies, second regarding the reduction of play interruptions. Conclusion

  16. Quantification through TLC-densitometric analysis, repellency and anticholinesterase activity of the homemade extract of Indian cloves.

    PubMed

    Affonso, Raphael S; Lima, Josélia A; Lessa, Bruno M; Caetano, João V O; Obara, Marcos T; Nóbrega, Andréa B; Nepovimova, Eugenie; Musilek, Kamil; Kuca, Kamil; Slana, Gláucia B C A; França, Tanos C C

    2018-02-01

    The rise of the mosquitoes-transmitted diseases, like dengue, zika and chikungunya in Brazil in the last years has increased concerns on protection against mosquitoes bites. However, the prohibitive prices of the commercially available repellents for the majority of the Brazilian population has provoked a search for cheaper solutions, like the use of the homemade ethanolic extract of Indian clove (Syzygium aromaticum L.) as repellent, which has been reported as quite efficient by the local press. In order to verify this, we performed here the quantification of the main components of this extract through high-performance thin-layer chromatography (HPTLC)-densitometry and evaluated its efficiency as a repellent and its acetylcholinesterase (AChE) inhibition capacity. Our results have proved HPTLC-densitometry as an efficient and appropriate method for this quantification and confirmed the repellency activity, as well as its capacity of AChE inhibition. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Quantification of myocardial fibrosis by digital image analysis and interactive stereology

    PubMed Central

    2014-01-01

    Background Cardiac fibrosis disrupts the normal myocardial structure and has a direct impact on heart function and survival. Despite already available digital methods, the pathologist’s visual score is still widely considered as ground truth and used as a primary method in histomorphometric evaluations. The aim of this study was to compare the accuracy of digital image analysis tools and the pathologist’s visual scoring for evaluating fibrosis in human myocardial biopsies, based on reference data obtained by point counting performed on the same images. Methods Endomyocardial biopsy material from 38 patients diagnosed with inflammatory dilated cardiomyopathy was used. The extent of total cardiac fibrosis was assessed by image analysis on Masson’s trichrome-stained tissue specimens using automated Colocalization and Genie software, by Stereology grid count and manually by Pathologist’s visual score. Results A total of 116 slides were analyzed. The mean results obtained by the Colocalization software (13.72 ± 12.24%) were closest to the reference value of stereology (RVS), while the Genie software and Pathologist score gave a slight underestimation. RVS values correlated strongly with values obtained using the Colocalization and Genie (r > 0.9, p < 0.001) software as well as the pathologist visual score. Differences in fibrosis quantification by Colocalization and RVS were statistically insignificant. However, significant bias was found in the results obtained by using Genie versus RVS and pathologist score versus RVS with mean difference values of: -1.61% and 2.24%. Bland-Altman plots showed a bidirectional bias dependent on the magnitude of the measurement: Colocalization software overestimated the area fraction of fibrosis in the lower end, and underestimated in the higher end of the RVS values. Meanwhile, Genie software as well as the pathologist score showed more uniform results throughout the values, with a slight underestimation in the mid

  18. Biodiesel production from microalgal isolates of southern Pakistan and quantification of FAMEs by GC-MS/MS analysis

    PubMed Central

    2012-01-01

    Background Microalgae have attracted major interest as a sustainable source for biodiesel production on commercial scale. This paper describes the screening of six microalgal species, Scenedesmus quadricauda, Scenedesmus acuminatus, Nannochloropsis sp., Anabaena sp., Chlorella sp. and Oscillatoria sp., isolated from fresh and marine water resources of southern Pakistan for biodiesel production and the GC-MS/MS analysis of their fatty acid methyl esters (FAMEs). Results Growth rate, biomass productivity and oil content of each algal species have been investigated under autotrophic condition. Biodiesel was produced from algal oil by acid catalyzed transesterification reaction and resulting fatty acid methyl esters (FAMEs) content was analyzed by GC/MS. Fatty acid profiling of the biodiesel, obtained from various microalgal oils showed high content of C-16:0, C-18:0, cis-Δ9C-18:1, cis-Δ11C-18:1 (except Scenedesmus quadricauda) and 10-hydroxyoctadecanoic (except Scenedesmus acuminatus). Absolute amount of C-14:0, C-16:0 and C-18:0 by a validated GC-MS/MS method were found to be 1.5-1.7, 15.0-42.5 and 4.2-18.4 mg/g, respectively, in biodiesel obtained from various microalgal oils. Biodiesel was also characterized in terms of cetane number, kinematic viscosity, density and higher heating value and compared with the standard values. Conclusion Six microalgae of local origin were screened for biodiesel production. A method for absolute quantification of three important saturated fatty acid methyl esters (C-14, C-16 and C-18) by gas chromatography-tandem mass spectrometry (GC-MS/MS), using multiple reactions monitoring (MRM) mode, was employed for the identification and quantification of biodiesels obtained from various microalgal oils. The results suggested that locally found microalgae can be sustainably harvested for the production of biodiesel. This offers the tremendous economic opportunity for an energy-deficient nation. PMID:23216896

  19. Quantification of regional fat volume in rat MRI

    NASA Astrophysics Data System (ADS)

    Sacha, Jaroslaw P.; Cockman, Michael D.; Dufresne, Thomas E.; Trokhan, Darren

    2003-05-01

    Multiple initiatives in the pharmaceutical and beauty care industries are directed at identifying therapies for weight management. Body composition measurements are critical for such initiatives. Imaging technologies that can be used to measure body composition noninvasively include DXA (dual energy x-ray absorptiometry) and MRI (magnetic resonance imaging). Unlike other approaches, MRI provides the ability to perform localized measurements of fat distribution. Several factors complicate the automatic delineation of fat regions and quantification of fat volumes. These include motion artifacts, field non-uniformity, brightness and contrast variations, chemical shift misregistration, and ambiguity in delineating anatomical structures. We have developed an approach to deal practically with those challenges. The approach is implemented in a package, the Fat Volume Tool, for automatic detection of fat tissue in MR images of the rat abdomen, including automatic discrimination between abdominal and subcutaneous regions. We suppress motion artifacts using masking based on detection of implicit landmarks in the images. Adaptive object extraction is used to compensate for intensity variations. This approach enables us to perform fat tissue detection and quantification in a fully automated manner. The package can also operate in manual mode, which can be used for verification of the automatic analysis or for performing supervised segmentation. In supervised segmentation, the operator has the ability to interact with the automatic segmentation procedures to touch-up or completely overwrite intermediate segmentation steps. The operator's interventions steer the automatic segmentation steps that follow. This improves the efficiency and quality of the final segmentation. Semi-automatic segmentation tools (interactive region growing, live-wire, etc.) improve both the accuracy and throughput of the operator when working in manual mode. The quality of automatic segmentation has been

  20. GMO quantification: valuable experience and insights for the future.

    PubMed

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  1. A simple and fast method for extraction and quantification of cryptophyte phycoerythrin.

    PubMed

    Thoisen, Christina; Hansen, Benni Winding; Nielsen, Søren Laurentius

    2017-01-01

    The microalgal pigment phycoerythrin (PE) is of commercial interest as natural colorant in food and cosmetics, as well as fluoroprobes for laboratory analysis. Several methods for extraction and quantification of PE are available but they comprise typically various extraction buffers, repetitive freeze-thaw cycles and liquid nitrogen, making extraction procedures more complicated. A simple method for extraction of PE from cryptophytes is described using standard laboratory materials and equipment. The cryptophyte cells on the filters were disrupted at -80 °C and added phosphate buffer for extraction at 4 °C followed by absorbance measurement. The cryptophyte Rhodomonas salina was used as a model organism. •Simple method for extraction and quantification of phycoerythrin from cryptophytes.•Minimal usage of equipment and chemicals, and low labor costs.•Applicable for industrial and biological purposes.

  2. Quantification and statistical significance analysis of group separation in NMR-based metabonomics studies

    PubMed Central

    Goodpaster, Aaron M.; Kennedy, Michael A.

    2015-01-01

    Currently, no standard metrics are used to quantify cluster separation in PCA or PLS-DA scores plots for metabonomics studies or to determine if cluster separation is statistically significant. Lack of such measures makes it virtually impossible to compare independent or inter-laboratory studies and can lead to confusion in the metabonomics literature when authors putatively identify metabolites distinguishing classes of samples based on visual and qualitative inspection of scores plots that exhibit marginal separation. While previous papers have addressed quantification of cluster separation in PCA scores plots, none have advocated routine use of a quantitative measure of separation that is supported by a standard and rigorous assessment of whether or not the cluster separation is statistically significant. Here quantification and statistical significance of separation of group centroids in PCA and PLS-DA scores plots are considered. The Mahalanobis distance is used to quantify the distance between group centroids, and the two-sample Hotelling's T2 test is computed for the data, related to an F-statistic, and then an F-test is applied to determine if the cluster separation is statistically significant. We demonstrate the value of this approach using four datasets containing various degrees of separation, ranging from groups that had no apparent visual cluster separation to groups that had no visual cluster overlap. Widespread adoption of such concrete metrics to quantify and evaluate the statistical significance of PCA and PLS-DA cluster separation would help standardize reporting of metabonomics data. PMID:26246647

  3. Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estep, Donald; El-Azab, Anter; Pernice, Michael

    2017-03-23

    In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis formore » computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.« less

  4. A quantification model for the structure of clay materials.

    PubMed

    Tang, Liansheng; Sang, Haitao; Chen, Haokun; Sun, Yinlei; Zhang, Longjian

    2016-07-04

    In this paper, the quantification for clay structure is explicitly explained, and the approach and goals of quantification are also discussed. The authors consider that the purpose of the quantification for clay structure is to determine some parameters that can be used to quantitatively characterize the impact of clay structure on the macro-mechanical behaviour. According to the system theory and the law of energy conservation, a quantification model for the structure characteristics of clay materials is established and three quantitative parameters (i.e., deformation structure potential, strength structure potential and comprehensive structure potential) are proposed. And the corresponding tests are conducted. The experimental results show that these quantitative parameters can accurately reflect the influence of clay structure on the deformation behaviour, strength behaviour and the relative magnitude of structural influence on the above two quantitative parameters, respectively. These quantitative parameters have explicit mechanical meanings, and can be used to characterize the structural influences of clay on its mechanical behaviour.

  5. Artifacts Quantification of Metal Implants in MRI

    NASA Astrophysics Data System (ADS)

    Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.

    2017-11-01

    The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.

  6. Illite polytype quantification using Wildfire© calculated x-ray diffraction patterns

    USGS Publications Warehouse

    Grathoff, Georg H.; Moore, D.M.

    1996-01-01

    Illite polytype quantification allows the differentiation of diagenetic and detrital illite components. In Paleozoic shales from the Illinois Basin, we observe 3 polytypes: 1Md, 1M and 2M1. 1Md and 1M are of diagenetic origin and 2M1 is of detrital origin. In this paper, we compare experimental X-ray diffraction (XRD) traces with traces calculated using WILDFIRE© and quantify mixtures of all 3 polytypes, adjusting the effects of preferred orientation and overlapping peaks. The broad intensity (“illite hump”) around the illite 003, which is very common in illite from shales, is caused by the presence of 1Md illite and mixing of illite polytypes and is not an artifact of sample preparation or other impurities in the sample. Illite polytype quantification provides a tool to extrapolate the K/Ar age and chemistry of the detrital and diagenetic end-members by analysis of different size fractions containing different proportions of diagenetic and detrital illite polytypes.

  7. A novel approach for quantification and analysis of the color Doppler twinkling artifact with application in noninvasive surface roughness characterization: an in vitro phantom study.

    PubMed

    Jamzad, Amoon; Setarehdan, Seyed Kamaledin

    2014-04-01

    The twinkling artifact is an undesired phenomenon within color Doppler sonograms that usually appears at the site of internal calcifications. Since the appearance of the twinkling artifact is correlated with the roughness of the calculi, noninvasive roughness estimation of the internal stones may be considered as a potential twinkling artifact application. This article proposes a novel quantitative approach for measurement and analysis of twinkling artifact data for roughness estimation. A phantom was developed with 7 quantified levels of roughness. The Doppler system was initially calibrated by the proposed procedure to facilitate the analysis. A total of 1050 twinkling artifact images were acquired from the phantom, and 32 novel numerical measures were introduced and computed for each image. The measures were then ranked on the basis of roughness quantification ability using different methods. The performance of the proposed twinkling artifact-based surface roughness quantification method was finally investigated for different combinations of features and classifiers. Eleven features were shown to be the most efficient numerical twinkling artifact measures in roughness characterization. The linear classifier outperformed other methods for twinkling artifact classification. The pixel count measures produced better results among the other categories. The sequential selection method showed higher accuracy than other individual rankings. The best roughness recognition average accuracy of 98.33% was obtained by the first 5 principle components and the linear classifier. The proposed twinkling artifact analysis method could recognize the phantom surface roughness with average accuracy of 98.33%. This method may also be applicable for noninvasive calculi characterization in treatment management.

  8. Integrative analysis with ChIP-seq advances the limits of transcript quantification from RNA-seq.

    PubMed

    Liu, Peng; Sanalkumar, Rajendran; Bresnick, Emery H; Keleş, Sündüz; Dewey, Colin N

    2016-08-01

    RNA-seq is currently the technology of choice for global measurement of transcript abundances in cells. Despite its successes, isoform-level quantification remains difficult because short RNA-seq reads are often compatible with multiple alternatively spliced isoforms. Existing methods rely heavily on uniquely mapping reads, which are not available for numerous isoforms that lack regions of unique sequence. To improve quantification accuracy in such difficult cases, we developed a novel computational method, prior-enhanced RSEM (pRSEM), which uses a complementary data type in addition to RNA-seq data. We found that ChIP-seq data of RNA polymerase II and histone modifications were particularly informative in this approach. In qRT-PCR validations, pRSEM was shown to be superior than competing methods in estimating relative isoform abundances within or across conditions. Data-driven simulations suggested that pRSEM has a greatly decreased false-positive rate at the expense of a small increase in false-negative rate. In aggregate, our study demonstrates that pRSEM transforms existing capacity to precisely estimate transcript abundances, especially at the isoform level. © 2016 Liu et al.; Published by Cold Spring Harbor Laboratory Press.

  9. Quantification of fungicides in snow-melt runoff from turf: A comparison of four extraction methods

    USDA-ARS?s Scientific Manuscript database

    A variety of pesticides are used to control diverse stressors to turf. These pesticides have a wide range in physical and chemical properties. The objective of this project was to develop an extraction and analysis method for quantification of chlorothalonil and PCNB (pentachloronitrobenzene), two p...

  10. Collagen Quantification in Tissue Specimens.

    PubMed

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  11. Ultra-high Performance Liquid Chromatography Tandem Mass-Spectrometry for Simple and Simultaneous Quantification of Cannabinoids

    PubMed Central

    Jamwal, Rohitash; Topletz, Ariel R.; Ramratnam, Bharat; Akhlaghi, Fatemeh

    2017-01-01

    Cannabis is used widely in the United States, both recreationally and for medical purposes. Current methods for analysis of cannabinoids in human biological specimens rely on complex extraction process and lengthy analysis time. We established a rapid and simple assay for quantification of Δ9-tetrahydrocannabinol (THC), cannabidiol (CBD), 11-hydroxy Δ9-tetrahydrocannabinol (11-OH THC) and 11-nor-9-carboxy-Δ9-tetrahydrocannbinol (THC-COOH) in human plasma by U-HPLC-MS/MS using Δ9-tetrahydrocannabinol-D3 as the internal standard. Chromatographic separation was achieved on an Acquity BEH C18 column using a gradient comprising of water (0.1% formic acid) and methanol (0.1% formic acid) over a 6 min run-time. Analytes from 200 µL plasma were extracted using acetonitrile (containing 1% formic acid and THC-D3). Mass spectrometry was performed in positive ionization mode, and total ion chromatogram was used for quantification of analytes. The assay was validated according to guidelines set forth by Food and Drug Administration of United States. An eight-point calibration curve was fitted with quadratic regression (r2>0.99) from 1.56 to 100 ng mL−1 and a lower limit of quantification (LLOQ) of 1.56 ng mL−1 was achieved. Accuracy and precision calculated from six calibration curves was between 85 to 115% while the mean extraction recovery was >90% for all the analytes. Several plasma phospholipids eluted after the analytes thus did not interfere with the assay. Bench-top, freeze-thaw, auto-sampler and short-term stability ranged from 92.7 to 106.8% of nominal values. Application of the method was evaluated by quantification of analytes in human plasma from six subjects. PMID:28192758

  12. Quantification of febuxostat polymorphs using powder X-ray diffraction technique.

    PubMed

    Qiu, Jing-bo; Li, Gang; Sheng, Yue; Zhu, Mu-rong

    2015-03-25

    Febuxostat is a pharmaceutical compound with more than 20 polymorphs of which form A is most widely used and usually exists in a mixed polymorphic form with form G. In the present study, a quantification method for polymorphic form A and form G of febuxostat (FEB) has been developed using powder X-ray diffraction (PXRD). Prior to development of a quantification method, pure polymorphic form A and form G are characterized. A continuous scan with a scan rate of 3° min(-1) over an angular range of 3-40° 2θ is applied for the construction of the calibration curve using the characteristic peaks of form A at 12.78° 2θ (I/I0100%) and form G at 11.72° 2θ (I/I0100%). The linear regression analysis data for the calibration plots shows good linear relationship with R(2)=0.9985 with respect to peak area in the concentration range 10-60 wt.%. The method is validated for precision, recovery and ruggedness. The limits of detection and quantitation are 1.5% and 4.6%, respectively. The obtained results prove that the method is repeatable, sensitive and accurate. The proposed developed PXRD method can be applied for the quantitative analysis of mixtures of febuxostat polymorphs (forms A and G). Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations

    PubMed Central

    Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.

    2013-01-01

    Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359

  14. Simple and rapid quantification of brominated vegetable oil in commercial soft drinks by LC–MS

    PubMed Central

    Chitranshi, Priyanka; da Costa, Gonçalo Gamboa

    2016-01-01

    We report here a simple and rapid method for the quantification of brominated vegetable oil (BVO) in soft drinks based upon liquid chromatography–electrospray ionization mass spectrometry. Unlike previously reported methods, this novel method does not require hydrolysis, extraction or derivatization steps, but rather a simple “dilute and shoot” sample preparation. The quantification is conducted by mass spectrometry in selected ion recording mode and a single point standard addition procedure. The method was validated in the range of 5–25 μg/mL BVO, encompassing the legal limit of 15 μg/mL established by the US FDA for fruit-flavored beverages in the US market. The method was characterized by excellent intra- and inter-assay accuracy (97.3–103.4%) and very low imprecision [0.5–3.6% (RSD)]. The direct nature of the quantification, simplicity, and excellent statistical performance of this methodology constitute clear advantages in relation to previously published methods for the analysis of BVO in soft drinks. PMID:27451219

  15. A visual approach to efficient analysis and quantification of ductile iron and reinforced sprayed concrete.

    PubMed

    Fritz, Laura; Hadwiger, Markus; Geier, Georg; Pittino, Gerhard; Gröller, M Eduard

    2009-01-01

    This paper describes advanced volume visualization and quantification for applications in non-destructive testing (NDT), which results in novel and highly effective interactive workflows for NDT practitioners. We employ a visual approach to explore and quantify the features of interest, based on transfer functions in the parameter spaces of specific application scenarios. Examples are the orientations of fibres or the roundness of particles. The applicability and effectiveness of our approach is illustrated using two specific scenarios of high practical relevance. First, we discuss the analysis of Steel Fibre Reinforced Sprayed Concrete (SFRSpC). We investigate the orientations of the enclosed steel fibres and their distribution, depending on the concrete's application direction. This is a crucial step in assessing the material's behavior under mechanical stress, which is still in its infancy and therefore a hot topic in the building industry. The second application scenario is the designation of the microstructure of ductile cast irons with respect to the contained graphite. This corresponds to the requirements of the ISO standard 945-1, which deals with 2D metallographic samples. We illustrate how the necessary analysis steps can be carried out much more efficiently using our system for 3D volumes. Overall, we show that a visual approach with custom transfer functions in specific application domains offers significant benefits and has the potential of greatly improving and optimizing the workflows of domain scientists and engineers.

  16. Arkas: Rapid reproducible RNAseq analysis

    PubMed Central

    Colombo, Anthony R.; J. Triche Jr, Timothy; Ramsingh, Giridharan

    2017-01-01

    The recently introduced Kallisto pseudoaligner has radically simplified the quantification of transcripts in RNA-sequencing experiments.  We offer cloud-scale RNAseq pipelines Arkas-Quantification, and Arkas-Analysis available within Illumina’s BaseSpace cloud application platform which expedites Kallisto preparatory routines, reliably calculates differential expression, and performs gene-set enrichment of REACTOME pathways .  Due to inherit inefficiencies of scale, Illumina's BaseSpace computing platform offers a massively parallel distributive environment improving data management services and data importing.   Arkas-Quantification deploys Kallisto for parallel cloud computations and is conveniently integrated downstream from the BaseSpace Sequence Read Archive (SRA) import/conversion application titled SRA Import.  Arkas-Analysis annotates the Kallisto results by extracting structured information directly from source FASTA files with per-contig metadata, calculates the differential expression and gene-set enrichment analysis on both coding genes and transcripts. The Arkas cloud pipeline supports ENSEMBL transcriptomes and can be used downstream from the SRA Import facilitating raw sequencing importing, SRA FASTQ conversion, RNA quantification and analysis steps. PMID:28868134

  17. The principles of quantification applied to in vivo proton MR spectroscopy.

    PubMed

    Helms, Gunther

    2008-08-01

    Following the identification of metabolite signals in the in vivo MR spectrum, quantification is the procedure to estimate numerical values of their concentrations. The two essential steps are discussed in detail: analysis by fitting a model of prior knowledge, that is, the decomposition of the spectrum into the signals of singular metabolites; then, normalization of these signals to yield concentration estimates. Special attention is given to using the in vivo water signal as internal reference.

  18. Quantification of micro stickies

    Treesearch

    Mahendra Doshi; Jeffrey Dyer; Salman Aziz; Kristine Jackson; Said M. Abubakr

    1997-01-01

    The objective of this project was to compare the different methods for the quantification of micro stickies. The hydrophobic materials investigated in this project for the collection of micro stickies were Microfoam* (polypropylene packing material), low density polyethylene film (LDPE), high density polyethylene (HDPE; a flat piece from a square plastic bottle), paper...

  19. Respiratory Mucosal Proteome Quantification in Human Influenza Infections.

    PubMed

    Marion, Tony; Elbahesh, Husni; Thomas, Paul G; DeVincenzo, John P; Webby, Richard; Schughart, Klaus

    2016-01-01

    Respiratory influenza virus infections represent a serious threat to human health. Underlying medical conditions and genetic make-up predispose some influenza patients to more severe forms of disease. To date, only a few studies have been performed in patients to correlate a selected group of cytokines and chemokines with influenza infection. Therefore, we evaluated the potential of a novel multiplex micro-proteomics technology, SOMAscan, to quantify proteins in the respiratory mucosa of influenza A and B infected individuals. The analysis included but was not limited to quantification of cytokines and chemokines detected in previous studies. SOMAscan quantified more than 1,000 secreted proteins in small nasal wash volumes from infected and healthy individuals. Our results illustrate the utility of micro-proteomic technology for analysis of proteins in small volumes of respiratory mucosal samples. Furthermore, when we compared nasal wash samples from influenza-infected patients with viral load ≥ 2(8) and increased IL-6 and CXCL10 to healthy controls, we identified 162 differentially-expressed proteins between the two groups. This number greatly exceeds the number of DEPs identified in previous studies in human influenza patients. Most of the identified proteins were associated with the host immune response to infection, and changes in protein levels of 151 of the DEPs were significantly correlated with viral load. Most important, SOMAscan identified differentially expressed proteins heretofore not associated with respiratory influenza infection in humans. Our study is the first report for the use of SOMAscan to screen nasal secretions. It establishes a precedent for micro-proteomic quantification of proteins that reflect ongoing response to respiratory infection.

  20. Respiratory Mucosal Proteome Quantification in Human Influenza Infections

    PubMed Central

    Marion, Tony; Elbahesh, Husni; Thomas, Paul G.; DeVincenzo, John P.; Webby, Richard; Schughart, Klaus

    2016-01-01

    Respiratory influenza virus infections represent a serious threat to human health. Underlying medical conditions and genetic make-up predispose some influenza patients to more severe forms of disease. To date, only a few studies have been performed in patients to correlate a selected group of cytokines and chemokines with influenza infection. Therefore, we evaluated the potential of a novel multiplex micro-proteomics technology, SOMAscan, to quantify proteins in the respiratory mucosa of influenza A and B infected individuals. The analysis included but was not limited to quantification of cytokines and chemokines detected in previous studies. SOMAscan quantified more than 1,000 secreted proteins in small nasal wash volumes from infected and healthy individuals. Our results illustrate the utility of micro-proteomic technology for analysis of proteins in small volumes of respiratory mucosal samples. Furthermore, when we compared nasal wash samples from influenza-infected patients with viral load ≥ 28 and increased IL-6 and CXCL10 to healthy controls, we identified 162 differentially-expressed proteins between the two groups. This number greatly exceeds the number of DEPs identified in previous studies in human influenza patients. Most of the identified proteins were associated with the host immune response to infection, and changes in protein levels of 151 of the DEPs were significantly correlated with viral load. Most important, SOMAscan identified differentially expressed proteins heretofore not associated with respiratory influenza infection in humans. Our study is the first report for the use of SOMAscan to screen nasal secretions. It establishes a precedent for micro-proteomic quantification of proteins that reflect ongoing response to respiratory infection. PMID:27088501

  1. Development of a real-time PCR method for the differential detection and quantification of four solanaceae in GMO analysis: potato (Solanum tuberosum), tomato (Solanum lycopersicum), eggplant (Solanum melongena), and pepper (Capsicum annuum).

    PubMed

    Chaouachi, Maher; El Malki, Redouane; Berard, Aurélie; Romaniuk, Marcel; Laval, Valérie; Brunel, Dominique; Bertheau, Yves

    2008-03-26

    The labeling of products containing genetically modified organisms (GMO) is linked to their quantification since a threshold for the presence of fortuitous GMOs in food has been established. This threshold is calculated from a combination of two absolute quantification values: one for the specific GMO target and the second for an endogenous reference gene specific to the taxon. Thus, the development of reliable methods to quantify GMOs using endogenous reference genes in complex matrixes such as food and feed is needed. Plant identification can be difficult in the case of closely related taxa, which moreover are subject to introgression events. Based on the homology of beta-fructosidase sequences obtained from public databases, two couples of consensus primers were designed for the detection, quantification, and differentiation of four Solanaceae: potato (Solanum tuberosum), tomato (Solanum lycopersicum), pepper (Capsicum annuum), and eggplant (Solanum melongena). Sequence variability was studied first using lines and cultivars (intraspecies sequence variability), then using taxa involved in gene introgressions, and finally, using taxonomically close taxa (interspecies sequence variability). This study allowed us to design four highly specific TaqMan-MGB probes. A duplex real time PCR assay was developed for simultaneous quantification of tomato and potato. For eggplant and pepper, only simplex real time PCR tests were developed. The results demonstrated the high specificity and sensitivity of the assays. We therefore conclude that beta-fructosidase can be used as an endogenous reference gene for GMO analysis.

  2. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, Christopher B., E-mail: Christopher.jackson@insel.ch; Gallati, Sabina, E-mail: sabina.gallati@insel.ch; Schaller, Andre, E-mail: andre.schaller@insel.ch

    2012-07-06

    Highlights: Black-Right-Pointing-Pointer Serial qPCR accurately determines fragmentation state of any given DNA sample. Black-Right-Pointing-Pointer Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. Black-Right-Pointing-Pointer Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. Black-Right-Pointing-Pointer Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serialmore » qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze-thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA ({lambda}{sub nDNA}) and mtDNA ({lambda}{sub mtDNA}) we present an approach to possibly correct

  3. Direct Quantification of Methane Emissions Across the Supply Chain: Identification of Mitigation Targets

    NASA Astrophysics Data System (ADS)

    Darzi, M.; Johnson, D.; Heltzel, R.; Clark, N.

    2017-12-01

    Researchers at West Virginia University's Center for Alternative Fuels, Engines, and Emissions have recently participated in a variety of studies targeted at direction quantification of methane emissions from across the natural gas supply chain. These studies included assessing methane emissions from heavy-duty vehicles and their fuel stations, active unconventional well sites - during both development and production, natural gas compression and storage facilities, natural gas engines - both large and small, two- and four-stroke, and low-throughput equipment associated with coal bed methane wells. Engine emissions were sampled using conventional instruments such as Fourier transform infrared spectrometers and heated flame ionization detection analyzers. However, to accurately quantify a wide range of other sources beyond the tailpipe (both leaks and losses), a full flow sampling system was developed, which included an integrated cavity-enhanced absorption spectrometer. Through these direct quantification efforts and analysis major sources of methane emissions were identified. Technological solutions and best practices exist or could be developed to reduce methane emissions by focusing on the "lowest-hanging fruit." For example, engine crankcases from across the supply chain should employ vent mitigation systems to reduce methane and other emissions. An overview of the direct quantification system and various campaign measurements results will be presented along with the identification of other targets for additional mitigation.

  4. Non-random nature of spontaneous mIPSCs in mouse auditory brainstem neurons revealed by recurrence quantification analysis

    PubMed Central

    Leao, Richardson N; Leao, Fabricio N; Walmsley, Bruce

    2005-01-01

    A change in the spontaneous release of neurotransmitter is a useful indicator of processes occurring within presynaptic terminals. Linear techniques (e.g. Fourier transform) have been used to analyse spontaneous synaptic events in previous studies, but such methods are inappropriate if the timing pattern is complex. We have investigated spontaneous glycinergic miniature synaptic currents (mIPSCs) in principal cells of the medial nucleus of the trapezoid body. The random versus deterministic (or periodic) nature of mIPSCs was assessed using recurrence quantification analysis. Nonlinear methods were then used to quantify any detected determinism in spontaneous release, and to test for chaotic or fractal patterns. Modelling demonstrated that this procedure is much more sensitive in detecting periodicities than conventional techniques. mIPSCs were found to exhibit periodicities that were abolished by blockade of internal calcium stores with ryanodine, suggesting calcium oscillations in the presynaptic inhibitory terminals. Analysis indicated that mIPSC occurrences were chaotic in nature. Furthermore, periodicities were less evident in congenitally deaf mice than in normal mice, indicating that appropriate neural activity during development is necessary for the expression of deterministic chaos in mIPSC patterns. We suggest that chaotic oscillations of mIPSC occurrences play a physiological role in signal processing in the auditory brainstem. PMID:16271982

  5. Lesion Quantification in Dual-Modality Mammotomography

    NASA Astrophysics Data System (ADS)

    Li, Heng; Zheng, Yibin; More, Mitali J.; Goodale, Patricia J.; Williams, Mark B.

    2007-02-01

    This paper describes a novel x-ray/SPECT dual modality breast imaging system that provides 3D structural and functional information. While only a limited number of views on one side of the breast can be acquired due to mechanical and time constraints, we developed a technique to compensate for the limited angle artifact in reconstruction images and accurately estimate both the lesion size and radioactivity concentration. Various angular sampling strategies were evaluated using both simulated and experimental data. It was demonstrated that quantification of lesion size to an accuracy of 10% and quantification of radioactivity to an accuracy of 20% are feasible from limited-angle data acquired with clinically practical dosage and acquisition time

  6. Quantification of substrate and cellular strains in stretchable 3D cell cultures: an experimental and computational framework.

    PubMed

    González-Avalos, P; Mürnseer, M; Deeg, J; Bachmann, A; Spatz, J; Dooley, S; Eils, R; Gladilin, E

    2017-05-01

    The mechanical cell environment is a key regulator of biological processes . In living tissues, cells are embedded into the 3D extracellular matrix and permanently exposed to mechanical forces. Quantification of the cellular strain state in a 3D matrix is therefore the first step towards understanding how physical cues determine single cell and multicellular behaviour. The majority of cell assays are, however, based on 2D cell cultures that lack many essential features of the in vivo cellular environment. Furthermore, nondestructive measurement of substrate and cellular mechanics requires appropriate computational tools for microscopic image analysis and interpretation. Here, we present an experimental and computational framework for generation and quantification of the cellular strain state in 3D cell cultures using a combination of 3D substrate stretcher, multichannel microscopic imaging and computational image analysis. The 3D substrate stretcher enables deformation of living cells embedded in bead-labelled 3D collagen hydrogels. Local substrate and cell deformations are determined by tracking displacement of fluorescent beads with subsequent finite element interpolation of cell strains over a tetrahedral tessellation. In this feasibility study, we debate diverse aspects of deformable 3D culture construction, quantification and evaluation, and present an example of its application for quantitative analysis of a cellular model system based on primary mouse hepatocytes undergoing transforming growth factor (TGF-β) induced epithelial-to-mesenchymal transition. © 2017 The Authors. Journal of Microscopy published by JohnWiley & Sons Ltd on behalf of Royal Microscopical Society.

  7. Phylogenetic Quantification of Intra-tumour Heterogeneity

    PubMed Central

    Schwarz, Roland F.; Trinh, Anne; Sipos, Botond; Brenton, James D.; Goldman, Nick; Markowetz, Florian

    2014-01-01

    Intra-tumour genetic heterogeneity is the result of ongoing evolutionary change within each cancer. The expansion of genetically distinct sub-clonal populations may explain the emergence of drug resistance, and if so, would have prognostic and predictive utility. However, methods for objectively quantifying tumour heterogeneity have been missing and are particularly difficult to establish in cancers where predominant copy number variation prevents accurate phylogenetic reconstruction owing to horizontal dependencies caused by long and cascading genomic rearrangements. To address these challenges, we present MEDICC, a method for phylogenetic reconstruction and heterogeneity quantification based on a Minimum Event Distance for Intra-tumour Copy-number Comparisons. Using a transducer-based pairwise comparison function, we determine optimal phasing of major and minor alleles, as well as evolutionary distances between samples, and are able to reconstruct ancestral genomes. Rigorous simulations and an extensive clinical study show the power of our method, which outperforms state-of-the-art competitors in reconstruction accuracy, and additionally allows unbiased numerical quantification of tumour heterogeneity. Accurate quantification and evolutionary inference are essential to understand the functional consequences of tumour heterogeneity. The MEDICC algorithms are independent of the experimental techniques used and are applicable to both next-generation sequencing and array CGH data. PMID:24743184

  8. Laser-induced plasma characterization through self-absorption quantification

    NASA Astrophysics Data System (ADS)

    Hou, JiaJia; Zhang, Lei; Zhao, Yang; Yan, Xingyu; Ma, Weiguang; Dong, Lei; Yin, Wangbao; Xiao, Liantuan; Jia, Suotang

    2018-07-01

    A self-absorption quantification method is proposed to quantify the self-absorption degree of spectral lines, in which plasma characteristics including electron temperature, elemental concentration ratio, and absolute species number density can be deduced directly. Since there is no spectral intensity involved in the calculation, the analysis results are independent of the self-absorption effects and the additional spectral efficiency calibration is not required. In order to evaluate the practicality, the limitation for application and the precision of this method are also discussed. Experimental results of aluminum-lithium alloy prove that the proposed method is qualified to realize semi-quantitative measurements and fast plasma characteristics diagnostics.

  9. Validation of a Sulfuric Acid Digestion Method for Inductively Coupled Plasma Mass Spectrometry Quantification of TiO2 Nanoparticles.

    PubMed

    Watkins, Preston S; Castellon, Benjamin T; Tseng, Chiyen; Wright, Moncie V; Matson, Cole W; Cobb, George P

    2018-04-13

    A consistent analytical method incorporating sulfuric acid (H 2 SO 4 ) digestion and ICP-MS quantification has been developed for TiO 2 quantification in biotic and abiotic environmentally relevant matrices. Sample digestion in H 2 SO 4 at 110°C provided consistent results without using hydrofluoric acid or microwave digestion. Analysis of seven replicate samples for four matrices on each of 3 days produced Ti recoveries of 97% ± 2.5%, 91 % ± 4.0%, 94% ± 1.8%, and 73 % ± 2.6% (mean ± standard deviation) from water, fish tissue, periphyton, and sediment, respectively. The method demonstrated consistent performance in analysis of water collected over a 1 month.

  10. Individuality Normalization when Labeling with Isotopic Glycan Hydrazide Tags (INLIGHT): A Novel Glycan Relative Quantification Strategy

    PubMed Central

    Walker, S. Hunter; Taylor, Amber D.; Muddiman, David C.

    2013-01-01

    The INLIGHT strategy for the sample preparation, data analysis, and relative quantification of N-linked glycans is presented. Glycans are derivatized with either natural (L) or stable-isotope labeled (H) hydrazide reagents and analyzed using reversed phase liquid chromatography coupled online to a Q Exactive mass spectrometer. A simple glycan ladder, maltodextrin, is first used to demonstrate the relative quantification strategy in samples with negligible analytical and biological variability. It is shown that after a molecular weight correction due to isotopic overlap and a post-acquisition normalization of the data to account for both the systematic variability, a plot of the experimental H:L ratio vs. the calculated H:L ratio exhibits a correlation of unity for maltodextrin samples mixed in different ratios. We also demonstrate that the INLIGHT approach can quantify species over four orders of magnitude in ion abundance. The INLIGHT strategy is further demonstrated in pooled human plasma, where it is shown that the post-acquisition normalization is more effective than using a single spiked-in internal standard. Finally, changes in glycosylation are able to be detected in complex biological matrices, when spiked with a glycoprotein. The ability to spike in a glycoprotein and detect change at the glycan level validates both the sample preparation and data analysis strategy, making INLIGHT an invaluable relative quantification strategy for the field of glycomics. PMID:23860851

  11. Use of a medication quantification scale for comparison of pain medication usage in patients with complex regional pain syndrome (CRPS).

    PubMed

    Gallizzi, Michael A; Khazai, Ravand S; Gagnon, Christine M; Bruehl, Stephen; Harden, R Norman

    2015-03-01

    To correlate the amount and types of pain medications prescribed to CRPS patients, using the Medication Quantification Scale, and patients' subjective pain levels. An international, multisite, retrospective review. University medical centers in the United States, Israel, Germany, and the Netherlands. A total of 89 subjects were enrolled from four different countries: 27 from the United States, 20 Germany, 18 Netherlands, and 24 Israel. The main outcome measures used were the Medication Quantification Scale III and numerical analog pain scale. There was no statistically significant correlation noted between the medication quantification scale and the visual analog scale for any site except for a moderate positive correlation at German sites. The medication quantification scale mean differences between the United States and Germany, the Netherlands, and Israel were 9.793 (P < 0.002), 10.389 (P < 0.001), and 4.984 (P = 0.303), respectively. There appears to be only a weak correlation between amount of pain medication prescribed and patients' reported subjective pain intensity within this limited patient population. The Medication Quantification Scale is a viable tool for the analysis of pharmaceutical treatment of CRPS patients and would be useful in further prospective studies of pain medication prescription practices in the CRPS population worldwide. Wiley Periodicals, Inc.

  12. THE QUANTIFICATION OF AQUEOUS TRACERS IN LABORATORY AQUIFER MODELS USING A LIGHT TRANSMISSION VISUALIZATION METHOD - 3

    EPA Science Inventory

    The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...

  13. THE QUANTIFICATION OF AQUEOUS TRACERS IN LABORATORY AQUIFER MODELS USING A LIGHT TRANSMISSION VISUALIZATION METHOD - 2

    EPA Science Inventory

    The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...

  14. THE QUANTIFICATION OF AQUEOUS TRACERS IN LABORATORY AQUIFER MODELS USING A LIGHT TRANSMISSION VISUALIZATION METHOD - 1

    EPA Science Inventory

    The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...

  15. A Comparative Analysis of Computational Approaches to Relative Protein Quantification Using Peptide Peak Intensities in Label-free LC-MS Proteomics Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzke, Melissa M.; Brown, Joseph N.; Gritsenko, Marina A.

    2013-02-01

    Liquid chromatography coupled with mass spectrometry (LC-MS) is widely used to identify and quantify peptides in complex biological samples. In particular, label-free shotgun proteomics is highly effective for the identification of peptides and subsequently obtaining a global protein profile of a sample. As a result, this approach is widely used for discovery studies. Typically, the objective of these discovery studies is to identify proteins that are affected by some condition of interest (e.g. disease, exposure). However, for complex biological samples, label-free LC-MS proteomics experiments measure peptides and do not directly yield protein quantities. Thus, protein quantification must be inferred frommore » one or more measured peptides. In recent years, many computational approaches to relative protein quantification of label-free LC-MS data have been published. In this review, we examine the most commonly employed quantification approaches to relative protein abundance from peak intensity values, evaluate their individual merits, and discuss challenges in the use of the various computational approaches.« less

  16. Quantification of phytochelatins in Chlamydomonas reinhardtii using ferrocene-based derivatization.

    PubMed

    Bräutigam, Anja; Bomke, Susanne; Pfeifer, Thorben; Karst, Uwe; Krauss, Gerd-Joachim; Wesenberg, Dirk

    2010-08-01

    A method for the identification and quantification of canonic and isoforms of phytochelatins (PCs) from Chlamydomonas reinhardtii was developed. After disulfide reduction with tris(2-carboxyethyl)phosphine (TCEP) PCs were derivatized with ferrocenecarboxylic acid (2-maleimidoyl)ethylamide (FMEA) in order to avoid oxidation of the free thiol functions during analysis. Liquid chromatography (LC) coupled to electrospray mass spectrometry (ESI-MS) and inductively coupled plasma-mass spectrometry (ICP-MS) was used for rapid and quantitative analysis of the precolumn derivatized PCs. PC(2-4), CysGSH, CysPC(2-4), CysPC(2)desGly, CysPC(2)Glu and CysPC(2)Ala were determined in the algal samples depending on the exposure of the cells to cadmium ions.

  17. Quantification of protein carbonylation.

    PubMed

    Wehr, Nancy B; Levine, Rodney L

    2013-01-01

    Protein carbonylation is the most commonly used measure of oxidative modification of proteins. It is most often measured spectrophotometrically or immunochemically by derivatizing proteins with the classical carbonyl reagent 2,4 dinitrophenylhydrazine (DNPH). We present protocols for the derivatization and quantification of protein carbonylation with these two methods, including a newly described dot blot with greatly increased sensitivity.

  18. Quantification of fructo-oligosaccharides based on the evaluation of oligomer ratios using an artificial neural network.

    PubMed

    Onofrejová, Lucia; Farková, Marta; Preisler, Jan

    2009-04-13

    The application of an internal standard in quantitative analysis is desirable in order to correct for variations in sample preparation and instrumental response. In mass spectrometry of organic compounds, the internal standard is preferably labelled with a stable isotope, such as (18)O, (15)N or (13)C. In this study, a method for the quantification of fructo-oligosaccharides using matrix-assisted laser desorption/ionisation time-of-flight mass spectrometry (MALDI TOF MS) was proposed and tested on raftilose, a partially hydrolysed inulin with a degree of polymeration 2-7. A tetraoligosaccharide nystose, which is chemically identical to the raftilose tetramer, was used as an internal standard rather than an isotope-labelled analyte. Two mathematical approaches used for data processing, conventional calculations and artificial neural networks (ANN), were compared. The conventional data processing relies on the assumption that a constant oligomer dispersion profile will change after the addition of the internal standard and some simple numerical calculations. On the other hand, ANN was found to compensate for a non-linear MALDI response and variations in the oligomer dispersion profile with raftilose concentration. As a result, the application of ANN led to lower quantification errors and excellent day-to-day repeatability compared to the conventional data analysis. The developed method is feasible for MS quantification of raftilose in the range of 10-750 pg with errors below 7%. The content of raftilose was determined in dietary cream; application can be extended to other similar polymers. It should be stressed that no special optimisation of the MALDI process was carried out. A common MALDI matrix and sample preparation were used and only the basic parameters, such as sampling and laser energy, were optimised prior to quantification.

  19. Quantification of the memory imprint effect for a charged particle environment

    NASA Technical Reports Server (NTRS)

    Bhuva, B. L.; Johnson, R. L., Jr.; Gyurcsik, R. S.; Kerns, S. E.; Fernald, K. W.

    1987-01-01

    The effects of total accumulated dose on the single-event vulnerability of NMOS resistive-load SRAMs are investigated. The bias-dependent shifts in device parameters can imprint the memory state present during exposure or erase the imprinted state. Analysis of these effects is presented along with an analytic model developed for the quantification of these effects. The results indicate that the imprint effect is dominated by the difference in the threshold voltage of the n-channel devices.

  20. Quantification of steroid hormones in human serum by liquid chromatography-high resolution tandem mass spectrometry.

    PubMed

    Matysik, Silke; Liebisch, Gerhard

    2017-12-01

    A limited specificity is inherent to immunoassays for steroid hormone analysis. To improve selectivity mass spectrometric analysis of steroid hormones by liquid chromatography-tandem mass spectrometry (LC-MS/MS) has been introduced in the clinical laboratory over the past years usually with low mass resolution triple-quadrupole instruments or more recently by high resolution mass spectrometry (HR-MS). Here we introduce liquid chromatography-high resolution tandem mass spectrometry (LC-MS/HR-MS) to further increase selectivity of steroid hormone quantification. Application of HR-MS demonstrates an enhanced selectivity compared to low mass resolution. Separation of isobaric interferences reduces background noise and avoids overestimation. Samples were prepared by automated liquid-liquid extraction with MTBE. The LC-MS/HR-MS method using a quadrupole-Orbitrap analyzer includes eight steroid hormones i.e. androstenedione, corticosterone, cortisol, cortisone, 11-deoxycortisol, 17-hydroxyprogesterone, progesterone, and testosterone. It has a run-time of 5.3min and was validated according to the U.S. Food and Drug Administration (FDA) and the European Medicines Agency (EMA) guidelines. For most of the analytes coefficient of variation were 10% or lower and LOQs were determined significantly below 1ng/ml. Full product ion spectra including accurate masses substantiate compound identification by matching their masses and ratios with authentic standards. In summary, quantification of steroid hormones by LC-MS/HR-MS is applicable for clinical diagnostics and holds also promise for highly selective quantification of other small molecules. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Simple and accurate quantification of BTEX in ambient air by SPME and GC-MS.

    PubMed

    Baimatova, Nassiba; Kenessov, Bulat; Koziel, Jacek A; Carlsen, Lars; Bektassov, Marat; Demyanenko, Olga P

    2016-07-01

    Benzene, toluene, ethylbenzene and xylenes (BTEX) comprise one of the most ubiquitous and hazardous groups of ambient air pollutants of concern. Application of standard analytical methods for quantification of BTEX is limited by the complexity of sampling and sample preparation equipment, and budget requirements. Methods based on SPME represent simpler alternative, but still require complex calibration procedures. The objective of this research was to develop a simpler, low-budget, and accurate method for quantification of BTEX in ambient air based on SPME and GC-MS. Standard 20-mL headspace vials were used for field air sampling and calibration. To avoid challenges with obtaining and working with 'zero' air, slope factors of external standard calibration were determined using standard addition and inherently polluted lab air. For polydimethylsiloxane (PDMS) fiber, differences between the slope factors of calibration plots obtained using lab and outdoor air were below 14%. PDMS fiber provided higher precision during calibration while the use of Carboxen/PDMS fiber resulted in lower detection limits for benzene and toluene. To provide sufficient accuracy, the use of 20mL vials requires triplicate sampling and analysis. The method was successfully applied for analysis of 108 ambient air samples from Almaty, Kazakhstan. Average concentrations of benzene, toluene, ethylbenzene and o-xylene were 53, 57, 11 and 14µgm(-3), respectively. The developed method can be modified for further quantification of a wider range of volatile organic compounds in air. In addition, the new method is amenable to automation. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Self-optimized construction of transition rate matrices from accelerated atomistic simulations with Bayesian uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Swinburne, Thomas D.; Perez, Danny

    2018-05-01

    A massively parallel method to build large transition rate matrices from temperature-accelerated molecular dynamics trajectories is presented. Bayesian Markov model analysis is used to estimate the expected residence time in the known state space, providing crucial uncertainty quantification for higher-scale simulation schemes such as kinetic Monte Carlo or cluster dynamics. The estimators are additionally used to optimize where exploration is performed and the degree of temperature acceleration on the fly, giving an autonomous, optimal procedure to explore the state space of complex systems. The method is tested against exactly solvable models and used to explore the dynamics of C15 interstitial defects in iron. Our uncertainty quantification scheme allows for accurate modeling of the evolution of these defects over timescales of several seconds.

  3. Quantification of sunscreen ethylhexyl triazone in topical skin-care products by normal-phase TLC/densitometry.

    PubMed

    Sobanska, Anna W; Pyzowski, Jaroslaw

    2012-01-01

    Ethylhexyl triazone (ET) was separated from other sunscreens such as avobenzone, octocrylene, octyl methoxycinnamate, and diethylamino hydroxybenzoyl hexyl benzoate and from parabens by normal-phase HPTLC on silica gel 60 as stationary phase. Two mobile phases were particularly effective: (A) cyclohexane-diethyl ether 1 : 1 (v/v) and (B) cyclohexane-diethyl ether-acetone 15 : 1 : 2 (v/v/v) since apart from ET analysis they facilitated separation and quantification of other sunscreens present in the formulations. Densitometric scanning was performed at 300 nm. Calibration curves for ET were nonlinear (second-degree polynomials), with R > 0.998. For both mobile phases limits of detection (LOD) were 0.03 and limits of quantification (LOQ) 0.1 μg spot(-1). Both methods were validated.

  4. Ariadne's Thread: A Robust Software Solution Leading to Automated Absolute and Relative Quantification of SRM Data.

    PubMed

    Nasso, Sara; Goetze, Sandra; Martens, Lennart

    2015-09-04

    Selected reaction monitoring (SRM) MS is a highly selective and sensitive technique to quantify protein abundances in complex biological samples. To enhance the pace of SRM large studies, a validated, robust method to fully automate absolute quantification and to substitute for interactive evaluation would be valuable. To address this demand, we present Ariadne, a Matlab software. To quantify monitored targets, Ariadne exploits metadata imported from the transition lists, and targets can be filtered according to mProphet output. Signal processing and statistical learning approaches are combined to compute peptide quantifications. To robustly estimate absolute abundances, the external calibration curve method is applied, ensuring linearity over the measured dynamic range. Ariadne was benchmarked against mProphet and Skyline by comparing its quantification performance on three different dilution series, featuring either noisy/smooth traces without background or smooth traces with complex background. Results, evaluated as efficiency, linearity, accuracy, and precision of quantification, showed that Ariadne's performance is independent of data smoothness and complex background presence and that Ariadne outperforms mProphet on the noisier data set and improved 2-fold Skyline's accuracy and precision for the lowest abundant dilution with complex background. Remarkably, Ariadne could statistically distinguish from each other all different abundances, discriminating dilutions as low as 0.1 and 0.2 fmol. These results suggest that Ariadne offers reliable and automated analysis of large-scale SRM differential expression studies.

  5. A universal real-time PCR assay for the quantification of group-M HIV-1 proviral load.

    PubMed

    Malnati, Mauro S; Scarlatti, Gabriella; Gatto, Francesca; Salvatori, Francesca; Cassina, Giulia; Rutigliano, Teresa; Volpi, Rosy; Lusso, Paolo

    2008-01-01

    Quantification of human immunodeficiency virus type-1 (HIV-1) proviral DNA is increasingly used to measure the HIV-1 cellular reservoirs, a helpful marker to evaluate the efficacy of antiretroviral therapeutic regimens in HIV-1-infected individuals. Furthermore, the proviral DNA load represents a specific marker for the early diagnosis of perinatal HIV-1 infection and might be predictive of HIV-1 disease progression independently of plasma HIV-1 RNA levels and CD4(+) T-cell counts. The high degree of genetic variability of HIV-1 poses a serious challenge for the design of a universal quantitative assay capable of detecting all the genetic subtypes within the main (M) HIV-1 group with similar efficiency. Here, we describe a highly sensitive real-time PCR protocol that allows for the correct quantification of virtually all group-M HIV-1 strains with a higher degree of accuracy compared with other methods. The protocol involves three stages, namely DNA extraction/lysis, cellular DNA quantification and HIV-1 proviral load assessment. Owing to the robustness of the PCR design, this assay can be performed on crude cellular extracts, and therefore it may be suitable for the routine analysis of clinical samples even in developing countries. An accurate quantification of the HIV-1 proviral load can be achieved within 1 d from blood withdrawal.

  6. Quantification and analysis of color stability based on thermal transient behavior in white LED lamps.

    PubMed

    Nisa Khan, M

    2017-09-20

    We present measurement and analysis of color stability over time for two categories of white LED lamps based on their thermal management scheme, which also affects their transient lumen depreciation. We previously reported that lumen depreciation in LED lamps can be minimized by properly designing the heat sink configuration that allows lamps to reach a thermal equilibrium condition quickly. Although it is well known that lumen depreciation degrades color stability of white light since color coordinates vary with total lumen power by definition, quantification and characterization of color shifts based on thermal transient behavior have not been previously reported in literature for LED lamps. Here we provide experimental data and analysis of transient color shifts for two categories of household LED lamps (from a total of six lamps in two categories) and demonstrate that reaching thermal equilibrium more quickly provides better stability for color rendering, color temperature, and less deviation of color coordinates from the Planckian blackbody locus line, which are all very important characterization parameters of color for white light. We report for the first time that a lamp's color degradation from the turn-on time primarily depends on thermal transient behavior of the semiconductor LED chip, which experiences a wavelength shift as well as a decrease in its dominant wavelength peak value with time, which in turn degrades the phosphor conversion. For the first time, we also provide a comprehensive quantitative analysis that differentiates color degradation due to the heat rise in GaN/GaInN LED chips and subsequently the boards these chips are mounted on-from that caused by phosphor heating in a white LED module. Finally, we briefly discuss why there are some inevitable trade-offs between omnidirectionality and color and luminous output stability in current household LED lamps and what will help eliminate these trade-offs in future lamp designs.

  7. Automatic analysis and quantification of fluorescently labeled synapses in microscope images

    NASA Astrophysics Data System (ADS)

    Yona, Shai; Katsman, Alex; Orenbuch, Ayelet; Gitler, Daniel; Yitzhaky, Yitzhak

    2011-09-01

    The purpose of this work is to classify and quantify synapses and their properties in the cultures of a mouse's hippocampus, from images acquired by a fluorescent microscope. Quantification features include the number of synapses, their intensity and their size characteristics. The images obtained by the microscope contain hundreds to several thousands of synapses with various elliptic-like shape features and intensities. These images also include other features such as glia cells and other biological objects beyond the focus plane; those features reduce the visibility of the synapses and interrupt the segmentation process. The proposed method comprises several steps, including background subtraction, identification of suspected centers of synapses as local maxima of small neighborhoods, evaluation of the tendency of objects to be synapses according to intensity properties at their larger neighborhoods, classification of detected synapses into categories as bulks or single synapses and finally, delimiting the borders of each synapse.

  8. Proteomic Analysis of Sauvignon Blanc Grape Skin, Pulp and Seed and Relative Quantification of Pathogenesis-Related Proteins.

    PubMed

    Tian, Bin; Harrison, Roland; Morton, James; Deb-Choudhury, Santanu

    2015-01-01

    Thaumatin-like proteins (TLPs) and chitinases are the main constituents of so-called protein hazes which can form in finished white wine and which is a great concern of winemakers. These soluble pathogenesis-related (PR) proteins are extracted from grape berries. However, their distribution in different grape tissues is not well documented. In this study, proteins were first separately extracted from the skin, pulp and seed of Sauvignon Blanc grapes, followed by trypsin digestion and analysis by liquid chromatography-electrospray ionization-tandem mass spectrometry (LC-ESI-MS/MS). Proteins identified included 75 proteins from Sauvignon Blanc grape skin, 63 from grape pulp and 35 from grape seed, mostly functionally classified as associated with metabolism and energy. Some were present exclusively in specific grape tissues; for example, proteins involved in photosynthesis were only detected in grape skin and proteins found in alcoholic fermentation were only detected in grape pulp. Moreover, proteins identified in grape seed were less diverse than those identified in grape skin and pulp. TLPs and chitinases were identified in both Sauvignon Blanc grape skin and pulp, but not in the seed. To relatively quantify the PR proteins, the protein extracts of grape tissues were seperated by HPLC first and then analysed by SDS-PAGE. The results showed that the protein fractions eluted at 9.3 min and 19.2 min under the chromatographic conditions of this study confirmed that these corresponded to TLPs and chitinases seperately. Thus, the relative quantification of TLPs and chitinases in protein extracts was carried out by comparing the area of corresponding peaks against the area of a thamautin standard. The results presented in this study clearly demonstrated the distribution of haze-forming PR proteins in grape berries, and the relative quantification of TLPs and chitinases could be applied in fast tracking of changes in PR proteins during grape growth and determination of PR

  9. Proteomic Analysis of Sauvignon Blanc Grape Skin, Pulp and Seed and Relative Quantification of Pathogenesis-Related Proteins

    PubMed Central

    Tian, Bin; Harrison, Roland; Morton, James; Deb-Choudhury, Santanu

    2015-01-01

    Thaumatin-like proteins (TLPs) and chitinases are the main constituents of so-called protein hazes which can form in finished white wine and which is a great concern of winemakers. These soluble pathogenesis-related (PR) proteins are extracted from grape berries. However, their distribution in different grape tissues is not well documented. In this study, proteins were first separately extracted from the skin, pulp and seed of Sauvignon Blanc grapes, followed by trypsin digestion and analysis by liquid chromatography-electrospray ionization-tandem mass spectrometry (LC-ESI-MS/MS). Proteins identified included 75 proteins from Sauvignon Blanc grape skin, 63 from grape pulp and 35 from grape seed, mostly functionally classified as associated with metabolism and energy. Some were present exclusively in specific grape tissues; for example, proteins involved in photosynthesis were only detected in grape skin and proteins found in alcoholic fermentation were only detected in grape pulp. Moreover, proteins identified in grape seed were less diverse than those identified in grape skin and pulp. TLPs and chitinases were identified in both Sauvignon Blanc grape skin and pulp, but not in the seed. To relatively quantify the PR proteins, the protein extracts of grape tissues were seperated by HPLC first and then analysed by SDS-PAGE. The results showed that the protein fractions eluted at 9.3 min and 19.2 min under the chromatographic conditions of this study confirmed that these corresponded to TLPs and chitinases seperately. Thus, the relative quantification of TLPs and chitinases in protein extracts was carried out by comparing the area of corresponding peaks against the area of a thamautin standard. The results presented in this study clearly demonstrated the distribution of haze-forming PR proteins in grape berries, and the relative quantification of TLPs and chitinases could be applied in fast tracking of changes in PR proteins during grape growth and determination of PR

  10. Expedited quantification of mutant ribosomal RNA by binary deoxyribozyme (BiDz) sensors.

    PubMed

    Gerasimova, Yulia V; Yakovchuk, Petro; Dedkova, Larisa M; Hecht, Sidney M; Kolpashchikov, Dmitry M

    2015-10-01

    Mutations in ribosomal RNA (rRNA) have traditionally been detected by the primer extension assay, which is a tedious and multistage procedure. Here, we describe a simple and straightforward fluorescence assay based on binary deoxyribozyme (BiDz) sensors. The assay uses two short DNA oligonucleotides that hybridize specifically to adjacent fragments of rRNA, one of which contains a mutation site. This hybridization results in the formation of a deoxyribozyme catalytic core that produces the fluorescent signal and amplifies it due to multiple rounds of catalytic action. This assay enables us to expedite semi-quantification of mutant rRNA content in cell cultures starting from whole cells, which provides information useful for optimization of culture preparation prior to ribosome isolation. The method requires less than a microliter of a standard Escherichia coli cell culture and decreases analysis time from several days (for primer extension assay) to 1.5 h with hands-on time of ∼10 min. It is sensitive to single-nucleotide mutations. The new assay simplifies the preliminary analysis of RNA samples and cells in molecular biology and cloning experiments and is promising in other applications where fast detection/quantification of specific RNA is required. © 2015 Gerasimova et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  11. Computed tomographic-based quantification of emphysema and correlation to pulmonary function and mechanics.

    PubMed

    Washko, George R; Criner, Gerald J; Mohsenifar, Zab; Sciurba, Frank C; Sharafkhaneh, Amir; Make, Barry J; Hoffman, Eric A; Reilly, John J

    2008-06-01

    Computed tomographic based indices of emphysematous lung destruction may highlight differences in disease pathogenesis and further enable the classification of subjects with Chronic Obstructive Pulmonary Disease. While there are multiple techniques that can be utilized for such radiographic analysis, there is very little published information comparing the performance of these methods in a clinical case series. Our objective was to examine several quantitative and semi-quantitative methods for the assessment of the burden of emphysema apparent on computed tomographic scans and compare their ability to predict lung mechanics and function. Automated densitometric analysis was performed on 1094 computed tomographic scans collected upon enrollment into the National Emphysema Treatment Trial. Trained radiologists performed an additional visual grading of emphysema on high resolution CT scans. Full pulmonary function test results were available for correlation, with a subset of subjects having additional measurements of lung static recoil. There was a wide range of emphysematous lung destruction apparent on the CT scans and univariate correlations to measures of lung function were of modest strength. No single method of CT scan analysis clearly outperformed the rest of the group. Quantification of the burden of emphysematous lung destruction apparent on CT scan is a weak predictor of lung function and mechanics in severe COPD with no uniformly superior method found to perform this analysis. The CT based quantification of emphysema may augment pulmonary function testing in the characterization of COPD by providing complementary phenotypic information.

  12. Droplet Digital Enzyme-Linked Oligonucleotide Hybridization Assay for Absolute RNA Quantification.

    PubMed

    Guan, Weihua; Chen, Liben; Rane, Tushar D; Wang, Tza-Huei

    2015-09-03

    We present a continuous-flow droplet-based digital Enzyme-Linked Oligonucleotide Hybridization Assay (droplet digital ELOHA) for sensitive detection and absolute quantification of RNA molecules. Droplet digital ELOHA incorporates direct hybridization and single enzyme reaction via the formation of single probe-RNA-probe (enzyme) complex on magnetic beads. It enables RNA detection without reverse transcription and PCR amplification processes. The magnetic beads are subsequently encapsulated into a large number of picoliter-sized droplets with enzyme substrates in a continuous-flow device. This device is capable of generating droplets at high-throughput. It also integrates in-line enzymatic incubation and detection of fluorescent products. Our droplet digital ELOHA is able to accurately quantify (differentiate 40% difference) as few as ~600 RNA molecules in a 1 mL sample (equivalent to 1 aM or lower) without molecular replication. The absolute quantification ability of droplet digital ELOHA is demonstrated with the analysis of clinical Neisseria gonorrhoeae 16S rRNA to show its potential value in real complex samples.

  13. Droplet Digital Enzyme-Linked Oligonucleotide Hybridization Assay for Absolute RNA Quantification

    PubMed Central

    Guan, Weihua; Chen, Liben; Rane, Tushar D.; Wang, Tza-Huei

    2015-01-01

    We present a continuous-flow droplet-based digital Enzyme-Linked Oligonucleotide Hybridization Assay (droplet digital ELOHA) for sensitive detection and absolute quantification of RNA molecules. Droplet digital ELOHA incorporates direct hybridization and single enzyme reaction via the formation of single probe-RNA-probe (enzyme) complex on magnetic beads. It enables RNA detection without reverse transcription and PCR amplification processes. The magnetic beads are subsequently encapsulated into a large number of picoliter-sized droplets with enzyme substrates in a continuous-flow device. This device is capable of generating droplets at high-throughput. It also integrates in-line enzymatic incubation and detection of fluorescent products. Our droplet digital ELOHA is able to accurately quantify (differentiate 40% difference) as few as ~600 RNA molecules in a 1 mL sample (equivalent to 1 aM or lower) without molecular replication. The absolute quantification ability of droplet digital ELOHA is demonstrated with the analysis of clinical Neisseria gonorrhoeae 16S rRNA to show its potential value in real complex samples. PMID:26333806

  14. Droplet Digital Enzyme-Linked Oligonucleotide Hybridization Assay for Absolute RNA Quantification

    NASA Astrophysics Data System (ADS)

    Guan, Weihua; Chen, Liben; Rane, Tushar D.; Wang, Tza-Huei

    2015-09-01

    We present a continuous-flow droplet-based digital Enzyme-Linked Oligonucleotide Hybridization Assay (droplet digital ELOHA) for sensitive detection and absolute quantification of RNA molecules. Droplet digital ELOHA incorporates direct hybridization and single enzyme reaction via the formation of single probe-RNA-probe (enzyme) complex on magnetic beads. It enables RNA detection without reverse transcription and PCR amplification processes. The magnetic beads are subsequently encapsulated into a large number of picoliter-sized droplets with enzyme substrates in a continuous-flow device. This device is capable of generating droplets at high-throughput. It also integrates in-line enzymatic incubation and detection of fluorescent products. Our droplet digital ELOHA is able to accurately quantify (differentiate 40% difference) as few as ~600 RNA molecules in a 1 mL sample (equivalent to 1 aM or lower) without molecular replication. The absolute quantification ability of droplet digital ELOHA is demonstrated with the analysis of clinical Neisseria gonorrhoeae 16S rRNA to show its potential value in real complex samples.

  15. Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.

    PubMed

    Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania

    2016-04-01

    The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets.

  16. Automated lobar quantification of emphysema in patients with severe COPD.

    PubMed

    Revel, Marie-Pierre; Faivre, Jean-Baptiste; Remy-Jardin, Martine; Deken, Valérie; Duhamel, Alain; Marquette, Charles-Hugo; Tacelli, Nunzia; Bakai, Anne-Marie; Remy, Jacques

    2008-12-01

    Automated lobar quantification of emphysema has not yet been evaluated. Unenhanced 64-slice MDCT was performed in 47 patients evaluated before bronchoscopic lung-volume reduction. CT images reconstructed with a standard (B20) and high-frequency (B50) kernel were analyzed using a dedicated prototype software (MevisPULMO) allowing lobar quantification of emphysema extent. Lobar quantification was obtained following (a) a fully automatic delineation of the lobar limits by the software and (b) a semiautomatic delineation with manual correction of the lobar limits when necessary and was compared with the visual scoring of emphysema severity per lobe. No statistically significant difference existed between automated and semiautomated lobar quantification (p > 0.05 in the five lobes), with differences ranging from 0.4 to 3.9%. The agreement between the two methods (intraclass correlation coefficient, ICC) was excellent for left upper lobe (ICC = 0.94), left lower lobe (ICC = 0.98), and right lower lobe (ICC = 0.80). The agreement was good for right upper lobe (ICC = 0.68) and moderate for middle lobe (IC = 0.53). The Bland and Altman plots confirmed these results. A good agreement was observed between the software and visually assessed lobar predominance of emphysema (kappa 0.78; 95% CI 0.64-0.92). Automated and semiautomated lobar quantifications of emphysema are concordant and show good agreement with visual scoring.

  17. STEM VQ Method, Using Scanning Transmission Electron Microscopy (STEM) for Accurate Virus Quantification

    DTIC Science & Technology

    2017-02-02

    Corresponding Author Abstract Accurate virus quantification is sought, but a perfect method still eludes the scientific community. Electron...unlimited. UNCLASSIFIED 2 provides morphology data and counts all viral particles, including partial or noninfectious particles; however, EM methods ...consistent, reproducible virus quantification method called Scanning Transmission Electron Microscopy – Virus Quantification (STEM-VQ) which simplifies

  18. Analysis of Stomata Distribution Patterns for Quantification of the Foliar Plasticity of Tradescantia Zebrina

    NASA Astrophysics Data System (ADS)

    Batista Florindo, Joao; Landini, Gabriel; Almeida Filho, Humberto; Martinez Bruno, Odemir

    2015-09-01

    Here we propose a method for the analysis of the stomata distribution patterns on the surface of plant leaves. We also investigate how light exposure during growth can affect stomata distribution and the plasticity of leaves. Understanding foliar plasticity (the ability of leaves to modify their structural organization to adapt to changing environmental resources) is a fundamental problem in Agricultural and Environmental Sciences. Most published work on quantification of stomata has concentrated on descriptions of their density per unit of leaf area, however density alone does not provide a complete description of the problem and leaves several unanswered questions (e.g. whether the stomata patterns change across various areas of the leaf, or how the patterns change under varying observational scales). We used two approaches here, to know, multiscale fractal dimension and complex networks, as a means to provide a description of the complexity of these distributions. In the experiments, we used 18 samples from the plant Tradescantia Zebrina grown under three different conditions (4 hours of artificial light each day, 24 hours of artificial light each day, and sunlight) for a total of 69 days. The network descriptors were capable of correctly discriminating the different conditions in 88% of cases, while the fractal descriptors discriminated 83% of the samples. This is a significant improvement over the correct classification rates achieved when using only stomata density (56% of the samples).

  19. Comparison of methods for the quantification of the different carbon fractions in atmospheric aerosol samples

    NASA Astrophysics Data System (ADS)

    Nunes, Teresa; Mirante, Fátima; Almeida, Elza; Pio, Casimiro

    2010-05-01

    Atmospheric carbon consists of: organic carbon (OC, including various organic compounds), elemental carbon (EC, or black carbon [BC]/soot, a non-volatile/light-absorbing carbon), and a small quantity of carbonate carbon. Thermal/optical methods (TOM) have been widely used for quantifying total carbon (TC), OC, and EC in ambient and source particulate samples. Unfortunately, the different thermal evolution protocols in use can result in a wide elemental carbon-to-total carbon variation. Temperature evolution in thermal carbon analysis is critical to the allocation of carbon fractions. Another critical point in OC and EC quantification by TOM is the interference of carbonate carbon (CC) that could be present in the particulate samples, mainly in the coarse fraction of atmospheric aerosol. One of the methods used to minimize this interference consists on the use of a sample pre-treatment with acid to eliminate CC prior to thermal analysis (Chow et al., 2001; Pio et al., 1994). In Europe, there is currently no standard procedure for determining the carbonaceous aerosol fraction, which implies that data from different laboratories at various sites are of unknown accuracy and cannot be considered comparable. In the framework of the EU-project EUSAAR, a comprehensive study has been carried out to identify the causes of differences in the EC measured using different thermal evolution protocols. From this study an optimised protocol, the EUSAAR-2 protocol, was defined (Cavali et al., 2009). During the last two decades thousands of aerosol samples have been taken over quartz filters at urban, industrial, rural and background sites, and also from plume forest fires and biomass burning in a domestic closed stove. These samples were analysed for OC and EC, by a TOM, similar to that in use in the IMPROVE network (Pio et al., 2007). More recently we reduced the number of steps in thermal evolution protocols, without significant repercussions in the OC/EC quantifications. In order

  20. A rapid Fourier-transform infrared (FTIR) spectroscopic method for direct quantification of paracetamol content in solid pharmaceutical formulations

    NASA Astrophysics Data System (ADS)

    Mallah, Muhammad Ali; Sherazi, Syed Tufail Hussain; Bhanger, Muhammad Iqbal; Mahesar, Sarfaraz Ahmed; Bajeer, Muhammad Ashraf

    2015-04-01

    A transmission FTIR spectroscopic method was developed for direct, inexpensive and fast quantification of paracetamol content in solid pharmaceutical formulations. In this method paracetamol content is directly analyzed without solvent extraction. KBr pellets were formulated for the acquisition of FTIR spectra in transmission mode. Two chemometric models: simple Beer's law and partial least squares employed over the spectral region of 1800-1000 cm-1 for quantification of paracetamol content had a regression coefficient of (R2) of 0.999. The limits of detection and quantification using FTIR spectroscopy were 0.005 mg g-1 and 0.018 mg g-1, respectively. Study for interference was also done to check effect of the excipients. There was no significant interference from the sample matrix. The results obviously showed the sensitivity of transmission FTIR spectroscopic method for pharmaceutical analysis. This method is green in the sense that it does not require large volumes of hazardous solvents or long run times and avoids prior sample preparation.

  1. Raman spectroscopy for DNA quantification in cell nucleus.

    PubMed

    Okotrub, K A; Surovtsev, N V; Semeshin, V F; Omelyanchuk, L V

    2015-01-01

    Here we demonstrate the feasibility of a novel approach to quantify DNA in cell nuclei. This approach is based on spectroscopy analysis of Raman light scattering, and avoids the problem of nonstoichiometric binding of dyes to DNA, as it directly measures the signal from DNA. Quantitative analysis of nuclear DNA contribution to Raman spectrum could be reliably performed using intensity of a phosphate mode at 1096 cm(-1) . When compared to the known DNA standards from cells of different animals, our results matched those values at error of 10%. We therefore suggest that this approach will be useful to expand the list of DNA standards, to properly adjust the duration of hydrolysis in Feulgen staining, to assay the applicability of fuchsines for DNA quantification, as well as to measure DNA content in cells with complex hydrolysis patterns, when Feulgen densitometry is inappropriate. © 2014 International Society for Advancement of Cytometry.

  2. Nondestructive Detection and Quantification of Blueberry Bruising using Near-infrared (NIR) Hyperspectral Reflectance Imaging

    NASA Astrophysics Data System (ADS)

    Jiang, Yu; Li, Changying; Takeda, Fumiomi

    2016-10-01

    Currently, blueberry bruising is evaluated by either human visual/tactile inspection or firmness measurement instruments. These methods are destructive, time-consuming, and subjective. The goal of this paper was to develop a non-destructive approach for blueberry bruising detection and quantification. Experiments were conducted on 300 samples of southern highbush blueberry (Camellia, Rebel, and Star) and on 1500 samples of northern highbush blueberry (Bluecrop, Jersey, and Liberty) for hyperspectral imaging analysis, firmness measurement, and human evaluation. An algorithm was developed to automatically calculate a bruise ratio index (ratio of bruised to whole fruit area) for bruise quantification. The spectra of bruised and healthy tissues were statistically separated and the separation was independent of cultivars. Support vector machine (SVM) classification of the spectra from the regions of interest (ROIs) achieved over 94%, 92%, and 96% accuracy on the training set, independent testing set, and combined set, respectively. The statistical results showed that the bruise ratio index was equivalent to the measured firmness but better than the predicted firmness in regard to effectiveness of bruise quantification, and the bruise ratio index had a strong correlation with human assessment (R2 = 0.78 - 0.83). Therefore, the proposed approach and the bruise ratio index are effective to non-destructively detect and quantify blueberry bruising.

  3. Nondestructive Detection and Quantification of Blueberry Bruising using Near-infrared (NIR) Hyperspectral Reflectance Imaging.

    PubMed

    Jiang, Yu; Li, Changying; Takeda, Fumiomi

    2016-10-21

    Currently, blueberry bruising is evaluated by either human visual/tactile inspection or firmness measurement instruments. These methods are destructive, time-consuming, and subjective. The goal of this paper was to develop a non-destructive approach for blueberry bruising detection and quantification. Experiments were conducted on 300 samples of southern highbush blueberry (Camellia, Rebel, and Star) and on 1500 samples of northern highbush blueberry (Bluecrop, Jersey, and Liberty) for hyperspectral imaging analysis, firmness measurement, and human evaluation. An algorithm was developed to automatically calculate a bruise ratio index (ratio of bruised to whole fruit area) for bruise quantification. The spectra of bruised and healthy tissues were statistically separated and the separation was independent of cultivars. Support vector machine (SVM) classification of the spectra from the regions of interest (ROIs) achieved over 94%, 92%, and 96% accuracy on the training set, independent testing set, and combined set, respectively. The statistical results showed that the bruise ratio index was equivalent to the measured firmness but better than the predicted firmness in regard to effectiveness of bruise quantification, and the bruise ratio index had a strong correlation with human assessment (R2 = 0.78 - 0.83). Therefore, the proposed approach and the bruise ratio index are effective to non-destructively detect and quantify blueberry bruising.

  4. An alternative method for irones quantification in iris rhizomes using headspace solid-phase microextraction.

    PubMed

    Roger, B; Fernandez, X; Jeannot, V; Chahboun, J

    2010-01-01

    The essential oil obtained from iris rhizomes is one of the most precious raw materials for the perfume industry. Its fragrance is due to irones that are gradually formed by oxidative degradation of iridals during rhizome ageing. The development of an alternative method allowing irone quantification in iris rhizomes using HS-SPME-GC. The development of the method using HS-SPME-GC was achieved using the results obtained from a conventional method, i.e. a solid-liquid extraction (SLE) followed by irone quantification by CG. Among several calibration methods tested, internal calibration gave the best results and was the least sensitive to the matrix effect. The proposed method using HS-SPME-GC is as accurate and reproducible as the conventional one using SLE. These two methods were used to monitor and compare irone concentrations in iris rhizomes that had been stored for 6 months to 9 years. Irone quantification in iris rhizome can be achieved using HS-SPME-GC. This method can thus be used for the quality control of the iris rhizomes. It offers the advantage of combining extraction and analysis with an automated device and thus allows a large number of rhizome batches to be analysed and compared in a limited amount of time. Copyright © 2010 John Wiley & Sons, Ltd.

  5. Adaptive polynomial chaos techniques for uncertainty quantification of a gas cooled fast reactor transient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perko, Z.; Gilli, L.; Lathouwers, D.

    2013-07-01

    Uncertainty quantification plays an increasingly important role in the nuclear community, especially with the rise of Best Estimate Plus Uncertainty methodologies. Sensitivity analysis, surrogate models, Monte Carlo sampling and several other techniques can be used to propagate input uncertainties. In recent years however polynomial chaos expansion has become a popular alternative providing high accuracy at affordable computational cost. This paper presents such polynomial chaos (PC) methods using adaptive sparse grids and adaptive basis set construction, together with an application to a Gas Cooled Fast Reactor transient. Comparison is made between a new sparse grid algorithm and the traditionally used techniquemore » proposed by Gerstner. An adaptive basis construction method is also introduced and is proved to be advantageous both from an accuracy and a computational point of view. As a demonstration the uncertainty quantification of a 50% loss of flow transient in the GFR2400 Gas Cooled Fast Reactor design was performed using the CATHARE code system. The results are compared to direct Monte Carlo sampling and show the superior convergence and high accuracy of the polynomial chaos expansion. Since PC techniques are easy to implement, they can offer an attractive alternative to traditional techniques for the uncertainty quantification of large scale problems. (authors)« less

  6. Comparison of machine learning and semi-quantification algorithms for (I123)FP-CIT classification: the beginning of the end for semi-quantification?

    PubMed

    Taylor, Jonathan Christopher; Fenner, John Wesley

    2017-11-29

    Semi-quantification methods are well established in the clinic for assisted reporting of (I123) Ioflupane images. Arguably, these are limited diagnostic tools. Recent research has demonstrated the potential for improved classification performance offered by machine learning algorithms. A direct comparison between methods is required to establish whether a move towards widespread clinical adoption of machine learning algorithms is justified. This study compared three machine learning algorithms with that of a range of semi-quantification methods, using the Parkinson's Progression Markers Initiative (PPMI) research database and a locally derived clinical database for validation. Machine learning algorithms were based on support vector machine classifiers with three different sets of features: Voxel intensities Principal components of image voxel intensities Striatal binding radios from the putamen and caudate. Semi-quantification methods were based on striatal binding ratios (SBRs) from both putamina, with and without consideration of the caudates. Normal limits for the SBRs were defined through four different methods: Minimum of age-matched controls Mean minus 1/1.5/2 standard deviations from age-matched controls Linear regression of normal patient data against age (minus 1/1.5/2 standard errors) Selection of the optimum operating point on the receiver operator characteristic curve from normal and abnormal training data Each machine learning and semi-quantification technique was evaluated with stratified, nested 10-fold cross-validation, repeated 10 times. The mean accuracy of the semi-quantitative methods for classification of local data into Parkinsonian and non-Parkinsonian groups varied from 0.78 to 0.87, contrasting with 0.89 to 0.95 for classifying PPMI data into healthy controls and Parkinson's disease groups. The machine learning algorithms gave mean accuracies between 0.88 to 0.92 and 0.95 to 0.97 for local and PPMI data respectively. Classification

  7. Quantification of cell cycle kinetics by EdU (5-ethynyl-2′-deoxyuridine)-coupled-fluorescence-intensity analysis

    PubMed Central

    Cabrita, Marisa; Bekman, Evguenia; Braga, José; Rino, José; Santus, Renè; Filipe, Paulo L.; Sousa, Ana E.; Ferreira, João A.

    2017-01-01

    We propose a novel single-deoxynucleoside-based assay that is easy to perform and provides accurate values for the absolute length (in units of time) of each of the cell cycle stages (G1, S and G2/M). This flow-cytometric assay takes advantage of the excellent stoichiometric properties of azide-fluorochrome detection of DNA substituted with 5-ethynyl-2′-deoxyuridine (EdU). We show that by pulsing cells with EdU for incremental periods of time maximal EdU-coupled fluorescence is reached when pulsing times match the length of S phase. These pulsing times, allowing labelling for a full S phase of a fraction of cells in asynchronous populations, provide accurate values for the absolute length of S phase. We characterized additional, lower intensity signals that allowed quantification of the absolute durations of G1 and G2 phases. Importantly, using this novel assay data on the lengths of G1, S and G2/M phases are obtained in parallel. Therefore, these parameters can be estimated within a time frame that is shorter than a full cell cycle. This method, which we designate as EdU-Coupled Fluorescence Intensity (E-CFI) analysis, was successfully applied to cell types with distinctive cell cycle features and shows excellent agreement with established methodologies for analysis of cell cycle kinetics. PMID:28465489

  8. Quantification of pericardial effusions by echocardiography and computed tomography.

    PubMed

    Leibowitz, David; Perlman, Gidon; Planer, David; Gilon, Dan; Berman, Philip; Bogot, Naama

    2011-01-15

    Echocardiography is a well-accepted tool for the diagnosis and quantification of pericardial effusion (PEff). Given the increasing use of computed tomographic (CT) scanning, more PEffs are being initially diagnosed by computed tomography. No study has compared quantification of PEff by computed tomography and echocardiography. The objective of this study was to assess the accuracy of quantification of PEff by 2-dimensional echocardiography and computed tomography compared to the amount of pericardial fluid drained at pericardiocentesis. We retrospectively reviewed an institutional database to identify patients who underwent chest computed tomography and echocardiography before percutaneous pericardiocentesis with documentation of the amount of fluid withdrawn. Digital 2-dimensional echocardiographic and CT images were retrieved and quantification of PEff volume was performed by applying the formula for the volume of a prolate ellipse, π × 4/3 × maximal long-axis dimension/2 × maximal transverse dimension/2 × maximal anteroposterior dimension/2, to the pericardial sac and to the heart. Nineteen patients meeting study qualifications were entered into the study. The amount of PEff drained was 200 to 1,700 ml (mean 674 ± 340). Echocardiographically calculated pericardial effusion volume correlated relatively well with PEff volume (r = 0.73, p <0.001, mean difference -41 ± 225 ml). There was only moderate correlation between CT volume quantification and actual volume drained (r = 0.4, p = 0.004, mean difference 158 ± 379 ml). In conclusion, echocardiography appears a more accurate imaging technique than computed tomography in quantitative assessment of nonloculated PEffs and should continue to be the primary imaging in these patients. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. Normal Databases for the Relative Quantification of Myocardial Perfusion

    PubMed Central

    Rubeaux, Mathieu; Xu, Yuan; Germano, Guido; Berman, Daniel S.; Slomka, Piotr J.

    2016-01-01

    Purpose of review Myocardial perfusion imaging (MPI) with SPECT is performed clinically worldwide to detect and monitor coronary artery disease (CAD). MPI allows an objective quantification of myocardial perfusion at stress and rest. This established technique relies on normal databases to compare patient scans against reference normal limits. In this review, we aim to introduce the process of MPI quantification with normal databases and describe the associated perfusion quantitative measures that are used. Recent findings New equipment and new software reconstruction algorithms have been introduced which require the development of new normal limits. The appearance and regional count variations of normal MPI scan may differ between these new scanners and standard Anger cameras. Therefore, these new systems may require the determination of new normal limits to achieve optimal accuracy in relative myocardial perfusion quantification. Accurate diagnostic and prognostic results rivaling those obtained by expert readers can be obtained by this widely used technique. Summary Throughout this review, we emphasize the importance of the different normal databases and the need for specific databases relative to distinct imaging procedures. use of appropriate normal limits allows optimal quantification of MPI by taking into account subtle image differences due to the hardware and software used, and the population studied. PMID:28138354

  10. Quantification of Sunscreen Ethylhexyl Triazone in Topical Skin-Care Products by Normal-Phase TLC/Densitometry

    PubMed Central

    Sobanska, Anna W.; Pyzowski, Jaroslaw

    2012-01-01

    Ethylhexyl triazone (ET) was separated from other sunscreens such as avobenzone, octocrylene, octyl methoxycinnamate, and diethylamino hydroxybenzoyl hexyl benzoate and from parabens by normal-phase HPTLC on silica gel 60 as stationary phase. Two mobile phases were particularly effective: (A) cyclohexane-diethyl ether 1 : 1 (v/v) and (B) cyclohexane-diethyl ether-acetone 15 : 1 : 2 (v/v/v) since apart from ET analysis they facilitated separation and quantification of other sunscreens present in the formulations. Densitometric scanning was performed at 300 nm. Calibration curves for ET were nonlinear (second-degree polynomials), with R > 0.998. For both mobile phases limits of detection (LOD) were 0.03 and limits of quantification (LOQ) 0.1 μg spot−1. Both methods were validated. PMID:22629203

  11. LipidMiner: A Software for Automated Identification and Quantification of Lipids from Multiple Liquid Chromatography-Mass Spectrometry Data Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, Da; Zhang, Qibin; Gao, Xiaoli

    2014-04-30

    We have developed a tool for automated, high-throughput analysis of LC-MS/MS data files, which greatly simplifies LC-MS based lipidomics analysis. Our results showed that LipidMiner is accurate and comprehensive in identification and quantification of lipid molecular species. In addition, the workflow implemented in LipidMiner is not limited to identification and quantification of lipids. If a suitable metabolite library is implemented in the library matching module, LipidMiner could be reconfigured as a tool for general metabolomics data analysis. It is of note that LipidMiner currently is limited to singly charged ions, although it is adequate for the purpose of lipidomics sincemore » lipids are rarely multiply charged,[14] even for the polyphosphoinositides. LipidMiner also only processes file formats generated from mass spectrometers from Thermo, i.e. the .RAW format. In the future, we are planning to accommodate file formats generated by mass spectrometers from other predominant instrument vendors to make this tool more universal.« less

  12. Installation Restoration Program. Confirmation/Quantification Stage 1. Phase 2

    DTIC Science & Technology

    1985-03-07

    INSTALLATION RESTORATION PROGRAM i0 PHASE II - CONFIRMATION/QUANTIFICATION 0STAGE 1 KIRTLAND AFB KIRTLAND AFB, NEW MEXICO 87117 IIl PREPARED BY SCIENCE...APPLICATIONS INTERNATIONAL CORPORATION 505 MARQUETTE NW, SUITE 1200 ALBUQUERQUE, NEW MEXICO 871021 5MARCH 1985 FINAL REPORT FROM FEB 1983 TO MAR 1985...QUANTIFICATION STAGE 1 i FINAL REPORT FOR IKIRTLAND AFB KIRTLAND AFB, NEW MEXICO 87117U HEADQUARTERS MILITARY AIRLIFT COMMAND COMMAND SURGEON’S OFFICE (HQ MAC

  13. Use of multiple competitors for quantification of human immunodeficiency virus type 1 RNA in plasma.

    PubMed

    Vener, T; Nygren, M; Andersson, A; Uhlén, M; Albert, J; Lundeberg, J

    1998-07-01

    Quantification of human immunodeficiency virus type 1 (HIV-1) RNA in plasma has rapidly become an important tool in basic HIV research and in the clinical care of infected individuals. Here, a quantitative HIV assay based on competitive reverse transcription-PCR with multiple competitors was developed. Four RNA competitors containing identical PCR primer binding sequences as the viral HIV-1 RNA target were constructed. One of the PCR primers was fluorescently labeled, which facilitated discrimination between the viral RNA and competitor amplicons by fragment analysis with conventional automated sequencers. The coamplification of known amounts of the RNA competitors provided the means to establish internal calibration curves for the individual reactions resulting in exclusion of tube-to-tube variations. Calibration curves were created from the peak areas, which were proportional to the starting amount of each competitor. The fluorescence detection format was expanded to provide a dynamic range of more than 5 log units. This quantitative assay allowed for reproducible analysis of samples containing as few as 40 viral copies of HIV-1 RNA per reaction. The within- and between-run coefficients of variation were <24% (range, 10 to 24) and <36% (range, 27 to 36), respectively. The high reproducibility (standard deviation, <0.13 log) of the overall procedure for quantification of HIV-1 RNA in plasma, including sample preparation, amplification, and detection variations, allowed reliable detection of a 0.5-log change in RNA viral load. The assay could be a useful tool for monitoring HIV-1 disease progression and antiviral treatment and can easily be adapted to the quantification of other pathogens.

  14. Deep learning approach for the detection and quantification of intraretinal cystoid fluid in multivendor optical coherence tomography.

    PubMed

    Venhuizen, Freerk G; van Ginneken, Bram; Liefers, Bart; van Asten, Freekje; Schreur, Vivian; Fauser, Sascha; Hoyng, Carel; Theelen, Thomas; Sánchez, Clara I

    2018-04-01

    We developed a deep learning algorithm for the automatic segmentation and quantification of intraretinal cystoid fluid (IRC) in spectral domain optical coherence tomography (SD-OCT) volumes independent of the device used for acquisition. A cascade of neural networks was introduced to include prior information on the retinal anatomy, boosting performance significantly. The proposed algorithm approached human performance reaching an overall Dice coefficient of 0.754 ± 0.136 and an intraclass correlation coefficient of 0.936, for the task of IRC segmentation and quantification, respectively. The proposed method allows for fast quantitative IRC volume measurements that can be used to improve patient care, reduce costs, and allow fast and reliable analysis in large population studies.

  15. Deep learning approach for the detection and quantification of intraretinal cystoid fluid in multivendor optical coherence tomography

    PubMed Central

    Venhuizen, Freerk G.; van Ginneken, Bram; Liefers, Bart; van Asten, Freekje; Schreur, Vivian; Fauser, Sascha; Hoyng, Carel; Theelen, Thomas; Sánchez, Clara I.

    2018-01-01

    We developed a deep learning algorithm for the automatic segmentation and quantification of intraretinal cystoid fluid (IRC) in spectral domain optical coherence tomography (SD-OCT) volumes independent of the device used for acquisition. A cascade of neural networks was introduced to include prior information on the retinal anatomy, boosting performance significantly. The proposed algorithm approached human performance reaching an overall Dice coefficient of 0.754 ± 0.136 and an intraclass correlation coefficient of 0.936, for the task of IRC segmentation and quantification, respectively. The proposed method allows for fast quantitative IRC volume measurements that can be used to improve patient care, reduce costs, and allow fast and reliable analysis in large population studies. PMID:29675301

  16. Quantification Bias Caused by Plasmid DNA Conformation in Quantitative Real-Time PCR Assay

    PubMed Central

    Lin, Chih-Hui; Chen, Yu-Chieh; Pan, Tzu-Ming

    2011-01-01

    Quantitative real-time PCR (qPCR) is the gold standard for the quantification of specific nucleic acid sequences. However, a serious concern has been revealed in a recent report: supercoiled plasmid standards cause significant over-estimation in qPCR quantification. In this study, we investigated the effect of plasmid DNA conformation on the quantification of DNA and the efficiency of qPCR. Our results suggest that plasmid DNA conformation has significant impact on the accuracy of absolute quantification by qPCR. DNA standard curves shifted significantly among plasmid standards with different DNA conformations. Moreover, the choice of DNA measurement method and plasmid DNA conformation may also contribute to the measurement error of DNA standard curves. Due to the multiple effects of plasmid DNA conformation on the accuracy of qPCR, efforts should be made to assure the highest consistency of plasmid standards for qPCR. Thus, we suggest that the conformation, preparation, quantification, purification, handling, and storage of standard plasmid DNA should be described and defined in the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) to assure the reproducibility and accuracy of qPCR absolute quantification. PMID:22194997

  17. Elemental labelling combined with liquid chromatography inductively coupled plasma mass spectrometry for quantification of biomolecules: A review

    PubMed Central

    Kretschy, Daniela; Koellensperger, Gunda; Hann, Stephan

    2012-01-01

    This article reviews novel quantification concepts where elemental labelling is combined with flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS) or liquid chromatography inductively coupled plasma mass spectrometry (LC–ICP-MS), and employed for quantification of biomolecules such as proteins, peptides and related molecules in challenging sample matrices. In the first sections an overview on general aspects of biomolecule quantification, as well as of labelling will be presented emphasizing the potential, which lies in such methodological approaches. In this context, ICP-MS as detector provides high sensitivity, selectivity and robustness in biological samples and offers the capability for multiplexing and isotope dilution mass spectrometry (IDMS). Fundamental methodology of elemental labelling will be highlighted and analytical, as well as biomedical applications will be presented. A special focus will lie on established applications underlining benefits and bottlenecks of such approaches for the implementation in real life analysis. Key research made in this field will be summarized and a perspective for future developments including sophisticated and innovative applications will given. PMID:23062431

  18. GoIFISH: a system for the quantification of single cell heterogeneity from IFISH images.

    PubMed

    Trinh, Anne; Rye, Inga H; Almendro, Vanessa; Helland, Aslaug; Russnes, Hege G; Markowetz, Florian

    2014-08-26

    Molecular analysis has revealed extensive intra-tumor heterogeneity in human cancer samples, but cannot identify cell-to-cell variations within the tissue microenvironment. In contrast, in situ analysis can identify genetic aberrations in phenotypically defined cell subpopulations while preserving tissue-context specificity. GoIFISHGoIFISH is a widely applicable, user-friendly system tailored for the objective and semi-automated visualization, detection and quantification of genomic alterations and protein expression obtained from fluorescence in situ analysis. In a sample set of HER2-positive breast cancers GoIFISHGoIFISH is highly robust in visual analysis and its accuracy compares favorably to other leading image analysis methods. GoIFISHGoIFISH is freely available at www.sourceforge.net/projects/goifish/.

  19. MPQ-cytometry: a magnetism-based method for quantification of nanoparticle-cell interactions

    NASA Astrophysics Data System (ADS)

    Shipunova, V. O.; Nikitin, M. P.; Nikitin, P. I.; Deyev, S. M.

    2016-06-01

    Precise quantification of interactions between nanoparticles and living cells is among the imperative tasks for research in nanobiotechnology, nanotoxicology and biomedicine. To meet the challenge, a rapid method called MPQ-cytometry is developed, which measures the integral non-linear response produced by magnetically labeled nanoparticles in a cell sample with an original magnetic particle quantification (MPQ) technique. MPQ-cytometry provides a sensitivity limit 0.33 ng of nanoparticles and is devoid of a background signal present in many label-based assays. Each measurement takes only a few seconds, and no complicated sample preparation or data processing is required. The capabilities of the method have been demonstrated by quantification of interactions of iron oxide nanoparticles with eukaryotic cells. The total amount of targeted nanoparticles that specifically recognized the HER2/neu oncomarker on the human cancer cell surface was successfully measured, the specificity of interaction permitting the detection of HER2/neu positive cells in a cell mixture. Moreover, it has been shown that MPQ-cytometry analysis of a HER2/neu-specific iron oxide nanoparticle interaction with six cell lines of different tissue origins quantitatively reflects the HER2/neu status of the cells. High correlation of MPQ-cytometry data with those obtained by three other commonly used in molecular and cell biology methods supports consideration of this method as a prospective alternative for both quantifying cell-bound nanoparticles and estimating the expression level of cell surface antigens. The proposed method does not require expensive sophisticated equipment or highly skilled personnel and it can be easily applied for rapid diagnostics, especially under field conditions.Precise quantification of interactions between nanoparticles and living cells is among the imperative tasks for research in nanobiotechnology, nanotoxicology and biomedicine. To meet the challenge, a rapid method

  20. Subcellular object quantification with Squassh3C and SquasshAnalyst.

    PubMed

    Rizk, Aurélien; Mansouri, Maysam; Ballmer-Hofer, Kurt; Berger, Philipp

    2015-11-01

    Quantitative image analysis plays an important role in contemporary biomedical research. Squassh is a method for automatic detection, segmentation, and quantification of subcellular structures and analysis of their colocalization. Here we present the applications Squassh3C and SquasshAnalyst. Squassh3C extends the functionality of Squassh to three fluorescence channels and live-cell movie analysis. SquasshAnalyst is an interactive web interface for the analysis of Squassh3C object data. It provides segmentation image overview and data exploration, figure generation, object and image filtering, and a statistical significance test in an easy-to-use interface. The overall procedure combines the Squassh3C plug-in for the free biological image processing program ImageJ and a web application working in conjunction with the free statistical environment R, and it is compatible with Linux, MacOS X, or Microsoft Windows. Squassh3C and SquasshAnalyst are available for download at www.psi.ch/lbr/SquasshAnalystEN/SquasshAnalyst.zip.

  1. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef; Conrad, Patrick; Bigoni, Daniele

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a historymore » of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is

  2. Kinetic quantification of plyometric exercise intensity.

    PubMed

    Ebben, William P; Fauth, McKenzie L; Garceau, Luke R; Petushek, Erich J

    2011-12-01

    Ebben, WP, Fauth, ML, Garceau, LR, and Petushek, EJ. Kinetic quantification of plyometric exercise intensity. J Strength Cond Res 25(12): 3288-3298, 2011-Quantification of plyometric exercise intensity is necessary to understand the characteristics of these exercises and the proper progression of this mode of exercise. The purpose of this study was to assess the kinetic characteristics of a variety of plyometric exercises. This study also sought to assess gender differences in these variables. Twenty-six men and 23 women with previous experience in performing plyometric training served as subjects. The subjects performed a variety of plyometric exercises including line hops, 15.24-cm cone hops, squat jumps, tuck jumps, countermovement jumps (CMJs), loaded CMJs equal to 30% of 1 repetition maximum squat, depth jumps normalized to the subject's jump height (JH), and single leg jumps. All plyometric exercises were assessed with a force platform. Outcome variables associated with the takeoff, airborne, and landing phase of each plyometric exercise were evaluated. These variables included the peak vertical ground reaction force (GRF) during takeoff, the time to takeoff, flight time, JH, peak power, landing rate of force development, and peak vertical GRF during landing. A 2-way mixed analysis of variance with repeated measures for plyometric exercise type demonstrated main effects for exercise type and all outcome variables (p ≤ 0.05) and for the interaction between gender and peak vertical GRF during takeoff (p ≤ 0.05). Bonferroni-adjusted pairwise comparisons identified a number of differences between the plyometric exercises for the outcome variables assessed (p ≤ 0.05). These findings can be used to guide the progression of plyometric training by incorporating exercises of increasing intensity over the course of a program.

  3. Relative quantification of biomarkers using mixed-isotope labeling coupled with MS

    PubMed Central

    Chapman, Heidi M; Schutt, Katherine L; Dieter, Emily M; Lamos, Shane M

    2013-01-01

    The identification and quantification of important biomarkers is a critical first step in the elucidation of biological systems. Biomarkers take many forms as cellular responses to stimuli and can be manifested during transcription, translation, and/or metabolic processing. Increasingly, researchers have relied upon mixed-isotope labeling (MIL) coupled with MS to perform relative quantification of biomarkers between two or more biological samples. MIL effectively tags biomarkers of interest for ease of identification and quantification within the mass spectrometer by using isotopic labels that introduce a heavy and light form of the tag. In addition to MIL coupled with MS, a number of other approaches have been used to quantify biomarkers including protein gel staining, enzymatic labeling, metabolic labeling, and several label-free approaches that generate quantitative data from the MS signal response. This review focuses on MIL techniques coupled with MS for the quantification of protein and small-molecule biomarkers. PMID:23157360

  4. pyQms enables universal and accurate quantification of mass spectrometry data.

    PubMed

    Leufken, Johannes; Niehues, Anna; Sarin, L Peter; Wessel, Florian; Hippler, Michael; Leidel, Sebastian A; Fufezan, Christian

    2017-10-01

    Quantitative mass spectrometry (MS) is a key technique in many research areas (1), including proteomics, metabolomics, glycomics, and lipidomics. Because all of the corresponding molecules can be described by chemical formulas, universal quantification tools are highly desirable. Here, we present pyQms, an open-source software for accurate quantification of all types of molecules measurable by MS. pyQms uses isotope pattern matching that offers an accurate quality assessment of all quantifications and the ability to directly incorporate mass spectrometer accuracy. pyQms is, due to its universal design, applicable to every research field, labeling strategy, and acquisition technique. This opens ultimate flexibility for researchers to design experiments employing innovative and hitherto unexplored labeling strategies. Importantly, pyQms performs very well to accurately quantify partially labeled proteomes in large scale and high throughput, the most challenging task for a quantification algorithm. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  5. Biofilm Quantification on Nasolacrimal Silastic Stents After Dacryocystorhinostomy.

    PubMed

    Murphy, Jae; Ali, Mohammed Javed; Psaltis, Alkis James

    2015-01-01

    Biofilms are now recognized as potential factors in the pathogenesis of chronic inflammatory and infective diseases. The aim of this study was to examine the presence of biofilms and quantify their biomass on silastic nasolacrimal duct stents inserted after dacryocystorhinostomy (DCR). A prospective study was performed on a series of patients undergoing DCR with O'Donoghue stent insertion. After removal, the stents were subjected to biofilm analysis using standard protocols of confocal laser scanning microscopy (CLSM) and scanning electron microscopy. These stents were compared against negative controls and positive in vitro ones established using Staphylococcus aureus strain ATCC 25923. Biofilm quantification was performed using the COMSTAT2 software and the total biofilm biomass was calculated. A total of nine consecutive patient samples were included in this prospective study. None of the patients had any evidence of postoperative infection. All the stents demonstrated evidence of biofilm formation using both imaging modalities. The presence of various different sized organisms within a common exopolysaccharide matrix on CLSM suggested the existence of polymicrobial communities. The mean biomass of patient samples was 0.9385 μm³/μm² (range: 0.3901-1.9511 μm³/μm²). This is the first study to report the quantification of biomass on lacrimal stents. The presence of biofilms on lacrimal stents after DCR is a common finding but this need not necessarily translate to postoperative clinical infection.

  6. Evaluation of quantification methods for real-time PCR minor groove binding hybridization probe assays.

    PubMed

    Durtschi, Jacob D; Stevenson, Jeffery; Hymas, Weston; Voelkerding, Karl V

    2007-02-01

    Real-time PCR data analysis for quantification has been the subject of many studies aimed at the identification of new and improved quantification methods. Several analysis methods have been proposed as superior alternatives to the common variations of the threshold crossing method. Notably, sigmoidal and exponential curve fit methods have been proposed. However, these studies have primarily analyzed real-time PCR with intercalating dyes such as SYBR Green. Clinical real-time PCR assays, in contrast, often employ fluorescent probes whose real-time amplification fluorescence curves differ from those of intercalating dyes. In the current study, we compared four analysis methods related to recent literature: two versions of the threshold crossing method, a second derivative maximum method, and a sigmoidal curve fit method. These methods were applied to a clinically relevant real-time human herpes virus type 6 (HHV6) PCR assay that used a minor groove binding (MGB) Eclipse hybridization probe as well as an Epstein-Barr virus (EBV) PCR assay that used an MGB Pleiades hybridization probe. We found that the crossing threshold method yielded more precise results when analyzing the HHV6 assay, which was characterized by lower signal/noise and less developed amplification curve plateaus. In contrast, the EBV assay, characterized by greater signal/noise and amplification curves with plateau regions similar to those observed with intercalating dyes, gave results with statistically similar precision by all four analysis methods.

  7. Surface-EMG analysis for the quantification of thigh muscle dynamic co-contractions during normal gait.

    PubMed

    Strazza, Annachiara; Mengarelli, Alessandro; Fioretti, Sandro; Burattini, Laura; Agostini, Valentina; Knaflitz, Marco; Di Nardo, Francesco

    2017-01-01

    The research purpose was to quantify the co-contraction patterns of quadriceps femoris (QF) vs. hamstring muscles during free walking, in terms of onset-offset muscular activation, excitation intensity, and occurrence frequency. Statistical gait analysis was performed on surface-EMG signals from vastus lateralis (VL), rectus femoris (RF), and medial hamstrings (MH), in 16315 strides walked by 30 healthy young adults. Results showed full superimpositions of MH with both VL and RF activity from terminal swing, 80 to 100% of gait cycle (GC), to the successive loading response (≈0-15% of GC), in around 90% of the considered strides. A further superimposition was detected during the push-off phase both between VL and MH activation intervals (38.6±12.8% to 44.1±9.6% of GC) in 21.9±13.6% of strides, and between RF and MH activation intervals (45.9±5.3% to 50.7±9.7 of GC) in 32.7±15.1% of strides. These findings led to identify three different co-contractions among QF and hamstring muscles during able-bodied walking: in early stance (in ≈90% of strides), in push-off (in 25-30% of strides) and in terminal swing (in ≈90% of strides). The co-contraction in terminal swing is the one with the highest levels of muscle excitation intensity. To our knowledge, this analysis represents the first attempt for quantification of QF/hamstring muscles co-contraction in young healthy subjects during normal gait, able to include the physiological variability of the phenomenon. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Characterization and Quantification of Intact 26S Proteasome Proteins by Real-Time Measurement of Intrinsic Fluorescence Prior to Top-down Mass Spectrometry

    PubMed Central

    Russell, Jason D.; Scalf, Mark; Book, Adam J.; Ladror, Daniel T.; Vierstra, Richard D.; Smith, Lloyd M.; Coon, Joshua J.

    2013-01-01

    Quantification of gas-phase intact protein ions by mass spectrometry (MS) is impeded by highly-variable ionization, ion transmission, and ion detection efficiencies. Therefore, quantification of proteins using MS-associated techniques is almost exclusively done after proteolysis where peptides serve as proxies for estimating protein abundance. Advances in instrumentation, protein separations, and informatics have made large-scale sequencing of intact proteins using top-down proteomics accessible to the proteomics community; yet quantification of proteins using a top-down workflow has largely been unaddressed. Here we describe a label-free approach to determine the abundance of intact proteins separated by nanoflow liquid chromatography prior to MS analysis by using solution-phase measurements of ultraviolet light-induced intrinsic fluorescence (UV-IF). UV-IF is measured directly at the electrospray interface just prior to the capillary exit where proteins containing at least one tryptophan residue are readily detected. UV-IF quantification was demonstrated using commercially available protein standards and provided more accurate and precise protein quantification than MS ion current. We evaluated the parallel use of UV-IF and top-down tandem MS for quantification and identification of protein subunits and associated proteins from an affinity-purified 26S proteasome sample from Arabidopsis thaliana. We identified 26 unique proteins and quantified 13 tryptophan-containing species. Our analyses discovered previously unidentified N-terminal processing of the β6 (PBF1) and β7 (PBG1) subunit - such processing of PBG1 may generate a heretofore unknown additional protease active site upon cleavage. In addition, our approach permitted the unambiguous identification and quantification both isoforms of the proteasome-associated protein DSS1. PMID:23536786

  9. Characterization and quantification of intact 26S proteasome proteins by real-time measurement of intrinsic fluorescence prior to top-down mass spectrometry.

    PubMed

    Russell, Jason D; Scalf, Mark; Book, Adam J; Ladror, Daniel T; Vierstra, Richard D; Smith, Lloyd M; Coon, Joshua J

    2013-01-01

    Quantification of gas-phase intact protein ions by mass spectrometry (MS) is impeded by highly-variable ionization, ion transmission, and ion detection efficiencies. Therefore, quantification of proteins using MS-associated techniques is almost exclusively done after proteolysis where peptides serve as proxies for estimating protein abundance. Advances in instrumentation, protein separations, and informatics have made large-scale sequencing of intact proteins using top-down proteomics accessible to the proteomics community; yet quantification of proteins using a top-down workflow has largely been unaddressed. Here we describe a label-free approach to determine the abundance of intact proteins separated by nanoflow liquid chromatography prior to MS analysis by using solution-phase measurements of ultraviolet light-induced intrinsic fluorescence (UV-IF). UV-IF is measured directly at the electrospray interface just prior to the capillary exit where proteins containing at least one tryptophan residue are readily detected. UV-IF quantification was demonstrated using commercially available protein standards and provided more accurate and precise protein quantification than MS ion current. We evaluated the parallel use of UV-IF and top-down tandem MS for quantification and identification of protein subunits and associated proteins from an affinity-purified 26S proteasome sample from Arabidopsis thaliana. We identified 26 unique proteins and quantified 13 tryptophan-containing species. Our analyses discovered previously unidentified N-terminal processing of the β6 (PBF1) and β7 (PBG1) subunit - such processing of PBG1 may generate a heretofore unknown additional protease active site upon cleavage. In addition, our approach permitted the unambiguous identification and quantification both isoforms of the proteasome-associated protein DSS1.

  10. An open tool for input function estimation and quantification of dynamic PET FDG brain scans.

    PubMed

    Bertrán, Martín; Martínez, Natalia; Carbajal, Guillermo; Fernández, Alicia; Gómez, Álvaro

    2016-08-01

    Positron emission tomography (PET) analysis of clinical studies is mostly restricted to qualitative evaluation. Quantitative analysis of PET studies is highly desirable to be able to compute an objective measurement of the process of interest in order to evaluate treatment response and/or compare patient data. But implementation of quantitative analysis generally requires the determination of the input function: the arterial blood or plasma activity which indicates how much tracer is available for uptake in the brain. The purpose of our work was to share with the community an open software tool that can assist in the estimation of this input function, and the derivation of a quantitative map from the dynamic PET study. Arterial blood sampling during the PET study is the gold standard method to get the input function, but is uncomfortable and risky for the patient so it is rarely used in routine studies. To overcome the lack of a direct input function, different alternatives have been devised and are available in the literature. These alternatives derive the input function from the PET image itself (image-derived input function) or from data gathered from previous similar studies (population-based input function). In this article, we present ongoing work that includes the development of a software tool that integrates several methods with novel strategies for the segmentation of blood pools and parameter estimation. The tool is available as an extension to the 3D Slicer software. Tests on phantoms were conducted in order to validate the implemented methods. We evaluated the segmentation algorithms over a range of acquisition conditions and vasculature size. Input function estimation algorithms were evaluated against ground truth of the phantoms, as well as on their impact over the final quantification map. End-to-end use of the tool yields quantification maps with [Formula: see text] relative error in the estimated influx versus ground truth on phantoms. The main

  11. Three-Dimensional Echocardiographic Assessment of Left Heart Chamber Size and Function with Fully Automated Quantification Software in Patients with Atrial Fibrillation.

    PubMed

    Otani, Kyoko; Nakazono, Akemi; Salgo, Ivan S; Lang, Roberto M; Takeuchi, Masaaki

    2016-10-01

    Echocardiographic determination of left heart chamber volumetric parameters by using manual tracings during multiple beats is tedious in atrial fibrillation (AF). The aim of this study was to determine the usefulness of fully automated left chamber quantification software with single-beat three-dimensional transthoracic echocardiographic data sets in patients with AF. Single-beat full-volume three-dimensional transthoracic echocardiographic data sets were prospectively acquired during consecutive multiple cardiac beats (≥10 beats) in 88 patients with AF. In protocol 1, left ventricular volumes, left ventricular ejection fraction, and maximal left atrial volume were validated using automated quantification against the manual tracing method in identical beats in 10 patients. In protocol 2, automated quantification-derived averaged values from multiple beats were compared with the corresponding values obtained from the indexed beat in all patients. Excellent correlations of left chamber parameters between automated quantification and the manual method were observed (r = 0.88-0.98) in protocol 1. The time required for the analysis with the automated quantification method (5 min) was significantly less compared with the manual method (27 min) (P < .0001). In protocol 2, there were excellent linear correlations between the averaged left chamber parameters and the corresponding values obtained from the indexed beat (r = 0.94-0.99), and test-retest variability of left chamber parameters was low (3.5%-4.8%). Three-dimensional transthoracic echocardiography with fully automated quantification software is a rapid and reliable way to measure averaged values of left heart chamber parameters during multiple consecutive beats. Thus, it is a potential new approach for left chamber quantification in patients with AF in daily routine practice. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.

  12. Simultaneous Quantification of Multiple Alternatively Spliced mRNA Transcripts Using Droplet Digital PCR.

    PubMed

    Sun, Bing; Zheng, Yun-Ling

    2018-01-01

    Currently there is no sensitive, precise, and reproducible method to quantitate alternative splicing of mRNA transcripts. Droplet digital™ PCR (ddPCR™) analysis allows for accurate digital counting for quantification of gene expression. Human telomerase reverse transcriptase (hTERT) is one of the essential components required for telomerase activity and for the maintenance of telomeres. Several alternatively spliced forms of hTERT mRNA in human primary and tumor cells have been reported in the literature. Using one pair of primers and two probes for hTERT, four alternatively spliced forms of hTERT (α-/β+, α+/β- single deletions, α-/β- double deletion, and nondeletion α+/β+) were accurately quantified through a novel analysis method via data collected from a single ddPCR reaction. In this chapter, we describe this ddPCR method that enables direct quantitative comparison of four alternatively spliced forms of the hTERT messenger RNA without the need for internal standards or multiple pairs of primers specific for each variant, eliminating the technical variation due to differential PCR amplification efficiency for different amplicons and the challenges of quantification using standard curves. This simple and straightforward method should have general utility for quantifying alternatively spliced gene transcripts.

  13. Rapid capillary electrophoresis approach for the quantification of ewe milk adulteration with cow milk.

    PubMed

    Trimboli, Francesca; Morittu, Valeria Maria; Cicino, Caterina; Palmieri, Camillo; Britti, Domenico

    2017-10-13

    The substitution of ewe milk with more economic cow milk is a common fraud. Here we present a capillary electrophoresis method for the quantification of ewe milk in ovine/bovine milk mixtures, which allows for the rapid and inexpensive recognition of ewe milk adulteration with cow milk. We utilized a routine CE method for human blood and urine proteins analysis, which fulfilled the separation of skimmed milk proteins in alkaline buffer. Under this condition, ovine and bovine milk exhibited a recognizable and distinct CE protein profiles, with a specific ewe peak showing a reproducible migration zone in ovine/bovine mixtures. Based on ewe specific CE peak, we developed a method for ewe milk quantification in ovine/bovine skimmed milk mixtures, which showed good linearity, precision and accuracy, and a minimum amount of detectable fraudulent cow milk equal to 5%. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Variation compensation and analysis on diaphragm curvature analysis for emphysema quantification on whole lung CT scans

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Barr, R. Graham; Yankelevitz, David F.; Henschke, Claudia I.

    2010-03-01

    CT scans allow for the quantitative evaluation of the anatomical bases of emphysema. Recently, a non-density based geometric measurement of lung diagphragm curvature has been proposed as a method for the quantification of emphysema from CT. This work analyzes variability of diaphragm curvature and evaluates the effectiveness of a compensation methodology for the reduction of this variability as compared to emphysema index. Using a dataset of 43 scan-pairs with less than a 100 day time-interval between scans, we find that the diaphragm curvature had a trend towards lower overall variability over emphysema index (95% CI:-9.7 to + 14.7 vs. -15.8 to +12.0), and that the variation of both measures was reduced after compensation. We conclude that the variation of the new measure can be considered comparable to the established measure and the compensation can reduce the apparent variation of quantitative measures successfully.

  15. Segmentation and quantification of subcellular structures in fluorescence microscopy images using Squassh.

    PubMed

    Rizk, Aurélien; Paul, Grégory; Incardona, Pietro; Bugarski, Milica; Mansouri, Maysam; Niemann, Axel; Ziegler, Urs; Berger, Philipp; Sbalzarini, Ivo F

    2014-03-01

    Detection and quantification of fluorescently labeled molecules in subcellular compartments is a key step in the analysis of many cell biological processes. Pixel-wise colocalization analyses, however, are not always suitable, because they do not provide object-specific information, and they are vulnerable to noise and background fluorescence. Here we present a versatile protocol for a method named 'Squassh' (segmentation and quantification of subcellular shapes), which is used for detecting, delineating and quantifying subcellular structures in fluorescence microscopy images. The workflow is implemented in freely available, user-friendly software. It works on both 2D and 3D images, accounts for the microscope optics and for uneven image background, computes cell masks and provides subpixel accuracy. The Squassh software enables both colocalization and shape analyses. The protocol can be applied in batch, on desktop computers or computer clusters, and it usually requires <1 min and <5 min for 2D and 3D images, respectively. Basic computer-user skills and some experience with fluorescence microscopy are recommended to successfully use the protocol.

  16. Quantification of Fibrosis and Osteosclerosis in Myeloproliferative Neoplasms: A Computer-Assisted Image Study

    PubMed Central

    Teman, Carolin J.; Wilson, Andrew R.; Perkins, Sherrie L.; Hickman, Kimberly; Prchal, Josef T.; Salama, Mohamed E.

    2010-01-01

    Evaluation of bone marrow fibrosis and osteosclerosis in myeloproliferative neoplasms (MPN) is subject to interobserver inconsistency. Performance data for currently utilized fibrosis grading systems are lacking, and classification scales for osteosclerosis do not exist. Digital imaging can serve as a quantification method for fibrosis and osteosclerosis. We used digital imaging techniques for trabecular area assessment and reticulin-fiber quantification. Patients with all Philadelphia negative MPN subtypes had higher trabecular volume than controls (p ≤0.0015). Results suggest that the degree of osteosclerosis helps differentiate primary myelofibrosis from other MPN. Numerical quantification of fibrosis highly correlated with subjective scores, and interobserver correlation was satisfactory. Digital imaging provides accurate quantification for osteosclerosis and fibrosis. PMID:20122729

  17. Uncertainty quantification in flux balance analysis of spatially lumped and distributed models of neuron-astrocyte metabolism.

    PubMed

    Calvetti, Daniela; Cheng, Yougan; Somersalo, Erkki

    2016-12-01

    Identifying feasible steady state solutions of a brain energy metabolism model is an inverse problem that allows infinitely many solutions. The characterization of the non-uniqueness, or the uncertainty quantification of the flux balance analysis, is tantamount to identifying the degrees of freedom of the solution. The degrees of freedom of multi-compartment mathematical models for energy metabolism of a neuron-astrocyte complex may offer a key to understand the different ways in which the energetic needs of the brain are met. In this paper we study the uncertainty in the solution, using techniques of linear algebra to identify the degrees of freedom in a lumped model, and Markov chain Monte Carlo methods in its extension to a spatially distributed case. The interpretation of the degrees of freedom in metabolic terms, more specifically, glucose and oxygen partitioning, is then leveraged to derive constraints on the free parameters to guarantee that the model is energetically feasible. We demonstrate how the model can be used to estimate the stoichiometric energy needs of the cells as well as the household energy based on the measured oxidative cerebral metabolic rate of glucose and glutamate cycling. Moreover, our analysis shows that in the lumped model the net direction of lactate dehydrogenase (LDH) in the cells can be deduced from the glucose partitioning between the compartments. The extension of the lumped model to a spatially distributed multi-compartment setting that includes diffusion fluxes from capillary to tissue increases the number of degrees of freedom, requiring the use of statistical sampling techniques. The analysis of the distributed model reveals that some of the conclusions valid for the spatially lumped model, e.g., concerning the LDH activity and glucose partitioning, may no longer hold.

  18. Single cell genomic quantification by non-fluorescence nonlinear microscopy

    NASA Astrophysics Data System (ADS)

    Kota, Divya; Liu, Jing

    2017-02-01

    Human epidermal growth receptor 2 (Her2) is a gene which plays a major role in breast cancer development. The quantification of Her2 expression in single cells is limited by several drawbacks in existing fluorescence-based single molecule techniques, such as low signal-to-noise ratio (SNR), strong autofluorescence and background signals from biological components. For rigorous genomic quantification, a robust method of orthogonal detection is highly desirable and we demonstrated it by two non-fluorescent imaging techniques -transient absorption microscopy (TAM) and second harmonic generation (SHG). In TAM, gold nanoparticles (AuNPs) are chosen as an orthogonal probes for detection of single molecules which gives background-free quantifications of single mRNA transcript. In SHG, emission from barium titanium oxide (BTO) nanoprobes was demonstrated which allows stable signal beyond the autofluorescence window. Her2 mRNA was specifically labeled with nanoprobes which are conjugated with antibodies or oligonucleotides and quantified at single copy sensitivity in the cancer cells and tissues. Furthermore, a non-fluorescent super-resolution concept, named as second harmonic super-resolution microscopy (SHaSM), was proposed to quantify individual Her2 transcripts in cancer cells beyond the diffraction limit. These non-fluorescent imaging modalities will provide new dimensions in biomarker quantification at single molecule sensitivity in turbid biological samples, offering a strong cross-platform strategy for clinical monitoring at single cell resolution.

  19. Quantification by SEM-EDS in uncoated non-conducting samples

    NASA Astrophysics Data System (ADS)

    Galván Josa, V.; Castellano, G.; Bertolino, S. R.

    2013-07-01

    An approach to perform elemental quantitative analysis in a conventional scanning electron microscope with an energy dispersive spectrometer has been developed for non-conductive samples in which the conductive coating should be avoided. Charge accumulation effects, which basically decrease the energy of the primary beam, were taken into account by means of the Duane-Hunt limit. This value represents the maximum energy of the continuum X-ray spectrum, and is related to the effective energy of the incident electron beam. To validate the results obtained by this procedure, a non-conductive sample of known composition was quantified without conductive coating. Complementarily, changes in the X-ray spectrum due to charge accumulation effects were studied by Monte Carlo simulations, comparing relative characteristic intensities as a function of the incident energy. This methodology is exemplified here to obtain the chemical composition of white and reddish archaeological pigments belonging to the Ambato style of "Aguada" culture (Catamarca, Argentina 500-1100 AD). The results obtained in this work show that the quantification procedure taking into account the Duane-Hunt limit is suitable for this kind of samples. This approach may be recommended for the quantification of samples for which coating is not desirable, such as ancient artwork, forensic or archaeological samples, or when the coating element is also present in the sample.

  20. Quantification of Training and Competition Loads in Endurance Sports: Methods and Applications.

    PubMed

    Mujika, Iñigo

    2017-04-01

    Training quantification is basic to evaluate an endurance athlete's responses to training loads, ensure adequate stress/recovery balance, and determine the relationship between training and performance. Quantifying both external and internal workload is important, because external workload does not measure the biological stress imposed by the exercise sessions. Generally used quantification methods include retrospective questionnaires, diaries, direct observation, and physiological monitoring, often based on the measurement of oxygen uptake, heart rate, and blood lactate concentration. Other methods in use in endurance sports include speed measurement and the measurement of power output, made possible by recent technological advances such as power meters in cycling and triathlon. Among subjective methods of quantification, rating of perceived exertion stands out because of its wide use. Concurrent assessments of the various quantification methods allow researchers and practitioners to evaluate stress/recovery balance, adjust individual training programs, and determine the relationships between external load, internal load, and athletes' performance. This brief review summarizes the most relevant external- and internal-workload-quantification methods in endurance sports and provides practical examples of their implementation to adjust the training programs of elite athletes in accordance with their individualized stress/recovery balance.

  1. Motion-aware stroke volume quantification in 4D PC-MRI data of the human aorta.

    PubMed

    Köhler, Benjamin; Preim, Uta; Grothoff, Matthias; Gutberlet, Matthias; Fischbach, Katharina; Preim, Bernhard

    2016-02-01

    4D PC-MRI enables the noninvasive measurement of time-resolved, three-dimensional blood flow data that allow quantification of the hemodynamics. Stroke volumes are essential to assess the cardiac function and evolution of different cardiovascular diseases. The calculation depends on the wall position and vessel orientation, which both change during the cardiac cycle due to the heart muscle contraction and the pumped blood. However, current systems for the quantitative 4D PC-MRI data analysis neglect the dynamic character and instead employ a static 3D vessel approximation. We quantify differences between stroke volumes in the aorta obtained with and without consideration of its dynamics. We describe a method that uses the approximating 3D segmentation to automatically initialize segmentation algorithms that require regions inside and outside the vessel for each temporal position. This enables the use of graph cuts to obtain 4D segmentations, extract vessel surfaces including centerlines for each temporal position and derive motion information. The stroke volume quantification is compared using measuring planes in static (3D) vessels, planes with fixed angulation inside dynamic vessels (this corresponds to the common 2D PC-MRI) and moving planes inside dynamic vessels. Seven datasets with different pathologies such as aneurysms and coarctations were evaluated in close collaboration with radiologists. Compared to the experts' manual stroke volume estimations, motion-aware quantification performs, on average, 1.57% better than calculations without motion consideration. The mean difference between stroke volumes obtained with the different methods is 7.82%. Automatically obtained 4D segmentations overlap by 85.75% with manually generated ones. Incorporating motion information in the stroke volume quantification yields slight but not statistically significant improvements. The presented method is feasible for the clinical routine, since computation times are low and

  2. Quantification of hand synovitis in rheumatoid arthritis: Arterial mask subtraction reinforced with mutual information can improve accuracy of pixel-by-pixel time-intensity curve shape analysis in dynamic MRI.

    PubMed

    Kobayashi, Yuto; Kamishima, Tamotsu; Sugimori, Hiroyuki; Ichikawa, Shota; Noguchi, Atsushi; Kono, Michihito; Iiyama, Toshitake; Sutherland, Kenneth; Atsumi, Tatsuya

    2018-03-01

    Synovitis, which is a hallmark of rheumatoid arthritis (RA), needs to be precisely quantified to determine the treatment plan. Time-intensity curve (TIC) shape analysis is an objective assessment method for characterizing the pixels as artery, inflamed synovium, or other tissues using dynamic contrast-enhanced MRI (DCE-MRI). To assess the feasibility of our original arterial mask subtraction method (AMSM) with mutual information (MI) for quantification of synovitis in RA. Prospective study. Ten RA patients (nine women and one man; mean age, 56.8 years; range, 38-67 years). 3T/DCE-MRI. After optimization of TIC shape analysis to the hand region, a combination of TIC shape analysis and AMSM was applied to synovial quantification. The MI between pre- and postcontrast images was utilized to determine the arterial mask phase objectively, which was compared with human subjective selection. The volume of objectively measured synovitis by software was compared with that of manual outlining by an experienced radiologist. Simple TIC shape analysis and TIC shape analysis combined with AMSM were compared in slices without synovitis according to subjective evaluation. Pearson's correlation coefficient, paired t-test and intraclass correlation coefficient (ICC). TIC shape analysis was successfully optimized in the hand region with a correlation coefficient of 0.725 (P < 0.01) with the results of manual assessment regarded as ground truth. Objective selection utilizing MI had substantial agreement (ICC = 0.734) with subjective selection. Correlation of synovial volumetry in combination with TIC shape analysis and AMSM with manual assessment was excellent (r = 0.922, P < 0.01). In addition, negative predictive ability in slices without synovitis pixels was significantly increased (P < 0.01). The combination of TIC shape analysis and image subtraction reinforced with MI can accurately quantify synovitis of RA in the hand by eliminating arterial pixels. 2

  3. Uncertainty Quantification and Certification Prediction of Low-Boom Supersonic Aircraft Configurations

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Reuter, Bryan W.; Walker, Eric L.; Kleb, Bil; Park, Michael A.

    2014-01-01

    The primary objective of this work was to develop and demonstrate a process for accurate and efficient uncertainty quantification and certification prediction of low-boom, supersonic, transport aircraft. High-fidelity computational fluid dynamics models of multiple low-boom configurations were investigated including the Lockheed Martin SEEB-ALR body of revolution, the NASA 69 Delta Wing, and the Lockheed Martin 1021-01 configuration. A nonintrusive polynomial chaos surrogate modeling approach was used for reduced computational cost of propagating mixed, inherent (aleatory) and model-form (epistemic) uncertainty from both the computation fluid dynamics model and the near-field to ground level propagation model. A methodology has also been introduced to quantify the plausibility of a design to pass a certification under uncertainty. Results of this study include the analysis of each of the three configurations of interest under inviscid and fully turbulent flow assumptions. A comparison of the uncertainty outputs and sensitivity analyses between the configurations is also given. The results of this study illustrate the flexibility and robustness of the developed framework as a tool for uncertainty quantification and certification prediction of low-boom, supersonic aircraft.

  4. Nondestructive Detection and Quantification of Blueberry Bruising using Near-infrared (NIR) Hyperspectral Reflectance Imaging

    PubMed Central

    Jiang, Yu; Li, Changying; Takeda, Fumiomi

    2016-01-01

    Currently, blueberry bruising is evaluated by either human visual/tactile inspection or firmness measurement instruments. These methods are destructive, time-consuming, and subjective. The goal of this paper was to develop a non-destructive approach for blueberry bruising detection and quantification. Experiments were conducted on 300 samples of southern highbush blueberry (Camellia, Rebel, and Star) and on 1500 samples of northern highbush blueberry (Bluecrop, Jersey, and Liberty) for hyperspectral imaging analysis, firmness measurement, and human evaluation. An algorithm was developed to automatically calculate a bruise ratio index (ratio of bruised to whole fruit area) for bruise quantification. The spectra of bruised and healthy tissues were statistically separated and the separation was independent of cultivars. Support vector machine (SVM) classification of the spectra from the regions of interest (ROIs) achieved over 94%, 92%, and 96% accuracy on the training set, independent testing set, and combined set, respectively. The statistical results showed that the bruise ratio index was equivalent to the measured firmness but better than the predicted firmness in regard to effectiveness of bruise quantification, and the bruise ratio index had a strong correlation with human assessment (R2 = 0.78 − 0.83). Therefore, the proposed approach and the bruise ratio index are effective to non-destructively detect and quantify blueberry bruising. PMID:27767050

  5. Near-infrared spectroscopy for the detection and quantification of bacterial contaminations in pharmaceutical products.

    PubMed

    Quintelas, Cristina; Mesquita, Daniela P; Lopes, João A; Ferreira, Eugénio C; Sousa, Clara

    2015-08-15

    Accurate detection and quantification of microbiological contaminations remains an issue mainly due the lack of rapid and precise analytical techniques. Standard methods are expensive and time-consuming being associated to high economic losses and public health threats. In the context of pharmaceutical industry, the development of fast analytical techniques able to overcome these limitations is crucial and spectroscopic techniques might constitute a reliable alternative. In this work we proved the ability of Fourier transform near infrared spectroscopy (FT-NIRS) to detect and quantify bacteria (Bacillus subtilis, Escherichia coli, Pseudomonas fluorescens, Salmonella enterica, Staphylococcus epidermidis) from 10 to 10(8) CFUs/mL in sterile saline solutions (NaCl 0.9%). Partial least squares discriminant analysis (PLSDA) models showed that FT-NIRS was able to discriminate between sterile and contaminated solutions for all bacteria as well as to identify the contaminant bacteria. Partial least squares (PLS) models allowed bacterial quantification with limits of detection ranging from 5.1 to 9 CFU/mL for E. coli and B. subtilis, respectively. This methodology was successfully validated in three pharmaceutical preparations (contact lens solution, cough syrup and topic anti-inflammatory solution) proving that this technique possess a high potential to be routinely used for the detection and quantification of bacterial contaminations. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Rapid quantification of plant-powdery mildew interactions by qPCR and conidiospore counts.

    PubMed

    Weßling, Ralf; Panstruga, Ralph

    2012-08-31

    The powdery mildew disease represents a valuable patho-system to study the interaction between plant hosts and obligate biotrophic fungal pathogens. Numerous discoveries have been made on the basis of the quantitative evaluation of plant-powdery mildew interactions, especially in the context of hyper-susceptible and/or resistant plant mutants. However, the presently available methods to score the pathogenic success of powdery mildew fungi are laborious and thus not well suited for medium- to high-throughput analysis. Here we present two new protocols that allow the rapid quantitative assessment of powdery mildew disease development. One procedure depends on quantitative polymerase chain reaction (qPCR)-based evaluation of fungal biomass, while the other relies on the quantification of fungal conidiospores. We validated both techniques using the powdery mildew pathogen Golovinomyces orontii on a set of hyper-susceptible and resistant Arabidopsis thaliana mutants and found that both cover a wide dynamic range of one to two (qPCR) and four to five (quantification of conidia) orders of magnitude, respectively. The two approaches yield reproducible results and are easy to perform without specialized equipment. The qPCR and spore count assays rapidly and reproducibly quantify powdery mildew pathogenesis. Our methods are performed at later stages of infection and discern mutant phenotypes accurately. The assays therefore complement currently used procedures of powdery mildew quantification and can overcome some of their limitations. In addition, they can easily be adapted to other plant-powdery mildew patho-systems.

  7. In-Gel Stable-Isotope Labeling (ISIL): a strategy for mass spectrometry-based relative quantification.

    PubMed

    Asara, John M; Zhang, Xiang; Zheng, Bin; Christofk, Heather H; Wu, Ning; Cantley, Lewis C

    2006-01-01

    Most proteomics approaches for relative quantification of protein expression use a combination of stable-isotope labeling and mass spectrometry. Traditionally, researchers have used difference gel electrophoresis (DIGE) from stained 1D and 2D gels for relative quantification. While differences in protein staining intensity can often be visualized, abundant proteins can obscure less abundant proteins, and quantification of post-translational modifications is difficult. A method is presented for quantifying changes in the abundance of a specific protein or changes in specific modifications of a protein using In-gel Stable-Isotope Labeling (ISIL). Proteins extracted from any source (tissue, cell line, immunoprecipitate, etc.), treated under two experimental conditions, are resolved in separate lanes by gel electrophoresis. The regions of interest (visualized by staining) are reacted separately with light versus heavy isotope-labeled reagents, and the gel slices are then mixed and digested with proteases. The resulting peptides are then analyzed by LC-MS to determine relative abundance of light/heavy isotope pairs and analyzed by LC-MS/MS for identification of sequence and modifications. The strategy compares well with other relative quantification strategies, and in silico calculations reveal its effectiveness as a global relative quantification strategy. An advantage of ISIL is that visualization of gel differences can be used as a first quantification step followed by accurate and sensitive protein level stable-isotope labeling and mass spectrometry-based relative quantification.

  8. (99m)Tc-Annexin A5 quantification of apoptotic tumor response: a systematic review and meta-analysis of clinical imaging trials.

    PubMed

    Belhocine, Tarik Z; Blankenberg, Francis G; Kartachova, Marina S; Stitt, Larry W; Vanderheyden, Jean-Luc; Hoebers, Frank J P; Van de Wiele, Christophe

    2015-12-01

    (99m)Tc-Annexin A5 has been used as a molecular imaging probe for the visualization, characterization and measurement of apoptosis. In an effort to define the quantitative (99m)Tc-annexin A5 uptake criteria that best predict tumor response to treatment, we performed a systematic review and meta-analysis of the results of all clinical imaging trials found in the literature or publicly available databases. Included in this review were 17 clinical trials investigating quantitative (99m)Tc-annexin A5 (qAnx5) imaging using different parameters in cancer patients before and after the first course of chemotherapy and/or radiation therapy. Qualitative assessment of the clinical studies for diagnostic accuracy was performed using the QUADAS-2 criteria. Of these studies, five prospective single-center clinical trials (92 patients in total) were included in the meta-analysis after exclusion of one multicenter clinical trial due to heterogeneity. Pooled positive predictive values (PPV) and pooled negative predictive values (NPV) (with 95% CI) were calculated using Meta-Disc software version 1.4. Absolute quantification and/or relative quantification of (99m)Tc-annexin A5 uptake were performed at baseline and after the start of treatment. Various quantitative parameters have been used for the calculation of (99m)Tc-annexin A5 tumor uptake and delta (Δ) tumor changes post-treatment compared to baseline including: tumor-to-background ratio (TBR), ΔTBR, tumor-to-noise ratio, relative tumor ratio (TR), ΔTR, standardized tumor uptake ratio (STU), ΔSTU, maximum count per pixel within the tumor volume (Cmax), Cmax%, absolute ΔU and percentage (ΔU%), maximum ΔU counts, semiquantitative visual scoring, percent injected dose (%ID) and %ID/cm(3). Clinical trials investigating qAnx5 imaging have included patients with lung cancer, lymphoma, breast cancer, head and neck cancer and other less common tumor types. In two phase I/II single-center clinical trials, an increase of ≥25% in

  9. The use of self-quantification systems for personal health information: big data management activities and prospects.

    PubMed

    Almalki, Manal; Gray, Kathleen; Sanchez, Fernando Martin

    2015-01-01

    Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Self-quantification in personal health maintenance

  10. Statistical evaluation of vibration analysis techniques

    NASA Technical Reports Server (NTRS)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  11. Clinical applications of MS-based protein quantification.

    PubMed

    Sabbagh, Bassel; Mindt, Sonani; Neumaier, Michael; Findeisen, Peter

    2016-04-01

    Mass spectrometry-based assays are increasingly important in clinical laboratory medicine and nowadays are already commonly used in several areas of routine diagnostics. These include therapeutic drug monitoring, toxicology, endocrinology, pediatrics, and microbiology. Accordingly, some of the most common analyses are therapeutic drug monitoring of immunosuppressants, vitamin D, steroids, newborn screening, and bacterial identification. However, MS-based quantification of peptides and proteins for routine diagnostic use is rather rare up to now despite excellent analytical specificity and good sensitivity. Here, we want to give an overview over current fit-for-purpose assays for MS-based protein quantification. Advantages as well as challenges of this approach will be discussed with focus on feasibility for routine diagnostic use. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Quantification and Formalization of Security

    DTIC Science & Technology

    2010-02-01

    Quantification of Information Flow . . . . . . . . . . . . . . . . . . 30 2.4 Language Semantics . . . . . . . . . . . . . . . . . . . . . . . . . . 46...system behavior observed by users holding low clearances. This policy, or a variant of it, is enforced by many pro- gramming language -based mechanisms...illustrates with a particular programming language (while-programs plus probabilistic choice). The model is extended in §2.5 to programs in which

  13. A Probabilistic Framework for Peptide and Protein Quantification from Data-Dependent and Data-Independent LC-MS Proteomics Experiments

    PubMed Central

    Richardson, Keith; Denny, Richard; Hughes, Chris; Skilling, John; Sikora, Jacek; Dadlez, Michał; Manteca, Angel; Jung, Hye Ryung; Jensen, Ole Nørregaard; Redeker, Virginie; Melki, Ronald; Langridge, James I.; Vissers, Johannes P.C.

    2013-01-01

    A probability-based quantification framework is presented for the calculation of relative peptide and protein abundance in label-free and label-dependent LC-MS proteomics data. The results are accompanied by credible intervals and regulation probabilities. The algorithm takes into account data uncertainties via Poisson statistics modified by a noise contribution that is determined automatically during an initial normalization stage. Protein quantification relies on assignments of component peptides to the acquired data. These assignments are generally of variable reliability and may not be present across all of the experiments comprising an analysis. It is also possible for a peptide to be identified to more than one protein in a given mixture. For these reasons the algorithm accepts a prior probability of peptide assignment for each intensity measurement. The model is constructed in such a way that outliers of any type can be automatically reweighted. Two discrete normalization methods can be employed. The first method is based on a user-defined subset of peptides, while the second method relies on the presence of a dominant background of endogenous peptides for which the concentration is assumed to be unaffected. Normalization is performed using the same computational and statistical procedures employed by the main quantification algorithm. The performance of the algorithm will be illustrated on example data sets, and its utility demonstrated for typical proteomics applications. The quantification algorithm supports relative protein quantification based on precursor and product ion intensities acquired by means of data-dependent methods, originating from all common isotopically-labeled approaches, as well as label-free ion intensity-based data-independent methods. PMID:22871168

  14. Identification and quantification analysis of nonlinear dynamics properties of combustion instability in a diesel engine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Li-Ping, E-mail: yangliping302@hrbeu.edu.cn; Ding, Shun-Liang; Song, En-Zhe

    The cycling combustion instabilities in a diesel engine have been analyzed based on chaos theory. The objective was to investigate the dynamical characteristics of combustion in diesel engine. In this study, experiments were performed under the entire operating range of a diesel engine (the engine speed was changed from 600 to 1400 rpm and the engine load rate was from 0% to 100%), and acquired real-time series of in-cylinder combustion pressure using a piezoelectric transducer installed on the cylinder head. Several methods were applied to identify and quantitatively analyze the combustion process complexity in the diesel engine including delay-coordinate embedding, recurrencemore » plot (RP), Recurrence Quantification Analysis, correlation dimension (CD), and the largest Lyapunov exponent (LLE) estimation. The results show that the combustion process exhibits some determinism. If LLE is positive, then the combustion system has a fractal dimension and CD is no more than 1.6 and within the diesel engine operating range. We have concluded that the combustion system of diesel engine is a low-dimensional chaotic system and the maximum values of CD and LLE occur at the lowest engine speed and load. This means that combustion system is more complex and sensitive to initial conditions and that poor combustion quality leads to the decrease of fuel economy and the increase of exhaust emissions.« less

  15. Identification and quantification analysis of nonlinear dynamics properties of combustion instability in a diesel engine.

    PubMed

    Yang, Li-Ping; Ding, Shun-Liang; Litak, Grzegorz; Song, En-Zhe; Ma, Xiu-Zhen

    2015-01-01

    The cycling combustion instabilities in a diesel engine have been analyzed based on chaos theory. The objective was to investigate the dynamical characteristics of combustion in diesel engine. In this study, experiments were performed under the entire operating range of a diesel engine (the engine speed was changed from 600 to 1400 rpm and the engine load rate was from 0% to 100%), and acquired real-time series of in-cylinder combustion pressure using a piezoelectric transducer installed on the cylinder head. Several methods were applied to identify and quantitatively analyze the combustion process complexity in the diesel engine including delay-coordinate embedding, recurrence plot (RP), Recurrence Quantification Analysis, correlation dimension (CD), and the largest Lyapunov exponent (LLE) estimation. The results show that the combustion process exhibits some determinism. If LLE is positive, then the combustion system has a fractal dimension and CD is no more than 1.6 and within the diesel engine operating range. We have concluded that the combustion system of diesel engine is a low-dimensional chaotic system and the maximum values of CD and LLE occur at the lowest engine speed and load. This means that combustion system is more complex and sensitive to initial conditions and that poor combustion quality leads to the decrease of fuel economy and the increase of exhaust emissions.

  16. Quantification of susceptibility change at high-concentrated SPIO-labeled target by characteristic phase gradient recognition.

    PubMed

    Zhu, Haitao; Nie, Binbin; Liu, Hua; Guo, Hua; Demachi, Kazuyuki; Sekino, Masaki; Shan, Baoci

    2016-05-01

    Phase map cross-correlation detection and quantification may produce highlighted signal at superparamagnetic iron oxide nanoparticles, and distinguish them from other hypointensities. The method may quantify susceptibility change by performing least squares analysis between a theoretically generated magnetic field template and an experimentally scanned phase image. Because characteristic phase recognition requires the removal of phase wrap and phase background, additional steps of phase unwrapping and filtering may increase the chance of computing error and enlarge the inconsistence among algorithms. To solve problem, phase gradient cross-correlation and quantification method is developed by recognizing characteristic phase gradient pattern instead of phase image because phase gradient operation inherently includes unwrapping and filtering functions. However, few studies have mentioned the detectable limit of currently used phase gradient calculation algorithms. The limit may lead to an underestimation of large magnetic susceptibility change caused by high-concentrated iron accumulation. In this study, mathematical derivation points out the value of maximum detectable phase gradient calculated by differential chain algorithm in both spatial and Fourier domain. To break through the limit, a modified quantification method is proposed by using unwrapped forward differentiation for phase gradient generation. The method enlarges the detectable range of phase gradient measurement and avoids the underestimation of magnetic susceptibility. Simulation and phantom experiments were used to quantitatively compare different methods. In vivo application performs MRI scanning on nude mice implanted by iron-labeled human cancer cells. Results validate the limit of detectable phase gradient and the consequent susceptibility underestimation. Results also demonstrate the advantage of unwrapped forward differentiation compared with differential chain algorithms for susceptibility

  17. Use of Multiple Competitors for Quantification of Human Immunodeficiency Virus Type 1 RNA in Plasma

    PubMed Central

    Vener, Tanya; Nygren, Malin; Andersson, AnnaLena; Uhlén, Mathias; Albert, Jan; Lundeberg, Joakim

    1998-01-01

    Quantification of human immunodeficiency virus type 1 (HIV-1) RNA in plasma has rapidly become an important tool in basic HIV research and in the clinical care of infected individuals. Here, a quantitative HIV assay based on competitive reverse transcription-PCR with multiple competitors was developed. Four RNA competitors containing identical PCR primer binding sequences as the viral HIV-1 RNA target were constructed. One of the PCR primers was fluorescently labeled, which facilitated discrimination between the viral RNA and competitor amplicons by fragment analysis with conventional automated sequencers. The coamplification of known amounts of the RNA competitors provided the means to establish internal calibration curves for the individual reactions resulting in exclusion of tube-to-tube variations. Calibration curves were created from the peak areas, which were proportional to the starting amount of each competitor. The fluorescence detection format was expanded to provide a dynamic range of more than 5 log units. This quantitative assay allowed for reproducible analysis of samples containing as few as 40 viral copies of HIV-1 RNA per reaction. The within- and between-run coefficients of variation were <24% (range, 10 to 24) and <36% (range, 27 to 36), respectively. The high reproducibility (standard deviation, <0.13 log) of the overall procedure for quantification of HIV-1 RNA in plasma, including sample preparation, amplification, and detection variations, allowed reliable detection of a 0.5-log change in RNA viral load. The assay could be a useful tool for monitoring HIV-1 disease progression and antiviral treatment and can easily be adapted to the quantification of other pathogens. PMID:9650926

  18. Development of a Low-Cost Stem-Loop Real-Time Quantification PCR Technique for EBV miRNA Expression Analysis.

    PubMed

    Bergallo, Massimiliano; Merlino, Chiara; Montin, Davide; Galliano, Ilaria; Gambarino, Stefano; Mareschi, Katia; Fagioli, Franca; Montanari, Paola; Martino, Silvana; Tovo, Pier-Angelo

    2016-09-01

    MicroRNAs (miRNAs) are short, single stranded, non-coding RNA molecules. They are produced by many different species and are key regulators of several physiological processes. miRNAs are also encoded by the genomes of multiple virus families, such as herpesvirus family. In particular, miRNAs from Epstein Barr virus were found at high concentrations in different associated pathologies, such as Burkitt's lymphoma, Hodgkin disease, and nasopharyngeal carcinoma. Thanks to their stability, these molecules could possibly serve as biomarkers for EBV-associated diseases. In this study, a stem-loop real-time PCR for miR-BART2-5p, miR-BART15, and miR-BART22 EBV miRNAs detection and quantification has been developed. Evaluation of these miRNAs in 31 serum samples (12 from patients affected by primary immunodeficiency, 9 from X-linked agammaglobulinemia and 10 from healthy subjects) has been carried out. The amplification performance showed a wide dynamic range (10(8)-10(2) copies/reaction) and sensibility equal to 10(2) copies/reaction for all the target tested. Serum samples analysis, on the other hand, showed a statistical significant higher level of miR-BART22 in primary immunodeficiency patients (P = 0.0001) compared to other groups and targets. The results confirmed the potential use of this assay as a tool for monitoring EBV-associated disease and for miRNAs expression profile analysis.

  19. Quantification of fossil fuel CO2 at the building/street level for large US cities

    NASA Astrophysics Data System (ADS)

    Gurney, K. R.; Razlivanov, I. N.; Song, Y.

    2012-12-01

    Quantification of fossil fuel CO2 emissions from the bottom-up perspective is a critical element in emerging plans on a global, integrated, carbon monitoring system (CMS). A space/time explicit emissions data product can act as both a verification and planning system. It can verify atmospheric CO2 measurements (in situ and remote) and offer detailed mitigation information to management authorities in order to optimize the mix of mitigation efforts. Here, we present the Hestia Project, an effort aimed at building a high resolution (eg. building and road link-specific, hourly) fossil fuel CO2 emissions data product for the urban domain as a pilot effort to a CMS. A complete data product has been built for the city of Indianapolis and preliminary quantification has been completed for Los Angeles and Phoenix (see figure). The effort in Indianapolis is now part of a larger effort aimed at a convergent top-down/bottom-up assessment of greenhouse gas emissions, called INFLUX. Our urban-level quantification relies on a mixture of data and modeling structures. We start with the sector-specific Vulcan Project estimate at the mix of geocoded and county-wide levels. The Hestia aim is to distribute the Vulcan result in space and time. Two components take the majority of effort: buildings and onroad emissions. In collaboration with our INFLUX colleagues, we are transporting these high resolution emissions through an atmospheric transport model for a forward comparison of the Hestia data product with atmospheric measurements, collected on aircraft and cell towers. In preparation for a formal urban-scale inversion, these forward comparisons offer insights into both improving our emissions data product and measurement strategies. A key benefit of the approach taken in this study is the tracking and archiving of fuel and process-level detail (eg. combustion process, other pollutants), allowing for a more thorough understanding and analysis of energy throughputs in the urban

  20. Detection, mapping, and quantification of single walled carbon nanotubes in histological specimens with photoacoustic microscopy.

    PubMed

    Avti, Pramod K; Hu, Song; Favazza, Christopher; Mikos, Antonios G; Jansen, John A; Shroyer, Kenneth R; Wang, Lihong V; Sitharaman, Balaji

    2012-01-01

    In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM) was investigated to detect, map, and quantify trace amounts [nanograms (ng) to micrograms (µg)] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies (histological specimens from implanted tissue engineering scaffolds). Optical-resolution (OR) and acoustic-resolution (AR)--Photoacoustic microscopy (PAM) was employed to detect, map and quantify the SWCNTs in a variety of tissue histological specimens and compared with other optical techniques (bright-field optical microscopy, Raman microscopy, near infrared (NIR) fluorescence microscopy). Both optical-resolution and acoustic-resolution PAM, allow the detection and quantification of SWCNTs in histological specimens with scalable spatial resolution and depth penetration. The noise-equivalent detection sensitivity to SWCNTs in the specimens was calculated to be as low as ∼7 pg. Image processing analysis further allowed the mapping, distribution, and quantification of the SWCNTs in the histological sections. The results demonstrate the potential of PAM as a promising imaging technique to detect, map, and quantify SWCNTs in histological specimens, and could complement the capabilities of current optical and electron microscopy techniques in the analysis of histological specimens containing SWCNTs.

  1. Aerospace reliability applied to biomedicine.

    NASA Technical Reports Server (NTRS)

    Lalli, V. R.; Vargo, D. J.

    1972-01-01

    An analysis is presented that indicates that the reliability and quality assurance methodology selected by NASA to minimize failures in aerospace equipment can be applied directly to biomedical devices to improve hospital equipment reliability. The Space Electric Rocket Test project is used as an example of NASA application of reliability and quality assurance (R&QA) methods. By analogy a comparison is made to show how these same methods can be used in the development of transducers, instrumentation, and complex systems for use in medicine.

  2. Quantification is Neither Necessary Nor Sufficient for Measurement

    NASA Astrophysics Data System (ADS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-09-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement.

  3. Recent advances in stable isotope labeling based techniques for proteome relative quantification.

    PubMed

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2014-10-24

    The large scale relative quantification of all proteins expressed in biological samples under different states is of great importance for discovering proteins with important biological functions, as well as screening disease related biomarkers and drug targets. Therefore, the accurate quantification of proteins at proteome level has become one of the key issues in protein science. Herein, the recent advances in stable isotope labeling based techniques for proteome relative quantification were reviewed, from the aspects of metabolic labeling, chemical labeling and enzyme-catalyzed labeling. Furthermore, the future research direction in this field was prospected. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Quantification of Posterior Globe Flattening: Methodology Development and Validationc

    NASA Technical Reports Server (NTRS)

    Lumpkins, S. B.; Garcia, K. M.; Sargsyan, A. E.; Hamilton, D. R.; Berggren, M. D.; Antonsen, E.; Ebert, D.

    2011-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts, and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in several astronauts. This phenomenon is known to affect some terrestrial patient populations, and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT), or B-mode ultrasound (US), without consistent objective criteria. NASA uses a semi-quantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was to initiate development of an objective quantification methodology for posterior globe flattening.

  5. Quantification of Posterior Globe Flattening: Methodology Development and Validation

    NASA Technical Reports Server (NTRS)

    Lumpkins, Sarah B.; Garcia, Kathleen M.; Sargsyan, Ashot E.; Hamilton, Douglas R.; Berggren, Michael D.; Ebert, Douglas

    2012-01-01

    Microgravity exposure affects visual acuity in a subset of astronauts and mechanisms may include structural changes in the posterior globe and orbit. Particularly, posterior globe flattening has been implicated in the eyes of several astronauts. This phenomenon is known to affect some terrestrial patient populations and has been shown to be associated with intracranial hypertension. It is commonly assessed by magnetic resonance imaging (MRI), computed tomography (CT) or B-mode Ultrasound (US), without consistent objective criteria. NASA uses a semiquantitative scale of 0-3 as part of eye/orbit MRI and US analysis for occupational monitoring purposes. The goal of this study was ot initiate development of an objective quantification methodology to monitor small changes in posterior globe flattening.

  6. Quantification of DNA using the luminescent oxygen channeling assay.

    PubMed

    Patel, R; Pollner, R; de Keczer, S; Pease, J; Pirio, M; DeChene, N; Dafforn, A; Rose, S

    2000-09-01

    Simplified and cost-effective methods for the detection and quantification of nucleic acid targets are still a challenge in molecular diagnostics. Luminescent oxygen channeling assay (LOCI(TM)) latex particles can be conjugated to synthetic oligodeoxynucleotides and hybridized, via linking probes, to different DNA targets. These oligomer-conjugated LOCI particles survive thermocycling in a PCR reaction and allow quantified detection of DNA targets in both real-time and endpoint formats. The endpoint DNA quantification format utilized two sensitizer bead types that are sensitive to separate illumination wavelengths. These two bead types were uniquely annealed to target or control amplicons, and separate illuminations generated time-resolved chemiluminescence, which distinguished the two amplicon types. In the endpoint method, ratios of the two signals allowed determination of the target DNA concentration over a three-log range. The real-time format allowed quantification of the DNA target over a six-log range with a linear relationship between threshold cycle and log of the number of DNA targets. This is the first report of the use of an oligomer-labeled latex particle assay capable of producing DNA quantification and sequence-specific chemiluminescent signals in a homogeneous format. It is also the first report of the generation of two signals from a LOCI assay. The methods described here have been shown to be easily adaptable to new DNA targets because of the generic nature of the oligomer-labeled LOCI particles.

  7. Setting Standards for Reporting and Quantification in Fluorescence-Guided Surgery.

    PubMed

    Hoogstins, Charlotte; Burggraaf, Jan Jaap; Koller, Marjory; Handgraaf, Henricus; Boogerd, Leonora; van Dam, Gooitzen; Vahrmeijer, Alexander; Burggraaf, Jacobus

    2018-05-29

    Intraoperative fluorescence imaging (FI) is a promising technique that could potentially guide oncologic surgeons toward more radical resections and thus improve clinical outcome. Despite the increase in the number of clinical trials, fluorescent agents and imaging systems for intraoperative FI, a standardized approach for imaging system performance assessment and post-acquisition image analysis is currently unavailable. We conducted a systematic, controlled comparison between two commercially available imaging systems using a novel calibration device for FI systems and various fluorescent agents. In addition, we analyzed fluorescence images from previous studies to evaluate signal-to-background ratio (SBR) and determinants of SBR. Using the calibration device, imaging system performance could be quantified and compared, exposing relevant differences in sensitivity. Image analysis demonstrated a profound influence of background noise and the selection of the background on SBR. In this article, we suggest clear approaches for the quantification of imaging system performance assessment and post-acquisition image analysis, attempting to set new standards in the field of FI.

  8. Breast density quantification using magnetic resonance imaging (MRI) with bias field correction: A postmortem study

    PubMed Central

    Ding, Huanjun; Johnson, Travis; Lin, Muqing; Le, Huy Q.; Ducote, Justin L.; Su, Min-Ying; Molloi, Sabee

    2013-01-01

    Purpose: Quantification of breast density based on three-dimensional breast MRI may provide useful information for the early detection of breast cancer. However, the field inhomogeneity can severely challenge the computerized image segmentation process. In this work, the effect of the bias field in breast density quantification has been investigated with a postmortem study. Methods: T1-weighted images of 20 pairs of postmortem breasts were acquired on a 1.5 T breast MRI scanner. Two computer-assisted algorithms were used to quantify the volumetric breast density. First, standard fuzzy c-means (FCM) clustering was used on raw images with the bias field present. Then, the coherent local intensity clustering (CLIC) method estimated and corrected the bias field during the iterative tissue segmentation process. Finally, FCM clustering was performed on the bias-field-corrected images produced by CLIC method. The left–right correlation for breasts in the same pair was studied for both segmentation algorithms to evaluate the precision of the tissue classification. Finally, the breast densities measured with the three methods were compared to the gold standard tissue compositions obtained from chemical analysis. The linear correlation coefficient, Pearson's r, was used to evaluate the two image segmentation algorithms and the effect of bias field. Results: The CLIC method successfully corrected the intensity inhomogeneity induced by the bias field. In left–right comparisons, the CLIC method significantly improved the slope and the correlation coefficient of the linear fitting for the glandular volume estimation. The left–right breast density correlation was also increased from 0.93 to 0.98. When compared with the percent fibroglandular volume (%FGV) from chemical analysis, results after bias field correction from both the CLIC the FCM algorithms showed improved linear correlation. As a result, the Pearson's r increased from 0.86 to 0.92 with the bias field correction

  9. Breast density quantification using magnetic resonance imaging (MRI) with bias field correction: a postmortem study.

    PubMed

    Ding, Huanjun; Johnson, Travis; Lin, Muqing; Le, Huy Q; Ducote, Justin L; Su, Min-Ying; Molloi, Sabee

    2013-12-01

    Quantification of breast density based on three-dimensional breast MRI may provide useful information for the early detection of breast cancer. However, the field inhomogeneity can severely challenge the computerized image segmentation process. In this work, the effect of the bias field in breast density quantification has been investigated with a postmortem study. T1-weighted images of 20 pairs of postmortem breasts were acquired on a 1.5 T breast MRI scanner. Two computer-assisted algorithms were used to quantify the volumetric breast density. First, standard fuzzy c-means (FCM) clustering was used on raw images with the bias field present. Then, the coherent local intensity clustering (CLIC) method estimated and corrected the bias field during the iterative tissue segmentation process. Finally, FCM clustering was performed on the bias-field-corrected images produced by CLIC method. The left-right correlation for breasts in the same pair was studied for both segmentation algorithms to evaluate the precision of the tissue classification. Finally, the breast densities measured with the three methods were compared to the gold standard tissue compositions obtained from chemical analysis. The linear correlation coefficient, Pearson's r, was used to evaluate the two image segmentation algorithms and the effect of bias field. The CLIC method successfully corrected the intensity inhomogeneity induced by the bias field. In left-right comparisons, the CLIC method significantly improved the slope and the correlation coefficient of the linear fitting for the glandular volume estimation. The left-right breast density correlation was also increased from 0.93 to 0.98. When compared with the percent fibroglandular volume (%FGV) from chemical analysis, results after bias field correction from both the CLIC the FCM algorithms showed improved linear correlation. As a result, the Pearson's r increased from 0.86 to 0.92 with the bias field correction. The investigated CLIC method

  10. Quantification of Human Fecal Bifidobacterium Species by Use of Quantitative Real-Time PCR Analysis Targeting the groEL Gene

    PubMed Central

    Junick, Jana

    2012-01-01

    Quantitative real-time PCR assays targeting the groEL gene for the specific enumeration of 12 human fecal Bifidobacterium species were developed. The housekeeping gene groEL (HSP60 in eukaryotes) was used as a discriminative marker for the differentiation of Bifidobacterium adolescentis, B. angulatum, B. animalis, B. bifidum, B. breve, B. catenulatum, B. dentium, B. gallicum, B. longum, B. pseudocatenulatum, B. pseudolongum, and B. thermophilum. The bifidobacterial chromosome contains a single copy of the groEL gene, allowing the determination of the cell number by quantification of the groEL copy number. Real-time PCR assays were validated by comparing fecal samples spiked with known numbers of a given Bifidobacterium species. Independent of the Bifidobacterium species tested, the proportion of groEL copies recovered from fecal samples spiked with 5 to 9 log10 cells/g feces was approximately 50%. The quantification limit was 5 to 6 log10 groEL copies/g feces. The interassay variability was less than 10%, and variability between different DNA extractions was less than 23%. The method developed was applied to fecal samples from healthy adults and full-term breast-fed infants. Bifidobacterial diversity in both adults and infants was low, with mostly ≤3 Bifidobacterium species and B. longum frequently detected. The predominant species in infant and adult fecal samples were B. breve and B. adolescentis, respectively. It was possible to distinguish B. catenulatum and B. pseudocatenulatum. We conclude that the groEL gene is a suitable molecular marker for the specific and accurate quantification of human fecal Bifidobacterium species by real-time PCR. PMID:22307308

  11. PCR technology for screening and quantification of genetically modified organisms (GMOs).

    PubMed

    Holst-Jensen, Arne; Rønning, Sissel B; Løvseth, Astrid; Berdal, Knut G

    2003-04-01

    Although PCR technology has obvious limitations, the potentially high degree of sensitivity and specificity explains why it has been the first choice of most analytical laboratories interested in detection of genetically modified (GM) organisms (GMOs) and derived materials. Because the products that laboratories receive for analysis are often processed and refined, the quality and quantity of target analyte (e.g. protein or DNA) frequently challenges the sensitivity of any detection method. Among the currently available methods, PCR methods are generally accepted as the most sensitive and reliable methods for detection of GM-derived material in routine applications. The choice of target sequence motif is the single most important factor controlling the specificity of the PCR method. The target sequence is normally a part of the modified gene construct, for example a promoter, a terminator, a gene, or a junction between two of these elements. However, the elements may originate from wildtype organisms, they may be present in more than one GMO, and their copy number may also vary from one GMO to another. They may even be combined in a similar way in more than one GMO. Thus, the choice of method should fit the purpose. Recent developments include event-specific methods, particularly useful for identification and quantification of GM content. Thresholds for labelling are now in place in many countries including those in the European Union. The success of the labelling schemes is dependent upon the efficiency with which GM-derived material can be detected. We will present an overview of currently available PCR methods for screening and quantification of GM-derived DNA, and discuss their applicability and limitations. In addition, we will discuss some of the major challenges related to determination of the limits of detection (LOD) and quantification (LOQ), and to validation of methods.

  12. Quantification of penicillin G during labor and delivery by capillary electrophoresis.

    PubMed

    Thomas, Andrea; Ukpoma, Omon K; Inman, Jennifer A; Kaul, Anil K; Beeson, James H; Roberts, Kenneth P

    2008-04-24

    In this study, a capillary electrophoresis (CE) method was developed as a means to measure levels of penicillin G (PCN G) in Group B Streptococcus (GBS) positive pregnant women during labor and delivery. Volunteers for this developmental study were administered five million units of PCN G at the onset of labor. Urine, blood, and amniotic fluid samples were collected during labor and post delivery. Samples were semi-purified by solid-phase extraction (SPE) using Waters tC18 SepPak 3cc cartridges with a sodium phosphate/methanol step gradient for elution. Capillary electrophoresis or reversed-phase high-performance liquid chromatography (RP-HPLC) with diode-array absorbance detection were used to separate the samples in less than 30 min. Quantification was accomplished by establishing a calibration curve with a linear dynamic range. The tC18 SPE methodology provided substantial sample clean-up with high recovery yields of PCN G ( approximately 90%). It was found that SPE was critical for maintaining the integrity of the separation column when using RP-HPLC, but was not necessary for sample analysis by CE where no stationary phase is present. Quantification results ranged from millimolar concentrations of PCN G in maternal urine to micromolar concentrations in amniotic fluid. Serum and cord blood levels of PCN G were below quantification limits, which is likely due to the prolonged delay in sample collection after antibiotic administration. These results show that CE can serve as a simple and effective means to characterize the pharmacokinetic distribution of PCN G from mother to unborn fetus during labor and delivery. It is anticipated that similar methodologies have the potential to provide a quick, simple, and cost-effective means of monitoring the clinical efficacy of PCN G and other drugs during pregnancy.

  13. On the complex quantification of risk: systems-based perspective on terrorism.

    PubMed

    Haimes, Yacov Y

    2011-08-01

    This article highlights the complexity of the quantification of the multidimensional risk function, develops five systems-based premises on quantifying the risk of terrorism to a threatened system, and advocates the quantification of vulnerability and resilience through the states of the system. The five premises are: (i) There exists interdependence between a specific threat to a system by terrorist networks and the states of the targeted system, as represented through the system's vulnerability, resilience, and criticality-impact. (ii) A specific threat, its probability, its timing, the states of the targeted system, and the probability of consequences can be interdependent. (iii) The two questions in the risk assessment process: "What is the likelihood?" and "What are the consequences?" can be interdependent. (iv) Risk management policy options can reduce both the likelihood of a threat to a targeted system and the associated likelihood of consequences by changing the states (including both vulnerability and resilience) of the system. (v) The quantification of risk to a vulnerable system from a specific threat must be built on a systemic and repeatable modeling process, by recognizing that the states of the system constitute an essential step to construct quantitative metrics of the consequences based on intelligence gathering, expert evidence, and other qualitative information. The fact that the states of all systems are functions of time (among other variables) makes the time frame pivotal in each component of the process of risk assessment, management, and communication. Thus, risk to a system, caused by an initiating event (e.g., a threat) is a multidimensional function of the specific threat, its probability and time frame, the states of the system (representing vulnerability and resilience), and the probabilistic multidimensional consequences. © 2011 Society for Risk Analysis.

  14. a New Approach for Sediment Balance Quantification and Wind Erosion Monitoring

    NASA Astrophysics Data System (ADS)

    Ouerchefani, Dalel; Callot, Yann; Delaitre, Eric; Abdeljaouad, Saadi

    2014-05-01

    Studies on spatio-temporal heterogeneity of land surface in arid and semi-arid regions in relation to wind erosion are very few. These are ad hoc and instantaneous measurements of physical parameters, taking little account of aeolian landforms as markers of a changing environment. This is a handicap in the analysis of these spaces, in particular their sedimentary dynamic. Design methods for understanding the specific organization of aeolian landforms and their spatio-temporal monitoring is therefore essential. This allows quantifying the annual and seasonal sedimentary budgets of bad-instrumented sites which have not automatic recordings of meteorological variables In this work, we propose a method for multi-temporal quantification of sediment balance across a transect. This method were applied and validated in the Oglet Merteba study site. It has the advantage of linking the amount of sand deposited / eroded with changing surface conditions. It is to delineate and compare apparently accumulation and deflation areas with those having real positive and negative sedimentary budget. To do this, linear analysis techniques 'point quadrat' and 'profile leveling' were applied to a 500 m length transect. Measurements of variables related to aeolian landforms, soil and vegetation characteristics were undertaken during 2 years. The results show that the overall balance of Oglet Merteba is positive but with important seasonal fluctuations. Accumulation areas may actually be deflation zones, despite the presence of indicators showing the contrary. Conversely areas mapped as deflation zones can correspond really to zones of accumulation. This work is a contribution for the quantification of sedimentary budgets at the site level. It allows, when integrated in an Observatory approach, to harmonize the methods of data collection/analysis to regularly produce a synthesis of the situation of the local environment in a format that enables comparisons to that space as well time scales.

  15. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    DOE PAGES

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less

  16. A simple dilute and shoot methodology for the identification and quantification of illegal insulin.

    PubMed

    Vanhee, Celine; Janvier, Steven; Moens, Goedele; Deconinck, Eric; Courselle, Patricia

    2016-10-01

    The occurrence of illegal medicines is a well-established global problem and concerns mostly small molecules. However, due to the advances in genomics and recombinant expression technologies there is an increased development of polypeptide therapeutics. Insulin is one of the best known polypeptide drug, and illegal versions of this medicine led to lethal incidents in the past. Therefore, it is crucial for the public health sector to develop reliable, efficient, cheap, unbiased and easily applicable active pharmaceutical ingredient (API) identification and quantification strategies for routine analysis of suspected illegal insulins. Here we demonstrate that our combined label-free full scan approach is not only able to distinguish between all those different versions of insulin and the insulins originating from different species, but also able to chromatographically separate human insulin and insulin lispro in conditions that are compatible with mass spectrometry (MS). Additionally, we were also able to selectively quantify the different insulins, including human insulin and insulin lispro according to the validation criteria, put forward by the United Nations (UN), for the analysis of seized illicit drugs. The proposed identification and quantification method is currently being used in our official medicines control laboratory to analyze insulins retrieved from the illegal market.

  17. The use of self-quantification systems for personal health information: big data management activities and prospects

    PubMed Central

    2015-01-01

    Background Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. Objectives In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. Method We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. Findings We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Conclusions

  18. Detection and quantification of genetically modified organisms using very short, locked nucleic acid TaqMan probes.

    PubMed

    Salvi, Sergio; D'Orso, Fabio; Morelli, Giorgio

    2008-06-25

    Many countries have introduced mandatory labeling requirements on foods derived from genetically modified organisms (GMOs). Real-time quantitative polymerase chain reaction (PCR) based upon the TaqMan probe chemistry has become the method mostly used to support these regulations; moreover, event-specific PCR is the preferred method in GMO detection because of its high specificity based on the flanking sequence of the exogenous integrant. The aim of this study was to evaluate the use of very short (eight-nucleotide long), locked nucleic acid (LNA) TaqMan probes in 5'-nuclease PCR assays for the detection and quantification of GMOs. Classic TaqMan and LNA TaqMan probes were compared for the analysis of the maize MON810 transgene. The performance of the two types of probes was tested on the maize endogenous reference gene hmga, the CaMV 35S promoter, and the hsp70/cryIA(b) construct as well as for the event-specific 5'-integration junction of MON810, using plasmids as standard reference molecules. The results of our study demonstrate that the LNA 5'-nuclease PCR assays represent a valid and reliable analytical system for the detection and quantification of transgenes. Application of very short LNA TaqMan probes to GMO quantification can simplify the design of 5'-nuclease assays.

  19. Find Pairs: The Module for Protein Quantification of the PeakQuant Software Suite

    PubMed Central

    Eisenacher, Martin; Kohl, Michael; Wiese, Sebastian; Hebeler, Romano; Meyer, Helmut E.

    2012-01-01

    Abstract Accurate quantification of proteins is one of the major tasks in current proteomics research. To address this issue, a wide range of stable isotope labeling techniques have been developed, allowing one to quantitatively study thousands of proteins by means of mass spectrometry. In this article, the FindPairs module of the PeakQuant software suite is detailed. It facilitates the automatic determination of protein abundance ratios based on the automated analysis of stable isotope-coded mass spectrometric data. Furthermore, it implements statistical methods to determine outliers due to biological as well as technical variance of proteome data obtained in replicate experiments. This provides an important means to evaluate the significance in obtained protein expression data. For demonstrating the high applicability of FindPairs, we focused on the quantitative analysis of proteome data acquired in 14N/15N labeling experiments. We further provide a comprehensive overview of the features of the FindPairs software, and compare these with existing quantification packages. The software presented here supports a wide range of proteomics applications, allowing one to quantitatively assess data derived from different stable isotope labeling approaches, such as 14N/15N labeling, SILAC, and iTRAQ. The software is publicly available at http://www.medizinisches-proteom-center.de/software and free for academic use. PMID:22909347

  20. qFlow Cytometry-Based Receptoromic Screening: A High-Throughput Quantification Approach Informing Biomarker Selection and Nanosensor Development.

    PubMed

    Chen, Si; Weddell, Jared; Gupta, Pavan; Conard, Grace; Parkin, James; Imoukhuede, Princess I

    2017-01-01

    Nanosensor-based detection of biomarkers can improve medical diagnosis; however, a critical factor in nanosensor development is deciding which biomarker to target, as most diseases present several biomarkers. Biomarker-targeting decisions can be informed via an understanding of biomarker expression. Currently, immunohistochemistry (IHC) is the accepted standard for profiling biomarker expression. While IHC provides a relative mapping of biomarker expression, it does not provide cell-by-cell readouts of biomarker expression or absolute biomarker quantification. Flow cytometry overcomes both these IHC challenges by offering biomarker expression on a cell-by-cell basis, and when combined with calibration standards, providing quantitation of biomarker concentrations: this is known as qFlow cytometry. Here, we outline the key components for applying qFlow cytometry to detect biomarkers within the angiogenic vascular endothelial growth factor receptor family. The key aspects of the qFlow cytometry methodology include: antibody specificity testing, immunofluorescent cell labeling, saturation analysis, fluorescent microsphere calibration, and quantitative analysis of both ensemble and cell-by-cell data. Together, these methods enable high-throughput quantification of biomarker expression.

  1. Satellite Re-entry Modeling and Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Horsley, M.

    2012-09-01

    LEO trajectory modeling is a fundamental aerospace capability and has applications in many areas of aerospace, such as maneuver planning, sensor scheduling, re-entry prediction, collision avoidance, risk analysis, and formation flying. Somewhat surprisingly, modeling the trajectory of an object in low Earth orbit is still a challenging task. This is primarily due to the large uncertainty in the upper atmospheric density, about 15-20% (1-sigma) for most thermosphere models. Other contributions come from our inability to precisely model future solar and geomagnetic activities, the potentially unknown shape, material construction and attitude history of the satellite, and intermittent, noisy tracking data. Current methods to predict a satellite's re-entry trajectory typically involve making a single prediction, with the uncertainty dealt with in an ad-hoc manner, usually based on past experience. However, due to the extreme speed of a LEO satellite, even small uncertainties in the re-entry time translate into a very large uncertainty in the location of the re-entry event. Currently, most methods simply update the re-entry estimate on a regular basis. This results in a wide range of estimates that are literally spread over the entire globe. With no understanding of the underlying distribution of potential impact points, the sequence of impact points predicted by the current methodology are largely useless until just a few hours before re-entry. This paper will discuss the development of a set of the High Performance Computing (HPC)-based capabilities to support near real-time quantification of the uncertainty inherent in uncontrolled satellite re-entries. An appropriate management of the uncertainties is essential for a rigorous treatment of the re-entry/LEO trajectory problem. The development of HPC-based tools for re-entry analysis is important as it will allow a rigorous and robust approach to risk assessment by decision makers in an operational setting. Uncertainty

  2. Quantification of ventricular resynchronization reserve by radionuclide phase analysis in heart failure patients: a prospective long-term study.

    PubMed

    Dauphin, Raphael; Nonin, Emilie; Bontemps, Laurence; Vincent, Madeleine; Pinel, Alain; Bonijoly, Serge; Barborier, Denis; Ribier, Arnaud; Fernandes, Christine Mestre; Bert-Marcaz, Patrick; Itti, Roland; Chevalier, Philippe

    2011-03-01

    Phase analysis, developed to assess dyssynchrony from ECG-gated radionuclide ventriculography, has shown promising results. We hypothesized that quantifying the cardiac resynchronization reserve, that is, the extent of response to cardiac resynchronization therapy (CRT), by radionuclide imaging could potentially identify patients who are best suited for CRT. Seventy-four patients ages 64.8±10.1 years were prospectively studied from July 2004 to July 2006, of whom 62.2% and 37.8%, respectively, were in New York Heart Association class 3 and 4. Mean QRS width was 173±25 ms. ECG-gated radionuclide ventriculography to quantify interventricular and intraventricular dyssynchrony was performed at baseline with and without CRT and at the 3-month follow-up visit. Amino-terminal-pro-brain natriuretic peptide (NT-pro-BNP) levels were also determined at baseline and at 3 months. During a mean follow-up of 10.1±7.6 months, there were 37 (50%) clinical events that defined the nonresponder group, including cardiac death or readmission for worsening heart failure. In multivariate Cox model analysis, higher NT-pro-BNP blood levels were associated with a significant increase in the risk for event (hazard ratio=1.085 for a 100 pg/L increase in NT-pro-BNP; 95% confidence interval, 1.014 to 1.161). Each 10° elevation in intraventricular dyssynchrony was associated with a decrease in the risk of events (hazard ratio=0.456, 95% confidence interval, 0.304 to 0.683). Receiver operating characteristic curve analysis demonstrated that an interventricular dyssynchrony cutoff value of 25.5° for intraventricular synchrony yielded 91.4% sensitivity and 84.4% specificity for predicting a good response to CRT. The quantification of interventricular dyssynchrony with radionuclide phase analysis suggests that early postimplantation interventricular dyssynchrony may provide identification of CRT responders.

  3. [DNA quantification of blood samples pre-treated with pyramidon].

    PubMed

    Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan

    2014-06-01

    To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.

  4. Lamb Wave Damage Quantification Using GA-Based LS-SVM.

    PubMed

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-06-12

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  5. Lamb Wave Damage Quantification Using GA-Based LS-SVM

    PubMed Central

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-01-01

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification. PMID:28773003

  6. Consistency of flow quantifications in tridirectional phase-contrast MRI

    NASA Astrophysics Data System (ADS)

    Unterhinninghofen, R.; Ley, S.; Dillmann, R.

    2009-02-01

    Tridirectionally encoded phase-contrast MRI is a technique to non-invasively acquire time-resolved velocity vector fields of blood flow. These may not only be used to analyze pathological flow patterns, but also to quantify flow at arbitrary positions within the acquired volume. In this paper we examine the validity of this approach by analyzing the consistency of related quantifications instead of comparing it with an external reference measurement. Datasets of the thoracic aorta were acquired from 6 pigs, 1 healthy volunteer and 3 patients with artificial aortic valves. Using in-house software an elliptical flow quantification plane was placed manually at 6 positions along the descending aorta where it was rotated to 5 different angles. For each configuration flow was computed based on the original data and data that had been corrected for phase offsets. Results reveal that quantifications are more dependent on changes in position than on changes in angle. Phase offset correction considerably reduces this dependency. Overall consistency is good with a maximum variation coefficient of 9.9% and a mean variation coefficient of 7.2%.

  7. A critical view on microplastic quantification in aquatic organisms.

    PubMed

    Vandermeersch, Griet; Van Cauwenberghe, Lisbeth; Janssen, Colin R; Marques, Antonio; Granby, Kit; Fait, Gabriella; Kotterman, Michiel J J; Diogène, Jorge; Bekaert, Karen; Robbens, Johan; Devriese, Lisa

    2015-11-01

    Microplastics, plastic particles and fragments smaller than 5mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature review on all available extraction and quantification methods. Two of these methods, involving wet acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different "hotspot" locations in Europe (Po estuary, Italy; Tagus estuary, Portugal; Ebro estuary, Spain). An average of 0.18±0.14 total microplastics g(-1) w.w. for the Acid mix Method and 0.12±0.04 total microplastics g(-1) w.w. for the Nitric acid Method was established. Additionally, in a pilot study an average load of 0.13±0.14 total microplastics g(-1) w.w. was recorded in commercial mussels (Mytilus edulis and M. galloprovincialis) from five European countries (France, Italy, Denmark, Spain and The Netherlands). A detailed analysis and comparison of methods indicated the need for further research to develop a standardised operating protocol for microplastic quantification and monitoring. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. Magnetic Particle Testing, RQA/M1-5330.16.

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Huntsville, AL. George C. Marshall Space Flight Center.

    As one in the series of classroom training handbooks, prepared by the U.S. space program, instructional material is presented in this volume concerning familiarization and orientation on magnetic particle testing. The subject is divided under the following headings: Introduction, Principles of Magnetic Particle Testing, Magnetic Particle Test…

  9. Eddy Current Testing, RQA/M1-5330.17.

    ERIC Educational Resources Information Center

    National Aeronautics and Space Administration, Huntsville, AL. George C. Marshall Space Flight Center.

    As one in the series of classroom training handbooks, prepared by the U.S. space program, instructional material is presented in this volume concerning familiarization and orientation on eddy current testing. The subject is presented under the following headings: Introduction, Eddy Current Principles, Eddy Current Equipment, Eddy Current Methods,…

  10. Clarity™ digital PCR system: a novel platform for absolute quantification of nucleic acids.

    PubMed

    Low, Huiyu; Chan, Shun-Jie; Soo, Guo-Hao; Ling, Belinda; Tan, Eng-Lee

    2017-03-01

    In recent years, digital polymerase chain reaction (dPCR) has gained recognition in biomedical research as it provides a platform for precise and accurate quantification of nucleic acids without the need for a standard curve. However, this technology has not yet been widely adopted as compared to real-time quantitative PCR due to its more cumbersome workflow arising from the need to sub-divide a PCR sample into a large number of smaller partitions prior to thermal cycling to achieve zero or at least one copy of the target RNA/DNA per partition. A recently launched platform, the Clarity™ system from JN Medsys, simplifies dPCR workflow through the use of a novel chip-in-a-tube technology for sample partitioning. In this study, the performance of Clarity™ was evaluated through quantification of the single-copy human RNase P gene. The system demonstrated high precision and accuracy and also excellent linearity across a range of over 4 orders of magnitude for the absolute quantification of the target gene. Moreover, consistent DNA copy measurements were also attained using a panel of different probe- and dye-based master mixes, demonstrating the system's compatibility with commercial master mixes. The Clarity™ was then compared to the QX100™ droplet dPCR system from Bio-Rad using a set of DNA reference materials, and the copy number concentrations derived from both systems were found to be closely associated. Collectively, the results showed that Clarity™ is a reliable, robust and flexible platform for next-generation genetic analysis.

  11. Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification

    EPA Science Inventory

    Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...

  12. Critical assessment of digital PCR for the detection and quantification of genetically modified organisms.

    PubMed

    Demeke, Tigst; Dobnik, David

    2018-07-01

    The number of genetically modified organisms (GMOs) on the market is steadily increasing. Because of regulation of cultivation and trade of GMOs in several countries, there is pressure for their accurate detection and quantification. Today, DNA-based approaches are more popular for this purpose than protein-based methods, and real-time quantitative PCR (qPCR) is still the gold standard in GMO analytics. However, digital PCR (dPCR) offers several advantages over qPCR, making this new technique appealing also for GMO analysis. This critical review focuses on the use of dPCR for the purpose of GMO quantification and addresses parameters which are important for achieving accurate and reliable results, such as the quality and purity of DNA and reaction optimization. Three critical factors are explored and discussed in more depth: correct classification of partitions as positive, correctly determined partition volume, and dilution factor. This review could serve as a guide for all laboratories implementing dPCR. Most of the parameters discussed are applicable to fields other than purely GMO testing. Graphical abstract There are generally three different options for absolute quantification of genetically modified organisms (GMOs) using digital PCR: droplet- or chamber-based and droplets in chambers. All have in common the distribution of reaction mixture into several partitions, which are all subjected to PCR and scored at the end-point as positive or negative. Based on these results GMO content can be calculated.

  13. Simultaneous quantification of the viral antigens hemagglutinin and neuraminidase in influenza vaccines by LC-MSE.

    PubMed

    Creskey, Marybeth C; Li, Changgui; Wang, Junzhi; Girard, Michel; Lorbetskie, Barry; Gravel, Caroline; Farnsworth, Aaron; Li, Xuguang; Smith, Daryl G S; Cyr, Terry D

    2012-07-06

    Current methods for quality control of inactivated influenza vaccines prior to regulatory approval include determining the hemagglutinin (HA) content by single radial immunodiffusion (SRID), verifying neuraminidase (NA) enzymatic activity, and demonstrating that the levels of the contaminant protein ovalbumin are below a set threshold of 1 μg/dose. The SRID assays require the availability of strain-specific reference HA antigens and antibodies, the production of which is a potential rate-limiting step in vaccine development and release, particularly during a pandemic. Immune responses induced by neuraminidase also contribute to protection from infection; however, the amounts of NA antigen in influenza vaccines are currently not quantified or standardized. Here, we report a method for vaccine analysis that yields simultaneous quantification of HA and NA levels much more rapidly than conventional HA quantification techniques, while providing additional valuable information on the total protein content. Enzymatically digested vaccine proteins were analyzed by LC-MS(E), a mass spectrometric technology that allows absolute quantification of analytes, including the HA and NA antigens, other structural influenza proteins and chicken egg proteins associated with the manufacturing process. This method has potential application for increasing the accuracy of reference antigen standards and for validating label claims for HA content in formulated vaccines. It can also be used to monitor NA and chicken egg protein content in order to monitor manufacturing consistency. While this is a useful methodology with potential for broad application, we also discuss herein some of the inherent limitations of this approach and the care and caution that must be taken in its use as a tool for absolute protein quantification. The variations in HA, NA and chicken egg protein concentrations in the vaccines analyzed in this study are indicative of the challenges associated with the current

  14. Residual transglutaminase in collagen - effects, detection, quantification, and removal.

    PubMed

    Schloegl, W; Klein, A; Fürst, R; Leicht, U; Volkmer, E; Schieker, M; Jus, S; Guebitz, G M; Stachel, I; Meyer, M; Wiggenhorn, M; Friess, W

    2012-02-01

    In the present study, we developed an enzyme-linked immunosorbent assay (ELISA) for microbial transglutaminase (mTG) from Streptomyces mobaraensis to overcome the lack of a quantification method for mTG. We further performed a detailed follow-on-analysis of insoluble porcine collagen type I enzymatically modified with mTG primarily focusing on residuals of mTG. Repeated washing (4 ×) reduced mTG-levels in the washing fluids but did not quantitatively remove mTG from the material (p < 0.000001). Substantial amounts of up to 40% of the enzyme utilized in the crosslinking mixture remained associated with the modified collagen. Binding was non-covalent as could be demonstrated by Western blot analysis. Acidic and alkaline dialysis of mTG treated collagen material enabled complete removal the enzyme. Treatment with guanidinium chloride, urea, or sodium chloride was less effective in reducing the mTG content. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Large scale systematic proteomic quantification from non-metastatic to metastatic colorectal cancer

    NASA Astrophysics Data System (ADS)

    Yin, Xuefei; Zhang, Yang; Guo, Shaowen; Jin, Hong; Wang, Wenhai; Yang, Pengyuan

    2015-07-01

    A systematic proteomic quantification of formalin-fixed, paraffin-embedded (FFPE) colorectal cancer tissues from stage I to stage IIIC was performed in large scale. 1017 proteins were identified with 338 proteins in quantitative changes by label free method, while 341 proteins were quantified with significant expression changes among 6294 proteins by iTRAQ method. We found that proteins related to migration expression increased and those for binding and adherent decreased during the colorectal cancer development according to the gene ontology (GO) annotation and ingenuity pathway analysis (IPA). The integrin alpha 5 (ITA5) in integrin family was focused, which was consistent with the metastasis related pathway. The expression level of ITA5 decreased in metastasis tissues and the result has been further verified by Western blotting. Another two cell migration related proteins vitronectin (VTN) and actin-related protein (ARP3) were also proved to be up-regulated by both mass spectrometry (MS) based quantification results and Western blotting. Up to now, our result shows one of the largest dataset in colorectal cancer proteomics research. Our strategy reveals a disease driven omics-pattern for the metastasis colorectal cancer.

  16. Comparison of the Chiron Quantiplex branched DNA (bDNA) assay and the Abbott Genostics solution hybridization assay for quantification of hepatitis B viral DNA.

    PubMed

    Kapke, G E; Watson, G; Sheffler, S; Hunt, D; Frederick, C

    1997-01-01

    Several assays for quantification of DNA have been developed and are currently used in research and clinical laboratories. However, comparison of assay results has been difficult owing to the use of different standards and units of measurements as well as differences between assays in dynamic range and quantification limits. Although a few studies have compared results generated by different assays, there has been no consensus on conversion factors and thorough analysis has been precluded by small sample size and limited dynamic range studied. In this study, we have compared the Chiron branched DNA (bDNA) and Abbott liquid hybridization assays for quantification of hepatitis B virus (HBV) DNA in clinical specimens and have derived conversion factors to facilitate comparison of assay results. Additivity and variance stabilizing (AVAS) regression, a form of non-linear regression analysis, was performed on assay results for specimens from HBV clinical trials. Our results show that there is a strong linear relationship (R2 = 0.96) between log Chiron and log Abbott assay results. Conversion factors derived from regression analyses were found to be non-constant and ranged from 6-40. Analysis of paired assay results below and above each assay's limit of quantification (LOQ) indicated that a significantly (P < 0.01) larger proportion of observations were below the Abbott assay LOQ but above the Chiron assay LOQ, indicating that the Chiron assay is significantly more sensitive than the Abbott assay. Testing of replicate specimens showed that the Chiron assay consistently yielded lower per cent coefficients of variance (% CVs) than the Abbott assay, indicating that the Chiron assay provides superior precision.

  17. LC-MS/MS quantification of next-generation biotherapeutics: a case study for an IgE binding Nanobody in cynomolgus monkey plasma.

    PubMed

    Sandra, Koen; Mortier, Kjell; Jorge, Lucie; Perez, Luis C; Sandra, Pat; Priem, Sofie; Poelmans, Sofie; Bouche, Marie-Paule

    2014-05-01

    Nanobodies(®) are therapeutic proteins derived from the smallest functional fragments of heavy chain-only antibodies. The development and validation of an LC-MS/MS-based method for the quantification of an IgE binding Nanobody in cynomolgus monkey plasma is presented. Nanobody quantification was performed making use of a proteotypic tryptic peptide chromatographically enriched prior to LC-MS/MS analysis. The validated LLOQ at 36 ng/ml was measured with an intra- and inter-assay precision and accuracy <20%. The required sensitivity could be obtained based on the selectivity of 2D LC combined with MS/MS. No analyte specific tools for affinity purification were used. Plasma samples originating from a PK/PD study were analyzed and compared with the results obtained with a traditional ligand-binding assay. Excellent correlations between the two techniques were obtained, and similar PK parameters were estimated. A 2D LC-MS/MS method was successfully developed and validated for the quantification of a next generation biotherapeutic.

  18. Quantification of underivatised amino acids on dry blood spot, plasma, and urine by HPLC-ESI-MS/MS.

    PubMed

    Giordano, Giuseppe; Di Gangi, Iole Maria; Gucciardi, Antonina; Naturale, Mauro

    2012-01-01

    Enzyme deficiencies in amino acid (AA) metabolism affecting the levels of amino acids and their derivatives in physiological fluids may serve as diagnostically significant biomarkers for one or a group of metabolic disorders. Therefore, it is important to monitor a wide range of free amino acids simultaneously and to quantify them. This is time consuming if we use the classical methods and more than ever now that many laboratories have introduced Newborn Screening Programs for the semiquantitative analysis, detection, and quantification of some amino acids needed to be performed in a short time to reduce the rate of false positives.We have modified the stable isotope dilution HPLC-electrospray ionization (ESI)-MS/MS method previously described by Qu et al. (Anal Chem 74: 2034-2040, 2002) for a more rapid, robust, sensitive, and specific detection and quantification of underivatised amino acids. The modified method reduces the time of analysis to 10 min with very good reproducibility of retention times and a better separation of the metabolites and their isomers.The omission of the derivatization step allowed us to achieve some important advantages: fast and simple sample preparation and exclusion of artefacts and interferences. The use of this technique is highly sensitive, specific, and allows monitoring of 40 underivatized amino acids, including the key isomers and quantification of some of them, in order to cover many diagnostically important intermediates of metabolic pathways.We propose this HPLC-ESI-MS/MS method for underivatized amino acids as a support for the Newborn Screening as secondary test using the same dried blood spots for a more accurate and specific examination in case of suspected metabolic diseases. In this way, we avoid plasma collection from the patient as it normally occurs, reducing anxiety for the parents and further costs for analysis.The same method was validated and applied also to plasma and urine samples with good reproducibility

  19. Quantification of taurine in energy drinks using ¹H NMR.

    PubMed

    Hohmann, Monika; Felbinger, Christine; Christoph, Norbert; Wachter, Helmut; Wiest, Johannes; Holzgrabe, Ulrike

    2014-05-01

    The consumption of so called energy drinks is increasing, especially among adolescents. These beverages commonly contain considerable amounts of the amino sulfonic acid taurine, which is related to a magnitude of various physiological effects. The customary method to control the legal limit of taurine in energy drinks is LC-UV/vis with postcolumn derivatization using ninhydrin. In this paper we describe the quantification of taurine in energy drinks by (1)H NMR as an alternative to existing methods of quantification. Variation of pH values revealed the separation of a distinct taurine signal in (1)H NMR spectra, which was applied for integration and quantification. Quantification was performed using external calibration (R(2)>0.9999; linearity verified by Mandel's fitting test with a 95% confidence level) and PULCON. Taurine concentrations in 20 different energy drinks were analyzed by both using (1)H NMR and LC-UV/vis. The deviation between (1)H NMR and LC-UV/vis results was always below the expanded measurement uncertainty of 12.2% for the LC-UV/vis method (95% confidence level) and at worst 10.4%. Due to the high accordance to LC-UV/vis data and adequate recovery rates (ranging between 97.1% and 108.2%), (1)H NMR measurement presents a suitable method to quantify taurine in energy drinks. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Quantification of Lignin and Its Structural Features in Plant Biomass Using 13C Lignin as Internal Standard for Pyrolysis-GC-SIM-MS.

    PubMed

    van Erven, Gijs; de Visser, Ries; Merkx, Donny W H; Strolenberg, Willem; de Gijsel, Peter; Gruppen, Harry; Kabel, Mirjam A

    2017-10-17

    Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13 C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12 C and 13 C lignin were isolated from nonlabeled and uniformly 13 C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13 C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12 C lignin analogue and was shown to be extremely accurate (>99.9%, R 2 > 0.999) and precise (RSD < 1.5%). Third, 13 C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin.

  1. Impact Induced Delamination Detection and Quantification With Guided Wavefield Analysis

    NASA Technical Reports Server (NTRS)

    Tian, Zhenhua; Leckey, Cara A. C.; Yu, Lingyu; Seebo, Jeffrey P.

    2015-01-01

    This paper studies impact induced delamination detection and quantification by using guided wavefield data and spatial wavenumber imaging. The complex geometry impact-like delamination is created through a quasi-static indentation on a CFRP plate. To detect and quantify the impact delamination in the CFRP plate, PZT-SLDV sensing and spatial wavenumber imaging are performed. In the PZT-SLDV sensing, the guided waves are generated from the PZT, and the high spatial resolution guided wavefields are measured by the SLDV. The guided wavefield data acquired from the PZT-SLDV sensing represent guided wave propagation in the composite laminate and include guided wave interaction with the delamination damage. The measured guided wavefields are analyzed through the spatial wavenumber imaging method, which generates an image containing the dominant local wavenumber at each spatial location. The spatial wavenumber imaging result for the simple single layer Teflon insert delamination provided quantitative information on delamination damage size and location. The location of delamination damage is indicated by the area with larger wavenumbers in the spatial wavenumber image. The impact-like delamination results only partially agreed with the damage size and shape. The results also demonstrated the dependence on excitation frequency. Future work will further investigate the accuracy of the wavenumber imaging method for real composite damage and the dependence on frequency of excitation.

  2. Air Contamination Quantification by FTIR with Gas Cell

    NASA Technical Reports Server (NTRS)

    Freischlag, Jason

    2017-01-01

    Air quality is of utmost importance in environmental studies and has many industrial applications such as aviators grade breathing oxygen (ABO) for pilots and breathing air for fire fighters. Contamination is a major concern for these industries as identified in MIL-PRF-27210, CGA G-4.3, CGA G-7.1, and NFPA 1989. Fourier Transform Infrared Spectroscopy (FTIR) is a powerful tool that when combined with a gas cell has tremendous potential for gas contamination analysis. Current procedures focus mostly on GC-MS for contamination quantification. Introduction of this topic will be done through a comparison of the currently used deterministic methods for gas contamination with those of FTIR gas analysis. Certification of the mentioned standards through the ISOIEC 17065 certifying body A2LA will be addressed followed by an evaluation of quality information such as the determinations of linearity and the limits of detection and quantitation. Major interferences and issues arising from the use of the FTIR for accredited work with ABO and breathing air will be covered.

  3. Improvement of High-throughput Genotype Analysis After Implementation of a Dual-curve Sybr Green I-based Quantification and Normalization Procedure

    USDA-ARS?s Scientific Manuscript database

    The ability to rapidly screen a large number of individuals is the key to any successful plant breeding program. One of the primary bottlenecks in high throughput screening is the preparation of DNA samples, particularly the quantification and normalization of samples for downstream processing. A ...

  4. Hyperplex-MRM: a hybrid multiple reaction monitoring method using mTRAQ/iTRAQ labeling for multiplex absolute quantification of human colorectal cancer biomarker.

    PubMed

    Yin, Hong-Rui; Zhang, Lei; Xie, Li-Qi; Huang, Li-Yong; Xu, Ye; Cai, San-Jun; Yang, Peng-Yuan; Lu, Hao-Jie

    2013-09-06

    Novel biomarker verification assays are urgently required to improve the efficiency of biomarker development. Benefitting from lower development costs, multiple reaction monitoring (MRM) has been used for biomarker verification as an alternative to immunoassay. However, in general MRM analysis, only one sample can be quantified in a single experiment, which restricts its application. Here, a Hyperplex-MRM quantification approach, which combined mTRAQ for absolute quantification and iTRAQ for relative quantification, was developed to increase the throughput of biomarker verification. In this strategy, equal amounts of internal standard peptides were labeled with mTRAQ reagents Δ0 and Δ8, respectively, as double references, while 4-plex iTRAQ reagents were used to label four different samples as an alternative to mTRAQ Δ4. From the MRM trace and MS/MS spectrum, total amounts and relative ratios of target proteins/peptides of four samples could be acquired simultaneously. Accordingly, absolute amounts of target proteins/peptides in four different samples could be achieved in a single run. In addition, double references were used to increase the reliability of the quantification results. Using this approach, three biomarker candidates, ademosylhomocysteinase (AHCY), cathepsin D (CTSD), and lysozyme C (LYZ), were successfully quantified in colorectal cancer (CRC) tissue specimens of different stages with high accuracy, sensitivity, and reproducibility. To summarize, we demonstrated a promising quantification method for high-throughput verification of biomarker candidates.

  5. Real-time polymerase chain reaction-based approach for quantification of the pat gene in the T25 Zea mays event.

    PubMed

    Weighardt, Florian; Barbati, Cristina; Paoletti, Claudia; Querci, Maddalena; Kay, Simon; De Beuckeleer, Marc; Van den Eede, Guy

    2004-01-01

    In Europe, a growing interest for reliable techniques for the quantification of genetically modified component(s) of food matrixes is arising from the need to comply with the European legislative framework on novel food products. Real-time polymerase chain reaction (PCR) is currently the most powerful technique for the quantification of specific nucleic acid sequences. Several real-time PCR methodologies based on different molecular principles have been developed for this purpose. The most frequently used approach in the field of genetically modified organism (GMO) quantification in food or feed samples is based on the 5'-3'-exonuclease activity of Taq DNA polymerase on specific degradation probes (TaqMan principle). A novel approach was developed for the establishment of a TaqMan quantification system assessing GMO contents around the 1% threshold stipulated under European Union (EU) legislation for the labeling of food products. The Zea mays T25 elite event was chosen as a model for the development of the novel GMO quantification approach. The most innovative aspect of the system is represented by the use of sequences cloned in plasmids as reference standards. In the field of GMO quantification, plasmids are an easy to use, cheap, and reliable alternative to Certified Reference Materials (CRMs), which are only available for a few of the GMOs authorized in Europe, have a relatively high production cost, and require further processing to be suitable for analysis. Strengths and weaknesses of the use of novel plasmid-based standards are addressed in detail. In addition, the quantification system was designed to avoid the use of a reference gene (e.g., a single copy, species-specific gene) as normalizer, i.e., to perform a GMO quantification based on an absolute instead of a relative measurement. In fact, experimental evidences show that the use of reference genes adds variability to the measurement system because a second independent real-time PCR-based measurement

  6. Simultaneous Quantification of Syringic Acid and Kaempferol in Extracts of Bergenia Species Using Validated High-Performance Thin-Layer Chromatographic-Densitometric Method.

    PubMed

    Srivastava, Nishi; Srivastava, Amit; Srivastava, Sharad; Rawat, Ajay Kumar Singh; Khan, Abdul Rahman

    2016-03-01

    A rapid, sensitive, selective and robust quantitative densitometric high-performance thin-layer chromatographic method was developed and validated for separation and quantification of syringic acid (SYA) and kaempferol (KML) in the hydrolyzed extracts of Bergenia ciliata and Bergenia stracheyi. The separation was performed on silica gel 60F254 high-performance thin-layer chromatography plates using toluene : ethyl acetate : formic acid (5 : 4: 1, v/v/v) as the mobile phase. The quantification of SYA and KML was carried out using a densitometric reflection/absorption mode at 290 nm. A dense spot of SYA and KML appeared on the developed plate at a retention factor value of 0.61 ± 0.02 and 0.70 ± 0.01. A precise and accurate quantification was performed using linear regression analysis by plotting the peak area vs concentration 100-600 ng/band (correlation coefficient: r = 0.997, regression coefficient: R(2) = 0.996) for SYA and 100-600 ng/band (correlation coefficient: r = 0.995, regression coefficient: R(2) = 0.991) for KML. The developed method was validated in terms of accuracy, recovery and inter- and intraday study as per International Conference on Harmonisation guidelines. The limit of detection and limit of quantification of SYA and KML were determined, respectively, as 91.63, 142.26 and 277.67, 431.09 ng. The statistical data analysis showed that the method is reproducible and selective for the estimation of SYA and KML in extracts of B. ciliata and B. stracheyi. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Recommendations for the Generation, Quantification, Storage, and Handling of Peptides Used for Mass Spectrometry-Based Assays.

    PubMed

    Hoofnagle, Andrew N; Whiteaker, Jeffrey R; Carr, Steven A; Kuhn, Eric; Liu, Tao; Massoni, Sam A; Thomas, Stefani N; Townsend, R Reid; Zimmerman, Lisa J; Boja, Emily; Chen, Jing; Crimmins, Daniel L; Davies, Sherri R; Gao, Yuqian; Hiltke, Tara R; Ketchum, Karen A; Kinsinger, Christopher R; Mesri, Mehdi; Meyer, Matthew R; Qian, Wei-Jun; Schoenherr, Regine M; Scott, Mitchell G; Shi, Tujin; Whiteley, Gordon R; Wrobel, John A; Wu, Chaochao; Ackermann, Brad L; Aebersold, Ruedi; Barnidge, David R; Bunk, David M; Clarke, Nigel; Fishman, Jordan B; Grant, Russ P; Kusebauch, Ulrike; Kushnir, Mark M; Lowenthal, Mark S; Moritz, Robert L; Neubert, Hendrik; Patterson, Scott D; Rockwood, Alan L; Rogers, John; Singh, Ravinder J; Van Eyk, Jennifer E; Wong, Steven H; Zhang, Shucha; Chan, Daniel W; Chen, Xian; Ellis, Matthew J; Liebler, Daniel C; Rodland, Karin D; Rodriguez, Henry; Smith, Richard D; Zhang, Zhen; Zhang, Hui; Paulovich, Amanda G

    2016-01-01

    For many years, basic and clinical researchers have taken advantage of the analytical sensitivity and specificity afforded by mass spectrometry in the measurement of proteins. Clinical laboratories are now beginning to deploy these work flows as well. For assays that use proteolysis to generate peptides for protein quantification and characterization, synthetic stable isotope-labeled internal standard peptides are of central importance. No general recommendations are currently available surrounding the use of peptides in protein mass spectrometric assays. The Clinical Proteomic Tumor Analysis Consortium of the National Cancer Institute has collaborated with clinical laboratorians, peptide manufacturers, metrologists, representatives of the pharmaceutical industry, and other professionals to develop a consensus set of recommendations for peptide procurement, characterization, storage, and handling, as well as approaches to the interpretation of the data generated by mass spectrometric protein assays. Additionally, the importance of carefully characterized reference materials-in particular, peptide standards for the improved concordance of amino acid analysis methods across the industry-is highlighted. The alignment of practices around the use of peptides and the transparency of sample preparation protocols should allow for the harmonization of peptide and protein quantification in research and clinical care. © 2015 American Association for Clinical Chemistry.

  8. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  9. Quantification of Bacterial Twitching Motility in Dense Colonies Using Transmitted Light Microscopy and Computational Image Analysis.

    PubMed

    Smith, Benjamin; Li, Jianfang; Metruccio, Matteo; Wan, Stephanie; Evans, David; Fleiszig, Suzanne

    2018-04-20

    A method was developed to allow the quantification and mapping of relative bacterial twitching motility in dense samples, where tracking of individual bacteria was not feasible. In this approach, movies of bacterial films were acquired using differential interference contrast microscopy (DIC), and bacterial motility was then indirectly quantified by the degree to which the bacteria modulated the intensity of light in the field-of-view over time. This allowed the mapping of areas of relatively high and low motility within a single field-of-view, and comparison of the total distribution of motility between samples.

  10. Quantification of EVI1 transcript levels in acute myeloid leukemia by RT-qPCR analysis: A study by the ALFA Group.

    PubMed

    Smol, Thomas; Nibourel, Olivier; Marceau-Renaut, Alice; Celli-Lebras, Karine; Berthon, Céline; Quesnel, Bruno; Boissel, Nicolas; Terré, Christine; Thomas, Xavier; Castaigne, Sylvie; Dombret, Hervé; Preudhomme, Claude; Renneville, Aline

    2015-12-01

    EVI1 overexpression confers poor prognosis in acute myeloid leukemia (AML). Quantification of EVI1 expression has been mainly assessed by real-time quantitative PCR (RT-qPCR) based on relative quantification of EVI1-1D splice variant. In this study, we developed a RT-qPCR assay to perform quantification of EVI1 expression covering the different splice variants. A sequence localized in EVI1 exons 14 and 15 was cloned into plasmids that were used to establish RT-qPCR standard curves. Threshold values to define EVI1 overexpression were determined using 17 bone marrow (BM) and 31 peripheral blood (PB) control samples and were set at 1% in BM and 0.5% in PB. Samples from 64 AML patients overexpressing EVI1 included in the ALFA-0701 or -0702 trials were collected at diagnosis and during follow-up (n=152). Median EVI1 expression at AML diagnosis was 23.3% in BM and 3.6% in PB. EVI1 expression levels significantly decreased between diagnostic and post-induction samples, with an average variation from 21.6% to 3.56% in BM and from 4.0% to 0.22% in PB, but did not exceed 1 log10 reduction. Our study demonstrates that the magnitude of reduction in EVI1 expression levels between AML diagnosis and follow-up is not sufficient to allow sensitive detection of minimal residual disease. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Bile acid profiling and quantification in biofluids using ultra-performance liquid chromatography tandem mass spectrometry.

    PubMed

    Sarafian, Magali H; Lewis, Matthew R; Pechlivanis, Alexandros; Ralphs, Simon; McPhail, Mark J W; Patel, Vishal C; Dumas, Marc-Emmanuel; Holmes, Elaine; Nicholson, Jeremy K

    2015-10-06

    Bile acids are important end products of cholesterol metabolism. While they have been identified as key factors in lipid emulsification and absorption due to their detergent properties, bile acids have also been shown to act as signaling molecules and intermediates between the host and the gut microbiota. To further the investigation of bile acid functions in humans, an advanced platform for high throughput analysis is essential. Herein, we describe the development and application of a 15 min UPLC procedure for the separation of bile acid species from human biofluid samples requiring minimal sample preparation. High resolution time-of-flight mass spectrometry was applied for profiling applications, elucidating rich bile acid profiles in both normal and disease state plasma. In parallel, a second mode of detection was developed utilizing tandem mass spectrometry for sensitive and quantitative targeted analysis of 145 bile acid (BA) species including primary, secondary, and tertiary bile acids. The latter system was validated by testing the linearity (lower limit of quantification, LLOQ, 0.25-10 nM and upper limit of quantification, ULOQ, 2.5-5 μM), precision (≈6.5%), and accuracy (81.2-118.9%) on inter- and intraday analysis achieving good recovery of bile acids (serum/plasma 88% and urine 93%). The ultra performance liquid chromatography-mass spectrometry (UPLC-MS)/MS targeted method was successfully applied to plasma, serum, and urine samples in order to compare the bile acid pool compositional difference between preprandial and postprandial states, demonstrating the utility of such analysis on human biofluids.

  12. Modeling transport phenomena and uncertainty quantification in solidification processes

    NASA Astrophysics Data System (ADS)

    Fezi, Kyle S.

    Direct chill (DC) casting is the primary processing route for wrought aluminum alloys. This semicontinuous process consists of primary cooling as the metal is pulled through a water cooled mold followed by secondary cooling with a water jet spray and free falling water. To gain insight into this complex solidification process, a fully transient model of DC casting was developed to predict the transport phenomena of aluminum alloys for various conditions. This model is capable of solving mixture mass, momentum, energy, and species conservation equations during multicomponent solidification. Various DC casting process parameters were examined for their effect on transport phenomena predictions in an alloy of commercial interest (aluminum alloy 7050). The practice of placing a wiper to divert cooling water from the ingot surface was studied and the results showed that placement closer to the mold causes remelting at the surface and increases susceptibility to bleed outs. Numerical models of metal alloy solidification, like the one previously mentioned, are used to gain insight into physical phenomena that cannot be observed experimentally. However, uncertainty in model inputs cause uncertainty in results and those insights. The analysis of model assumptions and probable input variability on the level of uncertainty in model predictions has not been calculated in solidification modeling as yet. As a step towards understanding the effect of uncertain inputs on solidification modeling, uncertainty quantification (UQ) and sensitivity analysis were first performed on a transient solidification model of a simple binary alloy (Al-4.5wt.%Cu) in a rectangular cavity with both columnar and equiaxed solid growth models. This analysis was followed by quantifying the uncertainty in predictions from the recently developed transient DC casting model. The PRISM Uncertainty Quantification (PUQ) framework quantified the uncertainty and sensitivity in macrosegregation, solidification

  13. Quantification of multiple gene expression in individual cells.

    PubMed

    Peixoto, António; Monteiro, Marta; Rocha, Benedita; Veiga-Fernandes, Henrique

    2004-10-01

    Quantitative gene expression analysis aims to define the gene expression patterns determining cell behavior. So far, these assessments can only be performed at the population level. Therefore, they determine the average gene expression within a population, overlooking possible cell-to-cell heterogeneity that could lead to different cell behaviors/cell fates. Understanding individual cell behavior requires multiple gene expression analyses of single cells, and may be fundamental for the understanding of all types of biological events and/or differentiation processes. We here describe a new reverse transcription-polymerase chain reaction (RT-PCR) approach allowing the simultaneous quantification of the expression of 20 genes in the same single cell. This method has broad application, in different species and any type of gene combination. RT efficiency is evaluated. Uniform and maximized amplification conditions for all genes are provided. Abundance relationships are maintained, allowing the precise quantification of the absolute number of mRNA molecules per cell, ranging from 2 to 1.28 x 10(9) for each individual gene. We evaluated the impact of this approach on functional genetic read-outs by studying an apparently homogeneous population (monoclonal T cells recovered 4 d after antigen stimulation), using either this method or conventional real-time RT-PCR. Single-cell studies revealed considerable cell-to-cell variation: All T cells did not express all individual genes. Gene coexpression patterns were very heterogeneous. mRNA copy numbers varied between different transcripts and in different cells. As a consequence, this single-cell assay introduces new and fundamental information regarding functional genomic read-outs. By comparison, we also show that conventional quantitative assays determining population averages supply insufficient information, and may even be highly misleading.

  14. Nonlinear parameters of surface EMG in schizophrenia patients depend on kind of antipsychotic therapy.

    PubMed

    Meigal, Alexander Yu; Miroshnichenko, German G; Kuzmina, Anna P; Rissanen, Saara M; Georgiadis, Stefanos D; Karjalainen, Pasi A

    2015-01-01

    We compared a set of surface EMG (sEMG) parameters in several groups of schizophrenia (SZ, n = 74) patients and healthy controls (n = 11) and coupled them with the clinical data. sEMG records were quantified with spectral, mutual information (MI) based and recurrence quantification analysis (RQA) parameters, and with approximate and sample entropies (ApEn and SampEn). Psychotic deterioration was estimated with Positive and Negative Syndrome Scale (PANSS) and with the positive subscale of PANSS. Neuroleptic-induced parkinsonism (NIP) motor symptoms were estimated with Simpson-Angus Scale (SAS). Dyskinesia was measured with Abnormal Involuntary Movement Scale (AIMS). We found that there was no difference in values of sEMG parameters between healthy controls and drug-naïve SZ patients. The most specific group was formed of SZ patients who were administered both typical and atypical antipsychotics (AP). Their sEMG parameters were significantly different from those of SZ patients taking either typical or atypical AP or taking no AP. This may represent a kind of synergistic effect of these two classes of AP. For the clinical data we found that PANSS, SAS, and AIMS were not correlated to any of the sEMG parameters. with nonlinear parameters of sEMG it is possible to reveal NIP in SZ patients, and it may help to discriminate between different clinical groups of SZ patients. Combined typical and atypical AP therapy has stronger effect on sEMG than a therapy with AP of only one class.

  15. Reliable quantification of phthalates in environmental matrices (air, water, sludge, sediment and soil): a review.

    PubMed

    Net, Sopheak; Delmont, Anne; Sempéré, Richard; Paluselli, Andrea; Ouddane, Baghdad

    2015-05-15

    Because of their widespread application, phthalates or phthalic acid esters (PAEs) are ubiquitous in the environment. Their presence has attracted considerable attention due to their potential impacts on ecosystem functioning and on public health, so their quantification has become a necessity. Various extraction procedures as well as gas/liquid chromatography and mass spectrometry detection techniques are found as suitable for reliable detection of such compounds. However, PAEs are ubiquitous in the laboratory environment including ambient air, reagents, sampling equipment, and various analytical devices, that induces difficult analysis of real samples with a low PAE background. Therefore, accurate PAE analysis in environmental matrices is a challenging task. This paper reviews the extensive literature data on the techniques for PAE quantification in natural media. Sampling, sample extraction/pretreatment and detection for quantifying PAEs in different environmental matrices (air, water, sludge, sediment and soil) have been reviewed and compared. The concept of "green analytical chemistry" for PAE determination is also discussed. Moreover useful information about the material preparation and the procedures of quality control and quality assurance are presented to overcome the problem of sample contamination and these encountered due to matrix effects in order to avoid overestimating PAE concentrations in the environment. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Detection, Mapping, and Quantification of Single Walled Carbon Nanotubes in Histological Specimens with Photoacoustic Microscopy

    PubMed Central

    Mikos, Antonios G.; Jansen, John A.; Shroyer, Kenneth R.; Wang, Lihong V.; Sitharaman, Balaji

    2012-01-01

    Aims In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM) was investigated to detect, map, and quantify trace amounts [nanograms (ng) to micrograms (µg)] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies (histological specimens from implanted tissue engineering scaffolds). Materials and Methods Optical-resolution (OR) and acoustic-resolution (AR) - Photoacoustic microscopy (PAM) was employed to detect, map and quantify the SWCNTs in a variety of tissue histological specimens and compared with other optical techniques (bright-field optical microscopy, Raman microscopy, near infrared (NIR) fluorescence microscopy). Results Both optical-resolution and acoustic-resolution PAM, allow the detection and quantification of SWCNTs in histological specimens with scalable spatial resolution and depth penetration. The noise-equivalent detection sensitivity to SWCNTs in the specimens was calculated to be as low as ∼7 pg. Image processing analysis further allowed the mapping, distribution, and quantification of the SWCNTs in the histological sections. Conclusions The results demonstrate the potential of PAM as a promising imaging technique to detect, map, and quantify SWCNTs in histological specimens, and could complement the capabilities of current optical and electron microscopy techniques in the analysis of histological specimens containing SWCNTs. PMID:22496892

  17. An experimental design for quantification of cardiovascular responses to music stimuli in humans.

    PubMed

    Chang, S-H; Luo, C-H; Yeh, T-L

    2004-01-01

    There have been several researches on the relationship between music and human physiological or psychological responses. However, there are cardiovascular index factors that have not been explored quantitatively due to the qualitative nature of acoustic stimuli. This study proposes and demonstrates an experimental design for quantification of cardiovascular responses to music stimuli in humans. The system comprises two components: a unit for generating and monitoring quantitative acoustic stimuli and a portable autonomic nervous system (ANS) analysis unit for quantitative recording and analysis of the cardiovascular responses. The experimental results indicate that the proposed system can exactly achieve the goal of full control and measurement for the music stimuli, and also effectively support many quantitative indices of cardiovascular response in humans. In addition, the analysis results are discussed and predicted in the future clinical research.

  18. Superposition Quantification

    NASA Astrophysics Data System (ADS)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  19. Chemical derivatization for enhancing sensitivity during LC/ESI-MS/MS quantification of steroids in biological samples: a review.

    PubMed

    Higashi, Tatsuya; Ogawa, Shoujiro

    2016-09-01

    Sensitive and specific methods for the detection, characterization and quantification of endogenous steroids in body fluids or tissues are necessary for the diagnosis, pathological analysis and treatment of many diseases. Recently, liquid chromatography/electrospray ionization-tandem mass spectrometry (LC/ESI-MS/MS) has been widely used for these purposes due to its specificity and versatility. However, the ESI efficiency and fragmentation behavior of some steroids are poor, which lead to a low sensitivity. Chemical derivatization is one of the most effective methods to improve the detection characteristics of steroids in ESI-MS/MS. Based on this background, this article reviews the recent advances in chemical derivatization for the trace quantification of steroids in biological samples by LC/ESI-MS/MS. The derivatization in ESI-MS/MS is based on tagging a proton-affinitive or permanently charged moiety on the target steroid. Introduction/formation of a fragmentable moiety suitable for the selected reaction monitoring by the derivatization also enhances the sensitivity. The stable isotope-coded derivatization procedures for the steroid analysis are also described. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Meningococcal X polysaccharide quantification by high-performance anion-exchange chromatography using synthetic N-acetylglucosamine-4-phosphate as standard.

    PubMed

    Micoli, F; Adamo, R; Proietti, D; Gavini, M; Romano, M R; MacLennan, C A; Costantino, P; Berti, F

    2013-11-15

    A method for meningococcal X (MenX) polysaccharide quantification by high-performance anion-exchange chromatography with pulsed amperometric detection (HPAEC-PAD) is described. The polysaccharide is hydrolyzed by strong acidic treatment, and the peak of glucosamine-4-phosphate (4P-GlcN) is detected and measured after chromatography. In the selected conditions of hydrolysis, 4P-GlcN is the prevalent species formed, with GlcN detected for less than 5% in moles. As standard for the analysis, the monomeric unit of MenX polysaccharide, N-acetylglucosamine-4-phosphate (4P-GlcNAc), was used. This method for MenX quantification is highly selective and sensitive, and it constitutes an important analytical tool for the development of a conjugate vaccine against MenX. Copyright © 2013 Elsevier Inc. All rights reserved.

  1. Model Uncertainty Quantification Methods In Data Assimilation

    NASA Astrophysics Data System (ADS)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  2. Framework for hyperspectral image processing and quantification for cancer detection during animal tumor surgery.

    PubMed

    Lu, Guolan; Wang, Dongsheng; Qin, Xulei; Halig, Luma; Muller, Susan; Zhang, Hongzheng; Chen, Amy; Pogue, Brian W; Chen, Zhuo Georgia; Fei, Baowei

    2015-01-01

    Hyperspectral imaging (HSI) is an imaging modality that holds strong potential for rapid cancer detection during image-guided surgery. But the data from HSI often needs to be processed appropriately in order to extract the maximum useful information that differentiates cancer from normal tissue. We proposed a framework for hyperspectral image processing and quantification, which includes a set of steps including image preprocessing, glare removal, feature extraction, and ultimately image classification. The framework has been tested on images from mice with head and neck cancer, using spectra from 450- to 900-nm wavelength. The image analysis computed Fourier coefficients, normalized reflectance, mean, and spectral derivatives for improved accuracy. The experimental results demonstrated the feasibility of the hyperspectral image processing and quantification framework for cancer detection during animal tumor surgery, in a challenging setting where sensitivity can be low due to a modest number of features present, but potential for fast image classification can be high. This HSI approach may have potential application in tumor margin assessment during image-guided surgery, where speed of assessment may be the dominant factor.

  3. Framework for hyperspectral image processing and quantification for cancer detection during animal tumor surgery

    NASA Astrophysics Data System (ADS)

    Lu, Guolan; Wang, Dongsheng; Qin, Xulei; Halig, Luma; Muller, Susan; Zhang, Hongzheng; Chen, Amy; Pogue, Brian W.; Chen, Zhuo Georgia; Fei, Baowei

    2015-12-01

    Hyperspectral imaging (HSI) is an imaging modality that holds strong potential for rapid cancer detection during image-guided surgery. But the data from HSI often needs to be processed appropriately in order to extract the maximum useful information that differentiates cancer from normal tissue. We proposed a framework for hyperspectral image processing and quantification, which includes a set of steps including image preprocessing, glare removal, feature extraction, and ultimately image classification. The framework has been tested on images from mice with head and neck cancer, using spectra from 450- to 900-nm wavelength. The image analysis computed Fourier coefficients, normalized reflectance, mean, and spectral derivatives for improved accuracy. The experimental results demonstrated the feasibility of the hyperspectral image processing and quantification framework for cancer detection during animal tumor surgery, in a challenging setting where sensitivity can be low due to a modest number of features present, but potential for fast image classification can be high. This HSI approach may have potential application in tumor margin assessment during image-guided surgery, where speed of assessment may be the dominant factor.

  4. Quantification of Lignin and Its Structural Features in Plant Biomass Using 13C Lignin as Internal Standard for Pyrolysis-GC-SIM-MS

    PubMed Central

    2017-01-01

    Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12C and 13C lignin were isolated from nonlabeled and uniformly 13C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12C lignin analogue and was shown to be extremely accurate (>99.9%, R2 > 0.999) and precise (RSD < 1.5%). Third, 13C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin. PMID:28926698

  5. Matrix suppression as a guideline for reliable quantification of peptides by matrix-assisted laser desorption ionization.

    PubMed

    Ahn, Sung Hee; Bae, Yong Jin; Moon, Jeong Hee; Kim, Myung Soo

    2013-09-17

    We propose to divide matrix suppression in matrix-assisted laser desorption ionization into two parts, normal and anomalous. In quantification of peptides, the normal effect can be accounted for by constructing the calibration curve in the form of peptide-to-matrix ion abundance ratio versus concentration. The anomalous effect forbids reliable quantification and is noticeable when matrix suppression is larger than 70%. With this 70% rule, matrix suppression becomes a guideline for reliable quantification, rather than a nuisance. A peptide in a complex mixture can be quantified even in the presence of large amounts of contaminants, as long as matrix suppression is below 70%. The theoretical basis for the quantification method using a peptide as an internal standard is presented together with its weaknesses. A systematic method to improve quantification of high concentration analytes has also been developed.

  6. Detection and quantification of beef and pork materials in meat products by duplex droplet digital PCR.

    PubMed

    Cai, Yicun; He, Yuping; Lv, Rong; Chen, Hongchao; Wang, Qiang; Pan, Liangwen

    2017-01-01

    Meat products often consist of meat from multiple animal species, and inaccurate food product adulteration and mislabeling can negatively affect consumers. Therefore, a cost-effective and reliable method for identification and quantification of animal species in meat products is required. In this study, we developed a duplex droplet digital PCR (dddPCR) detection and quantification system to simultaneously identify and quantify the source of meat in samples containing a mixture of beef (Bos taurus) and pork (Sus scrofa) in a single digital PCR reaction tube. Mixed meat samples of known composition were used to test the accuracy and applicability of this method. The limit of detection (LOD) and the limit of quantification (LOQ) of this detection and quantification system were also identified. We conclude that our dddPCR detection and quantification system is suitable for quality control and routine analyses of meat products.

  7. Development and validation of a bioanalytical LC-MS method for the quantification of GHRP-6 in human plasma.

    PubMed

    Gil, Jeovanis; Cabrales, Ania; Reyes, Osvaldo; Morera, Vivian; Betancourt, Lázaro; Sánchez, Aniel; García, Gerardo; Moya, Galina; Padrón, Gabriel; Besada, Vladimir; González, Luis Javier

    2012-02-23

    Growth hormone-releasing peptide 6 (GHRP-6, His-(DTrp)-Ala-Trp-(DPhe)-Lys-NH₂, MW=872.44 Da) is a potent growth hormone secretagogue that exhibits a cytoprotective effect, maintaining tissue viability during acute ischemia/reperfusion episodes in different organs like small bowel, liver and kidneys. In the present work a quantitative method to analyze GHRP-6 in human plasma was developed and fully validated following FDA guidelines. The method uses an internal standard (IS) of GHRP-6 with ¹³C-labeled Alanine for quantification. Sample processing includes a precipitation step with cold acetone to remove the most abundant plasma proteins, recovering the GHRP-6 peptide with a high yield. Quantification was achieved by LC-MS in positive full scan mode in a Q-Tof mass spectrometer. The sensitivity of the method was evaluated, establishing the lower limit of quantification at 5 ng/mL and a range for the calibration curve from 5 ng/mL to 50 ng/mL. A dilution integrity test was performed to analyze samples at higher concentration of GHRP-6. The validation process involved five calibration curves and the analysis of quality control samples to determine accuracy and precision. The calibration curves showed R² higher than 0.988. The stability of the analyte and its internal standard (IS) was demonstrated in all conditions the samples would experience in a real time analyses. This method was applied to the quantification of GHRP-6 in plasma from nine healthy volunteers participating in a phase I clinical trial. Copyright © 2011 Elsevier B.V. All rights reserved.

  8. Quantification of the degree of reaction of fly ash

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ben Haha, M., E-mail: mohsen.ben-haha@empa.c; De Weerdt, K., E-mail: klaartje.de.weerdt@sintef.n; Lothenbach, B.

    2010-11-15

    The quantification of the fly ash (FA) in FA blended cements is an important parameter to understand the effect of the fly ash on the hydration of OPC and on the microstructural development. The FA reaction in two different blended OPC-FA systems was studied using a selective dissolution technique based on EDTA/NaOH, diluted NaOH solution, the portlandite content and by backscattered electron image analysis. The amount of FA determined by selective dissolution using EDTA/NaOH is found to be associated with a significant possible error as different assumptions lead to large differences in the estimate of FA reacted. In addition, atmore » longer hydration times, the reaction of the FA is underestimated by this method due to the presence of non-dissolved hydrates and MgO rich particles. The dissolution of FA in diluted NaOH solution agreed during the first days well with the dissolution as observed by image analysis. At 28 days and longer, the formation of hydrates in the diluted solutions leads to an underestimation. Image analysis appears to give consistent results and to be most reliable technique studied.« less

  9. Performance comparison of an active matrix flat panel imager, computed radiography system, and a screen-film system at four standard radiation qualities.

    PubMed

    Monnin, P; Gutierrez, D; Bulling, S; Lepori, D; Valley, J F; Verdun, F R

    2005-02-01

    Four standard radiation qualities (from RQA 3 to RQA 9) were used to compare the imaging performance of a computed radiography (CR) system (general purpose and high resolution phosphor plates of a Kodak CR 9000 system), a selenium-based direct flat panel detector (Kodak Direct View DR 9000), and a conventional screen-film system (Kodak T-MAT L/RA film with a 3M Trimax Regular screen of speed 400) in conventional radiography. Reference exposure levels were chosen according to the manufacturer's recommendations to be representative of clinical practice (exposure index of 1700 for digital systems and a film optical density of 1.4). With the exception of the RQA 3 beam quality, the exposure levels needed to produce a mean digital signal of 1700 were higher than those needed to obtain a mean film optical density of 1.4. In spite of intense developments in the field of digital detectors, screen-film systems are still very efficient detectors for most of the beam qualities used in radiology. An important outcome of this study is the behavior of the detective quantum efficiency of the digital radiography (DR) system as a function of beam energy. The practice of users to increase beam energy when switching from a screen-film system to a CR system, in order to improve the compromise between patient dose and image quality, might not be appropriate when switching from screen-film to selenium-based DR systems.

  10. Quantification of Pulmonary Inflammatory Processes Using Chest Radiography: Tuberculosis as the Motivating Application

    PubMed Central

    Giacomini, Guilherme; Miranda, José R.A.; Pavan, Ana Luiza M.; Duarte, Sérgio B.; Ribeiro, Sérgio M.; Pereira, Paulo C.M.; Alves, Allan F.F.; de Oliveira, Marcela; Pina, Diana R.

    2015-01-01

    Abstract The purpose of this work was to develop a quantitative method for evaluating the pulmonary inflammatory process (PIP) through the computational analysis of chest radiography exams in posteroanterior (PA) and lateral views. The quantification procedure was applied to patients with tuberculosis (TB) as the motivating application. A study of high-resolution computed tomography (HRCT) examinations of patients with TB was developed to establish a relation between the inflammatory process and the signal difference-to-noise ratio (SDNR) measured in the PA projection. A phantom essay was used to validate this relation, which was implemented using an algorithm that is able to estimate the volume of the inflammatory region based solely on SDNR values in the chest radiographs of patients. The PIP volumes that were quantified for 30 patients with TB were used for comparisons with direct HRCT analysis for the same patient. The Bland–Altman statistical analyses showed no significant differences between the 2 quantification methods. The linear regression line had a correlation coefficient of R2 = 0.97 and P < 0.001, showing a strong association between the volume that was determined by our evaluation method and the results obtained by direct HRCT scan analysis. Since the diagnosis and follow-up of patients with TB is commonly performed using X-rays exams, the method developed herein can be considered an adequate tool for quantifying the PIP with a lower patient radiation dose and lower institutional cost. Although we used patients with TB for the application of the method, this method may be used for other pulmonary diseases characterized by a PIP. PMID:26131814

  11. A preliminary study for fully automated quantification of psoriasis severity using image mapping

    NASA Astrophysics Data System (ADS)

    Mukai, Kazuhiro; Iyatomi, Hitoshi

    2014-03-01

    Psoriasis is a common chronic skin disease and it detracts patients' QoL seriously. Since there is no known permanent cure so far, controlling appropriate disease condition is necessary and therefore quantification of its severity is important. In clinical, psoriasis area and severity index (PASI) is commonly used for abovementioned purpose, however it is often subjective and troublesome. A fully automatic computer-assisted area and severity index (CASI) was proposed to make an objective quantification of skin disease. It investigates the size and density of erythema based on digital image analysis, however it does not consider various inadequate effects caused by different geometrical conditions under clinical follow-up (i.e. variability in direction and distance between camera and patient). In this study, we proposed an image alignment method for clinical images and investigated to quantify the severity of psoriasis under clinical follow-up combined with the idea of CASI. The proposed method finds geometrical same points in patient's body (ROI) between images with Scale Invariant Feature Transform (SIFT) and performs the Affine transform to map the pixel value to the other. In this study, clinical images from 7 patients with psoriasis lesions on their trunk under clinical follow-up were used. In each series, our image alignment algorithm align images to the geometry of their first image. Our proposed method aligned images appropriately on visual assessment and confirmed that psoriasis areas were properly extracted using the approach of CASI. Although we cannot evaluate PASI and CASI directly due to their different definition of ROI, we confirmed that there is a large correlation between those scores with our image quantification method.

  12. A critical view on microplastic quantification in aquatic organisms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vandermeersch, Griet, E-mail: griet.vandermeersch@ilvo.vlaanderen.be; Van Cauwenberghe, Lisbeth; Janssen, Colin R.

    Microplastics, plastic particles and fragments smaller than 5 mm, are ubiquitous in the marine environment. Ingestion and accumulation of microplastics have previously been demonstrated for diverse marine species ranging from zooplankton to bivalves and fish, implying the potential for microplastics to accumulate in the marine food web. In this way, microplastics can potentially impact food safety and human health. Although a few methods to quantify microplastics in biota have been described, no comparison and/or intercalibration of these techniques have been performed. Here we conducted a literature review on all available extraction and quantification methods. Two of these methods, involving wetmore » acid destruction, were used to evaluate the presence of microplastics in field-collected mussels (Mytilus galloprovincialis) from three different “hotspot” locations in Europe (Po estuary, Italy; Tagus estuary, Portugal; Ebro estuary, Spain). An average of 0.18±0.14 total microplastics g{sup −1} w.w. for the Acid mix Method and 0.12±0.04 total microplastics g{sup −1} w.w. for the Nitric acid Method was established. Additionally, in a pilot study an average load of 0.13±0.14 total microplastics g{sup −1} w.w. was recorded in commercial mussels (Mytilus edulis and M. galloprovincialis) from five European countries (France, Italy, Denmark, Spain and The Netherlands). A detailed analysis and comparison of methods indicated the need for further research to develop a standardised operating protocol for microplastic quantification and monitoring.« less

  13. A new analytical method for quantification of olive and palm oil in blends with other vegetable edible oils based on the chromatographic fingerprints from the methyl-transesterified fraction.

    PubMed

    Jiménez-Carvelo, Ana M; González-Casado, Antonio; Cuadros-Rodríguez, Luis

    2017-03-01

    A new analytical method for the quantification of olive oil and palm oil in blends with other vegetable edible oils (canola, safflower, corn, peanut, seeds, grapeseed, linseed, sesame and soybean) using normal phase liquid chromatography, and applying chemometric tools was developed. The procedure for obtaining of chromatographic fingerprint from the methyl-transesterified fraction from each blend is described. The multivariate quantification methods used were Partial Least Square-Regression (PLS-R) and Support Vector Regression (SVR). The quantification results were evaluated by several parameters as the Root Mean Square Error of Validation (RMSEV), Mean Absolute Error of Validation (MAEV) and Median Absolute Error of Validation (MdAEV). It has to be highlighted that the new proposed analytical method, the chromatographic analysis takes only eight minutes and the results obtained showed the potential of this method and allowed quantification of mixtures of olive oil and palm oil with other vegetable oils. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Quantification of L-Citrulline and other physiologic amino acids in watermelon and selected cucurbits

    USDA-ARS?s Scientific Manuscript database

    High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiologic amino acids in cucurbits. This method is particularly useful because the dabsyl derivatives of glutamine and citrulline are sufficiently separated to allow quantification of ea...

  15. Quantification of trace metals in water using complexation and filter concentration.

    PubMed

    Dolgin, Bella; Bulatov, Valery; Japarov, Julia; Elish, Eyal; Edri, Elad; Schechter, Israel

    2010-06-15

    Various metals undergo complexation with organic reagents, resulting in colored products. In practice, their molar absorptivities allow for quantification in the ppm range. However, a proper pre-concentration of the colored complex on paper filter lowers the quantification limit to the low ppb range. In this study, several pre-concentration techniques have been examined and compared: filtering the already complexed mixture, complexation on filter, and dipping of dye-covered filter in solution. The best quantification has been based on the ratio of filter reflectance at a certain wavelength to that at zero metal concentration. The studied complex formations (Ni ions with TAN and Cd ions with PAN) involve production of nanoparticle suspensions, which are associated with complicated kinetics. The kinetics of the complexation of Ni ions with TAN has been investigated and optimum timing could be found. Kinetic optimization in regard to some interferences has also been suggested.

  16. Nitric Oxide Analyzer Quantification of Plant S-Nitrosothiols.

    PubMed

    Hussain, Adil; Yun, Byung-Wook; Loake, Gary J

    2018-01-01

    Nitric oxide (NO) is a small diatomic molecule that regulates multiple physiological processes in animals, plants, and microorganisms. In animals, it is involved in vasodilation and neurotransmission and is present in exhaled breath. In plants, it regulates both plant immune function and numerous developmental programs. The high reactivity and short half-life of NO and cross-reactivity of its various derivatives make its quantification difficult. Different methods based on calorimetric, fluorometric, and chemiluminescent detection of NO and its derivatives are available, but all of them have significant limitations. Here we describe a method for the chemiluminescence-based quantification of NO using ozone-chemiluminescence technology in plants. This approach provides a sensitive, robust, and flexible approach for determining the levels of NO and its signaling products, protein S-nitrosothiols.

  17. Uncertainty Quantification in Alchemical Free Energy Methods.

    PubMed

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-06-12

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  18. Sequence optimization to reduce velocity offsets in cardiovascular magnetic resonance volume flow quantification - A multi-vendor study

    PubMed Central

    2011-01-01

    Purpose Eddy current induced velocity offsets are of concern for accuracy in cardiovascular magnetic resonance (CMR) volume flow quantification. However, currently known theoretical aspects of eddy current behavior have not led to effective guidelines for the optimization of flow quantification sequences. This study is aimed at identifying correlations between protocol parameters and the resulting velocity error in clinical CMR flow measurements in a multi-vendor study. Methods Nine 1.5T scanners of three different types/vendors were studied. Measurements were performed on a large stationary phantom. Starting from a clinical breath-hold flow protocol, several protocol parameters were varied. Acquisitions were made in three clinically relevant orientations. Additionally, a time delay between the bipolar gradient and read-out, asymmetric versus symmetric velocity encoding, and gradient amplitude and slew rate were studied in adapted sequences as exploratory measurements beyond the protocol. Image analysis determined the worst-case offset for a typical great-vessel flow measurement. Results The results showed a great variation in offset behavior among scanners (standard deviation among samples of 0.3, 0.4, and 0.9 cm/s for the three different scanner types), even for small changes in the protocol. Considering the absolute values, none of the tested protocol settings consistently reduced the velocity offsets below the critical level of 0.6 cm/s neither for all three orientations nor for all three scanner types. Using multilevel linear model analysis, oblique aortic and pulmonary slices showed systematic higher offsets than the transverse aortic slices (oblique aortic 0.6 cm/s, and pulmonary 1.8 cm/s higher than transverse aortic). The exploratory measurements beyond the protocol yielded some new leads for further sequence development towards reduction of velocity offsets; however those protocols were not always compatible with the time-constraints of breath

  19. Reference tissue quantification of DCE-MRI data without a contrast agent calibration

    NASA Astrophysics Data System (ADS)

    Walker-Samuel, Simon; Leach, Martin O.; Collins, David J.

    2007-02-01

    The quantification of dynamic contrast-enhanced (DCE) MRI data conventionally requires a conversion from signal intensity to contrast agent concentration by measuring a change in the tissue longitudinal relaxation rate, R1. In this paper, it is shown that the use of a spoiled gradient-echo acquisition sequence (optimized so that signal intensity scales linearly with contrast agent concentration) in conjunction with a reference tissue-derived vascular input function (VIF), avoids the need for the conversion to Gd-DTPA concentration. This study evaluates how to optimize such sequences and which dynamic time-series parameters are most suitable for this type of analysis. It is shown that signal difference and relative enhancement provide useful alternatives when full contrast agent quantification cannot be achieved, but that pharmacokinetic parameters derived from both contain sources of error (such as those caused by differences between reference tissue and region of interest proton density and native T1 values). It is shown in a rectal cancer study that these sources of uncertainty are smaller when using signal difference, compared with relative enhancement (15 ± 4% compared with 33 ± 4%). Both of these uncertainties are of the order of those associated with the conversion to Gd-DTPA concentration, according to literature estimates.

  20. Quantification of localized vertebral deformities using a sparse wavelet-based shape model.

    PubMed

    Zewail, R; Elsafi, A; Durdle, N

    2008-01-01

    Medical experts often examine hundreds of spine x-ray images to determine existence of various pathologies. Common pathologies of interest are anterior osteophites, disc space narrowing, and wedging. By careful inspection of the outline shapes of the vertebral bodies, experts are able to identify and assess vertebral abnormalities with respect to the pathology under investigation. In this paper, we present a novel method for quantification of vertebral deformation using a sparse shape model. Using wavelets and Independent component analysis (ICA), we construct a sparse shape model that benefits from the approximation power of wavelets and the capability of ICA to capture higher order statistics in wavelet space. The new model is able to capture localized pathology-related shape deformations, hence it allows for quantification of vertebral shape variations. We investigate the capability of the model to predict localized pathology related deformations. Next, using support-vector machines, we demonstrate the diagnostic capabilities of the method through the discrimination of anterior osteophites in lumbar vertebrae. Experiments were conducted using a set of 150 contours from digital x-ray images of lumbar spine. Each vertebra is labeled as normal or abnormal. Results reported in this work focus on anterior osteophites as the pathology of interest.