[Quantitative surface analysis of Pt-Co, Cu-Au and Cu-Ag alloy films by XPS and AES].
Li, Lian-Zhong; Zhuo, Shang-Jun; Shen, Ru-Xiang; Qian, Rong; Gao, Jie
2013-11-01
In order to improve the quantitative analysis accuracy of AES, We associated XPS with AES and studied the method to reduce the error of AES quantitative analysis, selected Pt-Co, Cu-Au and Cu-Ag binary alloy thin-films as the samples, used XPS to correct AES quantitative analysis results by changing the auger sensitivity factors to make their quantitative analysis results more similar. Then we verified the accuracy of the quantitative analysis of AES when using the revised sensitivity factors by other samples with different composition ratio, and the results showed that the corrected relative sensitivity factors can reduce the error in quantitative analysis of AES to less than 10%. Peak defining is difficult in the form of the integral spectrum of AES analysis since choosing the starting point and ending point when determining the characteristic auger peak intensity area with great uncertainty, and to make analysis easier, we also processed data in the form of the differential spectrum, made quantitative analysis on the basis of peak to peak height instead of peak area, corrected the relative sensitivity factors, and verified the accuracy of quantitative analysis by the other samples with different composition ratio. The result showed that the analytical error in quantitative analysis of AES reduced to less than 9%. It showed that the accuracy of AES quantitative analysis can be highly improved by the way of associating XPS with AES to correct the auger sensitivity factors since the matrix effects are taken into account. Good consistency was presented, proving the feasibility of this method.
Chen, Qianqian; Xie, Qian; Zhao, Min; Chen, Bin; Gao, Shi; Zhang, Haishan; Xing, Hua; Ma, Qingjie
2015-01-01
To compare the diagnostic value of visual and semi-quantitative analysis of technetium-99m-poly-ethylene glycol, 4-arginine-glycine-aspartic acid ((99m)Tc-3PRGD2) scintimammography (SMG) for better differentiation of benign from malignant breast masses, and also investigate the incremental role of semi-quantitative index of SMG. A total of 72 patients with breast lesions were included in the study. Technetium-99m-3PRGD2 SMG was performed with single photon emission computed tomography (SPET) at 60 min after intravenous injection of 749 ± 86MBq of the radiotracer. Images were evaluated by visual interpretation and semi-quantitative indices of tumor to non-tumor (T/N) ratios, which were compared with pathology results. Receiver operating characteristics (ROC) curve analyses were performed to determine the optimal visual grade, to calculate cut-off values of semi-quantitative indices, and to compare visual and semi-quantitative diagnostic values. Among the 72 patients, 89 lesions were confirmed by histopathology after fine needle aspiration biopsy or surgery, 48 malignant and 41 benign lesions. The mean T/N ratio of (99m)Tc-3PRGD2 SMG in malignant lesions was significantly higher than that in benign lesions (P<0.05). When grade 2 of the disease was used as cut-off value for the detection of primary breast cancer, the sensitivity, specificity and accuracy were 81.3%, 70.7%, and 76.4%, respectively. When a T/N ratio of 2.01 was used as cut-off value, the sensitivity, specificity and accuracy were 79.2%, 75.6%, and 77.5%, respectively. According to ROC analysis, the area under the curve for semi-quantitative analysis was higher than that for visual analysis, but the statistical difference was not significant (P=0.372). Compared with visual analysis or semi-quantitative analysis alone, the sensitivity, specificity and accuracy of visual analysis combined with semi-quantitative analysis in diagnosing primary breast cancer were higher, being: 87.5%, 82.9%, and 85.4%, respectively. The area under the curve was 0.891. Results of the present study suggest that the semi-quantitative and visual analysis statistically showed similar results. The semi-quantitative analysis provided incremental value additive to visual analysis of (99m)Tc-3PRGD2 SMG for the detection of breast cancer. It seems from our results that, when the tumor was located in the medial part of the breast, the semi-quantitative analysis gave better diagnostic results.
Ko, Dae-Hyun; Ji, Misuk; Kim, Sollip; Cho, Eun-Jung; Lee, Woochang; Yun, Yeo-Min; Chun, Sail; Min, Won-Ki
2016-01-01
The results of urine sediment analysis have been reported semiquantitatively. However, as recent guidelines recommend quantitative reporting of urine sediment, and with the development of automated urine sediment analyzers, there is an increasing need for quantitative analysis of urine sediment. Here, we developed a protocol for urine sediment analysis and quantified the results. Based on questionnaires, various reports, guidelines, and experimental results, we developed a protocol for urine sediment analysis. The results of this new protocol were compared with those obtained with a standardized chamber and an automated sediment analyzer. Reference intervals were also estimated using new protocol. We developed a protocol with centrifugation at 400 g for 5 min, with the average concentration factor of 30. The correlation between quantitative results of urine sediment analysis, the standardized chamber, and the automated sediment analyzer were generally good. The conversion factor derived from the new protocol showed a better fit with the results of manual count than the default conversion factor in the automated sediment analyzer. We developed a protocol for manual urine sediment analysis to quantitatively report the results. This protocol may provide a mean for standardization of urine sediment analysis.
Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana
2014-01-01
Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282
ERIC Educational Resources Information Center
Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.
2017-01-01
The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…
NASA Astrophysics Data System (ADS)
Oh, Won Jin; Jang, Jong Shik; Lee, Youn Seoung; Kim, Ansoon; Kim, Kyung Joong
2018-02-01
Quantitative analysis methods of multi-element alloy films were compared. The atomic fractions of Si1-xGex alloy films were measured by depth profiling analysis with secondary ion mass spectrometry (SIMS) and X-ray Photoelectron Spectroscopy (XPS). Intensity-to-composition conversion factor (ICF) was used as a mean to convert the intensities to compositions instead of the relative sensitivity factors. The ICFs were determined from a reference Si1-xGex alloy film by the conventional method, average intensity (AI) method and total number counting (TNC) method. In the case of SIMS, although the atomic fractions measured by oxygen ion beams were not quantitative due to severe matrix effect, the results by cesium ion beam were very quantitative. The quantitative analysis results by SIMS using MCs2+ ions are comparable to the results by XPS. In the case of XPS, the measurement uncertainty was highly improved by the AI method and TNC method.
An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise
ERIC Educational Resources Information Center
Parker, Richard H.
2011-01-01
An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…
Bhaduri, Anirban; Ghosh, Dipak
2016-01-01
The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation. PMID:26909045
Bhaduri, Anirban; Ghosh, Dipak
2016-01-01
The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation.
[A new method of processing quantitative PCR data].
Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun
2003-05-01
Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.
Jiang, Shun-Yuan; Sun, Hong-Bing; Sun, Hui; Ma, Yu-Ying; Chen, Hong-Yu; Zhu, Wen-Tao; Zhou, Yi
2016-03-01
This paper aims to explore a comprehensive assessment method combined traditional Chinese medicinal material specifications with quantitative quality indicators. Seventy-six samples of Notopterygii Rhizoma et Radix were collected on market and at producing areas. Traditional commercial specifications were described and assigned, and 10 chemical components and volatile oils were determined for each sample. Cluster analysis, Fisher discriminant analysis and correspondence analysis were used to establish the relationship between the traditional qualitative commercial specifications and quantitative chemical indices for comprehensive evaluating quality of medicinal materials, and quantitative classification of commercial grade and quality grade. A herb quality index (HQI) including traditional commercial specifications and chemical components for quantitative grade classification were established, and corresponding discriminant function were figured out for precise determination of quality grade and sub-grade of Notopterygii Rhizoma et Radix. The result showed that notopterol, isoimperatorin and volatile oil were the major components for determination of chemical quality, and their dividing values were specified for every grade and sub-grade of the commercial materials of Notopterygii Rhizoma et Radix. According to the result, essential relationship between traditional medicinal indicators, qualitative commercial specifications, and quantitative chemical composition indicators can be examined by K-mean cluster, Fisher discriminant analysis and correspondence analysis, which provide a new method for comprehensive quantitative evaluation of traditional Chinese medicine quality integrated traditional commodity specifications and quantitative modern chemical index. Copyright© by the Chinese Pharmaceutical Association.
ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra
NASA Astrophysics Data System (ADS)
Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.
2011-08-01
Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.
2017-05-10
repertoire-wide properties. Finally, through 75 the use of appropriate statistical analyses, the repertoire profiles can be quantitatively compared and 76...cell response to eVLP and 503 quantitatively compare GC B-cell repertoires from immunization conditions. We partitioned the 504 resulting clonotype... Quantitative analysis of repertoire-scale immunoglobulin properties in vaccine-induced B-cell responses Ilja V. Khavrutskii1, Sidhartha Chaudhury*1
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-20
... determined that the quantitative analysis of the energy consumption of buildings built to Standard 90.1-2007... Determination 3. Public Comments Regarding the Preliminary Determination II. Summary of the Comparative Analysis... Analysis B. Quantitative Analysis 1. Discussion of Whole Building Energy Analysis 2. Results of Whole...
NASA Astrophysics Data System (ADS)
Chen, Yi; Ma, Yong; Lu, Zheng; Peng, Bei; Chen, Qin
2011-08-01
In the field of anti-illicit drug applications, many suspicious mixture samples might consist of various drug components—for example, a mixture of methamphetamine, heroin, and amoxicillin—which makes spectral identification very difficult. A terahertz spectroscopic quantitative analysis method using an adaptive range micro-genetic algorithm with a variable internal population (ARVIPɛμGA) has been proposed. Five mixture cases are discussed using ARVIPɛμGA driven quantitative terahertz spectroscopic analysis in this paper. The devised simulation results show agreement with the previous experimental results, which suggested that the proposed technique has potential applications for terahertz spectral identifications of drug mixture components. The results show agreement with the results obtained using other experimental and numerical techniques.
Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C
2012-04-01
The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.
ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.
Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S
2011-08-01
Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. Copyright © 2011 Elsevier Inc. All rights reserved.
USDA-ARS?s Scientific Manuscript database
A quantitative answer cannot exist in an analysis without a qualitative component to give enough confidence that the result meets the analytical needs for the analysis (i.e. the result relates to the analyte and not something else). Just as a quantitative method must typically undergo an empirical ...
Economic analysis of light brown apple moth using GIS and quantitative modeling
Glenn Fowler; Lynn Garrett; Alison Neeley; Roger Magarey; Dan Borchert; Brian Spears
2011-01-01
We conducted an economic analysis of the light brown apple moth (LBAM), (piphyas postvittana (Walker)), whose presence in California has resulted in a regulatory program. Our objective was to quantitatively characterize the economic costs to apple, grape, orange, and pear crops that would result from LBAM's introduction into the continental...
Analysis of airborne MAIS imaging spectrometric data for mineral exploration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang Jinnian; Zheng Lanfen; Tong Qingxi
1996-11-01
The high spectral resolution imaging spectrometric system made quantitative analysis and mapping of surface composition possible. The key issue will be the quantitative approach for analysis of surface parameters for imaging spectrometer data. This paper describes the methods and the stages of quantitative analysis. (1) Extracting surface reflectance from imaging spectrometer image. Lab. and inflight field measurements are conducted for calibration of imaging spectrometer data, and the atmospheric correction has also been used to obtain ground reflectance by using empirical line method and radiation transfer modeling. (2) Determining quantitative relationship between absorption band parameters from the imaging spectrometer data andmore » chemical composition of minerals. (3) Spectral comparison between the spectra of spectral library and the spectra derived from the imagery. The wavelet analysis-based spectrum-matching techniques for quantitative analysis of imaging spectrometer data has beer, developed. Airborne MAIS imaging spectrometer data were used for analysis and the analysis results have been applied to the mineral and petroleum exploration in Tarim Basin area china. 8 refs., 8 figs.« less
Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana
2014-02-01
To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.
Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol
2011-02-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.
Sub-band denoising and spline curve fitting method for hemodynamic measurement in perfusion MRI
NASA Astrophysics Data System (ADS)
Lin, Hong-Dun; Huang, Hsiao-Ling; Hsu, Yuan-Yu; Chen, Chi-Chen; Chen, Ing-Yi; Wu, Liang-Chi; Liu, Ren-Shyan; Lin, Kang-Ping
2003-05-01
In clinical research, non-invasive MR perfusion imaging is capable of investigating brain perfusion phenomenon via various hemodynamic measurements, such as cerebral blood volume (CBV), cerebral blood flow (CBF), and mean trasnit time (MTT). These hemodynamic parameters are useful in diagnosing brain disorders such as stroke, infarction and periinfarct ischemia by further semi-quantitative analysis. However, the accuracy of quantitative analysis is usually affected by poor signal-to-noise ratio image quality. In this paper, we propose a hemodynamic measurement method based upon sub-band denoising and spline curve fitting processes to improve image quality for better hemodynamic quantitative analysis results. Ten sets of perfusion MRI data and corresponding PET images were used to validate the performance. For quantitative comparison, we evaluate gray/white matter CBF ratio. As a result, the hemodynamic semi-quantitative analysis result of mean gray to white matter CBF ratio is 2.10 +/- 0.34. The evaluated ratio of brain tissues in perfusion MRI is comparable to PET technique is less than 1-% difference in average. Furthermore, the method features excellent noise reduction and boundary preserving in image processing, and short hemodynamic measurement time.
ERIC Educational Resources Information Center
Ling, Chris D.; Bridgeman, Adam J.
2011-01-01
Titration experiments are ideal for generating large data sets for use in quantitative-analysis activities that are meaningful and transparent to general chemistry students. We report the successful implementation of a sophisticated quantitative exercise in which the students identify a series of unknown acids by determining their molar masses…
Monakhova, Yulia B; Mushtakova, Svetlana P
2017-05-01
A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.
Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol
2011-01-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032
Huang, An-Min; Fei, Ben-Hua; Jiang, Ze-Hui; Hse, Chung-Yun
2007-09-01
Near infrared spectroscopy is widely used as a quantitative method, and the main multivariate techniques consist of regression methods used to build prediction models, however, the accuracy of analysis results will be affected by many factors. In the present paper, the influence of different sample roughness on the mathematical model of NIR quantitative analysis of wood density was studied. The result of experiments showed that if the roughness of predicted samples was consistent with that of calibrated samples, the result was good, otherwise the error would be much higher. The roughness-mixed model was more flexible and adaptable to different sample roughness. The prediction ability of the roughness-mixed model was much better than that of the single-roughness model.
Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li
2013-01-21
A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.
An experimental design for quantification of cardiovascular responses to music stimuli in humans.
Chang, S-H; Luo, C-H; Yeh, T-L
2004-01-01
There have been several researches on the relationship between music and human physiological or psychological responses. However, there are cardiovascular index factors that have not been explored quantitatively due to the qualitative nature of acoustic stimuli. This study proposes and demonstrates an experimental design for quantification of cardiovascular responses to music stimuli in humans. The system comprises two components: a unit for generating and monitoring quantitative acoustic stimuli and a portable autonomic nervous system (ANS) analysis unit for quantitative recording and analysis of the cardiovascular responses. The experimental results indicate that the proposed system can exactly achieve the goal of full control and measurement for the music stimuli, and also effectively support many quantitative indices of cardiovascular response in humans. In addition, the analysis results are discussed and predicted in the future clinical research.
Ulgen, Ayse; Han, Zhihua; Li, Wentian
2003-12-31
We address the question of whether statistical correlations among quantitative traits lead to correlation of linkage results of these traits. Five measured quantitative traits (total cholesterol, fasting glucose, HDL cholesterol, blood pressure, and triglycerides), and one derived quantitative trait (total cholesterol divided by the HDL cholesterol) are used for phenotype correlation studies. Four of them are used for linkage analysis. We show that although correlation among phenotypes partially reflects the correlation among linkage analysis results, the LOD-score correlations are on average low. The most significant peaks found by using different traits do not often overlap. Studying covariances at specific locations in LOD scores may provide clues for further bivariate linkage analyses.
Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C
2015-02-01
Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Wang, Li-Li; Zhang, Yun-Bin; Sun, Xiao-Ya; Chen, Sui-Qing
2016-05-08
Establish a quantitative analysis of multi-components by the single marker (QAMS) method for quality evaluation and validate its feasibilities by the simultaneous quantitative assay of four main components in Linderae Reflexae Radix. Four main components of pinostrobin, pinosylvin, pinocembrin, and 3,5-dihydroxy-2-(1- p -mentheneyl)- trans -stilbene were selected as analytes to evaluate the quality by RP-HPLC coupled with a UV-detector. The method was evaluated by a comparison of the quantitative results between the external standard method and QAMS with a different HPLC system. The results showed that no significant differences were found in the quantitative results of the four contents of Linderae Reflexae Radix determined by the external standard method and QAMS (RSD <3%). The contents of four analytes (pinosylvin, pinocembrin, pinostrobin, and Reflexanbene I) in Linderae Reflexae Radix were determined by the single marker of pinosylvin. This fingerprint was the spectra determined by Shimadzu LC-20AT and Waters e2695 HPLC that were equipped with three different columns.
Multi-frequency local wavenumber analysis and ply correlation of delamination damage.
Juarez, Peter D; Leckey, Cara A C
2015-09-01
Wavenumber domain analysis through use of scanning laser Doppler vibrometry has been shown to be effective for non-contact inspection of damage in composites. Qualitative and semi-quantitative local wavenumber analysis of realistic delamination damage and quantitative analysis of idealized damage scenarios (Teflon inserts) have been performed previously in the literature. This paper presents a new methodology based on multi-frequency local wavenumber analysis for quantitative assessment of multi-ply delamination damage in carbon fiber reinforced polymer (CFRP) composite specimens. The methodology is presented and applied to a real world damage scenario (impact damage in an aerospace CFRP composite). The methodology yields delamination size and also correlates local wavenumber results from multiple excitation frequencies to theoretical dispersion curves in order to robustly determine the delamination ply depth. Results from the wavenumber based technique are validated against a traditional nondestructive evaluation method. Published by Elsevier B.V.
Chen, Jing; Wang, Shu-Mei; Meng, Jiang; Sun, Fei; Liang, Sheng-Wang
2013-05-01
To establish a new method for quality evaluation and validate its feasibilities by simultaneous quantitative assay of five alkaloids in Sophora flavescens. The new quality evaluation method, quantitative analysis of multi-components by single marker (QAMS), was established and validated with S. flavescens. Five main alkaloids, oxymatrine, sophocarpine, matrine, oxysophocarpine and sophoridine, were selected as analytes to evaluate the quality of rhizome of S. flavescens, and the relative correction factor has good repeatibility. Their contents in 21 batches of samples, collected from different areas, were determined by both external standard method and QAMS. The method was evaluated by comparison of the quantitative results between external standard method and QAMS. No significant differences were found in the quantitative results of five alkaloids in 21 batches of S. flavescens determined by external standard method and QAMS. It is feasible and suitable to evaluate the quality of rhizome of S. flavescens by QAMS.
Developing Sampling Frame for Case Study: Challenges and Conditions
ERIC Educational Resources Information Center
Ishak, Noriah Mohd; Abu Bakar, Abu Yazid
2014-01-01
Due to statistical analysis, the issue of random sampling is pertinent to any quantitative study. Unlike quantitative study, the elimination of inferential statistical analysis, allows qualitative researchers to be more creative in dealing with sampling issue. Since results from qualitative study cannot be generalized to the bigger population,…
Seniors' Online Communities: A Quantitative Content Analysis
ERIC Educational Resources Information Center
Nimrod, Galit
2010-01-01
Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…
NASA Astrophysics Data System (ADS)
Lü, Chengxu; Jiang, Xunpeng; Zhou, Xingfan; Zhang, Yinqiao; Zhang, Naiqian; Wei, Chongfeng; Mao, Wenhua
2017-10-01
Wet gluten is a useful quality indicator for wheat, and short wave near infrared spectroscopy (NIRS) is a high performance technique with the advantage of economic rapid and nondestructive test. To study the feasibility of short wave NIRS analyzing wet gluten directly from wheat seed, 54 representative wheat seed samples were collected and scanned by spectrometer. 8 spectral pretreatment method and genetic algorithm (GA) variable selection method were used to optimize analysis. Both quantitative and qualitative model of wet gluten were built by partial least squares regression and discriminate analysis. For quantitative analysis, normalization is the optimized pretreatment method, 17 wet gluten sensitive variables are selected by GA, and GA model performs a better result than that of all variable model, with R2V=0.88, and RMSEV=1.47. For qualitative analysis, automatic weighted least squares baseline is the optimized pretreatment method, all variable models perform better results than those of GA models. The correct classification rates of 3 class of <24%, 24-30%, >30% wet gluten content are 95.45, 84.52, and 90.00%, respectively. The short wave NIRS technique shows potential for both quantitative and qualitative analysis of wet gluten for wheat seed.
Wang, Chen; Brancusi, Flavia; Valivullah, Zaheer M; Anderson, Michael G; Cunningham, Denise; Hedberg-Buenz, Adam; Power, Bradley; Simeonov, Dimitre; Gahl, William A; Zein, Wadih M; Adams, David R; Brooks, Brian
2018-01-01
To develop a sensitive scale of iris transillumination suitable for clinical and research use, with the capability of either quantitative analysis or visual matching of images. Iris transillumination photographic images were used from 70 study subjects with ocular or oculocutaneous albinism. Subjects represented a broad range of ocular pigmentation. A subset of images was subjected to image analysis and ranking by both expert and nonexpert reviewers. Quantitative ordering of images was compared with ordering by visual inspection. Images were binned to establish an 8-point scale. Ranking consistency was evaluated using the Kendall rank correlation coefficient (Kendall's tau). Visual ranking results were assessed using Kendall's coefficient of concordance (Kendall's W) analysis. There was a high degree of correlation among the image analysis, expert-based and non-expert-based image rankings. Pairwise comparisons of the quantitative ranking with each reviewer generated an average Kendall's tau of 0.83 ± 0.04 (SD). Inter-rater correlation was also high with Kendall's W of 0.96, 0.95, and 0.95 for nonexpert, expert, and all reviewers, respectively. The current standard for assessing iris transillumination is expert assessment of clinical exam findings. We adapted an image-analysis technique to generate quantitative transillumination values. Quantitative ranking was shown to be highly similar to a ranking produced by both expert and nonexpert reviewers. This finding suggests that the image characteristics used to quantify iris transillumination do not require expert interpretation. Inter-rater rankings were also highly similar, suggesting that varied methods of transillumination ranking are robust in terms of producing reproducible results.
Implementing a Quantitative Analysis Design Tool for Future Generation Interfaces
2012-03-01
with Remotely Piloted Aircraft (RPA) has resulted in the need of a platform to evaluate interface design. The Vigilant Spirit Control Station ( VSCS ...Spirit interface. A modified version of the HCI Index was successfully applied to perform a quantitative analysis of the baseline VSCS interface and...time of the original VSCS interface. These results revealed the effectiveness of the tool and demonstrated in the design of future generation
A Qualitative-Quantitative H-NMR Experiment for the Instrumental Analysis Laboratory.
ERIC Educational Resources Information Center
Phillips, John S.; Leary, James J.
1986-01-01
Describes an experiment combining qualitative and quantitative information from hydrogen nuclear magnetic resonance spectra. Reviews theory, discusses the experimental approach, and provides sample results. (JM)
Li, Zhigang; Wang, Qiaoyun; Lv, Jiangtao; Ma, Zhenhe; Yang, Linjuan
2015-06-01
Spectroscopy is often applied when a rapid quantitative analysis is required, but one challenge is the translation of raw spectra into a final analysis. Derivative spectra are often used as a preliminary preprocessing step to resolve overlapping signals, enhance signal properties, and suppress unwanted spectral features that arise due to non-ideal instrument and sample properties. In this study, to improve quantitative analysis of near-infrared spectra, derivatives of noisy raw spectral data need to be estimated with high accuracy. A new spectral estimator based on singular perturbation technique, called the singular perturbation spectra estimator (SPSE), is presented, and the stability analysis of the estimator is given. Theoretical analysis and simulation experimental results confirm that the derivatives can be estimated with high accuracy using this estimator. Furthermore, the effectiveness of the estimator for processing noisy infrared spectra is evaluated using the analysis of beer spectra. The derivative spectra of the beer and the marzipan are used to build the calibration model using partial least squares (PLS) modeling. The results show that the PLS based on the new estimator can achieve better performance compared with the Savitzky-Golay algorithm and can serve as an alternative choice for quantitative analytical applications.
Haiyang, Yu; Tian, Luo
2016-06-01
Target restoration space (TRS) is the most precise space required for designing optimal prosthesis. TRS consists of an internal or external tooth space to confirm the esthetics and function of the final restoration. Therefore, assisted with quantitive analysis transfer, TRS quantitative analysis is a significant improvement for minimum tooth preparation. This article presents TRS quantity-related measurement, analysis, transfer, and internal relevance of three TR. classifications. Results reveal the close bond between precision and minimally invasive treatment. This study can be used to improve the comprehension and execution of precise esthetic prosthodontics.
Quantitative 13C NMR characterization of fast pyrolysis oils
Happs, Renee M.; Lisa, Kristina; Ferrell, III, Jack R.
2016-10-20
Quantitative 13C NMR analysis of model catalytic fast pyrolysis (CFP) oils following literature procedures showed poor agreement for aromatic hydrocarbons between NMR measured concentrations and actual composition. Furthermore, modifying integration regions based on DEPT analysis for aromatic carbons resulted in better agreement. Solvent effects were also investigated for hydrotreated CFP oil.
Quantitative 13C NMR characterization of fast pyrolysis oils
DOE Office of Scientific and Technical Information (OSTI.GOV)
Happs, Renee M.; Lisa, Kristina; Ferrell, III, Jack R.
Quantitative 13C NMR analysis of model catalytic fast pyrolysis (CFP) oils following literature procedures showed poor agreement for aromatic hydrocarbons between NMR measured concentrations and actual composition. Furthermore, modifying integration regions based on DEPT analysis for aromatic carbons resulted in better agreement. Solvent effects were also investigated for hydrotreated CFP oil.
Approaches to quantitating the results of differentially dyed cottons
USDA-ARS?s Scientific Manuscript database
The differential dyeing (DD) method has served as a subjective method for visually determining immature cotton fibers. In an attempt to quantitate the results of the differential dyeing method, and thus offer an efficient means of elucidating cotton maturity without visual discretion, image analysi...
Reciprocal contribution analysis of the left and right hips while walking
NASA Astrophysics Data System (ADS)
Tsuruoka, Yuriko; Tamura, Yoshiyasu; Shibasaki, Ryosuke
2007-10-01
The physical posture of even healthy university students is easy to collapse when walking with textbooks and other heavy loads during university attendance. Consequently, they may experience lower-back pain or knee pain. However, the resulting burden of this stress to the left and right lower-back has not previously been quantitatively analyzed. In this study, we employed a Relative Power Contribution (RPC) analysis approach to quantitatively investigate and compare the reciprocal contribution between the left and right lower-backs while walking with a bag and without a bag. Quantitative data were collected by two accelerometers installed on the subjects. Results for the subjects walking with and without a bag indicated that the contribution of the left and right lower-backs decreased by up to 21% ( p<0.05). Some disorder occurs in the feedback relations of the movement in both lower-backs and as a result, it was understood to cause much discomfort in these areas. This analysis reveals the quantitative relations of left and right lower-backs, which are difficult to discern from the original data. The results can be useful for preventive healthcare for lower-back and knee pains.
Guo, Longhua; Qiu, Bin; Chi, Yuwu; Chen, Guonan
2008-09-01
In this paper, an ultrasensitive CE-CL detection system coupled with a novel double-on-column coaxial flow detection interface was developed for the detection of PCR products. A reliable procedure based on this system had been demonstrated for qualitative and quantitative analysis of genetically modified organism-the detection of Roundup Ready Soy (RRS) samples was presented as an example. The promoter, terminator, function and two reference genes of RRS were amplified with multiplex PCR simultaneously. After that, the multiplex PCR products were labeled with acridinium ester at the 5'-terminal through an amino modification and then analyzed by the proposed CE-CL system. Reproducibility of analysis times and peak heights for the CE-CL analysis were determined to be better than 0.91 and 3.07% (RSD, n=15), respectively, for three consecutive days. It was shown that this method could accurately and qualitatively detect RRS standards and the simulative samples. The evaluation in terms of quantitative analysis of RRS provided by this new method was confirmed by comparing our assay results with those of the standard real-time quantitative PCR (RT-QPCR) using SYBR Green I dyes. The results showed a good coherence between the two methods. This approach demonstrated the possibility for accurate qualitative and quantitative detection of GM plants in a single run.
Design and analysis issues in quantitative proteomics studies.
Karp, Natasha A; Lilley, Kathryn S
2007-09-01
Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.
Ayad, Essam; Mansy, Mina; Elwi, Dalal; Salem, Mostafa; Salama, Mohamed; Kayser, Klaus
2015-01-01
Optimization of workflow for breast cancer samples with equivocal human epidermal growth factor receptors 2 (HER2)/neu score 2(+) results in routine practice, remains to be a central focus of the on-going efforts to assess HER2 status. According to the College of American Pathologists/American Society of Clinical Oncology guidelines equivocal HER2/neu score 2(+) cases are subject for further testing, usually by fluorescence in situ hybridization (FISH) investigations. It still remains on open question, whether quantitative digital image analysis of HER2 immunohistochemistry (IHC) stained slides can assist in further refining the HER2 score 2(+). To assess utility of quantitative digital analysis of IHC stained slides and compare its performance to FISH in cases of breast cancer with equivocal HER2 score 2(+). Fifteen specimens (previously diagnosed as breast cancer and was evaluated as HER 2(-) score 2(+)) represented the study population. Contemporary new cuts were prepared for re-evaluation of HER2 immunohistochemical studies and FISH examination. All the cases were digitally scanned by iScan (Produced by BioImagene [Now Roche-Ventana]). The IHC signals of HER2 were measured using an automated image analyzing system (MECES, www.Diagnomx.eu/meces). Finally, a comparative study was done between the results of the FISH and the quantitative analysis of the virtual slides. Three out of the 15 cases with equivocal HER2 score 2(+), turned out to be positive (3(+)) by quantitative digital analysis, and 12 were found to be negative in FISH too. Two of these three positive cases proved to be positive with FISH, and only one was negative. Quantitative digital analysis is highly sensitive and relatively specific when compared to FISH in detecting HER2/neu overexpression. Therefore, it represents a potential reliable substitute for FISH in breast cancer cases, which desire further refinement of equivocal IHC results.
[Quantitative Analysis of Heavy Metals in Water with LIBS Based on Signal-to-Background Ratio].
Hu, Li; Zhao, Nan-jing; Liu, Wen-qing; Fang, Li; Zhang, Da-hai; Wang, Yin; Meng, De Shuo; Yu, Yang; Ma, Ming-jun
2015-07-01
There are many influence factors in the precision and accuracy of the quantitative analysis with LIBS technology. According to approximately the same characteristics trend of background spectrum and characteristic spectrum along with the change of temperature through in-depth analysis, signal-to-background ratio (S/B) measurement and regression analysis could compensate the spectral line intensity changes caused by system parameters such as laser power, spectral efficiency of receiving. Because the measurement dates were limited and nonlinear, we used support vector machine (SVM) for regression algorithm. The experimental results showed that the method could improve the stability and the accuracy of quantitative analysis of LIBS, and the relative standard deviation and average relative error of test set respectively were 4.7% and 9.5%. Data fitting method based on signal-to-background ratio(S/B) is Less susceptible to matrix elements and background spectrum etc, and provides data processing reference for real-time online LIBS quantitative analysis technology.
NASA Technical Reports Server (NTRS)
Johnson, R. W.; Bahn, G. S.
1977-01-01
Statistical analysis techniques were applied to develop quantitative relationships between in situ river measurements and the remotely sensed data that were obtained over the James River in Virginia on 28 May 1974. The remotely sensed data were collected with a multispectral scanner and with photographs taken from an aircraft platform. Concentration differences among water quality parameters such as suspended sediment, chlorophyll a, and nutrients indicated significant spectral variations. Calibrated equations from the multiple regression analysis were used to develop maps that indicated the quantitative distributions of water quality parameters and the dispersion characteristics of a pollutant plume entering the turbid river system. Results from further analyses that use only three preselected multispectral scanner bands of data indicated that regression coefficients and standard errors of estimate were not appreciably degraded compared with results from the 10-band analysis.
Aims: To determine the performance of a rapid, real time polymerase chain reaction (PCR) method for the detection and quantitative analysis Helicobacter pylori at low concentrations in drinking water.
Methods and Results: A rapid DNA extraction and quantitative PCR (QPCR)...
Zhang, Yin; Wang, Lei
2013-01-01
Abstract The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. PMID:24330689
Zhang, Yin; Wang, Lei; Diao, Tianxi
2013-12-01
The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. © 2013 Wiley Periodicals, Inc.
Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis
ERIC Educational Resources Information Center
Rubin, Samuel J.; Abrams, Binyomin
2015-01-01
Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…
QUANTITATIVE PCR ANALYSIS OF MOLDS IN THE DUST FROM HOMES OF ASTHMATIC CHILDREN IN NORTH CAROLINA
The vacuum bag (VB) dust was analyzed by mold specific quantitative PCR. These results were compared to the analysis survey calculated for each of the homes. The mean and standard deviation (SD) of the ERMI values in the homes of the NC asthmatic children was 16.4 (6.77), compa...
Uncertainty of quantitative microbiological methods of pharmaceutical analysis.
Gunar, O V; Sakhno, N G
2015-12-30
The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Badar, Lawrence J.
This report, in the form of a teacher's guide, presents materials for a ninth grade introductory course on Introduction to Quantitative Science (IQS). It is intended to replace a traditional ninth grade general science with a process oriented course that will (1) unify the sciences, and (2) provide a quantitative preparation for the new science…
Ceol, M; Forino, M; Gambaro, G; Sauer, U; Schleicher, E D; D'Angelo, A; Anglani, F
2001-01-01
Gene expression can be examined with different techniques including ribonuclease protection assay (RPA), in situ hybridisation (ISH), and quantitative reverse transcription-polymerase chain reaction (RT/PCR). These methods differ considerably in their sensitivity and precision in detecting and quantifying low abundance mRNA. Although there is evidence that RT/PCR can be performed in a quantitative manner, the quantitative capacity of this method is generally underestimated. To demonstrate that the comparative kinetic RT/PCR strategy-which uses a housekeeping gene as internal standard-is a quantitative method to detect significant differences in mRNA levels between different samples, the inhibitory effect of heparin on phorbol 12-myristate 13-acetate (PMA)-induced-TGF-beta1 mRNA expression was evaluated by RT/PCR and RPA, the standard method of mRNA quantification, and the results were compared. The reproducibility of RT/PCR amplification was calculated by comparing the quantity of G3PDH and TGF-beta1 PCR products, generated during the exponential phases, estimated from two different RT/PCR (G3PDH, r = 0.968, P = 0.0000; TGF-beta1, r = 0.966, P = 0.0000). The quantitative capacity of comparative kinetic RT/PCR was demonstrated by comparing the results obtained from RPA and RT/PCR using linear regression analysis. Starting from the same RNA extraction, but using only 1% of the RNA for the RT/PCR compared to RPA, significant correlation was observed (r = 0.984, P = 0.0004). Moreover the morphometric analysis of ISH signal was applied for the semi-quantitative evaluation of the expression and localisation of TGF-beta1 mRNA in the entire cell population. Our results demonstrate the close similarity of the RT/PCR and RPA methods in giving quantitative information on mRNA expression and indicate the possibility to adopt the comparative kinetic RT/PCR as reliable quantitative method of mRNA analysis. Copyright 2001 Wiley-Liss, Inc.
Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper
2018-03-01
Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Wu, Xiaoyu; Hao, Zhenqi; Wu, Di; Zheng, Lu; Jiang, Zhanzhi; Ganesan, Vishal; Wang, Yayu; Lai, Keji
2018-04-01
We report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-field microwave imaging with small distance modulation.
Quantitative Hydrocarbon Surface Analysis
NASA Technical Reports Server (NTRS)
Douglas, Vonnie M.
2000-01-01
The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.
The analysis of morphometric data on rocky mountain wolves and artic wolves using statistical method
NASA Astrophysics Data System (ADS)
Ammar Shafi, Muhammad; Saifullah Rusiman, Mohd; Hamzah, Nor Shamsidah Amir; Nor, Maria Elena; Ahmad, Noor’ani; Azia Hazida Mohamad Azmi, Nur; Latip, Muhammad Faez Ab; Hilmi Azman, Ahmad
2018-04-01
Morphometrics is a quantitative analysis depending on the shape and size of several specimens. Morphometric quantitative analyses are commonly used to analyse fossil record, shape and size of specimens and others. The aim of the study is to find the differences between rocky mountain wolves and arctic wolves based on gender. The sample utilised secondary data which included seven variables as independent variables and two dependent variables. Statistical modelling was used in the analysis such was the analysis of variance (ANOVA) and multivariate analysis of variance (MANOVA). The results showed there exist differentiating results between arctic wolves and rocky mountain wolves based on independent factors and gender.
Relating interesting quantitative time series patterns with text events and text features
NASA Astrophysics Data System (ADS)
Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.
2013-12-01
In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other application domains such as data analysis of smart grids, cyber physical systems or the security of critical infrastructure, where the data consists of a combination of quantitative and textual time series data.
Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang
2018-01-01
Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.
ERIC Educational Resources Information Center
Kottich, Sarah
2017-01-01
This study analyzed tuition reductions in the private not-for-profit sector of higher education, utilizing a quantitative descriptive and correlational approach with secondary data analysis. It resulted in a listing of 45 institutions with verified tuition reductions from 2007 to 2017, more than previously thought. It found that the…
Guo, L B; Hao, Z Q; Shen, M; Xiong, W; He, X N; Xie, Z Q; Gao, M; Li, X Y; Zeng, X Y; Lu, Y F
2013-07-29
To improve the accuracy of quantitative analysis in laser-induced breakdown spectroscopy, the plasma produced by a Nd:YAG laser from steel targets was confined by a cavity. A number of elements with low concentrations, such as vanadium (V), chromium (Cr), and manganese (Mn), in the steel samples were investigated. After the optimization of the cavity dimension and laser fluence, significant enhancement factors of 4.2, 3.1, and 2.87 in the emission intensity of V, Cr, and Mn lines, respectively, were achieved at a laser fluence of 42.9 J/cm(2) using a hemispherical cavity (diameter: 5 mm). More importantly, the correlation coefficient of the V I 440.85/Fe I 438.35 nm was increased from 0.946 (without the cavity) to 0.981 (with the cavity); and similar results for Cr I 425.43/Fe I 425.08 nm and Mn I 476.64/Fe I 492.05 nm were also obtained. Therefore, it was demonstrated that the accuracy of quantitative analysis with low concentration elements in steel samples was improved, because the plasma became uniform with spatial confinement. The results of this study provide a new pathway for improving the accuracy of quantitative analysis of LIBS.
SSBD: a database of quantitative data of spatiotemporal dynamics of biological phenomena
Tohsato, Yukako; Ho, Kenneth H. L.; Kyoda, Koji; Onami, Shuichi
2016-01-01
Motivation: Rapid advances in live-cell imaging analysis and mathematical modeling have produced a large amount of quantitative data on spatiotemporal dynamics of biological objects ranging from molecules to organisms. There is now a crucial need to bring these large amounts of quantitative biological dynamics data together centrally in a coherent and systematic manner. This will facilitate the reuse of this data for further analysis. Results: We have developed the Systems Science of Biological Dynamics database (SSBD) to store and share quantitative biological dynamics data. SSBD currently provides 311 sets of quantitative data for single molecules, nuclei and whole organisms in a wide variety of model organisms from Escherichia coli to Mus musculus. The data are provided in Biological Dynamics Markup Language format and also through a REST API. In addition, SSBD provides 188 sets of time-lapse microscopy images from which the quantitative data were obtained and software tools for data visualization and analysis. Availability and Implementation: SSBD is accessible at http://ssbd.qbic.riken.jp. Contact: sonami@riken.jp PMID:27412095
Üstündağ, Özgür; Dinç, Erdal; Özdemir, Nurten; Tilkan, M Günseli
2015-01-01
In the development strategies of new drug products and generic drug products, the simultaneous in-vitro dissolution behavior of oral dosage formulations is the most important indication for the quantitative estimation of efficiency and biopharmaceutical characteristics of drug substances. This is to force the related field's scientists to improve very powerful analytical methods to get more reliable, precise and accurate results in the quantitative analysis and dissolution testing of drug formulations. In this context, two new chemometric tools, partial least squares (PLS) and principal component regression (PCR) were improved for the simultaneous quantitative estimation and dissolution testing of zidovudine (ZID) and lamivudine (LAM) in a tablet dosage form. The results obtained in this study strongly encourage us to use them for the quality control, the routine analysis and the dissolution test of the marketing tablets containing ZID and LAM drugs.
Quantitative trait nucleotide analysis using Bayesian model selection.
Blangero, John; Goring, Harald H H; Kent, Jack W; Williams, Jeff T; Peterson, Charles P; Almasy, Laura; Dyer, Thomas D
2005-10-01
Although much attention has been given to statistical genetic methods for the initial localization and fine mapping of quantitative trait loci (QTLs), little methodological work has been done to date on the problem of statistically identifying the most likely functional polymorphisms using sequence data. In this paper we provide a general statistical genetic framework, called Bayesian quantitative trait nucleotide (BQTN) analysis, for assessing the likely functional status of genetic variants. The approach requires the initial enumeration of all genetic variants in a set of resequenced individuals. These polymorphisms are then typed in a large number of individuals (potentially in families), and marker variation is related to quantitative phenotypic variation using Bayesian model selection and averaging. For each sequence variant a posterior probability of effect is obtained and can be used to prioritize additional molecular functional experiments. An example of this quantitative nucleotide analysis is provided using the GAW12 simulated data. The results show that the BQTN method may be useful for choosing the most likely functional variants within a gene (or set of genes). We also include instructions on how to use our computer program, SOLAR, for association analysis and BQTN analysis.
Classification of cassava genotypes based on qualitative and quantitative data.
Oliveira, E J; Oliveira Filho, O S; Santos, V S
2015-02-02
We evaluated the genetic variation of cassava accessions based on qualitative (binomial and multicategorical) and quantitative traits (continuous). We characterized 95 accessions obtained from the Cassava Germplasm Bank of Embrapa Mandioca e Fruticultura; we evaluated these accessions for 13 continuous, 10 binary, and 25 multicategorical traits. First, we analyzed the accessions based only on quantitative traits; next, we conducted joint analysis (qualitative and quantitative traits) based on the Ward-MLM method, which performs clustering in two stages. According to the pseudo-F, pseudo-t2, and maximum likelihood criteria, we identified five and four groups based on quantitative trait and joint analysis, respectively. The smaller number of groups identified based on joint analysis may be related to the nature of the data. On the other hand, quantitative data are more subject to environmental effects in the phenotype expression; this results in the absence of genetic differences, thereby contributing to greater differentiation among accessions. For most of the accessions, the maximum probability of classification was >0.90, independent of the trait analyzed, indicating a good fit of the clustering method. Differences in clustering according to the type of data implied that analysis of quantitative and qualitative traits in cassava germplasm might explore different genomic regions. On the other hand, when joint analysis was used, the means and ranges of genetic distances were high, indicating that the Ward-MLM method is very useful for clustering genotypes when there are several phenotypic traits, such as in the case of genetic resources and breeding programs.
Biglands, John D; Ibraheem, Montasir; Magee, Derek R; Radjenovic, Aleksandra; Plein, Sven; Greenwood, John P
2018-05-01
This study sought to compare the diagnostic accuracy of visual and quantitative analyses of myocardial perfusion cardiovascular magnetic resonance against a reference standard of quantitative coronary angiography. Visual analysis of perfusion cardiovascular magnetic resonance studies for assessing myocardial perfusion has been shown to have high diagnostic accuracy for coronary artery disease. However, only a few small studies have assessed the diagnostic accuracy of quantitative myocardial perfusion. This retrospective study included 128 patients randomly selected from the CE-MARC (Clinical Evaluation of Magnetic Resonance Imaging in Coronary Heart Disease) study population such that the distribution of risk factors and disease status was proportionate to the full population. Visual analysis results of cardiovascular magnetic resonance perfusion images, by consensus of 2 expert readers, were taken from the original study reports. Quantitative myocardial blood flow estimates were obtained using Fermi-constrained deconvolution. The reference standard for myocardial ischemia was a quantitative coronary x-ray angiogram stenosis severity of ≥70% diameter in any coronary artery of >2 mm diameter, or ≥50% in the left main stem. Diagnostic performance was calculated using receiver-operating characteristic curve analysis. The area under the curve for visual analysis was 0.88 (95% confidence interval: 0.81 to 0.95) with a sensitivity of 81.0% (95% confidence interval: 69.1% to 92.8%) and specificity of 86.0% (95% confidence interval: 78.7% to 93.4%). For quantitative stress myocardial blood flow the area under the curve was 0.89 (95% confidence interval: 0.83 to 0.96) with a sensitivity of 87.5% (95% confidence interval: 77.3% to 97.7%) and specificity of 84.5% (95% confidence interval: 76.8% to 92.3%). There was no statistically significant difference between the diagnostic performance of quantitative and visual analyses (p = 0.72). Incorporating rest myocardial blood flow values to generate a myocardial perfusion reserve did not significantly increase the quantitative analysis area under the curve (p = 0.79). Quantitative perfusion has a high diagnostic accuracy for detecting coronary artery disease but is not superior to visual analysis. The incorporation of rest perfusion imaging does not improve diagnostic accuracy in quantitative perfusion analysis. Copyright © 2018 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Quantitative Doppler Analysis Using Conventional Color Flow Imaging Acquisitions.
Karabiyik, Yucel; Ekroll, Ingvild Kinn; Eik-Nes, Sturla H; Lovstakken, Lasse
2018-05-01
Interleaved acquisitions used in conventional triplex mode result in a tradeoff between the frame rate and the quality of velocity estimates. On the other hand, workflow becomes inefficient when the user has to switch between different modes, and measurement variability is increased. This paper investigates the use of power spectral Capon estimator in quantitative Doppler analysis using data acquired with conventional color flow imaging (CFI) schemes. To preserve the number of samples used for velocity estimation, only spatial averaging was utilized, and clutter rejection was performed after spectral estimation. The resulting velocity spectra were evaluated in terms of spectral width using a recently proposed spectral envelope estimator. The spectral envelopes were also used for Doppler index calculations using in vivo and string phantom acquisitions. In vivo results demonstrated that the Capon estimator can provide spectral estimates with sufficient quality for quantitative analysis using packet-based CFI acquisitions. The calculated Doppler indices were similar to the values calculated using spectrograms estimated on a commercial ultrasound scanner.
Rigbolt, Kristoffer T G; Vanselow, Jens T; Blagoev, Blagoy
2011-08-01
Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)(1). The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net.
Rigbolt, Kristoffer T. G.; Vanselow, Jens T.; Blagoev, Blagoy
2011-01-01
Recent technological advances have made it possible to identify and quantify thousands of proteins in a single proteomics experiment. As a result of these developments, the analysis of data has become the bottleneck of proteomics experiment. To provide the proteomics community with a user-friendly platform for comprehensive analysis, inspection and visualization of quantitative proteomics data we developed the Graphical Proteomics Data Explorer (GProX)1. The program requires no special bioinformatics training, as all functions of GProX are accessible within its graphical user-friendly interface which will be intuitive to most users. Basic features facilitate the uncomplicated management and organization of large data sets and complex experimental setups as well as the inspection and graphical plotting of quantitative data. These are complemented by readily available high-level analysis options such as database querying, clustering based on abundance ratios, feature enrichment tests for e.g. GO terms and pathway analysis tools. A number of plotting options for visualization of quantitative proteomics data is available and most analysis functions in GProX create customizable high quality graphical displays in both vector and bitmap formats. The generic import requirements allow data originating from essentially all mass spectrometry platforms, quantitation strategies and software to be analyzed in the program. GProX represents a powerful approach to proteomics data analysis providing proteomics experimenters with a toolbox for bioinformatics analysis of quantitative proteomics data. The program is released as open-source and can be freely downloaded from the project webpage at http://gprox.sourceforge.net. PMID:21602510
Rapid Quantitative Determination of Squalene in Shark Liver Oils by Raman and IR Spectroscopy.
Hall, David W; Marshall, Susan N; Gordon, Keith C; Killeen, Daniel P
2016-01-01
Squalene is sourced predominantly from shark liver oils and to a lesser extent from plants such as olives. It is used for the production of surfactants, dyes, sunscreen, and cosmetics. The economic value of shark liver oil is directly related to the squalene content, which in turn is highly variable and species-dependent. Presented here is a validated gas chromatography-mass spectrometry analysis method for the quantitation of squalene in shark liver oils, with an accuracy of 99.0 %, precision of 0.23 % (standard deviation), and linearity of >0.999. The method has been used to measure the squalene concentration of 16 commercial shark liver oils. These reference squalene concentrations were related to infrared (IR) and Raman spectra of the same oils using partial least squares regression. The resultant models were suitable for the rapid quantitation of squalene in shark liver oils, with cross-validation r (2) values of >0.98 and root mean square errors of validation of ≤4.3 % w/w. Independent test set validation of these models found mean absolute deviations of the 4.9 and 1.0 % w/w for the IR and Raman models, respectively. Both techniques were more accurate than results obtained by an industrial refractive index analysis method, which is used for rapid, cheap quantitation of squalene in shark liver oils. In particular, the Raman partial least squares regression was suited to quantitative squalene analysis. The intense and highly characteristic Raman bands of squalene made quantitative analysis possible irrespective of the lipid matrix.
77 FR 41109 - Margin Requirements for Uncleared Swaps for Swap Dealers and Major Swap Participants
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-12
... and specifically requests quantitative data and analysis on the comparative costs and benefits of the... quantitative impact study (``QIS'') to assess the costs and benefits of margin requirements. The results of the...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Xiaoyu; Hao, Zhenqi; Wu, Di
Here, we report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS 2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-fieldmore » microwave imaging with small distance modulation.« less
Wu, Xiaoyu; Hao, Zhenqi; Wu, Di; ...
2018-04-01
Here, we report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS 2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-fieldmore » microwave imaging with small distance modulation.« less
Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros
2018-02-01
Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.
Enríquez-Navas, Pedro M; Guzzi, Cinzia; Muñoz-García, Juan C; Nieto, Pedro M; Angulo, Jesús
2015-01-01
Glycan-receptor interactions are of fundamental relevance for a large number of biological processes, and their kinetics properties (medium/weak binding affinities) make them appropriated to be studied by ligand observed NMR techniques, among which saturation transfer difference (STD) NMR spectroscopy has been shown to be a very robust and powerful approach. The quantitative analysis of the results from a STD NMR study of a glycan-receptor interaction is essential to be able to translate the resulting spectral intensities into a 3D molecular model of the complex. This chapter describes how to carry out such a quantitative analysis by means of the Complete Relaxation and Conformational Exchange Matrix Approach for STD NMR (CORCEMA-ST), in general terms, and an example of a previous work on an antibody-glycan interaction is also shown.
Rota, Cristina; Biondi, Marco; Trenti, Tommaso
2011-09-26
Aution Max AX-4030, a test strip analyzer recently introduced to the market, represents an upgrade of the Aution Max AX-4280 widely employed for urinalysis. This new instrument model can allocate two different test strips at the same time. In the present study the two instruments have been compared together with the usage of Uriflet 9UB and the recently produced Aution Sticks 10PA urine strips, the latter presenting an additional test area for the measurement of urinary creatinine. Imprecision and correlation between instruments and strips have been evaluated for chemical-physical parameters. Accuracy was evaluated for protein, glucose and creatinine by comparing the semi-quantitative results to those obtained by quantitative methods. The well-known interference effect of high ascorbic acid levels on urine glucose test strip determination was evaluated, ascorbic acid influence was also evaluated on protein and creatinine determination. The two instruments have demonstrated comparable performances: precision and correlation between instruments and strips, evaluated for chemical-physical parameters, were always good. Furthermore, accuracy was always very good: results of protein and glucose semi-quantitative measurements resulted to be highly correlated with those obtained by quantitative methods. Moreover, the semi-quantitative measurements of creatinine, employing Aution Sticks 10PA urine strips, were highly comparable with quantitative results. 10PA urine strips are eligible for urine creatinine determination with the possibility of correcting urinalysis results for urinary creatinine concentration, whenever necessary and calculating the protein creatinine ratio. Further studies should be carried out to evaluate effectiveness and appropriateness of the usage of creatinine semi-quantitative analysis.
Graberski Matasović, M; Matasović, T; Markovac, Z
1997-06-01
The frequency of femoral quadriceps muscle hypotrophy has become a significant therapeutic problem. Efforts are being made to improve the standard scheme of kinesitherapeutic treatment by using additional more effective therapeutic methods. Beside kinesitherapy, the authors have used magnetotherapy in 30 of the 60 patients. The total of 60 patients, both sexes, similar age groups and intensity of hypotrophy, were included in the study. They were divided into groups A and B, the experimental and the control one (30 patients each). The treatment was scheduled for the usual 5-6 weeks. Electromyographic quantitative analysis was used to check-up the treatment results achieved after 5 and 6 weeks of treatment period. Analysis of results has confirmed the assumption that magnetotherapy may yield better and faster treatment results, disappearance of pain and decreased risk of complications. The same results were obtained in the experimental group, only one week earlier than in the control group. The EMG quantitative analysis has not proved sufficiently reliable and objective method in the assessment of real condition of the muscle and effects of treatment.
Low-dose CT for quantitative analysis in acute respiratory distress syndrome
2013-01-01
Introduction The clinical use of serial quantitative computed tomography (CT) to characterize lung disease and guide the optimization of mechanical ventilation in patients with acute respiratory distress syndrome (ARDS) is limited by the risk of cumulative radiation exposure and by the difficulties and risks related to transferring patients to the CT room. We evaluated the effects of tube current-time product (mAs) variations on quantitative results in healthy lungs and in experimental ARDS in order to support the use of low-dose CT for quantitative analysis. Methods In 14 sheep chest CT was performed at baseline and after the induction of ARDS via intravenous oleic acid injection. For each CT session, two consecutive scans were obtained applying two different mAs: 60 mAs was paired with 140, 15 or 7.5 mAs. All other CT parameters were kept unaltered (tube voltage 120 kVp, collimation 32 × 0.5 mm, pitch 0.85, matrix 512 × 512, pixel size 0.625 × 0.625 mm). Quantitative results obtained at different mAs were compared via Bland-Altman analysis. Results Good agreement was observed between 60 mAs and 140 mAs and between 60 mAs and 15 mAs (all biases less than 1%). A further reduction of mAs to 7.5 mAs caused an increase in the bias of poorly aerated and nonaerated tissue (-2.9% and 2.4%, respectively) and determined a significant widening of the limits of agreement for the same compartments (-10.5% to 4.8% for poorly aerated tissue and -5.9% to 10.8% for nonaerated tissue). Estimated mean effective dose at 140, 60, 15 and 7.5 mAs corresponded to 17.8, 7.4, 2.0 and 0.9 mSv, respectively. Image noise of scans performed at 140, 60, 15 and 7.5 mAs corresponded to 10, 16, 38 and 74 Hounsfield units, respectively. Conclusions A reduction of effective dose up to 70% has been achieved with minimal effects on lung quantitative results. Low-dose computed tomography provides accurate quantitative results and could be used to characterize lung compartment distribution and possibly monitor time-course of ARDS with a lower risk of exposure to ionizing radiation. A further radiation dose reduction is associated with lower accuracy in quantitative results. PMID:24004842
Computerized image analysis for quantitative neuronal phenotyping in zebrafish.
Liu, Tianming; Lu, Jianfeng; Wang, Ye; Campbell, William A; Huang, Ling; Zhu, Jinmin; Xia, Weiming; Wong, Stephen T C
2006-06-15
An integrated microscope image analysis pipeline is developed for automatic analysis and quantification of phenotypes in zebrafish with altered expression of Alzheimer's disease (AD)-linked genes. We hypothesize that a slight impairment of neuronal integrity in a large number of zebrafish carrying the mutant genotype can be detected through the computerized image analysis method. Key functionalities of our zebrafish image processing pipeline include quantification of neuron loss in zebrafish embryos due to knockdown of AD-linked genes, automatic detection of defective somites, and quantitative measurement of gene expression levels in zebrafish with altered expression of AD-linked genes or treatment with a chemical compound. These quantitative measurements enable the archival of analyzed results and relevant meta-data. The structured database is organized for statistical analysis and data modeling to better understand neuronal integrity and phenotypic changes of zebrafish under different perturbations. Our results show that the computerized analysis is comparable to manual counting with equivalent accuracy and improved efficacy and consistency. Development of such an automated data analysis pipeline represents a significant step forward to achieve accurate and reproducible quantification of neuronal phenotypes in large scale or high-throughput zebrafish imaging studies.
Using normalization 3D model for automatic clinical brain quantative analysis and evaluation
NASA Astrophysics Data System (ADS)
Lin, Hong-Dun; Yao, Wei-Jen; Hwang, Wen-Ju; Chung, Being-Tau; Lin, Kang-Ping
2003-05-01
Functional medical imaging, such as PET or SPECT, is capable of revealing physiological functions of the brain, and has been broadly used in diagnosing brain disorders by clinically quantitative analysis for many years. In routine procedures, physicians manually select desired ROIs from structural MR images and then obtain physiological information from correspondent functional PET or SPECT images. The accuracy of quantitative analysis thus relies on that of the subjectively selected ROIs. Therefore, standardizing the analysis procedure is fundamental and important in improving the analysis outcome. In this paper, we propose and evaluate a normalization procedure with a standard 3D-brain model to achieve precise quantitative analysis. In the normalization process, the mutual information registration technique was applied for realigning functional medical images to standard structural medical images. Then, the standard 3D-brain model that shows well-defined brain regions was used, replacing the manual ROIs in the objective clinical analysis. To validate the performance, twenty cases of I-123 IBZM SPECT images were used in practical clinical evaluation. The results show that the quantitative analysis outcomes obtained from this automated method are in agreement with the clinical diagnosis evaluation score with less than 3% error in average. To sum up, the method takes advantage of obtaining precise VOIs, information automatically by well-defined standard 3-D brain model, sparing manually drawn ROIs slice by slice from structural medical images in traditional procedure. That is, the method not only can provide precise analysis results, but also improve the process rate for mass medical images in clinical.
Bayes` theorem and quantitative risk assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaplan, S.
1994-12-31
This paper argues that for a quantitative risk analysis (QRA) to be useful for public and private decision making, and for rallying the support necessary to implement those decisions, it is necessary that the QRA results be ``trustable.`` Trustable means that the results are based solidly and logically on all the relevant evidence available. This, in turn, means that the quantitative results must be derived from the evidence using Bayes` theorem. Thus, it argues that one should strive to make their QRAs more clearly and explicitly Bayesian, and in this way make them more ``evidence dependent`` than ``personality dependent.``
Quantitative analysis of regional myocardial performance in coronary artery disease
NASA Technical Reports Server (NTRS)
Stewart, D. K.; Dodge, H. T.; Frimer, M.
1975-01-01
Findings from a group of subjects with significant coronary artery stenosis are given. A group of controls determined by use of a quantitative method for the study of regional myocardial performance based on the frame-by-frame analysis of biplane left ventricular angiograms are presented. Particular emphasis was placed upon the analysis of wall motion in terms of normalized segment dimensions, timing and velocity of contraction. The results were compared with the method of subjective assessment used clinically.
NASA Astrophysics Data System (ADS)
Han, Young-Soo; Mao, Xiaodong; Jang, Jinsung; Kim, Tae-Kyu
2015-04-01
The ferritic ODS steel was manufactured by hot isostatic pressing and heat treatment. The nano-sized microstructures such as yttrium oxides and Cr oxides were quantitatively analyzed by small-angle neutron scattering (SANS). The effects of the fabrication conditions on the nano-sized microstructure were investigated in relation to the quantitative analysis results obtained by SANS. The ratio between magnetic and nuclear scattering components was calculated, and the characteristics of the nano-sized yttrium oxides are discussed based on the SANS analysis results.
Hou, Zhifei; Sun, Guoxiang; Guo, Yong
2016-01-01
The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard.
Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides
Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...
Rodríguez Chialanza, Mauricio; Sierra, Ignacio; Pérez Parada, Andrés; Fornaro, Laura
2018-06-01
There are several techniques used to analyze microplastics. These are often based on a combination of visual and spectroscopic techniques. Here we introduce an alternative workflow for identification and mass quantitation through a combination of optical microscopy with image analysis (IA) and differential scanning calorimetry (DSC). We studied four synthetic polymers with environmental concern: low and high density polyethylene (LDPE and HDPE, respectively), polypropylene (PP), and polyethylene terephthalate (PET). Selected experiments were conducted to investigate (i) particle characterization and counting procedures based on image analysis with open-source software, (ii) chemical identification of microplastics based on DSC signal processing, (iii) dependence of particle size on DSC signal, and (iv) quantitation of microplastics mass based on DSC signal. We describe the potential and limitations of these techniques to increase reliability for microplastic analysis. Particle size demonstrated to have particular incidence in the qualitative and quantitative performance of DSC signals. Both, identification (based on characteristic onset temperature) and mass quantitation (based on heat flow) showed to be affected by particle size. As a result, a proper sample treatment which includes sieving of suspended particles is particularly required for this analytical approach.
Stewart, Ethan L; Hagerty, Christina H; Mikaberidze, Alexey; Mundt, Christopher C; Zhong, Ziming; McDonald, Bruce A
2016-07-01
Zymoseptoria tritici causes Septoria tritici blotch (STB) on wheat. An improved method of quantifying STB symptoms was developed based on automated analysis of diseased leaf images made using a flatbed scanner. Naturally infected leaves (n = 949) sampled from fungicide-treated field plots comprising 39 wheat cultivars grown in Switzerland and 9 recombinant inbred lines (RIL) grown in Oregon were included in these analyses. Measures of quantitative resistance were percent leaf area covered by lesions, pycnidia size and gray value, and pycnidia density per leaf and lesion. These measures were obtained automatically with a batch-processing macro utilizing the image-processing software ImageJ. All phenotypes in both locations showed a continuous distribution, as expected for a quantitative trait. The trait distributions at both sites were largely overlapping even though the field and host environments were quite different. Cultivars and RILs could be assigned to two or more statistically different groups for each measured phenotype. Traditional visual assessments of field resistance were highly correlated with quantitative resistance measures based on image analysis for the Oregon RILs. These results show that automated image analysis provides a promising tool for assessing quantitative resistance to Z. tritici under field conditions.
NASA Technical Reports Server (NTRS)
1986-01-01
Digital Imaging is the computer processed numerical representation of physical images. Enhancement of images results in easier interpretation. Quantitative digital image analysis by Perceptive Scientific Instruments, locates objects within an image and measures them to extract quantitative information. Applications are CAT scanners, radiography, microscopy in medicine as well as various industrial and manufacturing uses. The PSICOM 327 performs all digital image analysis functions. It is based on Jet Propulsion Laboratory technology, is accurate and cost efficient.
A quantitative analysis of coupled oscillations using mobile accelerometer sensors
NASA Astrophysics Data System (ADS)
Castro-Palacio, Juan Carlos; Velázquez-Abad, Luisberis; Giménez, Fernando; Monsoriu, Juan A.
2013-05-01
In this paper, smartphone acceleration sensors were used to perform a quantitative analysis of mechanical coupled oscillations. Symmetric and asymmetric normal modes were studied separately in the first two experiments. In the third, a coupled oscillation was studied as a combination of the normal modes. Results indicate that acceleration sensors of smartphones, which are very familiar to students, represent valuable measurement instruments for introductory and first-year physics courses.
Quantitative analysis of voids in percolating structures in two-dimensional N-body simulations
NASA Technical Reports Server (NTRS)
Harrington, Patrick M.; Melott, Adrian L.; Shandarin, Sergei F.
1993-01-01
We present in this paper a quantitative method for defining void size in large-scale structure based on percolation threshold density. Beginning with two-dimensional gravitational clustering simulations smoothed to the threshold of nonlinearity, we perform percolation analysis to determine the large scale structure. The resulting objective definition of voids has a natural scaling property, is topologically interesting, and can be applied immediately to redshift surveys.
Goeminne, Ludger J E; Gevaert, Kris; Clement, Lieven
2018-01-16
Label-free shotgun proteomics is routinely used to assess proteomes. However, extracting relevant information from the massive amounts of generated data remains difficult. This tutorial provides a strong foundation on analysis of quantitative proteomics data. We provide key statistical concepts that help researchers to design proteomics experiments and we showcase how to analyze quantitative proteomics data using our recent free and open-source R package MSqRob, which was developed to implement the peptide-level robust ridge regression method for relative protein quantification described by Goeminne et al. MSqRob can handle virtually any experimental proteomics design and outputs proteins ordered by statistical significance. Moreover, its graphical user interface and interactive diagnostic plots provide easy inspection and also detection of anomalies in the data and flaws in the data analysis, allowing deeper assessment of the validity of results and a critical review of the experimental design. Our tutorial discusses interactive preprocessing, data analysis and visualization of label-free MS-based quantitative proteomics experiments with simple and more complex designs. We provide well-documented scripts to run analyses in bash mode on GitHub, enabling the integration of MSqRob in automated pipelines on cluster environments (https://github.com/statOmics/MSqRob). The concepts outlined in this tutorial aid in designing better experiments and analyzing the resulting data more appropriately. The two case studies using the MSqRob graphical user interface will contribute to a wider adaptation of advanced peptide-based models, resulting in higher quality data analysis workflows and more reproducible results in the proteomics community. We also provide well-documented scripts for experienced users that aim at automating MSqRob on cluster environments. Copyright © 2017 Elsevier B.V. All rights reserved.
Loescher, Christine M; Morton, David W; Razic, Slavica; Agatonovic-Kustrin, Snezana
2014-09-01
Chromatography techniques such as HPTLC and HPLC are commonly used to produce a chemical fingerprint of a plant to allow identification and quantify the main constituents within the plant. The aims of this study were to compare HPTLC and HPLC, for qualitative and quantitative analysis of the major constituents of Calendula officinalis and to investigate the effect of different extraction techniques on the C. officinalis extract composition from different parts of the plant. The results found HPTLC to be effective for qualitative analysis, however, HPLC was found to be more accurate for quantitative analysis. A combination of the two methods may be useful in a quality control setting as it would allow rapid qualitative analysis of herbal material while maintaining accurate quantification of extract composition. Copyright © 2014 Elsevier B.V. All rights reserved.
[Development and application of morphological analysis method in Aspergillus niger fermentation].
Tang, Wenjun; Xia, Jianye; Chu, Ju; Zhuang, Yingping; Zhang, Siliang
2015-02-01
Filamentous fungi are widely used in industrial fermentation. Particular fungal morphology acts as a critical index for a successful fermentation. To break the bottleneck of morphological analysis, we have developed a reliable method for fungal morphological analysis. By this method, we can prepare hundreds of pellet samples simultaneously and obtain quantitative morphological information at large scale quickly. This method can largely increase the accuracy and reliability of morphological analysis result. Based on that, the studies of Aspergillus niger morphology under different oxygen supply conditions and shear rate conditions were carried out. As a result, the morphological responding patterns of A. niger morphology to these conditions were quantitatively demonstrated, which laid a solid foundation for the further scale-up.
Temporal Lobe Epilepsy: Quantitative MR Volumetry in Detection of Hippocampal Atrophy
Farid, Nikdokht; Girard, Holly M.; Kemmotsu, Nobuko; Smith, Michael E.; Magda, Sebastian W.; Lim, Wei Y.; Lee, Roland R.
2012-01-01
Purpose: To determine the ability of fully automated volumetric magnetic resonance (MR) imaging to depict hippocampal atrophy (HA) and to help correctly lateralize the seizure focus in patients with temporal lobe epilepsy (TLE). Materials and Methods: This study was conducted with institutional review board approval and in compliance with HIPAA regulations. Volumetric MR imaging data were analyzed for 34 patients with TLE and 116 control subjects. Structural volumes were calculated by using U.S. Food and Drug Administration–cleared software for automated quantitative MR imaging analysis (NeuroQuant). Results of quantitative MR imaging were compared with visual detection of atrophy, and, when available, with histologic specimens. Receiver operating characteristic analyses were performed to determine the optimal sensitivity and specificity of quantitative MR imaging for detecting HA and asymmetry. A linear classifier with cross validation was used to estimate the ability of quantitative MR imaging to help lateralize the seizure focus. Results: Quantitative MR imaging–derived hippocampal asymmetries discriminated patients with TLE from control subjects with high sensitivity (86.7%–89.5%) and specificity (92.2%–94.1%). When a linear classifier was used to discriminate left versus right TLE, hippocampal asymmetry achieved 94% classification accuracy. Volumetric asymmetries of other subcortical structures did not improve classification. Compared with invasive video electroencephalographic recordings, lateralization accuracy was 88% with quantitative MR imaging and 85% with visual inspection of volumetric MR imaging studies but only 76% with visual inspection of clinical MR imaging studies. Conclusion: Quantitative MR imaging can depict the presence and laterality of HA in TLE with accuracy rates that may exceed those achieved with visual inspection of clinical MR imaging studies. Thus, quantitative MR imaging may enhance standard visual analysis, providing a useful and viable means for translating volumetric analysis into clinical practice. © RSNA, 2012 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.12112638/-/DC1 PMID:22723496
Automated detection of arterial input function in DSC perfusion MRI in a stroke rat model
NASA Astrophysics Data System (ADS)
Yeh, M.-Y.; Lee, T.-H.; Yang, S.-T.; Kuo, H.-H.; Chyi, T.-K.; Liu, H.-L.
2009-05-01
Quantitative cerebral blood flow (CBF) estimation requires deconvolution of the tissue concentration time curves with an arterial input function (AIF). However, image-based determination of AIF in rodent is challenged due to limited spatial resolution. We evaluated the feasibility of quantitative analysis using automated AIF detection and compared the results with commonly applied semi-quantitative analysis. Permanent occlusion of bilateral or unilateral common carotid artery was used to induce cerebral ischemia in rats. The image using dynamic susceptibility contrast method was performed on a 3-T magnetic resonance scanner with a spin-echo echo-planar-image sequence (TR/TE = 700/80 ms, FOV = 41 mm, matrix = 64, 3 slices, SW = 2 mm), starting from 7 s prior to contrast injection (1.2 ml/kg) at four different time points. For quantitative analysis, CBF was calculated by the AIF which was obtained from 10 voxels with greatest contrast enhancement after deconvolution. For semi-quantitative analysis, relative CBF was estimated by the integral divided by the first moment of the relaxivity time curves. We observed if the AIFs obtained in the three different ROIs (whole brain, hemisphere without lesion and hemisphere with lesion) were similar, the CBF ratios (lesion/normal) between quantitative and semi-quantitative analyses might have a similar trend at different operative time points. If the AIFs were different, the CBF ratios might be different. We concluded that using local maximum one can define proper AIF without knowing the anatomical location of arteries in a stroke rat model.
2017-01-01
Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers (QIBs) to measure changes in these features. Critical to the performance of a QIB in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method and metrics used to assess a QIB for clinical use. It is therefore, difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America (RSNA) and the Quantitative Imaging Biomarker Alliance (QIBA) with technical, radiological and statistical experts developed a set of technical performance analysis methods, metrics and study designs that provide terminology, metrics and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of QIB performance studies so that results from multiple studies can be compared, contrasted or combined. PMID:24919831
A Dimensionally Reduced Clustering Methodology for Heterogeneous Occupational Medicine Data Mining.
Saâdaoui, Foued; Bertrand, Pierre R; Boudet, Gil; Rouffiac, Karine; Dutheil, Frédéric; Chamoux, Alain
2015-10-01
Clustering is a set of techniques of the statistical learning aimed at finding structures of heterogeneous partitions grouping homogenous data called clusters. There are several fields in which clustering was successfully applied, such as medicine, biology, finance, economics, etc. In this paper, we introduce the notion of clustering in multifactorial data analysis problems. A case study is conducted for an occupational medicine problem with the purpose of analyzing patterns in a population of 813 individuals. To reduce the data set dimensionality, we base our approach on the Principal Component Analysis (PCA), which is the statistical tool most commonly used in factorial analysis. However, the problems in nature, especially in medicine, are often based on heterogeneous-type qualitative-quantitative measurements, whereas PCA only processes quantitative ones. Besides, qualitative data are originally unobservable quantitative responses that are usually binary-coded. Hence, we propose a new set of strategies allowing to simultaneously handle quantitative and qualitative data. The principle of this approach is to perform a projection of the qualitative variables on the subspaces spanned by quantitative ones. Subsequently, an optimal model is allocated to the resulting PCA-regressed subspaces.
Xie, Weilong; Yu, Kangfu; Pauls, K Peter; Navabi, Alireza
2012-04-01
The effectiveness of image analysis (IA) compared with an ordinal visual scale, for quantitative measurement of disease severity, its application in quantitative genetic studies, and its effect on the estimates of genetic parameters were investigated. Studies were performed using eight backcross-derived families of common bean (Phaseolus vulgaris) (n = 172) segregating for the molecular marker SU91, known to be associated with a quantitative trait locus (QTL) for resistance to common bacterial blight (CBB), caused by Xanthomonas campestris pv. phaseoli and X. fuscans subsp. fuscans. Even though both IA and visual assessments were highly repeatable, IA was more sensitive in detecting quantitative differences between bean genotypes. The CBB phenotypic difference between the two SU91 genotypic groups was consistently more than fivefold for IA assessments but generally only two- to threefold for visual assessments. Results suggest that the visual assessment results in overestimation of the effect of QTL in genetic studies. This may have been caused by lack of additivity and uneven intervals of the visual scale. Although visual assessment of disease severity is a useful tool for general selection in breeding programs, assessments using IA may be more suitable for phenotypic evaluations in quantitative genetic studies involving CBB resistance as well as other foliar diseases.
van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M
2017-11-27
Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory analysis our results show similar diagnostic accuracy comparing anatomical (AUC 0.86(0.83-0.89)) and functional reference standards (AUC 0.88(0.84-0.90)). Only the per territory analysis sensitivity did not show significant heterogeneity. None of the groups showed signs of publication bias. The clinical value of semi-quantitative and quantitative CMR perfusion analysis remains uncertain due to extensive inter-study heterogeneity and large differences in CMR perfusion acquisition protocols, reference standards, and methods of assessment of myocardial perfusion parameters. For wide spread implementation, standardization of CMR perfusion techniques is essential. CRD42016040176 .
Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong
2015-01-01
Objectives This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. Methods The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. Results These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. Conclusions This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies. PMID:26206364
Boersema, Paul J.; Foong, Leong Yan; Ding, Vanessa M. Y.; Lemeer, Simone; van Breukelen, Bas; Philp, Robin; Boekhorst, Jos; Snel, Berend; den Hertog, Jeroen; Choo, Andre B. H.; Heck, Albert J. R.
2010-01-01
Several mass spectrometry-based assays have emerged for the quantitative profiling of cellular tyrosine phosphorylation. Ideally, these methods should reveal the exact sites of tyrosine phosphorylation, be quantitative, and not be cost-prohibitive. The latter is often an issue as typically several milligrams of (stable isotope-labeled) starting protein material are required to enable the detection of low abundance phosphotyrosine peptides. Here, we adopted and refined a peptidecentric immunoaffinity purification approach for the quantitative analysis of tyrosine phosphorylation by combining it with a cost-effective stable isotope dimethyl labeling method. We were able to identify by mass spectrometry, using just two LC-MS/MS runs, more than 1100 unique non-redundant phosphopeptides in HeLa cells from about 4 mg of starting material without requiring any further affinity enrichment as close to 80% of the identified peptides were tyrosine phosphorylated peptides. Stable isotope dimethyl labeling could be incorporated prior to the immunoaffinity purification, even for the large quantities (mg) of peptide material used, enabling the quantification of differences in tyrosine phosphorylation upon pervanadate treatment or epidermal growth factor stimulation. Analysis of the epidermal growth factor-stimulated HeLa cells, a frequently used model system for tyrosine phosphorylation, resulted in the quantification of 73 regulated unique phosphotyrosine peptides. The quantitative data were found to be exceptionally consistent with the literature, evidencing that such a targeted quantitative phosphoproteomics approach can provide reproducible results. In general, the combination of immunoaffinity purification of tyrosine phosphorylated peptides with large scale stable isotope dimethyl labeling provides a cost-effective approach that can alleviate variation in sample preparation and analysis as samples can be combined early on. Using this approach, a rather complete qualitative and quantitative picture of tyrosine phosphorylation signaling events can be generated. PMID:19770167
Multispectral analysis of ocean dumped materials
NASA Technical Reports Server (NTRS)
Johnson, R. W.
1977-01-01
Remotely sensed data were collected in conjunction with sea-truth measurements in three experiments in the New York Bight. Pollution features of primary interest were ocean dumped materials, such as sewage sludge and acid waste. Sewage-sludge and acid-waste plumes, including plumes from sewage sludge dumped by the 'line-dump' and 'spot-dump' methods, were located, identified, and mapped. Previously developed quantitative analysis techniques for determining quantitative distributions of materials in sewage sludge dumps were evaluated, along with multispectral analysis techniques developed to identify ocean dumped materials. Results of these experiments and the associated data analysis investigations are presented and discussed.
Quantitative transmission Raman spectroscopy of pharmaceutical tablets and capsules.
Johansson, Jonas; Sparén, Anders; Svensson, Olof; Folestad, Staffan; Claybourn, Mike
2007-11-01
Quantitative analysis of pharmaceutical formulations using the new approach of transmission Raman spectroscopy has been investigated. For comparison, measurements were also made in conventional backscatter mode. The experimental setup consisted of a Raman probe-based spectrometer with 785 nm excitation for measurements in backscatter mode. In transmission mode the same system was used to detect the Raman scattered light, while an external diode laser of the same type was used as excitation source. Quantitative partial least squares models were developed for both measurement modes. The results for tablets show that the prediction error for an independent test set was lower for the transmission measurements with a relative root mean square error of about 2.2% as compared with 2.9% for the backscatter mode. Furthermore, the models were simpler in the transmission case, for which only a single partial least squares (PLS) component was required to explain the variation. The main reason for the improvement using the transmission mode is a more representative sampling of the tablets compared with the backscatter mode. Capsules containing mixtures of pharmaceutical powders were also assessed by transmission only. The quantitative results for the capsules' contents were good, with a prediction error of 3.6% w/w for an independent test set. The advantage of transmission Raman over backscatter Raman spectroscopy has been demonstrated for quantitative analysis of pharmaceutical formulations, and the prospects for reliable, lean calibrations for pharmaceutical analysis is discussed.
Retinal status analysis method based on feature extraction and quantitative grading in OCT images.
Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri
2016-07-22
Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.
A combined qualitative and quantitative procedure for the chemical analysis of urinary calculi
Hodgkinson, A.
1971-01-01
A better understanding of the physico-chemical principles underlying the formation of calculus has led to a need for more precise information on the chemical composition of stones. A combined qualitative and quantitative procedure for the chemical analysis of urinary calculi which is suitable for routine use is presented. The procedure involves five simple qualitative tests followed by the quantitative determination of calcium, magnesium, inorganic phosphate, and oxalate. These data are used to calculate the composition of the stone in terms of calcium oxalate, apatite, and magnesium ammonium phosphate. Analytical results and derived values for five representative types of calculi are presented. PMID:5551382
Defining the Pathophysiological Role of Tau in Experimental TBI
2017-10-01
clinically a blood test for improving the diagnosis of TBI-induced chronic neurodegenerative disease in the long-term post -injury time period. The...we will complete the quantitative analysis of perforant pathway synapse integrity in all 63 long-term post -injury cases. Our results thus far support...substantiated by quantitative analysis of NeuN-positive neuronal density in lateral entorhinal cortex layer II at 4 months post -injury (Table 1). At
Hao, Yong; Sun, Xu-Dong; Yang, Qiang
2012-12-01
Variables selection strategy combined with local linear embedding (LLE) was introduced for the analysis of complex samples by using near infrared spectroscopy (NIRS). Three methods include Monte Carlo uninformation variable elimination (MCUVE), successive projections algorithm (SPA) and MCUVE connected with SPA were used for eliminating redundancy spectral variables. Partial least squares regression (PLSR) and LLE-PLSR were used for modeling complex samples. The results shown that MCUVE can both extract effective informative variables and improve the precision of models. Compared with PLSR models, LLE-PLSR models can achieve more accurate analysis results. MCUVE combined with LLE-PLSR is an effective modeling method for NIRS quantitative analysis.
40 CFR 799.9780 - TSCA immunotoxicity.
Code of Federal Regulations, 2012 CFR
2012-07-01
... quantitative analysis of the effects of a chemical on the numbers of cells in major lymphocyte populations and..., and the research sample shall be stored under conditions that maintain its purity and stability. Prior... type of effect. (ii) All observed results, quantitative and incidental, shall be evaluated by an...
40 CFR 799.9780 - TSCA immunotoxicity.
Code of Federal Regulations, 2013 CFR
2013-07-01
... quantitative analysis of the effects of a chemical on the numbers of cells in major lymphocyte populations and..., and the research sample shall be stored under conditions that maintain its purity and stability. Prior... type of effect. (ii) All observed results, quantitative and incidental, shall be evaluated by an...
40 CFR 799.9780 - TSCA immunotoxicity.
Code of Federal Regulations, 2010 CFR
2010-07-01
... quantitative analysis of the effects of a chemical on the numbers of cells in major lymphocyte populations and..., and the research sample shall be stored under conditions that maintain its purity and stability. Prior... type of effect. (ii) All observed results, quantitative and incidental, shall be evaluated by an...
40 CFR 799.9780 - TSCA immunotoxicity.
Code of Federal Regulations, 2014 CFR
2014-07-01
... quantitative analysis of the effects of a chemical on the numbers of cells in major lymphocyte populations and..., and the research sample shall be stored under conditions that maintain its purity and stability. Prior... type of effect. (ii) All observed results, quantitative and incidental, shall be evaluated by an...
40 CFR 799.9780 - TSCA immunotoxicity.
Code of Federal Regulations, 2011 CFR
2011-07-01
... quantitative analysis of the effects of a chemical on the numbers of cells in major lymphocyte populations and..., and the research sample shall be stored under conditions that maintain its purity and stability. Prior... type of effect. (ii) All observed results, quantitative and incidental, shall be evaluated by an...
Zhang, Haibo; Yang, Litao; Guo, Jinchao; Li, Xiang; Jiang, Lingxi; Zhang, Dabing
2008-07-23
To enforce the labeling regulations of genetically modified organisms (GMOs), the application of reference molecules as calibrators is becoming essential for practical quantification of GMOs. However, the reported reference molecules with tandem marker multiple targets have been proved not suitable for duplex PCR analysis. In this study, we developed one unique plasmid molecule based on one pMD-18T vector with three exogenous target DNA fragments of Roundup Ready soybean GTS 40-3-2 (RRS), that is, CaMV35S, NOS, and RRS event fragments, plus one fragment of soybean endogenous Lectin gene. This Lectin gene fragment was separated from the three exogenous target DNA fragments of RRS by inserting one 2.6 kb DNA fragment with no relatedness to RRS detection targets in this resultant plasmid. Then, we proved that this design allows the quantification of RRS using the three duplex real-time PCR assays targeting CaMV35S, NOS, and RRS events employing this reference molecule as the calibrator. In these duplex PCR assays, the limits of detection (LOD) and quantification (LOQ) were 10 and 50 copies, respectively. For the quantitative analysis of practical RRS samples, the results of accuracy and precision were similar to those of simplex PCR assays, for instance, the quantitative results were at the 1% level, the mean bias of the simplex and duplex PCR were 4.0% and 4.6%, respectively, and the statistic analysis ( t-test) showed that the quantitative data from duplex and simplex PCR had no significant discrepancy for each soybean sample. Obviously, duplex PCR analysis has the advantages of saving the costs of PCR reaction and reducing the experimental errors in simplex PCR testing. The strategy reported in the present study will be helpful for the development of new reference molecules suitable for duplex PCR quantitative assays of GMOs.
Li, Yuanpeng; Li, Fucui; Yang, Xinhao; Guo, Liu; Huang, Furong; Chen, Zhenqiang; Chen, Xingdan; Zheng, Shifu
2018-08-05
A rapid quantitative analysis model for determining the glycated albumin (GA) content based on Attenuated total reflectance (ATR)-Fourier transform infrared spectroscopy (FTIR) combining with linear SiPLS and nonlinear SVM has been developed. Firstly, the real GA content in human serum was determined by GA enzymatic method, meanwhile, the ATR-FTIR spectra of serum samples from the population of health examination were obtained. The spectral data of the whole spectra mid-infrared region (4000-600 cm -1 ) and GA's characteristic region (1800-800 cm -1 ) were used as the research object of quantitative analysis. Secondly, several preprocessing steps including first derivative, second derivative, variable standardization and spectral normalization, were performed. Lastly, quantitative analysis regression models were established by using SiPLS and SVM respectively. The SiPLS modeling results are as follows: root mean square error of cross validation (RMSECV T ) = 0.523 g/L, calibration coefficient (R C ) = 0.937, Root Mean Square Error of Prediction (RMSEP T ) = 0.787 g/L, and prediction coefficient (R P ) = 0.938. The SVM modeling results are as follows: RMSECV T = 0.0048 g/L, R C = 0.998, RMSEP T = 0.442 g/L, and R p = 0.916. The results indicated that the model performance was improved significantly after preprocessing and optimization of characteristic regions. While modeling performance of nonlinear SVM was considerably better than that of linear SiPLS. Hence, the quantitative analysis model for GA in human serum based on ATR-FTIR combined with SiPLS and SVM is effective. And it does not need sample preprocessing while being characterized by simple operations and high time efficiency, providing a rapid and accurate method for GA content determination. Copyright © 2018 Elsevier B.V. All rights reserved.
Wu, Cheng-Ching; Lin, Hung-Yu; Wang, Chao-Ping; Lu, Li-Fen; Yu, Teng-Hung; Hung, Wei-Chin; Houng, Jer-Yiing; Chung, Fu-Mei; Lee, Yau-Jiunn; Hu, Jin-Jia
2015-11-03
Prostate cancer remains the most common cancer in men. Qualitative or semi-quantitative immunochromatographic measurements of prostate specific antigen (PSA) have been shown to be simple, noninvasive and feasible. The aim of this study was to evaluate an optimized gold immunochromatographic strip device for the detection of PSA, in which the results can be analysed using a Chromogenic Rapid Test Reader to quantitatively assess the test results. This reader measures the reflectance of the signal line via a charge-coupled device camera. For quantitative analysis, PSA concentration was computed via a calibration equation. Capillary blood samples from 305 men were evaluated, and two independent observers interpreted the test results after 12 min. Blood samples were also collected and tested with a conventional quantitative assay. Sensitivity, specificity, positive and negative predictive values, and accuracy of the PSA rapid quantitative test system were 100, 96.6, 89.5, 100, and 97.4 %, respectively. Reproducibility of the test was 99.2, and interobserver variation was 8 % with a false positive rate of 3.4 %. The correlation coefficient between the ordinary quantitative assay and the rapid quantitative test was 0.960. The PSA rapid quantitative test system provided results quickly and was easy to use, so that tests using this system can be easily performed at outpatient clinics or elsewhere. This system may also be useful for initial cancer screening and for point-of-care testing, because results can be obtained within 12 min and at a cost lower than that of conventional quantitative assays.
Krummen, David E; Patel, Mitul; Nguyen, Hong; Ho, Gordon; Kazi, Dhruv S; Clopton, Paul; Holland, Marian C; Greenberg, Scott L; Feld, Gregory K; Faddis, Mitchell N; Narayan, Sanjiv M
2010-11-01
Quantitative ECG Analysis. Optimal atrial tachyarrhythmia management is facilitated by accurate electrocardiogram interpretation, yet typical atrial flutter (AFl) may present without sawtooth F-waves or RR regularity, and atrial fibrillation (AF) may be difficult to separate from atypical AFl or rapid focal atrial tachycardia (AT). We analyzed whether improved diagnostic accuracy using a validated analysis tool significantly impacts costs and patient care. We performed a prospective, blinded, multicenter study using a novel quantitative computerized algorithm to identify atrial tachyarrhythmia mechanism from the surface ECG in patients referred for electrophysiology study (EPS). In 122 consecutive patients (age 60 ± 12 years) referred for EPS, 91 sustained atrial tachyarrhythmias were studied. ECGs were also interpreted by 9 physicians from 3 specialties for comparison and to allow healthcare system modeling. Diagnostic accuracy was compared to the diagnosis at EPS. A Markov model was used to estimate the impact of improved arrhythmia diagnosis. We found 13% of typical AFl ECGs had neither sawtooth flutter waves nor RR regularity, and were misdiagnosed by the majority of clinicians (0/6 correctly diagnosed by consensus visual interpretation) but correctly by quantitative analysis in 83% (5/6, P = 0.03). AF diagnosis was also improved through use of the algorithm (92%) versus visual interpretation (primary care: 76%, P < 0.01). Economically, we found that these improvements in diagnostic accuracy resulted in an average cost-savings of $1,303 and 0.007 quality-adjusted-life-years per patient. Typical AFl and AF are frequently misdiagnosed using visual criteria. Quantitative analysis improves diagnostic accuracy and results in improved healthcare costs and patient outcomes. © 2010 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Greenwood, Judy T.; Watson, Alex P.; Dennis, Melissa
2011-01-01
This article analyzes quantitative adequacy gap scores and coded qualitative comments from LibQual surveys at the University of Mississippi from 2001 to 2010, looking for relationships between library policy changes and LibQual results and any other trends that emerged. Analysis found no relationship between changes in policy and survey results…
A Multidimensional Analysis Tool for Visualizing Online Interactions
ERIC Educational Resources Information Center
Kim, Minjeong; Lee, Eunchul
2012-01-01
This study proposes and verifies the performance of an analysis tool for visualizing online interactions. A review of the most widely used methods for analyzing online interactions, including quantitative analysis, content analysis, and social network analysis methods, indicates these analysis methods have some limitations resulting from their…
Quantitative analysis of peel-off degree for printed electronics
NASA Astrophysics Data System (ADS)
Park, Janghoon; Lee, Jongsu; Sung, Ki-Hak; Shin, Kee-Hyun; Kang, Hyunkyoo
2018-02-01
We suggest a facile methodology of peel-off degree evaluation by image processing on printed electronics. The quantification of peeled and printed areas was performed using open source programs. To verify the accuracy of methods, we manually removed areas from the printed circuit that was measured, resulting in 96.3% accuracy. The sintered patterns showed a decreasing tendency in accordance with the increase in the energy density of an infrared lamp, and the peel-off degree increased. Thus, the comparison between both results was presented. Finally, the correlation between performance characteristics was determined by quantitative analysis.
NASA Technical Reports Server (NTRS)
Kuehner, S. M.; Laughlin, J. R.; Grossman, L.; Johnson, M. L.; Burnett, D. S.
1989-01-01
The applicability of ion microprobe (IMP) for quantitative analysis of minor elements (Sr, Y, Zr, La, Sm, and Yb) in the major phases present in natural Ca-, Al-rich inclusions (CAIs) was investigated by comparing IMP results with those of an electron microprobe (EMP). Results on three trace-element-doped glasses indicated that it is not possible to obtain precise quantitative analysis by using IMP if there are large differences in SiO2 content between the standards used to derive the ion yields and the unknowns.
Quality Assessments of Long-Term Quantitative Proteomic Analysis of Breast Cancer Xenograft Tissues
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Jian-Ying; Chen, Lijun; Zhang, Bai
The identification of protein biomarkers requires large-scale analysis of human specimens to achieve statistical significance. In this study, we evaluated the long-term reproducibility of an iTRAQ (isobaric tags for relative and absolute quantification) based quantitative proteomics strategy using one channel for universal normalization across all samples. A total of 307 liquid chromatography tandem mass spectrometric (LC-MS/MS) analyses were completed, generating 107 one-dimensional (1D) LC-MS/MS datasets and 8 offline two-dimensional (2D) LC-MS/MS datasets (25 fractions for each set) for human-in-mouse breast cancer xenograft tissues representative of basal and luminal subtypes. Such large-scale studies require the implementation of robust metrics to assessmore » the contributions of technical and biological variability in the qualitative and quantitative data. Accordingly, we developed a quantification confidence score based on the quality of each peptide-spectrum match (PSM) to remove quantification outliers from each analysis. After combining confidence score filtering and statistical analysis, reproducible protein identification and quantitative results were achieved from LC-MS/MS datasets collected over a 16 month period.« less
Hou, Zhifei; Sun, Guoxiang; Guo, Yong
2016-01-01
The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425
ERIC Educational Resources Information Center
Fenk, Christopher J.; Hickman, Nicole M.; Fincke, Melissa A.; Motry, Douglas H.; Lavine, Barry
2010-01-01
An undergraduate LC-MS experiment is described for the identification and quantitative determination of acetaminophen, acetylsalicylic acid, and caffeine in commercial analgesic tablets. This inquiry-based experimental procedure requires minimal sample preparation and provides good analytical results. Students are provided sufficient background…
Behavioral Changes Based on a Course in Agroecology: A Mixed Methods Study
ERIC Educational Resources Information Center
Harms, Kristyn; King, James; Francis, Charles
2009-01-01
This study evaluated and described student perceptions of a course in agroecology to determine if participants experienced changed perceptions and behaviors resulting from the Agroecosystems Analysis course. A triangulation validating quantitative data mixed methods approach included a written survey comprised of both quantitative and open-ended…
ERIC Educational Resources Information Center
Currier, Joseph M.; Neimeyer, Robert A.; Berman, Jeffrey S.
2008-01-01
Previous quantitative reviews of research on psychotherapeutic interventions for bereaved persons have yielded divergent findings and have not included many of the available controlled outcome studies. This meta-analysis summarizes results from 61 controlled studies to offer a more comprehensive integration of this literature. This review examined…
Are Teacher Course Evaluations Biased against Faculty That Teach Quantitative Methods Courses?
ERIC Educational Resources Information Center
Royal, Kenneth D.; Stockdale, Myrah R.
2015-01-01
The present study investigated graduate students' responses to teacher/course evaluations (TCE) to determine if students' responses were inherently biased against faculty who teach quantitative methods courses. Item response theory (IRT) and Differential Item Functioning (DIF) techniques were utilized for data analysis. Results indicate students…
Improving Student Retention and Performance in Quantitative Courses Using Clickers
ERIC Educational Resources Information Center
Liu, Wallace C.; Stengel, Donald N.
2011-01-01
Clickers offer instructors of mathematics-related courses an opportunity to involve students actively in class sessions while diminishing the embarrassment of being wrong. This paper reports on the use of clickers in two university-level courses in quantitative analysis and business statistics. Results for student retention and examination…
Engelberg, Jesse A.; Giberson, Richard T.; Young, Lawrence J.T.; Hubbard, Neil E.
2014-01-01
Microwave methods of fixation can dramatically shorten fixation times while preserving tissue structure; however, it remains unclear if adequate tissue antigenicity is preserved. To assess and validate antigenicity, robust quantitative methods and animal disease models are needed. We used two mouse mammary models of human breast cancer to evaluate microwave-assisted and standard 24-hr formalin fixation. The mouse models expressed four antigens prognostic for breast cancer outcome: estrogen receptor, progesterone receptor, Ki67, and human epidermal growth factor receptor 2. Using pathologist evaluation and novel methods of quantitative image analysis, we measured and compared the quality of antigen preservation, percentage of positive cells, and line plots of cell intensity. Visual evaluations by pathologists established that the amounts and patterns of staining were similar in tissues fixed by the different methods. The results of the quantitative image analysis provided a fine-grained evaluation, demonstrating that tissue antigenicity is preserved in tissues fixed using microwave methods. Evaluation of the results demonstrated that a 1-hr, 150-W fixation is better than a 45-min, 150-W fixation followed by a 15-min, 650-W fixation. The results demonstrated that microwave-assisted formalin fixation can standardize fixation times to 1 hr and produce immunohistochemistry that is in every way commensurate with longer conventional fixation methods. PMID:24682322
Quantitative Appearance Inspection for Film Coated Tablets.
Yoshino, Hiroyuki; Yamashita, Kazunari; Iwao, Yasunori; Noguchi, Shuji; Itai, Shigeru
2016-01-01
The decision criteria for the physical appearance of pharmaceutical products are subjective and qualitative means of evaluation that are based entirely on human interpretation. In this study, we have developed a comprehensive method for the quantitative analysis of the physical appearance of film coated tablets. Three different kinds of film coated tablets with considerable differences in their physical appearances were manufactured as models, and their surface roughness, contact angle, color measurements and physicochemical properties were investigated as potential characteristics for the quantitative analysis of their physical appearance. All of these characteristics were useful for the quantitative evaluation of the physical appearances of the tablets, and could potentially be used to establish decision criteria to assess the quality of tablets. In particular, the analysis of the surface roughness and film coating properties of the tablets by terahertz spectroscopy allowed for an effective evaluation of the tablets' properties. These results indicated the possibility of inspecting the appearance of tablets during the film coating process.
Feared consequences of panic attacks in panic disorder: a qualitative and quantitative analysis.
Raffa, Susan D; White, Kamila S; Barlow, David H
2004-01-01
Cognitions are hypothesized to play a central role in panic disorder (PD). Previous studies have used questionnaires to assess cognitive content, focusing on prototypical cognitions associated with PD; however, few studies have qualitatively examined cognitions associated with the feared consequences of panic attacks. The purpose of this study was to conduct a qualitative and quantitative analysis of feared consequences of panic attacks. The initial, qualitative analysis resulted in the development of 32 categories of feared consequences. The categories were derived from participant responses to a standardized, semi-structured question (n = 207). Five expert-derived categories were then utilized to quantitatively examine the relationship between cognitions and indicators of PD severity. Cognitions did not predict PD severity; however, correlational analyses indicated some predictive validity to the expert-derived categories. The qualitative analysis identified additional areas of patient-reported concern not included in previous research that may be important in the assessment and treatment of PD.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhandari, Deepak; Kertesz, Vilmos; Van Berkel, Gary J
RATIONALE: Ascorbic acid (AA) and folic acid (FA) are water-soluble vitamins and are usually fortified in food and dietary supplements. For the safety of human health, proper intake of these vitamins is recommended. Improvement in the analysis time required for the quantitative determination of these vitamins in food and nutritional formulations is desired. METHODS: A simple and fast (~5 min) in-tube sample preparation was performed, independently for FA and AA, by mixing extraction solvent with a powdered sample aliquot followed by agitation, centrifugation, and filtration to recover an extract for analysis. Quantitative detection was achieved by flow-injection (1 L injectionmore » volume) electrospray ionization tandem mass spectrometry (ESI-MS/MS) in negative ion mode using the method of standard addition. RESULTS: Method of standard addition was employed for the quantitative estimation of each vitamin in a sample extract. At least 2 spiked and 1 non-spiked sample extract were injected in triplicate for each quantitative analysis. Given an injection-to-injection interval of approximately 2 min, about 18 min was required to complete the quantitative estimation of each vitamin. The concentration values obtained for the respective vitamins in the standard reference material (SRM) 3280 using this approach were within the statistical range of the certified values provided in the NIST Certificate of Analysis. The estimated limit of detections of FA and AA were 13 and 5.9 ng/g, respectively. CONCLUSIONS: Flow-injection ESI-MS/MS was successfully applied for the rapid quantitation of FA and AA in SRM 3280 multivitamin/multielement tablets.« less
Yang, Jianhong; Li, Xiaomeng; Xu, Jinwu; Ma, Xianghong
2018-01-01
The quantitative analysis accuracy of calibration-free laser-induced breakdown spectroscopy (CF-LIBS) is severely affected by the self-absorption effect and estimation of plasma temperature. Herein, a CF-LIBS quantitative analysis method based on the auto-selection of internal reference line and the optimized estimation of plasma temperature is proposed. The internal reference line of each species is automatically selected from analytical lines by a programmable procedure through easily accessible parameters. Furthermore, the self-absorption effect of the internal reference line is considered during the correction procedure. To improve the analysis accuracy of CF-LIBS, the particle swarm optimization (PSO) algorithm is introduced to estimate the plasma temperature based on the calculation results from the Boltzmann plot. Thereafter, the species concentrations of a sample can be calculated according to the classical CF-LIBS method. A total of 15 certified alloy steel standard samples of known compositions and elemental weight percentages were used in the experiment. Using the proposed method, the average relative errors of Cr, Ni, and Fe calculated concentrations were 4.40%, 6.81%, and 2.29%, respectively. The quantitative results demonstrated an improvement compared with the classical CF-LIBS method and the promising potential of in situ and real-time application.
Dikow, Nicola; Nygren, Anders Oh; Schouten, Jan P; Hartmann, Carolin; Krämer, Nikola; Janssen, Bart; Zschocke, Johannes
2007-06-01
Standard methods used for genomic methylation analysis allow the detection of complete absence of either methylated or non-methylated alleles but are usually unable to detect changes in the proportion of methylated and unmethylated alleles. We compare two methods for quantitative methylation analysis, using the chromosome 15q11-q13 imprinted region as model. Absence of the non-methylated paternal allele in this region leads to Prader-Willi syndrome (PWS) whilst absence of the methylated maternal allele results in Angelman syndrome (AS). A proportion of AS is caused by mosaic imprinting defects which may be missed with standard methods and require quantitative analysis for their detection. Sequence-based quantitative methylation analysis (SeQMA) involves quantitative comparison of peaks generated through sequencing reactions after bisulfite treatment. It is simple, cost-effective and can be easily established for a large number of genes. However, our results support previous suggestions that methods based on bisulfite treatment may be problematic for exact quantification of methylation status. Methylation-specific multiplex ligation-dependent probe amplification (MS-MLPA) avoids bisulfite treatment. It detects changes in both CpG methylation as well as copy number of up to 40 chromosomal sequences in one simple reaction. Once established in a laboratory setting, the method is more accurate, reliable and less time consuming.
Kim, Sungjune; Hong, Seokpyo; Ahn, Kilsoo; Gong, Sungyong
2015-01-01
This study presents the indicators and proxy variables for the quantitative assessment of green chemistry technologies and evaluates the relative importance of each assessment element by consulting experts from the fields of ecology, chemistry, safety, and public health. The results collected were subjected to an analytic hierarchy process to obtain the weights of the indicators and the proxy variables. These weights may prove useful in avoiding having to resort to qualitative means in absence of weights between indicators when integrating the results of quantitative assessment by indicator. This study points to the limitations of current quantitative assessment techniques for green chemistry technologies and seeks to present the future direction for quantitative assessment of green chemistry technologies.
NASA Technical Reports Server (NTRS)
Kruse, Fred A.; Dwyer, John L.
1993-01-01
The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) measures reflected light in 224 contiguous spectra bands in the 0.4 to 2.45 micron region of the electromagnetic spectrum. Numerous studies have used these data for mineralogic identification and mapping based on the presence of diagnostic spectral features. Quantitative mapping requires conversion of the AVIRIS data to physical units (usually reflectance) so that analysis results can be compared and validated with field and laboratory measurements. This study evaluated two different AVIRIS calibration techniques to ground reflectance: an empirically-based method and an atmospheric model based method to determine their effects on quantitative scientific analyses. Expert system analysis and linear spectral unmixing were applied to both calibrated data sets to determine the effect of the calibration on the mineral identification and quantitative mapping results. Comparison of the image-map results and image reflectance spectra indicate that the model-based calibrated data can be used with automated mapping techniques to produce accurate maps showing the spatial distribution and abundance of surface mineralogy. This has positive implications for future operational mapping using AVIRIS or similar imaging spectrometer data sets without requiring a priori knowledge.
The APOSTEL recommendations for reporting quantitative optical coherence tomography studies.
Cruz-Herranz, Andrés; Balk, Lisanne J; Oberwahrenbrock, Timm; Saidha, Shiv; Martinez-Lapiscina, Elena H; Lagreze, Wolf A; Schuman, Joel S; Villoslada, Pablo; Calabresi, Peter; Balcer, Laura; Petzold, Axel; Green, Ari J; Paul, Friedemann; Brandt, Alexander U; Albrecht, Philipp
2016-06-14
To develop consensus recommendations for reporting of quantitative optical coherence tomography (OCT) study results. A panel of experienced OCT researchers (including 11 neurologists, 2 ophthalmologists, and 2 neuroscientists) discussed requirements for performing and reporting quantitative analyses of retinal morphology and developed a list of initial recommendations based on experience and previous studies. The list of recommendations was subsequently revised during several meetings of the coordinating group. We provide a 9-point checklist encompassing aspects deemed relevant when reporting quantitative OCT studies. The areas covered are study protocol, acquisition device, acquisition settings, scanning protocol, funduscopic imaging, postacquisition data selection, postacquisition data analysis, recommended nomenclature, and statistical analysis. The Advised Protocol for OCT Study Terminology and Elements recommendations include core items to standardize and improve quality of reporting in quantitative OCT studies. The recommendations will make reporting of quantitative OCT studies more consistent and in line with existing standards for reporting research in other biomedical areas. The recommendations originated from expert consensus and thus represent Class IV evidence. They will need to be regularly adjusted according to new insights and practices. © 2016 American Academy of Neurology.
Fluorescence-based Western blotting for quantitation of protein biomarkers in clinical samples.
Zellner, Maria; Babeluk, Rita; Diestinger, Michael; Pirchegger, Petra; Skeledzic, Senada; Oehler, Rudolf
2008-09-01
Since most high throughput techniques used in biomarker discovery are very time and cost intensive, highly specific and quantitative analytical alternative application methods are needed for the routine analysis. Conventional Western blotting allows detection of specific proteins to the level of single isotypes while its quantitative accuracy is rather limited. We report a novel and improved quantitative Western blotting method. The use of fluorescently labelled secondary antibodies strongly extends the dynamic range of the quantitation and improves the correlation with the protein amount (r=0.997). By an additional fluorescent staining of all proteins immediately after their transfer to the blot membrane, it is possible to visualise simultaneously the antibody binding and the total protein profile. This allows for an accurate correction for protein load. Applying this normalisation it could be demonstrated that fluorescence-based Western blotting is able to reproduce a quantitative analysis of two specific proteins in blood platelet samples from 44 subjects with different diseases as initially conducted by 2D-DIGE. These results show that the proposed fluorescence-based Western blotting is an adequate application technique for biomarker quantitation and suggest possibilities of employment that go far beyond.
Quantitative analysis of the mixtures of illicit drugs using terahertz time-domain spectroscopy
NASA Astrophysics Data System (ADS)
Jiang, Dejun; Zhao, Shusen; Shen, Jingling
2008-03-01
A method was proposed to quantitatively inspect the mixtures of illicit drugs with terahertz time-domain spectroscopy technique. The mass percentages of all components in a mixture can be obtained by linear regression analysis, on the assumption that all components in the mixture and their absorption features be known. For illicit drugs were scarce and expensive, firstly we used common chemicals, Benzophenone, Anthraquinone, Pyridoxine hydrochloride and L-Ascorbic acid in the experiment. Then illicit drugs and a common adulterant, methamphetamine and flour, were selected for our experiment. Experimental results were in significant agreement with actual content, which suggested that it could be an effective method for quantitative identification of illicit drugs.
Wu, Shulian; Huang, Yudian; Li, Hui; Wang, Yunxia; Zhang, Xiaoman
2015-01-01
Dermatofibrosarcoma protuberans (DFSP) is a skin cancer usually mistaken as other benign tumors. Abnormal DFSP resection results in tumor recurrence. Quantitative characterization of collagen alteration on the skin tumor is essential for developing a diagnostic technique. In this study, second harmonic generation (SHG) microscopy was performed to obtain images of the human DFSP skin and normal skin. Subsequently, structure and texture analysis methods were applied to determine the differences in skin texture characteristics between the two skin types, and the link between collagen alteration and tumor was established. Results suggest that combining SHG microscopy and texture analysis methods is a feasible and effective method to describe the characteristics of skin tumor like DFSP. © Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Kong, Dezhao; Liu, Liqiang; Song, Shanshan; Suryoprabowo, Steven; Li, Aike; Kuang, Hua; Wang, Libing; Xu, Chuanlai
2016-02-01
A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan reader, with the calculated limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.04-0.17, 0.06-49, 0.15-0.22, 0.056-0.49 and 0.53-1.05 μg kg-1, respectively. The analytical results of spiked samples were in accordance with the accurate content in the simultaneous detection analysis. This newly developed ICA strip assay is suitable for the on-site detection and rapid initial screening of mycotoxins in cereal samples, facilitating both semi-quantitative and quantitative determination.A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan reader, with the calculated limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.04-0.17, 0.06-49, 0.15-0.22, 0.056-0.49 and 0.53-1.05 μg kg-1, respectively. The analytical results of spiked samples were in accordance with the accurate content in the simultaneous detection analysis. This newly developed ICA strip assay is suitable for the on-site detection and rapid initial screening of mycotoxins in cereal samples, facilitating both semi-quantitative and quantitative determination. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr09171c
Quantitation of glycerophosphorylcholine by flow injection analysis using immobilized enzymes.
Mancini, A; Del Rosso, F; Roberti, R; Caligiana, P; Vecchini, A; Binaglia, L
1996-09-20
A method for quantitating glycerophosphorylcholine by flow injection analysis is reported in the present paper. Glycerophosphorylcholine phosphodiesterase and choline oxidase, immobilized on controlled porosity glass beads, are packed in a small reactor inserted in a flow injection manifold. When samples containing glycerophosphorylcholine are injected, glycerophosphorylcholine is hydrolyzed into choline and sn-glycerol-3-phosphate. The free choline produced in this reaction is oxidized to betain and hydrogen peroxide. Hydrogen peroxide is detected amperometrically. Quantitation of glycerophosphorylcholine in samples containing choline and phosphorylcholine is obtained inserting ahead of the reactor a small column packed with a mixed bed ion exchange resin. The time needed for each determination does not exceed one minute. The present method, applied to quantitate glycerophosphorylcholine in samples of seminal plasma, gave results comparable with those obtained using the standard enzymatic-spectrophotometric procedure. An alternative procedure, making use of co-immobilized glycerophosphorylcholine phosphodiesterase and glycerol-3-phosphate oxidase for quantitating glycerophosphorylcholine, glycerophosphorylethanolamine and glycerophosphorylserine is also described.
Google glass based immunochromatographic diagnostic test analysis
NASA Astrophysics Data System (ADS)
Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan
2015-03-01
Integration of optical imagers and sensors into recently emerging wearable computational devices allows for simpler and more intuitive methods of integrating biomedical imaging and medical diagnostics tasks into existing infrastructures. Here we demonstrate the ability of one such device, the Google Glass, to perform qualitative and quantitative analysis of immunochromatographic rapid diagnostic tests (RDTs) using a voice-commandable hands-free software-only interface, as an alternative to larger and more bulky desktop or handheld units. Using the built-in camera of Glass to image one or more RDTs (labeled with Quick Response (QR) codes), our Glass software application uploads the captured image and related information (e.g., user name, GPS, etc.) to our servers for remote analysis and storage. After digital analysis of the RDT images, the results are transmitted back to the originating Glass device, and made available through a website in geospatial and tabular representations. We tested this system on qualitative human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) RDTs. For qualitative HIV tests, we demonstrate successful detection and labeling (i.e., yes/no decisions) for up to 6-fold dilution of HIV samples. For quantitative measurements, we activated and imaged PSA concentrations ranging from 0 to 200 ng/mL and generated calibration curves relating the RDT line intensity values to PSA concentration. By providing automated digitization of both qualitative and quantitative test results, this wearable colorimetric diagnostic test reader platform on Google Glass can reduce operator errors caused by poor training, provide real-time spatiotemporal mapping of test results, and assist with remote monitoring of various biomedical conditions.
Wu, Yi-Hsuan; Hu, Chia-Wei; Chien, Chih-Wei; Chen, Yu-Ju; Huang, Hsuan-Cheng; Juan, Hsueh-Fen
2013-01-01
ATP synthase is present on the plasma membrane of several types of cancer cells. Citreoviridin, an ATP synthase inhibitor, selectively suppresses the proliferation and growth of lung cancer without affecting normal cells. However, the global effects of targeting ectopic ATP synthase in vivo have not been well defined. In this study, we performed quantitative proteomic analysis using isobaric tags for relative and absolute quantitation (iTRAQ) and provided a comprehensive insight into the complicated regulation by citreoviridin in a lung cancer xenograft model. With high reproducibility of the quantitation, we obtained quantitative proteomic profiling with 2,659 proteins identified. Bioinformatics analysis of the 141 differentially expressed proteins selected by their relative abundance revealed that citreoviridin induces alterations in the expression of glucose metabolism-related enzymes in lung cancer. The up-regulation of enzymes involved in gluconeogenesis and storage of glucose indicated that citreoviridin may reduce the glycolytic intermediates for macromolecule synthesis and inhibit cell proliferation. Using comprehensive proteomics, the results identify metabolic aspects that help explain the antitumorigenic effect of citreoviridin in lung cancer, which may lead to a better understanding of the links between metabolism and tumorigenesis in cancer therapy.
Wu, Yi-Hsuan; Hu, Chia-Wei; Chien, Chih-Wei; Chen, Yu-Ju; Huang, Hsuan-Cheng; Juan, Hsueh-Fen
2013-01-01
ATP synthase is present on the plasma membrane of several types of cancer cells. Citreoviridin, an ATP synthase inhibitor, selectively suppresses the proliferation and growth of lung cancer without affecting normal cells. However, the global effects of targeting ectopic ATP synthase in vivo have not been well defined. In this study, we performed quantitative proteomic analysis using isobaric tags for relative and absolute quantitation (iTRAQ) and provided a comprehensive insight into the complicated regulation by citreoviridin in a lung cancer xenograft model. With high reproducibility of the quantitation, we obtained quantitative proteomic profiling with 2,659 proteins identified. Bioinformatics analysis of the 141 differentially expressed proteins selected by their relative abundance revealed that citreoviridin induces alterations in the expression of glucose metabolism-related enzymes in lung cancer. The up-regulation of enzymes involved in gluconeogenesis and storage of glucose indicated that citreoviridin may reduce the glycolytic intermediates for macromolecule synthesis and inhibit cell proliferation. Using comprehensive proteomics, the results identify metabolic aspects that help explain the antitumorigenic effect of citreoviridin in lung cancer, which may lead to a better understanding of the links between metabolism and tumorigenesis in cancer therapy. PMID:23990911
A benchmark for comparison of dental radiography analysis algorithms.
Wang, Ching-Wei; Huang, Cheng-Ta; Lee, Jia-Hong; Li, Chung-Hsing; Chang, Sheng-Wei; Siao, Ming-Jhih; Lai, Tat-Ming; Ibragimov, Bulat; Vrtovec, Tomaž; Ronneberger, Olaf; Fischer, Philipp; Cootes, Tim F; Lindner, Claudia
2016-07-01
Dental radiography plays an important role in clinical diagnosis, treatment and surgery. In recent years, efforts have been made on developing computerized dental X-ray image analysis systems for clinical usages. A novel framework for objective evaluation of automatic dental radiography analysis algorithms has been established under the auspices of the IEEE International Symposium on Biomedical Imaging 2015 Bitewing Radiography Caries Detection Challenge and Cephalometric X-ray Image Analysis Challenge. In this article, we present the datasets, methods and results of the challenge and lay down the principles for future uses of this benchmark. The main contributions of the challenge include the creation of the dental anatomy data repository of bitewing radiographs, the creation of the anatomical abnormality classification data repository of cephalometric radiographs, and the definition of objective quantitative evaluation for comparison and ranking of the algorithms. With this benchmark, seven automatic methods for analysing cephalometric X-ray image and two automatic methods for detecting bitewing radiography caries have been compared, and detailed quantitative evaluation results are presented in this paper. Based on the quantitative evaluation results, we believe automatic dental radiography analysis is still a challenging and unsolved problem. The datasets and the evaluation software will be made available to the research community, further encouraging future developments in this field. (http://www-o.ntust.edu.tw/~cweiwang/ISBI2015/). Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Visual Aggregate Analysis of Eligibility Features of Clinical Trials
He, Zhe; Carini, Simona; Sim, Ida; Weng, Chunhua
2015-01-01
Objective To develop a method for profiling the collective populations targeted for recruitment by multiple clinical studies addressing the same medical condition using one eligibility feature each time. Methods Using a previously published database COMPACT as the backend, we designed a scalable method for visual aggregate analysis of clinical trial eligibility features. This method consists of four modules for eligibility feature frequency analysis, query builder, distribution analysis, and visualization, respectively. This method is capable of analyzing (1) frequently used qualitative and quantitative features for recruiting subjects for a selected medical condition, (2) distribution of study enrollment on consecutive value points or value intervals of each quantitative feature, and (3) distribution of studies on the boundary values, permissible value ranges, and value range widths of each feature. All analysis results were visualized using Google Charts API. Five recruited potential users assessed the usefulness of this method for identifying common patterns in any selected eligibility feature for clinical trial participant selection. Results We implemented this method as a Web-based analytical system called VITTA (Visual Analysis Tool of Clinical Study Target Populations). We illustrated the functionality of VITTA using two sample queries involving quantitative features BMI and HbA1c for conditions “hypertension” and “Type 2 diabetes”, respectively. The recruited potential users rated the user-perceived usefulness of VITTA with an average score of 86.4/100. Conclusions We contributed a novel aggregate analysis method to enable the interrogation of common patterns in quantitative eligibility criteria and the collective target populations of multiple related clinical studies. A larger-scale study is warranted to formally assess the usefulness of VITTA among clinical investigators and sponsors in various therapeutic areas. PMID:25615940
Caballero, Julio; Fernández, Michael; Coll, Deysma
2010-12-01
Three-dimensional quantitative structure-activity relationship studies were carried out on a series of 28 organosulphur compounds as 15-lipoxygenase inhibitors using comparative molecular field analysis and comparative molecular similarity indices analysis. Quantitative information on structure-activity relationships is provided for further rational development and direction of selective synthesis. All models were carried out over a training set including 22 compounds. The best comparative molecular field analysis model only included steric field and had a good Q² = 0.789. Comparative molecular similarity indices analysis overcame the comparative molecular field analysis results: the best comparative molecular similarity indices analysis model also only included steric field and had a Q² = 0.894. In addition, this model predicted adequately the compounds contained in the test set. Furthermore, plots of steric comparative molecular similarity indices analysis field allowed conclusions to be drawn for the choice of suitable inhibitors. In this sense, our model should prove useful in future 15-lipoxygenase inhibitor design studies. © 2010 John Wiley & Sons A/S.
Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang
2017-01-01
The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.
On sweat analysis for quantitative estimation of dehydration during physical exercise.
Ring, Matthias; Lohmueller, Clemens; Rauh, Manfred; Eskofier, Bjoern M
2015-08-01
Quantitative estimation of water loss during physical exercise is of importance because dehydration can impair both muscular strength and aerobic endurance. A physiological indicator for deficit of total body water (TBW) might be the concentration of electrolytes in sweat. It has been shown that concentrations differ after physical exercise depending on whether water loss was replaced by fluid intake or not. However, to the best of our knowledge, this fact has not been examined for its potential to quantitatively estimate TBW loss. Therefore, we conducted a study in which sweat samples were collected continuously during two hours of physical exercise without fluid intake. A statistical analysis of these sweat samples revealed significant correlations between chloride concentration in sweat and TBW loss (r = 0.41, p <; 0.01), and between sweat osmolality and TBW loss (r = 0.43, p <; 0.01). A quantitative estimation of TBW loss resulted in a mean absolute error of 0.49 l per estimation. Although the precision has to be improved for practical applications, the present results suggest that TBW loss estimation could be realizable using sweat samples.
Visualization and Quantitative Analysis of Crack-Tip Plastic Zone in Pure Nickel
NASA Astrophysics Data System (ADS)
Kelton, Randall; Sola, Jalal Fathi; Meletis, Efstathios I.; Huang, Haiying
2018-05-01
Changes in surface morphology have long been thought to be associated with crack propagation in metallic materials. We have studied areal surface texture changes around crack tips in an attempt to understand the correlations between surface texture changes and crack growth behavior. Detailed profiling of the fatigue sample surface was carried out at short fatigue intervals. An image processing algorithm was developed to calculate the surface texture changes. Quantitative analysis of the crack-tip plastic zone, crack-arrested sites near triple points, and large surface texture changes associated with crack release from arrested locations was carried out. The results indicate that surface texture imaging enables visualization of the development of plastic deformation around a crack tip. Quantitative analysis of the surface texture changes reveals the effects of local microstructures on the crack growth behavior.
NASA Astrophysics Data System (ADS)
Neubert, M.; Jurisch, M.
2015-06-01
The paper analyzes experimental compositional profiles in Vertical Bridgman (VB, VGF) grown (Cd,Zn)Te crystals, found in the literature. The origin of the observed axial ZnTe-distribution profiles is attributed to dendritic growth after initial nucleation from supercooled melts. The analysis was done by utilizing a boundary layer model providing a very good approximation of the experimental data. Besides the discussion of the qualitative results also a quantitative analysis of the fitted model parameters is presented as far as it is possible by the utilized model.
Temporal lobe epilepsy: quantitative MR volumetry in detection of hippocampal atrophy.
Farid, Nikdokht; Girard, Holly M; Kemmotsu, Nobuko; Smith, Michael E; Magda, Sebastian W; Lim, Wei Y; Lee, Roland R; McDonald, Carrie R
2012-08-01
To determine the ability of fully automated volumetric magnetic resonance (MR) imaging to depict hippocampal atrophy (HA) and to help correctly lateralize the seizure focus in patients with temporal lobe epilepsy (TLE). This study was conducted with institutional review board approval and in compliance with HIPAA regulations. Volumetric MR imaging data were analyzed for 34 patients with TLE and 116 control subjects. Structural volumes were calculated by using U.S. Food and Drug Administration-cleared software for automated quantitative MR imaging analysis (NeuroQuant). Results of quantitative MR imaging were compared with visual detection of atrophy, and, when available, with histologic specimens. Receiver operating characteristic analyses were performed to determine the optimal sensitivity and specificity of quantitative MR imaging for detecting HA and asymmetry. A linear classifier with cross validation was used to estimate the ability of quantitative MR imaging to help lateralize the seizure focus. Quantitative MR imaging-derived hippocampal asymmetries discriminated patients with TLE from control subjects with high sensitivity (86.7%-89.5%) and specificity (92.2%-94.1%). When a linear classifier was used to discriminate left versus right TLE, hippocampal asymmetry achieved 94% classification accuracy. Volumetric asymmetries of other subcortical structures did not improve classification. Compared with invasive video electroencephalographic recordings, lateralization accuracy was 88% with quantitative MR imaging and 85% with visual inspection of volumetric MR imaging studies but only 76% with visual inspection of clinical MR imaging studies. Quantitative MR imaging can depict the presence and laterality of HA in TLE with accuracy rates that may exceed those achieved with visual inspection of clinical MR imaging studies. Thus, quantitative MR imaging may enhance standard visual analysis, providing a useful and viable means for translating volumetric analysis into clinical practice.
Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast
Pang, Wei; Coghill, George M.
2015-01-01
In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377
Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh
2011-03-01
It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and empirical findings, challenge theoretical assumptions and develop new theory. Copyright © 2010 Elsevier Ltd. All rights reserved.
Tan, Peng; Zhang, Hai-Zhu; Zhang, Ding-Kun; Wu, Shan-Na; Niu, Ming; Wang, Jia-Bo; Xiao, Xiao-He
2017-07-01
This study attempts to evaluate the quality of Chinese formula granules by combined use of multi-component simultaneous quantitative analysis and bioassay. The rhubarb dispensing granules were used as the model drug for demonstrative study. The ultra-high performance liquid chromatography (UPLC) method was adopted for simultaneously quantitative determination of the 10 anthraquinone derivatives (such as aloe emodin-8-O-β-D-glucoside) in rhubarb dispensing granules; purgative biopotency of different batches of rhubarb dispensing granules was determined based on compound diphenoxylate tablets-induced mouse constipation model; blood activating biopotency of different batches of rhubarb dispensing granules was determined based on in vitro rat antiplatelet aggregation model; SPSS 22.0 statistical software was used for correlation analysis between 10 anthraquinone derivatives and purgative biopotency, blood activating biopotency. The results of multi-components simultaneous quantitative analysisshowed that there was a great difference in chemical characterizationand certain differences inpurgative biopotency and blood activating biopotency among 10 batches of rhubarb dispensing granules. The correlation analysis showed that the intensity of purgative biopotency was significantly correlated with the content of conjugated anthraquinone glycosides (P<0.01), and the intensity of blood activating biopotency was significantly correlated with the content of free anthraquinone (P<0.01). In summary, the combined use of multi-component simultaneous quantitative analysis and bioassay can achieve objective quantification and more comprehensive reflection on overall quality difference among different batches of rhubarb dispensing granules. Copyright© by the Chinese Pharmaceutical Association.
Comprehensive Quantitative Analysis on Privacy Leak Behavior
Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan
2013-01-01
Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046
Comprehensive quantitative analysis on privacy leak behavior.
Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan
2013-01-01
Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects.
MilQuant: a free, generic software tool for isobaric tagging-based quantitation.
Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo
2012-09-18
Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.
Zhang, Fen-Fen; Jiang, Meng-Hong; Sun, Lin-Lin; Zheng, Feng; Dong, Lei; Shah, Vishva; Shen, Wen-Bin; Ding, Ya
2015-01-07
To expand the application scope of nuclear magnetic resonance (NMR) technology in quantitative analysis of pharmaceutical ingredients, (19)F nuclear magnetic resonance ((19)F-NMR) spectroscopy has been employed as a simple, rapid, and reproducible approach for the detection of a fluorine-containing model drug, sitagliptin phosphate monohydrate (STG). ciprofloxacin (Cipro) has been used as the internal standard (IS). Influential factors, including the relaxation delay time (d1) and pulse angle, impacting the accuracy and precision of spectral data are systematically optimized. Method validation has been carried out in terms of precision and intermediate precision, linearity, limit of detection (LOD) and limit of quantification (LOQ), robustness, and stability. To validate the reliability and feasibility of the (19)F-NMR technology in quantitative analysis of pharmaceutical analytes, the assay result has been compared with that of (1)H-NMR. The statistical F-test and student t-test at 95% confidence level indicate that there is no significant difference between these two methods. Due to the advantages of (19)F-NMR, such as higher resolution and suitability for biological samples, it can be used as a universal technology for the quantitative analysis of other fluorine-containing pharmaceuticals and analytes.
A Quantitative Approach to Scar Analysis
Khorasani, Hooman; Zheng, Zhong; Nguyen, Calvin; Zara, Janette; Zhang, Xinli; Wang, Joyce; Ting, Kang; Soo, Chia
2011-01-01
Analysis of collagen architecture is essential to wound healing research. However, to date no consistent methodologies exist for quantitatively assessing dermal collagen architecture in scars. In this study, we developed a standardized approach for quantitative analysis of scar collagen morphology by confocal microscopy using fractal dimension and lacunarity analysis. Full-thickness wounds were created on adult mice, closed by primary intention, and harvested at 14 days after wounding for morphometrics and standard Fourier transform-based scar analysis as well as fractal dimension and lacunarity analysis. In addition, transmission electron microscopy was used to evaluate collagen ultrastructure. We demonstrated that fractal dimension and lacunarity analysis were superior to Fourier transform analysis in discriminating scar versus unwounded tissue in a wild-type mouse model. To fully test the robustness of this scar analysis approach, a fibromodulin-null mouse model that heals with increased scar was also used. Fractal dimension and lacunarity analysis effectively discriminated unwounded fibromodulin-null versus wild-type skin as well as healing fibromodulin-null versus wild-type wounds, whereas Fourier transform analysis failed to do so. Furthermore, fractal dimension and lacunarity data also correlated well with transmission electron microscopy collagen ultrastructure analysis, adding to their validity. These results demonstrate that fractal dimension and lacunarity are more sensitive than Fourier transform analysis for quantification of scar morphology. PMID:21281794
Quantitative 3D analysis of bone in hip osteoarthritis using clinical computed tomography.
Turmezei, Tom D; Treece, Graham M; Gee, Andrew H; Fotiadou, Anastasia F; Poole, Kenneth E S
2016-07-01
To assess the relationship between proximal femoral cortical bone thickness and radiological hip osteoarthritis using quantitative 3D analysis of clinical computed tomography (CT) data. Image analysis was performed on clinical CT imaging data from 203 female volunteers with a technique called cortical bone mapping (CBM). Colour thickness maps were created for each proximal femur. Statistical parametric mapping was performed to identify statistically significant differences in cortical bone thickness that corresponded with the severity of radiological hip osteoarthritis. Kellgren and Lawrence (K&L) grade, minimum joint space width (JSW) and a novel CT-based osteophyte score were also blindly assessed from the CT data. For each increase in K&L grade, cortical thickness increased by up to 25 % in distinct areas of the superolateral femoral head-neck junction and superior subchondral bone plate. For increasing severity of CT osteophytes, the increase in cortical thickness was more circumferential, involving a wider portion of the head-neck junction, with up to a 7 % increase in cortical thickness per increment in score. Results were not significant for minimum JSW. These findings indicate that quantitative 3D analysis of the proximal femur can identify changes in cortical bone thickness relevant to structural hip osteoarthritis. • CT is being increasingly used to assess bony involvement in osteoarthritis • CBM provides accurate and reliable quantitative analysis of cortical bone thickness • Cortical bone is thicker at the superior femoral head-neck with worse osteoarthritis • Regions of increased thickness co-locate with impingement and osteophyte formation • Quantitative 3D bone analysis could enable clinical disease prediction and therapy development.
NASA Technical Reports Server (NTRS)
Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.
2010-01-01
Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, in a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation; testing results, and other information. Where appropriate, actual performance history was used for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to verify compliance with requirements and to highlight design or performance shortcomings for further decision-making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability and maintainability analysis, and present findings and observation based on analysis leading to the Ground Systems Preliminary Design Review milestone.
Moon, Andres; Smith, Geoffrey H; Kong, Jun; Rogers, Thomas E; Ellis, Carla L; Farris, Alton B Brad
2018-02-01
Renal allograft rejection diagnosis depends on assessment of parameters such as interstitial inflammation; however, studies have shown interobserver variability regarding interstitial inflammation assessment. Since automated image analysis quantitation can be reproducible, we devised customized analysis methods for CD3+ T-cell staining density as a measure of rejection severity and compared them with established commercial methods along with visual assessment. Renal biopsy CD3 immunohistochemistry slides (n = 45), including renal allografts with various degrees of acute cellular rejection (ACR) were scanned for whole slide images (WSIs). Inflammation was quantitated in the WSIs using pathologist visual assessment, commercial algorithms (Aperio nuclear algorithm for CD3+ cells/mm 2 and Aperio positive pixel count algorithm), and customized open source algorithms developed in ImageJ with thresholding/positive pixel counting (custom CD3+%) and identification of pixels fulfilling "maxima" criteria for CD3 expression (custom CD3+ cells/mm 2 ). Based on visual inspections of "markup" images, CD3 quantitation algorithms produced adequate accuracy. Additionally, CD3 quantitation algorithms correlated between each other and also with visual assessment in a statistically significant manner (r = 0.44 to 0.94, p = 0.003 to < 0.0001). Methods for assessing inflammation suggested a progression through the tubulointerstitial ACR grades, with statistically different results in borderline versus other ACR types, in all but the custom methods. Assessment of CD3-stained slides using various open source image analysis algorithms presents salient correlations with established methods of CD3 quantitation. These analysis techniques are promising and highly customizable, providing a form of on-slide "flow cytometry" that can facilitate additional diagnostic accuracy in tissue-based assessments.
Chen, Song; Li, Xuena; Chen, Meijie; Yin, Yafu; Li, Na; Li, Yaming
2016-10-01
This study is aimed to compare the diagnostic power of using quantitative analysis or visual analysis with single time point imaging (STPI) PET/CT and dual time point imaging (DTPI) PET/CT for the classification of solitary pulmonary nodules (SPN) lesions in granuloma-endemic regions. SPN patients who received early and delayed (18)F-FDG PET/CT at 60min and 180min post-injection were retrospectively reviewed. Diagnoses are confirmed by pathological results or follow-ups. Three quantitative metrics, early SUVmax, delayed SUVmax and retention index(the percentage changes between the early SUVmax and delayed SUVmax), were measured for each lesion. Three 5-point scale score was given by blinded interpretations performed by physicians based on STPI PET/CT images, DTPI PET/CT images and CT images, respectively. ROC analysis was performed on three quantitative metrics and three visual interpretation scores. One-hundred-forty-nine patients were retrospectively included. The areas under curve (AUC) of the ROC curves of early SUVmax, delayed SUVmax, RI, STPI PET/CT score, DTPI PET/CT score and CT score are 0.73, 0.74, 0.61, 0.77 0.75 and 0.76, respectively. There were no significant differences between the AUCs in visual interpretation of STPI PET/CT images and DTPI PET/CT images, nor in early SUVmax and delayed SUVmax. The differences of sensitivity, specificity and accuracy between STPI PET/CT and DTPI PET/CT were not significantly different in either quantitative analysis or visual interpretation. In granuloma-endemic regions, DTPI PET/CT did not offer significant improvement over STPI PET/CT in differentiating malignant SPNs in both quantitative analysis and visual interpretation. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Da Costa, Caitlyn; Reynolds, James C; Whitmarsh, Samuel; Lynch, Tom; Creaser, Colin S
2013-01-01
RATIONALE Chemical additives are incorporated into commercial lubricant oils to modify the physical and chemical properties of the lubricant. The quantitative analysis of additives in oil-based lubricants deposited on a surface without extraction of the sample from the surface presents a challenge. The potential of desorption electrospray ionization mass spectrometry (DESI-MS) for the quantitative surface analysis of an oil additive in a complex oil lubricant matrix without sample extraction has been evaluated. METHODS The quantitative surface analysis of the antioxidant additive octyl (4-hydroxy-3,5-di-tert-butylphenyl)propionate in an oil lubricant matrix was carried out by DESI-MS in the presence of 2-(pentyloxy)ethyl 3-(3,5-di-tert-butyl-4-hydroxyphenyl)propionate as an internal standard. A quadrupole/time-of-flight mass spectrometer fitted with an in-house modified ion source enabling non-proximal DESI-MS was used for the analyses. RESULTS An eight-point calibration curve ranging from 1 to 80 µg/spot of octyl (4-hydroxy-3,5-di-tert-butylphenyl)propionate in an oil lubricant matrix and in the presence of the internal standard was used to determine the quantitative response of the DESI-MS method. The sensitivity and repeatability of the technique were assessed by conducting replicate analyses at each concentration. The limit of detection was determined to be 11 ng/mm2 additive on spot with relative standard deviations in the range 3–14%. CONCLUSIONS The application of DESI-MS to the direct, quantitative surface analysis of a commercial lubricant additive in a native oil lubricant matrix is demonstrated. © 2013 The Authors. Rapid Communications in Mass Spectrometry published by John Wiley & Sons, Ltd. PMID:24097398
NASA Astrophysics Data System (ADS)
Monesi, C.; Meneghini, C.; Bardelli, F.; Benfatto, M.; Mobilio, S.; Manju, U.; Sarma, D. D.
2005-11-01
Hole-doped perovskites such as La1-xCaxMnO3 present special magnetic and magnetotransport properties, and it is commonly accepted that the local atomic structure around Mn ions plays a crucial role in determining these peculiar features. Therefore experimental techniques directly probing the local atomic structure, like x-ray absorption spectroscopy (XAS), have been widely exploited to deeply understand the physics of these compounds. Quantitative XAS analysis usually concerns the extended region [extended x-ray absorption fine structure (EXAFS)] of the absorption spectra. The near-edge region [x-ray absorption near-edge spectroscopy (XANES)] of XAS spectra can provide detailed complementary information on the electronic structure and local atomic topology around the absorber. However, the complexity of the XANES analysis usually prevents a quantitative understanding of the data. This work exploits the recently developed MXAN code to achieve a quantitative structural refinement of the Mn K -edge XANES of LaMnO3 and CaMnO3 compounds; they are the end compounds of the doped manganite series LaxCa1-xMnO3 . The results derived from the EXAFS and XANES analyses are in good agreement, demonstrating that a quantitative picture of the local structure can be obtained from XANES in these crystalline compounds. Moreover, the quantitative XANES analysis provides topological information not directly achievable from EXAFS data analysis. This work demonstrates that combining the analysis of extended and near-edge regions of Mn K -edge XAS spectra could provide a complete and accurate description of Mn local atomic environment in these compounds.
Good practices for quantitative bias analysis.
Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander
2014-12-01
Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage more widespread use of bias analysis to estimate the potential magnitude and direction of biases, as well as the uncertainty in estimates potentially influenced by the biases. © The Author 2014; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.
Quantitative risk analysis of oil storage facilities in seismic areas.
Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto
2005-08-31
Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.
IB-LBM simulation of the haemocyte dynamics in a stenotic capillary.
Yuan-Qing, Xu; Xiao-Ying, Tang; Fang-Bao, Tian; Yu-Hua, Peng; Yong, Xu; Yan-Jun, Zeng
2014-01-01
To study the behaviour of a haemocyte when crossing a stenotic capillary, the immersed boundary-lattice Boltzmann method was used to establish a quantitative analysis model. The haemocyte was assumed to be spherical and to have an elastic cell membrane, which can be driven by blood flow to adopt a highly deformable character. In the stenotic capillary, the spherical blood cell was stressed both by the flow and the wall dimension, and the cell shape was forced to be stretched to cross the stenosis. Our simulation investigated the haemocyte crossing process in detail. The velocity and pressure were anatomised to obtain information on how blood flows through a capillary and to estimate the degree of cell damage caused by excessive pressure. Quantitative velocity analysis results demonstrated that a large haemocyte crossing a small stenosis would have a noticeable effect on blood flow, while quantitative pressure distribution analysis results indicated that the crossing process would produce a special pressure distribution in the cell interior and to some extent a sudden change between the cell interior and the surrounding plasma.
Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun
2015-02-01
Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun
2017-01-01
Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test–retest repeatability data for illustrative purposes. PMID:24872353
The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...
Chemical analysis and quantitation of the tapetum lucidum
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gee, N.A.; Fisher, G.L.; Nash, C.P.
1975-06-01
A study was conducted to provide a basis for the evaluation of the biochemical nature of the $sup 226$Ra alterations of the beagle tapetum. Results indicated that zinc and/or melanin determinations in the tapetum nigrum and tapetum lucidum may allow quantitation of tapetum lucidum tissue without the need for physical separation of the tapetal layers. (HLW)
Examining the Inclusion of Quantitative Research in a Meta-Ethnographic Review
ERIC Educational Resources Information Center
Booker, Rhae-Ann Richardson
2010-01-01
This study explored how one might extend meta-ethnography to quantitative research for the advancement of interpretive review methods. Using the same population of 139 studies on racial-ethnic matching as data, my investigation entailed an extended meta-ethnography (EME) and comparison of its results to a published meta-analysis (PMA). Adhering to…
Risk Assessment and Integration Team (RAIT) Portfolio Risk Analysis Strategy
NASA Technical Reports Server (NTRS)
Edwards, Michelle
2010-01-01
Impact at management level: Qualitative assessment of risk criticality in conjunction with risk consequence, likelihood, and severity enable development of an "investment policy" towards managing a portfolio of risks. Impact at research level: Quantitative risk assessments enable researchers to develop risk mitigation strategies with meaningful risk reduction results. Quantitative assessment approach provides useful risk mitigation information.
Mehdi, Muhammad Zain; Nagi, Abdul Hanan; Naseem, Nadia
2016-01-01
ABSTRACT Introduction/Background: Fuhrman nuclear grade is the most important histological parameter to predict prognosis in a patient of renal cell carcinoma (RCC). However, it suffers from inter-observer and intra-observer variation giving rise to need of a parameter that not only correlates with nuclear grade but is also objective and reproducible. Proliferation is the measure of aggressiveness of a tumour and it is strongly correlated with Fuhrman nuclear grade, clinical survival and recurrence in RCC. Ki-67 is conventionally used to assess proliferation. Mini-chromosome maintenance 2 (MCM-2) is a lesser known marker of proliferation and identifies a greater proliferation faction. This study was designed to assess the prognostic significance of MCM-2 by comparing it with Fuhrman nuclear grade and Ki-67. Material and Methods: n=50 cases of various ages, stages, histological subtypes and grades of RCC were selected for this study. Immunohistochemical staining using Ki-67(MIB-1, Mouse monoclonal antibody, Dako) and MCM-2 (Mouse monoclonal antibody, Thermo) was performed on the paraffin embedded blocks in the department of Morbid anatomy and Histopathology, University of Health Sciences, Lahore. Labeling indices (LI) were determined by two pathologists independently using quantitative and semi-quantitative analysis. Statistical analysis was carried out using SPSS 20.0. Kruskall-Wallis test was used to determine a correlation of proliferation markers with grade, and Pearson's correlate was used to determine correlation between the two proliferation markers. Results: Labeling index of MCM-2 (median=24.29%) was found to be much higher than Ki-67(median=13.05%). Both markers were significantly related with grade (p=0.00; Kruskall-Wallis test). LI of MCM-2 was found to correlate significantly with LI of Ki-67(r=0.0934;p=0.01 with Pearson's correlate). Results of semi-quantitative analysis correlated well with quantitative analysis. Conclusion: Both Ki-67 and MCM-2 are markers of proliferation which are closely linked to grade. Therefore, they can act as surrogate markers for grade in a manner that is more objective and reproducible. PMID:27532114
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-03
... Response to Comments on Previous Analysis C. Summary of the Comparative Analysis 1. Quantitative Analysis 2... preliminary quantitative analysis are specific building designs, in most cases with specific spaces defined... preliminary determination. C. Summary of the Comparative Analysis DOE carried out both a broad quantitative...
Zhang, Yin; Diao, Tianxi; Wang, Lei
2014-12-01
Designed to advance the two-way translational process between basic research and clinical practice, translational medicine has become one of the most important areas in biomedicine. The quantitative evaluation of translational medicine is valuable for the decision making of global translational medical research and funding. Using the scientometric analysis and information extraction techniques, this study quantitatively analyzed the scientific articles on translational medicine. The results showed that translational medicine had significant scientific output and impact, specific core field and institute, and outstanding academic status and benefit. While it is not considered in this study, the patent data are another important indicators that should be integrated in the relevant research in the future. © 2014 Wiley Periodicals, Inc.
Development of iPad application "Postima" for quantitative analysis of the effects of manual therapy
NASA Astrophysics Data System (ADS)
Sugiyama, Naruhisa; Shirakawa, Tomohiro
2017-07-01
The technical difficulty of diagnosing joint misalignment and/or dysfunction by quantitative evaluation is commonly acknowledged among manual therapists. Usually, manual therapists make a diagnosis based on a combination of observing patient symptoms and performing physical examinations, both of which rely on subjective criteria and thus contain some uncertainty. We thus sought to investigate the correlations among posture, skeletal misalignment, and pain severity over the course of manual therapy treatment, and to explore the possibility of establishing objective criteria for diagnosis. For this purpose, we developed an iPad application that realizes the measurement of patients' postures and analyzes them quantitatively. We also discuss the results and effectiveness of the measurement and analysis.
A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.
Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less
A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers
Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...
2016-09-14
Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less
Henshall, John M; Dierens, Leanne; Sellars, Melony J
2014-09-02
While much attention has focused on the development of high-density single nucleotide polymorphism (SNP) assays, the costs of developing and running low-density assays have fallen dramatically. This makes it feasible to develop and apply SNP assays for agricultural species beyond the major livestock species. Although low-cost low-density assays may not have the accuracy of the high-density assays widely used in human and livestock species, we show that when combined with statistical analysis approaches that use quantitative instead of discrete genotypes, their utility may be improved. The data used in this study are from a 63-SNP marker Sequenom® iPLEX Platinum panel for the Black Tiger shrimp, for which high-density SNP assays are not currently available. For quantitative genotypes that could be estimated, in 5% of cases the most likely genotype for an individual at a SNP had a probability of less than 0.99. Matrix formulations of maximum likelihood equations for parentage assignment were developed for the quantitative genotypes and also for discrete genotypes perturbed by an assumed error term. Assignment rates that were based on maximum likelihood with quantitative genotypes were similar to those based on maximum likelihood with perturbed genotypes but, for more than 50% of cases, the two methods resulted in individuals being assigned to different families. Treating genotypes as quantitative values allows the same analysis framework to be used for pooled samples of DNA from multiple individuals. Resulting correlations between allele frequency estimates from pooled DNA and individual samples were consistently greater than 0.90, and as high as 0.97 for some pools. Estimates of family contributions to the pools based on quantitative genotypes in pooled DNA had a correlation of 0.85 with estimates of contributions from DNA-derived pedigree. Even with low numbers of SNPs of variable quality, parentage testing and family assignment from pooled samples are sufficiently accurate to provide useful information for a breeding program. Treating genotypes as quantitative values is an alternative to perturbing genotypes using an assumed error distribution, but can produce very different results. An understanding of the distribution of the error is required for SNP genotyping platforms.
Zhang, Zhen; Shang, Haihong; Shi, Yuzhen; Huang, Long; Li, Junwen; Ge, Qun; Gong, Juwu; Liu, Aiying; Chen, Tingting; Wang, Dan; Wang, Yanling; Palanga, Koffi Kibalou; Muhammad, Jamshed; Li, Weijie; Lu, Quanwei; Deng, Xiaoying; Tan, Yunna; Song, Weiwu; Cai, Juan; Li, Pengtao; Rashid, Harun or; Gong, Wankui; Yuan, Youlu
2016-04-11
Upland Cotton (Gossypium hirsutum) is one of the most important worldwide crops it provides natural high-quality fiber for the industrial production and everyday use. Next-generation sequencing is a powerful method to identify single nucleotide polymorphism markers on a large scale for the construction of a high-density genetic map for quantitative trait loci mapping. In this research, a recombinant inbred lines population developed from two upland cotton cultivars 0-153 and sGK9708 was used to construct a high-density genetic map through the specific locus amplified fragment sequencing method. The high-density genetic map harbored 5521 single nucleotide polymorphism markers which covered a total distance of 3259.37 cM with an average marker interval of 0.78 cM without gaps larger than 10 cM. In total 18 quantitative trait loci of boll weight were identified as stable quantitative trait loci and were detected in at least three out of 11 environments and explained 4.15-16.70 % of the observed phenotypic variation. In total, 344 candidate genes were identified within the confidence intervals of these stable quantitative trait loci based on the cotton genome sequence. These genes were categorized based on their function through gene ontology analysis, Kyoto Encyclopedia of Genes and Genomes analysis and eukaryotic orthologous groups analysis. This research reported the first high-density genetic map for Upland Cotton (Gossypium hirsutum) with a recombinant inbred line population using single nucleotide polymorphism markers developed by specific locus amplified fragment sequencing. We also identified quantitative trait loci of boll weight across 11 environments and identified candidate genes within the quantitative trait loci confidence intervals. The results of this research would provide useful information for the next-step work including fine mapping, gene functional analysis, pyramiding breeding of functional genes as well as marker-assisted selection.
Qualitative and quantitative interpretation of SEM image using digital image processing.
Saladra, Dawid; Kopernik, Magdalena
2016-10-01
The aim of the this study is improvement of qualitative and quantitative analysis of scanning electron microscope micrographs by development of computer program, which enables automatic crack analysis of scanning electron microscopy (SEM) micrographs. Micromechanical tests of pneumatic ventricular assist devices result in a large number of micrographs. Therefore, the analysis must be automatic. Tests for athrombogenic titanium nitride/gold coatings deposited on polymeric substrates (Bionate II) are performed. These tests include microshear, microtension and fatigue analysis. Anisotropic surface defects observed in the SEM micrographs require support for qualitative and quantitative interpretation. Improvement of qualitative analysis of scanning electron microscope images was achieved by a set of computational tools that includes binarization, simplified expanding, expanding, simple image statistic thresholding, the filters Laplacian 1, and Laplacian 2, Otsu and reverse binarization. Several modifications of the known image processing techniques and combinations of the selected image processing techniques were applied. The introduced quantitative analysis of digital scanning electron microscope images enables computation of stereological parameters such as area, crack angle, crack length, and total crack length per unit area. This study also compares the functionality of the developed computer program of digital image processing with existing applications. The described pre- and postprocessing may be helpful in scanning electron microscopy and transmission electron microscopy surface investigations. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
Benefit-risk analysis : a brief review and proposed quantitative approaches.
Holden, William L
2003-01-01
Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.
Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.
Pang, Wei; Coghill, George M
2015-05-01
In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Qi, Pan; Shao, Wenbin; Liao, Shusheng
2016-02-01
For quantitative defects detection research on heat transfer tube in nuclear power plants (NPP), two parts of work are carried out based on the crack as the main research objects. (1) Production optimization of calibration tube. Firstly, ASME, RSEM and homemade crack calibration tubes are applied to quantitatively analyze the defects depth on other designed crack test tubes, and then the judgment with quantitative results under crack calibration tube with more accuracy is given. Base on that, weight analysis of influence factors for crack depth quantitative test such as crack orientation, length, volume and so on can be undertaken, which will optimize manufacture technology of calibration tubes. (2) Quantitative optimization of crack depth. Neural network model with multi-calibration curve adopted to optimize natural crack test depth generated in in-service tubes shows preliminary ability to improve quantitative accuracy.
Qualitative and Quantitative Analyses of Glycogen in Human Milk.
Matsui-Yatsuhashi, Hiroko; Furuyashiki, Takashi; Takata, Hiroki; Ishida, Miyuki; Takumi, Hiroko; Kakutani, Ryo; Kamasaka, Hiroshi; Nagao, Saeko; Hirose, Junko; Kuriki, Takashi
2017-02-22
Identification as well as a detailed analysis of glycogen in human milk has not been shown yet. The present study confirmed that glycogen is contained in human milk by qualitative and quantitative analyses. High-performance anion exchange chromatography (HPAEC) and high-performance size exclusion chromatography with a multiangle laser light scattering detector (HPSEC-MALLS) were used for qualitative analysis of glycogen in human milk. Quantitative analysis was carried out by using samples obtained from the individual milks. The result revealed that the concentration of human milk glycogen varied depending on the mother's condition-such as the period postpartum and inflammation. The amounts of glycogen in human milk collected at 0 and 1-2 months postpartum were higher than in milk collected at 3-14 months postpartum. In the milk from mothers with severe mastitis, the concentration of glycogen was about 40 times higher than that in normal milk.
NASA Astrophysics Data System (ADS)
Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.
2016-01-01
In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.
Dulohery, Kate; Papavdi, Asteria; Michalodimitrakis, Manolis; Kranioti, Elena F
2012-11-01
Coronary artery atherosclerosis is a hugely prevalent condition in the Western World and is often encountered during autopsy. Atherosclerotic plaques can cause luminal stenosis: which, if over a significant level (75%), is said to contribute to cause of death. Estimation of stenosis can be macroscopically performed by the forensic pathologists at the time of autopsy or by microscopic examination. This study compares macroscopic estimation with quantitative microscopic image analysis with a particular focus on the assessment of significant stenosis (>75%). A total of 131 individuals were analysed. The sample consists of an atherosclerotic group (n=122) and a control group (n=9). The results of the two methods were significantly different from each other (p=0.001) and the macroscopic method gave a greater percentage stenosis by an average of 3.5%. Also, histological examination of coronary artery stenosis yielded a difference in significant stenosis in 11.5% of cases. The differences were attributed to either histological quantitative image analysis underestimation; gross examination overestimation; or, a combination of both. The underestimation may have come from tissue shrinkage during tissue processing for histological specimen. The overestimation from the macroscopic assessment can be attributed to the lumen shape, to the examiner observer error or to a possible bias to diagnose coronary disease when no other cause of death is apparent. The results indicate that the macroscopic estimation is open to more biases and that histological quantitative image analysis only gives a precise assessment of stenosis ex vivo. Once tissue shrinkage, if any, is accounted for then histological quantitative image analysis will yield a more accurate assessment of in vivo stenosis. It may then be considered a complementary tool for the examination of coronary stenosis. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
Remote sensing and spectral analysis of plumes from ocean dumping in the New York Bight Apex
NASA Technical Reports Server (NTRS)
Johnson, R. W.
1980-01-01
The application of the remote sensing techniques of aerial photography and multispectral scanning in the qualitative and quantitative analysis of plumes from ocean dumping of waste materials is investigated in the New York Bight Apex. Plumes resulting from the dumping of acid waste and sewage sludge were observed by Ocean Color Scanner at an altitude of 19.7 km and by Modular Multispectral Scanner and mapping camera at an altitude of 3.0 km. Results of the qualitative analysis of multispectral and photographic data for the mapping, location, and identification of pollution features without concurrent sea truth measurements are presented which demonstrate the usefulness of in-scene calibration. Quantitative distributions of the suspended solids in sewage sludge released in spot and line dumps are also determined by a multiple regression analysis of multispectral and sea truth data.
ERIC Educational Resources Information Center
Anderson, James L.; And Others
1980-01-01
Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)
NASA Technical Reports Server (NTRS)
Carpenter, Paul; Curreri, Peter A. (Technical Monitor)
2002-01-01
This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.
Wang, Tong; Wu, Hai-Long; Xie, Li-Xia; Zhu, Li; Liu, Zhi; Sun, Xiao-Dong; Xiao, Rong; Yu, Ru-Qin
2017-04-01
In this work, a smart chemometrics-enhanced strategy, high-performance liquid chromatography, and diode array detection coupled with second-order calibration method based on alternating trilinear decomposition algorithm was proposed to simultaneously quantify 12 polyphenols in different kinds of apple peel and pulp samples. The proposed strategy proved to be a powerful tool to solve the problems of coelution, unknown interferences, and chromatographic shifts in the process of high-performance liquid chromatography analysis, making it possible for the determination of 12 polyphenols in complex apple matrices within 10 min under simple conditions of elution. The average recoveries with standard deviations, and figures of merit including sensitivity, selectivity, limit of detection, and limit of quantitation were calculated to validate the accuracy of the proposed method. Compared to the quantitative analysis results from the classic high-performance liquid chromatography method, the statistical and graphical analysis showed that our proposed strategy obtained more reliable results. All results indicated that our proposed method used in the quantitative analysis of apple polyphenols was an accurate, fast, universal, simple, and green one, and it was expected to be developed as an attractive alternative method for simultaneous determination of multitargeted analytes in complex matrices. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Using the iPhone as a device for a rapid quantitative analysis of trinitrotoluene in soil.
Choodum, Aree; Kanatharana, Proespichaya; Wongniramaikul, Worawit; Daeid, Niamh Nic
2013-10-15
Mobile 'smart' phones have become almost ubiquitous in society and are typically equipped with a high-resolution digital camera which can be used to produce an image very conveniently. In this study, the built-in digital camera of a smart phone (iPhone) was used to capture the results from a rapid quantitative colorimetric test for trinitrotoluene (TNT) in soil. The results were compared to those from a digital single-lens reflex (DSLR) camera. The colored product from the selective test for TNT was quantified using an innovative application of photography where the relationships between the Red Green Blue (RGB) values and the concentrations of colorimetric product were exploited. The iPhone showed itself to be capable of being used more conveniently than the DSLR while providing similar analytical results with increased sensitivity. The wide linear range and low detection limits achieved were comparable with those from spectrophotometric quantification methods. Low relative errors in the range of 0.4 to 6.3% were achieved in the analysis of control samples and 0.4-6.2% for spiked soil extracts with good precision (2.09-7.43% RSD) for the analysis over 4 days. The results demonstrate that the iPhone provides the potential to be used as an ideal novel platform for the development of a rapid on site semi quantitative field test for the analysis of explosives. © 2013 Elsevier B.V. All rights reserved.
Hruska, Carrie B; Geske, Jennifer R; Swanson, Tiffinee N; Mammel, Alyssa N; Lake, David S; Manduca, Armando; Conners, Amy Lynn; Whaley, Dana H; Scott, Christopher G; Carter, Rickey E; Rhodes, Deborah J; O'Connor, Michael K; Vachon, Celine M
2018-06-05
Background parenchymal uptake (BPU), which refers to the level of Tc-99m sestamibi uptake within normal fibroglandular tissue on molecular breast imaging (MBI), has been identified as a breast cancer risk factor, independent of mammographic density. Prior analyses have used subjective categories to describe BPU. We evaluate a new quantitative method for assessing BPU by testing its reproducibility, comparing quantitative results with previously established subjective BPU categories, and determining the association of quantitative BPU with breast cancer risk. Two nonradiologist operators independently performed region-of-interest analysis on MBI images viewed in conjunction with corresponding digital mammograms. Quantitative BPU was defined as a unitless ratio of the average pixel intensity (counts/pixel) within the fibroglandular tissue versus the average pixel intensity in fat. Operator agreement and the correlation of quantitative BPU measures with subjective BPU categories assessed by expert radiologists were determined. Percent density on mammograms was estimated using Cumulus. The association of quantitative BPU with breast cancer (per one unit BPU) was examined within an established case-control study of 62 incident breast cancer cases and 177 matched controls. Quantitative BPU ranged from 0.4 to 3.2 across all subjects and was on average higher in cases compared to controls (1.4 versus 1.2, p < 0.007 for both operators). Quantitative BPU was strongly correlated with subjective BPU categories (Spearman's r = 0.59 to 0.69, p < 0.0001, for each paired combination of two operators and two radiologists). Interoperator and intraoperator agreement in the quantitative BPU measure, assessed by intraclass correlation, was 0.92 and 0.98, respectively. Quantitative BPU measures showed either no correlation or weak negative correlation with mammographic percent density. In a model adjusted for body mass index and percent density, higher quantitative BPU was associated with increased risk of breast cancer for both operators (OR = 4.0, 95% confidence interval (CI) 1.6-10.1, and 2.4, 95% CI 1.2-4.7). Quantitative measurement of BPU, defined as the ratio of average counts in fibroglandular tissue relative to that in fat, can be reliably performed by nonradiologist operators with a simple region-of-interest analysis tool. Similar to results obtained with subjective BPU categories, quantitative BPU is a functional imaging biomarker of breast cancer risk, independent of mammographic density and hormonal factors.
LFQuant: a label-free fast quantitative analysis tool for high-resolution LC-MS/MS proteomics data.
Zhang, Wei; Zhang, Jiyang; Xu, Changming; Li, Ning; Liu, Hui; Ma, Jie; Zhu, Yunping; Xie, Hongwei
2012-12-01
Database searching based methods for label-free quantification aim to reconstruct the peptide extracted ion chromatogram based on the identification information, which can limit the search space and thus make the data processing much faster. The random effect of the MS/MS sampling can be remedied by cross-assignment among different runs. Here, we present a new label-free fast quantitative analysis tool, LFQuant, for high-resolution LC-MS/MS proteomics data based on database searching. It is designed to accept raw data in two common formats (mzXML and Thermo RAW), and database search results from mainstream tools (MASCOT, SEQUEST, and X!Tandem), as input data. LFQuant can handle large-scale label-free data with fractionation such as SDS-PAGE and 2D LC. It is easy to use and provides handy user interfaces for data loading, parameter setting, quantitative analysis, and quantitative data visualization. LFQuant was compared with two common quantification software packages, MaxQuant and IDEAL-Q, on the replication data set and the UPS1 standard data set. The results show that LFQuant performs better than them in terms of both precision and accuracy, and consumes significantly less processing time. LFQuant is freely available under the GNU General Public License v3.0 at http://sourceforge.net/projects/lfquant/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
[A comparison of convenience sampling and purposive sampling].
Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien
2014-06-01
Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.
Zhang, Xiaoyu; Mei, Xueran; Wang, Zhanguo; Wu, Jing; Liu, Gang; Hu, Huiling; Li, Qijuan
2018-05-24
Docynia dcne leaf from the genus of Docynia Dcne (including three species of Docynia delavayi, Docynia indica and Docynia longiunguis.) is an important raw material of local ethnic minority tea, ethnomedicines and food supplements in southwestern areas of China. However, D. dcne leaves from these three species are usually used confusingly, which could influence the therapeutic effect of it. A rapid and effective method for the chemical fingerprint and quantitative analysis to evaluate the quality of D. dcne leaves was established. The chemometric methods, including similarity analysis, hierarchical cluster analysis and partial least-squares discrimination analysis, were applied to distinguish 30 batches of D. dcne leaf samples from these three species. The above results could validate each other and successfully group these samples into three categories which were closely related to the species of D. dcne leaves. Moreover, isoquercitrin and phlorizin were screened as the chemical markers to evaluate the quality of D. dcne leaves from different species. And the contents of isoquercitrin and phlorizin varied remarkably in these samples, with ranges of 6.41-38.84 and 95.73-217.76 mg/g, respectively. All the results indicated that an integration method of chemical fingerprint couple with chemometrics analysis and quantitative assessment was a powerful and beneficial tool for quality control of D. dcne leaves, and could be applied also for differentiation and quality control of other herbal preparations.
Quantitative Analysis of High-Quality Officer Selection by Commandants Career-Level Education Board
2017-03-01
due to Marines being evaluated before the end of their initial service commitment. Our research utilizes quantitative variables to analyze the...not provide detailed information why. B. LIMITATIONS The photograph analysis in this research is strictly limited to a quantitative analysis in...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. QUANTITATIVE
Accuracy of a remote quantitative image analysis in the whole slide images.
Słodkowska, Janina; Markiewicz, Tomasz; Grala, Bartłomiej; Kozłowski, Wojciech; Papierz, Wielisław; Pleskacz, Katarzyna; Murawski, Piotr
2011-03-30
The rationale for choosing a remote quantitative method supporting a diagnostic decision requires some empirical studies and knowledge on scenarios including valid telepathology standards. The tumours of the central nervous system [CNS] are graded on the base of the morphological features and the Ki-67 labelling Index [Ki-67 LI]. Various methods have been applied for Ki-67 LI estimation. Recently we have introduced the Computerized Analysis of Medical Images [CAMI] software for an automated Ki-67 LI counting in the digital images. Aims of our study was to explore the accuracy and reliability of a remote assessment of Ki-67 LI with CAMI software applied to the whole slide images [WSI]. The WSI representing CNS tumours: 18 meningiomas and 10 oligodendrogliomas were stored on the server of the Warsaw University of Technology. The digital copies of entire glass slides were created automatically by the Aperio ScanScope CS with objective 20x or 40x. Aperio's Image Scope software provided functionality for a remote viewing of WSI. The Ki-67 LI assessment was carried on within 2 out of 20 selected fields of view (objective 40x) representing the highest labelling areas in each WSI. The Ki-67 LI counting was performed by 3 various methods: 1) the manual reading in the light microscope - LM, 2) the automated counting with CAMI software on the digital images - DI , and 3) the remote quantitation on the WSIs - as WSI method. The quality of WSIs and technical efficiency of the on-line system were analysed. The comparative statistical analysis was performed for the results obtained by 3 methods of Ki-67 LI counting. The preliminary analysis showed that in 18% of WSI the results of Ki-67 LI differed from those obtained in other 2 methods of counting when the quality of the glass slides was below the standard range. The results of our investigations indicate that the remote automated Ki-67 LI analysis performed with the CAMI algorithm on the whole slide images of meningiomas and oligodendrogliomas could be successfully used as an alternative method to the manual reading as well as to the digital images quantitation with CAMI software. According to our observation a need of a remote supervision/consultation and training for the effective use of remote quantitative analysis of WSI is necessary.
A method for the extraction and quantitation of phycoerythrin from algae
NASA Technical Reports Server (NTRS)
Stewart, D. E.
1982-01-01
A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.
TV News Analysis Project Motivates Broadcast Class.
ERIC Educational Resources Information Center
Smith, James R.
1980-01-01
Describes the use of content analysis by a journalism class in studying television news. Indicates that the method is flexible, generates familiarity with quantitative approaches to the analysis of broadcast journalism, can result in increased awareness of the complexity of the broadcast news medium, and increases student motivation. (TJ)
NASA Technical Reports Server (NTRS)
Norga, Koenraad K.; Gurganus, Marjorie C.; Dilda, Christy L.; Yamamoto, Akihiko; Lyman, Richard F.; Patel, Prajal H.; Rubin, Gerald M.; Hoskins, Roger A.; Mackay, Trudy F.; Bellen, Hugo J.
2003-01-01
BACKGROUND: The identification of the function of all genes that contribute to specific biological processes and complex traits is one of the major challenges in the postgenomic era. One approach is to employ forward genetic screens in genetically tractable model organisms. In Drosophila melanogaster, P element-mediated insertional mutagenesis is a versatile tool for the dissection of molecular pathways, and there is an ongoing effort to tag every gene with a P element insertion. However, the vast majority of P element insertion lines are viable and fertile as homozygotes and do not exhibit obvious phenotypic defects, perhaps because of the tendency for P elements to insert 5' of transcription units. Quantitative genetic analysis of subtle effects of P element mutations that have been induced in an isogenic background may be a highly efficient method for functional genome annotation. RESULTS: Here, we have tested the efficacy of this strategy by assessing the extent to which screening for quantitative effects of P elements on sensory bristle number can identify genes affecting neural development. We find that such quantitative screens uncover an unusually large number of genes that are known to function in neural development, as well as genes with yet uncharacterized effects on neural development, and novel loci. CONCLUSIONS: Our findings establish the use of quantitative trait analysis for functional genome annotation through forward genetics. Similar analyses of quantitative effects of P element insertions will facilitate our understanding of the genes affecting many other complex traits in Drosophila.
Smartphone-based multispectral imaging: system development and potential for mobile skin diagnosis.
Kim, Sewoong; Cho, Dongrae; Kim, Jihun; Kim, Manjae; Youn, Sangyeon; Jang, Jae Eun; Je, Minkyu; Lee, Dong Hun; Lee, Boreom; Farkas, Daniel L; Hwang, Jae Youn
2016-12-01
We investigate the potential of mobile smartphone-based multispectral imaging for the quantitative diagnosis and management of skin lesions. Recently, various mobile devices such as a smartphone have emerged as healthcare tools. They have been applied for the early diagnosis of nonmalignant and malignant skin diseases. Particularly, when they are combined with an advanced optical imaging technique such as multispectral imaging and analysis, it would be beneficial for the early diagnosis of such skin diseases and for further quantitative prognosis monitoring after treatment at home. Thus, we demonstrate here the development of a smartphone-based multispectral imaging system with high portability and its potential for mobile skin diagnosis. The results suggest that smartphone-based multispectral imaging and analysis has great potential as a healthcare tool for quantitative mobile skin diagnosis.
Analysis of facility needs level in architecture studio for students’ studio grades
NASA Astrophysics Data System (ADS)
Lubis, A. S.; Hamid, B.; Pane, I. F.; Marpaung, B. O. Y.
2018-03-01
Architects must be able to play an active role and contribute to the realization of a sustainable environment. Architectural education has inherited many education research used qualitative and quantitative methods. The data were gathered by conducting (a) observation,(b) interviews, (c) documentation, (d) literature study, and (e) Questionnaire. The gathered data were analyzed qualitatively to find out what equipment needed in the learning process in the Architecture Studio, USU. Questionnaires and Ms. Excel were used for the quantitative analysis. The tabulation of quantitative data would be correlated with the students’ studio grades. The result of the research showed that equipment with the highest level of needs was (1) drawing table, (2) Special room for each student, (3) Internet Network, (4) Air Conditioning, (5) Sufficient lighting.
Lee, Dong-Yup; Yun, Hongsoek; Park, Sunwon; Lee, Sang Yup
2003-11-01
MetaFluxNet is a program package for managing information on the metabolic reaction network and for quantitatively analyzing metabolic fluxes in an interactive and customized way. It allows users to interpret and examine metabolic behavior in response to genetic and/or environmental modifications. As a result, quantitative in silico simulations of metabolic pathways can be carried out to understand the metabolic status and to design the metabolic engineering strategies. The main features of the program include a well-developed model construction environment, user-friendly interface for metabolic flux analysis (MFA), comparative MFA of strains having different genotypes under various environmental conditions, and automated pathway layout creation. http://mbel.kaist.ac.kr/ A manual for MetaFluxNet is available as PDF file.
NASA Astrophysics Data System (ADS)
Shiina, Tsuyoshi; Maki, Tomonori; Yamakawa, Makoto; Mitake, Tsuyoshi; Kudo, Masatoshi; Fujimoto, Kenji
2012-07-01
Precise evaluation of the stage of chronic hepatitis C with respect to fibrosis has become an important issue to prevent the occurrence of cirrhosis and to initiate appropriate therapeutic intervention such as viral eradication using interferon. Ultrasound tissue elasticity imaging, i.e., elastography can visualize tissue hardness/softness, and its clinical usefulness has been studied to detect and evaluate tumors. We have recently reported that the texture of elasticity image changes as fibrosis progresses. To evaluate fibrosis progression quantitatively on the basis of ultrasound tissue elasticity imaging, we introduced a mechanical model of fibrosis progression and simulated the process by which hepatic fibrosis affects elasticity images and compared the results with those clinical data analysis. As a result, it was confirmed that even in diffuse diseases like chronic hepatitis, the patterns of elasticity images are related to fibrous structural changes caused by hepatic disease and can be used to derive features for quantitative evaluation of fibrosis stage.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gritsenko, Marina A.; Xu, Zhe; Liu, Tao
Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less
Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D
2016-01-01
Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.
Broadband external cavity quantum cascade laser based sensor for gasoline detection
NASA Astrophysics Data System (ADS)
Ding, Junya; He, Tianbo; Zhou, Sheng; Li, Jinsong
2018-02-01
A new type of tunable diode spectroscopy sensor based on an external cavity quantum cascade laser (ECQCL) and a quartz crystal tuning fork (QCTF) were used for quantitative analysis of volatile organic compounds. In this work, the sensor system had been tested on different gasoline sample analysis. For signal processing, the self-established interpolation algorithm and multiple linear regression algorithm model were used for quantitative analysis of major volatile organic compounds in gasoline samples. The results were very consistent with that of the standard spectra taken from the Pacific Northwest National Laboratory (PNNL) database. In future, The ECQCL sensor will be used for trace explosive, chemical warfare agent, and toxic industrial chemical detection and spectroscopic analysis, etc.
NASA Astrophysics Data System (ADS)
Melelli, Laura; Liucci, Luisa; Vergari, Francesca; Ciccacci, Sirio; Del Monte, Maurizio
2014-05-01
Drainage basins are primary landscape units for geomorphological investigations. Both hillslopes and river drainage system are fundamental components in drainage basins analysis. As other geomorphological systems, also the drainage basins aim to an equilibrium condition where the sequence of erosion, transport and sedimentation approach to a condition of minimum energy effort. This state is revealed by a typical geometry of landforms and of drainage net. Several morphometric indexes can measure how much a drainage basin is far from the theoretical equilibrium configuration, revealing possible external disarray. In active tectonic areas, the drainage basins have a primary importance in order to highlight style, amount and rate of tectonic impulses, and morphometric indexes allow to estimate the tectonic activity classes of different sectors in a study area. Moreover, drainage rivers are characterized by a self-similarity structure; this promotes the use of fractals theory to investigate the system. In this study, fractals techniques are employed together with quantitative geomorphological analysis to study the Upper Tiber Valley (UTV), a tectonic intermontane basin located in northern Apennines (Umbria, central Italy). The area is the result of different tectonic phases. From Late Pliocene until present time the UTV is strongly controlled by a regional uplift and by an extensional phase with different sets of normal faults playing a fundamental role in basin morphology. Thirty-four basins are taken into account for the quantitative analysis, twenty on the left side of the basin, the others on the right side. Using fractals dimension of drainage networks, Horton's laws results, concavity and steepness indexes, and hypsometric curves, this study aims to obtain an evolutionary model of the UTV, where the uplift is compared to local subsidence induced by normal fault activity. The results highlight a well defined difference between western and eastern tributary basins, suggesting a greater disequilibrium in the last ones. The quantitative analysis points out the segments of the basin boundaries where the fault activity is more efficient and the resulting geomorphological implications.
Gao, Meng; Wang, Yuesheng; Wei, Huizhen; Ouyang, Hui; He, Mingzhen; Zeng, Lianqing; Shen, Fengyun; Guo, Qiang; Rao, Yi
2014-06-01
A method was developed for the determination of amygdalin and its metabolite prunasin in rat plasma after intragastric administration of Maxing shigan decoction. The analytes were identified by ultra-high performance liquid chromatography-tandem quadrupole time of flight mass spectrometry and quantitatively determined by ultra-high performance liquid chromatography-tandem triple quadrupole mass spectrometry. After purified by liquid-liquid extraction, the qualitative analysis of amygdalin and prunasin in the plasma sample was performed on a Shim-pack XR-ODS III HPLC column (75 mm x 2.0 mm, 1.6 microm), using acetonitrile-0.1% (v/v) formic acid aqueous solution. The detection was performed on a Triple TOF 5600 quadrupole time of flight mass spectrometer. The quantitative analysis of amygdalin and prunasin in the plasma sample was performed by separation on an Agilent C18 HPLC column (50 mm x 2.1 mm, 1.7 microm), using acetonitrile-0.1% (v/v) formic acid aqueous solution. The detection was performed on an AB Q-TRAP 4500 triple quadrupole mass spectrometer utilizing electrospray ionization (ESI) interface operated in negative ion mode and multiple-reaction monitoring (MRM) mode. The qualitative analysis results showed that amygdalin and its metabolite prunasin were detected in the plasma sample. The quantitative analysis results showed that the linear range of amygdalin was 1.05-4 200 ng/mL with the correlation coefficient of 0.999 0 and the linear range of prunasin was 1.25-2 490 ng/mL with the correlation coefficient of 0.997 0. The method had a good precision with the relative standard deviations (RSDs) lower than 9.20% and the overall recoveries varied from 82.33% to 95.25%. The limits of detection (LODs) of amygdalin and prunasin were 0.50 ng/mL. With good reproducibility, the method is simple, fast and effective for the qualitative and quantitative analysis of the amygdalin and prunasin in plasma sample of rats which were administered by Maxing shigan decoction.
NASA Technical Reports Server (NTRS)
Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.
2010-01-01
Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, within a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability, and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation, testing results, and other information. Where appropriate, actual performance history was used to calculate failure rates for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to assess compliance with requirements and to highlight design or performance shortcomings for further decision making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability, and maintainability analysis, and present findings and observation based on analysis leading to the Ground Operations Project Preliminary Design Review milestone.
Automatic registration of ICG images using mutual information and perfusion analysis
NASA Astrophysics Data System (ADS)
Kim, Namkug; Seo, Jong-Mo; Lee, June-goo; Kim, Jong Hyo; Park, Kwangsuk; Yu, Hyeong-Gon; Yu, Young Suk; Chung, Hum
2005-04-01
Introduction: Indocyanin green fundus angiographic images (ICGA) of the eyes is useful method in detecting and characterizing the choroidal neovascularization (CNV), which is the major cause of the blindness over 65 years of age. To investigate the quantitative analysis of the blood flow on ICGA, systematic approach for automatic registration of using mutual information and a quantitative analysis was developed. Methods: Intermittent sequential images of indocyanin green angiography were acquired by Heidelberg retinal angiography that uses the laser scanning system for the image acquisition. Misalignment of the each image generated by the minute eye movement of the patients was corrected by the mutual information method because the distribution of the contrast media on image is changing throughout the time sequences. Several region of interest (ROI) were selected by a physician and the intensities of the selected region were plotted according to the time sequences. Results: The registration of ICGA time sequential images is required not only translate transform but also rotational transform. Signal intensities showed variation based on gamma-variate function depending on ROIs and capillary vessels show more variance of signal intensity than major vessels. CNV showed intermediate variance of signal intensity and prolonged transit time. Conclusion: The resulting registered images can be used not only for quantitative analysis, but also for perfusion analysis. Various investigative approached on CNV using this method will be helpful in the characterization of the lesion and follow-up.
The simultaneous quantitation of ten amino acids in soil extracts by mass fragmentography
NASA Technical Reports Server (NTRS)
Pereira, W. E.; Hoyano, Y.; Reynolds, W. E.; Summons, R. E.; Duffield, A. M.
1972-01-01
A specific and sensitive method for the identification and simultaneous quantitation by mass fragmentography of ten of the amino acids present in soil was developed. The technique uses a computer driven quadrupole mass spectrometer and a commercial preparation of deuterated amino acids is used as internal standards for purposes of quantitation. The results obtained are comparable with those from an amino acid analyzer. In the quadrupole mass spectrometer-computer system up to 25 pre-selected ions may be monitored sequentially. This allows a maximum of 12 different amino acids (one specific ion in each of the undeuterated and deuterated amino acid spectra) to be quantitated. The method is relatively rapid (analysis time of approximately one hour) and is capable of the quantitation of nanogram quantities of amino acids.
NASA Astrophysics Data System (ADS)
Wardhani, D. K.; Azmi, D. S.; Purnamasari, W. D.
2017-06-01
RW 3 Sukun Malang was one of kampong that won the competition kampong environment and had managed to maintain the preservation of the kampong. Society of RW 3 Sukun undertake various activities to manage the environment by optimizing the use of kampong space. Despite RW 3 Sukun had conducted environmental management activities, there are several locations in the kampong space that less well maintained. The purpose of this research was to determine the relation of environmental management with the quality of kampong space in RW 3 Sukun. This research used qualitative and quantitative research approaches. Quantitative research conducted by using descriptive statistical analysis in assessing the quality of kampong space with weighting, scoring, and overlay maps. Quantitative research was also conducted on the relation analysis of environmental management with the quality of kampong space by using typology analysis and pearson correlation analysis. Qualitative research conducted on the analysis of environmental management and the relation analysis of environmental management with the quality of kampong space. Result of this research indicates that environmental management in RW 3 Sukun have relation with the quality of kampong space.
Napolitano, Assunta; Akay, Seref; Mari, Angela; Bedir, Erdal; Pizza, Cosimo; Piacente, Sonia
2013-11-01
Astragalus species are widely used as health foods and dietary supplements, as well as drugs in traditional medicine. To rapidly evaluate metabolite similarities and differences among the EtOH extracts of the roots of eight commercial Astragalus spp., an approach based on direct analyses by ESI-MS followed by PCA of ESI-MS data, was carried out. Successively, quali-quantitative analyses of cycloartane derivatives in the eight Astragalus spp. by LC-ESI-MS(n) and PCA of LC-ESI-MS data were performed. This approach allowed to promptly highlighting metabolite similarities and differences among the various Astragalus spp. PCA results from LC-ESI-MS data of Astragalus samples were in reasonable agreement with both PCA results of ESI-MS data and quantitative results. This study affords an analytical method for the quali-quantitative determination of cycloartane derivatives in herbal preparations used as health and food supplements. Copyright © 2013 Elsevier B.V. All rights reserved.
Quantitative aspects of inductively coupled plasma mass spectrometry
NASA Astrophysics Data System (ADS)
Bulska, Ewa; Wagner, Barbara
2016-10-01
Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.
Buenrostro, Jason D.; Chircus, Lauren M.; Araya, Carlos L.; Layton, Curtis J.; Chang, Howard Y.; Snyder, Michael P.; Greenleaf, William J.
2015-01-01
RNA-protein interactions drive fundamental biological processes and are targets for molecular engineering, yet quantitative and comprehensive understanding of the sequence determinants of affinity remains limited. Here we repurpose a high-throughput sequencing instrument to quantitatively measure binding and dissociation of MS2 coat protein to >107 RNA targets generated on a flow-cell surface by in situ transcription and inter-molecular tethering of RNA to DNA. We decompose the binding energy contributions from primary and secondary RNA structure, finding that differences in affinity are often driven by sequence-specific changes in association rates. By analyzing the biophysical constraints and modeling mutational paths describing the molecular evolution of MS2 from low- to high-affinity hairpins, we quantify widespread molecular epistasis, and a long-hypothesized structure-dependent preference for G:U base pairs over C:A intermediates in evolutionary trajectories. Our results suggest that quantitative analysis of RNA on a massively parallel array (RNAMaP) relationships across molecular variants. PMID:24727714
Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis
NASA Technical Reports Server (NTRS)
Shortle, J. F.; Allocco, M.
2005-01-01
Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.
Cui, Meng; McCooeye, Margaret A; Fraser, Catharine; Mester, Zoltán
2004-12-01
A quantitative method was developed for analysis of lysergic acid diethylamide (LSD) in urine using atmospheric pressure matrix-assisted laser desorption/ionization ion trap mass spectrometry (AP MALDI-ITMS). Following solid-phase extraction of LSD from urine samples, extracts were analyzed by AP MALDI-ITMS. The identity of LSD was confirmed by fragmentation of the [M + H](+) ion using tandem mass spectrometry. The quantification of LSD was achieved using stable-isotope-labeled LSD (LSD-d(3)) as the internal standard. The [M + H](+) ion fragmented to produce a dominant fragment ion, which was used for a selected reaction monitoring (SRM) method for quantitative analysis of LSD. SRM was compared with selected ion monitoring and produced a wider linear range and lower limit of quantification. For SRM analysis of samples of LSD spiked in urine, the calibration curve was linear in the range of 1-100 ng/mL with a coefficient of determination, r(2), of 0.9917. This assay was used to determine LSD in urine samples and the AP MALDI-MS results were comparable to the HPLC/ ESI-MS results.
NASA Astrophysics Data System (ADS)
Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; More, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin
2016-03-01
How to rationally identify epithelial ovarian cancer (EOC) patients who will benefit from bevacizumab or other antiangiogenic therapies is a critical issue in EOC treatments. The motivation of this study is to quantitatively measure adiposity features from CT images and investigate the feasibility of predicting potential benefit of EOC patients with or without receiving bevacizumab-based chemotherapy treatment using multivariate statistical models built based on quantitative adiposity image features. A dataset involving CT images from 59 advanced EOC patients were included. Among them, 32 patients received maintenance bevacizumab after primary chemotherapy and the remaining 27 patients did not. We developed a computer-aided detection (CAD) scheme to automatically segment subcutaneous fat areas (VFA) and visceral fat areas (SFA) and then extracted 7 adiposity-related quantitative features. Three multivariate data analysis models (linear regression, logistic regression and Cox proportional hazards regression) were performed respectively to investigate the potential association between the model-generated prediction results and the patients' progression-free survival (PFS) and overall survival (OS). The results show that using all 3 statistical models, a statistically significant association was detected between the model-generated results and both of the two clinical outcomes in the group of patients receiving maintenance bevacizumab (p<0.01), while there were no significant association for both PFS and OS in the group of patients without receiving maintenance bevacizumab. Therefore, this study demonstrated the feasibility of using quantitative adiposity-related CT image features based statistical prediction models to generate a new clinical marker and predict the clinical outcome of EOC patients receiving maintenance bevacizumab-based chemotherapy.
Quantitative Muscle Ultrasonography in Carpal Tunnel Syndrome.
Lee, Hyewon; Jee, Sungju; Park, Soo Ho; Ahn, Seung-Chan; Im, Juneho; Sohn, Min Kyun
2016-12-01
To assess the reliability of quantitative muscle ultrasonography (US) in healthy subjects and to evaluate the correlation between quantitative muscle US findings and electrodiagnostic study results in patients with carpal tunnel syndrome (CTS). The clinical significance of quantitative muscle US in CTS was also assessed. Twenty patients with CTS and 20 age-matched healthy volunteers were recruited. All control and CTS subjects underwent a bilateral median and ulnar nerve conduction study (NCS) and quantitative muscle US. Transverse US images of the abductor pollicis brevis (APB) and abductor digiti minimi (ADM) were obtained to measure muscle cross-sectional area (CSA), thickness, and echo intensity (EI). EI was determined using computer-assisted, grayscale analysis. Inter-rater and intra-rater reliability for quantitative muscle US in control subjects, and differences in muscle thickness, CSA, and EI between the CTS patient and control groups were analyzed. Relationships between quantitative US parameters and electrodiagnostic study results were evaluated. Quantitative muscle US had high inter-rater and intra-rater reliability in the control group. Muscle thickness and CSA were significantly decreased, and EI was significantly increased in the APB of the CTS group (all p<0.05). EI demonstrated a significant positive correlation with latency of the median motor and sensory NCS in CTS patients (p<0.05). These findings suggest that quantitative muscle US parameters may be useful for detecting muscle changes in CTS. Further study involving patients with other neuromuscular diseases is needed to evaluate peripheral muscle change using quantitative muscle US.
Interstate waste transport -- Emotions, energy, and environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elcock, D.
1993-12-31
This report applies quantitative analysis to the debate of waste transport and disposal. Moving from emotions and politics back to numbers, this report estimates potential energy, employment and environmental impacts associated with disposing a ton of municipal solid waste under three different disposal scenarios that reflect interstate and intrastate options. The results help provide a less emotional, more quantitative look at interstate waste transport restrictions.
Interstate waste transport -- Emotions, energy, and environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elcock, D.
1993-01-01
This report applies quantitative analysis to the debate of waste transport and disposal. Moving from emotions and politics back to numbers, this report estimates potential energy, employment and environmental impacts associated with disposing a ton of municipal solid waste under three different disposal scenarios that reflect interstate and intrastate options. The results help provide a less emotional, more quantitative look at interstate waste transport restrictions.
Quantitative morphometric analysis for the tectonic characterisation of northern Tunisia.
NASA Astrophysics Data System (ADS)
Camafort, Miquel; Pérez-Peña, José Vicente; Booth-Rea, Guillermo; Ranero, César R.; Gràcia, Eulàlia; Azañón, José Miguel; Melki, Fetheddine; Ouadday, Mohamed
2016-04-01
Northern Tunisia is characterized by low deformation rates and low to moderate seismicity. Although instrumental seismicity reaches maximum magnitudes of Mw 5.5, some historical earthquakes have occurred with catastrophic consequences in this region. Aiming to improve our knowledge of active tectonics in Tunisia, we carried out both a quantitative morphometric analysis and field study in the north-western region. We applied different morphometric tools, like river profiles, knickpoint analysis, hypsometric curves and integrals and drainage pattern anomalies in order to differentiate between zones with high or low recent tectonic activity. This analysis helps identifying uplift and subsidence zones, which we relate to fault activity. Several active faults in a sparse distribution were identified. A selected sector was studied with a field campaign to test the results obtained with the quantitative analysis. During the fieldwork we identified geological evidence of recent activity and a considerable seismogenic potential along El Alia-Teboursouk (ETF) and Dkhila (DF) faults. The ETF fault could be responsible of one of the most devastating historical earthquakes in northern Tunisia that destroyed Utique in 412 A.D. Geological evidence include fluvial terraces folded by faults, striated and cracked pebbles, clastic dikes, sand volcanoes, coseismic cracks, etc. Although not reflected in the instrumental seismicity, our results support an important seismic hazard, evidenced by the several active tectonic structures identified and the two seismogenic faults described. After obtaining the current active tectonic framework of Tunisia we discuss our results within the western Mediterranean trying to contribute to the understanding of the western Mediterranean tectonic context. With our results, we suggest that the main reason explaining the sparse and scarce seismicity of the area in contrast with the adjacent parts of the Nubia-Eurasia boundary is due to its extended continental platform and its lack of proto-oceanic crust northward.
A further component analysis for illicit drugs mixtures with THz-TDS
NASA Astrophysics Data System (ADS)
Xiong, Wei; Shen, Jingling; He, Ting; Pan, Rui
2009-07-01
A new method for quantitative analysis of mixtures of illicit drugs with THz time domain spectroscopy was proposed and verified experimentally. In traditional method we need fingerprints of all the pure chemical components. In practical as only the objective components in a mixture and their absorption features are known, it is necessary and important to present a more practical technique for the detection and identification. Our new method of quantitatively inspect of the mixtures of illicit drugs is developed by using derivative spectrum. In this method, the ratio of objective components in a mixture can be obtained on the assumption that all objective components in the mixture and their absorption features are known but the unknown components are not needed. Then methamphetamine and flour, a illicit drug and a common adulterant, were selected for our experiment. The experimental result verified the effectiveness of the method, which suggested that it could be an effective method for quantitative identification of illicit drugs. This THz spectroscopy technique is great significant in the real-world applications of illicit drugs quantitative analysis. It could be an effective method in the field of security and pharmaceuticals inspection.
Jia, Weina; Wang, Chunhua; Wang, Yuefei; Pan, Guixiang; Jiang, Miaomiao; Li, Zheng; Zhu, Yan
2015-01-01
Lianhua-Qingwen capsule (LQC) is a commonly used Chinese medical preparation to treat viral influenza and especially played a very important role in the fight against severe acute respiratory syndrome (SARS) in 2002-2003 in China. In this paper, a rapid ultraperformance liquid chromatography coupled with diode-array detector and quadrupole time-of-flight mass spectrometry (UPLC-DAD-QTOF-MS) method was established for qualitative and quantitative analysis of the major constituents of LQC. A total of 61 compounds including flavonoids, phenylpropanoids, anthraquinones, triterpenoids, iridoids, and other types of compounds were unambiguously or tentatively identified by comparing the retention times and accurate mass measurement with reference compounds or literature data. Among them, twelve representative compounds were further quantified as chemical markers in quantitative analysis, including salidroside, chlorogenic acid, forsythoside E, cryptochlorogenic acid, amygdalin, sweroside, hyperin, rutin, forsythoside A, phillyrin, rhein, and glycyrrhizic acid. The UPLC-DAD method was evaluated with linearity, limit of detection (LOD), limit of quantification (LOQ), precision, stability, repeatability, and recovery tests. The results showed that the developed quantitative method was linear, sensitive, and precise for the quality control of LQC. PMID:25654135
Wang, Peng; Liu, Donghui; Gu, Xu; Jiang, Shuren; Zhou, Zhiqiang
2008-01-01
Methods for the enantiomeric quantitative determination of 3 chiral pesticides, paclobutrazol, myclobutanil, and uniconazole, and their residues in soil and water are reported. An effective chiral high-performance liquid chromatographic (HPLC)-UV method using an amylose-tris(3,5-dimethylphenylcarbamate; AD) column was developed for resolving the enantiomers and quantitative determination. The enantiomers were identified by a circular dichroism detector. Validation involved complete resolution of each of the 2 enantiomers, plus determination of linearity, precision, and limit of detection (LOD). The pesticide enantiomers were isolated by solvent extraction from soil and C18 solid-phase extraction from water. The 2 enantiomers of the 3 pesticides could be completely separated on the AD column using n-hexane isopropanol mobile phase. The linearity and precision results indicated that the method was reliable for the quantitative analysis of the enantiomers. LODs were 0.025, 0.05, and 0.05 mg/kg for each enantiomer of paclobutrazol, myclobutanil, and uniconazole, respectively. Recovery and precision data showed that the pretreatment procedures were satisfactory for enantiomer extraction and cleanup. This method can be used for optical purity determination of technical material and analysis of environmental residues.
Shenoy, Shailesh M
2016-07-01
A challenge in any imaging laboratory, especially one that uses modern techniques, is to achieve a sustainable and productive balance between using open source and commercial software to perform quantitative image acquisition, analysis and visualization. In addition to considering the expense of software licensing, one must consider factors such as the quality and usefulness of the software's support, training and documentation. Also, one must consider the reproducibility with which multiple people generate results using the same software to perform the same analysis, how one may distribute their methods to the community using the software and the potential for achieving automation to improve productivity.
Lin, Yunfeng
2015-01-01
Bacteria such as Salmonella and E. coli present a great challenge in public health care in today’s society. Protection of public safety against bacterial contamination and rapid diagnosis of infection require simple and fast assays for the detection and elimination of bacterial pathogens. After utilizing Salmonella DT104 as an example bacterial strain for our investigation, we report a rapid and sensitive assay for the qualitative and quantitative detection of bacteria by using antibody affinity binding, popcorn shaped gold nanoparticle (GNPOPs) labeling, surfance enchanced Raman spectroscopy (SERS), and inductively coupled plasma mass spectrometry (ICP-MS) detection. For qualitative analysis, our assay can detect Salmonella within 10 min by Raman spectroscopy; for quantitative analysis, our assay has the ability to measure as few as 100 Salmonella DT104 in a 1 mL sample (100 CFU/mL) within 40 min. Based on the quantitative detection, we investigated the quantitative destruction of Salmonella DT104, and the assay’s photothermal efficiency in order to reduce the amount of GNPOPs in the assay to ultimately to eliminate any potential side effects/toxicity to the surrounding cells in vivo. Results suggest that our assay may serve as a promising candidate for qualitative and quantitative detection and elimination of a variety of bacterial pathogens. PMID:26417447
Nicolotti, Luca; Cordero, Chiara; Cagliero, Cecilia; Liberto, Erica; Sgorbini, Barbara; Rubiolo, Patrizia; Bicchi, Carlo
2013-10-10
The study proposes an investigation strategy that simultaneously provides detailed profiling and quantitative fingerprinting of food volatiles, through a "comprehensive" analytical platform that includes sample preparation by Headspace Solid Phase Microextraction (HS-SPME), separation by two-dimensional comprehensive gas chromatography coupled with mass spectrometry detection (GC×GC-MS) and data processing using advanced fingerprinting approaches. Experiments were carried out on roasted hazelnuts and on Gianduja pastes (sugar, vegetable oil, hazelnuts, cocoa, nonfat dried milk, vanilla flavorings) and demonstrated that the information potential of each analysis can better be exploited if suitable quantitation methods are applied. Quantitation approaches through Multiple Headspace Extraction and Standard Addition were compared in terms of performance parameters (linearity, precision, accuracy, Limit of Detection and Limit of Quantitation) under headspace linearity conditions. The results on 19 key analytes, potent odorants, and technological markers, and more than 300 fingerprint components, were used for further processing to obtain information concerning the effect of the matrix on volatile release, and to produce an informative chemical blueprint for use in sensomics and flavoromics. The importance of quantitation approaches in headspace analysis of solid matrices of complex composition, and the advantages of MHE, are also critically discussed. Copyright © 2013 Elsevier B.V. All rights reserved.
Hadfield, J D; Nakagawa, S
2010-03-01
Although many of the statistical techniques used in comparative biology were originally developed in quantitative genetics, subsequent development of comparative techniques has progressed in relative isolation. Consequently, many of the new and planned developments in comparative analysis already have well-tested solutions in quantitative genetics. In this paper, we take three recent publications that develop phylogenetic meta-analysis, either implicitly or explicitly, and show how they can be considered as quantitative genetic models. We highlight some of the difficulties with the proposed solutions, and demonstrate that standard quantitative genetic theory and software offer solutions. We also show how results from Bayesian quantitative genetics can be used to create efficient Markov chain Monte Carlo algorithms for phylogenetic mixed models, thereby extending their generality to non-Gaussian data. Of particular utility is the development of multinomial models for analysing the evolution of discrete traits, and the development of multi-trait models in which traits can follow different distributions. Meta-analyses often include a nonrandom collection of species for which the full phylogenetic tree has only been partly resolved. Using missing data theory, we show how the presented models can be used to correct for nonrandom sampling and show how taxonomies and phylogenies can be combined to give a flexible framework with which to model dependence.
Simplex and duplex event-specific analytical methods for functional biotech maize.
Lee, Seong-Hun; Kim, Su-Jeong; Yi, Bu-Young
2009-08-26
Analytical methods are very important in the control of genetically modified organism (GMO) labeling systems or living modified organism (LMO) management for biotech crops. Event-specific primers and probes were developed for qualitative and quantitative analysis for biotech maize event 3272 and LY 038 on the basis of the 3' flanking regions, respectively. The qualitative primers confirmed the specificity by a single PCR product and sensitivity to 0.05% as a limit of detection (LOD). Simplex and duplex quantitative methods were also developed using TaqMan real-time PCR. One synthetic plasmid was constructed from two taxon-specific DNA sequences of maize and two event-specific 3' flanking DNA sequences of event 3272 and LY 038 as reference molecules. In-house validation of the quantitative methods was performed using six levels of mixing samples, from 0.1 to 10.0%. As a result, the biases from the true value and the relative deviations were all within the range of +/-30%. Limits of quantitation (LOQs) of the quantitative methods were all 0.1% for simplex real-time PCRs of event 3272 and LY 038 and 0.5% for duplex real-time PCR of LY 038. This study reports that event-specific analytical methods were applicable for qualitative and quantitative analysis for biotech maize event 3272 and LY 038.
Hamamura, Kensuke; Yanagida, Mitsuaki; Ishikawa, Hitoshi; Banzai, Michio; Yoshitake, Hiroshi; Nonaka, Daisuke; Tanaka, Kenji; Sakuraba, Mayumi; Miyakuni, Yasuka; Takamori, Kenji; Nojima, Michio; Yoshida, Koyo; Fujiwara, Hiroshi; Takeda, Satoru; Araki, Yoshihiko
2018-03-01
Purpose We previously attempted to develop quantitative enzyme-linked immunosorbent assay (ELISA) systems for the PDA039/044/071 peptides, potential serum disease biomarkers (DBMs) of pregnancy-induced hypertension (PIH), primarily identified by a peptidomic approach (BLOTCHIP®-mass spectrometry (MS)). However, our methodology did not extend to PDA071 (cysteinyl α2-HS-glycoprotein 341-367 ), due to difficulty to produce a specific antibody against the peptide. The aim of the present study was to establish an alternative PDA071 quantitation system using liquid chromatography-multiple reaction monitoring (LC-MRM)/MS, to explore the potential utility of PDA071 as a DBM for PIH. Methods We tested heat/acid denaturation methods in efforts to purify serum PDA071 and developed an LC-MRM/MS method allowing for specific quantitation thereof. We measured serum PDA071 concentrations, and these results were validated including by three-dimensional (3D) plotting against PDA039 (kininogen-1 439-456 )/044 (kininogen-1 438-456 ) concentrations, followed by discriminant analysis. Results PDA071 was successfully extracted from serum using a heat denaturation method. Optimum conditions for quantitation via LC-MRM/MS were developed; the assayed serum PDA071 correlated well with the BLOTCHIP® assay values. Although the PDA071 alone did not significantly differ between patients and controls, 3D plotting of PDA039/044/071 peptide concentrations and construction of a Jackknife classification matrix were satisfactory in terms of PIH diagnostic precision. Conclusions Combination analysis using both PDA071 and PDA039/044 concentrations allowed PIH diagnostic accuracy to be attained, and our method will be valuable in future pathophysiological studies of hypertensive disorders of pregnancy.
Sachpekidis, Christos; Hillengass, Jens; Goldschmidt, Hartmut; Anwar, Hoda; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia
2017-01-01
A renewed interest has been recently developed for the highly sensitive bone-seeking radiopharmaceutical 18F-NaF. Aim of the present study is to evaluate the potential utility of quantitative analysis of 18F-NaF dynamic PET/CT data in differentiating malignant from benign degenerative lesions in multiple myeloma (MM). 80 MM patients underwent whole-body PET/CT and dynamic PET/CT scanning of the pelvis with 18F-NaF. PET/CT data evaluation was based on visual (qualitative) assessment, semi-quantitative (SUV) calculations, and absolute quantitative estimations after application of a 2-tissue compartment model and a non-compartmental approach leading to the extraction of fractal dimension (FD). In total 263 MM lesions were demonstrated on 18F-NaF PET/CT. Semi-quantitative and quantitative evaluations were performed for 25 MM lesions as well as for 25 benign, degenerative and traumatic lesions. Mean SUVaverage for MM lesions was 11.9 and mean SUVmax was 23.2. Respectively, SUVaverage and SUVmax for degenerative lesions were 13.5 and 20.2. Kinetic analysis of 18F-NaF revealed the following mean values for MM lesions: K1 = 0.248 (1/min), k3 = 0.359 (1/min), influx (Ki) = 0.107 (1/min), FD = 1.382, while the respective values for degenerative lesions were: K1 = 0.169 (1/min), k3 = 0.422 (1/min), influx (Ki) = 0.095 (1/min), FD = 1. 411. No statistically significant differences between MM and benign degenerative disease regarding SUVaverage, SUVmax, K1, k3 and influx (Ki) were demonstrated. FD was significantly higher in degenerative than in malignant lesions. The present findings show that quantitative analysis of 18F-NaF PET data cannot differentiate malignant from benign degenerative lesions in MM patients, supporting previously published results, which reflect the limited role of 18F-NaF PET/CT in the diagnostic workup of MM. PMID:28913153
Hong, Quan Nha; Pluye, Pierre; Bujold, Mathieu; Wassef, Maggy
2017-03-23
Systematic reviews of qualitative and quantitative evidence can provide a rich understanding of complex phenomena. This type of review is increasingly popular, has been used to provide a landscape of existing knowledge, and addresses the types of questions not usually covered in reviews relying solely on either quantitative or qualitative evidence. Although several typologies of synthesis designs have been developed, none have been tested on a large sample of reviews. The aim of this review of reviews was to identify and develop a typology of synthesis designs and methods that have been used and to propose strategies for synthesizing qualitative and quantitative evidence. A review of systematic reviews combining qualitative and quantitative evidence was performed. Six databases were searched from inception to December 2014. Reviews were included if they were systematic reviews combining qualitative and quantitative evidence. The included reviews were analyzed according to three concepts of synthesis processes: (a) synthesis methods, (b) sequence of data synthesis, and (c) integration of data and synthesis results. A total of 459 reviews were included. The analysis of this literature highlighted a lack of transparency in reporting how evidence was synthesized and a lack of consistency in the terminology used. Two main types of synthesis designs were identified: convergent and sequential synthesis designs. Within the convergent synthesis design, three subtypes were found: (a) data-based convergent synthesis design, where qualitative and quantitative evidence is analyzed together using the same synthesis method, (b) results-based convergent synthesis design, where qualitative and quantitative evidence is analyzed separately using different synthesis methods and results of both syntheses are integrated during a final synthesis, and (c) parallel-results convergent synthesis design consisting of independent syntheses of qualitative and quantitative evidence and an interpretation of the results in the discussion. Performing systematic reviews of qualitative and quantitative evidence is challenging because of the multiple synthesis options. The findings provide guidance on how to combine qualitative and quantitative evidence. Also, recommendations are made to improve the conducting and reporting of this type of review.
Cheng, Wing-Chi; Yau, Tsan-Sang; Wong, Ming-Kei; Chan, Lai-Ping; Mok, Vincent King-Kuen
2006-10-16
A rapid urinalysis system based on SPE-LC-MS/MS with an in-house post-analysis data management system has been developed for the simultaneous identification and semi-quantitation of opiates (morphine, codeine), methadone, amphetamines (amphetamine, methylamphetamine (MA), 3,4-methylenedioxyamphetamine (MDA) and 3,4-methylenedioxymethamphetamine (MDMA)), 11-benzodiazepines or their metabolites and ketamine. The urine samples are subjected to automated solid phase extraction prior to analysis by LC-MS (Finnigan Surveyor LC connected to a Finnigan LCQ Advantage) fitted with an Alltech Rocket Platinum EPS C-18 column. With a single point calibration at the cut-off concentration for each analyte, simultaneous identification and semi-quantitation for the above mentioned drugs can be achieved in a 10 min run per urine sample. A computer macro-program package was developed to automatically retrieve appropriate data from the analytical data files, compare results with preset values (such as cut-off concentrations, MS matching scores) of each drug being analyzed and generate user-defined Excel reports to indicate all positive and negative results in batch-wise manner for ease of checking. The final analytical results are automatically copied into an Access database for report generation purposes. Through the use of automation in sample preparation, simultaneous identification and semi-quantitation by LC-MS/MS and a tailored made post-analysis data management system, this new urinalysis system significantly improves the quality of results, reduces the post-data treatment time, error due to data transfer and is suitable for high-throughput laboratory in batch-wise operation.
Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.
2016-01-01
Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933
Krishnamurthy, Krish
2013-12-01
The intrinsic quantitative nature of NMR is increasingly exploited in areas ranging from complex mixture analysis (as in metabolomics and reaction monitoring) to quality assurance/control. Complex NMR spectra are more common than not, and therefore, extraction of quantitative information generally involves significant prior knowledge and/or operator interaction to characterize resonances of interest. Moreover, in most NMR-based metabolomic experiments, the signals from metabolites are normally present as a mixture of overlapping resonances, making quantification difficult. Time-domain Bayesian approaches have been reported to be better than conventional frequency-domain analysis at identifying subtle changes in signal amplitude. We discuss an approach that exploits Bayesian analysis to achieve a complete reduction to amplitude frequency table (CRAFT) in an automated and time-efficient fashion - thus converting the time-domain FID to a frequency-amplitude table. CRAFT uses a two-step approach to FID analysis. First, the FID is digitally filtered and downsampled to several sub FIDs, and secondly, these sub FIDs are then modeled as sums of decaying sinusoids using the Bayesian approach. CRAFT tables can be used for further data mining of quantitative information using fingerprint chemical shifts of compounds of interest and/or statistical analysis of modulation of chemical quantity in a biological study (metabolomics) or process study (reaction monitoring) or quality assurance/control. The basic principles behind this approach as well as results to evaluate the effectiveness of this approach in mixture analysis are presented. Copyright © 2013 John Wiley & Sons, Ltd.
Li, Junjie; Zhang, Weixia; Chung, Ting-Fung; Slipchenko, Mikhail N; Chen, Yong P; Cheng, Ji-Xin; Yang, Chen
2015-07-23
We report a transient absorption (TA) imaging method for fast visualization and quantitative layer analysis of graphene and GO. Forward and backward imaging of graphene on various substrates under ambient condition was imaged with a speed of 2 μs per pixel. The TA intensity linearly increased with the layer number of graphene. Real-time TA imaging of GO in vitro with capability of quantitative analysis of intracellular concentration and ex vivo in circulating blood were demonstrated. These results suggest that TA microscopy is a valid tool for the study of graphene based materials.
Takach, Edward; O'Shea, Thomas; Liu, Hanlan
2014-08-01
Quantifying amino acids in biological matrices is typically performed using liquid chromatography (LC) coupled with fluorescent detection (FLD), requiring both derivatization and complete baseline separation of all amino acids. Due to its high specificity and sensitivity, the use of UPLC-MS/MS eliminates the derivatization step and allows for overlapping amino acid retention times thereby shortening the analysis time. Furthermore, combining UPLC-MS/MS with stable isotope labeling (e.g., isobaric tag for relative and absolute quantitation, i.e., iTRAQ) of amino acids enables quantitation while maintaining sensitivity, selectivity and speed of analysis. In this study, we report combining UPLC-MS/MS analysis with iTRAQ labeling of amino acids resulting in the elution and quantitation of 44 amino acids within 5 min demonstrating the speed and convenience of this assay over established approaches. This chromatographic analysis time represented a 5-fold improvement over the conventional HPLC-MS/MS method developed in our laboratory. In addition, the UPLC-MS/MS method demonstrated improvements in both specificity and sensitivity without loss of precision. In comparing UPLC-MS/MS and HPLC-MS/MS results of 32 detected amino acids, only 2 amino acids exhibited imprecision (RSD) >15% using UPLC-MS/MS, while 9 amino acids exhibited RSD >15% using HPLC-MS/MS. Evaluating intra- and inter-assay precision over 3 days, the quantitation range for 32 detected amino acids in rat plasma was 0.90-497 μM, with overall mean intra-day precision of less than 15% and mean inter-day precision of 12%. This UPLC-MS/MS assay was successfully implemented for the quantitative analysis of amino acids in rat and mouse plasma, along with mouse urine and tissue samples, resulting in the following concentration ranges: 0.98-431 μM in mouse plasma for 32 detected amino acids; 0.62-443 μM in rat plasma for 32 detected amino acids; 0.44-8590μM in mouse liver for 33 detected amino acids; 0.61-1241 μM in mouse kidney for 37 detected amino acids; and 1.39-1,681 μM in rat urine for 34 detected amino acids. The utility of the assay was further demonstrated by measuring and comparing plasma amino acid levels between pre-diabetic Zucker diabetic fatty rats (ZDF/Gmi fa/fa) and their lean littermates (ZDF/Gmi fa/?). Significant differences (P<0.001) in 9 amino acid concentrations were observed, with the majority ranging from a 2- to 5-fold increase in pre-diabetic ZDF rats on comparison with ZDF lean rats, consistent with previous literature reports. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Safi, A.; Campanella, B.; Grifoni, E.; Legnaioli, S.; Lorenzetti, G.; Pagnotta, S.; Poggialini, F.; Ripoll-Seguer, L.; Hidalgo, M.; Palleschi, V.
2018-06-01
The introduction of multivariate calibration curve approach in Laser-Induced Breakdown Spectroscopy (LIBS) quantitative analysis has led to a general improvement of the LIBS analytical performances, since a multivariate approach allows to exploit the redundancy of elemental information that are typically present in a LIBS spectrum. Software packages implementing multivariate methods are available in the most diffused commercial and open source analytical programs; in most of the cases, the multivariate algorithms are robust against noise and operate in unsupervised mode. The reverse of the coin of the availability and ease of use of such packages is the (perceived) difficulty in assessing the reliability of the results obtained which often leads to the consideration of the multivariate algorithms as 'black boxes' whose inner mechanism is supposed to remain hidden to the user. In this paper, we will discuss the dangers of a 'black box' approach in LIBS multivariate analysis, and will discuss how to overcome them using the chemical-physical knowledge that is at the base of any LIBS quantitative analysis.
Jović, Ozren; Smolić, Tomislav; Primožič, Ines; Hrenar, Tomica
2016-04-19
The aim of this study was to investigate the feasibility of FTIR-ATR spectroscopy coupled with the multivariate numerical methodology for qualitative and quantitative analysis of binary and ternary edible oil mixtures. Four pure oils (extra virgin olive oil, high oleic sunflower oil, rapeseed oil, and sunflower oil), as well as their 54 binary and 108 ternary mixtures, were analyzed using FTIR-ATR spectroscopy in combination with principal component and discriminant analysis, partial least-squares, and principal component regression. It was found that the composition of all 166 samples can be excellently represented using only the first three principal components describing 98.29% of total variance in the selected spectral range (3035-2989, 1170-1140, 1120-1100, 1093-1047, and 930-890 cm(-1)). Factor scores in 3D space spanned by these three principal components form a tetrahedral-like arrangement: pure oils being at the vertices, binary mixtures at the edges, and ternary mixtures on the faces of a tetrahedron. To confirm the validity of results, we applied several cross-validation methods. Quantitative analysis was performed by minimization of root-mean-square error of cross-validation values regarding the spectral range, derivative order, and choice of method (partial least-squares or principal component regression), which resulted in excellent predictions for test sets (R(2) > 0.99 in all cases). Additionally, experimentally more demanding gas chromatography analysis of fatty acid content was carried out for all specimens, confirming the results obtained by FTIR-ATR coupled with principal component analysis. However, FTIR-ATR provided a considerably better model for prediction of mixture composition than gas chromatography, especially for high oleic sunflower oil.
Pleiotropy Analysis of Quantitative Traits at Gene Level by Multivariate Functional Linear Models
Wang, Yifan; Liu, Aiyi; Mills, James L.; Boehnke, Michael; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Xiong, Momiao; Wu, Colin O.; Fan, Ruzong
2015-01-01
In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai–Bartlett trace, Hotelling–Lawley trace, and Wilks’s Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. PMID:25809955
Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.
Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong
2015-05-01
In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. © 2015 WILEY PERIODICALS, INC.
2013-06-30
QUANTITATIVE RISK ANALYSIS The use of quantitative cost risk analysis tools can be valuable in measuring numerical risk to the government ( Galway , 2004...assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost, schedule, and...www.amazon.com Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review (RAND Working Paper WR-112-RC
García-Florentino, Cristina; Maguregui, Maite; Romera-Fernández, Miriam; Queralt, Ignasi; Margui, Eva; Madariaga, Juan Manuel
2018-05-01
Wavelength dispersive X-ray fluorescence (WD-XRF) spectrometry has been widely used for elemental quantification of mortars and cements. In this kind of instrument, samples are usually prepared as pellets or fused beads and the whole volume of sample is measured at once. In this work, the usefulness of a dual energy dispersive X-ray fluorescence spectrometer (ED-XRF), working at two lateral resolutions (1 mm and 25 μm) for macro and microanalysis respectively, to develop quantitative methods for the elemental characterization of mortars and concretes is demonstrated. A crucial step before developing any quantitative method with this kind of spectrometers is to verify the homogeneity of the standards at these two lateral resolutions. This new ED-XRF quantitative method also demonstrated the importance of matrix effects in the accuracy of the results being necessary to use Certified Reference Materials as standards. The results obtained with the ED-XRF quantitative method were compared with the ones obtained with two WD-XRF quantitative methods employing two different sample preparation strategies (pellets and fused beads). The selected ED-XRF and both WD-XRF quantitative methods were applied to the analysis of real mortars. The accuracy of the ED-XRF results turn out to be similar to the one achieved by WD-XRF, except for the lightest elements (Na and Mg). The results described in this work proved that μ-ED-XRF spectrometers can be used not only for acquiring high resolution elemental map distributions, but also to perform accurate quantitative studies avoiding the use of more sophisticated WD-XRF systems or the acid extraction/alkaline fusion required as destructive pretreatment in Inductively coupled plasma mass spectrometry based procedures.
Mari, João Fernando; Saito, José Hiroki; Neves, Amanda Ferreira; Lotufo, Celina Monteiro da Cruz; Destro-Filho, João-Batista; Nicoletti, Maria do Carmo
2015-12-01
Microelectrode Arrays (MEA) are devices for long term electrophysiological recording of extracellular spontaneous or evocated activities on in vitro neuron culture. This work proposes and develops a framework for quantitative and morphological analysis of neuron cultures on MEAs, by processing their corresponding images, acquired by fluorescence microscopy. The neurons are segmented from the fluorescence channel images using a combination of segmentation by thresholding, watershed transform, and object classification. The positioning of microelectrodes is obtained from the transmitted light channel images using the circular Hough transform. The proposed method was applied to images of dissociated culture of rat dorsal root ganglion (DRG) neuronal cells. The morphological and topological quantitative analysis carried out produced information regarding the state of culture, such as population count, neuron-to-neuron and neuron-to-microelectrode distances, soma morphologies, neuron sizes, neuron and microelectrode spatial distributions. Most of the analysis of microscopy images taken from neuronal cultures on MEA only consider simple qualitative analysis. Also, the proposed framework aims to standardize the image processing and to compute quantitative useful measures for integrated image-signal studies and further computational simulations. As results show, the implemented microelectrode identification method is robust and so are the implemented neuron segmentation and classification one (with a correct segmentation rate up to 84%). The quantitative information retrieved by the method is highly relevant to assist the integrated signal-image study of recorded electrophysiological signals as well as the physical aspects of the neuron culture on MEA. Although the experiments deal with DRG cell images, cortical and hippocampal cell images could also be processed with small adjustments in the image processing parameter estimation.
Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results
NASA Astrophysics Data System (ADS)
Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.
2014-03-01
Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.
NASA Astrophysics Data System (ADS)
Kanari, M.; Ketter, T.; Tibor, G.; Schattner, U.
2017-12-01
We aim to characterize the seafloor morphology and its shallow sub-surface structures and deformations in the deep part of the Levant basin (eastern Mediterranean) using recently acquired high-resolution shallow seismic reflection data and multibeam bathymetry, which allow quantitative analysis of morphology and structure. The Levant basin at the eastern Mediterranean is considered a passive continental margin, where most of the recent geological processes were related in literature to salt tectonics rooted at the Messinian deposits from 6Ma. We analyzed two sets of recently acquired high-resolution data from multibeam bathymetry and 3.5 kHz Chirp sub-bottom seismic reflection in the deep basin of the continental shelf offshore Israel (water depths up to 2100 m). Semi-automatic mapping of seafloor features and seismic data interpretation resulted in quantitative morphological analysis of the seafloor and its underlying sediment with penetration depth up to 60 m. The quantitative analysis and its interpretation are still in progress. Preliminary results reveal distinct morphologies of four major elements: channels, faults, folds and sediment waves, validated by seismic data. From the spatial distribution and orientation analyses of these phenomena, we identify two primary process types which dominate the formation of the seafloor in the Levant basin: structural and sedimentary. Characterization of the geological and geomorphological processes forming the seafloor helps to better understand the transport mechanisms and the relations between sediment transport and deposition in deep water and the shallower parts of the shelf and slope.
Wang, Chao-Qun; Jia, Xiu-Hong; Zhu, Shu; Komatsu, Katsuko; Wang, Xuan; Cai, Shao-Qing
2015-03-01
A new quantitative analysis of multi-component with single marker (QAMS) method for 11 saponins (ginsenosides Rg1, Rb1, Rg2, Rh1, Rf, Re and Rd; notoginsenosides R1, R4, Fa and K) in notoginseng was established, when 6 of these saponins were individually used as internal referring substances to investigate the influences of chemical structure, concentrations of quantitative components, and purities of the standard substances on the accuracy of the QAMS method. The results showed that the concentration of the analyte in sample solution was the major influencing parameter, whereas the other parameters had minimal influence on the accuracy of the QAMS method. A new method for calculating the relative correction factors by linear regression was established (linear regression method), which demonstrated to decrease standard method differences of the QAMS method from 1.20%±0.02% - 23.29%±3.23% to 0.10%±0.09% - 8.84%±2.85% in comparison with the previous method. And the differences between external standard method and the QAMS method using relative correction factors calculated by linear regression method were below 5% in the quantitative determination of Rg1, Re, R1, Rd and Fa in 24 notoginseng samples and Rb1 in 21 notoginseng samples. And the differences were mostly below 10% in the quantitative determination of Rf, Rg2, R4 and N-K (the differences of these 4 constituents bigger because their contents lower) in all the 24 notoginseng samples. The results indicated that the contents assayed by the new QAMS method could be considered as accurate as those assayed by external standard method. In addition, a method for determining applicable concentration ranges of the quantitative components assayed by QAMS method was established for the first time, which could ensure its high accuracy and could be applied to QAMS methods of other TCMs. The present study demonstrated the practicability of the application of the QAMS method for the quantitative analysis of multi-component and the quality control of TCMs and TCM prescriptions. Copyright © 2014 Elsevier B.V. All rights reserved.
Miao, Qing; Kong, Weijun; Zhao, Xiangsheng; Yang, Shihai; Yang, Meihua
2015-01-01
Analytical methods for quantitative analysis and chemical fingerprinting of volatile oils from Alpinia oxyphylla were established. The volatile oils were prepared by hydrodistillation, and the yields were between 0.82% and 1.33%. The developed gas chromatography-flame ionization detection (GC-FID) method showed good specificity, linearity, reproducibility, stability and recovery, and could be used satisfactorily for quantitative analysis. The results showed that the volatile oils contained 2.31-77.30 μL/mL p-cymene and 12.38-99.34 mg/mL nootkatone. A GC-FID fingerprinting method was established, and the profiles were analyzed using chemometrics. GC-MS was used to identify the principal compounds in the GC-FID profiles. The profiles of almost all the samples were consistent and stable. The harvesting time and source were major factors that affected the profile, while the volatile oil yield and the nootkatone content had minor secondary effects. Copyright © 2014 Elsevier B.V. All rights reserved.
High-coverage quantitative proteomics using amine-specific isotopic labeling.
Melanson, Jeremy E; Avery, Steven L; Pinto, Devanand M
2006-08-01
Peptide dimethylation with isotopically coded formaldehydes was evaluated as a potential alternative to techniques such as the iTRAQ method for comparative proteomics. The isotopic labeling strategy and custom-designed protein quantitation software were tested using protein standards and then applied to measure proteins levels associated with Alzheimer's disease (AD). The method provided high accuracy (10% error), precision (14% RSD) and coverage (70%) when applied to the analysis of a standard solution of BSA by LC-MS/MS. The technique was then applied to measure protein abundance levels in brain tissue afflicted with AD relative to normal brain tissue. 2-D LC-MS analysis identified 548 unique proteins (p<0.05). Of these, 349 were quantified with two or more peptides that met the statistical criteria used in this study. Several classes of proteins exhibited significant changes in abundance. For example, elevated levels of antioxidant proteins and decreased levels of mitochondrial electron transport proteins were observed. The results demonstrate the utility of the labeling method for high-throughput quantitative analysis.
Composition and Quantitation of Microalgal Lipids by ERETIC 1H NMR Method
Nuzzo, Genoveffa; Gallo, Carmela; d’Ippolito, Giuliana; Cutignano, Adele; Sardo, Angela; Fontana, Angelo
2013-01-01
Accurate characterization of biomass constituents is a crucial aspect of research in the biotechnological application of natural products. Here we report an efficient, fast and reproducible method for the identification and quantitation of fatty acids and complex lipids (triacylglycerols, glycolipids, phospholipids) in microalgae under investigation for the development of functional health products (probiotics, food ingredients, drugs, etc.) or third generation biofuels. The procedure consists of extraction of the biological matrix by modified Folch method and direct analysis of the resulting material by proton nuclear magnetic resonance (1H NMR). The protocol uses a reference electronic signal as external standard (ERETIC method) and allows assessment of total lipid content, saturation degree and class distribution in both high throughput screening of algal collection and metabolic analysis during genetic or culturing studies. As proof of concept, the methodology was applied to the analysis of three microalgal species (Thalassiosira weissflogii, Cyclotella cryptica and Nannochloropsis salina) which drastically differ for the qualitative and quantitative composition of their fatty acid-based lipids. PMID:24084790
Frikha, Youssef; Fellner, Johann; Zairi, Moncef
2017-09-01
Despite initiatives for enhanced recycling and waste utilization, landfill still represents the dominant disposal path for municipal solid waste (MSW). The environmental impacts of landfills depend on several factors, including waste composition, technical barriers, landfill operation and climatic conditions. A profound evaluation of all factors and their impact is necessary in order to evaluate the environmental hazards emanating from landfills. The present paper investigates a sanitary landfill located in a semi-arid climate (Tunisia) and highlights major differences in quantitative and qualitative leachate characteristics compared to landfills situated in moderate climates. Besides the qualitative analysis of leachate samples, a quantitative analysis including the simulation of leachate generation (using the HELP model) has been conducted. The results of the analysis indicate a high load of salts (Cl, Na, inorganic nitrogen) in the leachate compared to other landfills. Furthermore the simulations with HELP model highlight that a major part of the leachate generated originates form the water content of waste.
Shivanandan, Arun; Unnikrishnan, Jayakrishnan; Radenovic, Aleksandra
2015-01-01
Single Molecule Localization Microscopy techniques like PhotoActivated Localization Microscopy, with their sub-diffraction limit spatial resolution, have been popularly used to characterize the spatial organization of membrane proteins, by means of quantitative cluster analysis. However, such quantitative studies remain challenged by the techniques’ inherent sources of errors such as a limited detection efficiency of less than 60%, due to incomplete photo-conversion, and a limited localization precision in the range of 10 – 30nm, varying across the detected molecules, mainly depending on the number of photons collected from each. We provide analytical methods to estimate the effect of these errors in cluster analysis and to correct for them. These methods, based on the Ripley’s L(r) – r or Pair Correlation Function popularly used by the community, can facilitate potentially breakthrough results in quantitative biology by providing a more accurate and precise quantification of protein spatial organization. PMID:25794150
What Are We Doing When We Translate from Quantitative Models?
Critchfield, Thomas S; Reed, Derek D
2009-01-01
Although quantitative analysis (in which behavior principles are defined in terms of equations) has become common in basic behavior analysis, translational efforts often examine everyday events through the lens of narrative versions of laboratory-derived principles. This approach to translation, although useful, is incomplete because equations may convey concepts that are difficult to capture in words. To support this point, we provide a nontechnical introduction to selected aspects of quantitative analysis; consider some issues that translational investigators (and, potentially, practitioners) confront when attempting to translate from quantitative models; and discuss examples of relevant translational studies. We conclude that, where behavior-science translation is concerned, the quantitative features of quantitative models cannot be ignored without sacrificing conceptual precision, scientific and practical insights, and the capacity of the basic and applied wings of behavior analysis to communicate effectively. PMID:22478533
NASA Astrophysics Data System (ADS)
Mustafaoglu, Mustafa Sinan
Some of the main energy issues in developing countries are high dependence on non-renewable energy sources, low energy efficiency levels and as a result of this high amount of CO2 emissions. Besides, a common problem of many countries including developing countries is economic inequality problem. In the study, solar photovoltaic policies of Germany, Japan and the USA is analyzed through a quantitative analysis and a new renewable energy support mechanism called Socio Feed-in Tariff Mechanism (SocioFIT) is formed based on the analysis results to address the mentioned issues of developing countries as well as economic inequality problem by using energy savings as a funding source for renewable energy systems. The applicability of the mechanism is solidified by the calculations in case of an implementation of the mechanism in Turkey.
The quantitative analysis of silicon carbide surface smoothing by Ar and Xe cluster ions
NASA Astrophysics Data System (ADS)
Ieshkin, A. E.; Kireev, D. S.; Ermakov, Yu. A.; Trifonov, A. S.; Presnov, D. E.; Garshev, A. V.; Anufriev, Yu. V.; Prokhorova, I. G.; Krupenin, V. A.; Chernysh, V. S.
2018-04-01
The gas cluster ion beam technique was used for the silicon carbide crystal surface smoothing. The effect of processing by two inert cluster ions, argon and xenon, was quantitatively compared. While argon is a standard element for GCIB, results for xenon clusters were not reported yet. Scanning probe microscopy and high resolution transmission electron microscopy techniques were used for the analysis of the surface roughness and surface crystal layer quality. The gas cluster ion beam processing results in surface relief smoothing down to average roughness about 1 nm for both elements. It was shown that xenon as the working gas is more effective: sputtering rate for xenon clusters is 2.5 times higher than for argon at the same beam energy. High resolution transmission electron microscopy analysis of the surface defect layer gives values of 7 ± 2 nm and 8 ± 2 nm for treatment with argon and xenon clusters.
Prakash, Amol; Peterman, Scott; Ahmad, Shadab; Sarracino, David; Frewen, Barbara; Vogelsang, Maryann; Byram, Gregory; Krastins, Bryan; Vadali, Gouri; Lopez, Mary
2014-12-05
Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, where DDA is typically applied for deep discovery and DIA may be used to create sample records. In this paper, we present a hybrid data acquisition and processing strategy (pSMART) that combines the strengths of both techniques and provides significant benefits for qualitative and quantitative peptide analysis. The performance of pSMART is compared to published DIA strategies in an experiment that allows the objective assessment of DIA performance with respect to interrogation of previously acquired MS data. The results of this experiment demonstrate that pSMART creates fewer decoy hits than a standard DIA strategy. Moreover, we show that pSMART is more selective, sensitive, and reproducible than either standard DIA or DDA strategies alone.
The cutting edge - Micro-CT for quantitative toolmark analysis of sharp force trauma to bone.
Norman, D G; Watson, D G; Burnett, B; Fenne, P M; Williams, M A
2018-02-01
Toolmark analysis involves examining marks created on an object to identify the likely tool responsible for creating those marks (e.g., a knife). Although a potentially powerful forensic tool, knife mark analysis is still in its infancy and the validation of imaging techniques as well as quantitative approaches is ongoing. This study builds on previous work by simulating real-world stabbings experimentally and statistically exploring quantitative toolmark properties, such as cut mark angle captured by micro-CT imaging, to predict the knife responsible. In Experiment 1 a mechanical stab rig and two knives were used to create 14 knife cut marks on dry pig ribs. The toolmarks were laser and micro-CT scanned to allow for quantitative measurements of numerous toolmark properties. The findings from Experiment 1 demonstrated that both knives produced statistically different cut mark widths, wall angle and shapes. Experiment 2 examined knife marks created on fleshed pig torsos with conditions designed to better simulate real-world stabbings. Eight knives were used to generate 64 incision cut marks that were also micro-CT scanned. Statistical exploration of these cut marks suggested that knife type, serrated or plain, can be predicted from cut mark width and wall angle. Preliminary results suggest that knives type can be predicted from cut mark width, and that knife edge thickness correlates with cut mark width. An additional 16 cut marks walls were imaged for striation marks using scanning electron microscopy with results suggesting that this approach might not be useful for knife mark analysis. Results also indicated that observer judgements of cut mark shape were more consistent when rated from micro-CT images than light microscopy images. The potential to combine micro-CT data, medical grade CT data and photographs to develop highly realistic virtual models for visualisation and 3D printing is also demonstrated. This is the first study to statistically explore simulated real-world knife marks imaged by micro-CT to demonstrate the potential of quantitative approaches in knife mark analysis. Findings and methods presented in this study are relevant to both forensic toolmark researchers as well as practitioners. Limitations of the experimental methodologies and imaging techniques are discussed, and further work is recommended. Copyright © 2017 Elsevier B.V. All rights reserved.
Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H
2015-12-01
An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.
Smartphone-based multispectral imaging: system development and potential for mobile skin diagnosis
Kim, Sewoong; Cho, Dongrae; Kim, Jihun; Kim, Manjae; Youn, Sangyeon; Jang, Jae Eun; Je, Minkyu; Lee, Dong Hun; Lee, Boreom; Farkas, Daniel L.; Hwang, Jae Youn
2016-01-01
We investigate the potential of mobile smartphone-based multispectral imaging for the quantitative diagnosis and management of skin lesions. Recently, various mobile devices such as a smartphone have emerged as healthcare tools. They have been applied for the early diagnosis of nonmalignant and malignant skin diseases. Particularly, when they are combined with an advanced optical imaging technique such as multispectral imaging and analysis, it would be beneficial for the early diagnosis of such skin diseases and for further quantitative prognosis monitoring after treatment at home. Thus, we demonstrate here the development of a smartphone-based multispectral imaging system with high portability and its potential for mobile skin diagnosis. The results suggest that smartphone-based multispectral imaging and analysis has great potential as a healthcare tool for quantitative mobile skin diagnosis. PMID:28018743
Zhao, Wen-Wen; Wu, Zhi-Min; Wu, Xia; Zhao, Hai-Yu; Chen, Xiao-Qing
2016-10-01
This study is to determine five naphthaquinones (acetylshikonin, β-acetoxyisovalerylalkannin, isobutylshikonin, β,β'-dimethylacrylalkannin,α-methyl-n-butylshikonin) by quantitative analysis of multi-components with a single marker (QAMS). β,β'-Dimethylacrylalkannin was selected as the internal reference substance, and the relative correlation factors (RCFs) of acetylshikonin, β-acetoxyisovalerylalkannin, isobutylshikonin and α-methyl-n-butylshikonin were calculated. Then the ruggedness of relative correction factors was tested on different instruments and columns. Meanwhile, 16 batches of Arnebia euchroma were analyzed by external standard method (ESM) and QAMS, respectively. The peaks were identifited by LC-MS. The ruggedness of relative correction factors was good. And the analytical results calculated by ESM and QAMS showed no difference. The quantitative method established was feasible and suitable for the quality evaluation of A. euchroma. Copyright© by the Chinese Pharmaceutical Association.
Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.
Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro
2015-01-01
Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells.
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
Jaferzadeh, Keyvan; Moon, Inkyu
2015-11-01
Quantitative phase information obtained by digital holographic microscopy (DHM) can provide new insight into the functions and morphology of single red blood cells (RBCs). Since the functionality of a RBC is related to its three-dimensional (3-D) shape, quantitative 3-D geometric changes induced by storage time can help hematologists realize its optimal functionality period. We quantitatively investigate RBC 3-D geometric changes in the storage lesion using DHM. Our experimental results show that the substantial geometric transformation of the biconcave-shaped RBCs to the spherocyte occurs due to RBC storage lesion. This transformation leads to progressive loss of cell surface area, surface-to-volume ratio, and functionality of RBCs. Furthermore, our quantitative analysis shows that there are significant correlations between chemical and morphological properties of RBCs.
Rao, Rohit T; Scherholz, Megerle L; Hartmanshenn, Clara; Bae, Seul-A; Androulakis, Ioannis P
2017-12-05
The use of models in biology has become particularly relevant as it enables investigators to develop a mechanistic framework for understanding the operating principles of living systems as well as in quantitatively predicting their response to both pathological perturbations and pharmacological interventions. This application has resulted in a synergistic convergence of systems biology and pharmacokinetic-pharmacodynamic modeling techniques that has led to the emergence of quantitative systems pharmacology (QSP). In this review, we discuss how the foundational principles of chemical process systems engineering inform the progressive development of more physiologically-based systems biology models.
Proteomics wants cRacker: automated standardized data analysis of LC-MS derived proteomic data.
Zauber, Henrik; Schulze, Waltraud X
2012-11-02
The large-scale analysis of thousands of proteins under various experimental conditions or in mutant lines has gained more and more importance in hypothesis-driven scientific research and systems biology in the past years. Quantitative analysis by large scale proteomics using modern mass spectrometry usually results in long lists of peptide ion intensities. The main interest for most researchers, however, is to draw conclusions on the protein level. Postprocessing and combining peptide intensities of a proteomic data set requires expert knowledge, and the often repetitive and standardized manual calculations can be time-consuming. The analysis of complex samples can result in very large data sets (lists with several 1000s to 100,000 entries of different peptides) that cannot easily be analyzed using standard spreadsheet programs. To improve speed and consistency of the data analysis of LC-MS derived proteomic data, we developed cRacker. cRacker is an R-based program for automated downstream proteomic data analysis including data normalization strategies for metabolic labeling and label free quantitation. In addition, cRacker includes basic statistical analysis, such as clustering of data, or ANOVA and t tests for comparison between treatments. Results are presented in editable graphic formats and in list files.
Birnbrauer, Kristina; Frohlich, Dennis Owen; Treise, Debbie
2017-09-01
West Nile Virus (WNV) has been reported as one of the worst epidemics in US history. This study sought to understand how WNV news stories were framed and how risk information was portrayed from its 1999 arrival in the US through the year 2012. The authors conducted a quantitative content analysis of online news articles obtained through Google News ( N = 428). The results of this analysis were compared to the CDC's ArboNET surveillance system. The following story frames were identified in this study: action, conflict, consequence, new evidence, reassurance and uncertainty, with the action frame appearing most frequently. Risk was communicated quantitatively without context in the majority of articles, and only in 2006, the year with the third-highest reported deaths, was risk reported with statistical accuracy. The results from the analysis indicated that at-risk communities were potentially under-informed as accurate risks were not communicated. This study offers evidence about how disease outbreaks are covered in relation to actual disease surveillance data.
How to Perform a Systematic Review and Meta-analysis of Diagnostic Imaging Studies.
Cronin, Paul; Kelly, Aine Marie; Altaee, Duaa; Foerster, Bradley; Petrou, Myria; Dwamena, Ben A
2018-05-01
A systematic review is a comprehensive search, critical evaluation, and synthesis of all the relevant studies on a specific (clinical) topic that can be applied to the evaluation of diagnostic and screening imaging studies. It can be a qualitative or a quantitative (meta-analysis) review of available literature. A meta-analysis uses statistical methods to combine and summarize the results of several studies. In this review, a 12-step approach to performing a systematic review (and meta-analysis) is outlined under the four domains: (1) Problem Formulation and Data Acquisition, (2) Quality Appraisal of Eligible Studies, (3) Statistical Analysis of Quantitative Data, and (4) Clinical Interpretation of the Evidence. This review is specifically geared toward the performance of a systematic review and meta-analysis of diagnostic test accuracy (imaging) studies. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
SARGENT, DANIEL J.; GEIBEL, M.; HAWKINS, J. A.; WILKINSON, M. J.; BATTEY, N. H.; SIMPSON, D. W.
2004-01-01
• Background and Aims The aims of this investigation were to highlight the qualitative and quantitative diversity apparent between nine diploid Fragaria species and produce interspecific populations segregating for a large number of morphological characters suitable for quantitative trait loci analysis. • Methods A qualitative comparison of eight described diploid Fragaria species was performed and measurements were taken of 23 morphological traits from 19 accessions including eight described species and one previously undescribed species. A principal components analysis was performed on 14 mathematically unrelated traits from these accessions, which partitioned the species accessions into distinct morphological groups. Interspecific crosses were performed with accessions of species that displayed significant quantitative divergence and, from these, populations that should segregate for a range of quantitative traits were raised. • Key Results Significant differences between species were observed for all 23 morphological traits quantified and three distinct groups of species accessions were observed after the principal components analysis. Interspecific crosses were performed between these groups, and F2 and backcross populations were raised that should segregate for a range of morphological characters. In addition, the study highlighted a number of distinctive morphological characters in many of the species studied. • Conclusions Diploid Fragaria species are morphologically diverse, yet remain highly interfertile, making the group an ideal model for the study of the genetic basis of phenotypic differences between species through map-based investigation using quantitative trait loci. The segregating interspecific populations raised will be ideal for such investigations and could also provide insights into the nature and extent of genome evolution within this group. PMID:15469944
Benharash, Peyman; Buch, Eric; Frank, Paul; Share, Michael; Tung, Roderick; Shivkumar, Kalyanam; Mandapati, Ravi
2015-01-01
Background New approaches to ablation of atrial fibrillation (AF) include focal impulse and rotor modulation (FIRM) mapping, and initial results reported with this technique have been favorable. We sought to independently evaluate the approach by analyzing quantitative characteristics of atrial electrograms used to identify rotors and describe acute procedural outcomes of FIRM-guided ablation. Methods and Results All FIRM-guided ablation procedures (n=24; 50% paroxysmal) at University of California, Los Angeles Medical Center were included for analysis. During AF, unipolar atrial electrograms collected from a 64-pole basket catheter were used to construct phase maps and identify putative AF sources. These sites were targeted for ablation, in conjunction with pulmonary vein isolation in most patients (n=19; 79%). All patients had rotors identified (mean, 2.3±0.9 per patient; 72% in left atrium). Prespecified acute procedural end point was achieved in 12 of 24 (50%) patients: AF termination (n=1), organization (n=3), or >10% slowing of AF cycle length (n=8). Basket electrodes were within 1 cm of 54% of left atrial surface area, and a mean of 31 electrodes per patient showed interpretable atrial electrograms. Offline analysis revealed no differences between rotor and distant sites in dominant frequency or Shannon entropy. Electroanatomic mapping showed no rotational activation at FIRM-identified rotor sites in 23 of 24 patients (96%). Conclusions FIRM-identified rotor sites did not exhibit quantitative atrial electrogram characteristics expected from rotors and did not differ quantitatively from surrounding tissue. Catheter ablation at these sites, in conjunction with pulmonary vein isolation, resulted in AF termination or organization in a minority of patients (4/24; 17%). Further validation of this approach is necessary. PMID:25873718
Subsurface imaging and cell refractometry using quantitative phase/ shear-force feedback microscopy
NASA Astrophysics Data System (ADS)
Edward, Kert; Farahi, Faramarz
2009-10-01
Over the last few years, several novel quantitative phase imaging techniques have been developed for the study of biological cells. However, many of these techniques are encumbered by inherent limitations including 2π phase ambiguities and diffraction limited spatial resolution. In addition, subsurface information in the phase data is not exploited. We hereby present a novel quantitative phase imaging system without 2 π ambiguities, which also allows for subsurface imaging and cell refractometry studies. This is accomplished by utilizing simultaneously obtained shear-force topography information. We will demonstrate how the quantitative phase and topography data can be used for subsurface and cell refractometry analysis and will present results for a fabricated structure and a malaria infected red blood cell.
[Reconstituting evaluation methods based on both qualitative and quantitative paradigms].
Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro
2011-01-01
Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.
Quantitative aspects of inductively coupled plasma mass spectrometry
Wagner, Barbara
2016-01-01
Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644971
Quantitative Data Analysis--In the Graduate Curriculum
ERIC Educational Resources Information Center
Albers, Michael J.
2017-01-01
A quantitative research study collects numerical data that must be analyzed to help draw the study's conclusions. Teaching quantitative data analysis is not teaching number crunching, but teaching a way of critical thinking for how to analyze the data. The goal of data analysis is to reveal the underlying patterns, trends, and relationships of a…
NASA Astrophysics Data System (ADS)
Li, Jiangui; Wang, Junhua; Zhigang, Zhao; Yan, Weili
2012-04-01
In this paper, analytical analysis of the permanent magnet vernier (PMV) is presented. The key is to analytically solve the governing Laplacian/quasi-Poissonian field equations in the motor regions. By using the time-stepping finite element method, the analytical method is verified. Hence, the performances of the PMV machine are quantitatively compared with that of the analytical results. The analytical results agree well with the finite element method results. Finally, the experimental results are given to further show the validity of the analysis.
Kuhn, Felix P; Spinner, Georg; Del Grande, Filippo; Wyss, Michael; Piccirelli, Marco; Erni, Stefan; Pfister, Pascal; Ho, Michael; Sah, Bert-Ram; Filli, Lukas; Ettlin, Dominik A; Gallo, Luigi M; Andreisek, Gustav
2017-01-01
Objectives: To qualitatively and quantitatively compare MRI of the temporomandibular joint (TMJ) at 7.0 T using high-permittivity dielectric pads and 3.0 T using a clinical high-resolution protocol. Methods: Institutional review board-approved study with written informed consent. 12 asymptomatic volunteers were imaged at 7.0 and 3.0 T using 32-channel head coils. High-permittivity dielectric pads consisting of barium titanate in deuterated suspension were used for imaging at 7.0 T. Imaging protocol consisted of oblique sagittal proton density weighted turbo spin echo sequences. For quantitative analysis, pixelwise signal-to-noise ratio maps of the TMJ were calculated. For qualitative analysis, images were evaluated by two independent readers using 5-point Likert scales. Quantitative and qualitative results were compared using t-tests and Wilcoxon signed-rank tests, respectively. Results: TMJ imaging at 7.0 T using high-permittivity dielectric pads was feasible in all volunteers. Quantitative analysis showed similar signal-to-noise ratio for both field strengths (mean ± SD; 7.0 T, 13.02 ± 3.92; 3.0 T, 14.02 ± 3.41; two-sample t-tests, p = 0.188). At 7.0 T, qualitative analysis yielded better visibility of all anatomical subregions of the temporomandibular disc (anterior band, intermediate zone and posterior band) than 3.0 T (Wilcoxon signed-rank tests, p < 0.05, corrected for multiple comparisons). Conclusions: MRI of the TMJ at 7.0 T using high-permittivity dielectric pads yields superior visibility of the temporomandibular disc compared with 3.0 T. PMID:27704872
Barlow, Pepita; McKee, Martin; Basu, Sanjay; Stuckler, David
2017-03-08
Regional trade agreements are major international policy instruments that shape macro-economic and political systems. There is widespread debate as to whether and how these agreements pose risks to public health. Here we perform a comprehensive systematic review of quantitative studies of the health impact of trade and investment agreements. We identified studies from searches in PubMed, Web of Science, EMBASE, and Global Health Online. Research articles were eligible for inclusion if they were quantitative studies of the health impacts of trade and investment agreements or policy. We systematically reviewed study findings, evaluated quality using the Quality Assessment Tool from the Effective Public Health Practice Project, and performed network citation analysis to study disciplinary siloes. Seventeen quantitative studies met our inclusion criteria. There was consistent evidence that implementing trade agreements was associated with increased consumption of processed foods and sugar-sweetened beverages. Granting import licenses for patented drugs was associated with increased access to pharmaceuticals. Implementing trade agreements and associated policies was also correlated with higher cardiovascular disease incidence and higher Body Mass Index (BMI), whilst correlations with tobacco consumption, under-five mortality, maternal mortality, and life expectancy were inconclusive. Overall, the quality of studies is weak or moderately weak, and co-citation analysis revealed a relative isolation of public health from economics. We identified limitations in existing studies which preclude definitive conclusions of the health impacts of regional trade and investment agreements. Few address unobserved confounding, and many possible consequences and mechanisms linking trade and investment agreements to health remain poorly understood. Results from our co-citation analysis suggest scope for greater interdisciplinary collaboration. Notwithstanding these limitations, our results find evidence that trade agreements pose some significant health risks. Health protections in trade and investment treaties may mitigate these impacts.
An Quantitative Analysis Method Of Trabecular Pattern In A Bone
NASA Astrophysics Data System (ADS)
Idesawa, Masanor; Yatagai, Toyohiko
1982-11-01
Orientation and density of trabecular pattern observed in a bone is closely related to its mechanical properties and deseases of a bone are appeared as changes of orientation and/or density distrbution of its trabecular patterns. They have been treated from a qualitative point of view so far because quantitative analysis method has not be established. In this paper, the authors proposed and investigated some quantitative analysis methods of density and orientation of trabecular patterns observed in a bone. These methods can give an index for evaluating orientation of trabecular pattern quantitatively and have been applied to analyze trabecular pattern observed in a head of femur and their availabilities are confirmed. Key Words: Index of pattern orientation, Trabecular pattern, Pattern density, Quantitative analysis
Fedorov, Andriy; Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R
2016-01-01
Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM(®)) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard.
Smooth muscle cells of penis in the rat: noninvasive quantification with shear wave elastography.
Zhang, Jia-Jie; Qiao, Xiao-Hui; Gao, Feng; Bai, Ming; Li, Fan; Du, Lian-Fang; Xing, Jin-Fang
2015-01-01
Smooth muscle cells (SMCs) of cavernosum play an important role in erection. It is of great significance to quantitatively analyze the level of SMCs in penis. In this study, we investigated the feasibility of shear wave elastography (SWE) on evaluating the level of SMCs in penis quantitatively. Twenty healthy male rats were selected. The SWE imaging of penis was carried out and then immunohistochemistry analysis of penis was performed to analyze the expression of alpha smooth muscle actin in penis. The measurement index of SWE examination was tissue stiffness (TS). The measurement index of immunohistochemistry analysis was positive area percentage of alpha smooth muscle actin (AP). Sixty sets of data of TS and AP were obtained. The results showed that TS was significantly correlated with AP and the correlation coefficient was -0.618 (p < 0.001). The result of TS had been plotted against the AP measurements. The relation between the two results has been fitted with quadric curve; the goodness-of-fit index was 0.364 (p < 0.001). The level of SMCs in penis was successfully quantified in vivo with SWE. SWE can be used clinically for evaluating the level of SMCs in penis quantitatively.
Quantitative characterisation of sedimentary grains
NASA Astrophysics Data System (ADS)
Tunwal, Mohit; Mulchrone, Kieran F.; Meere, Patrick A.
2016-04-01
Analysis of sedimentary texture helps in determining the formation, transportation and deposition processes of sedimentary rocks. Grain size analysis is traditionally quantitative, whereas grain shape analysis is largely qualitative. A semi-automated approach to quantitatively analyse shape and size of sand sized sedimentary grains is presented. Grain boundaries are manually traced from thin section microphotographs in the case of lithified samples and are automatically identified in the case of loose sediments. Shape and size paramters can then be estimated using a software package written on the Mathematica platform. While automated methodology already exists for loose sediment analysis, the available techniques for the case of lithified samples are limited to cases of high definition thin section microphotographs showing clear contrast between framework grains and matrix. Along with the size of grain, shape parameters such as roundness, angularity, circularity, irregularity and fractal dimension are measured. A new grain shape parameter developed using Fourier descriptors has also been developed. To test this new approach theoretical examples were analysed and produce high quality results supporting the accuracy of the algorithm. Furthermore sandstone samples from known aeolian and fluvial environments from the Dingle Basin, County Kerry, Ireland were collected and analysed. Modern loose sediments from glacial till from County Cork, Ireland and aeolian sediments from Rajasthan, India have also been collected and analysed. A graphical summary of the data is presented and allows for quantitative distinction between samples extracted from different sedimentary environments.
Cheng, Keding; Sloan, Angela; McCorrister, Stuart; Peterson, Lorea; Chui, Huixia; Drebot, Mike; Nadon, Celine; Knox, J David; Wang, Gehua
2014-12-01
The need for rapid and accurate H typing is evident during Escherichia coli outbreak situations. This study explores the transition of MS-H, a method originally developed for rapid H antigen typing of E. coli using LC-MS/MS of flagella digest of reference strains and some clinical strains, to E. coli isolates in clinical scenario through quantitative analysis and method validation. Motile and nonmotile strains were examined in batches to simulate clinical sample scenario. Various LC-MS/MS batch run procedures and MS-H typing rules were compared and summarized through quantitative analysis of MS-H data output for a standard method development. Label-free quantitative data analysis of MS-H typing was proven very useful for examining the quality of MS-H result and the effects of some sample carryovers from motile E. coli isolates. Based on this, a refined procedure and protein identification rule specific for clinical MS-H typing was established and validated. With LC-MS/MS batch run procedure and database search parameter unique for E. coli MS-H typing, the standard procedure maintained high accuracy and specificity in clinical situations, and its potential to be used in a clinical setting was clearly established. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models
Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby
2017-01-01
Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.
Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong
2017-12-01
Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.
Schneider, Marilyn J
2008-08-01
A simple, rapid fluorescence screening assay was applied to the analysis of beef muscle for danofloxacin at the U.S. tolerance level of 200 ng/g. Muscle samples were homogenized in acetic acid-acetonitrile, the resultant mixture centrifuged, and fluorescence of the supernatants was then measured. The significant difference between the fluorescence of control muscle sample extracts and extracts of samples fortified at 200 ng/g allowed for successful discrimination between the samples. Setting a threshold level at the average 200 ng/g fortified sample extract fluorescence -3sigma allowed for identification of potentially violative samples. Successful analysis of a group of blind fortified samples over a range of concentrations was accomplished in this manner, without any false-negative results. The limits of quantitation for danofloxacin, as well as enrofloxacin, using this assay were determined in three types of beef muscle (hanging tenderloin, neck, and eye round steak), as well as in serum. Significant differences in limits of quantitation were found among the three different muscle types examined, with hanging tenderloin muscle providing the lowest value. This work not only shows the potential for use of the fluorescence screening assay as an alternative to currently used microbial or antibody-based assays for the analysis of danofloxacin in beef muscle, but also suggests that assays using beef muscle may vary in performance depending on the specific muscle selected for analysis.
Thoma, Brent; Camorlinga, Paola; Chan, Teresa M; Hall, Andrew Koch; Murnaghan, Aleisha; Sherbino, Jonathan
2018-01-01
Quantitative research is one of the many research methods used to help educators advance their understanding of questions in medical education. However, little research has been done on how to succeed in publishing in this area. We conducted a scoping review to identify key recommendations and reporting guidelines for quantitative educational research and scholarship. Medline, ERIC, and Google Scholar were searched for English-language articles published between 2006 and January 2016 using the search terms, "research design," "quantitative," "quantitative methods," and "medical education." A hand search was completed for additional references during the full-text review. Titles/abstracts were reviewed by two authors (BT, PC) and included if they focused on quantitative research in medical education and outlined reporting guidelines, or provided recommendations on conducting quantitative research. One hundred articles were reviewed in parallel with the first 30 used for calibration and the subsequent 70 to calculate Cohen's kappa coefficient. Two reviewers (BT, PC) conducted a full text review and extracted recommendations and reporting guidelines. A simple thematic analysis summarized the extracted recommendations. Sixty-one articles were reviewed in full, and 157 recommendations were extracted. The thematic analysis identified 86 items, 14 categories, and 3 themes. Fourteen quality evaluation tools and reporting guidelines were found. Discussion This paper provides guidance for junior researchers in the form of key quality markers and reporting guidelines. We hope that quantitative researchers in medical education will be informed by the results and that further work will be done to refine the list of recommendations.
Forkert, N D; Cheng, B; Kemmling, A; Thomalla, G; Fiehler, J
2014-01-01
The objective of this work is to present the software tool ANTONIA, which has been developed to facilitate a quantitative analysis of perfusion-weighted MRI (PWI) datasets in general as well as the subsequent multi-parametric analysis of additional datasets for the specific purpose of acute ischemic stroke patient dataset evaluation. Three different methods for the analysis of DSC or DCE PWI datasets are currently implemented in ANTONIA, which can be case-specifically selected based on the study protocol. These methods comprise a curve fitting method as well as a deconvolution-based and deconvolution-free method integrating a previously defined arterial input function. The perfusion analysis is extended for the purpose of acute ischemic stroke analysis by additional methods that enable an automatic atlas-based selection of the arterial input function, an analysis of the perfusion-diffusion and DWI-FLAIR mismatch as well as segmentation-based volumetric analyses. For reliability evaluation, the described software tool was used by two observers for quantitative analysis of 15 datasets from acute ischemic stroke patients to extract the acute lesion core volume, FLAIR ratio, perfusion-diffusion mismatch volume with manually as well as automatically selected arterial input functions, and follow-up lesion volume. The results of this evaluation revealed that the described software tool leads to highly reproducible results for all parameters if the automatic arterial input function selection method is used. Due to the broad selection of processing methods that are available in the software tool, ANTONIA is especially helpful to support image-based perfusion and acute ischemic stroke research projects.
DOT National Transportation Integrated Search
1975-07-01
The volume presents the results of the quantitative analyses of the O'Hare ASTC System operations. The operations environments for the periods selected for detailed analysis of the ASDE films and controller communications recording are described. Fol...
The Role of Recurrence Plots in Characterizing the Output-Unemployment Relationship: An Analysis
Caraiani, Petre; Haven, Emmanuel
2013-01-01
We analyse the output-unemployment relationship using an approach based on cross-recurrence plots and quantitative recurrence analysis. We use post-war period quarterly U.S. data. The results obtained show the emergence of a complex and interesting relationship. PMID:23460814
COMPARATIVE ANALYSIS OF HEALTH RISK ASSESSMENTS FOR MUNICIPAL WASTE COMBUSTORS
Quantitative health risk assessments have been performed for a number of proposed municipal waste combustor (MWC) facilities over the past several years. his article presents the results of a comparative analysis of a total of 21 risk assessments, focusing on seven of the most co...
An Analysis of Students' Mistakes on Routine Slope Tasks
ERIC Educational Resources Information Center
Cho, Peter; Nagle, Courtney
2017-01-01
This study extends past research on students' understanding of slope by analyzing college students' mistakes on routine tasks involving slope. We conduct quantitative and qualitative analysis of students' mistakes to extract information regarding slope conceptualizations described in prior research. Results delineate procedural proficiencies and…
The Quantitative Preparation of Future Geoscience Graduate Students
NASA Astrophysics Data System (ADS)
Manduca, C. A.; Hancock, G. S.
2006-12-01
Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways. Calculus, calculus-based physics, chemistry, statistics, programming and linear algebra were viewed as important course preparation for a successful graduate experience. A set of recommendations for departments and for new community resources includes ideas for infusing quantitative reasoning throughout the undergraduate experience and mechanisms for learning from successful experiments in both geoscience and mathematics. A full list of participants, summaries of the meeting discussion and recommendations are available at http://serc.carleton.edu/quantskills/winter06/index.html. These documents, crafted by a small but diverse group can serve as a starting point for broader community discussion of the quantitative preparation of future geoscience graduate students.
Putting the "But" Back in Meta-Analysis: Issues Affecting the Validity of Quantitative Reviews.
ERIC Educational Resources Information Center
L'Hommedieu, Randi; And Others
Some of the frustrations inherent in trying to incorporate qualifications of statistical results into meta-analysis are reviewed, and some solutions are proposed to prevent the loss of information in meta-analytic reports. The validity of a meta-analysis depends on several factors, including the: thoroughness of the literature search; selection of…
A quantitative study of nanoparticle skin penetration with interactive segmentation.
Lee, Onseok; Lee, See Hyun; Jeong, Sang Hoon; Kim, Jaeyoung; Ryu, Hwa Jung; Oh, Chilhwan; Son, Sang Wook
2016-10-01
In the last decade, the application of nanotechnology techniques has expanded within diverse areas such as pharmacology, medicine, and optical science. Despite such wide-ranging possibilities for implementation into practice, the mechanisms behind nanoparticle skin absorption remain unknown. Moreover, the main mode of investigation has been qualitative analysis. Using interactive segmentation, this study suggests a method of objectively and quantitatively analyzing the mechanisms underlying the skin absorption of nanoparticles. Silica nanoparticles (SNPs) were assessed using transmission electron microscopy and applied to the human skin equivalent model. Captured fluorescence images of this model were used to evaluate degrees of skin penetration. These images underwent interactive segmentation and image processing in addition to statistical quantitative analyses of calculated image parameters including the mean, integrated density, skewness, kurtosis, and area fraction. In images from both groups, the distribution area and intensity of fluorescent silica gradually increased in proportion to time. Since statistical significance was achieved after 2 days in the negative charge group and after 4 days in the positive charge group, there is a periodic difference. Furthermore, the quantity of silica per unit area showed a dramatic change after 6 days in the negative charge group. Although this quantitative result is identical to results obtained by qualitative assessment, it is meaningful in that it was proven by statistical analysis with quantitation by using image processing. The present study suggests that the surface charge of SNPs could play an important role in the percutaneous absorption of NPs. These findings can help achieve a better understanding of the percutaneous transport of NPs. In addition, these results provide important guidance for the design of NPs for biomedical applications.
Chemical Fingerprint Analysis and Quantitative Analysis of Rosa rugosa by UPLC-DAD.
Mansur, Sanawar; Abdulla, Rahima; Ayupbec, Amatjan; Aisa, Haji Akbar
2016-12-21
A method based on ultra performance liquid chromatography with a diode array detector (UPLC-DAD) was developed for quantitative analysis of five active compounds and chemical fingerprint analysis of Rosa rugosa . Ten batches of R. rugosa collected from different plantations in the Xinjiang region of China were used to establish the fingerprint. The feasibility and advantages of the used UPLC fingerprint were verified for its similarity evaluation by systematically comparing chromatograms with professional analytical software recommended by State Food and Drug Administration (SFDA) of China. In quantitative analysis, the five compounds showed good regression (R² = 0.9995) within the test ranges, and the recovery of the method was in the range of 94.2%-103.8%. The similarities of liquid chromatography fingerprints of 10 batches of R. rugosa were more than 0.981. The developed UPLC fingerprint method is simple, reliable, and validated for the quality control and identification of R. rugosa . Additionally, simultaneous quantification of five major bioactive ingredients in the R. rugosa samples was conducted to interpret the consistency of the quality test. The results indicated that the UPLC fingerprint, as a characteristic distinguishing method combining similarity evaluation and quantification analysis, can be successfully used to assess the quality and to identify the authenticity of R. rugosa .
Tataw, David Besong; Ekúndayò, Olúgbémiga T
2017-01-01
This article reports on the use of sequential and integrated mixed-methods approach in a focused population and small-area analysis. The study framework integrates focus groups, survey research, and community engagement strategies in a search for evidence related to prostate cancer screening services utilization as a component of cancer prevention planning in a marginalized African American community in the United States. Research and data analysis methods are synthesized by aggregation, configuration, and interpretive analysis. The results of synthesis show that qualitative and quantitative data validate and complement each other in advancing our knowledge of population characteristics, variable associations, the complex context in which variables exist, and the best options for prevention and service planning. Synthesis of findings and interpretive analysis provided two important explanations which seemed inexplicable in regression outputs: (a) Focus group data on the limitations of the church as an educational source explain the negative association between preferred educational channels and screening behavior found in quantitative analysis. (b) Focus group data on unwelcoming provider environments explain the inconsistent relationship between knowledge of local sites and screening services utilization found in quantitative analysis. The findings suggest that planners, evaluators, and scientists should grow their planning and evaluation evidence from the community they serve.
Dynamics of land - use change in urban area in West Jakarta
NASA Astrophysics Data System (ADS)
Pangaribowo, R. L.
2018-01-01
This aim to research is to know how land use change in West Jakarta period 2000 - 2010. The research method used is descriptive method with a quantitative approach. Data analysis was done by using the result of research instrument to find out the driving of land change and to know the change of was analyzed using GIS (Geographic Information System) in Arc View GIS 3.3 program and Quantitative Analysis Model Location Quotient (LQ) and Shift-Share Analysis (SSA) In this study. The research instrument used in the analysis was observation and documentation. Based on the analysis conducted, the results of research on land use change in West Jakarta in the period of 10 years from 2000 until 2010 is caused by several aspects that are related to each other, namely political, economic, demographic, and cultural aspects. The land use change occurred in the area which decreased by minus 367,79 hectares (2.87%), the open space area decreased by minus 103.36 hectares (0.8%), the built up area increased by 201.13 hectares (1.57%), and the settlement area was 27.14 hectares (0.21%).
Valdés, Pablo A.; Leblond, Frederic; Kim, Anthony; Harris, Brent T.; Wilson, Brian C.; Fan, Xiaoyao; Tosteson, Tor D.; Hartov, Alex; Ji, Songbai; Erkmen, Kadir; Simmons, Nathan E.; Paulsen, Keith D.; Roberts, David W.
2011-01-01
Object Accurate discrimination between tumor and normal tissue is crucial for optimal tumor resection. Qualitative fluorescence of protoporphyrin IX (PpIX), synthesized endogenously following δ-aminolevulinic acid (ALA) administration, has been used for this purpose in high-grade glioma (HGG). The authors show that diagnostically significant but visually imperceptible concentrations of PpIX can be quantitatively measured in vivo and used to discriminate normal from neoplastic brain tissue across a range of tumor histologies. Methods The authors studied 14 patients with diagnoses of low-grade glioma (LGG), HGG, meningioma, and metastasis under an institutional review board–approved protocol for fluorescence-guided resection. The primary aim of the study was to compare the diagnostic capabilities of a highly sensitive, spectrally resolved quantitative fluorescence approach to conventional fluorescence imaging for detection of neoplastic tissue in vivo. Results A significant difference in the quantitative measurements of PpIX concentration occurred in all tumor groups compared with normal brain tissue. Receiver operating characteristic (ROC) curve analysis of PpIX concentration as a diagnostic variable for detection of neoplastic tissue yielded a classification efficiency of 87% (AUC = 0.95, specificity = 92%, sensitivity = 84%) compared with 66% (AUC = 0.73, specificity = 100%, sensitivity = 47%) for conventional fluorescence imaging (p < 0.0001). More than 81% (57 of 70) of the quantitative fluorescence measurements that were below the threshold of the surgeon's visual perception were classified correctly in an analysis of all tumors. Conclusions These findings are clinically profound because they demonstrate that ALA-induced PpIX is a targeting biomarker for a variety of intracranial tumors beyond HGGs. This study is the first to measure quantitative ALA-induced PpIX concentrations in vivo, and the results have broad implications for guidance during resection of intracranial tumors. PMID:21438658
Yu, Huan; Ni, Shi-Jun; Kong, Bo; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources.
Ni, Shi-Jun; He, Zheng-Wei; Zhang, Cheng-Jiang; Zhang, Shu-Qing; Pan, Xin; Xia, Chao-Xu; Li, Xuan-Qiong
2013-01-01
Land-use planning has triggered debates on social and environmental values, in which two key questions will be faced: one is how to see different planning simulation results instantaneously and apply the results back to interactively assist planning work; the other is how to ensure that the planning simulation result is scientific and accurate. To answer these questions, the objective of this paper is to analyze whether and how a bridge can be built between qualitative and quantitative approaches for land-use planning work and to find out a way to overcome the gap that exists between the ability to construct computer simulation models to aid integrated land-use plan making and the demand for them by planning professionals. The study presented a theoretical framework of land-use planning based on scenario analysis (SA) method and multiagent system (MAS) simulation integration and selected freshwater wetlands in the Sanjiang Plain of China as a case study area. Study results showed that MAS simulation technique emphasizing quantitative process effectively compensated for the SA method emphasizing qualitative process, which realized the organic combination of qualitative and quantitative land-use planning work, and then provided a new idea and method for the land-use planning and sustainable managements of land resources. PMID:23818816
Model-Based Linkage Analysis of a Quantitative Trait.
Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H
2017-01-01
Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.
Quantitative analysis of the correlations in the Boltzmann-Grad limit for hard spheres
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pulvirenti, M.
2014-12-09
In this contribution I consider the problem of the validity of the Boltzmann equation for a system of hard spheres in the Boltzmann-Grad limit. I briefly review the results available nowadays with a particular emphasis on the celebrated Lanford’s validity theorem. Finally I present some recent results, obtained in collaboration with S. Simonella, concerning a quantitative analysis of the propagation of chaos. More precisely we introduce a quantity (the correlation error) measuring how close a j-particle rescaled correlation function at time t (sufficiently small) is far from the full statistical independence. Roughly speaking, a correlation error of order k, measuresmore » (in the context of the BBKGY hierarchy) the event in which k tagged particles form a recolliding group.« less
Quantitative EEG analysis in minimally conscious state patients during postural changes.
Greco, A; Carboncini, M C; Virgillito, A; Lanata, A; Valenza, G; Scilingo, E P
2013-01-01
Mobilization and postural changes of patients with cognitive impairment are standard clinical practices useful for both psychic and physical rehabilitation process. During this process, several physiological signals, such as Electroen-cephalogram (EEG), Electrocardiogram (ECG), Photopletysmography (PPG), Respiration activity (RESP), Electrodermal activity (EDA), are monitored and processed. In this paper we investigated how quantitative EEG (qEEG) changes with postural modifications in minimally conscious state patients. This study is quite novel and no similar experimental data can be found in the current literature, therefore, although results are very encouraging, a quantitative analysis of the cortical area activated in such postural changes still needs to be deeply investigated. More specifically, this paper shows EEG power spectra and brain symmetry index modifications during a verticalization procedure, from 0 to 60 degrees, of three patients in Minimally Consciousness State (MCS) with focused region of impairment. Experimental results show a significant increase of the power in β band (12 - 30 Hz), commonly associated to human alertness process, thus suggesting that mobilization and postural changes can have beneficial effects in MCS patients.
Disintegration of endodontic cements in water.
Kaplan, A E; Goldberg, F; Artaza, L P; de Silvio, A; Macchi, R L
1997-07-01
The disintegration of three endodontic cements in water was determined quantitatively and qualitatively. The materials studied were Ketac-Endo (KE), Tubli Seal (TS), and AH26 (AH). Specimens were immersed in water for 48 h (GI), 7 (GII) and 45 days (GIII). The solid residue was then determined. For the qualitative analysis three groups of tubes were filled with the materials and stored in water for the same periods. The exposed surface was photographed. Results expressed as percentage of original mass in the quantitative analysis for loss of mass due to dissolution were: GI = KE 2.39 (0.70); TS 3.56 (0.37); AH 4.94 (2.83); GII = KE 2.84 (0.30); TS 2.50 (0.50); AH 0.66 (0.26); GIII = KE 1.60 (0.84); TS 1.03 (0.42); AH 1.22 (0.54). Tukey's least significant difference (0.05) was 2.94. In the qualitative experiment KE disintegration was far more evident than that suffered by other materials. The quantitative results had no correlation with the qualitative observations probably due to the difference in the moment when the materials were immersed.
NASA Astrophysics Data System (ADS)
Poveromo, Scott; Malcolm, Doug; Earthman, James
Conventional nondestructive (NDT) techniques used to detect defects in composites are not able to determine intact bond integrity within a composite structure and are costly to use on large and complex shaped surfaces. To overcome current NDT limitations, a new technology was adopted based on quantitative percussion diagnostics (QPD) to better quantify bond quality in fiber reinforced composite materials. Results indicate that this technology is capable of detecting weak (`kiss') bonds between flat composite laminates. Specifically, the local value of the probe force determined from quantitative percussion testing was predicted to be significantly lower for a laminate that contained a `kiss' bond compared to that for a well-bonded sample, which is in agreement with experimental findings. Experimental results were compared to a finite element analysis (FEA) using MSC PATRAN/NASTRAN to understand the visco-elastic behavior of the laminates during percussion testing. The dynamic FEA models were used to directly predict changes in the probe force, as well as effective stress distributions across the bonded panels as a function of time.
Code of Federal Regulations, 2014 CFR
2014-07-01
... PM2.5 violations”) must be based on quantitative analysis using the applicable air quality models... either: (i) Quantitative methods that represent reasonable and common professional practice; or (ii) A...) The hot-spot demonstration required by § 93.116 must be based on quantitative analysis methods for the...
2007-01-05
positive / false negatives. The quantitative on-site methods were evaluated using linear regression analysis and relative percent difference (RPD) comparison...Conclusion ...............................................................................................3-9 3.2 Quantitative Analysis Using CRREL...3-37 3.3 Quantitative Analysis for NG by GC/TID.........................................................3-38 3.3.1 Introduction
Prognostic Value of Quantitative Stress Perfusion Cardiac Magnetic Resonance.
Sammut, Eva C; Villa, Adriana D M; Di Giovine, Gabriella; Dancy, Luke; Bosio, Filippo; Gibbs, Thomas; Jeyabraba, Swarna; Schwenke, Susanne; Williams, Steven E; Marber, Michael; Alfakih, Khaled; Ismail, Tevfik F; Razavi, Reza; Chiribiri, Amedeo
2018-05-01
This study sought to evaluate the prognostic usefulness of visual and quantitative perfusion cardiac magnetic resonance (CMR) ischemic burden in an unselected group of patients and to assess the validity of consensus-based ischemic burden thresholds extrapolated from nuclear studies. There are limited data on the prognostic value of assessing myocardial ischemic burden by CMR, and there are none using quantitative perfusion analysis. Patients with suspected coronary artery disease referred for adenosine-stress perfusion CMR were included (n = 395; 70% male; age 58 ± 13 years). The primary endpoint was a composite of cardiovascular death, nonfatal myocardial infarction, aborted sudden death, and revascularization after 90 days. Perfusion scans were assessed visually and with quantitative analysis. Cross-validated Cox regression analysis and net reclassification improvement were used to assess the incremental prognostic value of visual or quantitative perfusion analysis over a baseline clinical model, initially as continuous covariates, then using accepted thresholds of ≥2 segments or ≥10% myocardium. After a median 460 days (interquartile range: 190 to 869 days) follow-up, 52 patients reached the primary endpoint. At 2 years, the addition of ischemic burden was found to increase prognostic value over a baseline model of age, sex, and late gadolinium enhancement (baseline model area under the curve [AUC]: 0.75; visual AUC: 0.84; quantitative AUC: 0.85). Dichotomized quantitative ischemic burden performed better than visual assessment (net reclassification improvement 0.043 vs. 0.003 against baseline model). This study was the first to address the prognostic benefit of quantitative analysis of perfusion CMR and to support the use of consensus-based ischemic burden thresholds by perfusion CMR for prognostic evaluation of patients with suspected coronary artery disease. Quantitative analysis provided incremental prognostic value to visual assessment and established risk factors, potentially representing an important step forward in the translation of quantitative CMR perfusion analysis to the clinical setting. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
3D/4D multiscale imaging in acute lymphoblastic leukemia cells: visualizing dynamics of cell death
NASA Astrophysics Data System (ADS)
Sarangapani, Sreelatha; Mohan, Rosmin Elsa; Patil, Ajeetkumar; Lang, Matthew J.; Asundi, Anand
2017-06-01
Quantitative phase detection is a new methodology that provides quantitative information on cellular morphology to monitor the cell status, drug response and toxicity. In this paper the morphological changes in acute leukemia cells treated with chitosan were detected using d'Bioimager a robust imaging system. Quantitative phase image of the cells was obtained with numerical analysis. Results show that the average area and optical volume of the chitosan treated cells is significantly reduced when compared with the control cells, which reveals the effect of chitosan on the cancer cells. From the results it can be attributed that d'Bioimager can be used as a non-invasive imaging alternative to measure the morphological changes of the living cells in real time.
NASA Astrophysics Data System (ADS)
Goacher, Robyn Elizabeth
Secondary Ion Mass Spectrometry (SIMS) is an established method for the quantitative analysis of dopants in semiconductors. The quasi-parallel mass acquisition of Time-of-Flight SIMS, along with the development of polyatomic primary ions, have rapidly increased the use of SIMS for analysis of organic and biological specimens. However, the advantages and disadvantages of using cluster primary ions for quantitative analysis of inorganic materials are not clear. The research described in this dissertation investigates the consequences of using polyatomic primary ions for the analysis of inorganic compounds in ToF-SIMS. Furthermore, the diffusion of Mn in GaAs, which is important in Spintronic material applications such as spin injection, is also studied by quantitative ToF-SIMS depth profiling. In the first portion of this work, it was discovered that primary ion bombardment of pre-sputtered compound semiconductors GaAs and InP for the purpose of spectral analysis resulted in the formation of cluster secondary ions, as well as atomic secondary ions (Chapter 2). In particular, bombardment using a cluster primary ion such as Bi3q + or C60q+ resulted in higher yields of high-mass cluster secondary ions. These cluster secondary ions did not have bulk stoichiometry, "non-stoichiometric", in contrast to the paradigm of stoichiometric cluster ions generated from salts. This is attributed to the covalent bonding of the compound semiconductors, as well as to preferential sputtering. The utility of high-mass cluster secondary ions in depth profiling is also discussed. Relative sensitivity factors (RSFs) calculated for ion-implanted Fe and Mn samples in GaAs also exhibit differences based on whether monatomic or polyatomic primary ions are utilized (Chapter 3). These RSFs are important for the quantitative conversion of intensity to concentration. When Bi 32+ primary ions are used for analysis instead of Bi + primary ions, there is a significantly higher proportion of Mn and Fe ions present in the spectra, as referenced to the matrix species. The magnitude of this effect differs depending on the sputtering ion, Cs or C60. The use of C60cluster primary ions for depth profiling of GaAs is also investigated (Chapter 4). In particular, for quantitative depth profiling, parameters such as depth resolution, ion and sputter yields, and relative sensitivity factors are pertinent to profiling thin layered structures quantitatively and quickly. C60 sputtering is compared to Cs sputtering in all of these aspects. It is found that 10 keV C60+ is advantageous for the analysis of metals (such as Au contacts on Si) but that previously reported roughness problems prohibit successful analysis in Si. For Al delta layers and quantum wells in GaAs, C60 q+ sputtering induced very little roughness in the sample, and resulted in high ion yields and excellent signal-to-noise as compared to Cs+ sputtering. However, the depth resolution of C60 is at best equivalent to 1 keV Cs+ and does not extend into the sub 2-nm range. Furthermore, C60 sputtering results in significant carbon implantation. In the second portion of this work, quantitative ToF-SIMS depth profiling was used to evaluate the diffusion of Mn into GaAs. Samples were prepared by Molecular Beam Epitaxy in the department of Physics. Mn diffusion from MnAs was investigated first, and Mn diffusion from layered epitaxial structures of GaAs / Ga1-xMnxAs / GaAs was investigated second. Diffusion experiments were conducted by annealing portions of the samples in sealed glass ampoules at low temperatures (200-400°C). Different sputtering rates were measured for MnAs and GaAs and the measured depth profiles were corrected for these effects. RSFs measured for Mn ion-implanted standards were used to calibrate the intensity scale. For diffusion from MnAs, thin MnAs layers resulted in no measurable changes except in the surface transient. For thick MnAs layers, it was determined that substantial loss of As occurred at 400°C, resulting in severe sample roughening, which inhibited proper SIMS analysis. Results for the diffusion of Mn out of a thick buried layer of Ga1-xMnxAs show that annealing induces diffusion of Mn species from the Ga1-xMnxAs layer into the neighboring GaAs with an activation energy of 0.69+/-0.09 eV. This results in doping of the GaAs layer, which is detrimental to spin injection for Spintronics devices.
NASA Astrophysics Data System (ADS)
Zivkovic, Sanja; Momcilovic, Milos; Staicu, Angela; Mutic, Jelena; Trtica, Milan; Savovic, Jelena
2017-02-01
The aim of this study was to develop a simple laser induced breakdown spectroscopy (LIBS) method for quantitative elemental analysis of powdered biological materials based on laboratory prepared calibration samples. The analysis was done using ungated single pulse LIBS in ambient air at atmospheric pressure. Transversely-Excited Atmospheric pressure (TEA) CO2 laser was used as an energy source for plasma generation on samples. The material used for the analysis was a blue-green alga Spirulina, widely used in food and pharmaceutical industries and also in a few biotechnological applications. To demonstrate the analytical potential of this particular LIBS system the obtained spectra were compared to the spectra obtained using a commercial LIBS system based on pulsed Nd:YAG laser. A single sample of known concentration was used to estimate detection limits for Ba, Ca, Fe, Mg, Mn, Si and Sr and compare detection power of these two LIBS systems. TEA CO2 laser based LIBS was also applied for quantitative analysis of the elements in powder Spirulina samples. Analytical curves for Ba, Fe, Mg, Mn and Sr were constructed using laboratory produced matrix-matched calibration samples. Inductively coupled plasma optical emission spectroscopy (ICP-OES) was used as the reference technique for elemental quantification, and reasonably well agreement between ICP and LIBS data was obtained. Results confirm that, in respect to its sensitivity and precision, TEA CO2 laser based LIBS can be successfully applied for quantitative analysis of macro and micro-elements in algal samples. The fact that nearly all classes of materials can be prepared as powders implies that the proposed method could be easily extended to a quantitative analysis of different kinds of materials, organic, biological or inorganic.
Visual aggregate analysis of eligibility features of clinical trials.
He, Zhe; Carini, Simona; Sim, Ida; Weng, Chunhua
2015-04-01
To develop a method for profiling the collective populations targeted for recruitment by multiple clinical studies addressing the same medical condition using one eligibility feature each time. Using a previously published database COMPACT as the backend, we designed a scalable method for visual aggregate analysis of clinical trial eligibility features. This method consists of four modules for eligibility feature frequency analysis, query builder, distribution analysis, and visualization, respectively. This method is capable of analyzing (1) frequently used qualitative and quantitative features for recruiting subjects for a selected medical condition, (2) distribution of study enrollment on consecutive value points or value intervals of each quantitative feature, and (3) distribution of studies on the boundary values, permissible value ranges, and value range widths of each feature. All analysis results were visualized using Google Charts API. Five recruited potential users assessed the usefulness of this method for identifying common patterns in any selected eligibility feature for clinical trial participant selection. We implemented this method as a Web-based analytical system called VITTA (Visual Analysis Tool of Clinical Study Target Populations). We illustrated the functionality of VITTA using two sample queries involving quantitative features BMI and HbA1c for conditions "hypertension" and "Type 2 diabetes", respectively. The recruited potential users rated the user-perceived usefulness of VITTA with an average score of 86.4/100. We contributed a novel aggregate analysis method to enable the interrogation of common patterns in quantitative eligibility criteria and the collective target populations of multiple related clinical studies. A larger-scale study is warranted to formally assess the usefulness of VITTA among clinical investigators and sponsors in various therapeutic areas. Copyright © 2015 Elsevier Inc. All rights reserved.
ASPECTS: an automation-assisted SPE method development system.
Li, Ming; Chou, Judy; King, Kristopher W; Yang, Liyu
2013-07-01
A typical conventional SPE method development (MD) process usually involves deciding the chemistry of the sorbent and eluent based on information about the analyte; experimentally preparing and trying out various combinations of adsorption chemistry and elution conditions; quantitatively evaluating the various conditions; and comparing quantitative results from all combination of conditions to select the best condition for method qualification. The second and fourth steps have mostly been performed manually until now. We developed an automation-assisted system that expedites the conventional SPE MD process by automating 99% of the second step, and expedites the fourth step by automatically processing the results data and presenting it to the analyst in a user-friendly format. The automation-assisted SPE MD system greatly saves the manual labor in SPE MD work, prevents analyst errors from causing misinterpretation of quantitative results, and shortens data analysis and interpretation time.
Application of remote sensing to monitoring and studying dispersion in ocean dumping
NASA Technical Reports Server (NTRS)
Johnson, R. W.; Ohlhorst, C. W.
1981-01-01
Remotely sensed wide area synoptic data provides information on ocean dumping that is not readily available by other means. A qualitative approach has been used to map features, such as river plumes. Results of quantitative analyses have been used to develop maps showing quantitative distributions of one or more water quality parameters, such as suspended solids or chlorophyll a. Joint NASA/NOAA experiments have been conducted at designated dump areas in the U.S. coastal zones to determine the applicability of aircraft remote sensing systems to map plumes resulting from ocean dumping of sewage sludge and industrial wastes. A second objective is related to the evaluation of previously developed quantitative analysis techniques for studying dispersion of materials in these plumes. It was found that plumes resulting from dumping of four waste materials have distinctive spectral characteristics. The development of a technology for use in a routine monitoring system, based on remote sensing techniques, is discussed.
A quantitative analysis of IRAS maps of molecular clouds
NASA Technical Reports Server (NTRS)
Wiseman, Jennifer J.; Adams, Fred C.
1994-01-01
We present an analysis of IRAS maps of five molecular clouds: Orion, Ophiuchus, Perseus, Taurus, and Lupus. For the classification and description of these astrophysical maps, we use a newly developed technique which considers all maps of a given type to be elements of a pseudometric space. For each physical characteristic of interest, this formal system assigns a distance function (a pseudometric) to the space of all maps: this procedure allows us to measure quantitatively the difference between any two maps and to order the space of all maps. We thus obtain a quantitative classification scheme for molecular clouds. In this present study we use the IRAS continuum maps at 100 and 60 micrometer(s) to produce column density (or optical depth) maps for the five molecular cloud regions given above. For this sample of clouds, we compute the 'output' functions which measure the distribution of density, the distribution of topological components, the self-gravity, and the filamentary nature of the clouds. The results of this work provide a quantitative description of the structure in these molecular cloud regions. We then order the clouds according to the overall environmental 'complexity' of these star-forming regions. Finally, we compare our results with the observed populations of young stellar objects in these clouds and discuss the possible environmental effects on the star-formation process. Our results are consistent with the recently stated conjecture that more massive stars tend to form in more 'complex' environments.
NASA Astrophysics Data System (ADS)
Xianliang, Lei; Hongying, Yu
Using the questionnaire, mathematical statistics and entropy measurement methods, the quantitative relationship between the individual characteristics urban residents and their sports consumption motivation are studied. The results show that the most main sports consumption motivation of urban residents is fitness motivation and social motivation. Urban residents of different gender, age, education and income levels are different in regulating psychological motivation, rational consumption motivation and seeking common motivation.
Practical considerations of image analysis and quantification of signal transduction IHC staining.
Grunkin, Michael; Raundahl, Jakob; Foged, Niels T
2011-01-01
The dramatic increase in computer processing power in combination with the availability of high-quality digital cameras during the last 10 years has fertilized the grounds for quantitative microscopy based on digital image analysis. With the present introduction of robust scanners for whole slide imaging in both research and routine, the benefits of automation and objectivity in the analysis of tissue sections will be even more obvious. For in situ studies of signal transduction, the combination of tissue microarrays, immunohistochemistry, digital imaging, and quantitative image analysis will be central operations. However, immunohistochemistry is a multistep procedure including a lot of technical pitfalls leading to intra- and interlaboratory variability of its outcome. The resulting variations in staining intensity and disruption of original morphology are an extra challenge for the image analysis software, which therefore preferably should be dedicated to the detection and quantification of histomorphometrical end points.
Ghosh, Debasree; Chattopadhyay, Parimal
2012-06-01
The objective of the work was to use the method of quantitative descriptive analysis (QDA) to describe the sensory attributes of the fermented food products prepared with the incorporation of lactic cultures. Panellists were selected and trained to evaluate various attributes specially color and appearance, body texture, flavor, overall acceptability and acidity of the fermented food products like cow milk curd and soymilk curd, idli, sauerkraut and probiotic ice cream. Principal component analysis (PCA) identified the six significant principal components that accounted for more than 90% of the variance in the sensory attribute data. Overall product quality was modelled as a function of principal components using multiple least squares regression (R (2) = 0.8). The result from PCA was statistically analyzed by analysis of variance (ANOVA). These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring the fermented food product attributes that are important for consumer acceptability.
Reinventing the ames test as a quantitative lab that connects classical and molecular genetics.
Goodson-Gregg, Nathan; De Stasio, Elizabeth A
2009-01-01
While many institutions use a version of the Ames test in the undergraduate genetics laboratory, students typically are not exposed to techniques or procedures beyond qualitative analysis of phenotypic reversion, thereby seriously limiting the scope of learning. We have extended the Ames test to include both quantitative analysis of reversion frequency and molecular analysis of revertant gene sequences. By giving students a role in designing their quantitative methods and analyses, students practice and apply quantitative skills. To help students connect classical and molecular genetic concepts and techniques, we report here procedures for characterizing the molecular lesions that confer a revertant phenotype. We suggest undertaking reversion of both missense and frameshift mutants to allow a more sophisticated molecular genetic analysis. These modifications and additions broaden the educational content of the traditional Ames test teaching laboratory, while simultaneously enhancing students' skills in experimental design, quantitative analysis, and data interpretation.
Quantitative proteomics in biological research.
Wilm, Matthias
2009-10-01
Proteomics has enabled the direct investigation of biological material, at first through the analysis of individual proteins, then of lysates from cell cultures, and finally of extracts from tissues and biopsies from entire organisms. Its latest manifestation - quantitative proteomics - allows deeper insight into biological systems. This article reviews the different methods used to extract quantitative information from mass spectra. It follows the technical developments aimed toward global proteomics, the attempt to characterize every expressed protein in a cell by at least one peptide. When applications of the technology are discussed, the focus is placed on yeast biology. In particular, differential quantitative proteomics, the comparison between an experiment and its control, is very discriminating for proteins involved in the process being studied. When trying to understand biological processes on a molecular level, differential quantitative proteomics tends to give a clearer picture than global transcription analyses. As a result, MS has become an even more indispensable tool for biochemically motivated biological research.
Multiscale Modeling for the Analysis for Grain-Scale Fracture Within Aluminum Microstructures
NASA Technical Reports Server (NTRS)
Glaessgen, Edward H.; Phillips, Dawn R.; Yamakov, Vesselin; Saether, Erik
2005-01-01
Multiscale modeling methods for the analysis of metallic microstructures are discussed. Both molecular dynamics and the finite element method are used to analyze crack propagation and stress distribution in a nanoscale aluminum bicrystal model subjected to hydrostatic loading. Quantitative similarity is observed between the results from the two very different analysis methods. A bilinear traction-displacement relationship that may be embedded into cohesive zone finite elements is extracted from the nanoscale molecular dynamics results.
Benner, W.H.
1984-05-08
An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N/sub 2/), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable obtained by decomposing the sample at 1135/sup 0/C, or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135/sup 0/C as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N/sub 2/, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.
Benner, William H.
1986-01-01
An oxygen analyzer which identifies and classifies microgram quantities of oxygen in ambient particulate matter and for quantitating organic oxygen in solvent extracts of ambient particulate matter. A sample is pyrolyzed in oxygen-free nitrogen gas (N.sub.2), and the resulting oxygen quantitatively converted to carbon monoxide (CO) by contact with hot granular carbon (C). Two analysis modes are made possible: (1) rapid determination of total pyrolyzable oxygen obtained by decomposing the sample at 1135.degree. C., or (2) temperature-programmed oxygen thermal analysis obtained by heating the sample from room temperature to 1135.degree. C. as a function of time. The analyzer basically comprises a pyrolysis tube containing a bed of granular carbon under N.sub.2, ovens used to heat the carbon and/or decompose the sample, and a non-dispersive infrared CO detector coupled to a mini-computer to quantitate oxygen in the decomposition products and control oven heating.
Fu, Hongbo; Wang, Huadong; Jia, Junwei; Ni, Zhibo; Dong, Fengzhong
2018-01-01
Due to the influence of major elements' self-absorption, scarce observable spectral lines of trace elements, and relative efficiency correction of experimental system, accurate quantitative analysis with calibration-free laser-induced breakdown spectroscopy (CF-LIBS) is in fact not easy. In order to overcome these difficulties, standard reference line (SRL) combined with one-point calibration (OPC) is used to analyze six elements in three stainless-steel and five heat-resistant steel samples. The Stark broadening and Saha - Boltzmann plot of Fe are used to calculate the electron density and the plasma temperature, respectively. In the present work, we tested the original SRL method, the SRL with the OPC method, and intercept with the OPC method. The final calculation results show that the latter two methods can effectively improve the overall accuracy of quantitative analysis and the detection limits of trace elements.
Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells
Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro
2015-01-01
Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells. PMID:26039484
The Brain Network for Deductive Reasoning: A Quantitative Meta-analysis of 28 Neuroimaging Studies
Prado, Jérôme; Chadha, Angad; Booth, James R.
2011-01-01
Over the course of the past decade, contradictory claims have been made regarding the neural bases of deductive reasoning. Researchers have been puzzled by apparent inconsistencies in the literature. Some have even questioned the effectiveness of the methodology used to study the neural bases of deductive reasoning. However, the idea that neuroimaging findings are inconsistent is not based on any quantitative evidence. Here, we report the results of a quantitative meta-analysis of 28 neuroimaging studies of deductive reasoning published between 1997 and 2010, combining 382 participants. Consistent areas of activations across studies were identified using the multilevel kernel density analysis method. We found that results from neuroimaging studies are more consistent than what has been previously assumed. Overall, studies consistently report activations in specific regions of a left fronto-parietal system, as well as in the left Basal Ganglia. This brain system can be decomposed into three subsystems that are specific to particular types of deductive arguments: relational, categorical, and propositional. These dissociations explain inconstancies in the literature. However, they are incompatible with the notion that deductive reasoning is supported by a single cognitive system relying either on visuospatial or rule-based mechanisms. Our findings provide critical insight into the cognitive organization of deductive reasoning and need to be accounted for by cognitive theories. PMID:21568632
Doshi, Ankur M; Ream, Justin M; Kierans, Andrea S; Bilbily, Matthew; Rusinek, Henry; Huang, William C; Chandarana, Hersh
2016-03-01
The purpose of this study was to determine whether qualitative and quantitative MRI feature analysis is useful for differentiating type 1 from type 2 papillary renal cell carcinoma (PRCC). This retrospective study included 21 type 1 and 17 type 2 PRCCs evaluated with preoperative MRI. Two radiologists independently evaluated various qualitative features, including signal intensity, heterogeneity, and margin. For the quantitative analysis, a radiology fellow and a medical student independently drew 3D volumes of interest over the entire tumor on T2-weighted HASTE images, apparent diffusion coefficient parametric maps, and nephrographic phase contrast-enhanced MR images to derive first-order texture metrics. Qualitative and quantitative features were compared between the groups. For both readers, qualitative features with greater frequency in type 2 PRCC included heterogeneous enhancement, indistinct margin, and T2 heterogeneity (all, p < 0.035). Indistinct margins and heterogeneous enhancement were independent predictors (AUC, 0.822). Quantitative analysis revealed that apparent diffusion coefficient, HASTE, and contrast-enhanced entropy were greater in type 2 PRCC (p < 0.05; AUC, 0.682-0.716). A combined quantitative and qualitative model had an AUC of 0.859. Qualitative features within the model had interreader concordance of 84-95%, and the quantitative data had intraclass coefficients of 0.873-0.961. Qualitative and quantitative features can help discriminate between type 1 and type 2 PRCC. Quantitative analysis may capture useful information that complements the qualitative appearance while benefiting from high interobserver agreement.
On aerodynamic wake analysis and its relation to total aerodynamic drag in a wind tunnel environment
NASA Astrophysics Data System (ADS)
Guterres, Rui M.
The present work was developed with the goal of advancing the state of the art in the application of three-dimensional wake data analysis to the quantification of aerodynamic drag on a body in a low speed wind tunnel environment. Analysis of the existing tools, their strengths and limitations is presented. Improvements to the existing analysis approaches were made. Software tools were developed to integrate the analysis into a practical tool. A comprehensive derivation of the equations needed for drag computations based on three dimensional separated wake data is developed. A set of complete steps ranging from the basic mathematical concept to the applicable engineering equations is presented. An extensive experimental study was conducted. Three representative body types were studied in varying ground effect conditions. A detailed qualitative wake analysis using wake imaging and two and three dimensional flow visualization was performed. Several significant features of the flow were identified and their relation to the total aerodynamic drag established. A comprehensive wake study of this type is shown to be in itself a powerful tool for the analysis of the wake aerodynamics and its relation to body drag. Quantitative wake analysis techniques were developed. Significant post processing and data conditioning tools and precision analysis were developed. The quality of the data is shown to be in direct correlation with the accuracy of the computed aerodynamic drag. Steps are taken to identify the sources of uncertainty. These are quantified when possible and the accuracy of the computed results is seen to significantly improve. When post processing alone does not resolve issues related to precision and accuracy, solutions are proposed. The improved quantitative wake analysis is applied to the wake data obtained. Guidelines are established that will lead to more successful implementation of these tools in future research programs. Close attention is paid to implementation of issues that are of crucial importance for the accuracy of the results and that are not detailed in the literature. The impact of ground effect on the flows in hand is qualitatively and quantitatively studied. Its impact on the accuracy of the computations as well as the wall drag incompatibility with the theoretical model followed are discussed. The newly developed quantitative analysis provides significantly increased accuracy. The aerodynamic drag coefficient is computed within one percent of balance measured value for the best cases.
Boyle, Rebecca R; McLean, Stuart; Brandon, Sue; Pass, Georgia J; Davies, Noel W
2002-11-25
We have developed two solid-phase microextraction (SPME) methods, coupled with gas chromatography, for quantitatively analysing the major Eucalyptus leaf terpene, 1,8-cineole, in both expired air and blood from the common brushtail possum (Trichosurus vulpecula). In-line SPME sampling (5 min at 20 degrees C room temperature) of excurrent air from an expiratory chamber containing a possum dosed orally with 1,8-cineole (50 mg/kg) allowed real-time semi-quantitative measurements reflecting 1,8-cineole blood concentrations. Headspace SPME using 50 microl whole blood collected from possums dosed orally with 1,8-cineole (30 mg/kg) resulted in excellent sensitivity (quantitation limit 1 ng/ml) and reproducibility. Blood concentrations ranged between 1 and 1380 ng/ml. Calibration curves were prepared for two concentration ranges (0.05-10 and 10-400 ng/50 microl) for the analysis of blood concentrations. Both calibration curves were linear (r(2)=0.999 and 0.994, respectively) and the equations for the two concentration ranges were consistent. Copyright 2002 Elsevier Science B.V.
Zhang, Qinnan; Zhong, Liyun; Tang, Ping; Yuan, Yingjie; Liu, Shengde; Tian, Jindong; Lu, Xiaoxu
2017-05-31
Cell refractive index, an intrinsic optical parameter, is closely correlated with the intracellular mass and concentration. By combining optical phase-shifting interferometry (PSI) and atomic force microscope (AFM) imaging, we constructed a label free, non-invasive and quantitative refractive index of single cell measurement system, in which the accurate phase map of single cell was retrieved with PSI technique and the cell morphology with nanoscale resolution was achieved with AFM imaging. Based on the proposed AFM/PSI system, we achieved quantitative refractive index distributions of single red blood cell and Jurkat cell, respectively. Further, the quantitative change of refractive index distribution during Daunorubicin (DNR)-induced Jurkat cell apoptosis was presented, and then the content changes of intracellular biochemical components were achieved. Importantly, these results were consistent with Raman spectral analysis, indicating that the proposed PSI/AFM based refractive index system is likely to become a useful tool for intracellular biochemical components analysis measurement, and this will facilitate its application for revealing cell structure and pathological state from a new perspective.
Wang, Du; Zhang, Zhaowei; Li, Peiwu; Zhang, Qi; Zhang, Wen
2016-07-14
Rapid and quantitative sensing of aflatoxin B1 with high sensitivity and specificity has drawn increased attention of studies investigating soybean sauce. A sensitive and rapid quantitative immunochromatographic sensing method was developed for the detection of aflatoxin B1 based on time-resolved fluorescence. It combines the advantages of time-resolved fluorescent sensing and immunochromatography. The dynamic range of a competitive and portable immunoassay was 0.3-10.0 µg·kg(-1), with a limit of detection (LOD) of 0.1 µg·kg(-1) and recoveries of 87.2%-114.3%, within 10 min. The results showed good correlation (R² > 0.99) between time-resolved fluorescent immunochromatographic strip test and high performance liquid chromatography (HPLC). Soybean sauce samples analyzed using time-resolved fluorescent immunochromatographic strip test revealed that 64.2% of samples contained aflatoxin B1 at levels ranging from 0.31 to 12.5 µg·kg(-1). The strip test is a rapid, sensitive, quantitative, and cost-effective on-site screening technique in food safety analysis.
NASA Astrophysics Data System (ADS)
Wang, Ximing; Kim, Bokkyu; Park, Ji Hoon; Wang, Erik; Forsyth, Sydney; Lim, Cody; Ravi, Ragini; Karibyan, Sarkis; Sanchez, Alexander; Liu, Brent
2017-03-01
Quantitative imaging biomarkers are used widely in clinical trials for tracking and evaluation of medical interventions. Previously, we have presented a web based informatics system utilizing quantitative imaging features for predicting outcomes in stroke rehabilitation clinical trials. The system integrates imaging features extraction tools and a web-based statistical analysis tool. The tools include a generalized linear mixed model(GLMM) that can investigate potential significance and correlation based on features extracted from clinical data and quantitative biomarkers. The imaging features extraction tools allow the user to collect imaging features and the GLMM module allows the user to select clinical data and imaging features such as stroke lesion characteristics from the database as regressors and regressands. This paper discusses the application scenario and evaluation results of the system in a stroke rehabilitation clinical trial. The system was utilized to manage clinical data and extract imaging biomarkers including stroke lesion volume, location and ventricle/brain ratio. The GLMM module was validated and the efficiency of data analysis was also evaluated.
In silico quantitative structure-toxicity relationship study of aromatic nitro compounds.
Pasha, Farhan Ahmad; Neaz, Mohammad Morshed; Cho, Seung Joo; Ansari, Mohiuddin; Mishra, Sunil Kumar; Tiwari, Sharvan
2009-05-01
Small molecules often have toxicities that are a function of molecular structural features. Minor variations in structural features can make large difference in such toxicity. Consequently, in silico techniques may be used to correlate such molecular toxicities with their structural features. Relative to nine different sets of aromatic nitro compounds having known observed toxicities against different targets, we developed ligand-based 2D quantitative structure-toxicity relationship models using 20 selected topological descriptors. The topological descriptors have several advantages such as conformational independency, facile and less time-consuming computation to yield good results. Multiple linear regression analysis was used to correlate variations of toxicity with molecular properties. The information index on molecular size, lopping centric index and Kier flexibility index were identified as fundamental descriptors for different kinds of toxicity, and further showed that molecular size, branching and molecular flexibility might be particularly important factors in quantitative structure-toxicity relationship analysis. This study revealed that topological descriptor-guided quantitative structure-toxicity relationship provided a very useful, cost and time-efficient, in silico tool for describing small-molecule toxicities.
[Scanning electron microscope observation and image quantitative analysis of Hippocampi].
Zhang, Z; Pu, Z; Xu, L; Xu, G; Wang, Q; Xu, G; Wu, L; Chen, J
1998-12-01
The "scale-like projects" on the derma of 3 species of Hippocampi, H. kuda Bleerer, H. trimaculatus Leach and H. japonicus Kaup were observed by scanning electron microscope (SEM). Results showed that some characteristics such us size, shape and type of arrangement of the "scale-like projects" can be considered as the evidence for microanalysis. Image quantitative analysis of the "scale-like project" was carried out on 45 pieces of photograph using area, long diameter, short diameter and shape factor as parameters. No difference among the different parts of the same species was observed, but significant differences were found among the above 3 species.
Li, Junjie; Zhang, Weixia; Chung, Ting-Fung; Slipchenko, Mikhail N.; Chen, Yong P.; Cheng, Ji-Xin; Yang, Chen
2015-01-01
We report a transient absorption (TA) imaging method for fast visualization and quantitative layer analysis of graphene and GO. Forward and backward imaging of graphene on various substrates under ambient condition was imaged with a speed of 2 μs per pixel. The TA intensity linearly increased with the layer number of graphene. Real-time TA imaging of GO in vitro with capability of quantitative analysis of intracellular concentration and ex vivo in circulating blood were demonstrated. These results suggest that TA microscopy is a valid tool for the study of graphene based materials. PMID:26202216
Investment appraisal using quantitative risk analysis.
Johansson, Henrik
2002-07-01
Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.
Comparison and evaluation of fusion methods used for GF-2 satellite image in coastal mangrove area
NASA Astrophysics Data System (ADS)
Ling, Chengxing; Ju, Hongbo; Liu, Hua; Zhang, Huaiqing; Sun, Hua
2018-04-01
GF-2 satellite is the highest spatial resolution Remote Sensing Satellite of the development history of China's satellite. In this study, three traditional fusion methods including Brovey, Gram-Schmidt and Color Normalized (CN were used to compare with the other new fusion method NNDiffuse, which used the qualitative assessment and quantitative fusion quality index, including information entropy, variance, mean gradient, deviation index, spectral correlation coefficient. Analysis results show that NNDiffuse method presented the optimum in qualitative and quantitative analysis. It had more effective for the follow up of remote sensing information extraction and forest, wetland resources monitoring applications.
Quantitative proteomic analysis of intact plastids.
Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki
2014-01-01
Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.
Smile line assessment comparing quantitative measurement and visual estimation.
Van der Geld, Pieter; Oosterveld, Paul; Schols, Jan; Kuijpers-Jagtman, Anne Marie
2011-02-01
Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation according to a standard categorization are more practical for regular diagnostics. Our objective in this study was to compare 2 semiquantitative methods with quantitative measurements for reliability and agreement. The faces of 122 male participants were individually registered by using digital videography. Spontaneous and posed smiles were captured. On the records, maxillary lip line heights and tooth display were digitally measured on each tooth and also visually estimated according to 3-grade and 4-grade scales. Two raters were involved. An error analysis was performed. Reliability was established with kappa statistics. Interexaminer and intraexaminer reliability values were high, with median kappa values from 0.79 to 0.88. Agreement of the 3-grade scale estimation with quantitative measurement showed higher median kappa values (0.76) than the 4-grade scale estimation (0.66). Differentiating high and gummy smile lines (4-grade scale) resulted in greater inaccuracies. The estimation of a high, average, or low smile line for each tooth showed high reliability close to quantitative measurements. Smile line analysis can be performed reliably with a 3-grade scale (visual) semiquantitative estimation. For a more comprehensive diagnosis, additional measuring is proposed, especially in patients with disproportional gingival display. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
Real-time quantitative analysis of H2, He, O2, and Ar by quadrupole ion trap mass spectrometry.
Ottens, Andrew K; Harrison, W W; Griffin, Timothy P; Helms, William R
2002-09-01
The use of a quadrupole ion trap mass spectrometer (QITMS) for quantitative analysis of hydrogen and helium as well as of other permanent gases is demonstrated. Like commercial instruments, the customized QITMS uses mass selective instability; however, this instrument operates at a greater trapping frequency and without a buffer gas. Thus, a useable mass range from 2 to over 50 daltons (Da) is achieved. The performance of the ion trap is evaluated using part-per-million (ppm) concentrations of hydrogen, helium, oxygen, and argon mixed into a nitrogen gas stream, as outlined by the National Aeronautics and Space Administration (NASA), which is interested in monitoring for cryogenic fuel leaks within the Space Shuttle during launch preparations. When quantitating the four analytes, relative accuracy and precision were better than the NASA-required minimum of 10% error and 5% deviation, respectively. Limits of detection were below the NASA requirement of 25-ppm hydrogen and 100-ppm helium; those for oxygen and argon were within the same order of magnitude as the requirements. These results were achieved at a fast data recording rate, and demonstrate the utility of the QITMS as a real-time quantitative monitoring device for permanent gas analysis. c. 2002 American Society for Mass Spectrometry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kiefel, Denis, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Stoessel, Rainer, E-mail: Denis.Kiefel@airbus.com, E-mail: Rainer.Stoessel@airbus.com; Grosse, Christian, E-mail: Grosse@tum.de
2015-03-31
In recent years, an increasing number of safety-relevant structures are designed and manufactured from carbon fiber reinforced polymers (CFRP) in order to reduce weight of airplanes by taking the advantage of their specific strength into account. Non-destructive testing (NDT) methods for quantitative defect analysis of damages are liquid- or air-coupled ultrasonic testing (UT), phased array ultrasonic techniques, and active thermography (IR). The advantage of these testing methods is the applicability on large areas. However, their quantitative information is often limited on impact localization and size. In addition to these techniques, Airbus Group Innovations operates a micro x-ray computed tomography (μ-XCT)more » system, which was developed for CFRP characterization. It is an open system which allows different kinds of acquisition, reconstruction, and data evaluation. One main advantage of this μ-XCT system is its high resolution with 3-dimensional analysis and visualization opportunities, which enables to gain important quantitative information for composite part design and stress analysis. Within this study, different NDT methods will be compared at CFRP samples with specified artificial impact damages. The results can be used to select the most suitable NDT-method for specific application cases. Furthermore, novel evaluation and visualization methods for impact analyzes are developed and will be presented.« less
78 FR 43838 - Airworthiness Directives; Hamilton Sundstrand Corporation Propellers
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-22
... qualitative risk assessment. The data gathered was then used for a more representative quantitative risk analysis. The results from the bond strength tests predicts a significantly lower fleet risk than the prior... predicts a significantly lower fleet risk than the prior qualitative analysis. Accordingly, we withdraw the...
ERIC Educational Resources Information Center
Misra, Anjali; Schloss, Patrick J.
1989-01-01
The critical analysis of 23 studies using respondent techniques for the reduction of excessive emotional reactions in school children focuses on research design, dependent variables, independent variables, component analysis, and demonstrations of generalization and maintenance. Results indicate widespread methodological flaws that limit the…
Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.
2015-01-01
PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895
Testicular Dysgenesis Syndrome and the Estrogen Hypothesis: A Quantitative Meta-Analysis
Martin, Olwenn V.; Shialis, Tassos; Lester, John N.; Scrimshaw, Mark D.; Boobis, Alan R.; Voulvoulis, Nikolaos
2008-01-01
Background Male reproductive tract abnormalities such as hypospadias and cryptorchidism, and testicular cancer have been proposed to comprise a common syndrome together with impaired spermatogenesis with a common etiology resulting from the disruption of gonadal development during fetal life, the testicular dysgenesis syndrome (TDS). The hypothesis that in utero exposure to estrogenic agents could induce these disorders was first proposed in 1993. The only quantitative summary estimate of the association between prenatal exposure to estrogenic agents and testicular cancer was published over 10 years ago, and other systematic reviews of the association between estrogenic compounds, other than the potent pharmaceutical estrogen diethylstilbestrol (DES), and TDS end points have remained inconclusive. Objectives We conducted a quantitative meta-analysis of the association between the end points related to TDS and prenatal exposure to estrogenic agents. Inclusion in this analysis was based on mechanistic criteria, and the plausibility of an estrogen receptor (ER)-α–mediated mode of action was specifically explored. Results We included in this meta-analysis eight studies investigating the etiology of hypospadias and/or cryptorchidism that had not been identified in previous systematic reviews. Four additional studies of pharmaceutical estrogens yielded a statistically significant updated summary estimate for testicular cancer. Conclusions The doubling of the risk ratios for all three end points investigated after DES exposure is consistent with a shared etiology and the TDS hypothesis but does not constitute evidence of an estrogenic mode of action. Results of the subset analyses point to the existence of unidentified sources of heterogeneity between studies or within the study population. PMID:18288311
Quantitative learning strategies based on word networks
NASA Astrophysics Data System (ADS)
Zhao, Yue-Tian-Yi; Jia, Zi-Yang; Tang, Yong; Xiong, Jason Jie; Zhang, Yi-Cheng
2018-02-01
Learning English requires a considerable effort, but the way that vocabulary is introduced in textbooks is not optimized for learning efficiency. With the increasing population of English learners, learning process optimization will have significant impact and improvement towards English learning and teaching. The recent developments of big data analysis and complex network science provide additional opportunities to design and further investigate the strategies in English learning. In this paper, quantitative English learning strategies based on word network and word usage information are proposed. The strategies integrate the words frequency with topological structural information. By analyzing the influence of connected learned words, the learning weights for the unlearned words and dynamically updating of the network are studied and analyzed. The results suggest that quantitative strategies significantly improve learning efficiency while maintaining effectiveness. Especially, the optimized-weight-first strategy and segmented strategies outperform other strategies. The results provide opportunities for researchers and practitioners to reconsider the way of English teaching and designing vocabularies quantitatively by balancing the efficiency and learning costs based on the word network.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sciancalepore, Corrado, E-mail: corrado.sciancalepore@unimore.it; Bondioli, Federica; INSTM Consortium, Via G. Giusti 9, 51121 Firenze
2015-02-15
An innovative preparation procedure, based on microwave assisted non-hydrolytic sol–gel synthesis, to obtain spherical magnetite nanoparticles was reported together with a detailed quantitative phase analysis and microstructure characterization of the synthetic products. The nanoparticle growth was analyzed as a function of the synthesis time and was described in terms of crystallization degree employing the Rietveld method on the magnetic nanostructured system for the determination of the amorphous content using hematite as internal standard. Product crystallinity increases as the microwave thermal treatment is increased and reaches very high percentages for synthesis times longer than 1 h. Microstructural evolution of nanocrystals wasmore » followed by the integral breadth methods to obtain information on the crystallite size-strain distribution. The results of diffraction line profile analysis were compared with nanoparticle grain distribution estimated by dimensional analysis of the transmission electron microscopy (TEM) images. A variation both in the average grain size and in the distribution of the coherently diffraction domains is evidenced, allowing to suppose a relationship between the two quantities. The traditional integral breadth methods have proven to be valid for a rapid assessment of the diffraction line broadening effects in the above-mentioned nanostructured systems and the basic assumption for the correct use of these methods are discussed as well. - Highlights: • Fe{sub 3}O{sub 4} nanocrystals were obtained by MW-assisted non-hydrolytic sol–gel synthesis. • Quantitative phase analysis revealed that crystallinity up to 95% was reached. • The strategy of Rietveld refinements was discussed in details. • Dimensional analysis showed nanoparticles ranging from 4 to 8 nm. • Results of integral breadth methods were compared with microscopic analysis.« less
NASA Astrophysics Data System (ADS)
Mehta, Shalin B.; Sheppard, Colin J. R.
2010-05-01
Various methods that use large illumination aperture (i.e. partially coherent illumination) have been developed for making transparent (i.e. phase) specimens visible. These methods were developed to provide qualitative contrast rather than quantitative measurement-coherent illumination has been relied upon for quantitative phase analysis. Partially coherent illumination has some important advantages over coherent illumination and can be used for measurement of the specimen's phase distribution. However, quantitative analysis and image computation in partially coherent systems have not been explored fully due to the lack of a general, physically insightful and computationally efficient model of image formation. We have developed a phase-space model that satisfies these requirements. In this paper, we employ this model (called the phase-space imager) to elucidate five different partially coherent systems mentioned in the title. We compute images of an optical fiber under these systems and verify some of them with experimental images. These results and simulated images of a general phase profile are used to compare the contrast and the resolution of the imaging systems. We show that, for quantitative phase imaging of a thin specimen with matched illumination, differential phase contrast offers linear transfer of specimen information to the image. We also show that the edge enhancement properties of spiral phase contrast are compromised significantly as the coherence of illumination is reduced. The results demonstrate that the phase-space imager model provides a useful framework for analysis, calibration, and design of partially coherent imaging methods.
Park, Eun-Ah; Goo, Jin Mo; Park, Sang Joon; Lee, Hyun Ju; Lee, Chang Hyun; Park, Chang Min; Yoo, Chul-Gyu; Kim, Jong Hyo
2010-09-01
To evaluate the potential of xenon ventilation computed tomography (CT) in the quantitative and visual analysis of chronic obstructive pulmonary disease (COPD). This study was approved by the institutional review board. After informed consent was obtained, 32 patients with COPD underwent CT performed before the administration of xenon, two-phase xenon ventilation CT with wash-in (WI) and wash-out (WO) periods, and pulmonary function testing (PFT). For quantitative analysis, results of PFT were compared with attenuation parameters from prexenon images and xenon parameters from xenon-enhanced images in the following three areas at each phase: whole lung, lung with normal attenuation, and low-attenuating lung (LAL). For visual analysis, ventilation patterns were categorized according to the pattern of xenon attenuation in the area of structural abnormalities compared with that in the normal-looking background on a per-lobe basis: pattern A consisted of isoattenuation or high attenuation in the WI period and isoattenuation in the WO period; pattern B, isoattenuation or high attenuation in the WI period and high attenuation in the WO period; pattern C, low attenuation in both the WI and WO periods; and pattern D, low attenuation in the WI period and isoattenuation or high attenuation in the WO period. Among various attenuation and xenon parameters, xenon parameters of the LAL in the WO period showed the best inverse correlation with results of PFT (P < .0001). At visual analysis, while emphysema (which affected 99 lobes) commonly showed pattern A or B, airway diseases such as obstructive bronchiolitis (n = 5) and bronchiectasis (n = 2) and areas with a mucus plug (n = 1) or centrilobular nodules (n = 5) showed pattern D or C. WI and WO xenon ventilation CT is feasible for the simultaneous regional evaluation of structural and ventilation abnormalities both quantitatively and qualitatively in patients with COPD. (c) RSNA, 2010.
Xu, Y.; Xia, J.; Miller, R.D.
2006-01-01
Multichannel analysis of surface waves is a developing method widely used in shallow subsurface investigations. The field procedures and related parameters are very important for successful applications. Among these parameters, the source-receiver offset range is seldom discussed in theory and normally determined by empirical or semi-quantitative methods in current practice. This paper discusses the problem from a theoretical perspective. A formula for quantitatively evaluating a layered homogenous elastic model was developed. The analytical results based on simple models and experimental data demonstrate that the formula is correct for surface wave surveys for near-surface applications. ?? 2005 Elsevier B.V. All rights reserved.
A color prediction model for imagery analysis
NASA Technical Reports Server (NTRS)
Skaley, J. E.; Fisher, J. R.; Hardy, E. E.
1977-01-01
A simple model has been devised to selectively construct several points within a scene using multispectral imagery. The model correlates black-and-white density values to color components of diazo film so as to maximize the color contrast of two or three points per composite. The CIE (Commission Internationale de l'Eclairage) color coordinate system is used as a quantitative reference to locate these points in color space. Superimposed on this quantitative reference is a perceptional framework which functionally contrasts color values in a psychophysical sense. This methodology permits a more quantitative approach to the manual interpretation of multispectral imagery while resulting in improved accuracy and lower costs.
Dynamic calibration approach for determining catechins and gallic acid in green tea using LC-ESI/MS.
Bedner, Mary; Duewer, David L
2011-08-15
Catechins and gallic acid are antioxidant constituents of Camellia sinensis, or green tea. Liquid chromatography with both ultraviolet (UV) absorbance and electrospray ionization mass spectrometric (ESI/MS) detection was used to determine catechins and gallic acid in three green tea matrix materials that are commonly used as dietary supplements. The results from both detection modes were evaluated with 14 quantitation models, all of which were based on the analyte response relative to an internal standard. Half of the models were static, where quantitation was achieved with calibration factors that were constant over an analysis set. The other half were dynamic, with calibration factors calculated from interpolated response factor data at each time a sample was injected to correct for potential variations in analyte response over time. For all analytes, the relatively nonselective UV responses were found to be very stable over time and independent of the calibrant concentration; comparable results with low variability were obtained regardless of the quantitation model used. Conversely, the highly selective MS responses were found to vary both with time and as a function of the calibrant concentration. A dynamic quantitation model based on polynomial data-fitting was used to reduce the variability in the quantitative results using the MS data.
2014-01-01
Background Inflammatory mediators can serve as biomarkers for the monitoring of the disease progression or prognosis in many conditions. In the present study we introduce an adaptation of a membrane-based technique in which the level of up to 40 cytokines and chemokines can be determined in both human and rodent blood in a semi-quantitative way. The planar assay was modified using the LI-COR (R) detection system (fluorescence based) rather than chemiluminescence and semi-quantitative outcomes were achieved by normalizing the outcomes using the automated exposure settings of the Odyssey readout device. The results were compared to the gold standard assay, namely ELISA. Results The improved planar assay allowed the detection of a considerably higher number of analytes (n = 30 and n = 5 for fluorescent and chemiluminescent detection, respectively). The improved planar method showed high sensitivity up to 17 pg/ml and a linear correlation of the normalized fluorescence intensity with the results from the ELISA (r = 0.91). Conclusions The results show that the membrane-based technique is a semi-quantitative assay that correlates satisfactorily to the gold standard when enhanced by the use of fluorescence and subsequent semi-quantitative analysis. This promising technique can be used to investigate inflammatory profiles in multiple conditions, particularly in studies with constraints in sample sizes and/or budget. PMID:25022797
Sajnóg, Adam; Hanć, Anetta; Barałkiewicz, Danuta
2018-05-15
Analysis of clinical specimens by imaging techniques allows to determine the content and distribution of trace elements on the surface of the examined sample. In order to obtain reliable results, the developed procedure should be based not only on the properly prepared sample and performed calibration. It is also necessary to carry out all phases of the procedure in accordance with the principles of chemical metrology whose main pillars are the use of validated analytical methods, establishing the traceability of the measurement results and the estimation of the uncertainty. This review paper discusses aspects related to sampling, preparation and analysis of clinical samples by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) with emphasis on metrological aspects, i.e. selected validation parameters of the analytical method, the traceability of the measurement result and the uncertainty of the result. This work promotes the introduction of metrology principles for chemical measurement with emphasis to the LA-ICP-MS which is the comparative method that requires studious approach to the development of the analytical procedure in order to acquire reliable quantitative results. Copyright © 2018 Elsevier B.V. All rights reserved.
Namkoong, Sun; Hong, Seung Phil; Kim, Myung Hwa; Park, Byung Cheol
2013-02-01
Nowadays, although its clinical value remains controversial institutions utilize hair mineral analysis. Arguments about the reliability of hair mineral analysis persist, and there have been evaluations of commercial laboratories performing hair mineral analysis. The objective of this study was to assess the reliability of intra-laboratory and inter-laboratory data at three commercial laboratories conducting hair mineral analysis, compared to serum mineral analysis. Two divided hair samples taken from near the scalp were submitted for analysis at the same time, to all laboratories, from one healthy volunteer. Each laboratory sent a report consisting of quantitative results and their interpretation of health implications. Differences among intra-laboratory and interlaboratory data were analyzed using SPSS version 12.0 (SPSS Inc., USA). All the laboratories used identical methods for quantitative analysis, and they generated consistent numerical results according to Friedman analysis of variance. However, the normal reference ranges of each laboratory varied. As such, each laboratory interpreted the patient's health differently. On intra-laboratory data, Wilcoxon analysis suggested they generated relatively coherent data, but laboratory B could not in one element, so its reliability was doubtful. In comparison with the blood test, laboratory C generated identical results, but not laboratory A and B. Hair mineral analysis has its limitations, considering the reliability of inter and intra laboratory analysis comparing with blood analysis. As such, clinicians should be cautious when applying hair mineral analysis as an ancillary tool. Each laboratory included in this study requires continuous refinement from now on for inducing standardized normal reference levels.
A Study of Cognitive Load for Enhancing Student’s Quantitative Literacy in Inquiry Lab Learning
NASA Astrophysics Data System (ADS)
Nuraeni, E.; Rahman, T.; Alifiani, D. P.; Khoerunnisa, R. S.
2017-09-01
Students often find it difficult to appreciate the relevance of the role of quantitative analysis and concept attainment in the science class. This study measured student cognitive load during the inquiry lab of the respiratory system to improve quantitative literacy. Participants in this study were 40 11th graders from senior high school in Indonesia. After students learned, their feelings about the degree of mental effort that it took to complete the learning tasks were measured by 28 self-report on a 4-point Likert scale. The Task Complexity Worksheet were used to asses processing quantitative information and paper based test were applied to assess participants’ concept achievements. The results showed that inquiry instructional induced a relatively low mental effort, high processing information and high concept achievments.
Chardon, Jurgen; Swart, Arno
2016-07-01
In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.
Feasibility of high-resolution quantitative perfusion analysis in patients with heart failure.
Sammut, Eva; Zarinabad, Niloufar; Wesolowski, Roman; Morton, Geraint; Chen, Zhong; Sohal, Manav; Carr-White, Gerry; Razavi, Reza; Chiribiri, Amedeo
2015-02-12
Cardiac magnetic resonance (CMR) is playing an expanding role in the assessment of patients with heart failure (HF). The assessment of myocardial perfusion status in HF can be challenging due to left ventricular (LV) remodelling and wall thinning, coexistent scar and respiratory artefacts. The aim of this study was to assess the feasibility of quantitative CMR myocardial perfusion analysis in patients with HF. A group of 58 patients with heart failure (HF; left ventricular ejection fraction, LVEF ≤ 50%) and 33 patients with normal LVEF (LVEF >50%), referred for suspected coronary artery disease, were studied. All subjects underwent quantitative first-pass stress perfusion imaging using adenosine according to standard acquisition protocols. The feasibility of quantitative perfusion analysis was then assessed using high-resolution, 3 T kt perfusion and voxel-wise Fermi deconvolution. 30/58 (52%) subjects in the HF group had underlying ischaemic aetiology. Perfusion abnormalities were seen amongst patients with ischaemic HF and patients with normal LV function. No regional perfusion defect was observed in the non-ischaemic HF group. Good agreement was found between visual and quantitative analysis across all groups. Absolute stress perfusion rate, myocardial perfusion reserve (MPR) and endocardial-epicardial MPR ratio identified areas with abnormal perfusion in the ischaemic HF group (p = 0.02; p = 0.04; p = 0.02, respectively). In the Normal LV group, MPR and endocardial-epicardial MPR ratio were able to distinguish between normal and abnormal segments (p = 0.04; p = 0.02 respectively). No significant differences of absolute stress perfusion rate or MPR were observed comparing visually normal segments amongst groups. Our results demonstrate the feasibility of high-resolution voxel-wise perfusion assessment in patients with HF.
Sayet, G; Sinegre, M; Ben Reguiga, M
2014-01-01
Antibiotic Lock technique maintains catheters' sterility in high-risk patients with long-term parenteral nutrition. In our institution, vancomycin, teicoplanin, amikacin and gentamicin locks are prepared in the pharmaceutical department. In order to insure patient safety and to comply to regulatory requirements, antibiotic locks are submitted to qualitative and quantitative assays prior to their release. The aim of this study was to develop an alternative quantitation technique for each of these 4 antibiotics, using a Fourier transform infrared (FTIR) coupled to UV-Visible spectroscopy and to compare results to HPLC or Immunochemistry assays. Prevalidation studies permitted to assess spectroscopic conditions used for antibiotic locks quantitation: FTIR/UV combinations were used for amikacin (1091-1115cm(-1) and 208-224nm), vancomycin (1222-1240cm(-1) and 276-280nm), and teicoplanin (1226-1230cm(-1) and 278-282nm). Gentamicin was quantified with FTIR only (1045-1169cm(-1) and 2715-2850cm(-1)) due to interferences in UV domain of parabens, preservatives present in the commercial brand used to prepare locks. For all AL, the method was linear (R(2)=0.996 to 0.999), accurate, repeatable (intraday RSD%: from 2.9 to 7.1% and inter-days RSD%: 2.9 to 5.1%) and precise. Compared to the reference methods, the FTIR/UV method appeared tightly correlated (Pearson factor: 97.4 to 99.9%) and did not show significant difference in recovery determinations. We developed a new simple reliable analysis technique for antibiotics quantitation in locks using an original association of FTIR and UV analysis, allowing a short time analysis to identify and quantify the studied antibiotics. Copyright © 2013 Elsevier Masson SAS. All rights reserved.
Wen, Meiling; Jin, Ya; Manabe, Takashi; Chen, Shumin; Tan, Wen
2017-12-01
MS identification has long been used for PAGE-separated protein bands, but global and systematic quantitation utilizing MS after PAGE has remained rare and not been reported for native PAGE. Here we reported on a new method combining native PAGE, whole-gel slicing and quantitative LC-MS/MS, aiming at comparative analysis on not only abundance, but also structures and interactions of proteins. A pair of human plasma and serum samples were used as test samples and separated on a native PAGE gel. Six lanes of each sample were cut, each lane was further sliced into thirty-five 1.1 mm × 1.1 mm squares and all the squares were subjected to standardized procedures of in-gel digestion and quantitative LC-MS/MS. The results comprised 958 data rows that each contained abundance values of a protein detected in one square in eleven gel lanes (one plasma lane excluded). The data were evaluated to have satisfactory reproducibility of assignment and quantitation. Totally 315 proteins were assigned, with each protein assigned in 1-28 squares. The abundance distributions in the plasma and serum gel lanes were reconstructed for each protein, named as "native MS-electropherograms". Comparison of the electropherograms revealed significant plasma-versus-serum differences on 33 proteins in 87 squares (fold difference > 2 or < 0.5, p < 0.05). Many of the differences matched with accumulated knowledge on protein interactions and proteolysis involved in blood coagulation, complement and wound healing processes. We expect this method would be useful to provide more comprehensive information in comparative proteomic analysis, on both quantities and structures/interactions. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
3D Material Response Analysis of PICA Pyrolysis Experiments
NASA Technical Reports Server (NTRS)
Oliver, A. Brandon
2017-01-01
The PICA decomposition experiments of Bessire and Minton are investigated using 3D material response analysis. The steady thermoelectric equations have been added to the CHAR code to enable analysis of the Joule-heated experiments and the DAKOTA optimization code is used to define the voltage boundary condition that yields the experimentally observed temperature response. This analysis has identified a potential spatial non-uniformity in the PICA sample temperature driven by the cooled copper electrodes and thermal radiation from the surface of the test article (Figure 1). The non-uniformity leads to a variable heating rate throughout the sample volume that has an effect on the quantitative results of the experiment. Averaging the results of integrating a kinetic reaction mechanism with the heating rates seen across the sample volume yield a shift of peak species production to lower temperatures that is more significant for higher heating rates (Figure 2) when compared to integrating the same mechanism at the reported heating rate. The analysis supporting these conclusions will be presented along with a proposed analysis procedure that permits quantitative use of the existing data. Time permitting, a status on the in-development kinetic decomposition mechanism based on this data will be presented as well.
Hara, Risa; Ishigaki, Mika; Kitahama, Yasutaka; Ozaki, Yukihiro; Genkawa, Takuma
2018-08-30
The difference in Raman spectra for different excitation wavelengths (532 nm, 785 nm, and 1064 nm) was investigated to identify an appropriate wavelength for the quantitative analysis of carotenoids in tomatoes. For the 532 nm-excited Raman spectra, the intensity of the peak assigned to the carotenoid has no correlation with carotenoid concentration, and the peak shift reflects carotenoid composition changing from lycopene to β-carotene and lutein. Thus, 532 nm-excited Raman spectra are useful for the qualitative analysis of carotenoids. For the 785 nm- and 1064 nm-excited Raman spectra, the peak intensity of the carotenoid showed good correlation with carotenoid concentration; thus, regression models for carotenoid concentration were developed using these Raman spectra and partial least squares regression. A regression model designed using the 785 nm-excited Raman spectra showed a better result than the 532 nm- and 1064 nm-excited Raman spectra. Therefore, it can be concluded that 785 nm is the most suitable excitation wavelength for the quantitative analysis of carotenoid concentration in tomatoes. Copyright © 2018 Elsevier Ltd. All rights reserved.
COMPASS: a suite of pre- and post-search proteomics software tools for OMSSA
Wenger, Craig D.; Phanstiel, Douglas H.; Lee, M. Violet; Bailey, Derek J.; Coon, Joshua J.
2011-01-01
Here we present the Coon OMSSA Proteomic Analysis Software Suite (COMPASS): a free and open-source software pipeline for high-throughput analysis of proteomics data, designed around the Open Mass Spectrometry Search Algorithm. We detail a synergistic set of tools for protein database generation, spectral reduction, peptide false discovery rate analysis, peptide quantitation via isobaric labeling, protein parsimony and protein false discovery rate analysis, and protein quantitation. We strive for maximum ease of use, utilizing graphical user interfaces and working with data files in the original instrument vendor format. Results are stored in plain text comma-separated values files, which are easy to view and manipulate with a text editor or spreadsheet program. We illustrate the operation and efficacy of COMPASS through the use of two LC–MS/MS datasets. The first is a dataset of a highly annotated mixture of standard proteins and manually validated contaminants that exhibits the identification workflow. The second is a dataset of yeast peptides, labeled with isobaric stable isotope tags and mixed in known ratios, to demonstrate the quantitative workflow. For these two datasets, COMPASS performs equivalently or better than the current de facto standard, the Trans-Proteomic Pipeline. PMID:21298793
GUIDOS: tools for the assessment of pattern, connectivity, and fragmentation
NASA Astrophysics Data System (ADS)
Vogt, Peter
2013-04-01
Pattern, connectivity, and fragmentation can be considered as pillars for a quantitative analysis of digital landscape images. The free software toolbox GUIDOS (http://forest.jrc.ec.europa.eu/download/software/guidos) includes a variety of dedicated methodologies for the quantitative assessment of these features. Amongst others, Morphological Spatial Pattern Analysis (MSPA) is used for an intuitive description of image pattern structures and the automatic detection of connectivity pathways. GUIDOS includes tools for the detection and quantitative assessment of key nodes and links as well as to define connectedness in raster images and to setup appropriate input files for an enhanced network analysis using Conefor Sensinode. Finally, fragmentation is usually defined from a species point of view but a generic and quantifiable indicator is needed to measure fragmentation and its changes. Some preliminary results for different conceptual approaches will be shown for a sample dataset. Complemented by pre- and post-processing routines and a complete GIS environment the portable GUIDOS Toolbox may facilitate a holistic assessment in risk assessment studies, landscape planning, and conservation/restoration policies. Alternatively, individual analysis components may contribute to or enhance studies conducted with other software packages in landscape ecology.
NASA Astrophysics Data System (ADS)
Weinerová, Hedvika; Hron, Karel; Bábek, Ondřej; Šimíček, Daniel; Hladil, Jindřich
2017-06-01
Quantitative allochem compositional trends across the Lochkovian-Pragian boundary Event were examined at three sections recording the proximal to more distal carbonate ramp environment of the Prague Basin. Multivariate statistical methods (principal component analysis, correspondence analysis, cluster analysis) of point-counted thin section data were used to reconstruct facies stacking patterns and sea-level history. Both the closed-nature allochem percentages and their centred log-ratio (clr) coordinates were used. Both these approaches allow for distinguishing of lowstand, transgressive and highstand system tracts within the Praha Formation, which show gradual transition from crinoid-dominated facies deposited above the storm wave base to dacryoconarid-dominated facies of deep-water environment below the storm wave base. Quantitative compositional data also indicate progradative-retrogradative trends in the macrolithologically monotonous shallow-water succession and enable its stratigraphic correlation with successions from deeper-water environments. Generally, the stratigraphic trends of the clr data are more sensitive to subtle changes in allochem composition in comparison to the results based on raw data. A heterozoan-dominated allochem association in shallow-water environments of the Praha Formation supports the carbonate ramp environment assumed by previous authors.
Rebouças, Camila Tavares; Kogawa, Ana Carolina; Salgado, Hérida Regina Nunes
2018-05-18
Background: A green analytical chemistry method was developed for quantification of enrofloxacin in tablets. The drug, a second-generation fluoroquinolone, was first introduced in veterinary medicine for the treatment of various bacterial species. Objective: This study proposed to develop, validate, and apply a reliable, low-cost, fast, and simple IR spectroscopy method for quantitative routine determination of enrofloxacin in tablets. Methods: The method was completely validated according to the International Conference on Harmonisation guidelines, showing accuracy, precision, selectivity, robustness, and linearity. Results: It was linear over the concentration range of 1.0-3.0 mg with correlation coefficients >0.9999 and LOD and LOQ of 0.12 and 0.36 mg, respectively. Conclusions: Now that this IR method has met performance qualifications, it can be adopted and applied for the analysis of enrofloxacin tablets for production process control. The validated method can also be utilized to quantify enrofloxacin in tablets and thus is an environmentally friendly alternative for the routine analysis of enrofloxacin in quality control. Highlights: A new green method for the quantitative analysis of enrofloxacin by Fourier-Transform Infrared spectroscopy was validated. It is a fast, clean and low-cost alternative for the evaluation of enrofloxacin tablets.
NASA Technical Reports Server (NTRS)
Shortle, John F.; Allocco, Michael
2005-01-01
This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.
Liu, Jian; Gao, Yun-Hua; Li, Ding-Dong; Gao, Yan-Chun; Hou, Ling-Mi; Xie, Ting
2014-01-01
To compare the value of contrast-enhanced ultrasound (CEUS) qualitative and quantitative analysis in the identification of breast tumor lumps. Qualitative and quantitative indicators of CEUS for 73 cases of breast tumor lumps were retrospectively analyzed by univariate and multivariate approaches. Logistic regression was applied and ROC curves were drawn for evaluation and comparison. The CEUS qualitative indicator-generated regression equation contained three indicators, namely enhanced homogeneity, diameter line expansion and peak intensity grading, which demonstrated prediction accuracy for benign and malignant breast tumor lumps of 91.8%; the quantitative indicator-generated regression equation only contained one indicator, namely the relative peak intensity, and its prediction accuracy was 61.5%. The corresponding areas under the ROC curve for qualitative and quantitative analyses were 91.3% and 75.7%, respectively, which exhibited a statistically significant difference by the Z test (P<0.05). The ability of CEUS qualitative analysis to identify breast tumor lumps is better than with quantitative analysis.
Behboudi, S; Morein, B; Rönnberg, B
1995-12-01
In the iscom, multiple copies of antigen are attached by hydrophobic interaction to a matrix which is built up by Quillaja triterpenoid saponins and lipids. Thus, the iscom presents antigen in multimeric form in a small particle with a built-in adjuvant resulting in a highly immunogenic antigen formulation. We have designed a chloroform-methanol-water extraction procedure to isolate the triterpenoid saponins and lipids incorporated into iscom-matrix and iscoms. The triterpenoids in the triterpenoid phase were quantitated using orcinol sulfuric acid detecting their carbohydrate chains and by HPLC. The cholesterol and phosphatidylcholine in the lipid phase were quantitated by HPLC and a commercial colorimetric method for the cholesterol. The quantitative methods showed an almost total separation and recovery of triterpenoids and lipids in their respective phases, while protein was detected in all phases after extraction. The protein content was determined by the method of Lowry and by amino acid analysis. Amino acid analysis was shown to be the reliable method of the two to quantitate proteins in iscoms. In conclusion, simple, reproducible and efficient procedures have been designed to isolate and quantitate the triterpenoids and lipids added for preparation of iscom-matrix and iscoms. The procedures described should also be useful to adequately define constituents in prospective vaccines.
Quantitative 3D investigation of Neuronal network in mouse spinal cord model
NASA Astrophysics Data System (ADS)
Bukreeva, I.; Campi, G.; Fratini, M.; Spanò, R.; Bucci, D.; Battaglia, G.; Giove, F.; Bravin, A.; Uccelli, A.; Venturi, C.; Mastrogiacomo, M.; Cedola, A.
2017-01-01
The investigation of the neuronal network in mouse spinal cord models represents the basis for the research on neurodegenerative diseases. In this framework, the quantitative analysis of the single elements in different districts is a crucial task. However, conventional 3D imaging techniques do not have enough spatial resolution and contrast to allow for a quantitative investigation of the neuronal network. Exploiting the high coherence and the high flux of synchrotron sources, X-ray Phase-Contrast multiscale-Tomography allows for the 3D investigation of the neuronal microanatomy without any aggressive sample preparation or sectioning. We investigated healthy-mouse neuronal architecture by imaging the 3D distribution of the neuronal-network with a spatial resolution of 640 nm. The high quality of the obtained images enables a quantitative study of the neuronal structure on a subject-by-subject basis. We developed and applied a spatial statistical analysis on the motor neurons to obtain quantitative information on their 3D arrangement in the healthy-mice spinal cord. Then, we compared the obtained results with a mouse model of multiple sclerosis. Our approach paves the way to the creation of a “database” for the characterization of the neuronal network main features for a comparative investigation of neurodegenerative diseases and therapies.
Davatzikos, Christos; Rathore, Saima; Bakas, Spyridon; Pati, Sarthak; Bergman, Mark; Kalarot, Ratheesh; Sridharan, Patmaa; Gastounioti, Aimilia; Jahani, Nariman; Cohen, Eric; Akbari, Hamed; Tunc, Birkan; Doshi, Jimit; Parker, Drew; Hsieh, Michael; Sotiras, Aristeidis; Li, Hongming; Ou, Yangming; Doot, Robert K; Bilello, Michel; Fan, Yong; Shinohara, Russell T; Yushkevich, Paul; Verma, Ragini; Kontos, Despina
2018-01-01
The growth of multiparametric imaging protocols has paved the way for quantitative imaging phenotypes that predict treatment response and clinical outcome, reflect underlying cancer molecular characteristics and spatiotemporal heterogeneity, and can guide personalized treatment planning. This growth has underlined the need for efficient quantitative analytics to derive high-dimensional imaging signatures of diagnostic and predictive value in this emerging era of integrated precision diagnostics. This paper presents cancer imaging phenomics toolkit (CaPTk), a new and dynamically growing software platform for analysis of radiographic images of cancer, currently focusing on brain, breast, and lung cancer. CaPTk leverages the value of quantitative imaging analytics along with machine learning to derive phenotypic imaging signatures, based on two-level functionality. First, image analysis algorithms are used to extract comprehensive panels of diverse and complementary features, such as multiparametric intensity histogram distributions, texture, shape, kinetics, connectomics, and spatial patterns. At the second level, these quantitative imaging signatures are fed into multivariate machine learning models to produce diagnostic, prognostic, and predictive biomarkers. Results from clinical studies in three areas are shown: (i) computational neuro-oncology of brain gliomas for precision diagnostics, prediction of outcome, and treatment planning; (ii) prediction of treatment response for breast and lung cancer, and (iii) risk assessment for breast cancer.
Sieve-based device for MALDI sample preparation. III. Its power for quantitative measurements.
Molin, Laura; Cristoni, Simone; Seraglia, Roberta; Traldi, Pietro
2011-02-01
The solid sample inhomogeneity is a weak point of traditional MALDI deposition techniques that reflects negatively on quantitative analysis. The recently developed sieve-based device (SBD) sample deposition method, based on the electrospraying of matrix/analyte solutions through a grounded sieve, allows the homogeneous deposition of microcrystals with dimensions smaller than that of the laser spot. In each microcrystal the matrix/analyte molar ratio can be considered constant. Then, by irradiating different portions of the microcrystal distribution an identical response is obtained. This result suggests the employment of SBD in the development of quantitative procedures. For this aim, mixtures of different proteins of known molarity were analyzed, showing a good relationship between molarity and intensity ratios. This behaviour was also observed in the case of proteins with quite different ionic yields. The power of the developed method for quantitative evaluation was also tested by the measurement of the abundance of IGPP[Oxi]GPP[Oxi]GLMGPP (m/z 1219) present in the collagen-α-5(IV) chain precursor, differently expressed in urines from healthy subjects and diabetic-nephropathic patients, confirming its overexpression in the presence of nephropathy. The data obtained indicate that SBD is a particularly effective method for quantitative analysis also in biological fluids of interest. Copyright © 2011 John Wiley & Sons, Ltd.
Sachpekidis, Christos; Hillengass, Jens; Goldschmidt, Hartmut; Anwar, Hoda; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia
2017-01-01
A renewed interest has been recently developed for the highly sensitive bone-seeking radiopharmaceutical 18 F-NaF. Aim of the present study is to evaluate the potential utility of quantitative analysis of 18 F-NaF dynamic PET/CT data in differentiating malignant from benign degenerative lesions in multiple myeloma (MM). 80 MM patients underwent whole-body PET/CT and dynamic PET/CT scanning of the pelvis with 18 F-NaF. PET/CT data evaluation was based on visual (qualitative) assessment, semi-quantitative (SUV) calculations, and absolute quantitative estimations after application of a 2-tissue compartment model and a non-compartmental approach leading to the extraction of fractal dimension (FD). In total 263 MM lesions were demonstrated on 18 F-NaF PET/CT. Semi-quantitative and quantitative evaluations were performed for 25 MM lesions as well as for 25 benign, degenerative and traumatic lesions. Mean SUV average for MM lesions was 11.9 and mean SUV max was 23.2. Respectively, SUV average and SUV max for degenerative lesions were 13.5 and 20.2. Kinetic analysis of 18 F-NaF revealed the following mean values for MM lesions: K 1 = 0.248 (1/min), k 3 = 0.359 (1/min), influx (K i ) = 0.107 (1/min), FD = 1.382, while the respective values for degenerative lesions were: K 1 = 0.169 (1/min), k 3 = 0.422 (1/min), influx (K i ) = 0.095 (1/min), FD = 1. 411. No statistically significant differences between MM and benign degenerative disease regarding SUV average , SUV max , K 1 , k 3 and influx (K i ) were demonstrated. FD was significantly higher in degenerative than in malignant lesions. The present findings show that quantitative analysis of 18 F-NaF PET data cannot differentiate malignant from benign degenerative lesions in MM patients, supporting previously published results, which reflect the limited role of 18 F-NaF PET/CT in the diagnostic workup of MM.
Silicon solar cell process development, fabrication and analysis
NASA Technical Reports Server (NTRS)
Leung, D. C.; Iles, P. A.
1983-01-01
Measurements of minority carrier diffusion lengths were made on the small mesa diodes from HEM Si and SILSO Si. The results were consistent with previous Voc and Isc measurements. Only the medium grain SILSO had a distinct advantage for the non grain boundary diodes. Substantial variations were observed for the HEM ingot 4141C. Also a quantitatively scaled light spot scan was being developed for localized diffusion length measurements in polycrystalline silicon solar cells. A change to a more monochromatic input for the light spot scan results in greater sensitivity and in principle, quantitative measurement of local material qualities is now possible.
Low-frequency quantitative ultrasound imaging of cell death in vivo
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadeghi-Naini, Ali; Falou, Omar; Czarnota, Gregory J.
Purpose: Currently, no clinical imaging modality is used routinely to assess tumor response to cancer therapies within hours to days of the delivery of treatment. Here, the authors demonstrate the efficacy of ultrasound at a clinically relevant frequency to quantitatively detect changes in tumors in response to cancer therapies using preclinical mouse models.Methods: Conventional low-frequency and corresponding high-frequency ultrasound (ranging from 4 to 28 MHz) were used along with quantitative spectroscopic and signal envelope statistical analyses on data obtained from xenograft tumors treated with chemotherapy, x-ray radiation, as well as a novel vascular targeting microbubble therapy.Results: Ultrasound-based spectroscopic biomarkers indicatedmore » significant changes in cell-death associated parameters in responsive tumors. Specifically changes in the midband fit, spectral slope, and 0-MHz intercept biomarkers were investigated for different types of treatment and demonstrated cell-death related changes. The midband fit and 0-MHz intercept biomarker derived from low-frequency data demonstrated increases ranging approximately from 0 to 6 dBr and 0 to 8 dBr, respectively, depending on treatments administrated. These data paralleled results observed for high-frequency ultrasound data. Statistical analysis of ultrasound signal envelope was performed as an alternative method to obtain histogram-based biomarkers and provided confirmatory results. Histological analysis of tumor specimens indicated up to 61% cell death present in the tumors depending on treatments administered, consistent with quantitative ultrasound findings indicating cell death. Ultrasound-based spectroscopic biomarkers demonstrated a good correlation with histological morphological findings indicative of cell death (r{sup 2}= 0.71, 0.82; p < 0.001).Conclusions: In summary, the results provide preclinical evidence, for the first time, that quantitative ultrasound used at a clinically relevant frequency, in addition to high-frequency ultrasound, can detect tissue changes associated with cell death in vivo in response to cancer treatments.« less
Comparative study of standard space and real space analysis of quantitative MR brain data.
Aribisala, Benjamin S; He, Jiabao; Blamire, Andrew M
2011-06-01
To compare the robustness of region of interest (ROI) analysis of magnetic resonance imaging (MRI) brain data in real space with analysis in standard space and to test the hypothesis that standard space image analysis introduces more partial volume effect errors compared to analysis of the same dataset in real space. Twenty healthy adults with no history or evidence of neurological diseases were recruited; high-resolution T(1)-weighted, quantitative T(1), and B(0) field-map measurements were collected. Algorithms were implemented to perform analysis in real and standard space and used to apply a simple standard ROI template to quantitative T(1) datasets. Regional relaxation values and histograms for both gray and white matter tissues classes were then extracted and compared. Regional mean T(1) values for both gray and white matter were significantly lower using real space compared to standard space analysis. Additionally, regional T(1) histograms were more compact in real space, with smaller right-sided tails indicating lower partial volume errors compared to standard space analysis. Standard space analysis of quantitative MRI brain data introduces more partial volume effect errors biasing the analysis of quantitative data compared to analysis of the same dataset in real space. Copyright © 2011 Wiley-Liss, Inc.
Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.
Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan
2017-01-01
Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.
Savasoglu, Kaan; Payzin, Kadriye Bahriye; Ozdemirkiran, Fusun; Berber, Belgin
2015-08-01
To determine the use of the Quantitative Real Time PCR (RQ-PCR) assay follow-up with Chronic Myeloid Leukemia (CML) patients. Cross-sectional observational. Izmir Ataturk Education and Research Hospital, Izmir, Turkey, from 2009 to 2013. Cytogenetic, FISH, RQ-PCR test results from 177 CMLpatients' materials selected between 2009 - 2013 years was set up for comparison analysis. Statistical analysis was performed to compare between FISH, karyotype and RQ-PCR results of the patients. Karyotyping and FISH specificity and sensitivity rates determined by ROC analysis compared with RQ-PCR results. Chi-square test was used to compare test failure rates. Sensitivity and specificity values were determined for karyotyping 17.6 - 98% (p=0.118, p > 0.05) and for FISH 22.5 - 96% (p=0.064, p > 0.05) respectively. FISH sensitivity was slightly higher than karyotyping but there was calculated a strong correlation between them (p < 0.001). RQ-PCR test failure rate did not correlate with other two tests (p > 0.05); however, karyotyping and FISH test failure rate was statistically significant (p < 0.001). Besides, the situation needed for karyotype analysis, RQ-PCR assay can be used alone in the follow-up of CMLdisease.
2009-06-01
simulation is the campaign-level Peace Support Operations Model (PSOM). This thesis provides a quantitative analysis of PSOM. The results are based ...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . 15. NUMBER OF PAGES 159...multiple potential outcomes , further development and analysis is required before the model is used for large scale analysis . vi THIS PAGE
Abdul-Razzak, Amane; Sherifali, Diana; You, John; Simon, Jessica; Brazil, Kevin
2016-08-01
Despite the recognized importance of end-of-life (EOL) communication between patients and physicians, the extent and quality of such communication is lacking. We sought to understand patient perspectives on physician behaviours during EOL communication. In this mixed methods study, we conducted quantitative and qualitative strands and then merged data sets during a mixed methods analysis phase. In the quantitative strand, we used the quality of communication tool (QOC) to measure physician behaviours that predict global rating of satisfaction in EOL communication skills, while in the qualitative strand we conducted semi-structured interviews. During the mixed methods analysis, we compared and contrasted qualitative and quantitative data. Seriously ill inpatients at three tertiary care hospitals in Canada. We found convergence between qualitative and quantitative strands: patients desire candid information from their physician and a sense of familiarity. The quantitative results (n = 132) suggest a paucity of certain EOL communication behaviours in this seriously ill population with a limited prognosis. The qualitative findings (n = 16) suggest that at times, physicians did not engage in EOL communication despite patient readiness, while sometimes this may represent an appropriate deferral after assessment of a patient's lack of readiness. Avoidance of certain EOL topics may not always be a failure if it is a result of an assessment of lack of patient readiness. This has implications for future tool development: a measure could be built in to assess whether physician behaviours align with patient readiness. © 2015 The Authors. Health Expectations Published by John Wiley & Sons Ltd.
Quantitative analysis of wet-heat inactivation in bovine spongiform encephalopathy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matsuura, Yuichi; Ishikawa, Yukiko; Bo, Xiao
2013-03-01
Highlights: ► We quantitatively analyzed wet-heat inactivation of the BSE agent. ► Infectivity of the BSE macerate did not survive 155 °C wet-heat treatment. ► Once the sample was dehydrated, infectivity was observed even at 170 °C. ► A quantitative PMCA assay was used to evaluate the degree of BSE inactivation. - Abstract: The bovine spongiform encephalopathy (BSE) agent is resistant to conventional microbial inactivation procedures and thus threatens the safety of cattle products and by-products. To obtain information necessary to assess BSE inactivation, we performed quantitative analysis of wet-heat inactivation of infectivity in BSE-infected cattle spinal cords. Using amore » highly sensitive bioassay, we found that infectivity in BSE cattle macerates fell with increase in temperatures from 133 °C to 150 °C and was not detected in the samples subjected to temperatures above 155 °C. In dry cattle tissues, infectivity was detected even at 170 °C. Thus, BSE infectivity reduces with increase in wet-heat temperatures but is less affected when tissues are dehydrated prior to the wet-heat treatment. The results of the quantitative protein misfolding cyclic amplification assay also demonstrated that the level of the protease-resistant prion protein fell below the bioassay detection limit by wet-heat at 155 °C and higher and could help assess BSE inactivation. Our results show that BSE infectivity is strongly resistant to wet-heat inactivation and that it is necessary to pay attention to BSE decontamination in recycled cattle by-products.« less
Mycotoxin analysis: an update.
Krska, Rudolf; Schubert-Ullrich, Patricia; Molinelli, Alexandra; Sulyok, Michael; MacDonald, Susan; Crews, Colin
2008-02-01
Mycotoxin contamination of cereals and related products used for feed can cause intoxication, especially in farm animals. Therefore, efficient analytical tools for the qualitative and quantitative analysis of toxic fungal metabolites in feed are required. Current methods usually include an extraction step, a clean-up step to reduce or eliminate unwanted co-extracted matrix components and a separation step with suitably specific detection ability. Quantitative methods of analysis for most mycotoxins use immunoaffinity clean-up with high-performance liquid chromatography (HPLC) separation in combination with UV and/or fluorescence detection. Screening of samples contaminated with mycotoxins is frequently performed by thin layer chromatography (TLC), which yields qualitative or semi-quantitative results. Nowadays, enzyme-linked immunosorbent assays (ELISA) are often used for rapid screening. A number of promising methods, such as fluorescence polarization immunoassays, dipsticks, and even newer methods such as biosensors and non-invasive techniques based on infrared spectroscopy, have shown great potential for mycotoxin analysis. Currently, there is a strong trend towards the use of multi-mycotoxin methods for the simultaneous analysis of several of the important Fusarium mycotoxins, which is best achieved by LC-MS/MS (liquid chromatography with tandem mass spectrometry). This review focuses on recent developments in the determination of mycotoxins with a special emphasis on LC-MS/MS and emerging rapid methods.
NASA Astrophysics Data System (ADS)
Bestwick, Jordan; Unwin, David; Butler, Richard; Henderson, Don; Purnell, Mark
2017-04-01
Pterosaurs (Pterosauria) were a successful group of Mesozoic flying reptiles. For 150 million years they were integral components of terrestrial and coastal ecosystems, yet their feeding ecology remains poorly constrained. Postulated pterosaur diets include insectivory, piscivory and/or carnivory, but many dietary hypotheses are speculative and/or based on little evidence, highlighting the need for alternative approaches to provide robust data. One method involves quantitative analysis of the micron-scale 3D textures of worn pterosaur tooth surfaces - dental microwear texture analysis. Microwear is produced as scratches and chips generated by food items create characteristic tooth surface textures. Microwear analysis has never been applied to pterosaurs, but we might expect microwear textures to differ between pterosaurs with different diets. An important step in investigating pterosaur microwear is to examine microwear from extant organisms with known diets to provide a comparative data set. This has been achieved through analysis of non-occlusal microwear textures in extant bats, crocodilians and monitor lizards, clades within which species exhibit insectivorous, piscivorous and carnivorous diets. The results - the first test of the hypothesis that non-occlusal microwear textures in these extant clades vary with diet - provide the context for the first robust quantitative tests of pterosaur diets.
Do Deregulated Cas Proteins Induce Genomic Instability in Early-Stage Ovarian Cancer
2006-12-01
use Western blot analysis of tumor lysates to correlate expression of HEF1, p130Cas, Aurora A, and phospho-Aurora A. This analysis is in progress. In...and importantly, evaluated a number of different detection/image analysis systems to ensure reproducible quantitative results. We have used a pilot...reproducible Interestingly, preliminary statistical analysis using Spearman and Pearson correlation indicates at least one striking correlation
1983-10-13
Acid, Tannin , and Lignin in Natural Waters. Water Res. 14, 373 (1980). 85. Willard,H.,Furman,N.H.,Bacon,E.K. A Short Course in Quantitative Analysis , Van...63 c. Experimental Procedure 64 2. Results of the Preliminary Investigation of the SDI 74 a. Results of Before and After Membrane Filtration Analysis ...Permanganate Demand Test A. Literature Review 1. Permanganate to Predict Fouling 81 2. Detection and Analysis of Permanganate 83 a. Spectrophotometry
Unice, Kenneth M; Kreider, Marisa L; Panko, Julie M
2012-11-08
Pyrolysis(pyr)-GC/MS analysis of characteristic thermal decomposition fragments has been previously used for qualitative fingerprinting of organic sources in environmental samples. A quantitative pyr-GC/MS method based on characteristic tire polymer pyrolysis products was developed for tread particle quantification in environmental matrices including soil, sediment, and air. The feasibility of quantitative pyr-GC/MS analysis of tread was confirmed in a method evaluation study using artificial soil spiked with known amounts of cryogenically generated tread. Tread concentration determined by blinded analyses was highly correlated (r2 ≥ 0.88) with the known tread spike concentration. Two critical refinements to the initial pyrolysis protocol were identified including use of an internal standard and quantification by the dimeric markers vinylcyclohexene and dipentene, which have good specificity for rubber polymer with no other appreciable environmental sources. A novel use of deuterated internal standards of similar polymeric structure was developed to correct the variable analyte recovery caused by sample size, matrix effects, and ion source variability. The resultant quantitative pyr-GC/MS protocol is reliable and transferable between laboratories.
Isola, A A; Schmitt, H; van Stevendaal, U; Begemann, P G; Coulon, P; Boussel, L; Grass, M
2011-09-21
Large area detector computed tomography systems with fast rotating gantries enable volumetric dynamic cardiac perfusion studies. Prospectively, ECG-triggered acquisitions limit the data acquisition to a predefined cardiac phase and thereby reduce x-ray dose and limit motion artefacts. Even in the case of highly accurate prospective triggering and stable heart rate, spatial misalignment of the cardiac volumes acquired and reconstructed per cardiac cycle may occur due to small motion pattern variations from cycle to cycle. These misalignments reduce the accuracy of the quantitative analysis of myocardial perfusion parameters on a per voxel basis. An image-based solution to this problem is elastic 3D image registration of dynamic volume sequences with variable contrast, as it is introduced in this contribution. After circular cone-beam CT reconstruction of cardiac volumes covering large areas of the myocardial tissue, the complete series is aligned with respect to a chosen reference volume. The results of the registration process and the perfusion analysis with and without registration are evaluated quantitatively in this paper. The spatial alignment leads to improved quantification of myocardial perfusion for three different pig data sets.
A Systematic Approach for Quantitative Analysis of Multidisciplinary Design Optimization Framework
NASA Astrophysics Data System (ADS)
Kim, Sangho; Park, Jungkeun; Lee, Jeong-Oog; Lee, Jae-Woo
An efficient Multidisciplinary Design and Optimization (MDO) framework for an aerospace engineering system should use and integrate distributed resources such as various analysis codes, optimization codes, Computer Aided Design (CAD) tools, Data Base Management Systems (DBMS), etc. in a heterogeneous environment, and need to provide user-friendly graphical user interfaces. In this paper, we propose a systematic approach for determining a reference MDO framework and for evaluating MDO frameworks. The proposed approach incorporates two well-known methods, Analytic Hierarchy Process (AHP) and Quality Function Deployment (QFD), in order to provide a quantitative analysis of the qualitative criteria of MDO frameworks. Identification and hierarchy of the framework requirements and the corresponding solutions for the reference MDO frameworks, the general one and the aircraft oriented one were carefully investigated. The reference frameworks were also quantitatively identified using AHP and QFD. An assessment of three in-house frameworks was then performed. The results produced clear and useful guidelines for improvement of the in-house MDO frameworks and showed the feasibility of the proposed approach for evaluating an MDO framework without a human interference.
Artificial neural networks applied to quantitative elemental analysis of organic material using PIXE
NASA Astrophysics Data System (ADS)
Correa, R.; Chesta, M. A.; Morales, J. R.; Dinator, M. I.; Requena, I.; Vila, I.
2006-08-01
An artificial neural network (ANN) has been trained with real-sample PIXE (particle X-ray induced emission) spectra of organic substances. Following the training stage ANN was applied to a subset of similar samples thus obtaining the elemental concentrations in muscle, liver and gills of Cyprinus carpio. Concentrations obtained with the ANN method are in full agreement with results from one standard analytical procedure, showing the high potentiality of ANN in PIXE quantitative analyses.
Spalenza, Veronica; Girolami, Flavia; Bevilacqua, Claudia; Riondato, Fulvio; Rasero, Roberto; Nebbia, Carlo; Sacchi, Paola; Martin, Patrice
2011-09-01
Gene expression studies in blood cells, particularly lymphocytes, are useful for monitoring potential exposure to toxicants or environmental pollutants in humans and livestock species. Quantitative PCR is the method of choice for obtaining accurate quantification of mRNA transcripts although variations in the amount of starting material, enzymatic efficiency, and the presence of inhibitors can lead to evaluation errors. As a result, normalization of data is of crucial importance. The most common approach is the use of endogenous reference genes as an internal control, whose expression should ideally not vary among individuals and under different experimental conditions. The accurate selection of reference genes is therefore an important step in interpreting quantitative PCR studies. Since no systematic investigation in bovine lymphocytes has been performed, the aim of the present study was to assess the expression stability of seven candidate reference genes in circulating lymphocytes collected from 15 dairy cows. Following the characterization by flow cytometric analysis of the cell populations obtained from blood through a density gradient procedure, three popular softwares were used to evaluate the gene expression data. The results showed that two genes are sufficient for normalization of quantitative PCR studies in cattle lymphocytes and that YWAHZ, S24 and PPIA are the most stable genes. Copyright © 2010 Elsevier Ltd. All rights reserved.
Analysis and Derivation of Allocations for Fiber Contaminants in Liquid Bipropellant Systems
NASA Technical Reports Server (NTRS)
Lowrey, N. M; ibrahim, K. Y.
2012-01-01
An analysis was performed to identify the engineering rationale for the existing particulate limits in MSFC-SPEC-164, Cleanliness of Components for Use in Oxygen, Fuel, and Pneumatic Systems, determine the applicability of this rationale to fibers, identify potential risks that may result from fiber contamination in liquid oxygen/fuel bipropellant systems, and bound each of these risks. The objective of this analysis was to determine whether fiber contamination exceeding the established quantitative limits for particulate can be tolerated in these systems and, if so, to derive and recommend quantitative allocations for fibers beyond the limits established for other particulate. Knowledge gaps were identified that limit a complete understanding of the risk of promoted ignition from an accumulation of fibers in a gaseous oxygen system.
Wrobel, Tomasz P; Mateuszuk, Lukasz; Kostogrys, Renata B; Chlopicki, Stefan; Baranska, Malgorzata
2013-11-07
In this work the quantitative determination of atherosclerotic lesion area (ApoE/LDLR(-/-) mice) by FT-IR imaging is presented and validated by comparison with atherosclerotic lesion area determination by classic Oil Red O staining. Cluster analysis of FT-IR-based measurements in the 2800-3025 cm(-1) range allowed for quantitative analysis of the atherosclerosis plaque area, the results of which were highly correlated with those of Oil Red O histological staining (R(2) = 0.935). Moreover, a specific class obtained from a second cluster analysis of the aortic cross-section samples at different stages of disease progression (3, 4 and 6 months old) seemed to represent the macrophages (CD68) area within the atherosclerotic plaque.
Van Berkel, Gary J; Kertesz, Vilmos; Boeltz, Harry
2017-11-01
The aim of this work was to demonstrate and evaluate the analytical performance of coupling the immediate drop on demand technology to a mass spectrometer via the recently introduced open port sampling interface and ESI. Methodology & results: A maximum sample analysis throughput of 5 s per sample was demonstrated. Signal reproducibility was 10% or better as demonstrated by the quantitative analysis of propranolol and its stable isotope-labeled internal standard propranolol-d7. The ability of the system to multiply charge and analyze macromolecules was demonstrated using the protein cytochrome c. This immediate drop on demand technology/open port sampling interface/ESI-MS combination allowed for the quantitative analysis of relatively small mass analytes and was used for the identification of macromolecules like proteins.
Maddalena, Damian; Hoffman, Forrest; Kumar, Jitendra; Hargrove, William
2014-08-01
Sampling networks rarely conform to spatial and temporal ideals, often comprised of network sampling points which are unevenly distributed and located in less than ideal locations due to access constraints, budget limitations, or political conflict. Quantifying the global, regional, and temporal representativeness of these networks by quantifying the coverage of network infrastructure highlights the capabilities and limitations of the data collected, facilitates upscaling and downscaling for modeling purposes, and improves the planning efforts for future infrastructure investment under current conditions and future modeled scenarios. The work presented here utilizes multivariate spatiotemporal clustering analysis and representativeness analysis for quantitative landscape characterization and assessment of the Fluxnet, RAINFOR, and ForestGEO networks. Results include ecoregions that highlight patterns of bioclimatic, topographic, and edaphic variables and quantitative representativeness maps of individual and combined networks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa
Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less
Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa; ...
2016-06-22
Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less
White, Paul A; Johnson, George E
2016-05-01
Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the relationships between genetic damage and disease, and the concomitant ability to use genetic toxicity results per se. © Her Majesty the Queen in Right of Canada 2016. Reproduced with the permission of the Minister of Health.
Mehdi, Muhammad Zain; Nagi, Abdul Hanan; Naseem, Nadia
2016-01-01
Fuhrman nuclear grade is the most important histological parameter to predict prognosis in a patient of renal cell carcinoma (RCC). However, it suffers from inter-observer and intra-observer variation giving rise to need of a parameter that not only correlates with nuclear grade but is also objective and reproducible. Proliferation is the measure of aggressiveness of a tumour and it is strongly correlated with Fuhrman nuclear grade, clinical survival and recurrence in RCC. Ki-67 is conventionally used to assess proliferation. Mini-chromosome maintenance 2 (MCM-2) is a lesser known marker of proliferation and identifies a greater proliferation faction. This study was designed to assess the prognostic significance of MCM-2 by comparing it with Fuhrman nuclear grade and Ki-67. n=50 cases of various ages, stages, histological subtypes and grades of RCC were selected for this study. Immunohistochemical staining using Ki-67(MIB-1, Mouse monoclonal antibody, Dako) and MCM-2 (Mouse monoclonal antibody, Thermo) was performed on the paraffin embedded blocks in the department of Morbid anatomy and Histopathology, University of Health Sciences, Lahore. Labeling indices (LI) were determined by two pathologists independently using quantitative and semi-quantitative analysis. Statistical analysis was carried out using SPSS 20.0. Kruskall-Wallis test was used to determine a correlation of proliferation markers with grade, and Pearson's correlate was used to determine correlation between the two proliferation markers. Labeling index of MCM-2 (median=24.29%) was found to be much higher than Ki-67(median=13.05%). Both markers were significantly related with grade (p=0.00; Kruskall-Wallis test). LI of MCM-2 was found to correlate significantly with LI of Ki-67(r=0.0934;p=0.01 with Pearson's correlate). Results of semi-quantitative analysis correlated well with quantitative analysis. Both Ki-67 and MCM-2 are markers of proliferation which are closely linked to grade. Therefore, they can act as surrogate markers for grade in a manner that is more objective and reproducible. Copyright® by the International Brazilian Journal of Urology.
2013-06-01
measuring numerical risk to the government ( Galway , 2004). However, quantitative risk analysis is rarely utilized in DoD acquisition programs because the...quantitative assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost...Kindle version]. Retrieved from Amazon.com 83 Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review
A convenient method for X-ray analysis in TEM that measures mass thickness and composition
NASA Astrophysics Data System (ADS)
Statham, P.; Sagar, J.; Holland, J.; Pinard, P.; Lozano-Perez, S.
2018-01-01
We consider a new approach for quantitative analysis in transmission electron microscopy (TEM) that offers the same convenience as single-standard quantitative analysis in scanning electron microscopy (SEM). Instead of a bulk standard, a thin film with known mass thickness is used as a reference. The procedure involves recording an X-ray spectrum from the reference film for each session of acquisitions on real specimens. There is no need to measure the beam current; the current only needs to be stable for the duration of the session. A new reference standard with a large (1 mm x 1 mm) area of uniform thickness of 100 nm silicon nitride is used to reveal regions of X-ray detector occlusion that would give misleading results for any X-ray method that measures thickness. Unlike previous methods, the new X-ray method does not require an accurate beam current monitor but delivers equivalent accuracy in mass thickness measurement. Quantitative compositional results are also automatically corrected for specimen self-absorption. The new method is tested using a wedge specimen of Inconel 600 that is used to calibrate the high angle angular dark field (HAADF) signal to provide a thickness reference and results are compared with electron energy-loss spectrometry (EELS) measurements. For the new X-ray method, element composition results are consistent with the expected composition for the alloy and the mass thickness measurement is shown to provide an accurate alternative to EELS for thickness determination in TEM without the uncertainty associated with mean free path estimates.
Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens
We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response ofmore » an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.« less
Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models
Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens; ...
2016-12-15
We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response ofmore » an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.« less
Li, Lin; Xu, Shuo; An, Xin; Zhang, Lu-Da
2011-10-01
In near infrared spectral quantitative analysis, the precision of measured samples' chemical values is the theoretical limit of those of quantitative analysis with mathematical models. However, the number of samples that can obtain accurately their chemical values is few. Many models exclude the amount of samples without chemical values, and consider only these samples with chemical values when modeling sample compositions' contents. To address this problem, a semi-supervised LS-SVR (S2 LS-SVR) model is proposed on the basis of LS-SVR, which can utilize samples without chemical values as well as those with chemical values. Similar to the LS-SVR, to train this model is equivalent to solving a linear system. Finally, the samples of flue-cured tobacco were taken as experimental material, and corresponding quantitative analysis models were constructed for four sample compositions' content(total sugar, reducing sugar, total nitrogen and nicotine) with PLS regression, LS-SVR and S2 LS-SVR. For the S2 LS-SVR model, the average relative errors between actual values and predicted ones for the four sample compositions' contents are 6.62%, 7.56%, 6.11% and 8.20%, respectively, and the correlation coefficients are 0.974 1, 0.973 3, 0.923 0 and 0.948 6, respectively. Experimental results show the S2 LS-SVR model outperforms the other two, which verifies the feasibility and efficiency of the S2 LS-SVR model.
Quadrant photodetector sensitivity.
Manojlović, Lazo M
2011-07-10
A quantitative theoretical analysis of the quadrant photodetector (QPD) sensitivity in position measurement is presented. The Gaussian light spot irradiance distribution on the QPD surface was assumed to meet most of the real-life applications of this sensor. As the result of the mathematical treatment of the problem, we obtained, in a closed form, the sensitivity function versus the ratio of the light spot 1/e radius and the QPD radius. The obtained result is valid for the full range of the ratios. To check the influence of the finite light spot radius on the interaxis cross talk and linearity, we also performed a mathematical analysis to quantitatively measure these types of errors. An optimal range of the ratio of light spot radius and QPD radius has been found to simultaneously achieve low interaxis cross talk and high linearity of the sensor. © 2011 Optical Society of America
ERIC Educational Resources Information Center
Binglin, Zhong
2016-01-01
The article presents a quantitative analysis of the evaluation results for 41 newly built undergraduate schools that submitted to the qualification evaluation of undergraduate work by Ministry of Education in 2013. It shows that newly built undergraduate schools should place great emphasis on connotation construction and quality promotion and on…
Acoustic Facies Analysis of Side-Scan Sonar Data
NASA Astrophysics Data System (ADS)
Dwan, Fa Shu
Acoustic facies analysis methods have allowed the generation of system-independent values for the quantitative seafloor acoustic parameter, backscattering strength, from GLORIA and (TAMU) ^2 side-scan sonar data. The resulting acoustic facies parameters enable quantitative comparisons of data collected by different sonar systems, data from different environments, and measurements made with survey geometries. Backscattering strength values were extracted from the sonar amplitude data by inversion based on the sonar equation. Image processing products reveal seafloor features and patterns of relative intensity. To quantitatively compare data collected at different times or by different systems, and to ground truth-measurements and geoacoustic models, quantitative corrections must be made on any given data set for system source level, beam pattern, time-varying gain, processing gain, transmission loss, absorption, insonified area contribution, and grazing angle effects. In the sonar equation, backscattering strength is the sonar parameter which is directly related to seafloor properties. The GLORIA data used in this study are from the edge of a distal lobe of the Monterey Fan. An interfingered region of strong and weak seafloor signal returns from a flat seafloor region provides an ideal data set for this study. Inversion of imagery data from the region allows the quantitative definition of different acoustic facies. The (TAMU) ^2 data used are from a calibration site near the Green Canyon area of the Gulf of Mexico. Acoustic facies analysis techniques were implemented to generate statistical information for acoustic facies based on the estimates of backscattering strength. The backscattering strength values have been compared with Lambert's Law and other functions to parameterize the description of the acoustic facies. The resulting Lambertian constant values range from -26 dB to -36 dB. A modified Lambert relationship, which consists of both intercept and slope terms, appears to represent the BSS versus grazing angle profiles better based on chi^2 testing and error ellipse generation. Different regression functions, composed of trigonometric functions, were analyzed for different segments of the BSS profiles. A cotangent or sine/cosine function shows promising results for representing the entire grazing angle span of the BSS profiles.
Houssaye, Alexandra; Taverne, Maxime; Cornette, Raphaël
2018-05-01
Long bone inner structure and cross-sectional geometry display a strong functional signal, leading to convergences, and are widely analyzed in comparative anatomy at small and large taxonomic scales. Long bone microanatomical studies have essentially been conducted on transverse sections but also on a few longitudinal ones. Recent studies highlighted the interest in analyzing variations of the inner structure along the diaphysis using a qualitative as well as a quantitative approach. With the development of microtomography, it has become possible to study three-dimensional (3D) bone microanatomy and, in more detail, the form-function relationships of these features. This study focused on the selection of quantitative parameters to describe in detail the cross-sectional shape changes and distribution of the osseous tissue along the diaphysis. Two-dimensional (2D) virtual transverse sections were also performed in the two usual reference planes and results were compared with those obtained based on the whole diaphysis analysis. The sample consisted in 14 humeri and 14 femora of various mammalian taxa that are essentially terrestrial. Comparative quantitative analyses between different datasets made it possible to highlight the parameters that are strongly impacted by size and phylogeny and the redundant ones, and thus to estimate their relevance for use in form-function analyses. The analysis illustrated that results based on 2D transverse sections are similar for both sectional planes; thus if a strong bias exists when mixing sections from the two reference planes in the same analysis, it would not problematic to use either one plane or the other in comparative studies. However, this may no longer hold for taxa showing a much stronger variation in bone microstructure along the diaphysis. Finally, the analysis demonstrated the significant contribution of the parameters describing variations along the diaphysis, and thus the interest in performing 3D analyses; this should be even more fruitful for heterogeneous diaphyses. In addition, covariation analyses showed that there is a strong interest in removing the size effect to access the differences in the microstructure of the humerus and femur. This methodological study provides a reference for future quantitative analyses on long bone inner structure and should make it possible, through a detailed knowledge of each descriptive parameter, to better interpret results from the multivariate analyses associated with these studies. This will have direct implications for studies in vertebrate anatomy, but also in paleontology and anthropology. © 2018 Anatomical Society.
Sahiner, Ilgin; Akdemir, Umit O; Kocaman, Sinan A; Sahinarslan, Asife; Timurkaynak, Timur; Unlu, Mustafa
2013-02-01
Myocardial perfusion SPECT (MPS) is a noninvasive method commonly used for assessment of the hemodynamic significance of intermediate coronary stenoses. Fractional flow reserve (FFR) measurement is a well-validated invasive method used for the evaluation of intermediate stenoses. We aimed to determine the association between MPS and FFR findings in intermediate degree stenoses and evaluate the added value of quantification in MPS. Fifty-eight patients who underwent intracoronary pressure measurement in the catheterization laboratory to assess the physiological significance of intermediate (40-70%) left anterior descending (LAD) artery lesions, and who also underwent stress myocardial perfusion SPECT either for the assessment of an intermediate stenosis or for suspected coronary artery disease were analyzed retrospectively in the study. Quantitative analysis was performed using the 4DMSPECT program, with visual assessment performed by two experienced nuclear medicine physicians blinded to the angiographic findings. Summed stress scores (SSS) and summed difference scores (SDS) in the LAD artery territory according to the 20 segment model were calculated. A summed stress score of ≥ 3 and an SDS of ≥ 2 were assumed as pathologic, indicating significance of the lesion; a cutoff value of 0.75 was used to define abnormal FFR. Both visual and quantitative assessment results were compared with FFR using Chi-square (χ²) test. The mean time interval between two studies was 13 ± 11 days. FFR was normal in 45 and abnormal in 13 patients. Considering the FFR results as the gold standard method for assessing the significance of the lesion, the sensitivity and specificity of quantitative analysis determining the abnormal flow reserve were 85 and 84%, respectively, while visual analysis had a sensitivity of 77% and a specificity of 51%. There was a good agreement between the observers (κ = 0.856). Summed stress and difference scores demonstrated moderate inverse correlations with FFR values (r = -0.542, p < 0.001 and r = -0.506, p < 0.001, respectively). Quantitative analysis of the myocardial perfusion SPECT increases the specificity in evaluating the significance of intermediate degree coronary lesions.
2012-01-01
Background Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. Method We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. Results We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. Conclusions The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind. PMID:22369037
Graphical Interaction Analysis Impact on Groups Collaborating through Blogs
ERIC Educational Resources Information Center
Fessakis, Georgios; Dimitracopoulou, Angelique; Palaiodimos, Aggelos
2013-01-01
This paper presents empirical research results regarding the impact of Interaction Analysis (IA) graphs on groups of students collaborating through online blogging according to a "learning by design" scenario. The IA graphs used are of two categories; the first category summarizes quantitatively the activity of the users for each blog,…
Teaching Students with Visual Impairments in an Inclusive Educational Setting: A Case from Nepal
ERIC Educational Resources Information Center
Lamichhane, Kamal
2017-01-01
Using the data set from teachers and students and utilising both qualitative and quantitative techniques for analysis, I discuss teaching style considerations in Nepal's mainstream schools for students with visual impairments. Results of the econometric analysis show that teachers' years of schooling, teaching experience, and using blackboard were…
E-Books in Academic Libraries: Results of a Survey Carried out in Sweden and Lithuania
ERIC Educational Resources Information Center
Maceviciute, Elena; Wilson, T. D.; Gudinavicius, Arunas; Šuminas, Andrius
2017-01-01
Introduction: This paper reports on a study of e-books issues in academic libraries in two European countries representative of small language markets--Sweden and Lithuania. Method: Questionnaire surveys, using the same instrument, were carried out in Swedish and Lithuanian academic libraries. Analysis: Quantitative analysis was performed using…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhl, D.E.
1976-08-05
During the thirteen year duration of this contract the goal has been to develop and apply computer based analysis of radionuclide scan data so as to make available improved diagnostic information based on a knowledge of localized quantitative estimates of radionuclide concentration. Results are summarized. (CH)
Validation of quantitative method for azoxystrobin residues in green beans and peas.
Abdelraheem, Ehab M H; Hassan, Sayed M; Arief, Mohamed M H; Mohammad, Somaia G
2015-09-01
This study presents a method validation for extraction and quantitative analysis of azoxystrobin residues in green beans and peas using HPLC-UV and the results confirmed by GC-MS. The employed method involved initial extraction with acetonitrile after the addition of salts (magnesium sulfate and sodium chloride), followed by a cleanup step by activated neutral carbon. Validation parameters; linearity, matrix effect, LOQ, specificity, trueness and repeatability precision were attained. The spiking levels for the trueness and the precision experiments were (0.1, 0.5, 3 mg/kg). For HPLC-UV analysis, mean recoveries ranged between 83.69% to 91.58% and 81.99% to 107.85% for green beans and peas, respectively. For GC-MS analysis, mean recoveries ranged from 76.29% to 94.56% and 80.77% to 100.91% for green beans and peas, respectively. According to these results, the method has been proven to be efficient for extraction and determination of azoxystrobin residues in green beans and peas. Copyright © 2015 Elsevier Ltd. All rights reserved.
Wang, Ye; Williams, Cheri
2014-01-01
In a qualitative meta-analysis, the researchers systematically reviewed qualitative and quantitative meta-analyses on reading research with PK-12 students published after the 2000 National Reading Panel (NRP) report. Eleven qualitative and 39 quantitative meta-analyses were reviewed examining reading research with typically developing hearing students, special education hearing students (including English Language Learners), and d/Deaf or hard of hearing (d/Dhh) students. Generally, the meta-analysis yielded findings similar to and corroborative of the NRP's. Contradictory results (e.g., regarding the role of rhyme awareness in reading outcomes) most often resulted from differing definitions of interventions and their measurements. The analysis provided evidence of several instructional approaches that support reading development. On the basis of the qualitative similarity hypothesis (Paul, 2010, 2012; Paul & Lee, 2010; Paul & Wang, 2012; Paul, Wang, & Williams, 2013), the researchers argue that these instructional strategies also should effectively support d/Dhh children's reading development.
Carranco, Núria; Farrés-Cebrián, Mireia; Saurina, Javier
2018-01-01
High performance liquid chromatography method with ultra-violet detection (HPLC-UV) fingerprinting was applied for the analysis and characterization of olive oils, and was performed using a Zorbax Eclipse XDB-C8 reversed-phase column under gradient elution, employing 0.1% formic acid aqueous solution and methanol as mobile phase. More than 130 edible oils, including monovarietal extra-virgin olive oils (EVOOs) and other vegetable oils, were analyzed. Principal component analysis results showed a noticeable discrimination between olive oils and other vegetable oils using raw HPLC-UV chromatographic profiles as data descriptors. However, selected HPLC-UV chromatographic time-window segments were necessary to achieve discrimination among monovarietal EVOOs. Partial least square (PLS) regression was employed to tackle olive oil authentication of Arbequina EVOO adulterated with Picual EVOO, a refined olive oil, and sunflower oil. Highly satisfactory results were obtained after PLS analysis, with overall errors in the quantitation of adulteration in the Arbequina EVOO (minimum 2.5% adulterant) below 2.9%. PMID:29561820
Watanabe, Eiki; Miyake, Shiro
2013-01-15
This work presents analytical performance of a kit-based direct competitive enzyme-linked immunosorbent assay (dc-ELISA) for azoxystrobin detection in agricultural products. The dc-ELISA was sufficiently sensitive for analysis of residue levels close to the maximum residue limits. The dc-ELISA did not show cross-reactivity to other strobilurin analogues. Absorbance decreased with the increase of methanol concentration in sample solution from 2% to 40%, while the standard curve became most linear when the sample solution contained 10% methanol. Agricultural samples were extracted with methanol, and the extracts were diluted with water to 10% methanol adequate. No significant matrix interference was observed. Satisfying recovery was found for all of spiked samples and the results were well agreed with the analysis with liquid chromatography. These results clearly indicate that the kit-based dc-ELISA is suitable for the rapid, simple, quantitative and reliable determination of the fungicide. Copyright © 2012 Elsevier Ltd. All rights reserved.
[Urban ecological land in Changsha City: its quantitative analysis and optimization].
Li, Xiao-Li; Zeng, Guang-Ming; Shi, Lin; Liang, Jie; Cai, Qing
2010-02-01
In this paper, a hierarchy index system suitable for catastrophe progression method was constructed to comprehensively analyze and evaluate the status of ecological land construction in Changsha City in 2007. Based on the evaluation results, the irrationalities of the distribution pattern of Changsha urban ecological land were discussed. With the support of geographic information system (GIS), the ecological corridors of the urban ecological land were constructed by using the 'least-cost' modeling, and, in combining with conflict analysis, the optimum project of the urban ecological land was put forward, forming an integrated evaluation system. The results indicated that the ecological efficiency of urban ecological land in Changsha in 2007 was at medium level, with an evaluation value being 0.9416, and the quantitative index being relatively high but the coordination index being relatively low. The analysis and verification with software Fragstats showed that the ecological efficiency of the urban ecological land after optimization was higher, with the evaluation value being 0.9618, and the SHDI, CONTAG, and other indices also enhanced.
A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.
Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less
A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis
Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; ...
2015-12-07
Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less
Dinç, Erdal; Ozdemir, Abdil
2005-01-01
Multivariate chromatographic calibration technique was developed for the quantitative analysis of binary mixtures enalapril maleate (EA) and hydrochlorothiazide (HCT) in tablets in the presence of losartan potassium (LST). The mathematical algorithm of multivariate chromatographic calibration technique is based on the use of the linear regression equations constructed using relationship between concentration and peak area at the five-wavelength set. The algorithm of this mathematical calibration model having a simple mathematical content was briefly described. This approach is a powerful mathematical tool for an optimum chromatographic multivariate calibration and elimination of fluctuations coming from instrumental and experimental conditions. This multivariate chromatographic calibration contains reduction of multivariate linear regression functions to univariate data set. The validation of model was carried out by analyzing various synthetic binary mixtures and using the standard addition technique. Developed calibration technique was applied to the analysis of the real pharmaceutical tablets containing EA and HCT. The obtained results were compared with those obtained by classical HPLC method. It was observed that the proposed multivariate chromatographic calibration gives better results than classical HPLC.
Sastre Toraño, J; van Hattum, S H
2001-10-01
A new method is presented for the quantitative analysis of compounds in pharmaceutical preparations Fourier transform (FT) mid-infrared (MIR) spectroscopy with an attenuated total reflection (ATR) module. Reduction of the quantity of overlapping absorption bands, by interaction of the compound of interest with an appropriate solvent, and the employment of an internal standard (IS), makes MIR suitable for quantitative analysis. Vigabatrin, as active compound in vigabatrin 100-mg capsules, was used as a model compound for the development of the method. Vigabatrin was extracted from the capsule content with water after addition of a sodium thiosulfate IS solution. The extract was concentrated by volume reduction and applied to the FTMIR-ATR module. Concentrations of unknown samples were calculated from the ratio of the vigabatrin band area (1321-1610 cm(-1)) and the IS band area (883-1215 cm(-1)) using a calibration standard. The ratio of the area of the vigabatrin peak to that of the IS was linear with the concentration in the range of interest (90-110 mg, in twofold; n=2). The accuracy of the method in this range was 99.7-100.5% (n=5) with a variability of 0.4-1.3% (n=5). The comparison of the presented method with an HPLC assay showed similar results; the analysis of five vigabatrin 100-mg capsules resulted in a mean concentration of 102 mg with a variation of 2% with both methods.
Taira, Chiaki; Matsuda, Kazuyuki; Yamaguchi, Akemi; Uehara, Masayuki; Sugano, Mitsutoshi; Okumura, Nobuo; Honda, Takayuki
2015-05-20
Chimerism analysis is important for the evaluation of engraftment and predicting relapse following hematopoietic stem cell transplantation (HSCT). We developed a chimerism analysis for single nucleotide polymorphisms (SNPs), including rapid screening of the discriminable donor/recipient alleles using droplet allele-specific PCR (droplet-AS-PCR) pre-HSCT and quantitation of recipient DNA using AS-quantitative PCR (AS-qPCR) following HSCT. SNP genotyping of 20 donor/recipient pairs via droplet-AS-PCR and the evaluation of the informativity of 5 SNP markers for chimerism analysis were performed. Samples from six follow-up patients were analyzed to assess the chimerism via AS-qPCR. These results were compared with that determined by short tandem repeat PCR (STR-PCR). Droplet-AS-PCR could determine genotypes within 8min. The total informativity using all 5 loci was 95% (19/20). AS-qPCR provided the percentage of recipient DNA in all 6 follow-up patients without influence of the stutter peak or the amplification efficacy, which affected the STR-PCR results. The droplet-AS-PCR had an advantage over STR-PCR in terms of rapidity and simplicity for screening before HSCT. Furthermore, AS-qPCR had better accuracy than STR-PCR for quantification of recipient DNA following HSCT. The present chimerism assay compensates for the disadvantages of STR-PCR and is readily performable in clinical laboratories. Copyright © 2015 Elsevier B.V. All rights reserved.
Rohawi, Nur Syakila; Ramasamy, Kalavathy; Agatonovic-Kustrin, Snezana; Lim, Siong Meng
2018-06-05
A quantitative assay using high-performance thin-layer chromatography (HPTLC) was developed to investigate bile salt hydrolase (BSH) activity in Pediococcus pentosaceus LAB6 and Lactobacillus plantarum LAB12 probiotic bacteria isolated from Malaysian fermented food. Lactic acid bacteria (LAB) were cultured in de Man Rogosa and Sharpe (MRS) broth containing 1 mmol/L of sodium-based glyco- and tauro-conjugated bile salts for 24 h. The cultures were centrifuged and the resultant cell free supernatant was subjected to chromatographic separation on a HPTLC plate. Conjugated bile salts were quantified by densitometric scans at 550 nm and results were compared to digital image analysis of chromatographic plates after derivatisation with anisaldehyde/sulfuric acid. Standard curves for bile salts determination with both methods show good linearity with high coefficient of determination (R 2 ) between 0.97 and 0.99. Method validation indicates good sensitivity with low relative standard deviation (RSD) (<10%), low limits of detection (LOD) of 0.4 versus 0.2 μg and limit of quantification (LOQ) of 1.4 versus 0.7 μg, for densitometric vs digital image analysis method, respectively. The bile salt hydrolase activity was found to be higher against glyco- than tauro-conjugated bile salts (LAB6; 100% vs >38%: LAB12; 100% vs >75%). The present findings strongly show that quantitative analysis via digitally-enhanced HPTLC offers a rapid quantitative analysis for deconjugation of bile salts by probiotics. Copyright © 2018. Published by Elsevier B.V.
iTRAQ-Based Quantitative Proteomic Analysis of the Initiation of Head Regeneration in Planarians.
Geng, Xiaofang; Wang, Gaiping; Qin, Yanli; Zang, Xiayan; Li, Pengfei; Geng, Zhi; Xue, Deming; Dong, Zimei; Ma, Kexue; Chen, Guangwen; Xu, Cunshuan
2015-01-01
The planarian Dugesia japonica has amazing ability to regenerate a head from the anterior ends of the amputated stump with maintenance of the original anterior-posterior polarity. Although planarians present an attractive system for molecular investigation of regeneration and research has focused on clarifying the molecular mechanism of regeneration initiation in planarians at transcriptional level, but the initiation mechanism of planarian head regeneration (PHR) remains unclear at the protein level. Here, a global analysis of proteome dynamics during the early stage of PHR was performed using isobaric tags for relative and absolute quantitation (iTRAQ)-based quantitative proteomics strategy, and our data are available via ProteomeXchange with identifier PXD002100. The results showed that 162 proteins were differentially expressed at 2 h and 6 h following amputation. Furthermore, the analysis of expression patterns and functional enrichment of the differentially expressed proteins showed that proteins involved in muscle contraction, oxidation reduction and protein synthesis were up-regulated in the initiation of PHR. Moreover, ingenuity pathway analysis showed that predominant signaling pathways such as ILK, calcium, EIF2 and mTOR signaling which were associated with cell migration, cell proliferation and protein synthesis were likely to be involved in the initiation of PHR. The results for the first time demonstrated that muscle contraction and ILK signaling might played important roles in the initiation of PHR at the global protein level. The findings of this research provide a molecular basis for further unraveling the mechanism of head regeneration initiation in planarians.
Automated classification of cell morphology by coherence-controlled holographic microscopy
NASA Astrophysics Data System (ADS)
Strbkova, Lenka; Zicha, Daniel; Vesely, Pavel; Chmelik, Radim
2017-08-01
In the last few years, classification of cells by machine learning has become frequently used in biology. However, most of the approaches are based on morphometric (MO) features, which are not quantitative in terms of cell mass. This may result in poor classification accuracy. Here, we study the potential contribution of coherence-controlled holographic microscopy enabling quantitative phase imaging for the classification of cell morphologies. We compare our approach with the commonly used method based on MO features. We tested both classification approaches in an experiment with nutritionally deprived cancer tissue cells, while employing several supervised machine learning algorithms. Most of the classifiers provided higher performance when quantitative phase features were employed. Based on the results, it can be concluded that the quantitative phase features played an important role in improving the performance of the classification. The methodology could be valuable help in refining the monitoring of live cells in an automated fashion. We believe that coherence-controlled holographic microscopy, as a tool for quantitative phase imaging, offers all preconditions for the accurate automated analysis of live cell behavior while enabling noninvasive label-free imaging with sufficient contrast and high-spatiotemporal phase sensitivity.
Automated classification of cell morphology by coherence-controlled holographic microscopy.
Strbkova, Lenka; Zicha, Daniel; Vesely, Pavel; Chmelik, Radim
2017-08-01
In the last few years, classification of cells by machine learning has become frequently used in biology. However, most of the approaches are based on morphometric (MO) features, which are not quantitative in terms of cell mass. This may result in poor classification accuracy. Here, we study the potential contribution of coherence-controlled holographic microscopy enabling quantitative phase imaging for the classification of cell morphologies. We compare our approach with the commonly used method based on MO features. We tested both classification approaches in an experiment with nutritionally deprived cancer tissue cells, while employing several supervised machine learning algorithms. Most of the classifiers provided higher performance when quantitative phase features were employed. Based on the results, it can be concluded that the quantitative phase features played an important role in improving the performance of the classification. The methodology could be valuable help in refining the monitoring of live cells in an automated fashion. We believe that coherence-controlled holographic microscopy, as a tool for quantitative phase imaging, offers all preconditions for the accurate automated analysis of live cell behavior while enabling noninvasive label-free imaging with sufficient contrast and high-spatiotemporal phase sensitivity. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Teste, Marie-Ange; Duquenne, Manon; François, Jean M; Parrou, Jean-Luc
2009-01-01
Background Real-time RT-PCR is the recommended method for quantitative gene expression analysis. A compulsory step is the selection of good reference genes for normalization. A few genes often referred to as HouseKeeping Genes (HSK), such as ACT1, RDN18 or PDA1 are among the most commonly used, as their expression is assumed to remain unchanged over a wide range of conditions. Since this assumption is very unlikely, a geometric averaging of multiple, carefully selected internal control genes is now strongly recommended for normalization to avoid this problem of expression variation of single reference genes. The aim of this work was to search for a set of reference genes for reliable gene expression analysis in Saccharomyces cerevisiae. Results From public microarray datasets, we selected potential reference genes whose expression remained apparently invariable during long-term growth on glucose. Using the algorithm geNorm, ALG9, TAF10, TFC1 and UBC6 turned out to be genes whose expression remained stable, independent of the growth conditions and the strain backgrounds tested in this study. We then showed that the geometric averaging of any subset of three genes among the six most stable genes resulted in very similar normalized data, which contrasted with inconsistent results among various biological samples when the normalization was performed with ACT1. Normalization with multiple selected genes was therefore applied to transcriptional analysis of genes involved in glycogen metabolism. We determined an induction ratio of 100-fold for GPH1 and 20-fold for GSY2 between the exponential phase and the diauxic shift on glucose. There was no induction of these two genes at this transition phase on galactose, although in both cases, the kinetics of glycogen accumulation was similar. In contrast, SGA1 expression was independent of the carbon source and increased by 3-fold in stationary phase. Conclusion In this work, we provided a set of genes that are suitable reference genes for quantitative gene expression analysis by real-time RT-PCR in yeast biological samples covering a large panel of physiological states. In contrast, we invalidated and discourage the use of ACT1 as well as other commonly used reference genes (PDA1, TDH3, RDN18, etc) as internal controls for quantitative gene expression analysis in yeast. PMID:19874630
Sarrigiannis, Ptolemaios G; Zhao, Yifan; Wei, Hua-Liang; Billings, Stephen A; Fotheringham, Jayne; Hadjivassiliou, Marios
2014-01-01
To introduce a new method of quantitative EEG analysis in the time domain, the error reduction ratio (ERR)-causality test. To compare performance against cross-correlation and coherence with phase measures. A simulation example was used as a gold standard to assess the performance of ERR-causality, against cross-correlation and coherence. The methods were then applied to real EEG data. Analysis of both simulated and real EEG data demonstrates that ERR-causality successfully detects dynamically evolving changes between two signals, with very high time resolution, dependent on the sampling rate of the data. Our method can properly detect both linear and non-linear effects, encountered during analysis of focal and generalised seizures. We introduce a new quantitative EEG method of analysis. It detects real time levels of synchronisation in the linear and non-linear domains. It computes directionality of information flow with corresponding time lags. This novel dynamic real time EEG signal analysis unveils hidden neural network interactions with a very high time resolution. These interactions cannot be adequately resolved by the traditional methods of coherence and cross-correlation, which provide limited results in the presence of non-linear effects and lack fidelity for changes appearing over small periods of time. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Choël, Marie; Deboudt, Karine; Osán, János; Flament, Pascal; Van Grieken, René
2005-09-01
Atmospheric aerosols consist of a complex heterogeneous mixture of particles. Single-particle analysis techniques are known to provide unique information on the size-resolved chemical composition of aerosols. A scanning electron microscope (SEM) combined with a thin-window energy-dispersive X-ray (EDX) detector enables the morphological and elemental analysis of single particles down to 0.1 microm with a detection limit of 1-10 wt %, low-Z elements included. To obtain data statistically representative of the air masses sampled, a computer-controlled procedure can be implemented in order to run hundreds of single-particle analyses (typically 1000-2000) automatically in a relatively short period of time (generally 4-8 h, depending on the setup and on the particle loading). However, automated particle analysis by SEM-EDX raises two practical challenges: the accuracy of the particle recognition and the reliability of the quantitative analysis, especially for micrometer-sized particles with low atomic number contents. Since low-Z analysis is hampered by the use of traditional polycarbonate membranes, an alternate choice of substrate is a prerequisite. In this work, boron is being studied as a promising material for particle microanalysis. As EDX is generally said to probe a volume of approximately 1 microm3, geometry effects arise from the finite size of microparticles. These particle geometry effects must be corrected by means of a robust concentration calculation procedure. Conventional quantitative methods developed for bulk samples generate elemental concentrations considerably in error when applied to microparticles. A new methodology for particle microanalysis, combining the use of boron as the substrate material and a reverse Monte Carlo quantitative program, was tested on standard particles ranging from 0.25 to 10 microm. We demonstrate that the quantitative determination of low-Z elements in microparticles is achievable and that highly accurate results can be obtained using the automatic data processing described here compared to conventional methods.
Shi, Ximin; Li, Nan; Ding, Haiyan; Dang, Yonghong; Hu, Guilan; Liu, Shuai; Cui, Jie; Zhang, Yue; Li, Fang; Zhang, Hui; Huo, Li
2018-01-01
Kinetic modeling of dynamic 11 C-acetate PET imaging provides quantitative information for myocardium assessment. The quality and quantitation of PET images are known to be dependent on PET reconstruction methods. This study aims to investigate the impacts of reconstruction algorithms on the quantitative analysis of dynamic 11 C-acetate cardiac PET imaging. Suspected alcoholic cardiomyopathy patients ( N = 24) underwent 11 C-acetate dynamic PET imaging after low dose CT scan. PET images were reconstructed using four algorithms: filtered backprojection (FBP), ordered subsets expectation maximization (OSEM), OSEM with time-of-flight (TOF), and OSEM with both time-of-flight and point-spread-function (TPSF). Standardized uptake values (SUVs) at different time points were compared among images reconstructed using the four algorithms. Time-activity curves (TACs) in myocardium and blood pools of ventricles were generated from the dynamic image series. Kinetic parameters K 1 and k 2 were derived using a 1-tissue-compartment model for kinetic modeling of cardiac flow from 11 C-acetate PET images. Significant image quality improvement was found in the images reconstructed using iterative OSEM-type algorithms (OSME, TOF, and TPSF) compared with FBP. However, no statistical differences in SUVs were observed among the four reconstruction methods at the selected time points. Kinetic parameters K 1 and k 2 also exhibited no statistical difference among the four reconstruction algorithms in terms of mean value and standard deviation. However, for the correlation analysis, OSEM reconstruction presented relatively higher residual in correlation with FBP reconstruction compared with TOF and TPSF reconstruction, and TOF and TPSF reconstruction were highly correlated with each other. All the tested reconstruction algorithms performed similarly for quantitative analysis of 11 C-acetate cardiac PET imaging. TOF and TPSF yielded highly consistent kinetic parameter results with superior image quality compared with FBP. OSEM was relatively less reliable. Both TOF and TPSF were recommended for cardiac 11 C-acetate kinetic analysis.
Smoothing of the bivariate LOD score for non-normal quantitative traits.
Buil, Alfonso; Dyer, Thomas D; Almasy, Laura; Blangero, John
2005-12-30
Variance component analysis provides an efficient method for performing linkage analysis for quantitative traits. However, type I error of variance components-based likelihood ratio testing may be affected when phenotypic data are non-normally distributed (especially with high values of kurtosis). This results in inflated LOD scores when the normality assumption does not hold. Even though different solutions have been proposed to deal with this problem with univariate phenotypes, little work has been done in the multivariate case. We present an empirical approach to adjust the inflated LOD scores obtained from a bivariate phenotype that violates the assumption of normality. Using the Collaborative Study on the Genetics of Alcoholism data available for the Genetic Analysis Workshop 14, we show how bivariate linkage analysis with leptokurtotic traits gives an inflated type I error. We perform a novel correction that achieves acceptable levels of type I error.
Analysis of objects in binary images. M.S. Thesis - Old Dominion Univ.
NASA Technical Reports Server (NTRS)
Leonard, Desiree M.
1991-01-01
Digital image processing techniques are typically used to produce improved digital images through the application of successive enhancement techniques to a given image or to generate quantitative data about the objects within that image. In support of and to assist researchers in a wide range of disciplines, e.g., interferometry, heavy rain effects on aerodynamics, and structure recognition research, it is often desirable to count objects in an image and compute their geometric properties. Therefore, an image analysis application package, focusing on a subset of image analysis techniques used for object recognition in binary images, was developed. This report describes the techniques and algorithms utilized in three main phases of the application and are categorized as: image segmentation, object recognition, and quantitative analysis. Appendices provide supplemental formulas for the algorithms employed as well as examples and results from the various image segmentation techniques and the object recognition algorithm implemented.
Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia
2017-01-01
This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199
NASA Astrophysics Data System (ADS)
Shuxia, ZHAO; Lei, ZHANG; Jiajia, HOU; Yang, ZHAO; Wangbao, YIN; Weiguang, MA; Lei, DONG; Liantuan, XIAO; Suotang, JIA
2018-03-01
The chemical composition of alloys directly determines their mechanical behaviors and application fields. Accurate and rapid analysis of both major and minor elements in alloys plays a key role in metallurgy quality control and material classification processes. A quantitative calibration-free laser-induced breakdown spectroscopy (CF-LIBS) analysis method, which carries out combined correction of plasma temperature and spectral intensity by using a second-order iterative algorithm and two boundary standard samples, is proposed to realize accurate composition measurements. Experimental results show that, compared to conventional CF-LIBS analysis, the relative errors for major elements Cu and Zn and minor element Pb in the copper-lead alloys has been reduced from 12%, 26% and 32% to 1.8%, 2.7% and 13.4%, respectively. The measurement accuracy for all elements has been improved substantially.
Wang, Fei; He, Bei
2013-01-01
To investigate the role of endotracheal aspirate (EA) culture in the diagnosis and antibiotic management in ventilator-associated pneumonia (VAP). We searched CNKI, Wanfang, PUBMED and EMBASE databases published from January 1990 to December 2011, to find relevant literatures on VAP microbiological diagnostic techniques including EA and bronchoalveolar lavage (BALF). The following key words were used: ventilator associated pneumonia, diagnosis and adult. Meta-analysis was performed and the sensitivity and specificity of EA on VAP diagnosis were calculated. Our literature search identified 1665 potential articles, 8 of which fulfilled our selection criteria including 561 patients with paired cultures. Using BALF quantitative culture as reference standard, the sensitivity and specificity of EA were 72% and 71%. When considering quantitative culture of EA only, the sensitivity and specificity improved to 90% and 65%, while the positive and the negative predictive values were 68% and 89% respectively. However, the sensitivity and specificity of semi-quantitative culture of EA were only 50% and 80%, with a positive predictive value of 77% and a negative predictive value of 58% respectively. EA culture had relatively poor sensitivity and specificity, although quantitative culture of EA only could improve the sensitivity. Initiating therapy on the basis of EA quantitative culture may still result in excessive antibiotic usage. Our data suggested that EA could provide some information for clinical decision but could not replace the role of BALF quantitative culture in VAP diagnosis.
Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R.
2016-01-01
Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM®) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions. Conversion and visualization tools utilizing this toolkit were developed. The encoded objects were validated for consistency and interoperability. The resulting dataset was deposited in the QIN-HEADNECK collection of The Cancer Imaging Archive (TCIA). Supporting tools for data analysis and DICOM conversion were made available as free open-source software. Discussion. We presented a detailed investigation of the development and application of the DICOM model, as well as the supporting open-source tools and toolkits, to accommodate representation of the research data in QI biomarker development. We demonstrated that the DICOM standard can be used to represent the types of data relevant in HNC QI biomarker development, and encode their complex relationships. The resulting annotated objects are amenable to data mining applications, and are interoperable with a variety of systems that support the DICOM standard. PMID:27257542
Role Of Social Networks In Resilience Of Naval Recruits: A Quantitative Analysis
2016-06-01
comprises 1,297 total surveys from a total of eight divisions of recruits at two different time periods. Quantitative analyses using surveys and network... surveys from a total of eight divisions of recruits at two different time periods. Quantitative analyses using surveys and network data examine the effects...NETWORKS IN RESILIENCE OF NAVAL RECRUITS: A QUANTITATIVE ANALYSIS by Andrea M. Watling June 2016 Thesis Advisor: Edward H. Powley Co
QTest: Quantitative Testing of Theories of Binary Choice.
Regenwetter, Michel; Davis-Stober, Clintin P; Lim, Shiau Hong; Guo, Ying; Popova, Anna; Zwilling, Chris; Cha, Yun-Shil; Messner, William
2014-01-01
The goal of this paper is to make modeling and quantitative testing accessible to behavioral decision researchers interested in substantive questions. We provide a novel, rigorous, yet very general, quantitative diagnostic framework for testing theories of binary choice. This permits the nontechnical scholar to proceed far beyond traditionally rather superficial methods of analysis, and it permits the quantitatively savvy scholar to triage theoretical proposals before investing effort into complex and specialized quantitative analyses. Our theoretical framework links static algebraic decision theory with observed variability in behavioral binary choice data. The paper is supplemented with a custom-designed public-domain statistical analysis package, the QTest software. We illustrate our approach with a quantitative analysis using published laboratory data, including tests of novel versions of "Random Cumulative Prospect Theory." A major asset of the approach is the potential to distinguish decision makers who have a fixed preference and commit errors in observed choices from decision makers who waver in their preferences.
NASA Astrophysics Data System (ADS)
Ono-Ogasawara, Mariko; Serita, Fumio; Takaya, Mitsutoshi
2009-10-01
As the production of engineered nanomaterials quantitatively expands, the chance that workers involved in the manufacturing process will be exposed to nanoparticles also increases. A risk management system is needed for workplaces in the nanomaterial industry based on the precautionary principle. One of the problems in the risk management system is difficulty of exposure assessment. In this article, examples of exposure assessment in nanomaterial industries are reviewed with a focus on distinguishing engineered nanomaterial particles from background nanoparticles in workplace atmosphere. An approach by JNIOSH (Japan National Institute of Occupational Safety and Health) to quantitatively measure exposure to carbonaceous nanomaterials is also introduced. In addition to real-time measurements and qualitative analysis by electron microscopy, quantitative chemical analysis is necessary for quantitatively assessing exposure to nanomaterials. Chemical analysis is suitable for quantitative exposure measurement especially at facilities with high levels of background NPs.
Alonso, Pablo; Cortizo, Millán; Cantón, Francisco R; Fernández, Belén; Rodríguez, Ana; Centeno, Maria L; Cánovas, Francisco M; Ordás, Ricardo J
2007-12-01
As part of a study aimed at understanding the physiological and molecular mechanisms involved in adventitious shoot bud formation in pine cotyledons, we conducted a transcriptome analysis to identify early-induced genes during the first phases of adventitious caulogenesis in Pinus pinea L. cotyledons cultured in the presence of benzyladenine. A subtractive cDNA library with more than 700 clones was constructed. Of these clones, 393 were sequenced, analyzed and grouped according to their putative function. Quantitative real-time PCR analysis was performed to confirm the differential expression of 30 candidate genes. Results are contrasted with available data for other species.
Safety evaluation methodology for advanced coal extraction systems
NASA Technical Reports Server (NTRS)
Zimmerman, W. F.
1981-01-01
Qualitative and quantitative evaluation methods for coal extraction systems were developed. The analysis examines the soundness of the design, whether or not the major hazards have been eliminated or reduced, and how the reduction would be accomplished. The quantitative methodology establishes the approximate impact of hazards on injury levels. The results are weighted by peculiar geological elements, specialized safety training, peculiar mine environmental aspects, and reductions in labor force. The outcome is compared with injury level requirements based on similar, safer industries to get a measure of the new system's success in reducing injuries. This approach provides a more detailed and comprehensive analysis of hazards and their effects than existing safety analyses.
Quantitative petri net model of gene regulated metabolic networks in the cell.
Chen, Ming; Hofestädt, Ralf
2011-01-01
A method to exploit hybrid Petri nets (HPN) for quantitatively modeling and simulating gene regulated metabolic networks is demonstrated. A global kinetic modeling strategy and Petri net modeling algorithm are applied to perform the bioprocess functioning and model analysis. With the model, the interrelations between pathway analysis and metabolic control mechanism are outlined. Diagrammatical results of the dynamics of metabolites are simulated and observed by implementing a HPN tool, Visual Object Net ++. An explanation of the observed behavior of the urea cycle is proposed to indicate possibilities for metabolic engineering and medical care. Finally, the perspective of Petri nets on modeling and simulation of metabolic networks is discussed.
Leakey, Tatiana I; Zielinski, Jerzy; Siegfried, Rachel N; Siegel, Eric R; Fan, Chun-Yang; Cooney, Craig A
2008-06-01
DNA methylation at cytosines is a widely studied epigenetic modification. Methylation is commonly detected using bisulfite modification of DNA followed by PCR and additional techniques such as restriction digestion or sequencing. These additional techniques are either laborious, require specialized equipment, or are not quantitative. Here we describe a simple algorithm that yields quantitative results from analysis of conventional four-dye-trace sequencing. We call this method Mquant and we compare it with the established laboratory method of combined bisulfite restriction assay (COBRA). This analysis of sequencing electropherograms provides a simple, easily applied method to quantify DNA methylation at specific CpG sites.
NASA Astrophysics Data System (ADS)
Zhang, Zhiyuan; Jiang, Wanrun; Wang, Bo; Wang, Zhigang
2017-06-01
We introduce the orbital-resolved electron density projected integral (EDPI) along the H-bond in the real space to quantitatively investigate the specific contribution from the molecular orbitals (MOs) aspect in (H2O)2. Calculation results show that, the electronic occupied orbital (HOMO-4) of (H2O)2 accounts for about surprisingly 40% of the electron density at the bond critical point. Moreover, the electronic density difference analysis visualizes the electron accumulating effect of the orbital interaction within the H-bond between water molecules, supporting its covalent-like character. Our work expands the understanding of H-bond with specific contributions from certain MOs.
Ma, Shuguang; Li, Zhiling; Lee, Keun-Joong; Chowdhury, Swapan K
2010-12-20
A simple, reliable, and accurate method was developed for quantitative assessment of metabolite coverage in preclinical safety species by mixing equal volumes of human plasma with blank plasma of animal species and vice versa followed by an analysis using high-resolution full-scan accurate mass spectrometry. This approach provided comparable results (within (±15%) to those obtained from regulated bioanalysis and did not require synthetic standards or radiolabeled compounds. In addition, both qualitative and quantitative data were obtained from a single LC-MS analysis on all metabolites and, therefore, the coverage of any metabolite of interest can be obtained.
Recovery and Determination of Adsorbed Technetium on Savannah River Site Charcoal Stack Samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lahoda, Kristy G.; Engelmann, Mark D.; Farmer, Orville T.
2008-03-01
Experimental results are provided for the sample analyses for technetium (Tc) in charcoal samples placed in-line with a Savannah River Site (SRS) processing stack effluent stream as a part of an environmental surveillance program. The method for Tc removal from charcoal was based on that originally developed with high purity charcoal. Presented is the process that allowed for the quantitative analysis of 99Tc in SRS charcoal stack samples with and without 97Tc as a tracer. The results obtained with the method using the 97Tc tracer quantitatively confirm the results obtained with no tracer added. All samples contain 99Tc at themore » pg g-1 level.« less
Chapiro, Julius; Wood, Laura D.; Lin, MingDe; Duran, Rafael; Cornish, Toby; Lesage, David; Charu, Vivek; Schernthaner, Rüdiger; Wang, Zhijun; Tacher, Vania; Savic, Lynn Jeanette; Kamel, Ihab R.
2014-01-01
Purpose To evaluate the diagnostic performance of three-dimensional (3Dthree-dimensional) quantitative enhancement-based and diffusion-weighted volumetric magnetic resonance (MR) imaging assessment of hepatocellular carcinoma (HCChepatocellular carcinoma) lesions in determining the extent of pathologic tumor necrosis after transarterial chemoembolization (TACEtransarterial chemoembolization). Materials and Methods This institutional review board–approved retrospective study included 17 patients with HCChepatocellular carcinoma who underwent TACEtransarterial chemoembolization before surgery. Semiautomatic 3Dthree-dimensional volumetric segmentation of target lesions was performed at the last MR examination before orthotopic liver transplantation or surgical resection. The amount of necrotic tumor tissue on contrast material–enhanced arterial phase MR images and the amount of diffusion-restricted tumor tissue on apparent diffusion coefficient (ADCapparent diffusion coefficient) maps were expressed as a percentage of the total tumor volume. Visual assessment of the extent of tumor necrosis and tumor response according to European Association for the Study of the Liver (EASLEuropean Association for the Study of the Liver) criteria was performed. Pathologic tumor necrosis was quantified by using slide-by-slide segmentation. Correlation analysis was performed to evaluate the predictive values of the radiologic techniques. Results At histopathologic examination, the mean percentage of tumor necrosis was 70% (range, 10%–100%). Both 3Dthree-dimensional quantitative techniques demonstrated a strong correlation with tumor necrosis at pathologic examination (R2 = 0.9657 and R2 = 0.9662 for quantitative EASLEuropean Association for the Study of the Liver and quantitative ADCapparent diffusion coefficient, respectively) and a strong intermethod agreement (R2 = 0.9585). Both methods showed a significantly lower discrepancy with pathologically measured necrosis (residual standard error [RSEresidual standard error] = 6.38 and 6.33 for quantitative EASLEuropean Association for the Study of the Liver and quantitative ADCapparent diffusion coefficient, respectively), when compared with non-3Dthree-dimensional techniques (RSEresidual standard error = 12.18 for visual assessment). Conclusion This radiologic-pathologic correlation study demonstrates the diagnostic accuracy of 3Dthree-dimensional quantitative MR imaging techniques in identifying pathologically measured tumor necrosis in HCChepatocellular carcinoma lesions treated with TACEtransarterial chemoembolization. © RSNA, 2014 Online supplemental material is available for this article. PMID:25028783
Davari, Seyyed Ali; Hu, Sheng; Mukherjee, Dibyendu
2017-03-01
Intermetallic nanoalloys (NAs) and nanocomposites (NCs) have increasingly gained prominence as efficient catalytic materials in electrochemical energy conversion and storage systems. But their morphology and chemical compositions play critical role in tuning their catalytic activities, and precious metal contents. While advanced microscopy techniques facilitate morphological characterizations, traditional chemical characterizations are either qualitative or extremely involved. In this study, we apply Laser Induced Breakdown Spectroscopy (LIBS) for quantitative compositional analysis of NAs and NCs synthesized with varied elemental ratios by our in-house built pulsed laser ablation technique. Specifically, elemental ratios of binary PtNi, PdCo (NAs) and PtCo (NCs) of different compositions are determined from LIBS measurements employing an internal calibration scheme using the bulk matrix species as internal standards. Morphology and qualitative elemental compositions of the aforesaid NAs and NCs are confirmed from Transmission Electron Microscopy (TEM) images and Energy Dispersive X-ray Spectroscopy (EDX) measurements. LIBS experiments are carried out in ambient conditions with the NA and NC samples drop cast on silicon wafers after centrifugation to increase their concentrations. The technique does not call for cumbersome sample preparations including acid digestions and external calibration standards commonly required in Inductively Coupled Plasma-Optical Emission Spectroscopy (ICP-OES) techniques. Yet the quantitative LIBS results are in good agreement with the results from ICP-OES measurements. Our results indicate the feasibility of using LIBS in future for rapid and in-situ quantitative chemical characterizations of wide classes of synthesized NAs and NCs. Copyright © 2016 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
George A. Beitel
2004-02-01
In support of a national need to improve the current state-of-the-art in alerting decision makers to the risk of terrorist attack, a quantitative approach employing scientific and engineering concepts to develop a threat-risk index was undertaken at the Idaho National Engineering and Environmental Laboratory (INEEL). As a result of this effort, a set of models has been successfully integrated into a single comprehensive model known as Quantitative Threat-Risk Index Model (QTRIM), with the capability of computing a quantitative threat-risk index on a system level, as well as for the major components of the system. Such a threat-risk index could providemore » a quantitative variant or basis for either prioritizing security upgrades or updating the current qualitative national color-coded terrorist threat alert.« less
EDITORIAL: SPECTROSCOPIC IMAGING
A foremost goal in biology is understanding the molecular basis of single cell behavior, as well as cell interactions that result in functioning tissues. Accomplishing this goal requires quantitative analysis of multiple, specific macromolecules (e.g. proteins, ligands and enzyme...
USDA-ARS?s Scientific Manuscript database
The majority of economically important traits targeted for cotton improvement are quantitatively inherited. In this chapter, the current state of cotton quantitative genetics is described and separated into four components. These components include: 1) traditional quantitative inheritance analysis, ...
Accuracy and Precision of Silicon Based Impression Media for Quantitative Areal Texture Analysis
Goodall, Robert H.; Darras, Laurent P.; Purnell, Mark A.
2015-01-01
Areal surface texture analysis is becoming widespread across a diverse range of applications, from engineering to ecology. In many studies silicon based impression media are used to replicate surfaces, and the fidelity of replication defines the quality of data collected. However, while different investigators have used different impression media, the fidelity of surface replication has not been subjected to quantitative analysis based on areal texture data. Here we present the results of an analysis of the accuracy and precision with which different silicon based impression media of varying composition and viscosity replicate rough and smooth surfaces. Both accuracy and precision vary greatly between different media. High viscosity media tested show very low accuracy and precision, and most other compounds showed either the same pattern, or low accuracy and high precision, or low precision and high accuracy. Of the media tested, mid viscosity President Jet Regular Body and low viscosity President Jet Light Body (Coltène Whaledent) are the only compounds to show high levels of accuracy and precision on both surface types. Our results show that data acquired from different impression media are not comparable, supporting calls for greater standardisation of methods in areal texture analysis. PMID:25991505
ERIC Educational Resources Information Center
Funahashi, Atsushi; Gruebler, Anna; Aoki, Takeshi; Kadone, Hideki; Suzuki, Kenji
2014-01-01
We quantitatively measured the smiles of a child with autism spectrum disorder (ASD-C) using a wearable interface device during animal-assisted activities (AAA) for 7 months, and compared the results with a control of the same age. The participant was a 10-year-old boy with ASD, and a normal healthy boy of the same age was the control. They…
Quantitative Analysis of Cell Nucleus Organisation
Shiels, Carol; Adams, Niall M; Islam, Suhail A; Stephens, David A; Freemont, Paul S
2007-01-01
There are almost 1,300 entries for higher eukaryotes in the Nuclear Protein Database. The proteins' subcellular distribution patterns within interphase nuclei can be complex, ranging from diffuse to punctate or microspeckled, yet they all work together in a coordinated and controlled manner within the three-dimensional confines of the nuclear volume. In this review we describe recent advances in the use of quantitative methods to understand nuclear spatial organisation and discuss some of the practical applications resulting from this work. PMID:17676980
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carla J. Miller
This report provides a summary of the literature review that was performed and based on previous work performed at the Idaho National Laboratory studying the Three Mile Island 2 (TMI-2) nuclear reactor accident, specifically the melted fuel debris. The purpose of the literature review was to document prior published work that supports the feasibility of the analytical techniques that were developed to provide quantitative results of the make-up of the fuel and reactor component debris located inside and outside the containment. The quantitative analysis provides a technique to perform nuclear fuel accountancy measurements
[Free crystalline silica: a comparison of methods for its determination in total dust].
Maciejewska, Aleksandra; Szadkowska-Stańczyk, Irena; Kondratowicz, Grzegorz
2005-01-01
The major objective of the study was to compare and investigate the usefulness of quantitative analyses of free crystalline silica (FCS) in the assessment of dust exposure in samples of total dust of varied composition, using three methods: chemical method in common use in Poland; infrared spectrometry; and x-ray powder diffraction. Mineral composition and FCS contents were investigated in 9 laboratory samples of raw materials, materials, and industrial wastes, containing from about 2 to over 80% of crystalline silica and reduced to particles of size corresponding with that of total dust. Sample components were identified using XRD and FT-IR methods. Ten independent determinations of FCS with each of the three study methods were performed in dust samples. An analysis of linear correlation was applied to investigate interrelationship between mean FCS determinations. In analyzed dust samples, along with silica dust there were numerous minerals interfering with silica during the quantitative analysis. Comparison of mean results of FCS determinations showed that the results obtained using the FT-IR method were by 12-13% lower than those obtained with two other methods. However, the differences observed were within the limits of changeability of results associated with their precision and dependence on reference materials used. Assessment of occupational exposure to dusts containing crystalline silica can be performed on the basis of quantitative analysis of FCS in total dusts using each of the compared methods. The FT-IR method is most appropriate for the FCS determination in samples of small amount of silica or collected at low dust concentrations; the XRD method for the analysis of multicomponent samples; and the chemical method in the case of medium and high FCS contents in samples or high concentrations of dusts in the work environment.
Pi, Shan; Cao, Rong; Qiang, Jin Wei; Guo, Yan Hui
2018-01-01
Background Diffusion-weighted imaging (DWI) and quantitative apparent diffusion coefficient (ADC) values are widely used in the differential diagnosis of ovarian tumors. Purpose To assess the diagnostic performance of quantitative ADC values in ovarian tumors. Material and Methods PubMed, Embase, the Cochrane Library, and local databases were searched for studies assessing ovarian tumors using quantitative ADC values. We quantitatively analyzed the diagnostic performances for two clinical problems: benign vs. malignant tumors and borderline vs. malignant tumors. We evaluated diagnostic performances by the pooled sensitivity and specificity values and by summary receiver operating characteristic (SROC) curves. Subgroup analyses were used to analyze study heterogeneity. Results From the 742 studies identified in the search results, 16 studies met our inclusion criteria. A total of ten studies evaluated malignant vs. benign ovarian tumors and six studies assessed malignant vs. borderline ovarian tumors. Regarding the diagnostic accuracy of quantitative ADC values for distinguishing between malignant and benign ovarian tumors, the pooled sensitivity and specificity values were 0.91 and 0.91, respectively. The area under the SROC curve (AUC) was 0.96. For differentiating borderline from malignant tumors, the pooled sensitivity and specificity values were 0.89 and 0.79, and the AUC was 0.91. The methodological quality of the included studies was moderate. Conclusion Quantitative ADC values could serve as useful preoperative markers for predicting the nature of ovarian tumors. Nevertheless, prospective trials focused on standardized imaging parameters are needed to evaluate the clinical value of quantitative ADC values in ovarian tumors.
Buelow, Daelynn; Sun, Yilun; Tang, Li; Gu, Zhengming; Pounds, Stanley; Hayden, Randall
2016-07-01
Monitoring of Epstein-Barr virus (EBV) load in immunocompromised patients has become integral to their care. An increasing number of reagents are available for quantitative detection of EBV; however, there are little published comparative data. Four real-time PCR systems (one using laboratory-developed reagents and three using analyte-specific reagents) were compared with one another for detection of EBV from whole blood. Whole blood specimens seeded with EBV were used to determine quantitative linearity, analytical measurement range, lower limit of detection, and CV for each assay. Retrospective testing of 198 clinical samples was performed in parallel with all methods; results were compared to determine relative quantitative and qualitative performance. All assays showed similar performance. No significant difference was found in limit of detection (3.12-3.49 log10 copies/mL; P = 0.37). A strong qualitative correlation was seen with all assays that used clinical samples (positive detection rates of 89.5%-95.8%). Quantitative correlation of clinical samples across assays was also seen in pairwise regression analysis, with R(2) ranging from 0.83 to 0.95. Normalizing clinical sample results to IU/mL did not alter the quantitative correlation between assays. Quantitative EBV detection by real-time PCR can be performed over a wide linear dynamic range, using three different commercially available reagents and laboratory-developed methods. EBV was detected with comparable sensitivity and quantitative correlation for all assays. Copyright © 2016 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.
Modelling default and likelihood reasoning as probabilistic
NASA Technical Reports Server (NTRS)
Buntine, Wray
1990-01-01
A probabilistic analysis of plausible reasoning about defaults and about likelihood is presented. 'Likely' and 'by default' are in fact treated as duals in the same sense as 'possibility' and 'necessity'. To model these four forms probabilistically, a logic QDP and its quantitative counterpart DP are derived that allow qualitative and corresponding quantitative reasoning. Consistency and consequence results for subsets of the logics are given that require at most a quadratic number of satisfiability tests in the underlying propositional logic. The quantitative logic shows how to track the propagation error inherent in these reasoning forms. The methodology and sound framework of the system highlights their approximate nature, the dualities, and the need for complementary reasoning about relevance.
African Primary Care Research: Quantitative analysis and presentation of results
Ogunbanjo, Gboyega A.
2014-01-01
Abstract This article is part of a series on Primary Care Research Methods. The article describes types of continuous and categorical data, how to capture data in a spreadsheet, how to use descriptive and inferential statistics and, finally, gives advice on how to present the results in text, figures and tables. The article intends to help Master's level students with writing the data analysis section of their research proposal and presenting their results in their final research report. PMID:26245435
On normality, ethnicity, and missing values in quantitative trait locus mapping
Labbe, Aurélie; Wormald, Hanna
2005-01-01
Background This paper deals with the detection of significant linkage for quantitative traits using a variance components approach. Microsatellite markers were obtained for the Genetic Analysis Workshop 14 Collaborative Study on the Genetics of Alcoholism data. Ethnic heterogeneity, highly skewed quantitative measures, and a high rate of missing values are all present in this dataset and well known to impact upon linkage analysis. This makes it a good candidate for investigation. Results As expected, we observed a number of changes in LOD scores, especially for chromosomes 1, 7, and 18, along with the three factors studied. A dramatic example of such changes can be found in chromosome 7. Highly significant linkage to one of the quantitative traits became insignificant when a proper normalizing transformation of the trait was used and when analysis was carried out on an ethnically homogeneous subset of the original pedigrees. Conclusion In agreement with existing literature, transforming a trait to ensure normality using a Box-Cox transformation is highly recommended in order to avoid false-positive linkages. Furthermore, pedigrees should be sorted by ethnic groups and analyses should be carried out separately. Finally, one should be aware that the inclusion of covariates with a high rate of missing values reduces considerably the number of subjects included in the model. In such a case, the loss in power may be large. Imputation methods are then recommended. PMID:16451664
Sample normalization methods in quantitative metabolomics.
Wu, Yiman; Li, Liang
2016-01-22
To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.
Zhao, Shilin; Li, Rongxia; Cai, Xiaofan; Chen, Wanjia; Li, Qingrun; Xing, Tao; Zhu, Wenjie; Chen, Y Eugene; Zeng, Rong; Deng, Yueyi
2013-01-01
Body fluid proteome is the most informative proteome from a medical viewpoint. But the lack of accurate quantitation method for complicated body fluid limited its application in disease research and biomarker discovery. To address this problem, we introduced a novel strategy, in which SILAC-labeled mouse serum was used as internal standard for human serum and urine proteome analysis. The SILAC-labeled mouse serum was mixed with human serum and urine, and multidimensional separation coupled with tandem mass spectrometry (IEF-LC-MS/MS) analysis was performed. The shared peptides between two species were quantified by their SILAC pairs, and the human-only peptides were quantified by mouse peptides with coelution. The comparison for the results from two replicate experiments indicated the high repeatability of our strategy. Then the urine from Immunoglobulin A nephropathy patients treated and untreated was compared by this quantitation strategy. Fifty-three peptides were found to be significantly changed between two groups, including both known diagnostic markers for IgAN and novel candidates, such as Complement C3, Albumin, VDBP, ApoA,1 and IGFBP7. In conclusion, we have developed a practical and accurate quantitation strategy for comparison of complicated human body fluid proteome. The results from such strategy could provide potential disease-related biomarkers for evaluation of treatment.
Do adolescents support early marriage in Bangladesh? Evidence from study.
Rahman, M M; Kabir, M
2005-01-01
Adolescence is a critical period for female adolescents as they have to make decisions regarding their marriage, education and work which would influence and determine their future course of life. Although, early marriage has negative consequences, still a proportion of female adolescents favour early marriage because of prevailing cultural norms. This paper attempts to investigate the factors influencing the adolescents' attitude towards early marriage among the married and unmarried female adolescents. This is a quantitative and qualitative study. A multistage cluster sampling technique was used to select the sample. For quantitative results, data on 3362 female adolescents from rural and urban areas irrespective of their marital status were analyzed. To supplement the results found in quantitative analysis, a series of focus group discussions were conducted among the adolescents. Analysis revealed that one fourth (25.9%) of the adolescents were in favour of early marriage. A number of societal factors influenced them towards early marriage, despite the fact that adolescents are aware of the consequences of maternal and child health. Multivariate logistic regression analysis showed that current marital status, years of schooling, work status and parental marital decision are important predictors of early marriage (p < 0.05). The study concluded that female education would be an important determinant of adolescent marriage. Therefore, opportunities and scope of education beyond secondary would helps to bring change in the attitude towards early marriage.
Dinç, Erdal; Büker, Eda
2012-01-01
A new application of continuous wavelet transform (CWT) to overlapping peaks in a chromatogram was developed for the quantitative analysis of amiloride hydrochloride (AML) and hydrochlorothiazide (HCT) in tablets. Chromatographic analysis was done by using an ACQUITY ultra-performance LC (UPLC) BEH C18 column (50 x 2.1 mm id, 1.7 pm particle size) and a mobile phase consisting of methanol-0.1 M acetic acid (21 + 79, v/v) at a constant flow rate of 0.3 mL/min with diode array detection at 274 nm. The overlapping chromatographic peaks of the calibration set consisting of AML and HCT mixtures were recorded rapidly by using an ACQUITY UPLC H-Class system. The overlapping UPLC data vectors of AML and HCT drugs and their samples were processed by CWT signal processing methods. The calibration graphs for AML and HCT were computed from the relationship between concentration and areas of chromatographic CWT peaks. The applicability and validity of the improved UPLC-CWT approaches were confirmed by recovery studies and the standard addition technique. The proposed UPLC-CWT methods were applied to the determination of AML and HCT in tablets. The experimental results indicated that the suggested UPLC-CWT signal processing provides accurate and precise results for industrial QC and quantitative evaluation of AML-HCT tablets.
[Quantitative data analysis for live imaging of bone.
Seno, Shigeto
Bone tissue is a hard tissue, it was difficult to observe the interior of the bone tissue alive. With the progress of microscopic technology and fluorescent probe technology in recent years, it becomes possible to observe various activities of various cells forming bone society. On the other hand, the quantitative increase in data and the diversification and complexity of the images makes it difficult to perform quantitative analysis by visual inspection. It has been expected to develop a methodology for processing microscopic images and data analysis. In this article, we introduce the research field of bioimage informatics which is the boundary area of biology and information science, and then outline the basic image processing technology for quantitative analysis of live imaging data of bone.
Blanchard, Philippe; Regnault, Julie; Schurr, Frank; Dubois, Eric; Ribière, Magali
2012-03-01
Chronic bee paralysis virus (CBPV) is responsible for chronic bee paralysis, an infectious and contagious disease in adult honey bees (Apis mellifera L.). A real-time RT-PCR assay to quantitate the CBPV load is now available. To propose this assay as a reference method, it was characterised further in an intra-laboratory study during which the reliability and the repeatability of results and the performance of the assay were confirmed. The qPCR assay alone and the whole quantitation method (from sample RNA extraction to analysis) were both assessed following the ISO/IEC 17025 standard and the recent XP U47-600 standard issued by the French Standards Institute. The performance of the qPCR assay and of the overall CBPV quantitation method were validated over a 6 log range from 10(2) to 10(8) with a detection limit of 50 and 100 CBPV RNA copies, respectively, and the protocol of the real-time RT-qPCR assay for CBPV quantitation was approved by the French Accreditation Committee. Copyright © 2011 Elsevier B.V. All rights reserved.
Development of a Biological Science Quantitative Reasoning Exam (BioSQuaRE)
Stanhope, Liz; Ziegler, Laura; Haque, Tabassum; Le, Laura; Vinces, Marcelo; Davis, Gregory K.; Zieffler, Andrew; Brodfuehrer, Peter; Preest, Marion; M. Belitsky, Jason; Umbanhowar, Charles; Overvoorde, Paul J.
2017-01-01
Multiple reports highlight the increasingly quantitative nature of biological research and the need to innovate means to ensure that students acquire quantitative skills. We present a tool to support such innovation. The Biological Science Quantitative Reasoning Exam (BioSQuaRE) is an assessment instrument designed to measure the quantitative skills of undergraduate students within a biological context. The instrument was developed by an interdisciplinary team of educators and aligns with skills included in national reports such as BIO2010, Scientific Foundations for Future Physicians, and Vision and Change. Undergraduate biology educators also confirmed the importance of items included in the instrument. The current version of the BioSQuaRE was developed through an iterative process using data from students at 12 postsecondary institutions. A psychometric analysis of these data provides multiple lines of evidence for the validity of inferences made using the instrument. Our results suggest that the BioSQuaRE will prove useful to faculty and departments interested in helping students acquire the quantitative competencies they need to successfully pursue biology, and useful to biology students by communicating the importance of quantitative skills. We invite educators to use the BioSQuaRE at their own institutions. PMID:29196427
Optical holographic structural analysis of Kevlar rocket motor cases
NASA Astrophysics Data System (ADS)
Harris, W. J.
1981-05-01
The methodology of applying optical holography to evaluation of subscale Kevlar 49 composite pressure vessels is explored. The results and advantages of the holographic technique are discussed. The cases utilized were of similar design, but each had specific design features, the effects of which are reviewed. Burst testing results are presented in conjunction with the holographic fringe patterns obtained during progressive pressurization. Examples of quantitative data extracted by analysis of fringe fields are included.
Yan, Xu; Bishop, David J.
2018-01-01
Gene expression analysis by quantitative PCR in skeletal muscle is routine in exercise studies. The reproducibility and reliability of the data fundamentally depend on how the experiments are performed and interpreted. Despite the popularity of the assay, there is a considerable variation in experimental protocols and data analyses from different laboratories, and there is a lack of consistency of proper quality control steps throughout the assay. In this study, we present a number of experiments on various steps of quantitative PCR workflow, and demonstrate how to perform a quantitative PCR experiment with human skeletal muscle samples in an exercise study. We also tested some common mistakes in performing qPCR. Interestingly, we found that mishandling of muscle for a short time span (10 mins) before RNA extraction did not affect RNA quality, and isolated total RNA was preserved for up to one week at room temperature. Demonstrated by our data, use of unstable reference genes lead to substantial differences in the final results. Alternatively, cDNA content can be used for data normalisation; however, complete removal of RNA from cDNA samples is essential for obtaining accurate cDNA content. PMID:29746477
Dual reporter transgene driven by 2.3Col1a1 promoter is active in differentiated osteoblasts
NASA Technical Reports Server (NTRS)
Marijanovic, Inga; Jiang, Xi; Kronenberg, Mark S.; Stover, Mary Louise; Erceg, Ivana; Lichtler, Alexander C.; Rowe, David W.
2003-01-01
AIM: As quantitative and spatial analyses of promoter reporter constructs are not easily performed in intact bone, we designed a reporter gene specific to bone, which could be analyzed both visually and quantitatively by using chloramphenicol acetyltransferase (CAT) and a cyan version of green fluorescent protein (GFPcyan), driven by a 2.3-kb fragment of the rat collagen promoter (Col2.3). METHODS: The construct Col2.3CATiresGFPcyan was used for generating transgenic mice. Quantitative measurement of promoter activity was performed by CAT analysis of different tissues derived from transgenic animals; localization was performed by visualized GFP in frozen bone sections. To assess transgene expression during in vitro differentiation, marrow stromal cell and neonatal calvarial osteoblast cultures were analyzed for CAT and GFP activity. RESULTS: In mice, CAT activity was detected in the calvaria, long bone, teeth, and tendon, whereas histology showed that GFP expression was limited to osteoblasts and osteocytes. In cell culture, increased activity of CAT correlated with increased differentiation, and GFP activity was restricted to mineralized nodules. CONCLUSION: The concept of a dual reporter allows a simultaneous visual and quantitative analysis of transgene activity in bone.
Quantification of EEG reactivity in comatose patients
Hermans, Mathilde C.; Westover, M. Brandon; van Putten, Michel J.A.M.; Hirsch, Lawrence J.; Gaspard, Nicolas
2016-01-01
Objective EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. Methods In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. Results The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet’s AC1: 65–70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts’ agreement regarding reactivity for each individual case. Conclusion Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Significance Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. PMID:26183757
Claycamp, H Gregg; Kona, Ravikanth; Fahmy, Raafat; Hoag, Stephen W
2016-04-01
Qualitative risk assessment methods are often used as the first step to determining design space boundaries; however, quantitative assessments of risk with respect to the design space, i.e., calculating the probability of failure for a given severity, are needed to fully characterize design space boundaries. Quantitative risk assessment methods in design and operational spaces are a significant aid to evaluating proposed design space boundaries. The goal of this paper is to demonstrate a relatively simple strategy for design space definition using a simplified Bayesian Monte Carlo simulation. This paper builds on a previous paper that used failure mode and effects analysis (FMEA) qualitative risk assessment and Plackett-Burman design of experiments to identity the critical quality attributes. The results show that the sequential use of qualitative and quantitative risk assessments can focus the design of experiments on a reduced set of critical material and process parameters that determine a robust design space under conditions of limited laboratory experimentation. This approach provides a strategy by which the degree of risk associated with each known parameter can be calculated and allocates resources in a manner that manages risk to an acceptable level.
Kellie, John F; Kehler, Jonathan R; Karlinsey, Molly Z; Summerfield, Scott G
2017-12-01
Typically, quantitation of biotherapeutics from biological matrices by LC-MS is based on a surrogate peptide approach to determine molecule concentration. Recent efforts have focused on quantitation of the intact protein molecules or larger mass subunits of monoclonal antibodies. To date, there has been limited guidance for large or intact protein mass quantitation for quantitative bioanalysis. Intact- and subunit-level analyses of biotherapeutics from biological matrices are performed at 12-25 kDa mass range with quantitation data presented. Linearity, bias and other metrics are presented along with recommendations made on the viability of existing quantitation approaches. This communication is intended to start a discussion around intact protein data analysis and processing, recognizing that other published contributions will be required.
Analysis of artifacts suggests DGGE should not be used for quantitative diversity analysis.
Neilson, Julia W; Jordan, Fiona L; Maier, Raina M
2013-03-01
PCR-denaturing gradient gel electrophoresis (PCR-DGGE) is widely used in microbial ecology for the analysis of comparative community structure. However, artifacts generated during PCR-DGGE of mixed template communities impede the application of this technique to quantitative analysis of community diversity. The objective of the current study was to employ an artificial bacterial community to document and analyze artifacts associated with multiband signatures and preferential template amplification and to highlight their impacts on the use of this technique for quantitative diversity analysis. Six bacterial species (three Betaproteobacteria, two Alphaproteobacteria, and one Firmicutes) were amplified individually and in combinations with primers targeting the V7/V8 region of the 16S rRNA gene. Two of the six isolates produced multiband profiles demonstrating that band number does not correlate directly with α-diversity. Analysis of the multiple bands from one of these isolates confirmed that both bands had identical sequences which lead to the hypothesis that the multiband pattern resulted from two distinct structural conformations of the same amplicon. In addition, consistent preferential amplification was demonstrated following pairwise amplifications of the six isolates. DGGE and real time PCR analysis identified primer mismatch and PCR inhibition due to 16S rDNA secondary structure as the most probable causes of preferential amplification patterns. Reproducible DGGE community profiles generated in this study confirm that PCR-DGGE provides an excellent high-throughput tool for comparative community structure analysis, but that method-specific artifacts preclude its use for accurate comparative diversity analysis. Copyright © 2013 Elsevier B.V. All rights reserved.
Analysis of a document/reporting system
NASA Technical Reports Server (NTRS)
Narrow, B.
1971-01-01
An in-depth analysis of the information system within the Data Processing Branch is presented. Quantitative measures are used to evaluate the efficiency and effectiveness of the information system. It is believed that this is the first documented study which utilizes quantitative measures for full scale system analysis. The quantitative measures and techniques for collecting and qualifying the basic data, as described, are applicable to any information system. Therefore this report is considered to be of interest to any persons concerned with the management design, analysis or evaluation of information systems.
Wu, Xinzhou; Li, Weifeng; Guo, Pengran; Zhang, Zhixiang; Xu, Hanhong
2018-04-18
Matrix-assisted laser desorption/ionization Fourier transform ion cyclotron resonance mass spectrometry (MALDI-FTICR-MS) has been applied for rapid, sensitive, undisputed, and quantitative detection of pesticide residues on fresh leaves with little sample pretreatment. Various pesticides (insecticides, bactericides, herbicides, and acaricides) are detected directly in the complex matrix with excellent limits of detection down to 4 μg/L. FTICR-MS could unambiguously identify pesticides with tiny mass differences (∼0.017 75 Da), thereby avoiding false-positive results. Remarkably, pesticide isomers can be totally discriminated by use of diagnostic fragments, and quantitative analysis of pesticide isomers is demonstrated. The present results expand the horizons of the MALDI-FTICR-MS platform in the reliable determination of pesticides, with integrated advantages of ultrahigh mass resolution and accuracy. This method provides growing evidence for the resultant detrimental effects of pesticides, expediting the identification and evaluation of innovative pesticides.
Pansharpening on the Narrow Vnir and SWIR Spectral Bands of SENTINEL-2
NASA Astrophysics Data System (ADS)
Vaiopoulos, A. D.; Karantzalos, K.
2016-06-01
In this paper results from the evaluation of several state-of-the-art pansharpening techniques are presented for the VNIR and SWIR bands of Sentinel-2. A procedure for the pansharpening is also proposed which aims at respecting the closest spectral similarities between the higher and lower resolution bands. The evaluation included 21 different fusion algorithms and three evaluation frameworks based both on standard quantitative image similarity indexes and qualitative evaluation from remote sensing experts. The overall analysis of the evaluation results indicated that remote sensing experts disagreed with the outcomes and method ranking from the quantitative assessment. The employed image quality similarity indexes and quantitative evaluation framework based on both high and reduced resolution data from the literature didn't manage to highlight/evaluate mainly the spatial information that was injected to the lower resolution images. Regarding the SWIR bands none of the methods managed to deliver significantly better results than a standard bicubic interpolation on the original low resolution bands.
ERIC Educational Resources Information Center
Muslihah, Oleh Eneng
2015-01-01
The research examines the correlation between the understanding of school-based management, emotional intelligences and headmaster performance. Data was collected, using quantitative methods. The statistical analysis used was the Pearson Correlation, and multivariate regression analysis. The results of this research suggest firstly that there is…
The U.S. EPA has initiated a new recreational water study to evaluate the correlation between illness rates in swimmers and Enterococcus concentrations determined by the mEI agar membrane filter (MF) method and several new technologies including QPCR analysis. Results of this stu...
Turner, Andrew D; Waack, Julia; Lewis, Adam; Edwards, Christine; Lawton, Linda
2018-02-01
A simple, rapid UHPLC-MS/MS method has been developed and optimised for the quantitation of microcystins and nodularin in wide variety of sample matrices. Microcystin analogues targeted were MC-LR, MC-RR, MC-LA, MC-LY, MC-LF, LC-LW, MC-YR, MC-WR, [Asp3] MC-LR, [Dha7] MC-LR, MC-HilR and MC-HtyR. Optimisation studies were conducted to develop a simple, quick and efficient extraction protocol without the need for complex pre-analysis concentration procedures, together with a rapid sub 5min chromatographic separation of toxins in shellfish and algal supplement tablet powders, as well as water and cyanobacterial bloom samples. Validation studies were undertaken on each matrix-analyte combination to the full method performance characteristics following international guidelines. The method was found to be specific and linear over the full calibration range. Method sensitivity in terms of limits of detection, quantitation and reporting were found to be significantly improved in comparison to LC-UV methods and applicable to the analysis of each of the four matrices. Overall, acceptable recoveries were determined for each of the matrices studied, with associated precision and within-laboratory reproducibility well within expected guidance limits. Results from the formalised ruggedness analysis of all available cyanotoxins, showed that the method was robust for all parameters investigated. The results presented here show that the optimised LC-MS/MS method for cyanotoxins is fit for the purpose of detection and quantitation of a range of microcystins and nodularin in shellfish, algal supplement tablet powder, water and cyanobacteria. The method provides a valuable early warning tool for the rapid, routine extraction and analysis of natural waters, cyanobacterial blooms, algal powders, food supplements and shellfish tissues, enabling monitoring labs to supplement traditional microscopy techniques and report toxicity results within a short timeframe of sample receipt. The new method, now accredited to ISO17025 standard, is simple, quick, applicable to multiple matrices and is highly suitable for use as a routine, high-throughout, fast turnaround regulatory monitoring tool. Copyright © 2017 Elsevier B.V. All rights reserved.
Sung, Yun Ju; Di, Yanming; Fu, Audrey Q; Rothstein, Joseph H; Sieh, Weiva; Tong, Liping; Thompson, Elizabeth A; Wijsman, Ellen M
2007-01-01
We performed multipoint linkage analyses with multiple programs and models for several gene expression traits in the Centre d'Etude du Polymorphisme Humain families. All analyses provided consistent results for both peak location and shape. Variance-components (VC) analysis gave wider peaks and Bayes factors gave fewer peaks. Among programs from the MORGAN package, lm_multiple performed better than lm_markers, resulting in less Markov-chain Monte Carlo (MCMC) variability between runs, and the program lm_twoqtl provided higher LOD scores by also including either a polygenic component or an additional quantitative trait locus.
Sung, Yun Ju; Di, Yanming; Fu, Audrey Q; Rothstein, Joseph H; Sieh, Weiva; Tong, Liping; Thompson, Elizabeth A; Wijsman, Ellen M
2007-01-01
We performed multipoint linkage analyses with multiple programs and models for several gene expression traits in the Centre d'Etude du Polymorphisme Humain families. All analyses provided consistent results for both peak location and shape. Variance-components (VC) analysis gave wider peaks and Bayes factors gave fewer peaks. Among programs from the MORGAN package, lm_multiple performed better than lm_markers, resulting in less Markov-chain Monte Carlo (MCMC) variability between runs, and the program lm_twoqtl provided higher LOD scores by also including either a polygenic component or an additional quantitative trait locus. PMID:18466597
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amirifar, Nooshin; Lardé, Rodrigue, E-mail: rodrigue.larde@univ-rouen.fr; Talbot, Etienne
2015-12-07
In the last decade, atom probe tomography has become a powerful tool to investigate semiconductor and insulator nanomaterials in microelectronics, spintronics, and optoelectronics. In this paper, we report an investigation of zinc oxide nanostructures using atom probe tomography. We observed that the chemical composition of zinc oxide is strongly dependent on the analysis parameters used for atom probe experiments. It was observed that at high laser pulse energies, the electric field at the specimen surface is strongly dependent on the crystallographic directions. This dependence leads to an inhomogeneous field evaporation of the surface atoms, resulting in unreliable measurements. We showmore » that the laser pulse energy has to be well tuned to obtain reliable quantitative chemical composition measurements of undoped and doped ZnO nanomaterials.« less
Multifractal detrended cross-correlation analysis in the MENA area
NASA Astrophysics Data System (ADS)
El Alaoui, Marwane; Benbachir, Saâd
2013-12-01
In this paper, we investigated multifractal cross-correlations qualitatively and quantitatively using a cross-correlation test and the Multifractal detrended cross-correlation analysis method (MF-DCCA) for markets in the MENA area. We used cross-correlation coefficients to measure the level of this correlation. The analysis concerns four stock market indices of Morocco, Tunisia, Egypt and Jordan. The countries chosen are signatory of the Agadir agreement concerning the establishment of a free trade area comprising Arab Mediterranean countries. We computed the bivariate generalized Hurst exponent, Rényi exponent and spectrum of singularity for each pair of indices to measure quantitatively the cross-correlations. By analyzing the results, we found the existence of multifractal cross-correlations between all of these markets. We compared the spectrum width of these indices; we also found which pair of indices has a strong multifractal cross-correlation.
Econophysical visualization of Adam Smith’s invisible hand
NASA Astrophysics Data System (ADS)
Cohen, Morrel H.; Eliazar, Iddo I.
2013-02-01
Consider a complex system whose macrostate is statistically observable, but yet whose operating mechanism is an unknown black-box. In this paper we address the problem of inferring, from the system’s macrostate statistics, the system’s intrinsic force yielding the observed statistics. The inference is established via two diametrically opposite approaches which result in the very same intrinsic force: a top-down approach based on the notion of entropy, and a bottom-up approach based on the notion of Langevin dynamics. The general results established are applied to the problem of visualizing the intrinsic socioeconomic force-Adam Smith’s invisible hand-shaping the distribution of wealth in human societies. Our analysis yields quantitative econophysical representations of figurative socioeconomic forces, quantitative definitions of “poor” and “rich”, and a quantitative characterization of the “poor-get-poorer” and the “rich-get-richer” phenomena.
Quantitative determination of Auramine O by terahertz spectroscopy with 2DCOS-PLSR model
NASA Astrophysics Data System (ADS)
Zhang, Huo; Li, Zhi; Chen, Tao; Qin, Binyi
2017-09-01
Residues of harmful dyes such as Auramine O (AO) in herb and food products threaten the health of people. So, fast and sensitive detection techniques of the residues are needed. As a powerful tool for substance detection, terahertz (THz) spectroscopy was used for the quantitative determination of AO by combining with an improved partial least-squares regression (PLSR) model in this paper. Absorbance of herbal samples with different concentrations was obtained by THz-TDS in the band between 0.2THz and 1.6THz. We applied two-dimensional correlation spectroscopy (2DCOS) to improve the PLSR model. This method highlighted the spectral differences of different concentrations, provided a clear criterion of the input interval selection, and improved the accuracy of detection result. The experimental result indicated that the combination of the THz spectroscopy and 2DCOS-PLSR is an excellent quantitative analysis method.
Urban Multisensory Laboratory, AN Approach to Model Urban Space Human Perception
NASA Astrophysics Data System (ADS)
González, T.; Sol, D.; Saenz, J.; Clavijo, D.; García, H.
2017-09-01
An urban sensory lab (USL or LUS an acronym in Spanish) is a new and avant-garde approach for studying and analyzing a city. The construction of this approach allows the development of new methodologies to identify the emotional response of public space users. The laboratory combines qualitative analysis proposed by urbanists and quantitative measures managed by data analysis applications. USL is a new approach to go beyond the borders of urban knowledge. The design thinking strategy allows us to implement methods to understand the results provided by our technique. In this first approach, the interpretation is made by hand. However, our goal is to combine design thinking and machine learning in order to analyze the qualitative and quantitative data automatically. Now, the results are being used by students from the Urbanism and Architecture courses in order to get a better understanding of public spaces in Puebla, Mexico and its interaction with people.