49 CFR 40.97 - What do laboratories report and how do they report it?
Code of Federal Regulations, 2011 CFR
2011-10-01
... review by the certifying scientist is completed. (e)(1) You must provide quantitative values for..., without a request from the MRO. (f) You must provide quantitative values for confirmed opiate results for morphine or codeine at 15,000 ng/mL or above, even if the MRO has not requested quantitative values for the...
49 CFR 40.97 - What do laboratories report and how do they report it?
Code of Federal Regulations, 2012 CFR
2012-10-01
... review by the certifying scientist is completed. (e)(1) You must provide quantitative values for..., without a request from the MRO. (f) You must provide quantitative values for confirmed opiate results for morphine or codeine at 15,000 ng/mL or above, even if the MRO has not requested quantitative values for the...
49 CFR 40.97 - What do laboratories report and how do they report it?
Code of Federal Regulations, 2013 CFR
2013-10-01
... review by the certifying scientist is completed. (e)(1) You must provide quantitative values for..., without a request from the MRO. (f) You must provide quantitative values for confirmed opiate results for morphine or codeine at 15,000 ng/mL or above, even if the MRO has not requested quantitative values for the...
49 CFR 40.97 - What do laboratories report and how do they report it?
Code of Federal Regulations, 2010 CFR
2010-10-01
... review by the certifying scientist is completed. (1) You must provide quantitative values for confirmed..., without a request from the MRO. (f) You must provide quantitative values for confirmed opiate results for morphine or codeine at 15,000 ng/mL or above, even if the MRO has not requested quantitative values for the...
49 CFR 40.97 - What do laboratories report and how do they report it?
Code of Federal Regulations, 2014 CFR
2014-10-01
... review by the certifying scientist is completed. (e)(1) You must provide quantitative values for..., without a request from the MRO. (f) You must provide quantitative values for confirmed opiate results for morphine or codeine at 15,000 ng/mL or above, even if the MRO has not requested quantitative values for the...
10 CFR 26.169 - Reporting Results.
Code of Federal Regulations, 2012 CFR
2012-01-01
... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...
10 CFR 26.169 - Reporting Results.
Code of Federal Regulations, 2013 CFR
2013-01-01
... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...
10 CFR 26.169 - Reporting Results.
Code of Federal Regulations, 2010 CFR
2010-01-01
... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...
10 CFR 26.169 - Reporting Results.
Code of Federal Regulations, 2011 CFR
2011-01-01
... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...
10 CFR 26.169 - Reporting Results.
Code of Federal Regulations, 2014 CFR
2014-01-01
... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...
Values in Qualitative and Quantitative Research
ERIC Educational Resources Information Center
Duffy, Maureen; Chenail, Ronald J.
2008-01-01
The authors identify the philosophical underpinnings and value-ladenness of major research paradigms. They argue that useful and meaningful research findings for counseling can be generated from both qualitative and quantitative research methodologies, provided that the researcher has an appreciation of the importance of philosophical coherence in…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-15
...: Notification of Annual Quantitative Limit on Certain Apparel under HOPE. DATES: Effective Date: December 20...-added program is subject to a quantitative limitation. HOPE provides that the quantitative limitation... quantitative limitation for qualifying apparel imported from Haiti under the value-added program will be an...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-16
...: Notification of Annual Quantitative Limit on Certain Apparel under HOPE. DATES: Effective Date: December 20...-added program is subject to a quantitative limitation. HOPE provides that the quantitative limitation... quantitative limitation for qualifying apparel imported from Haiti under the value-added program will be an...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-19
...: Notification of Annual Quantitative Limit on Certain Apparel under HOPE. DATES: Effective Date: December 20...-added program is subject to a quantitative limitation. HOPE provides that the quantitative limitation... quantitative limitation for qualifying apparel imported from Haiti under the value-added program will be an...
Quantitative angle-insensitive flow measurement using relative standard deviation OCT.
Zhu, Jiang; Zhang, Buyun; Qi, Li; Wang, Ling; Yang, Qiang; Zhu, Zhuqing; Huo, Tiancheng; Chen, Zhongping
2017-10-30
Incorporating different data processing methods, optical coherence tomography (OCT) has the ability for high-resolution angiography and quantitative flow velocity measurements. However, OCT angiography cannot provide quantitative information of flow velocities, and the velocity measurement based on Doppler OCT requires the determination of Doppler angles, which is a challenge in a complex vascular network. In this study, we report on a relative standard deviation OCT (RSD-OCT) method which provides both vascular network mapping and quantitative information for flow velocities within a wide range of Doppler angles. The RSD values are angle-insensitive within a wide range of angles, and a nearly linear relationship was found between the RSD values and the flow velocities. The RSD-OCT measurement in a rat cortex shows that it can quantify the blood flow velocities as well as map the vascular network in vivo .
Quantitative angle-insensitive flow measurement using relative standard deviation OCT
NASA Astrophysics Data System (ADS)
Zhu, Jiang; Zhang, Buyun; Qi, Li; Wang, Ling; Yang, Qiang; Zhu, Zhuqing; Huo, Tiancheng; Chen, Zhongping
2017-10-01
Incorporating different data processing methods, optical coherence tomography (OCT) has the ability for high-resolution angiography and quantitative flow velocity measurements. However, OCT angiography cannot provide quantitative information of flow velocities, and the velocity measurement based on Doppler OCT requires the determination of Doppler angles, which is a challenge in a complex vascular network. In this study, we report on a relative standard deviation OCT (RSD-OCT) method which provides both vascular network mapping and quantitative information for flow velocities within a wide range of Doppler angles. The RSD values are angle-insensitive within a wide range of angles, and a nearly linear relationship was found between the RSD values and the flow velocities. The RSD-OCT measurement in a rat cortex shows that it can quantify the blood flow velocities as well as map the vascular network in vivo.
The conventional tuning fork as a quantitative tool for vibration threshold.
Alanazy, Mohammed H; Alfurayh, Nuha A; Almweisheer, Shaza N; Aljafen, Bandar N; Muayqil, Taim
2018-01-01
This study was undertaken to describe a method for quantifying vibration when using a conventional tuning fork (CTF) in comparison to a Rydel-Seiffer tuning fork (RSTF) and to provide reference values. Vibration thresholds at index finger and big toe were obtained in 281 participants. Spearman's correlations were performed. Age, weight, and height were analyzed for their covariate effects on vibration threshold. Reference values at the fifth percentile were obtained by quantile regression. The correlation coefficients between CTF and RSTF values at finger/toe were 0.59/0.64 (P = 0.001 for both). Among covariates, only age had a significant effect on vibration threshold. Reference values for CTF at finger/toe for the age groups 20-39 and 40-60 years were 7.4/4.9 and 5.8/4.6 s, respectively. Reference values for RSTF at finger/toe for the age groups 20-39 and 40-60 years were 6.9/5.5 and 6.2/4.7, respectively. CTF provides quantitative values that are as good as those provided by RSTF. Age-stratified reference data are provided. Muscle Nerve 57: 49-53, 2018. © 2017 Wiley Periodicals, Inc.
Espousing Democratic Leadership Practices: A Study of Values in Action
ERIC Educational Resources Information Center
Devereaux, Lorraine
2003-01-01
This article examines principals' espoused values and their values in action. It provides a reanalysis of previously collected data through a values lens. The original research study was an international quantitative and qualitative investigation of principals' leadership approaches that was based in 15 schools. This particular excerpt of the…
Null Hypothesis Significance Testing and "p" Values
ERIC Educational Resources Information Center
Travers, Jason C.; Cook, Bryan G.; Cook, Lysandra
2017-01-01
"p" values are commonly reported in quantitative research, but are often misunderstood and misinterpreted by research consumers. Our aim in this article is to provide special educators with guidance for appropriately interpreting "p" values, with the broader goal of improving research consumers' understanding and interpretation…
Aslan, Kerim; Gunbey, Hediye Pinar; Tomak, Leman; Ozmen, Zafer; Incesu, Lutfi
The aim of this study was to investigate whether the use of combination quantitative metrics (mamillopontine distance [MPD], pontomesencephalic angle, and mesencephalon anterior-posterior/medial-lateral diameter ratios) with qualitative signs (dural enhancement, subdural collections/hematoma, venous engorgement, pituitary gland enlargements, and tonsillar herniations) provides a more accurate diagnosis of intracranial hypotension (IH). The quantitative metrics and qualitative signs of 34 patients and 34 control subjects were assessed by 2 independent observers. Receiver operating characteristic (ROC) curve was used to evaluate the diagnostic performance of quantitative metrics and qualitative signs, and for the diagnosis of IH, optimum cutoff values of quantitative metrics were found with ROC analysis. Combined ROC curve was measured for the quantitative metrics, and qualitative signs combinations in determining diagnostic accuracy and sensitivity, specificity, and positive and negative predictive values were found, and the best model combination was formed. Whereas MPD and pontomesencephalic angle were significantly lower in patients with IH when compared with the control group (P < 0.001), mesencephalon anterior-posterior/medial-lateral diameter ratio was significantly higher (P < 0.001). For qualitative signs, the highest individual distinctive power was dural enhancement with area under the ROC curve (AUC) of 0.838. For quantitative metrics, the highest individual distinctive power was MPD with AUC of 0.947. The best accuracy in the diagnosis of IH was obtained by combination of dural enhancement, venous engorgement, and MPD with an AUC of 1.00. This study showed that the combined use of dural enhancement, venous engorgement, and MPD had diagnostic accuracy of 100 % for the diagnosis of IH. Therefore, a more accurate IH diagnosis can be provided with combination of quantitative metrics with qualitative signs.
Value of Information References
Morency, Christina
2014-12-12
This file contains a list of relevant references on value of information (VOI) in RIS format. VOI provides a quantitative analysis to evaluate the outcome of the combined technologies (seismology, hydrology, geodesy) used to monitor Brady's Geothermal Field.
Phillips, Brett T; Fourman, Mitchell S; Rivara, Andrew; Dagum, Alexander B; Huston, Tara L; Ganz, Jason C; Bui, Duc T; Khan, Sami U
2014-01-01
Several devices exist today to assist the intraoperative determination of skin flap perfusion. Laser-Assisted Indocyanine Green Dye Angiography (LAICGA) has been shown to accurately predict mastectomy skin flap necrosis using quantitative perfusion values. The laser properties of the latest LAICGA device (SPY Elite) differ significantly from its predecessor system (SPY 2001), preventing direct translation of previous published data. The purpose of this study was to establish a mathematical relationship of perfusion values between these 2 devices. Breast reconstruction patients were prospectively enrolled into a clinical trial where skin flap evaluation and excision was based on quantitative SPY Q values previously established in the literature. Initial study patients underwent mastectomy skin flap evaluation using both SPY systems simultaneously. Absolute perfusion unit (APU) values at identical locations on the breast were then compared graphically. 210 data points were identified on the same patients (n = 4) using both SPY systems. A linear relationship (y = 2.9883x + 12.726) was identified with a high level or correlation (R(2) = 0.744). Previously published values using SPY 2001 (APU 3.7) provided a value of 23.8 APU on the SPY Elite. In addition, postoperative necrosis in these patients correlated to regions of skin identified with the SPY Elite with APU less than 23.8. Intraoperative comparison of LAICGA systems has provided direct correlation of perfusion values predictive of necrosis that were previously established in the literature. An APU value of 3.7 from the SPY 2001 correlates to a SPY Elite APU value of 23.8.
Fourman, Mitchell S.; Rivara, Andrew; Dagum, Alexander B.; Huston, Tara L.; Ganz, Jason C.; Bui, Duc T.; Khan, Sami U.
2014-01-01
Objective: Several devices exist today to assist the intraoperative determination of skin flap perfusion. Laser-Assisted Indocyanine Green Dye Angiography (LAICGA) has been shown to accurately predict mastectomy skin flap necrosis using quantitative perfusion values. The laser properties of the latest LAICGA device (SPY Elite) differ significantly from its predecessor system (SPY 2001), preventing direct translation of previous published data. The purpose of this study was to establish a mathematical relationship of perfusion values between these 2 devices. Methods: Breast reconstruction patients were prospectively enrolled into a clinical trial where skin flap evaluation and excision was based on quantitative SPY Q values previously established in the literature. Initial study patients underwent mastectomy skin flap evaluation using both SPY systems simultaneously. Absolute perfusion unit (APU) values at identical locations on the breast were then compared graphically. Results: 210 data points were identified on the same patients (n = 4) using both SPY systems. A linear relationship (y = 2.9883x + 12.726) was identified with a high level or correlation (R2 = 0.744). Previously published values using SPY 2001 (APU 3.7) provided a value of 23.8 APU on the SPY Elite. In addition, postoperative necrosis in these patients correlated to regions of skin identified with the SPY Elite with APU less than 23.8. Conclusion: Intraoperative comparison of LAICGA systems has provided direct correlation of perfusion values predictive of necrosis that were previously established in the literature. An APU value of 3.7 from the SPY 2001 correlates to a SPY Elite APU value of 23.8. PMID:25525483
Prognostic Value of Quantitative Stress Perfusion Cardiac Magnetic Resonance.
Sammut, Eva C; Villa, Adriana D M; Di Giovine, Gabriella; Dancy, Luke; Bosio, Filippo; Gibbs, Thomas; Jeyabraba, Swarna; Schwenke, Susanne; Williams, Steven E; Marber, Michael; Alfakih, Khaled; Ismail, Tevfik F; Razavi, Reza; Chiribiri, Amedeo
2018-05-01
This study sought to evaluate the prognostic usefulness of visual and quantitative perfusion cardiac magnetic resonance (CMR) ischemic burden in an unselected group of patients and to assess the validity of consensus-based ischemic burden thresholds extrapolated from nuclear studies. There are limited data on the prognostic value of assessing myocardial ischemic burden by CMR, and there are none using quantitative perfusion analysis. Patients with suspected coronary artery disease referred for adenosine-stress perfusion CMR were included (n = 395; 70% male; age 58 ± 13 years). The primary endpoint was a composite of cardiovascular death, nonfatal myocardial infarction, aborted sudden death, and revascularization after 90 days. Perfusion scans were assessed visually and with quantitative analysis. Cross-validated Cox regression analysis and net reclassification improvement were used to assess the incremental prognostic value of visual or quantitative perfusion analysis over a baseline clinical model, initially as continuous covariates, then using accepted thresholds of ≥2 segments or ≥10% myocardium. After a median 460 days (interquartile range: 190 to 869 days) follow-up, 52 patients reached the primary endpoint. At 2 years, the addition of ischemic burden was found to increase prognostic value over a baseline model of age, sex, and late gadolinium enhancement (baseline model area under the curve [AUC]: 0.75; visual AUC: 0.84; quantitative AUC: 0.85). Dichotomized quantitative ischemic burden performed better than visual assessment (net reclassification improvement 0.043 vs. 0.003 against baseline model). This study was the first to address the prognostic benefit of quantitative analysis of perfusion CMR and to support the use of consensus-based ischemic burden thresholds by perfusion CMR for prognostic evaluation of patients with suspected coronary artery disease. Quantitative analysis provided incremental prognostic value to visual assessment and established risk factors, potentially representing an important step forward in the translation of quantitative CMR perfusion analysis to the clinical setting. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Stone, Geri L.; Major, Claire H.
2014-01-01
This quantitative study, which involved development of a Value Creation Survey, examined the perceived value of leadership development programs (LDPs) provided by continuing higher education for administrators in colleges and universities. Participants were administrators at Association for Continuing Higher Education (ACHE) member institutions.…
Low rank magnetic resonance fingerprinting.
Mazor, Gal; Weizman, Lior; Tal, Assaf; Eldar, Yonina C
2016-08-01
Magnetic Resonance Fingerprinting (MRF) is a relatively new approach that provides quantitative MRI using randomized acquisition. Extraction of physical quantitative tissue values is preformed off-line, based on acquisition with varying parameters and a dictionary generated according to the Bloch equations. MRF uses hundreds of radio frequency (RF) excitation pulses for acquisition, and therefore high under-sampling ratio in the sampling domain (k-space) is required. This under-sampling causes spatial artifacts that hamper the ability to accurately estimate the quantitative tissue values. In this work, we introduce a new approach for quantitative MRI using MRF, called Low Rank MRF. We exploit the low rank property of the temporal domain, on top of the well-known sparsity of the MRF signal in the generated dictionary domain. We present an iterative scheme that consists of a gradient step followed by a low rank projection using the singular value decomposition. Experiments on real MRI data demonstrate superior results compared to conventional implementation of compressed sensing for MRF at 15% sampling ratio.
Delay Discounting: I'm a "K", You're a "K"
ERIC Educational Resources Information Center
Odum, Amy L.
2011-01-01
Delay discounting is the decline in the present value of a reward with delay to its receipt. Across a variety of species, populations, and reward types, value declines hyperbolically with delay. Value declines steeply with shorter delays, but more shallowly with longer delays. Quantitative modeling provides precise measures to characterize the…
ERIC Educational Resources Information Center
Vickers, Tracy Simone
2017-01-01
The economic value of international students to higher education had become important and global competition to attract and retain these lucrative students was fierce. In British Columbia, educational goals were set to ensure that all students receive quality learning experiences and provide maximum economic benefit. Cultural values affect…
Chen, Qianqian; Xie, Qian; Zhao, Min; Chen, Bin; Gao, Shi; Zhang, Haishan; Xing, Hua; Ma, Qingjie
2015-01-01
To compare the diagnostic value of visual and semi-quantitative analysis of technetium-99m-poly-ethylene glycol, 4-arginine-glycine-aspartic acid ((99m)Tc-3PRGD2) scintimammography (SMG) for better differentiation of benign from malignant breast masses, and also investigate the incremental role of semi-quantitative index of SMG. A total of 72 patients with breast lesions were included in the study. Technetium-99m-3PRGD2 SMG was performed with single photon emission computed tomography (SPET) at 60 min after intravenous injection of 749 ± 86MBq of the radiotracer. Images were evaluated by visual interpretation and semi-quantitative indices of tumor to non-tumor (T/N) ratios, which were compared with pathology results. Receiver operating characteristics (ROC) curve analyses were performed to determine the optimal visual grade, to calculate cut-off values of semi-quantitative indices, and to compare visual and semi-quantitative diagnostic values. Among the 72 patients, 89 lesions were confirmed by histopathology after fine needle aspiration biopsy or surgery, 48 malignant and 41 benign lesions. The mean T/N ratio of (99m)Tc-3PRGD2 SMG in malignant lesions was significantly higher than that in benign lesions (P<0.05). When grade 2 of the disease was used as cut-off value for the detection of primary breast cancer, the sensitivity, specificity and accuracy were 81.3%, 70.7%, and 76.4%, respectively. When a T/N ratio of 2.01 was used as cut-off value, the sensitivity, specificity and accuracy were 79.2%, 75.6%, and 77.5%, respectively. According to ROC analysis, the area under the curve for semi-quantitative analysis was higher than that for visual analysis, but the statistical difference was not significant (P=0.372). Compared with visual analysis or semi-quantitative analysis alone, the sensitivity, specificity and accuracy of visual analysis combined with semi-quantitative analysis in diagnosing primary breast cancer were higher, being: 87.5%, 82.9%, and 85.4%, respectively. The area under the curve was 0.891. Results of the present study suggest that the semi-quantitative and visual analysis statistically showed similar results. The semi-quantitative analysis provided incremental value additive to visual analysis of (99m)Tc-3PRGD2 SMG for the detection of breast cancer. It seems from our results that, when the tumor was located in the medial part of the breast, the semi-quantitative analysis gave better diagnostic results.
NASA Astrophysics Data System (ADS)
Li, Jiajia; Li, Rongxi; Zhao, Bangsheng; Guo, Hui; Zhang, Shuan; Cheng, Jinghua; Wu, Xiaoli
2018-04-01
The use of Micro-Laser Raman spectroscopy technology for quantitatively determining gas carbon isotope composition is presented. In this study, 12CO2 and 13CO2 were mixed with N2 at various molar fraction ratios to obtain Raman quantification factors (F12CO2 and F13CO2), which provide a theoretical basis for calculating the δ13C value. And the corresponding values were 0.523 (0 < C12CO2/CN2 < 2) and 1.11998 (0 < C13CO2/CN2 < 1.5) respectively. It has shown that the representative Raman peak area can be used for the determination of δ13C values within the relative errors range of 0.076% to 1.154% in 13CO2/12CO2 binary mixtures when F12CO2/F13CO2 is 0.466972625. In addition, measurement of δ13C values by Micro-Laser Raman analysis were carried out on natural CO2 gas from Shengli Oil-field at room temperature under different pressures. The δ13C values obtained by Micro-Laser Raman spectroscopy technology and Isotope Ratio Mass Spectrometry (IRMS) technology are in good agreement with each other, and the relative errors range of δ13C values is 1.232%-6.964%. This research provides a fundamental analysis tool for determining gas carbon isotope composition (δ13C values) quantitatively by using Micro-Laser Raman spectroscopy. Experiment of results demonstrates that this method has the potential for obtaining δ13C values in natural CO2 gas reservoirs.
What's the Big Deal? Collection Evaluation at the National Level
ERIC Educational Resources Information Center
Jurczyk, Eva; Jacobs, Pamela
2014-01-01
This article discusses a project undertaken to assess the journals in a Big Deal package by applying a weighted value algorithm measuring quality, utility, and value of individual titles. Carried out by a national library consortium in Canada, the project confirmed the value of the Big Deal package while providing a quantitative approach for…
A quantitative description for efficient financial markets
NASA Astrophysics Data System (ADS)
Immonen, Eero
2015-09-01
In this article we develop a control system model for describing efficient financial markets. We define the efficiency of a financial market in quantitative terms by robust asymptotic price-value equality in this model. By invoking the Internal Model Principle of robust output regulation theory we then show that under No Bubble Conditions, in the proposed model, the market is efficient if and only if the following conditions hold true: (1) the traders, as a group, can identify any mispricing in asset value (even if no one single trader can do it accurately), and (2) the traders, as a group, incorporate an internal model of the value process (again, even if no one single trader knows it). This main result of the article, which deliberately avoids the requirement for investor rationality, demonstrates, in quantitative terms, that the more transparent the markets are, the more efficient they are. An extensive example is provided to illustrate the theoretical development.
The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...
Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A
2017-12-19
As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.
Shi, Yin; Zong, Min; Xu, Xiaoquan; Zou, Yuefen; Feng, Yang; Liu, Wei; Wang, Chuanbing; Wang, Dehang
2015-04-01
To quantitatively evaluate nerve roots by measuring fractional anisotropy (FA) values in healthy volunteers and sciatica patients, visualize nerve roots by tractography, and compare the diagnostic efficacy between conventional magnetic resonance imaging (MRI) and DTI. Seventy-five sciatica patients and thirty-six healthy volunteers underwent MR imaging using DTI. FA values for L5-S1 lumbar nerve roots were calculated at three levels from DTI images. Tractography was performed on L3-S1 nerve roots. ROC analysis was performed for FA values. The lumbar nerve roots were visualized and FA values were calculated in all subjects. FA values decreased in compressed nerve roots and declined from proximal to distal along the compressed nerve tracts. Mean FA values were more sensitive and specific than MR imaging for differentiating compressed nerve roots, especially in the far lateral zone at distal nerves. DTI can quantitatively evaluate compressed nerve roots, and DTT enables visualization of abnormal nerve tracts, providing vivid anatomic information and localization of probable nerve compression. DTI has great potential utility for evaluating lumbar nerve compression in sciatica. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Li, Jiajia; Li, Rongxi; Zhao, Bangsheng; Guo, Hui; Zhang, Shuan; Cheng, Jinghua; Wu, Xiaoli
2018-04-15
The use of Micro-Laser Raman spectroscopy technology for quantitatively determining gas carbon isotope composition is presented. In this study, 12 CO 2 and 13 CO 2 were mixed with N 2 at various molar fraction ratios to obtain Raman quantification factors (F 12CO2 and F 13CO2 ), which provide a theoretical basis for calculating the δ 13 C value. And the corresponding values were 0.523 (0
Roebuck, Joseph R; Haker, Steven J; Mitsouras, Dimitris; Rybicki, Frank J; Tempany, Clare M; Mulkern, Robert V
2009-05-01
Quantitative, apparent T(2) values of suspected prostate cancer and healthy peripheral zone tissue in men with prostate cancer were measured using a Carr-Purcell-Meiboom-Gill (CPMG) imaging sequence in order to assess the cancer discrimination potential of tissue T(2) values. The CPMG imaging sequence was used to image the prostates of 18 men with biopsy-proven prostate cancer. Whole gland coverage with nominal voxel volumes of 0.54 x 1.1 x 4 mm(3) was obtained in 10.7 min, resulting in data sets suitable for generating high-quality images with variable T(2)-weighting and for evaluating quantitative T(2) values on a pixel-by-pixel basis. Region-of-interest analysis of suspected healthy peripheral zone tissue and suspected cancer, identified on the basis of both T(1)- and T(2)-weighted signal intensities and available histopathology reports, yielded significantly (P<.0001) longer apparent T(2) values in suspected healthy tissue (193+/-49 ms) vs. suspected cancer (100+/-26 ms), suggesting potential utility of this method as a tissue specific discrimination index for prostate cancer. We conclude that CPMG imaging of the prostate can be performed in reasonable scan times and can provide advantages over T(2)-weighted fast spin echo (FSE) imaging alone, including quantitative T(2) values for cancer discrimination as well as proton density maps without the point spread function degradation associated with short effective echo time FSE sequences.
Roebuck, Joseph R.; Haker, Steven J.; Mitsouras, Dimitris; Rybicki, Frank J.; Tempany, Clare M.; Mulkern, Robert V.
2009-01-01
Quantitative, apparent T2 values of suspected prostate cancer and healthy peripheral zone tissue in men with prostate cancer were measured using a Carr-Purcell-Meiboom-Gill (CPMG) imaging sequence in order to assess the cancer discrimination potential of tissue T2 values. The CPMG imaging sequence was used to image the prostates of 18 men with biopsy proven prostate cancer. Whole gland coverage with nominal voxel volumes of 0.54 × 1.1 × 4 mm3 was obtained in 10.7 minutes, resulting in data sets suitable for generating high quality images with variable T2-weighting and for evaluating quantitative T2 values on a pixel-by-pixel basis. Region-of-interest analysis of suspected healthy peripheral zone tissue and suspected cancer, identified on the basis of both T1- and T2-weighted signal intensities and available histopathology reports, yielded significantly (p < 0.0001) longer apparent T2 values in suspected healthy tissue (193 ± 49 ms) vs. suspected cancer (100 ± 26 ms), suggesting potential utility of this method as a tissue specific discrimination index for prostate cancer. We conclude that CPMG imaging of the prostate can be performed in reasonable scan times and can provide advantages over T2-weighted fast spin echo imaging alone, including quantitative T2 values for cancer discrimination as well as proton density maps without the point spread function degradation associated with short effective echo time fast spin echo (FSE) sequences. PMID:18823731
Alizai, Hamza; Nardo, Lorenzo; Karampinos, Dimitrios C; Joseph, Gabby B; Yap, Samuel P; Baum, Thomas; Krug, Roland; Majumdar, Sharmila; Link, Thomas M
2012-07-01
The goal of this study was to compare the semi-quantitative Goutallier classification for fat infiltration with quantitative fat-fraction derived from a magnetic resonance imaging (MRI) chemical shift-based water/fat separation technique. Sixty-two women (age 61 ± 6 years), 27 of whom had diabetes, underwent MRI of the calf using a T1-weighted fast spin-echo sequence and a six-echo spoiled gradient-echo sequence at 3 T. Water/fat images and fat fraction maps were reconstructed using the IDEAL algorithm with T2* correction and a multi-peak model for the fat spectrum. Two radiologists scored fat infiltration on the T1-weighted images using the Goutallier classification in six muscle compartments. Spearman correlations between the Goutallier grades and the fat fraction were calculated; in addition, intra-observer and inter-observer agreement were calculated. A significant correlation between the clinical grading and the fat fraction values was found for all muscle compartments (P < 0.0001, R values ranging from 0.79 to 0.88). Goutallier grades 0-4 had a fat fraction ranging from 3.5 to 19%. Intra-observer and inter-observer agreement values of 0.83 and 0.81 were calculated for the semi-quantitative grading. Semi-quantitative grading of intramuscular fat and quantitative fat fraction were significantly correlated and both techniques had excellent reproducibility. However, the clinical grading was found to overestimate muscle fat. Fat infiltration of muscle commonly occurs in many metabolic and neuromuscular diseases. • Image-based semi-quantitative classifications for assessing fat infiltration are not well validated. • Quantitative MRI techniques provide an accurate assessment of muscle fat.
Economic valuation of landslide damage in hilly regions: a case study from Flanders, Belgium.
Vranken, Liesbet; Van Turnhout, Pieter; Van Den Eeckhaut, Miet; Vandekerckhove, Liesbeth; Poesen, Jean
2013-03-01
Several regions around the globe are at risk of incurring damage from landslides, but only few studies have concentrated on a quantitative estimate of the overall damage caused by landslides at a regional scale. This study therefore starts with a quantitative economic assessment of the direct and indirect damage caused by landslides in a 2,910 km study area located west of Brussels, a low-relief region susceptible to landslides. Based on focus interviews as well as on semi-structured interviews with homeowners, civil servants and the owners and providers of lifelines such as electricity and sewage, a quantitative damage assessment is provided. For private properties (houses, forest and pasture land) we estimate the real estate and production value losses for different damage scenarios, while for public infrastructure the costs of measures to repair and prevent landslide induced damage are estimated. In addition, the increase in amenity value of forests and grasslands due to the occurrence of landslides is also calculated. The study illustrates that a minority of land (only 2.3%) within the study area is used for dwellings, roads and railway lines, but that these land use types are responsible for the vast majority of the economic damage due to the occurrence of landslides. The annual cost of direct damage due to landsliding amounts to 688,148 €/year out of which 550,740 €/year for direct damage to houses, while the annual indirect damage augments to 3,020,049 €/year out of which 2,007,375 €/year for indirect damage to real estate. Next, the study illustrates that the increase of the amenity value of forests and grasslands outweighs the production value loss. As such the study does not only provide quantitative input data for the estimation of future risks, but also important information for government officials as it clearly informs about the costs associated with certain land use types in landslide areas. Copyright © 2013 Elsevier B.V. All rights reserved.
3D quantitative analysis of early decomposition changes of the human face.
Caplova, Zuzana; Gibelli, Daniele Maria; Poppa, Pasquale; Cummaudo, Marco; Obertova, Zuzana; Sforza, Chiarella; Cattaneo, Cristina
2018-03-01
Decomposition of the human body and human face is influenced, among other things, by environmental conditions. The early decomposition changes that modify the appearance of the face may hamper the recognition and identification of the deceased. Quantitative assessment of those changes may provide important information for forensic identification. This report presents a pilot 3D quantitative approach of tracking early decomposition changes of a single cadaver in controlled environmental conditions by summarizing the change with weekly morphological descriptions. The root mean square (RMS) value was used to evaluate the changes of the face after death. The results showed a high correlation (r = 0.863) between the measured RMS and the time since death. RMS values of each scan are presented, as well as the average weekly RMS values. The quantification of decomposition changes could improve the accuracy of antemortem facial approximation and potentially could allow the direct comparisons of antemortem and postmortem 3D scans.
The value of health care information exchange and interoperability.
Walker, Jan; Pan, Eric; Johnston, Douglas; Adler-Milstein, Julia; Bates, David W; Middleton, Blackford
2005-01-01
In this paper we assess the value of electronic health care information exchange and interoperability (HIEI) between providers (hospitals and medical group practices) and independent laboratories, radiology centers, pharmacies, payers, public health departments, and other providers. We have created an HIEI taxonomy and combined published evidence with expert opinion in a cost-benefit model. Fully standardized HIEI could yield a net value of dollar 77.8 billion per year once fully implemented. Nonstandardized HIEI offers smaller positive financial returns. The clinical impact of HIEI for which quantitative estimates cannot yet be made would likely add further value. A compelling business case exists for national implementation of fully standardized HIEI.
Industrial ecology: Quantitative methods for exploring a lower carbon future
NASA Astrophysics Data System (ADS)
Thomas, Valerie M.
2015-03-01
Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.
Azuma, M; Hirai, T; Yamada, K; Yamashita, S; Ando, Y; Tateishi, M; Iryo, Y; Yoneda, T; Kitajima, M; Wang, Y; Yamashita, Y
2016-05-01
Quantitative susceptibility mapping is useful for assessing iron deposition in the substantia nigra of patients with Parkinson disease. We aimed to determine whether quantitative susceptibility mapping is useful for assessing the lateral asymmetry and spatial difference in iron deposits in the substantia nigra of patients with Parkinson disease. Our study population comprised 24 patients with Parkinson disease and 24 age- and sex-matched healthy controls. They underwent 3T MR imaging by using a 3D multiecho gradient-echo sequence. On reconstructed quantitative susceptibility mapping, we measured the susceptibility values in the anterior, middle, and posterior parts of the substantia nigra, the whole substantia nigra, and other deep gray matter structures in both hemibrains. To identify the more and less affected hemibrains in patients with Parkinson disease, we assessed the severity of movement symptoms for each hemibrain by using the Unified Parkinson's Disease Rating Scale. In the posterior substantia nigra of patients with Parkinson disease, the mean susceptibility value was significantly higher in the more than the less affected hemibrain substantia nigra (P < .05). This value was significantly higher in both the more and less affected hemibrains of patients with Parkinson disease than in controls (P < .05). Asymmetry of the mean susceptibility values was significantly greater for patients than controls (P < .05). Receiver operating characteristic analysis showed that quantitative susceptibility mapping of the posterior substantia nigra in the more affected hemibrain provided the highest power for discriminating patients with Parkinson disease from the controls. Quantitative susceptibility mapping is useful for assessing the lateral asymmetry and spatial difference of iron deposition in the substantia nigra of patients with Parkinson disease. © 2016 by American Journal of Neuroradiology.
NASA Technical Reports Server (NTRS)
Bush, Lance B.
1997-01-01
In the current political climate NASA must be able to show reliable measures demonstrating successful technology transfer. The currently available quantitative data of intellectual property technology transfer efforts portray a less than successful performance. In this paper, the use of only quantitative values for measurement of technology transfer is shown to undervalue the effort. In addition, NASA's current policy in negotiating intellectual property rights results in undervalued royalty rates. NASA has maintained that it's position of providing public good precludes it from negotiating fair market value for its technology and instead has negotiated for reasonable cost in order to recover processing fees. This measurement issue is examined and recommendations made which include a new policy regarding the intellectual property rights negotiation, and two measures to supplement the intellectual property measures.
Wang, Fei; He, Bei
2013-01-01
To investigate the role of endotracheal aspirate (EA) culture in the diagnosis and antibiotic management in ventilator-associated pneumonia (VAP). We searched CNKI, Wanfang, PUBMED and EMBASE databases published from January 1990 to December 2011, to find relevant literatures on VAP microbiological diagnostic techniques including EA and bronchoalveolar lavage (BALF). The following key words were used: ventilator associated pneumonia, diagnosis and adult. Meta-analysis was performed and the sensitivity and specificity of EA on VAP diagnosis were calculated. Our literature search identified 1665 potential articles, 8 of which fulfilled our selection criteria including 561 patients with paired cultures. Using BALF quantitative culture as reference standard, the sensitivity and specificity of EA were 72% and 71%. When considering quantitative culture of EA only, the sensitivity and specificity improved to 90% and 65%, while the positive and the negative predictive values were 68% and 89% respectively. However, the sensitivity and specificity of semi-quantitative culture of EA were only 50% and 80%, with a positive predictive value of 77% and a negative predictive value of 58% respectively. EA culture had relatively poor sensitivity and specificity, although quantitative culture of EA only could improve the sensitivity. Initiating therapy on the basis of EA quantitative culture may still result in excessive antibiotic usage. Our data suggested that EA could provide some information for clinical decision but could not replace the role of BALF quantitative culture in VAP diagnosis.
Fujiwara, Yasuhiro; Maruyama, Hirotoshi; Toyomaru, Kanako; Nishizaka, Yuri; Fukamatsu, Masahiro
2018-06-01
Magnetic resonance imaging (MRI) is widely used to detect carotid atherosclerotic plaques. Although it is important to evaluate vulnerable carotid plaques containing lipids and intra-plaque hemorrhages (IPHs) using T 1 -weighted images, the image contrast changes depending on the imaging settings. Moreover, to distinguish between a thrombus and a hemorrhage, it is useful to evaluate the iron content of the plaque using both T 1 -weighted and T 2 *-weighted images. Therefore, a quantitative evaluation of carotid atherosclerotic plaques using T 1 and T 2 * values may be necessary for the accurate evaluation of plaque components. The purpose of this study was to determine whether the multi-echo phase-sensitive inversion recovery (mPSIR) sequence can improve T 1 contrast while simultaneously providing accurate T 1 and T 2 * values of an IPH. T 1 and T 2 * values measured using mPSIR were compared to values from conventional methods in phantom and in vivo studies. In the phantom study, the T 1 and T 2 * values estimated using mPSIR were linearly correlated with those of conventional methods. In the in vivo study, mPSIR demonstrated higher T 1 contrast between the IPH phantom and sternocleidomastoid muscle than the conventional method. Moreover, the T 1 and T 2 * values of the blood vessel wall and sternocleidomastoid muscle estimated using mPSIR were correlated with values measured by conventional methods and with values reported previously. The mPSIR sequence improved T 1 contrast while simultaneously providing accurate T 1 and T 2 * values of the neck region. Although further study is required to evaluate the clinical utility, mPSIR may improve carotid atherosclerotic plaque detection and provide detailed information about plaque components.
Quantitative Oxygenation Venography from MRI Phase
Fan, Audrey P.; Bilgic, Berkin; Gagnon, Louis; Witzel, Thomas; Bhat, Himanshu; Rosen, Bruce R.; Adalsteinsson, Elfar
2014-01-01
Purpose To demonstrate acquisition and processing methods for quantitative oxygenation venograms that map in vivo oxygen saturation (SvO2) along cerebral venous vasculature. Methods Regularized quantitative susceptibility mapping (QSM) is used to reconstruct susceptibility values and estimate SvO2 in veins. QSM with ℓ1 and ℓ2 regularization are compared in numerical simulations of vessel structures with known magnetic susceptibility. Dual-echo, flow-compensated phase images are collected in three healthy volunteers to create QSM images. Bright veins in the susceptibility maps are vectorized and used to form a three-dimensional vascular mesh, or venogram, along which to display SvO2 values from QSM. Results Quantitative oxygenation venograms that map SvO2 along brain vessels of arbitrary orientation and geometry are shown in vivo. SvO2 values in major cerebral veins lie within the normal physiological range reported by 15O positron emission tomography. SvO2 from QSM is consistent with previous MR susceptometry methods for vessel segments oriented parallel to the main magnetic field. In vessel simulations, ℓ1 regularization results in less than 10% SvO2 absolute error across all vessel tilt orientations and provides more accurate SvO2 estimation than ℓ2 regularization. Conclusion The proposed analysis of susceptibility images enables reliable mapping of quantitative SvO2 along venograms and may facilitate clinical use of venous oxygenation imaging. PMID:24006229
Electron Inelastic-Mean-Free-Path Database
National Institute of Standards and Technology Data Gateway
SRD 71 NIST Electron Inelastic-Mean-Free-Path Database (PC database, no charge) This database provides values of electron inelastic mean free paths (IMFPs) for use in quantitative surface analyses by AES and XPS.
Thind, Munveer; Ahmed, Mustafa I; Gok, Gulay; Joson, Marisa; Elsayed, Mahmoud; Tuck, Benjamin C; Townsley, Matthew M; Klas, Berthold; McGiffin, David C; Nanda, Navin C
2015-05-01
We report a case of a right atrial thrombus traversing a patent foramen ovale into the left atrium, where three-dimensional transesophageal echocardiography provided considerable incremental value over two-dimensional transesophageal echocardiography in its assessment. As well as allowing us to better spatially characterize the thrombus, three-dimensional transesophageal echocardiography provided a more quantitative assessment through estimation of total thrombus burden. © 2015, Wiley Periodicals, Inc.
Fundamentals of quantitative dynamic contrast-enhanced MR imaging.
Paldino, Michael J; Barboriak, Daniel P
2009-05-01
Quantitative analysis of dynamic contrast-enhanced MR imaging (DCE-MR imaging) has the power to provide information regarding physiologic characteristics of the microvasculature and is, therefore, of great potential value to the practice of oncology. In particular, these techniques could have a significant impact on the development of novel anticancer therapies as a promising biomarker of drug activity. Standardization of DCE-MR imaging acquisition and analysis to provide more reproducible measures of tumor vessel physiology is of crucial importance to realize this potential. The purpose of this article is to review the pathophysiologic basis and technical aspects of DCE-MR imaging techniques.
Savageau, M A
1998-01-01
Induction of gene expression can be accomplished either by removing a restraining element (negative mode of control) or by providing a stimulatory element (positive mode of control). According to the demand theory of gene regulation, which was first presented in qualitative form in the 1970s, the negative mode will be selected for the control of a gene whose function is in low demand in the organism's natural environment, whereas the positive mode will be selected for the control of a gene whose function is in high demand. This theory has now been further developed in a quantitative form that reveals the importance of two key parameters: cycle time C, which is the average time for a gene to complete an ON/OFF cycle, and demand D, which is the fraction of the cycle time that the gene is ON. Here we estimate nominal values for the relevant mutation rates and growth rates and apply the quantitative demand theory to the lactose and maltose operons of Escherichia coli. The results define regions of the C vs. D plot within which selection for the wild-type regulatory mechanisms is realizable, and these in turn provide the first estimates for the minimum and maximum values of demand that are required for selection of the positive and negative modes of gene control found in these systems. The ratio of mutation rate to selection coefficient is the most relevant determinant of the realizable region for selection, and the most influential parameter is the selection coefficient that reflects the reduction in growth rate when there is superfluous expression of a gene. The quantitative theory predicts the rate and extent of selection for each mode of control. It also predicts three critical values for the cycle time. The predicted maximum value for the cycle time C is consistent with the lifetime of the host. The predicted minimum value for C is consistent with the time for transit through the intestinal tract without colonization. Finally, the theory predicts an optimum value of C that is in agreement with the observed frequency for E. coli colonizing the human intestinal tract. PMID:9691028
Contributions of Estuarine Habitats to Major Fisheries
Estuaries provide unique habitat conditions that are essential to the production of major fisheries throughout the world, but quantitatively demonstrating the value of these habitats to fisheries presents some difficult problems. The questions are important, because critical hab...
Atmospheric Science Data Center
2014-05-15
... appear in the upper right-hand corners of both images. Quantitative values for the vegetation changes are provided by the center and ... MD. The MISR data were obtained from the NASA Langley Research Center Atmospheric Science Data Center in Hampton, VA. Image ...
Pi, Shan; Cao, Rong; Qiang, Jin Wei; Guo, Yan Hui
2018-01-01
Background Diffusion-weighted imaging (DWI) and quantitative apparent diffusion coefficient (ADC) values are widely used in the differential diagnosis of ovarian tumors. Purpose To assess the diagnostic performance of quantitative ADC values in ovarian tumors. Material and Methods PubMed, Embase, the Cochrane Library, and local databases were searched for studies assessing ovarian tumors using quantitative ADC values. We quantitatively analyzed the diagnostic performances for two clinical problems: benign vs. malignant tumors and borderline vs. malignant tumors. We evaluated diagnostic performances by the pooled sensitivity and specificity values and by summary receiver operating characteristic (SROC) curves. Subgroup analyses were used to analyze study heterogeneity. Results From the 742 studies identified in the search results, 16 studies met our inclusion criteria. A total of ten studies evaluated malignant vs. benign ovarian tumors and six studies assessed malignant vs. borderline ovarian tumors. Regarding the diagnostic accuracy of quantitative ADC values for distinguishing between malignant and benign ovarian tumors, the pooled sensitivity and specificity values were 0.91 and 0.91, respectively. The area under the SROC curve (AUC) was 0.96. For differentiating borderline from malignant tumors, the pooled sensitivity and specificity values were 0.89 and 0.79, and the AUC was 0.91. The methodological quality of the included studies was moderate. Conclusion Quantitative ADC values could serve as useful preoperative markers for predicting the nature of ovarian tumors. Nevertheless, prospective trials focused on standardized imaging parameters are needed to evaluate the clinical value of quantitative ADC values in ovarian tumors.
How to quantitatively evaluate safety of driver behavior upon accident? A biomechanical methodology
Zhang, Wen; Cao, Jieer
2017-01-01
How to evaluate driver spontaneous reactions in various collision patterns in a quantitative way is one of the most important topics in vehicle safety. Firstly, this paper constructs representative numerical crash scenarios described by impact velocity, impact angle and contact position based on finite element (FE) computation platform. Secondly, a driver cabin model is extracted and described in the well validated multi-rigid body (MB) model to compute the value of weighted injury criterion to quantitatively assess drivers’ overall injury under certain circumstances. Furthermore, based on the coupling of FE and MB, parametric studies on various crash scenarios are conducted. It is revealed that the WIC (Weighted Injury Criteria) value variation law under high impact velocities is quite distinct comparing with the one in low impact velocities. In addition, the coupling effect can be elucidated by the fact that the difference of WIC value among three impact velocities under smaller impact angles tends to be distinctly higher than that under larger impact angles. Meanwhile, high impact velocity also increases the sensitivity of WIC under different collision positions and impact angles. Results may provide a new methodology to quantitatively evaluate driving behaviors and serve as a significant guiding step towards collision avoidance for autonomous driving vehicles. PMID:29240789
How to quantitatively evaluate safety of driver behavior upon accident? A biomechanical methodology.
Zhang, Wen; Cao, Jieer; Xu, Jun
2017-01-01
How to evaluate driver spontaneous reactions in various collision patterns in a quantitative way is one of the most important topics in vehicle safety. Firstly, this paper constructs representative numerical crash scenarios described by impact velocity, impact angle and contact position based on finite element (FE) computation platform. Secondly, a driver cabin model is extracted and described in the well validated multi-rigid body (MB) model to compute the value of weighted injury criterion to quantitatively assess drivers' overall injury under certain circumstances. Furthermore, based on the coupling of FE and MB, parametric studies on various crash scenarios are conducted. It is revealed that the WIC (Weighted Injury Criteria) value variation law under high impact velocities is quite distinct comparing with the one in low impact velocities. In addition, the coupling effect can be elucidated by the fact that the difference of WIC value among three impact velocities under smaller impact angles tends to be distinctly higher than that under larger impact angles. Meanwhile, high impact velocity also increases the sensitivity of WIC under different collision positions and impact angles. Results may provide a new methodology to quantitatively evaluate driving behaviors and serve as a significant guiding step towards collision avoidance for autonomous driving vehicles.
Quantitative imaging biomarkers: Effect of sample size and bias on confidence interval coverage.
Obuchowski, Nancy A; Bullen, Jennifer
2017-01-01
Introduction Quantitative imaging biomarkers (QIBs) are being increasingly used in medical practice and clinical trials. An essential first step in the adoption of a quantitative imaging biomarker is the characterization of its technical performance, i.e. precision and bias, through one or more performance studies. Then, given the technical performance, a confidence interval for a new patient's true biomarker value can be constructed. Estimating bias and precision can be problematic because rarely are both estimated in the same study, precision studies are usually quite small, and bias cannot be measured when there is no reference standard. Methods A Monte Carlo simulation study was conducted to assess factors affecting nominal coverage of confidence intervals for a new patient's quantitative imaging biomarker measurement and for change in the quantitative imaging biomarker over time. Factors considered include sample size for estimating bias and precision, effect of fixed and non-proportional bias, clustered data, and absence of a reference standard. Results Technical performance studies of a quantitative imaging biomarker should include at least 35 test-retest subjects to estimate precision and 65 cases to estimate bias. Confidence intervals for a new patient's quantitative imaging biomarker measurement constructed under the no-bias assumption provide nominal coverage as long as the fixed bias is <12%. For confidence intervals of the true change over time, linearity must hold and the slope of the regression of the measurements vs. true values should be between 0.95 and 1.05. The regression slope can be assessed adequately as long as fixed multiples of the measurand can be generated. Even small non-proportional bias greatly reduces confidence interval coverage. Multiple lesions in the same subject can be treated as independent when estimating precision. Conclusion Technical performance studies of quantitative imaging biomarkers require moderate sample sizes in order to provide robust estimates of bias and precision for constructing confidence intervals for new patients. Assumptions of linearity and non-proportional bias should be assessed thoroughly.
Zhang, Yuxin; Holmes, James; Rabanillo, Iñaki; Guidon, Arnaud; Wells, Shane; Hernando, Diego
2018-09-01
To evaluate the reproducibility of quantitative diffusion measurements obtained with reduced Field of View (rFOV) and Multi-shot EPI (msEPI) acquisitions, using single-shot EPI (ssEPI) as a reference. Diffusion phantom experiments, and prostate diffusion-weighted imaging in healthy volunteers and patients with known or suspected prostate cancer were performed across the three different sequences. Quantitative diffusion measurements of apparent diffusion coefficient, and diffusion kurtosis parameters (healthy volunteers), were obtained and compared across diffusion sequences (rFOV, msEPI, and ssEPI). Other possible confounding factors like b-value combinations and acquisition parameters were also investigated. Both msEPI and rFOV have shown reproducible quantitative diffusion measurements relative to ssEPI; no significant difference in ADC was observed across pulse sequences in the standard diffusion phantom (p = 0.156), healthy volunteers (p ≥ 0.12) or patients (p ≥ 0.26). The ADC values within the non-cancerous central gland and peripheral zone of patients were 1.29 ± 0.17 × 10 -3 mm 2 /s and 1.74 ± 0.23 × 10 -3 mm 2 /s respectively. However, differences in quantitative diffusion parameters were observed across different number of averages for rFOV, and across b-value groups and diffusion models for all the three sequences. Both rFOV and msEPI have the potential to provide high image quality with reproducible quantitative diffusion measurements in prostate diffusion MRI. Copyright © 2018 Elsevier Inc. All rights reserved.
Gámez-Cenzano, Cristina; Pino-Sorroche, Francisco
2014-04-01
There is a growing interest in using quantification in FDG-PET/CT in oncology, especially for evaluating response to therapy. Complex full quantitative procedures with blood sampling and dynamic scanning have been clinically replaced by the use of standardized uptake value measurements that provide an index of regional tracer uptake normalized to the administered dose of FDG. Some approaches have been proposed for assessing quantitative metabolic response, such as EORTC and PERCIST criteria in solid tumors. When using standardized uptake value in clinical routine and multicenter trials, standardization of protocols and quality control procedures of instrumentation is required. Copyright © 2014 Elsevier Inc. All rights reserved.
Deployment of e-health services - a business model engineering strategy.
Kijl, Björn; Nieuwenhuis, Lambert J M; Huis in 't Veld, Rianne M H A; Hermens, Hermie J; Vollenbroek-Hutten, Miriam M R
2010-01-01
We designed a business model for deploying a myofeedback-based teletreatment service. An iterative and combined qualitative and quantitative action design approach was used for developing the business model and the related value network. Insights from surveys, desk research, expert interviews, workshops and quantitative modelling were combined to produce the first business model and then to refine it in three design cycles. The business model engineering strategy provided important insights which led to an improved, more viable and feasible business model and related value network design. Based on this experience, we conclude that the process of early stage business model engineering reduces risk and produces substantial savings in costs and resources related to service deployment.
Evolutionary Technique for Automated Synthesis of Electronic Circuits
NASA Technical Reports Server (NTRS)
Stoica, Adrian (Inventor); Salazar-Lazaro, Carlos Harold (Inventor)
2003-01-01
A method for evolving a circuit comprising configuring a plurality of transistors using a plurality of reconfigurable switches so that each of the plurality of transistors has a terminal coupled to a terminal of another of the plurality of transistors that is controllable by a single reconfigurable switch. The plurality of reconfigurable switches being controlled in response to a chromosome pattern. The plurality of reconfigurable switches may be controlled using an annealing function. As such, the plurality of reconfigurable switches may be controlled by selecting qualitative values for the plurality of reconfigurable switches in response to the chromosomal pattern, selecting initial quantitative values for the selected qualitative values, and morphing the initial quantitative values. Typically, subsequent quantitative values will be selected more divergent than the initial quantitative values. The morphing process may continue to partially or to completely polarize the quantitative values.
Trees and logs important to wildlife in the interior Columbia River basin.
Evelyn L. Bull; Catherine G. Parks; Torolf R. Torgersen
1997-01-01
This publication provides qualitative and quantitative information on five distinct structures: living trees with decayed parts, trees with hollow chambers, trees with brooms, dead trees, and logs. Information is provided on the value of these structures to wildlife, the decay or infection processes involved in the formation of these structures, and the principles to...
Metrics and the effective computational scientist: process, quality and communication.
Baldwin, Eric T
2012-09-01
Recent treatments of computational knowledge worker productivity have focused upon the value the discipline brings to drug discovery using positive anecdotes. While this big picture approach provides important validation of the contributions of these knowledge workers, the impact accounts do not provide the granular detail that can help individuals and teams perform better. I suggest balancing the impact-focus with quantitative measures that can inform the development of scientists. Measuring the quality of work, analyzing and improving processes, and the critical evaluation of communication can provide immediate performance feedback. The introduction of quantitative measures can complement the longer term reporting of impacts on drug discovery. These metric data can document effectiveness trends and can provide a stronger foundation for the impact dialogue. Copyright © 2012 Elsevier Ltd. All rights reserved.
Li, Jing Xin; Yang, Li; Yang, Lei; Zhang, Chao; Huo, Zhao Min; Chen, Min Hao; Luan, Xiao Feng
2018-03-01
Quantitative evaluation of ecosystem service is a primary premise for rational resources exploitation and sustainable development. Examining ecosystem services flow provides a scientific method to quantity ecosystem services. We built an assessment indicator system based on land cover/land use under the framework of four types of ecosystem services. The types of ecosystem services flow were reclassified. Using entropy theory, disorder degree and developing trend of indicators and urban ecosystem were quantitatively assessed. Beijing was chosen as the study area, and twenty-four indicators were selected for evaluation. The results showed that the entropy value of Beijing urban ecosystem during 2004 to 2015 was 0.794 and the entropy flow was -0.024, suggesting a large disordered degree and near verge of non-health. The system got maximum values for three times, while the mean annual variation of the system entropy value increased gradually in three periods, indicating that human activities had negative effects on urban ecosystem. Entropy flow reached minimum value in 2007, implying the environmental quality was the best in 2007. The determination coefficient for the fitting function of total permanent population in Beijing and urban ecosystem entropy flow was 0.921, indicating that urban ecosystem health was highly correlated with total permanent population.
Han, Zhihua; Shao, Lixin; Xie, Yan; Wu, Jianhong; Zhang, Yan; Xin, Hongkui; Ren, Aijun; Guo, Yong; Wang, Deli; He, Qing; Ruan, Dike
2014-01-01
Objective The objective of this study was to evaluate the efficacy of quantitative T2 magnetic resonance imaging (MRI) for quantifying early cervical intervertebral disc (IVD) degeneration in asymptomatic young adults by correlating the T2 value with Pfirrmann grade, sex, and anatomic level. Methods Seventy asymptomatic young subjects (34 men and 36 women; mean age, 22.80±2.11 yr; range, 18–25 years) underwent 3.0-T MRI to obtain morphological data (one T1-fast spin echo (FSE) and three-plane T2-FSE, used to assign a Pfirrmann grade (I–V)) and for T2 mapping (multi-echo spin echo). T2 values in the nucleus pulposus (NP, n = 350) and anulus fibrosus (AF, n = 700) were obtained. Differences in T2 values between sexes and anatomic level were evaluated, and linear correlation analysis of T2 values versus degenerative grade was conducted. Findings Cervical IVDs of healthy young adults were commonly determined to be at Pfirrmann grades I and II. T2 values of NPs were significantly higher than those of AF at all anatomic levels (P<0.000). The NP, anterior AF and posterior AF values did not differ significantly between genders at the same anatomic level (P>0.05). T2 values decreased linearly with degenerative grade. Linear correlation analysis revealed a strong negative association between the Pfirrmann grade and the T2 values of the NP (P = 0.000) but not the T2 values of the AF (P = 0.854). However, non-degenerated discs (Pfirrmann grades I and II) showed a wide range of T2 relaxation time. T2 values according to disc degeneration level classification were as follows: grade I (>62.03 ms), grade II (54.60–62.03 ms), grade III (<54.60 ms). Conclusions T2 quantitation provides a more sensitive and robust approach for detecting and characterizing the early stage of cervical IVD degeneration and to create a reliable quantitative in healthy young adults. PMID:24498384
Chen, Chun; Huang, Minghua; Han, Zhihua; Shao, Lixin; Xie, Yan; Wu, Jianhong; Zhang, Yan; Xin, Hongkui; Ren, Aijun; Guo, Yong; Wang, Deli; He, Qing; Ruan, Dike
2014-01-01
The objective of this study was to evaluate the efficacy of quantitative T2 magnetic resonance imaging (MRI) for quantifying early cervical intervertebral disc (IVD) degeneration in asymptomatic young adults by correlating the T2 value with Pfirrmann grade, sex, and anatomic level. Seventy asymptomatic young subjects (34 men and 36 women; mean age, 22.80±2.11 yr; range, 18-25 years) underwent 3.0-T MRI to obtain morphological data (one T1-fast spin echo (FSE) and three-plane T2-FSE, used to assign a Pfirrmann grade (I-V)) and for T2 mapping (multi-echo spin echo). T2 values in the nucleus pulposus (NP, n = 350) and anulus fibrosus (AF, n = 700) were obtained. Differences in T2 values between sexes and anatomic level were evaluated, and linear correlation analysis of T2 values versus degenerative grade was conducted. Cervical IVDs of healthy young adults were commonly determined to be at Pfirrmann grades I and II. T2 values of NPs were significantly higher than those of AF at all anatomic levels (P<0.000). The NP, anterior AF and posterior AF values did not differ significantly between genders at the same anatomic level (P>0.05). T2 values decreased linearly with degenerative grade. Linear correlation analysis revealed a strong negative association between the Pfirrmann grade and the T2 values of the NP (P = 0.000) but not the T2 values of the AF (P = 0.854). However, non-degenerated discs (Pfirrmann grades I and II) showed a wide range of T2 relaxation time. T2 values according to disc degeneration level classification were as follows: grade I (>62.03 ms), grade II (54.60-62.03 ms), grade III (<54.60 ms). T2 quantitation provides a more sensitive and robust approach for detecting and characterizing the early stage of cervical IVD degeneration and to create a reliable quantitative in healthy young adults.
A practical guide to value of information analysis.
Wilson, Edward C F
2015-02-01
Value of information analysis is a quantitative method to estimate the return on investment in proposed research projects. It can be used in a number of ways. Funders of research may find it useful to rank projects in terms of the expected return on investment from a variety of competing projects. Alternatively, trialists can use the principles to identify the efficient sample size of a proposed study as an alternative to traditional power calculations, and finally, a value of information analysis can be conducted alongside an economic evaluation as a quantitative adjunct to the 'future research' or 'next steps' section of a study write up. The purpose of this paper is to present a brief introduction to the methods, a step-by-step guide to calculation and a discussion of issues that arise in their application to healthcare decision making. Worked examples are provided in the accompanying online appendices as Microsoft Excel spreadsheets.
Quantitative Study on Corrosion of Steel Strands Based on Self-Magnetic Flux Leakage.
Xia, Runchuan; Zhou, Jianting; Zhang, Hong; Liao, Leng; Zhao, Ruiqiang; Zhang, Zeyu
2018-05-02
This paper proposed a new computing method to quantitatively and non-destructively determine the corrosion of steel strands by analyzing the self-magnetic flux leakage (SMFL) signals from them. The magnetic dipole model and three growth models (Logistic model, Exponential model, and Linear model) were proposed to theoretically analyze the characteristic value of SMFL. Then, the experimental study on the corrosion detection by the magnetic sensor was carried out. The setup of the magnetic scanning device and signal collection method were also introduced. The results show that the Logistic Growth model is verified as the optimal model for calculating the magnetic field with good fitting effects. Combined with the experimental data analysis, the amplitudes of the calculated values ( B xL ( x,z ) curves) agree with the measured values in general. This method provides significant application prospects for the evaluation of the corrosion and the residual bearing capacity of steel strand.
Early prediction of coma recovery after cardiac arrest with blinded pupillometry.
Solari, Daria; Rossetti, Andrea O; Carteron, Laurent; Miroz, John-Paul; Novy, Jan; Eckert, Philippe; Oddo, Mauro
2017-06-01
Prognostication studies on comatose cardiac arrest (CA) patients are limited by lack of blinding, potentially causing overestimation of outcome predictors and self-fulfilling prophecy. Using a blinded approach, we analyzed the value of quantitative automated pupillometry to predict neurological recovery after CA. We examined a prospective cohort of 103 comatose adult patients who were unconscious 48 hours after CA and underwent repeated measurements of quantitative pupillary light reflex (PLR) using the Neurolight-Algiscan device. Clinical examination, electroencephalography (EEG), somatosensory evoked potentials (SSEP), and serum neuron-specific enolase were performed in parallel, as part of standard multimodal assessment. Automated pupillometry results were blinded to clinicians involved in patient care. Cerebral Performance Categories (CPC) at 1 year was the outcome endpoint. Survivors (n = 50 patients; 32 CPC 1, 16 CPC 2, 2 CPC 3) had higher quantitative PLR (median = 20 [range = 13-41] vs 11 [0-55] %, p < 0.0001) and constriction velocity (1.46 [0.85-4.63] vs 0.94 [0.16-4.97] mm/s, p < 0.0001) than nonsurvivors. At 48 hours, a quantitative PLR < 13% had 100% specificity and positive predictive value to predict poor recovery (0% false-positive rate), and provided equal performance to that of EEG and SSEP. Reduced quantitative PLR correlated with higher serum neuron-specific enolase (Spearman r = -0.52, p < 0.0001). Reduced quantitative PLR correlates with postanoxic brain injury and, when compared to standard multimodal assessment, is highly accurate in predicting long-term prognosis after CA. This is the first prognostication study to show the value of automated pupillometry using a blinded approach to minimize self-fulfilling prophecy. Ann Neurol 2017;81:804-810. © 2017 American Neurological Association.
Norton, Heather L; Edwards, Melissa; Krithika, S; Johnson, Monique; Werren, Elizabeth A; Parra, Esteban J
2016-08-01
The main goals of this study are to 1) quantitatively measure skin, hair, and iris pigmentation in a diverse sample of individuals, 2) describe variation within and between these samples, and 3) demonstrate how quantitative measures can facilitate genotype-phenotype association tests. We quantitatively characterize skin, hair, and iris pigmentation using the Melanin (M) Index (skin) and CIELab values (hair) in 1,450 individuals who self-identify as African American, East Asian, European, Hispanic, or South Asian. We also quantify iris pigmentation in a subset of these individuals using CIELab values from high-resolution iris photographs. We compare mean skin M index and hair and iris CIELab values among populations using ANOVA and MANOVA respectively and test for genotype-phenotype associations in the European sample. All five populations are significantly different for skin (P <2 × 10(-16) ) and hair color (P <2 × 10(-16) ). Our quantitative analysis of iris and hair pigmentation reinforces the continuous, rather than discrete, nature of these traits. We confirm the association of three loci (rs16891982, rs12203592, and rs12913832) with skin pigmentation and four loci (rs12913832, rs12203592, rs12896399, and rs16891982) with hair pigmentation. Interestingly, the derived rs12203592 T allele located within the IRF4 gene is associated with lighter skin but darker hair color. The quantitative methods used here provide a fine-scale assessment of pigmentation phenotype and facilitate genotype-phenotype associations, even with relatively small sample sizes. This represents an important expansion of current investigations into pigmentation phenotype and associated genetic variation by including non-European and admixed populations. Am J Phys Anthropol 160:570-581, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Bousse, Tatiana; Shore, David A.; Goldsmith, Cynthia S.; Hossain, M. Jaber; Jang, Yunho; Davis, Charles T.; Donis, Ruben O.; Stevens, James
2017-01-01
Summary Recent advances in instrumentation and data analysis in field flow fractionation and multi-angle light scattering (FFF-MALS) have enabled greater use of this technique to characterize and quantitate viruses. In this study, the FFF-MALS technique was applied to the characterization and quantitation of type A influenza virus particles to assess its usefulness for vaccine preparation. The use of FFF-MALS for quantitation and measurement of control particles provided data accurate to within 5% of known values, reproducible with a coefficient of variation of 1.9 %. The methods, sensitivity and limit of detection were established by analyzing different volumes of purified virus, which produced a linear regression with fitting value R2 of 0.99. FFF-MALS was further applied to detect and quantitate influenza virus in the supernatant of infected MDCK cells and allantoic fluids of infected eggs. FFF fractograms of the virus present in these different fluids revealed similar distribution of monomeric and oligomeric virions. However, the monomer fraction of cell grown virus has greater size variety. Notably, β-propialactone (BPL) inactivation of influenza viruses did not influence any of the FFF-MALS measurements. Quantitation analysis by FFF-MALS was compared to infectivity assays and real-time RT-PCR (qRT-PCR) and the limitations of each assay were discussed. PMID:23916678
Medved, Milica; Sammet, Steffen; Yousuf, Ambereen; Oto, Aytekin
2015-01-01
Purpose To determine the possibility of obtaining high-quality magnetic resonance (MR) images before, during, and immediately after ejaculation and detecting measurable changes in quantitative MR imaging parameters after ejaculation. Materials and Methods In this prospective, institutional review board–approved, HIPAA-compliant study, eight young healthy volunteers (median age, 22.5 years), after providing informed consent, underwent MR imaging while masturbating to the point of ejaculation. A 1.5-T MR imaging unit was used, with an eight-channel surface coil and a dynamic single-shot fast spin-echo sequence. In addition, a quantitative MR imaging protocol that allowed calculation of T1, T2, and apparent diffusion coefficient (ADC) values was applied before and after ejaculation. Volumes of the prostate and seminal vesicles (SV) were calculated by using whole-volume segmentation on T2-weighted images, both before and after ejaculation. Pre- and postejaculation changes in quantitative MR parameters and measured volumes were evaluated by using the Wilcoxon signed rank test with Bonferroni adjustment. Results There was no significant change in prostate volumes on pre- and postejaculation images, while the SV contracted by 41% on average (median, 44.5%; P = .004). No changes before and after ejaculation were observed in T1 values or in T2 and ADC values in the central gland, while T2 and ADC values were significantly reduced in the peripheral zone by 12% and 14%, respectively (median, 13% and 14.5%, respectively; P = .004). Conclusion Successful dynamic MR imaging of ejaculation events and the ability to visualize internal sphincter closure, passage of ejaculate, and significant changes in SV volumes were demonstrated. Significant changes in peripheral zone T2 and ADC values were observed. PMID:24495265
The predictive power of Japanese candlestick charting in Chinese stock market
NASA Astrophysics Data System (ADS)
Chen, Shi; Bao, Si; Zhou, Yu
2016-09-01
This paper studies the predictive power of 4 popular pairs of two-day bullish and bearish Japanese candlestick patterns in Chinese stock market. Based on Morris' study, we give the quantitative details of definition of long candlestick, which is important in two-day candlestick pattern recognition but ignored by several previous researches, and we further give the quantitative definitions of these four pairs of two-day candlestick patterns. To test the predictive power of candlestick patterns on short-term price movement, we propose the definition of daily average return to alleviate the impact of correlation among stocks' overlap-time returns in statistical tests. To show the robustness of our result, two methods of trend definition are used for both the medium-market-value and large-market-value sample sets. We use Step-SPA test to correct for data snooping bias. Statistical results show that the predictive power differs from pattern to pattern, three of the eight patterns provide both short-term and relatively long-term prediction, another one pair only provide significant forecasting power within very short-term period, while the rest three patterns present contradictory results for different market value groups. For all the four pairs, the predictive power drops as predicting time increases, and forecasting power is stronger for stocks with medium market value than those with large market value.
Hefelfinger, Jenny; Patty, Alice; Ussery, Ann; Young, Walter
2013-10-24
This study assessed the value of technical assistance provided by state health department expert advisors and by the staff of the National Association of Chronic Disease Directors (NACDD) to community groups that participated in the Action Communities for Health, Innovation, and Environmental Change (ACHIEVE) Program, a CDC-funded health promotion program. We analyzed quantitative and qualitative data reported by community project coordinators to assess the nature and value of technical assistance provided by expert advisors and NACDD staff and the usefulness of ACHIEVE resources in the development and implementation of community action plans. A grounded theory approach was used to analyze and categorize phrases in text data provided by community coordinators. Open coding placed conceptual labels on text phrases. Frequency distributions of the quantitative data are described and discussed. The most valuable technical assistance and program support resources were those determined to be in the interpersonal domain (ie, interactions with state expert advisors, NACDD staff, and peer-to-peer support). The most valuable technical assistance events were action institutes, coaches' meetings, webinars, and technical assistance conference calls. This analysis suggests that ACHIEVE communities valued the management and training assistance provided by expert advisors and NACDD staff. State health department expert advisors provided technical guidance and support, including such skills or knowledge-based services as best-practice strategies, review and discussion of community assessment data, sustainability planning, and identification of possible funding opportunities. NACDD staff led development and implementation of technical assistance events.
Developing integrative primary healthcare delivery: adding a chiropractor to the team.
Garner, Michael J; Birmingham, Michael; Aker, Peter; Moher, David; Balon, Jeff; Keenan, Dirk; Manga, Pran
2008-01-01
The use of complementary and alternative medicine has been increasing in Canada despite the lack of coverage under the universal public health insurance system. Physicians and other healthcare practitioners are now being placed in multidisciplinary teams, yet little research on integration exists. We sought to investigate the effect of integrating chiropractic on the attitudes of providers on two healthcare teams. A mixed methods design with both quantitative and qualitative components was used to assess the healthcare teams. Assessment occurred prior to integration, at midstudy, and at the end of the study (18 months). Multidisciplinary healthcare teams at two community health centers in Ottawa, Ontario, participated in the study. All physicians, nurse practitioners, and degree-trained nurses employed at two study sites were approached to take part in the study. A chiropractor was introduced into each of the two healthcare teams. A quantitative questionnaire assessed providers' opinions, experiences with collaboration, and perceptions of chiropractic care. Focus groups were used to encourage providers to communicate their experiences and perceptions of the integration and of chiropractic. Twelve providers were followed for the full 18 months of integration. The providers expressed increased willingness to trust the chiropractors in shared care (F value = 7.18; P = .004). Questions regarding the legitimacy (F value = 12.33; P < .001) and effectiveness (F value = 11.17; P < .001) of chiropractic became increasingly positive by study end. This project has demonstrated the successful integration of chiropractors into primary healthcare teams.
Overview of T.E.S.T. (Toxicity Estimation Software Tool)
This talk provides an overview of T.E.S.T. (Toxicity Estimation Software Tool). T.E.S.T. predicts toxicity values and physical properties using a variety of different QSAR (quantitative structure activity relationship) approaches including hierarchical clustering, group contribut...
77 FR 71191 - 2012 Recreational Water Quality Criteria
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-29
... Criteria AGENCY: Environmental Protection Agency (EPA). ACTION: Notice of availability of the 2012... for beach monitoring, quantitative polymerase chain reaction (qPCR), for the detection of enterococci... managing recreational waters, such as predictive modeling; the EPA is providing a beach action value for...
Dolled-Filhart, Marisa P; Gustavson, Mark D
2012-11-01
Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Synovec, R.E.; Johnson, E.L.; Bahowick, T.J.
1990-08-01
This paper describes a new technique for data analysis in chromatography, based on taking the point-by-point ratio of sequential chromatograms that have been base line corrected. This ratio chromatogram provides a robust means for the identification and the quantitation of analytes. In addition, the appearance of an interferent is made highly visible, even when it coelutes with desired analytes. For quantitative analysis, the region of the ratio chromatogram corresponding to the pure elution of an analyte is identified and is used to calculate a ratio value equal to the ratio of concentrations of the analyte in sequential injections. For themore » ratio value calculation, a variance-weighted average is used, which compensates for the varying signal-to-noise ratio. This ratio value, or equivalently the percent change in concentration, is the basis of a chromatographic standard addition method and an algorithm to monitor analyte concentration in a process stream. In the case of overlapped peaks, a spiking procedure is used to calculate both the original concentration of an analyte and its signal contribution to the original chromatogram. Thus, quantitation and curve resolution may be performed simultaneously, without peak modeling or curve fitting. These concepts are demonstrated by using data from ion chromatography, but the technique should be applicable to all chromatographic techniques.« less
Influence of echo time in quantitative proton MR spectroscopy using LCModel.
Yamamoto, Tetsuya; Isobe, Tomonori; Akutsu, Hiroyoshi; Masumoto, Tomohiko; Ando, Hiroki; Sato, Eisuke; Takada, Kenta; Anno, Izumi; Matsumura, Akira
2015-06-01
The objective of this study was to elucidate the influence on quantitative analysis using LCModel with the condition of echo time (TE) longer than the recommended values in the spectrum acquisition specifications. A 3T magnetic resonance system was used to perform proton magnetic resonance spectroscopy. The participants were 5 healthy volunteers and 11 patients with glioma. Data were collected at TE of 72, 144 and 288ms. LCModel was used to quantify several metabolites (N-acetylaspartate, creatine and phosphocreatine, and choline-containing compounds). The results were compared with quantitative values obtained by using the T2-corrected internal reference method. In healthy volunteers, when TE was long, the quantitative values obtained using LCModel were up to 6.8-fold larger (p<0.05) than those obtained using the T2-corrected internal reference method. The ratios of the quantitative values obtained by the two methods differed between metabolites (p<0.05). In patients with glioma, the ratios of quantitative values obtained by the two methods tended to be larger at longer TE, similarly to the case of healthy volunteers, and large between-individual variation in the ratios was observed. In clinical practice, TE is sometimes set longer than the value recommended for LCModel. If TE is long, LCModel overestimates the quantitative value since it cannot compensate for signal attenuation, and this effect is different for each metabolite and condition. Therefore, if TE is longer than recommended, it is necessary to account for the possibly reduced reliability of quantitative values calculated using LCModel. Copyright © 2015 Elsevier Inc. All rights reserved.
Aydogdu, Ibrahim; Kiylioglu, Nefati; Tarlaci, Sultan; Tanriverdi, Zeynep; Alpaydin, Sezin; Acarer, Ahmet; Baysal, Leyla; Arpaci, Esra; Yuceyar, Nur; Secil, Yaprak; Ozdemirkiran, Tolga; Ertekin, Cumhur
2015-03-01
Neurogenic dysphagia (ND) is a prevalent condition that accounts for significant mortality and morbidity worldwide. Screening and follow-up are critical for early diagnosis and management which can mitigate its complications and be cost-saving. The aims of this study are to provide a comprehensive investigation of the dysphagia limit (DL) in a large diverse cohort and to provide a longitudinal assessment of dysphagia in a subset of subjects. We developed a quantitative and noninvasive method for objective assessment of dysphagia by using laryngeal sensor and submental electromyography. DL is the volume at which second or more swallows become necessary to swallow the whole amount of bolus. This study represents 17 years experience with the DL approach in assessing ND in a cohort of 1278 adult subjects consisting of 292 healthy controls, 784 patients with dysphagia, and 202 patients without dysphagia. A total of 192 of all patients were also reevaluated longitudinally over a period of 1-19 months. DL has 92% sensitivity, 91% specificity, 94% positive predictive value, and 88% negative predictive value with an accuracy of 0.92. Patients with ALS, stroke, and movement disorders have the highest sensitivity (85-97%) and positive predictive value (90-99%). The clinical severity of dysphagia has significant negative correlation with DL (r=-0.67, p<0.0001). We propose the DL as a reliable, quick, noninvasive, quantitative test to detect and follow both clinical and subclinical dysphagia and it can be performed in an EMG laboratory. Our study provides specific quantitative features of DL test that can be readily utilized by the neurologic community and nominates DL as an objective and robust method to evaluate dysphagia in a wide range of neurologic conditions. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Skin condition measurement by using multispectral imaging system (Conference Presentation)
NASA Astrophysics Data System (ADS)
Jung, Geunho; Kim, Sungchul; Kim, Jae Gwan
2017-02-01
There are a number of commercially available low level light therapy (LLLT) devices in a market, and face whitening or wrinkle reduction is one of targets in LLLT. The facial improvement could be known simply by visual observation of face, but it cannot provide either quantitative data or recognize a subtle change. Clinical diagnostic instruments such as mexameter can provide a quantitative data, but it costs too high for home users. Therefore, we designed a low cost multi-spectral imaging device by adding additional LEDs (470nm, 640nm, white LED, 905nm) to a commercial USB microscope which has two LEDs (395nm, 940nm) as light sources. Among various LLLT skin treatments, we focused on getting melanin and wrinkle information. For melanin index measurements, multi-spectral images of nevus were acquired and melanin index values from color image (conventional method) and from multi-spectral images were compared. The results showed that multi-spectral analysis of melanin index can visualize nevus with a different depth and concentration. A cross section of wrinkle on skin resembles a wedge which can be a source of high frequency components when the skin image is Fourier transformed into a spatial frequency domain map. In that case, the entropy value of the spatial frequency map can represent the frequency distribution which is related with the amount and thickness of wrinkle. Entropy values from multi-spectral images can potentially separate the percentage of thin and shallow wrinkle from thick and deep wrinkle. From the results, we found that this low cost multi-spectral imaging system could be beneficial for home users of LLLT by providing the treatment efficacy in a quantitative way.
Does Lean healthcare improve patient satisfaction? A mixed-method investigation into primary care.
Poksinska, Bozena Bonnie; Fialkowska-Filipek, Malgorzata; Engström, Jon
2017-02-01
Lean healthcare is claimed to contribute to improved patient satisfaction, but there is limited evidence to support this notion. This study investigates how primary-care centres working with Lean define and improve value from the patient's perspective, and how the application of Lean healthcare influences patient satisfaction. This paper contains two qualitative case studies and a quantitative study based on results from the Swedish National Patient Survey. Through the case studies, we investigated how primary-care organisations realised the principle of defining and improving value from the patient's perspective. In the quantitative study, we compared results from the patient satisfaction survey for 23 primary-care centres working with Lean with a control group of 23 care centres not working with Lean. We also analysed changes in patient satisfaction over time. Our case studies reveal that Lean healthcare implementations primarily target efficiency and little attention is paid to the patient's perspective. The quantitative study shows no significantly better results in patient satisfaction for primary-care centres working with Lean healthcare compared with those not working with Lean. Further, care centres working with Lean show no significant improvements in patient satisfaction over time. Lean healthcare implementations seem to have a limited impact on improving patient satisfaction. Care providers need to pay more attention to integrating the patient's perspective in the application of Lean healthcare. Value needs to be defined and value streams need to be improved based on both the knowledge and clinical expertise of care providers, and the preferences and needs of patients. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Park, Hae-Min; Park, Ju-Hyeong; Kim, Yoon-Woo; Kim, Kyoung-Jin; Jeong, Hee-Jin; Jang, Kyoung-Soon; Kim, Byung-Gee; Kim, Yun-Gon
2013-11-15
In recent years, the improvement of mass spectrometry-based glycomics techniques (i.e. highly sensitive, quantitative and high-throughput analytical tools) has enabled us to obtain a large dataset of glycans. Here we present a database named Xeno-glycomics database (XDB) that contains cell- or tissue-specific pig glycomes analyzed with mass spectrometry-based techniques, including a comprehensive pig glycan information on chemical structures, mass values, types and relative quantities. It was designed as a user-friendly web-based interface that allows users to query the database according to pig tissue/cell types or glycan masses. This database will contribute in providing qualitative and quantitative information on glycomes characterized from various pig cells/organs in xenotransplantation and might eventually provide new targets in the α1,3-galactosyltransferase gene-knock out pigs era. The database can be accessed on the web at http://bioinformatics.snu.ac.kr/xdb.
[Doppler echocardiography of tricuspid insufficiency. Methods of quantification].
Loubeyre, C; Tribouilloy, C; Adam, M C; Mirode, A; Trojette, F; Lesbre, J P
1994-01-01
Evaluation of tricuspid incompetence has benefitted considerably from the development of Doppler ultrasound. In addition to direct analysis of the valves, which provides information about the mechanism involved, this method is able to provide an accurate evaluation, mainly through use of the Doppler mode. In addition to new criteria being evaluated (mainly the convergence zone of the regurgitant jet), some indices are recognised as good quantitative parameters: extension of the regurgitant jet into the right atrium, anterograde tricuspid flow, laminar nature of the regurgitant flow, analysis of the flow in the supra-hepatic veins, this is only semi-quantitative, since the calculation of the regurgitation fraction from the pulsed Doppler does not seem to be reliable; This accurate semi-quantitative evaluation is made possible by careful and consistent use of all the criteria available. The authors set out to discuss the value of the various evaluation criteria mentioned in the literature and try to define a practical approach.
Breach Risk Magnitude: A Quantitative Measure of Database Security.
Yasnoff, William A
2016-01-01
A quantitative methodology is described that provides objective evaluation of the potential for health record system breaches. It assumes that breach risk increases with the number of potential records that could be exposed, while it decreases when more authentication steps are required for access. The breach risk magnitude (BRM) is the maximum value for any system user of the common logarithm of the number of accessible database records divided by the number of authentication steps needed to achieve such access. For a one million record relational database, the BRM varies from 5.52 to 6 depending on authentication protocols. For an alternative data architecture designed specifically to increase security by separately storing and encrypting each patient record, the BRM ranges from 1.3 to 2.6. While the BRM only provides a limited quantitative assessment of breach risk, it may be useful to objectively evaluate the security implications of alternative database organization approaches.
Non-animal approaches for toxicokinetics in risk evaluations of food chemicals.
Punt, Ans; Peijnenburg, Ad A C M; Hoogenboom, Ron L A P; Bouwmeester, Hans
2017-01-01
The objective of the present work was to review the availability and predictive value of non-animal toxicokinetic approaches and to evaluate their current use in European risk evaluations of food contaminants, additives and food contact materials, as well as pesticides and medicines. Results revealed little use of quantitative animal or human kinetic data in risk evaluations of food chemicals, compared with pesticides and medicines. Risk evaluations of medicines provided sufficient in vivo kinetic data from different species to evaluate the predictive value of animal kinetic data for humans. These data showed a relatively poor correlation between the in vivo bioavailability in rats and dogs versus that in humans. In contrast, in vitro (human) kinetic data have been demonstrated to provide adequate predictions of the fate of compounds in humans, using appropriate in vitro-in vivo scalers and by integration of in vitro kinetic data with in silico kinetic modelling. Even though in vitro kinetic data were found to be occasionally included within risk evaluations of food chemicals, particularly results from Caco-2 absorption experiments and in vitro data on gut-microbial conversions, only minor use of in vitro methods for metabolism and quantitative in vitro-in vivo extrapolation methods was identified. Yet, such quantitative predictions are essential in the development of alternatives to animal testing as well as to increase human relevance of toxicological risk evaluations. Future research should aim at further improving and validating quantitative alternative methods for kinetics, thereby increasing regulatory acceptance of non-animal kinetic data.
Quantitative influence of risk factors on blood glucose level.
Chen, Songjing; Luo, Senlin; Pan, Limin; Zhang, Tiemei; Han, Longfei; Zhao, Haixiu
2014-01-01
The aim of this study is to quantitatively analyze the influence of risk factors on the blood glucose level, and to provide theory basis for understanding the characteristics of blood glucose change and confirming the intervention index for type 2 diabetes. The quantitative method is proposed to analyze the influence of risk factors on blood glucose using back propagation (BP) neural network. Ten risk factors are screened first. Then the cohort is divided into nine groups by gender and age. According to the minimum error principle, nine BP models are trained respectively. The quantitative values of the influence of different risk factors on the blood glucose change can be obtained by sensitivity calculation. The experiment results indicate that weight is the leading cause of blood glucose change (0.2449). The second factors are cholesterol, age and triglyceride. The total ratio of these four factors reaches to 77% of the nine screened risk factors. And the sensitivity sequences can provide judgment method for individual intervention. This method can be applied to risk factors quantitative analysis of other diseases and potentially used for clinical practitioners to identify high risk populations for type 2 diabetes as well as other disease.
Photosynthesis-related quantities for education and modeling.
Antal, Taras K; Kovalenko, Ilya B; Rubin, Andrew B; Tyystjärvi, Esa
2013-11-01
A quantitative understanding of the photosynthetic machinery depends largely on quantities, such as concentrations, sizes, absorption wavelengths, redox potentials, and rate constants. The present contribution is a collection of numbers and quantities related mainly to photosynthesis in higher plants. All numbers are taken directly from a literature or database source and the corresponding reference is provided. The numerical values, presented in this paper, provide ranges of values, obtained in specific experiments for specific organisms. However, the presented numbers can be useful for understanding the principles of structure and function of photosynthetic machinery and for guidance of future research.
Larsen, Frank Wugt; Petersen, Anders Højgård; Strange, Niels; Lund, Mette Palitzsch; Rahbek, Carsten
2008-05-01
Denmark has committed itself to the European 2010 target to halt the loss of biodiversity. Currently, Denmark is in the process of designating larger areas as national parks, and 7 areas (of a possible 32 larger nature areas) have been selected for pilot projects to test the feasibility of establishing national parks. In this article, we first evaluate the effectiveness of the a priori network of national parks proposed through expert and political consensus versus a network chosen specifically for biodiversity through quantitative analysis. Second, we analyze the potential synergy between preserving biodiversity in terms of species representation and recreational values in selecting a network of national parks. We use the actual distribution of 973 species within these 32 areas and 4 quantitative measures of recreational value. Our results show that the 7 pilot project areas are not significantly more effective in representing species than expected by chance and that considerably more efficient networks can be selected. Moreover, it is possible to select more-effective networks of areas that combine high representation of species with high ranking in terms of recreational values. Therefore, our findings suggest possible synergies between outdoor recreation and biodiversity conservation when selecting networks of national parks. Overall, this Danish case illustrates that data-driven analysis can not only provide valuable information to guide the decision-making process of designating national parks, but it can also be a means to identify solutions that simultaneously fulfill several goals (biodiversity preservation and recreational values).
Behavioral Economics and Empirical Public Policy
ERIC Educational Resources Information Center
Hursh, Steven R.; Roma, Peter G.
2013-01-01
The application of economics principles to the analysis of behavior has yielded novel insights on value and choice across contexts ranging from laboratory animal research to clinical populations to national trends of global impact. Recent innovations in demand curve methods provide a credible means of quantitatively comparing qualitatively…
Propagating Qualitative Values Through Quantitative Equations
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak
1992-01-01
In most practical problems where traditional numeric simulation is not adequate, one need to reason about a system with both qualitative and quantitative equations. In this paper, we address the problem of propagating qualitative values represented as interval values through quantitative equations. Previous research has produced exponential-time algorithms for approximate solution of the problem. These may not meet the stringent requirements of many real time applications. This paper advances the state of art by producing a linear-time algorithm that can propagate a qualitative value through a class of complex quantitative equations exactly and through arbitrary algebraic expressions approximately. The algorithm was found applicable to Space Shuttle Reaction Control System model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhandari, Deepak; Kertesz, Vilmos; Van Berkel, Gary J
RATIONALE: Ascorbic acid (AA) and folic acid (FA) are water-soluble vitamins and are usually fortified in food and dietary supplements. For the safety of human health, proper intake of these vitamins is recommended. Improvement in the analysis time required for the quantitative determination of these vitamins in food and nutritional formulations is desired. METHODS: A simple and fast (~5 min) in-tube sample preparation was performed, independently for FA and AA, by mixing extraction solvent with a powdered sample aliquot followed by agitation, centrifugation, and filtration to recover an extract for analysis. Quantitative detection was achieved by flow-injection (1 L injectionmore » volume) electrospray ionization tandem mass spectrometry (ESI-MS/MS) in negative ion mode using the method of standard addition. RESULTS: Method of standard addition was employed for the quantitative estimation of each vitamin in a sample extract. At least 2 spiked and 1 non-spiked sample extract were injected in triplicate for each quantitative analysis. Given an injection-to-injection interval of approximately 2 min, about 18 min was required to complete the quantitative estimation of each vitamin. The concentration values obtained for the respective vitamins in the standard reference material (SRM) 3280 using this approach were within the statistical range of the certified values provided in the NIST Certificate of Analysis. The estimated limit of detections of FA and AA were 13 and 5.9 ng/g, respectively. CONCLUSIONS: Flow-injection ESI-MS/MS was successfully applied for the rapid quantitation of FA and AA in SRM 3280 multivitamin/multielement tablets.« less
Aggett, Peter J; Hathcock, John; Jukes, David; Richardson, David P; Calder, Philip C; Bischoff-Ferrari, Heike; Nicklas, Theresa; Mühlebach, Stefan; Kwon, Oran; Lewis, Janine; Lugard, Maurits J F; Prock, Peter
2012-03-01
Codex documents may be used as educational and consensus materials for member governments. Also, the WTO SPS Agreement recognizes Codex as the presumptive international authority on food issues. Nutrient bioavailability is a critical factor in determining the ability of nutrients to provide beneficial effects. Bioavailability also influences the quantitative dietary requirements that are the basis of nutrient intake recommendations and NRVs. Codex, EFSA and some national regulatory authorities have established guidelines or regulations that will permit several types of health claims. The scientific basis for claims has been established by the US FDA and EFSA, but not yet by Codex. Evidence-based nutrition differs from evidence-based medicine, but the differences are only recently gaining recognition. Health claims on foods may provide useful information to consumers, but many will interpret the information to mean that they can rely upon the food or nutrient to eliminate a disease risk. NRVs are designed to provide a quantitative basis for comparing the nutritive values of foods, helping to illustrate how specific foods fit into the overall diet. The INL-98 and the mean of adult male and female values provide NRVs that are sufficient when used as targets for individual intakes by most adults. WTO recognizes Codex as the primary international authority on food issues. Current regulatory schemes based on recommended dietary allowances are trade restrictive. A substantial number of decisions by the EFSA could lead to violation of WTO agreements.
NASA Astrophysics Data System (ADS)
Ehn, Andreas; Jonsson, Malin; Johansson, Olof; Aldén, Marcus; Bood, Joakim
2013-01-01
Fluorescence lifetimes of toluene as a function of oxygen concentration in toluene/nitrogen/oxygen mixtures have been measured at room temperature using picosecond-laser excitation of the S1-S0 transition at 266 nm. The data satisfy the Stern-Volmer relation with high accuracy, providing an updated value of the Stern-Volmer slope. A newly developed fluorescence lifetime imaging scheme, called Dual Imaging with Modeling Evaluation (DIME), is evaluated and successfully demonstrated for quantitative oxygen concentration imaging in toluene-seeded O2/N2 gas mixtures.
NASA Astrophysics Data System (ADS)
Ehn, Andreas; Jonsson, Malin; Johansson, Olof; Aldén, Marcus; Bood, Joakim
2012-12-01
Fluorescence lifetimes of toluene as a function of oxygen concentration in toluene/nitrogen/oxygen mixtures have been measured at room temperature using picosecond-laser excitation of the S1-S0 transition at 266 nm. The data satisfy the Stern-Volmer relation with high accuracy, providing an updated value of the Stern-Volmer slope. A newly developed fluorescence lifetime imaging scheme, called Dual Imaging with Modeling Evaluation (DIME), is evaluated and successfully demonstrated for quantitative oxygen concentration imaging in toluene-seeded O2/N2 gas mixtures.
Visual and Quantitative Analysis Methods of Respiratory Patterns for Respiratory Gated PET/CT.
Son, Hye Joo; Jeong, Young Jin; Yoon, Hyun Jin; Park, Jong-Hwan; Kang, Do-Young
2016-01-01
We integrated visual and quantitative methods for analyzing the stability of respiration using four methods: phase space diagrams, Fourier spectra, Poincaré maps, and Lyapunov exponents. Respiratory patterns of 139 patients were grouped based on the combination of the regularity of amplitude, period, and baseline positions. Visual grading was done by inspecting the shape of diagram and classified into two states: regular and irregular. Quantitation was done by measuring standard deviation of x and v coordinates of Poincaré map (SD x , SD v ) or the height of the fundamental peak ( A 1 ) in Fourier spectrum or calculating the difference between maximal upward and downward drift. Each group showed characteristic pattern on visual analysis. There was difference of quantitative parameters (SD x , SD v , A 1 , and MUD-MDD) among four groups (one way ANOVA, p = 0.0001 for MUD-MDD, SD x , and SD v , p = 0.0002 for A 1 ). In ROC analysis, the cutoff values were 0.11 for SD x (AUC: 0.982, p < 0.0001), 0.062 for SD v (AUC: 0.847, p < 0.0001), 0.117 for A 1 (AUC: 0.876, p < 0.0001), and 0.349 for MUD-MDD (AUC: 0.948, p < 0.0001). This is the first study to analyze multiple aspects of respiration using various mathematical constructs and provides quantitative indices of respiratory stability and determining quantitative cutoff value for differentiating regular and irregular respiration.
Johnson, Rachel C.; Windell, Sean; Brandes, Patricia L.; Conrad, J. Louise; Ferguson, John; Goertler, Pascale A. L.; Harvey, Brett N.; Heublein, Joseph; Isreal, Joshua A.; Kratville, Daniel W.; Kirsch, Joseph E.; Perry, Russell W.; Pisciotto, Joseph; Poytress, William R.; Reece, Kevin; Swart, Brycen G.
2017-01-01
A robust monitoring network that provides quantitative information about the status of imperiled species at key life stages and geographic locations over time is fundamental for sustainable management of fisheries resources. For anadromous species, management actions in one geographic domain can substantially affect abundance of subsequent life stages that span broad geographic regions. Quantitative metrics (e.g., abundance, movement, survival, life history diversity, and condition) at multiple life stages are needed to inform how management actions (e.g., hatcheries, harvest, hydrology, and habitat restoration) influence salmon population dynamics. The existing monitoring network for endangered Sacramento River winterrun Chinook Salmon (SRWRC, Oncorhynchus tshawytscha) in California’s Central Valley was compared to conceptual models developed for each life stage and geographic region of the life cycle to identify relevant SRWRC metrics. We concluded that the current monitoring network was insufficient to diagnose when (life stage) and where (geographic domain) chronic or episodic reductions in SRWRC cohorts occur, precluding within- and among-year comparisons. The strongest quantitative data exist in the Upper Sacramento River, where abundance estimates are generated for adult spawners and emigrating juveniles. However, once SRWRC leave the upper river, our knowledge of their identity, abundance, and condition diminishes, despite the juvenile monitoring enterprise. We identified six system-wide recommended actions to strengthen the value of data generated from the existing monitoring network to assess resource management actions: (1) incorporate genetic run identification; (2) develop juvenile abundance estimates; (3) collect data for life history diversity metrics at multiple life stages; (4) expand and enhance real-time fish survival and movement monitoring; (5) collect fish condition data; and (6) provide timely public access to monitoring data in open data formats. To illustrate how updated technologies can enhance the existing monitoring to provide quantitative data on SRWRC, we provide examples of how each recommendation can address specific management issues.
Hefelfinger, Jenny; Patty, Alice; Ussery, Ann
2013-01-01
Introduction This study assessed the value of technical assistance provided by state health department expert advisors and by the staff of the National Association of Chronic Disease Directors (NACDD) to community groups that participated in the Action Communities for Health, Innovation, and Environmental Change (ACHIEVE) Program, a CDC-funded health promotion program. Methods We analyzed quantitative and qualitative data reported by community project coordinators to assess the nature and value of technical assistance provided by expert advisors and NACDD staff and the usefulness of ACHIEVE resources in the development and implementation of community action plans. A grounded theory approach was used to analyze and categorize phrases in text data provided by community coordinators. Open coding placed conceptual labels on text phrases. Frequency distributions of the quantitative data are described and discussed. Results The most valuable technical assistance and program support resources were those determined to be in the interpersonal domain (ie, interactions with state expert advisors, NACDD staff, and peer-to-peer support). The most valuable technical assistance events were action institutes, coaches’ meetings, webinars, and technical assistance conference calls. Conclusion This analysis suggests that ACHIEVE communities valued the management and training assistance provided by expert advisors and NACDD staff. State health department expert advisors provided technical guidance and support, including such skills or knowledge-based services as best-practice strategies, review and discussion of community assessment data, sustainability planning, and identification of possible funding opportunities. NACDD staff led development and implementation of technical assistance events. PMID:24157078
Diversifying natural resources value measurements: The Trinity River study
Taylor, J.G.; Douglas, A.J.
1999-01-01
An interdisciplinary team set out to establish the economic and social values of the Trinity River in northern California. This information was intended to support the Secretary of the Interior's decision on allocation of Trinity River flows. This team set out to measure the values of Trinity River flows, fishery resources, and recreation amenities in several different ways. A survey was mailed to users of the Trinity River. This single instrument included economic measures (willingness-to-pay and costs incurred in visiting) and social-psychological measures (importance, satisfaction, and water allocation preferences). A closely related survey measured several of these same values among west coast regional households. The results of these surveys were compiled, and the measured economic and social values were compared. We found that integrating economic and social value information provides a greater depth of understanding of the resource's value. In addition, this integration provides a more in-depth understanding through the quantitative and qualitative results that emerge.
Nijran, Kuldip S; Houston, Alex S; Fleming, John S; Jarritt, Peter H; Heikkinen, Jari O; Skrypniuk, John V
2014-07-01
In this second UK audit of quantitative parameters obtained from renography, phantom simulations were used in cases in which the 'true' values could be estimated, allowing the accuracy of the parameters measured to be assessed. A renal physical phantom was used to generate a set of three phantom simulations (six kidney functions) acquired on three different gamma camera systems. A total of nine phantom simulations and three real patient studies were distributed to UK hospitals participating in the audit. Centres were asked to provide results for the following parameters: relative function and time-to-peak (whole kidney and cortical region). As with previous audits, a questionnaire collated information on methodology. Errors were assessed as the root mean square deviation from the true value. Sixty-one centres responded to the audit, with some hospitals providing multiple sets of results. Twenty-one centres provided a complete set of parameter measurements. Relative function and time-to-peak showed a reasonable degree of accuracy and precision in most UK centres. The overall average root mean squared deviation of the results for (i) the time-to-peak measurement for the whole kidney and (ii) the relative function measurement from the true value was 7.7 and 4.5%, respectively. These results showed a measure of consistency in the relative function and time-to-peak that was similar to the results reported in a previous renogram audit by our group. Analysis of audit data suggests a reasonable degree of accuracy in the quantification of renography function using relative function and time-to-peak measurements. However, it is reasonable to conclude that the objectives of the audit could not be fully realized because of the limitations of the mechanical phantom in providing true values for renal parameters.
Economic values, ethics, and ecosystem health
Thomas P. Holmes; Randall A. Kramer
1995-01-01
Economic valuations of changes in ecosystem health can provide quantitative information for social decisions. However, willingness to pay for ecosystem health may be motivated by an environmental ethic regarding the right thing to do. Counterpreferential choices based on an environmental ethic are inconsistent with the normative basis of welfare economics. In this...
Hydrocarbon saturation determination using acoustic velocities obtained through casing
Moos, Daniel
2010-03-09
Compressional and shear velocities of earth formations are measured through casing. The determined compressional and shear velocities are used in a two component mixing model to provides improved quantitative values for the solid, the dry frame, and the pore compressibility. These are used in determination of hydrocarbon saturation.
Damman, Peter; Holmvang, Lene; Tijssen, Jan G P; Lagerqvist, Bo; Clayton, Tim C; Pocock, Stuart J; Windhausen, Fons; Hirsch, Alexander; Fox, Keith A A; Wallentin, Lars; de Winter, Robbert J
2012-01-01
The aim of this study was to evaluate the independent prognostic value of qualitative and quantitative admission electrocardiographic (ECG) analysis regarding long-term outcomes after non-ST-segment elevation acute coronary syndromes (NSTE-ACS). From the Fragmin and Fast Revascularization During Instability in Coronary Artery Disease (FRISC II), Invasive Versus Conservative Treatment in Unstable Coronary Syndromes (ICTUS), and Randomized Intervention Trial of Unstable Angina 3 (RITA-3) patient-pooled database, 5,420 patients with NSTE-ACS with qualitative ECG data, of whom 2,901 had quantitative data, were included in this analysis. The main outcome was 5-year cardiovascular death or myocardial infarction. Hazard ratios (HRs) were calculated with Cox regression models, and adjustments were made for established outcome predictors. The additional discriminative value was assessed with the category-less net reclassification improvement and integrated discrimination improvement indexes. In the 5,420 patients, the presence of ST-segment depression (≥1 mm; adjusted HR 1.43, 95% confidence interval [CI] 1.25 to 1.63) and left bundle branch block (adjusted HR 1.64, 95% CI 1.18 to 2.28) were independently associated with long-term cardiovascular death or myocardial infarction. Risk increases were short and long term. On quantitative ECG analysis, cumulative ST-segment depression (≥5 mm; adjusted HR 1.34, 95% CI 1.05 to 1.70), the presence of left bundle branch block (adjusted HR 2.15, 95% CI 1.36 to 3.40) or ≥6 leads with inverse T waves (adjusted HR 1.22, 95% CI 0.97 to 1.55) was independently associated with long-term outcomes. No interaction was observed with treatment strategy. No improvements in net reclassification improvement and integrated discrimination improvement were observed after the addition of quantitative characteristics to a model including qualitative characteristics. In conclusion, in the FRISC II, ICTUS, and RITA-3 NSTE-ACS patient-pooled data set, admission ECG characteristics provided long-term prognostic value for cardiovascular death or myocardial infarction. Quantitative ECG characteristics provided no incremental discrimination compared to qualitative data. Copyright © 2012 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Lazariev, A.; Allouche, A.-R.; Aubert-Frécon, M.; Fauvelle, F.; Piotto, M.; Elbayed, K.; Namer, I.-J.; van Ormondt, D.; Graveron-Demilly, D.
2011-11-01
High-resolution magic angle spinning (HRMAS) nuclear magnetic resonance (NMR) is playing an increasingly important role for diagnosis. This technique enables setting up metabolite profiles of ex vivo pathological and healthy tissue. The need to monitor diseases and pharmaceutical follow-up requires an automatic quantitation of HRMAS 1H signals. However, for several metabolites, the values of chemical shifts of proton groups may slightly differ according to the micro-environment in the tissue or cells, in particular to its pH. This hampers the accurate estimation of the metabolite concentrations mainly when using quantitation algorithms based on a metabolite basis set: the metabolite fingerprints are not correct anymore. In this work, we propose an accurate method coupling quantum mechanical simulations and quantitation algorithms to handle basis-set changes. The proposed algorithm automatically corrects mismatches between the signals of the simulated basis set and the signal under analysis by maximizing the normalized cross-correlation between the mentioned signals. Optimized chemical shift values of the metabolites are obtained. This method, QM-QUEST, provides more robust fitting while limiting user involvement and respects the correct fingerprints of metabolites. Its efficiency is demonstrated by accurately quantitating 33 signals from tissue samples of human brains with oligodendroglioma, obtained at 11.7 tesla. The corresponding chemical shift changes of several metabolites within the series are also analyzed.
van Rossum, Peter S N; Fried, David V; Zhang, Lifei; Hofstetter, Wayne L; van Vulpen, Marco; Meijer, Gert J; Court, Laurence E; Lin, Steven H
2016-05-01
A reliable prediction of a pathologic complete response (pathCR) to chemoradiotherapy before surgery for esophageal cancer would enable investigators to study the feasibility and outcome of an organ-preserving strategy after chemoradiotherapy. So far no clinical parameters or diagnostic studies are able to accurately predict which patients will achieve a pathCR. The aim of this study was to determine whether subjective and quantitative assessment of baseline and postchemoradiation (18)F-FDG PET can improve the accuracy of predicting pathCR to preoperative chemoradiotherapy in esophageal cancer beyond clinical predictors. This retrospective study was approved by the institutional review board, and the need for written informed consent was waived. Clinical parameters along with subjective and quantitative parameters from baseline and postchemoradiation (18)F-FDG PET were derived from 217 esophageal adenocarcinoma patients who underwent chemoradiotherapy followed by surgery. The associations between these parameters and pathCR were studied in univariable and multivariable logistic regression analysis. Four prediction models were constructed and internally validated using bootstrapping to study the incremental predictive values of subjective assessment of (18)F-FDG PET, conventional quantitative metabolic features, and comprehensive (18)F-FDG PET texture/geometry features, respectively. The clinical benefit of (18)F-FDG PET was determined using decision-curve analysis. A pathCR was found in 59 (27%) patients. A clinical prediction model (corrected c-index, 0.67) was improved by adding (18)F-FDG PET-based subjective assessment of response (corrected c-index, 0.72). This latter model was slightly improved by the addition of 1 conventional quantitative metabolic feature only (i.e., postchemoradiation total lesion glycolysis; corrected c-index, 0.73), and even more by subsequently adding 4 comprehensive (18)F-FDG PET texture/geometry features (corrected c-index, 0.77). However, at a decision threshold of 0.9 or higher, representing a clinically relevant predictive value for pathCR at which one may be willing to omit surgery, there was no clear incremental value. Subjective and quantitative assessment of (18)F-FDG PET provides statistical incremental value for predicting pathCR after preoperative chemoradiotherapy in esophageal cancer. However, the discriminatory improvement beyond clinical predictors does not translate into a clinically relevant benefit that could change decision making. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
Mapping Bone Mineral Density Obtained by Quantitative Computed Tomography to Bone Volume Fraction
NASA Technical Reports Server (NTRS)
Pennline, James A.; Mulugeta, Lealem
2017-01-01
Methods for relating or mapping estimates of volumetric Bone Mineral Density (vBMD) obtained by Quantitative Computed Tomography to Bone Volume Fraction (BVF) are outlined mathematically. The methods are based on definitions of bone properties, cited experimental studies and regression relations derived from them for trabecular bone in the proximal femur. Using an experimental range of values in the intertrochanteric region obtained from male and female human subjects, age 18 to 49, the BVF values calculated from four different methods were compared to the experimental average and numerical range. The BVF values computed from the conversion method used data from two sources. One source provided pre bed rest vBMD values in the intertrochanteric region from 24 bed rest subject who participated in a 70 day study. Another source contained preflight vBMD values from 18 astronauts who spent 4 to 6 months on the ISS. To aid the use of a mapping from BMD to BVF, the discussion includes how to formulate them for purpose of computational modeling. An application of the conversions would be used to aid in modeling of time varying changes in vBMD as it relates to changes in BVF via bone remodeling and/or modeling.
Jin, Yan; Huang, Jing-feng; Peng, Dai-liang
2009-01-01
Ecological compensation is becoming one of key and multidiscipline issues in the field of resources and environmental management. Considering the change relation between gross domestic product (GDP) and ecological capital (EC) based on remote sensing estimation, we construct a new quantitative estimate model for ecological compensation, using county as study unit, and determine standard value so as to evaluate ecological compensation from 2001 to 2004 in Zhejiang Province, China. Spatial differences of the ecological compensation were significant among all the counties or districts. This model fills up the gap in the field of quantitative evaluation of regional ecological compensation and provides a feasible way to reconcile the conflicts among benefits in the economic, social, and ecological sectors. PMID:19353749
Jin, Yan; Huang, Jing-feng; Peng, Dai-liang
2009-04-01
Ecological compensation is becoming one of key and multidiscipline issues in the field of resources and environmental management. Considering the change relation between gross domestic product (GDP) and ecological capital (EC) based on remote sensing estimation, we construct a new quantitative estimate model for ecological compensation, using county as study unit, and determine standard value so as to evaluate ecological compensation from 2001 to 2004 in Zhejiang Province, China. Spatial differences of the ecological compensation were significant among all the counties or districts. This model fills up the gap in the field of quantitative evaluation of regional ecological compensation and provides a feasible way to reconcile the conflicts among benefits in the economic, social, and ecological sectors.
NASA Astrophysics Data System (ADS)
Smart, Julie Brockman
2009-11-01
This study examined interactions between middle school science students' perceptions of teacher-student interactions and their motivation for learning science. Specifically, in order to better understand factors affecting middle school students' motivation for science, this study investigated the interactions between middle school students' perceptions of teacher interpersonal behavior in their science classroom and their efficacy, task value, mastery orientations, and goal orientation for learning science. This mixed methods study followed a sequential explanatory model (Cresswell & Plano-Clark, 2007). Quantitative and qualitative data were collected in two phases, with quantitative data in the first phase informing the selection of participants for the qualitative phase that followed. The qualitative phase also helped to clarify and explain results from the quantitative phase. Data mixing occurred between Phase One and Phase Two (participant selection) and at the interpretation level (explanatory) after quantitative and qualitative data were analyzed separately. Results from Phase One indicated that students' perceptions of teacher interpersonal behaviors were predictive of their efficacy for learning science, task value for learning science, mastery orientation, and performance orientation. These results were used to create motivation/perception composites, which were used in order to select students for the qualitative interviews. A total of 24 students with high motivation/high perceptions, low motivation/low perceptions, high motivation/low perceptions, and low motivation/high perceptions were selected in order to represent students whose profiles either supported or refuted the quantitative results. Results from Phase Two revealed themes relating to students' construction of their perceptions of teacher interpersonal behavior and dimensions of their efficacy and task value for science. Students who reported high motivation and high perceptions of teacher-student interactions during the quantitative phase described the most instances of teacher cooperative behaviors, such as teacher helpfulness and understanding. Conversely, students reporting low motivation and low perceptions of teacher-student interactions described the most instances of teacher oppositional behavior, such as harsh and impatient behaviors. An in-depth description of categories and subcategories is also provided. This study concludes with an interpretive analysis of quantitative and qualitative results considered both separately and together. Implications for middle grades science education are discussed, including recommendations for behavior management, scaffolding students' transition to middle school, making explicit connections to science careers, and providing opportunities for small successes within the science classroom. Implications for science teacher education, limitations of the study, and future research directions are also discussed.
Sherrouse, Benson C.; Semmens, Darius J.; Clement, Jessica M.
2014-01-01
Despite widespread recognition that social-value information is needed to inform stakeholders and decision makers regarding trade-offs in environmental management, it too often remains absent from ecosystem service assessments. Although quantitative indicators of social values need to be explicitly accounted for in the decision-making process, they need not be monetary. Ongoing efforts to map such values demonstrate how they can also be made spatially explicit and relatable to underlying ecological information. We originally developed Social Values for Ecosystem Services (SolVES) as a tool to assess, map, and quantify nonmarket values perceived by various groups of ecosystem stakeholders. With SolVES 2.0 we have extended the functionality by integrating SolVES with Maxent maximum entropy modeling software to generate more complete social-value maps from available value and preference survey data and to produce more robust models describing the relationship between social values and ecosystems. The current study has two objectives: (1) evaluate how effectively the value index, a quantitative, nonmonetary social-value indicator calculated by SolVES, reproduces results from more common statistical methods of social-survey data analysis and (2) examine how the spatial results produced by SolVES provide additional information that could be used by managers and stakeholders to better understand more complex relationships among stakeholder values, attitudes, and preferences. To achieve these objectives, we applied SolVES to value and preference survey data collected for three national forests, the Pike and San Isabel in Colorado and the Bridger–Teton and the Shoshone in Wyoming. Value index results were generally consistent with results found through more common statistical analyses of the survey data such as frequency, discriminant function, and correlation analyses. In addition, spatial analysis of the social-value maps produced by SolVES provided information that was useful for explaining relationships between stakeholder values and forest uses. Our results suggest that SolVES can effectively reproduce information derived from traditional statistical analyses while adding spatially explicit, social-value information that can contribute to integrated resource assessment, planning, and management of forests and other ecosystems.
Size Dependent Mechanical Properties of Monolayer Densely Arranged Polystyrene Nanospheres.
Huang, Peng; Zhang, Lijing; Yan, Qingfeng; Guo, Dan; Xie, Guoxin
2016-12-13
In contrast to macroscopic materials, the mechanical properties of polymer nanospheres show fascinating scientific and application values. However, the experimental measurements of individual nanospheres and quantitative analysis of theoretical mechanisms remain less well performed and understood. We provide a highly efficient and accurate method with monolayer densely arranged honeycomb polystyrene (PS) nanospheres for the quantitatively mechanical characterization of individual nanospheres on the basis of atomic force microscopy (AFM) nanoindentation. The efficiency is improved by 1-2 orders, and the accuracy is also enhanced almost by half-order. The elastic modulus measured in the experiments increases with decreasing radius to the smallest nanospheres (25-35 nm in radius). A core-shell model is introduced to predict the size dependent elasticity of PS nanospheres, and the theoretical prediction agrees reasonably well with the experimental results and also shows a peak modulus value.
Mechanistic and quantitative insight into cell surface targeted molecular imaging agent design.
Zhang, Liang; Bhatnagar, Sumit; Deschenes, Emily; Thurber, Greg M
2016-05-05
Molecular imaging agent design involves simultaneously optimizing multiple probe properties. While several desired characteristics are straightforward, including high affinity and low non-specific background signal, in practice there are quantitative trade-offs between these properties. These include plasma clearance, where fast clearance lowers background signal but can reduce target uptake, and binding, where high affinity compounds sometimes suffer from lower stability or increased non-specific interactions. Further complicating probe development, many of the optimal parameters vary depending on both target tissue and imaging agent properties, making empirical approaches or previous experience difficult to translate. Here, we focus on low molecular weight compounds targeting extracellular receptors, which have some of the highest contrast values for imaging agents. We use a mechanistic approach to provide a quantitative framework for weighing trade-offs between molecules. Our results show that specific target uptake is well-described by quantitative simulations for a variety of targeting agents, whereas non-specific background signal is more difficult to predict. Two in vitro experimental methods for estimating background signal in vivo are compared - non-specific cellular uptake and plasma protein binding. Together, these data provide a quantitative method to guide probe design and focus animal work for more cost-effective and time-efficient development of molecular imaging agents.
Quantitative assessment of chronic postsurgical pain using the McGill Pain Questionnaire.
Bruce, Julie; Poobalan, Amudha S; Smith, W Cairns S; Chambers, W Alastair
2004-01-01
The McGill Pain Questionnaire (MPQ) provides a quantitative profile of 3 major psychologic dimensions of pain: sensory-discriminative, motivational-affective, and cognitive-evaluative. Although the MPQ is frequently used as a pain measurement tool, no studies to date have compared the characteristics of chronic post-surgical pain after different surgical procedures using a quantitative scoring method. Three separate questionnaire surveys were administered to patients who had undergone surgery at different time points between 1990 and 2000. Surgical procedures selected were mastectomy (n = 511 patients), inguinal hernia repair (n = 351 patients), and cardiac surgery via a central chest wound with or without saphenous vein harvesting (n = 1348 patients). A standard questionnaire format with the MPQ was used for each survey. The IASP definition of chronic pain, continuously or intermittently for longer than 3 months, was used with other criteria for pain location. The type of chronic pain was compared between the surgical populations using 3 different analytical methods: the Pain Rating Intensity score using scale values, (PRI-S); the Pain Rating Intensity using weighted rank values multiplied by scale value (PRI-R); and number of words chosen (NWC). The prevalence of chronic pain after mastectomy, inguinal herniorrhaphy, and median sternotomy with or without saphenectomy was 43%, 30%, and 39% respectively. Chronic pain most frequently reported was sensory-discriminative in quality with similar proportions across different surgical sites. Average PRI-S values after mastectomy, hernia repair, sternotomy (without postoperative anginal symptoms), and saphenectomy were 14.06, 13.00, 12.03, and 8.06 respectively. Analysis was conducted on cardiac patients who reported anginal symptoms with chronic post-surgical pain (PRI-S value 14.28). Patients with moderate and severe pain were more likely to choose more than 10 pain descriptors, regardless of the operative site (P < 0.05). The prevalence and characteristics of chronic pain was remarkably similar across different operative groups. This study is the first to quantitatively compare chronic post-surgical pain using similar methodologies in heterogeneous post-surgical populations.
Quantitative, spectrally-resolved intraoperative fluorescence imaging
Valdés, Pablo A.; Leblond, Frederic; Jacobs, Valerie L.; Wilson, Brian C.; Paulsen, Keith D.; Roberts, David W.
2012-01-01
Intraoperative visual fluorescence imaging (vFI) has emerged as a promising aid to surgical guidance, but does not fully exploit the potential of the fluorescent agents that are currently available. Here, we introduce a quantitative fluorescence imaging (qFI) approach that converts spectrally-resolved data into images of absolute fluorophore concentration pixel-by-pixel across the surgical field of view (FOV). The resulting estimates are linear, accurate, and precise relative to true values, and spectral decomposition of multiple fluorophores is also achieved. Experiments with protoporphyrin IX in a glioma rodent model demonstrate in vivo quantitative and spectrally-resolved fluorescence imaging of infiltrating tumor margins for the first time. Moreover, we present images from human surgery which detect residual tumor not evident with state-of-the-art vFI. The wide-field qFI technique has broad implications for intraoperative surgical guidance because it provides near real-time quantitative assessment of multiple fluorescent biomarkers across the operative field. PMID:23152935
Calabrese, Edward J
2013-11-01
The most common quantitative feature of the hormetic-biphasic dose response is its modest stimulatory response which at maximum is only 30-60% greater than control values, an observation that is consistently independent of biological model, level of organization (i.e., cell, organ or individual), endpoint measured, chemical/physical agent studied, or mechanism. This quantitative feature suggests an underlying "upstream" mechanism common across biological systems, therefore basic and general. Hormetic dose response relationships represent an estimate of the peak performance of integrative biological processes that are allometrically based. Hormetic responses reflect both direct stimulatory or overcompensation responses to damage induced by relatively low doses of chemical or physical agents. The integration of the hormetic dose response within an allometric framework provides, for the first time, an explanation for both the generality and the quantitative features of the hormetic dose response. Copyright © 2013 Elsevier Ltd. All rights reserved.
Usefulness of quantitative susceptibility mapping for the diagnosis of Parkinson disease.
Murakami, Y; Kakeda, S; Watanabe, K; Ueda, I; Ogasawara, A; Moriya, J; Ide, S; Futatsuya, K; Sato, T; Okada, K; Uozumi, T; Tsuji, S; Liu, T; Wang, Y; Korogi, Y
2015-06-01
Quantitative susceptibility mapping allows overcoming several nonlocal restrictions of susceptibility-weighted and phase imaging and enables quantification of magnetic susceptibility. We compared the diagnostic accuracy of quantitative susceptibility mapping and R2* (1/T2*) mapping to discriminate between patients with Parkinson disease and controls. For 21 patients with Parkinson disease and 21 age- and sex-matched controls, 2 radiologists measured the quantitative susceptibility mapping values and R2* values in 6 brain structures (the thalamus, putamen, caudate nucleus, pallidum, substantia nigra, and red nucleus). The quantitative susceptibility mapping values and R2* values of the substantia nigra were significantly higher in patients with Parkinson disease (P < .01); measurements in other brain regions did not differ significantly between patients and controls. For the discrimination of patients with Parkinson disease from controls, receiver operating characteristic analysis suggested that the optimal cutoff values for the substantia nigra, based on the Youden Index, were >0.210 for quantitative susceptibility mapping and >28.8 for R2*. The sensitivity, specificity, and accuracy of quantitative susceptibility mapping were 90% (19 of 21), 86% (18 of 21), and 88% (37 of 42), respectively; for R2* mapping, they were 81% (17 of 21), 52% (11 of 21), and 67% (28 of 42). Pair-wise comparisons showed that the areas under the receiver operating characteristic curves were significantly larger for quantitative susceptibility mapping than for R2* mapping (0.91 versus 0.69, P < .05). Quantitative susceptibility mapping showed higher diagnostic performance than R2* mapping for the discrimination between patients with Parkinson disease and controls. © 2015 by American Journal of Neuroradiology.
Kessner, Darren; Novembre, John
2015-01-01
Evolve and resequence studies combine artificial selection experiments with massively parallel sequencing technology to study the genetic basis for complex traits. In these experiments, individuals are selected for extreme values of a trait, causing alleles at quantitative trait loci (QTL) to increase or decrease in frequency in the experimental population. We present a new analysis of the power of artificial selection experiments to detect and localize quantitative trait loci. This analysis uses a simulation framework that explicitly models whole genomes of individuals, quantitative traits, and selection based on individual trait values. We find that explicitly modeling QTL provides qualitatively different insights than considering independent loci with constant selection coefficients. Specifically, we observe how interference between QTL under selection affects the trajectories and lengthens the fixation times of selected alleles. We also show that a substantial portion of the genetic variance of the trait (50–100%) can be explained by detected QTL in as little as 20 generations of selection, depending on the trait architecture and experimental design. Furthermore, we show that power depends crucially on the opportunity for recombination during the experiment. Finally, we show that an increase in power is obtained by leveraging founder haplotype information to obtain allele frequency estimates. PMID:25672748
Attendance at NHS mandatory training sessions.
Brand, Darren
2015-02-17
To identify factors that affect NHS healthcare professionals' attendance at mandatory training sessions. A quantitative approach was used, with a questionnaire sent to 400 randomly selected participants. A total of 122 responses were received, providing a mix of qualitative and quantitative data. Quantitative data were analysed using statistical methods. Open-ended responses were reviewed using thematic analysis. Clinical staff value mandatory training sessions highly. They are aware of the requirement to keep practice up-to-date and ensure patient safety remains a priority. However, changes to the delivery format of mandatory training sessions are required to enable staff to participate more easily, as staff are often unable to attend. The delivery of mandatory training should move from classroom-based sessions into the clinical area to maximise participation. Delivery should be assisted by local 'experts' who are able to customise course content to meet local requirements and the requirements of different staff groups. Improved arrangements to provide staff cover, for those attending training, would enable more staff to attend training sessions.
Vaidyanathan, Karthik
2017-01-01
Business continuity management is often thought of as a proactive planning process for minimising impact from large-scale incidents and disasters. While this is true, and it is critical to plan for the worst, consistently validating plan effectiveness against smaller disruptions can enable an organisation to gain key insights about its business continuity readiness, drive programme improvements, reduce costs and provide an opportunity to quantitatively demonstrate the value of the programme to management. This paper describes a post mortem framework which is used as a continuous improvement mechanism for tracking, reviewing and learning from real-world events at Microsoft Customer Service & Support. This approach was developed and adopted because conducting regular business continuity exercises proved difficult and expensive in a complex and distributed operations environment with high availability requirements. Using a quantitative approach to measure response to incidents, and categorising outcomes based on such responses, enables business continuity teams to provide data-driven insights to leadership, change perceptions of incident root cause, and instil a higher level of confidence towards disaster response readiness and incident management. The scope of the framework discussed here is specific to reviewing and driving improvements from operational incidents. However, the concept can be extended to learning and evolving readiness plans for other types of incidents.
Nielsen, Gitte; Fritz-Hansen, Thomas; Dirks, Christina G; Jensen, Gorm B; Larsson, Henrik B W
2004-09-01
To investigate the diagnostic ability of quantitative magnetic resonance imaging (MRI) heart perfusion in acute heart patients, a fast, multislice dynamic contrast-enhanced MRI sequence was applied to patients with acute myocardial infarction. Seven patients with acute transmural myocardial infarction were studied using a Turbo-fast low angle shot (FLASH) MRI sequence to monitor the first pass of an extravascular contrast agent (CA), gadolinium diethylene triamine pentaacetic acid (Gd-DTPA). Quantitation of perfusion, expressed as Ki (mL/100 g/minute), in five slices, each having 60 sectors, provided an estimation of the severity and extent of the perfusion deficiency. Reperfusion was assessed both by noninvasive criteria and by coronary angiography (CAG). The Ki maps clearly delineated the infarction in all patients. Thrombolytic treatment was clearly beneficial in one case, but had no effect in the two other cases. Over the time-course of the study, normal perfusion values were not reestablished following thrombolytic treatment in all cases investigated. This study shows that quantitative MRI perfusion values can be obtained from acutely ill patients following acute myocardial infarction. The technique provides information on both the volume and severity of affected myocardial tissue, enabling the power of treatment regimes to be assessed objectively, and this approach should aid individual patient stratification and prognosis. Copyright 2004 Wiley-Liss, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yi; Hu, Dehong; Markillie, Lye Meng
Quantitative gene expression analysis in intact single cells can be achieved using single molecule- based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple FISH sub-probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific binding of sub-probes and tissue autofluorescence, limiting its accuracy. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on blinking frequencies of photoswitchable dyes. This fluctuation localization imaging-based FISH (fliFISH) uses blinking frequency patterns, emitted frommore » a transcript bound to multiple sub-probes, which are distinct from blinking patterns emitted from partial or nonspecifically bound sub-probes and autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2, and their ratio (Nkx2-2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct blinking frequency patterns, laying the foundation for reliable single-cell transcriptomics.« less
Assessing Motivation To Read. Instructional Resource No. 14.
ERIC Educational Resources Information Center
Gambrell, Linda B.; And Others
The Motivation to Read Profile (MRP) is a public-domain instrument designed to provide teachers with an efficient and reliable way to assess reading motivation qualitatively and quantitatively by evaluating students' self-concept as readers and the value they place on reading. The MRP consists of two basic instruments: the Reading Survey (a…
There has been an ongoing dilemma for agencies who set criteria for safe recreational waters in how to provide for a seasonal assessment of a beach site versus guidance for day-to-day management. Typically an overall 'safe' criterion level is derived from epidemiologic studies o...
Andreani, Carla; Romanelli, Giovanni; Senesi, Roberto
2016-06-16
This study presents the first direct and quantitative measurement of the nuclear momentum distribution anisotropy and the quantum kinetic energy tensor in stable and metastable (supercooled) water near its triple point, using deep inelastic neutron scattering (DINS). From the experimental spectra, accurate line shapes of the hydrogen momentum distributions are derived using an anisotropic Gaussian and a model-independent framework. The experimental results, benchmarked with those obtained for the solid phase, provide the state of the art directional values of the hydrogen mean kinetic energy in metastable water. The determinations of the direction kinetic energies in the supercooled phase, provide accurate and quantitative measurements of these dynamical observables in metastable and stable phases, that is, key insight in the physical mechanisms of the hydrogen quantum state in both disordered and polycrystalline systems. The remarkable findings of this study establish novel insight into further expand the capacity and accuracy of DINS investigations of the nuclear quantum effects in water and represent reference experimental values for theoretical investigations.
2013-06-01
measuring numerical risk to the government ( Galway , 2004). However, quantitative risk analysis is rarely utilized in DoD acquisition programs because the...quantitative assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost...Kindle version]. Retrieved from Amazon.com 83 Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review
Missing Value Monitoring Enhances the Robustness in Proteomics Quantitation.
Matafora, Vittoria; Corno, Andrea; Ciliberto, Andrea; Bachi, Angela
2017-04-07
In global proteomic analysis, it is estimated that proteins span from millions to less than 100 copies per cell. The challenge of protein quantitation by classic shotgun proteomic techniques relies on the presence of missing values in peptides belonging to low-abundance proteins that lowers intraruns reproducibility affecting postdata statistical analysis. Here, we present a new analytical workflow MvM (missing value monitoring) able to recover quantitation of missing values generated by shotgun analysis. In particular, we used confident data-dependent acquisition (DDA) quantitation only for proteins measured in all the runs, while we filled the missing values with data-independent acquisition analysis using the library previously generated in DDA. We analyzed cell cycle regulated proteins, as they are low abundance proteins with highly dynamic expression levels. Indeed, we found that cell cycle related proteins are the major components of the missing values-rich proteome. Using the MvM workflow, we doubled the number of robustly quantified cell cycle related proteins, and we reduced the number of missing values achieving robust quantitation for proteins over ∼50 molecules per cell. MvM allows lower quantification variance among replicates for low abundance proteins with respect to DDA analysis, which demonstrates the potential of this novel workflow to measure low abundance, dynamically regulated proteins.
What information on measurement uncertainty should be communicated to clinicians, and how?
Plebani, Mario; Sciacovelli, Laura; Bernardi, Daniela; Aita, Ada; Antonelli, Giorgia; Padoan, Andrea
2018-02-02
The communication of laboratory results to physicians and the quality of reports represent fundamental requirements of the post-analytical phase in order to assure the right interpretation and utilization of laboratory information. Accordingly, the International Standard for clinical laboratories accreditation (ISO 15189) requires that "laboratory reports shall include the information necessary for the interpretation of the examination results". Measurement uncertainty (MU) is an inherent property of any quantitative measurement result which express the lack of knowledge of the true value and quantify the uncertainty of a result, incorporating the factors known to influence it. Even if the MU is not included in the report attributes of ISO 15189 and cannot be considered a post-analytical requirement, it is suggested as an information which should facilitate an appropriate interpretation of quantitative results (quantity values). Therefore, MU has two intended uses: for laboratory professionals, it gives information about the quality of measurements, providing evidence of the compliance with analytical performance characteristics; for physicians (and patients) it may help in interpretation of measurement results, especially when values are compared with reference intervals or clinical decision limits, providing objective information. Here we describe the way that MU should be added to laboratory reports in order to facilitate the interpretation of laboratory results and connecting efforts performed within laboratory to provide more accurate and reliable results with a more objective tool for their interpretation by physicians. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
2013-06-30
QUANTITATIVE RISK ANALYSIS The use of quantitative cost risk analysis tools can be valuable in measuring numerical risk to the government ( Galway , 2004...assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost, schedule, and...www.amazon.com Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review (RAND Working Paper WR-112-RC
Lancione, Marta; Tosetti, Michela; Donatelli, Graziella; Cosottini, Mirco; Costagli, Mauro
2017-11-01
The aim of this work was to assess the impact of tissue structural orientation on quantitative susceptibility mapping (QSM) reliability, and to provide a criterion to identify voxels in which measures of magnetic susceptibility (χ) are most affected by spatial orientation effects. Four healthy volunteers underwent 7-T magnetic resonance imaging (MRI). Multi-echo, gradient-echo sequences were used to obtain quantitative maps of frequency shift (FS) and χ. Information from diffusion tensor imaging (DTI) was used to investigate the relationship between tissue orientation and FS measures and QSM. After sorting voxels on the basis of their fractional anisotropy (FA), the variations in FS and χ values over tissue orientation were measured. Using a K-means clustering algorithm, voxels were separated into two groups depending on the variability of measures within each FA interval. The consistency of FS and QSM values, observed at low FA, was disrupted for FA > 0.6. The standard deviation of χ measured at high FA (0.0103 ppm) was nearly five times that at low FA (0.0022 ppm). This result was consistent through data across different head positions and for different brain regions considered separately, which confirmed that such behavior does not depend on structures with different bulk susceptibility oriented along particular angles. The reliability of single-orientation QSM anticorrelates with local FA. QSM provides replicable values with little variability in brain regions with FA < 0.6, but QSM should be interpreted cautiously in major and coherent fiber bundles, which are strongly affected by structural anisotropy and magnetic susceptibility anisotropy. Copyright © 2017 John Wiley & Sons, Ltd.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blake, Thomas A.; Johnson, Timothy J.; Tonkyn, Russell G.
Infrared integrating sphere measurements of solid samples are important in providing reference data for contact, standoff and remote sensing applications. At the Pacific Northwest National Laboratory (PNNL) we have developed protocols to measure both the directional-hemispherical ( and diffuse (d) reflectances of powders, liquids, and disks of powders and solid materials using a commercially available, matte gold-coated integrating sphere and Fourier transform infrared spectrometer. Detailed descriptions of the sphere alignment and its use for making these reflectance measurements are given. Diffuse reflectance values were found to be dependent on the bidirectional reflection distribution function (BRDF) of the sample and themore » solid angle intercepted by the sphere’s specular exclusion port. To determine how well the sphere and protocols produce quantitative reflectance data, measurements were made of three diffuse and two specular standards prepared by the National institute of Standards and Technology (NIST, USA), LabSphere Infragold and Spectralon standards, hand-loaded sulfur and talc powder samples, and water. The five NIST standards behaved as expected: the three diffuse standards had a high degree of “diffuseness,” d/ = D > 0.9, whereas the two specular standards had D ≤ 0.03. The average absolute differences between the NIST and PNNL measurements of the NIST standards for both directional-hemispherical and diffuse reflectances are on the order of 0.01 reflectance units. Other quantitative differences between the PNNL-measured and calibration (where available) or literature reflectance values for these standards and materials are given and the possible origins of discrepancies are discussed. Random uncertainties and estimates of systematic uncertainties are presented. Corrections necessary to provide better agreement between the PNNL reflectance values as measured for the NIST standards and the NIST reflectance values for these same standards are also discussed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hartman, J.S.; Gordon, R.L.; Lessor, D.L.
1980-09-01
The application of reflective Nomarski differential interference contrast microscopy for the determination of quantitative sample topography data is presented. The discussion includes a review of key theoretical results presented previously plus the experimental implementation of the concepts using a commercial Momarski microscope. The experimental work included the modification and characterization of a commercial microscope to allow its use for obtaining quantitative sample topography data. System usage for the measurement of slopes on flat planar samples is also discussed. The discussion has been designed to provide the theoretical basis, a physical insight, and a cookbook procedure for implementation to allow thesemore » results to be of value to both those interested in the microscope theory and its practical usage in the metallography laboratory.« less
XSEDE Value Added, Cost Avoidance, and Return on Investment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, Craig A; Roskies, Ralph; Knepper, Richard
It is difficult for large research facilities to quantify a return on the investments that fund their operations. This is because there can be a time lag of years or decades between an innovation or discovery and the realization of its value through practical application. This report presents a three-part methodology that attempts to assess the value of federal investment in XSEDE: 1) a qualitative examination of the areas where XSEDE adds value to the activities of the open research community, 2) a thought model examining the cost avoidance realized by the National Science Foundation (NSF) through the centralization andmore » coordination XSEDE provides, and 3) an assessment of the value XSEDE provides to Service Providers in the XD ecosystem. XSEDE adds significantly to the US research community because it functions as a unified interface to the XD ecosystem and because of its scale. A partly quantitative, partly qualitative analysis suggests the Return on Investment of NSF spending on XSEDE is greater than 1.0. indicating that the aggregate value received by the nation from XSEDE is greater than the cost of direct federal investment in XSEDE.« less
An, Y Y; Li, H X; Zhan, Y; Lei, X W
2017-10-10
Objective: To evaluate the value of mDIXON-Quant sequence, diffusion-weighted imaging (DWI) in quantitative diagnosing of the sacroiliitis stages in patients with ankylosing spondylitis (AS). Methods: Based on the Bath Ankylosing Spondylitis Activity Index (BASDAI) and laboratory parameters, a total of 51 patients were diagnosed with AS. They were divided into two groups as early active group ( n =20) and chronic active group ( n =31), and at the same time, 25 healthy people from Tianjin were included as control group. The regular MRI sequences and mDIXON-Quant sequence, DWI were obtained. The apparent diffusion coefficient (ADC) and fat-signal fraction (FF) value of bone marrow with edema of the sacroiliac joints in early active group and chronic active group and of subchondral bone marrow of sacroiliac joint in control group all were measured by ADC maps and FF maps. Mean (FF, ADC) values were compared between groups. Results: The ADC value of the early active group, chronic active group and the control group is (1.07±0.20)×10(-3)mm(2)/s, (1.00±0.22)×10(-3)mm(2)/s, (0.25±0.07)×10(-3)mm(2)/s, respectively, and the differences of ADC value between early active group and control group, chronic active group and control group were significant ( P <0.01), but the difference of the ADC value between early active group and chronic active group was not significant ( P =0.394). That is to say, the ADC value can't distinguish the early active group and chronic active group. The differences of FF value between groups was significant ( P <0.01), and the FF value of bone marrow with edema in chronic active group were higher than that in early active group. Conclusions: The mDIXON-Quant sequence can quantitatively diagnose early active group and chronic active group, and the diagnostic value is better than DWI. Thus, it can provide guidance for clinical treatment and prognosis.
NASA Astrophysics Data System (ADS)
Lamarche, G.; Le Gonidec, Y.; Lucieer, V.; Lurton, X.; Greinert, J.; Dupré, S.; Nau, A.; Heffron, E.; Roche, M.; Ladroit, Y.; Urban, P.
2017-12-01
Detecting liquid, solid or gaseous features in the ocean is generating considerable interest in the geoscience community, because of their potentially high economic values (oil & gas, mining), their significance for environmental management (oil/gas leakage, biodiversity mapping, greenhouse gas monitoring) as well as their potential cultural and traditional values (food, freshwater). Enhancing people's capability to quantify and manage the natural capital present in the ocean water goes hand in hand with the development of marine acoustic technology, as marine echosounders provide the most reliable and technologically advanced means to develop quantitative studies of water column backscatter data. This is not developed to its full capability because (i) of the complexity of the physics involved in relation to the constantly changing marine environment, and (ii) the rapid technological evolution of high resolution multibeam echosounder (MBES) water-column imaging systems. The Water Column Imaging Working Group is working on a series of multibeam echosounder (MBES) water column datasets acquired in a variety of environments, using a range of frequencies, and imaging a number of water-column features such as gas seeps, oil leaks, suspended particulate matter, vegetation and freshwater springs. Access to data from different acoustic frequencies and ocean dynamics enables us to discuss and test multifrequency approaches which is the most promising means to develop a quantitative analysis of the physical properties of acoustic scatterers, providing rigorous cross calibration of the acoustic devices. In addition, high redundancy of multibeam data, such as is available for some datasets, will allow us to develop data processing techniques, leading to quantitative estimates of water column gas seeps. Each of the datasets has supporting ground-truthing data (underwater videos and photos, physical oceanography measurements) which provide information on the origin and chemistry of the seep content. This is of first importance when assessing the physical properties of water column scatterers from backscatter acoustic measurement.
Yost, Erin E; Stanek, John; DeWoskin, Robert S; Burgoon, Lyle D
2016-07-19
The United States Environmental Protection Agency (EPA) identified 1173 chemicals associated with hydraulic fracturing fluids, flowback, or produced water, of which 1026 (87%) lack chronic oral toxicity values for human health assessments. To facilitate the ranking and prioritization of chemicals that lack toxicity values, it may be useful to employ toxicity estimates from quantitative structure-activity relationship (QSAR) models. Here we describe an approach for applying the results of a QSAR model from the TOPKAT program suite, which provides estimates of the rat chronic oral lowest-observed-adverse-effect level (LOAEL). Of the 1173 chemicals, TOPKAT was able to generate LOAEL estimates for 515 (44%). To address the uncertainty associated with these estimates, we assigned qualitative confidence scores (high, medium, or low) to each TOPKAT LOAEL estimate, and found 481 to be high-confidence. For 48 chemicals that had both a high-confidence TOPKAT LOAEL estimate and a chronic oral reference dose from EPA's Integrated Risk Information System (IRIS) database, Spearman rank correlation identified 68% agreement between the two values (permutation p-value =1 × 10(-11)). These results provide support for the use of TOPKAT LOAEL estimates in identifying and prioritizing potentially hazardous chemicals. High-confidence TOPKAT LOAEL estimates were available for 389 of 1026 hydraulic fracturing-related chemicals that lack chronic oral RfVs and OSFs from EPA-identified sources, including a subset of chemicals that are frequently used in hydraulic fracturing fluids.
NASA Astrophysics Data System (ADS)
Dong, Huanhuan; Liu, Jing; Liu, Xiaoru; Yu, Yanying; Cao, Shuwen
2018-01-01
A collection of thirty-six aromatic heterocycle thiosemicarbazone analogues presented a broad span of anti-tyrosinase activities were designed and obtained. A robust and reliable two-dimensional quantitative structure-activity relationship model, as evidenced by the high q2 and r2 values (0.848 and 0.893, respectively), was gained based on the analogues to predict the quantitative chemical-biological relationship and the new modifier direction. Inhibitory activities of the compounds were found to greatly depend on molecular shape and orbital energy. Substituents brought out large ovality and high highest-occupied molecular orbital energy values helped to improve the activity of these analogues. The molecular docking results provided visual evidence for QSAR analysis and inhibition mechanism. Based on these, two novel tyrosinase inhibitors O04 and O05 with predicted IC50 of 0.5384 and 0.8752 nM were designed and suggested for further research.
Chang, Hui-Yin; Chen, Ching-Tai; Lih, T. Mamie; Lynn, Ke-Shiuan; Juo, Chiun-Gung; Hsu, Wen-Lian; Sung, Ting-Yi
2016-01-01
Efficient and accurate quantitation of metabolites from LC-MS data has become an important topic. Here we present an automated tool, called iMet-Q (intelligent Metabolomic Quantitation), for label-free metabolomics quantitation from high-throughput MS1 data. By performing peak detection and peak alignment, iMet-Q provides a summary of quantitation results and reports ion abundance at both replicate level and sample level. Furthermore, it gives the charge states and isotope ratios of detected metabolite peaks to facilitate metabolite identification. An in-house standard mixture and a public Arabidopsis metabolome data set were analyzed by iMet-Q. Three public quantitation tools, including XCMS, MetAlign, and MZmine 2, were used for performance comparison. From the mixture data set, seven standard metabolites were detected by the four quantitation tools, for which iMet-Q had a smaller quantitation error of 12% in both profile and centroid data sets. Our tool also correctly determined the charge states of seven standard metabolites. By searching the mass values for those standard metabolites against Human Metabolome Database, we obtained a total of 183 metabolite candidates. With the isotope ratios calculated by iMet-Q, 49% (89 out of 183) metabolite candidates were filtered out. From the public Arabidopsis data set reported with two internal standards and 167 elucidated metabolites, iMet-Q detected all of the peaks corresponding to the internal standards and 167 metabolites. Meanwhile, our tool had small abundance variation (≤0.19) when quantifying the two internal standards and had higher abundance correlation (≥0.92) when quantifying the 167 metabolites. iMet-Q provides user-friendly interfaces and is publicly available for download at http://ms.iis.sinica.edu.tw/comics/Software_iMet-Q.html. PMID:26784691
Neltner, Janna Hackett; Abner, Erin Lynn; Schmitt, Frederick A; Denison, Stephanie Kay; Anderson, Sonya; Patel, Ela; Nelson, Peter T
2012-12-01
Quantitative neuropathologic methods provide information that is important for both research and clinical applications. The technologic advancement of digital pathology and image analysis offers new solutions to enable valid quantification of pathologic severity that is reproducible between raters regardless of experience. Using an Aperio ScanScope XT and its accompanying image analysis software, we designed algorithms for quantitation of amyloid and tau pathologies on 65 β-amyloid (6F/3D antibody) and 48 phospho-tau (PHF-1)-immunostained sections of human temporal neocortex. Quantitative digital pathologic data were compared with manual pathology counts. There were excellent correlations between manually counted and digitally analyzed neuropathologic parameters (R² = 0.56-0.72). Data were highly reproducible among 3 participants with varying degrees of expertise in neuropathology (intraclass correlation coefficient values, >0.910). Digital quantification also provided additional parameters, including average plaque area, which shows statistically significant differences when samples are stratified according to apolipoprotein E allele status (average plaque area, 380.9 μm² in apolipoprotein E [Latin Small Letter Open E]4 carriers vs 274.4 μm² for noncarriers; p < 0.001). Thus, digital pathology offers a rigorous and reproducible method for quantifying Alzheimer disease neuropathologic changes and may provide additional insights into morphologic characteristics that were previously more challenging to assess because of technical limitations.
Sadaie, Wakako; Harada, Yoshie; Matsuda, Michiyuki
2014-01-01
Computer-assisted simulation is a promising approach for clarifying complicated signaling networks. However, this approach is currently limited by a deficiency of kinetic parameters determined in living cells. To overcome this problem, we applied fluorescence cross-correlation spectrometry (FCCS) to measure dissociation constant (Kd) values of signaling molecule complexes in living cells (in vivo Kd). Among the pairs of fluorescent molecules tested, that of monomerized enhanced green fluorescent protein (mEGFP) and HaloTag-tetramethylrhodamine was most suitable for the measurement of in vivo Kd by FCCS. Using this pair, we determined 22 in vivo Kd values of signaling molecule complexes comprising the epidermal growth factor receptor (EGFR)–Ras–extracellular signal-regulated kinase (ERK) mitogen-activated protein (MAP) kinase pathway. With these parameters, we developed a kinetic simulation model of the EGFR-Ras-ERK MAP kinase pathway and uncovered a potential role played by stoichiometry in Shc binding to EGFR during the peak activations of Ras, MEK, and ERK. Intriguingly, most of the in vivo Kd values determined in this study were higher than the in vitro Kd values reported previously, suggesting the significance of competitive bindings inside cells. These in vivo Kd values will provide a sound basis for the quantitative understanding of signal transduction. PMID:24958104
Pest management in Douglas-fir seed orchards: a microcomputer decision method
James B. Hoy; Michael I. Haverty
1988-01-01
The computer program described provides a Douglas-fir seed orchard manager (user) with a quantitative method for making insect pest management decisions on a desk-top computer. The decision system uses site-specific information such as estimates of seed crop size, insect attack rates, insecticide efficacy and application costs, weather, and crop value. At sites where...
The isolated red spruce communities of Virginia and West Virginia
Harold S. Adams; Steven Stephenson; Adam W. Rollins; Mary Beth Adams
2010-01-01
Quantitative data on the composition and structure of coniferous forests containing red spruce in the mountains of central and southwestern Virginia and eastern central West Virginia, based on sampling carried out in 67 stands during the 1982 to 1984 field seasons, are provided. The average importance value ([relative basal area + relative density/2]) of red spruce was...
Developing Tomorrow's Talent: The Case of an Undergraduate Mentoring Programme
ERIC Educational Resources Information Center
Gannon, Judie M.; Maher, Angela
2012-01-01
Purpose: The purpose of this paper is to explore the value of an alumni and employer engagement mentoring initiative in a hospitality and tourism school within a UK university. Design/methodology/approach: The paper uses the survey method and interviews to provide qualitative and quantitative data on the participants' reactions to the initiative.…
Huang, Hairong; Xu, Zanzan; Shao, Xianhong; Wismeijer, Daniel; Sun, Ping; Wang, Jingxiao
2017-01-01
Objectives This study identified potential general influencing factors for a mathematical prediction of implant stability quotient (ISQ) values in clinical practice. Methods We collected the ISQ values of 557 implants from 2 different brands (SICace and Osstem) placed by 2 surgeons in 336 patients. Surgeon 1 placed 329 SICace implants, and surgeon 2 placed 113 SICace implants and 115 Osstem implants. ISQ measurements were taken at T1 (immediately after implant placement) and T2 (before dental restoration). A multivariate linear regression model was used to analyze the influence of the following 11 candidate factors for stability prediction: sex, age, maxillary/mandibular location, bone type, immediate/delayed implantation, bone grafting, insertion torque, I-stage or II-stage healing pattern, implant diameter, implant length and T1-T2 time interval. Results The need for bone grafting as a predictor significantly influenced ISQ values in all three groups at T1 (weight coefficients ranging from -4 to -5). In contrast, implant diameter consistently influenced the ISQ values in all three groups at T2 (weight coefficients ranging from 3.4 to 4.2). Other factors, such as sex, age, I/II-stage implantation and bone type, did not significantly influence ISQ values at T2, and implant length did not significantly influence ISQ values at T1 or T2. Conclusions These findings provide a rational basis for mathematical models to quantitatively predict the ISQ values of implants in clinical practice. PMID:29084260
Yao, Y; Nguyen, T D; Pandya, S; Zhang, Y; Hurtado Rúa, S; Kovanlikaya, I; Kuceyeski, A; Liu, Z; Wang, Y; Gauthier, S A
2018-02-01
A hyperintense rim on susceptibility in chronic MS lesions is consistent with iron deposition, and the purpose of this study was to quantify iron-related myelin damage within these lesions as compared with those without rim. Forty-six patients had 2 longitudinal quantitative susceptibility mapping with automatic zero reference scans with a mean interval of 28.9 ± 11.4 months. Myelin water fraction mapping by using fast acquisition with spiral trajectory and T2 prep was obtained at the second time point to measure myelin damage. Mixed-effects models were used to assess lesion quantitative susceptibility mapping and myelin water fraction values. Quantitative susceptibility mapping scans were on average 6.8 parts per billion higher in 116 rim-positive lesions compared with 441 rim-negative lesions ( P < .001). All rim-positive lesions retained a hyperintense rim over time, with increasing quantitative susceptibility mapping values of both the rim and core regions ( P < .001). Quantitative susceptibility mapping scans and myelin water fraction in rim-positive lesions decreased from rim to core, which is consistent with rim iron deposition. Whole lesion myelin water fractions for rim-positive and rim-negative lesions were 0.055 ± 0.07 and 0.066 ± 0.04, respectively. In the mixed-effects model, rim-positive lesions had on average 0.01 lower myelin water fraction compared with rim-negative lesions ( P < .001). The volume of the rim at the initial quantitative susceptibility mapping scan was negatively associated with follow-up myelin water fraction ( P < .01). Quantitative susceptibility mapping rim-positive lesions maintained a hyperintense rim, increased in susceptibility, and had more myelin damage compared with rim-negative lesions. Our results are consistent with the identification of chronic active MS lesions and may provide a target for therapeutic interventions to reduce myelin damage. © 2018 by American Journal of Neuroradiology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa
Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less
Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa; ...
2016-06-22
Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less
Status of human chromosome aberrations as a biological radiation dosimeter in the nuclear industry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bender, M.A.
1978-01-01
It seems that the determination of peripheral lymphocyte chriomosome aberration levels is now firmly established as a means of biological dosimetry of great value in many phases of the nuclear industry. In the case of large external exposure it can provide valuable quantitative estimates, as well as information on dose distribution and radiation quality. In the case of routine occupational exposures the technique is more qualitative, but is of value particularly in resolving uncertainties as to whether suspected overexposures did in fact occur. Where workers accumulate burdens of internal emitters, aberration analysis provides a valuable, though at present quite qualitativemore » indicator. In spite of the expense of cytogenetic analyses, they are of sufficient value to justify much more widespread application, particularly in high risk situations.« less
Farberg, Aaron S; Winkelmann, Richard R; Tucker, Natalie; White, Richard; Rigel, Darrell S
2017-09-01
BACKGROUND: Early diagnosis of melanoma is critical to survival. New technologies, such as a multi-spectral digital skin lesion analysis (MSDSLA) device [MelaFind, STRATA Skin Sciences, Horsham, Pennsylvania] may be useful to enhance clinician evaluation of concerning pigmented skin lesions. Previous studies evaluated the effect of only the binary output. OBJECTIVE: The objective of this study was to determine how decisions dermatologists make regarding pigmented lesion biopsies are impacted by providing both the underlying classifier score (CS) and associated probability risk provided by multi-spectral digital skin lesion analysis. This outcome was also compared against the improvement reported with the provision of only the binary output. METHODS: Dermatologists attending an educational conference evaluated 50 pigmented lesions (25 melanomas and 25 benign lesions). Participants were asked if they would biopsy the lesion based on clinical images, and were asked this question again after being shown multi-spectral digital skin lesion analysis data that included the probability graphs and classifier score. RESULTS: Data were analyzed from a total of 160 United States board-certified dermatologists. Biopsy sensitivity for melanoma improved from 76 percent following clinical evaluation to 92 percent after quantitative multi-spectral digital skin lesion analysis information was provided ( p <0.0001). Specificity improved from 52 percent to 79 percent ( p <0.0001). The positive predictive value increased from 61 percent to 81 percent ( p <0.01) when the quantitative data were provided. Negative predictive value also increased (68% vs. 91%, p<0.01), and overall biopsy accuracy was greater with multi-spectral digital skin lesion analysis (64% vs. 86%, p <0.001). Interrater reliability improved (intraclass correlation 0.466 before, 0.559 after). CONCLUSION: Incorporating the classifier score and probability data into physician evaluation of pigmented lesions led to both increased sensitivity and specificity, thereby resulting in more accurate biopsy decisions.
Acoustic Facies Analysis of Side-Scan Sonar Data
NASA Astrophysics Data System (ADS)
Dwan, Fa Shu
Acoustic facies analysis methods have allowed the generation of system-independent values for the quantitative seafloor acoustic parameter, backscattering strength, from GLORIA and (TAMU) ^2 side-scan sonar data. The resulting acoustic facies parameters enable quantitative comparisons of data collected by different sonar systems, data from different environments, and measurements made with survey geometries. Backscattering strength values were extracted from the sonar amplitude data by inversion based on the sonar equation. Image processing products reveal seafloor features and patterns of relative intensity. To quantitatively compare data collected at different times or by different systems, and to ground truth-measurements and geoacoustic models, quantitative corrections must be made on any given data set for system source level, beam pattern, time-varying gain, processing gain, transmission loss, absorption, insonified area contribution, and grazing angle effects. In the sonar equation, backscattering strength is the sonar parameter which is directly related to seafloor properties. The GLORIA data used in this study are from the edge of a distal lobe of the Monterey Fan. An interfingered region of strong and weak seafloor signal returns from a flat seafloor region provides an ideal data set for this study. Inversion of imagery data from the region allows the quantitative definition of different acoustic facies. The (TAMU) ^2 data used are from a calibration site near the Green Canyon area of the Gulf of Mexico. Acoustic facies analysis techniques were implemented to generate statistical information for acoustic facies based on the estimates of backscattering strength. The backscattering strength values have been compared with Lambert's Law and other functions to parameterize the description of the acoustic facies. The resulting Lambertian constant values range from -26 dB to -36 dB. A modified Lambert relationship, which consists of both intercept and slope terms, appears to represent the BSS versus grazing angle profiles better based on chi^2 testing and error ellipse generation. Different regression functions, composed of trigonometric functions, were analyzed for different segments of the BSS profiles. A cotangent or sine/cosine function shows promising results for representing the entire grazing angle span of the BSS profiles.
Correcting power and p-value calculations for bias in diffusion tensor imaging.
Lauzon, Carolyn B; Landman, Bennett A
2013-07-01
Diffusion tensor imaging (DTI) provides quantitative parametric maps sensitive to tissue microarchitecture (e.g., fractional anisotropy, FA). These maps are estimated through computational processes and subject to random distortions including variance and bias. Traditional statistical procedures commonly used for study planning (including power analyses and p-value/alpha-rate thresholds) specifically model variability, but neglect potential impacts of bias. Herein, we quantitatively investigate the impacts of bias in DTI on hypothesis test properties (power and alpha-rate) using a two-sided hypothesis testing framework. We present theoretical evaluation of bias on hypothesis test properties, evaluate the bias estimation technique SIMEX for DTI hypothesis testing using simulated data, and evaluate the impacts of bias on spatially varying power and alpha rates in an empirical study of 21 subjects. Bias is shown to inflame alpha rates, distort the power curve, and cause significant power loss even in empirical settings where the expected difference in bias between groups is zero. These adverse effects can be attenuated by properly accounting for bias in the calculation of power and p-values. Copyright © 2013 Elsevier Inc. All rights reserved.
Harari, Gil
2014-01-01
Statistic significance, also known as p-value, and CI (Confidence Interval) are common statistics measures and are essential for the statistical analysis of studies in medicine and life sciences. These measures provide complementary information about the statistical probability and conclusions regarding the clinical significance of study findings. This article is intended to describe the methodologies, compare between the methods, assert their suitability for the different needs of study results analysis and to explain situations in which each method should be used.
A Quantitative Needs Assessment Technique for Cross-Cultural Work Adjustment Training.
ERIC Educational Resources Information Center
Selmer, Lyn
2000-01-01
A study of 67 Swedish expatriate bosses and 104 local Hong Kong middle managers tested a quantitative needs assessment technique measuring work values. Two-thirds of middle managers' work values were not correctly estimated by their bosses, especially instrumental values (pay, benefits, security, working hours and conditions), indicating a need…
Quantitative analysis of comparative genomic hybridization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manoir, S. du; Bentz, M.; Joos, S.
1995-01-01
Comparative genomic hybridization (CGH) is a new molecular cytogenetic method for the detection of chromosomal imbalances. Following cohybridization of DNA prepared from a sample to be studied and control DNA to normal metaphase spreads, probes are detected via different fluorochromes. The ratio of the test and control fluorescence intensities along a chromosome reflects the relative copy number of segments of a chromosome in the test genome. Quantitative evaluation of CGH experiments is required for the determination of low copy changes, e.g., monosomy or trisomy, and for the definition of the breakpoints involved in unbalanced rearrangements. In this study, a programmore » for quantitation of CGH preparations is presented. This program is based on the extraction of the fluorescence ratio profile along each chromosome, followed by averaging of individual profiles from several metaphase spreads. Objective parameters critical for quantitative evaluations were tested, and the criteria for selection of suitable CGH preparations are described. The granularity of the chromosome painting and the regional inhomogeneity of fluorescence intensities in metaphase spreads proved to be crucial parameters. The coefficient of variation of the ratio value for chromosomes in balanced state (CVBS) provides a general quality criterion for CGH experiments. Different cutoff levels (thresholds) of average fluorescence ratio values were compared for their specificity and sensitivity with regard to the detection of chromosomal imbalances. 27 refs., 15 figs., 1 tab.« less
Heijtel, D F R; Petersen, E T; Mutsaerts, H J M M; Bakker, E; Schober, P; Stevens, M F; van Berckel, B N M; Majoie, C B L M; Booij, J; van Osch, M J P; van Bavel, E T; Boellaard, R; Lammertsma, A A; Nederveen, A J
2016-04-01
The purpose of this study was to assess whether there was an agreement between quantitative cerebral blood flow (CBF) and arterial cerebral blood volume (CBVA) measurements by [(15)O]H2O positron emission tomography (PET) and model-free QUASAR MRI. Twelve healthy subjects were scanned within a week in separate MRI and PET imaging sessions, after which quantitative and qualitative agreement between both modalities was assessed for gray matter, white matter and whole brain region of interests (ROI). The correlation between CBF measurements obtained with both modalities was moderate to high (r(2): 0.28-0.60, P < 0.05), although QUASAR significantly underestimated CBF by 30% (P < 0.001). CBVA was moderately correlated (r(2): 0.28-0.43, P < 0.05), with QUASAR yielding values that were only 27% of the [(15)O]H2O-derived values (P < 0.001). Group-wise voxel statistics identified minor areas with significant contrast differences between [(15)O]H2O PET and QUASAR MRI, indicating similar qualitative CBVA and CBF information by both modalities. In conclusion, the results of this study demonstrate that QUASAR MRI and [(15)O]H2O PET provide similar CBF and CBVA information, but with systematic quantitative discrepancies. Copyright © 2016 John Wiley & Sons, Ltd.
A Privacy-Preserving Platform for User-Centric Quantitative Benchmarking
NASA Astrophysics Data System (ADS)
Herrmann, Dominik; Scheuer, Florian; Feustel, Philipp; Nowey, Thomas; Federrath, Hannes
We propose a centralised platform for quantitative benchmarking of key performance indicators (KPI) among mutually distrustful organisations. Our platform offers users the opportunity to request an ad-hoc benchmarking for a specific KPI within a peer group of their choice. Architecture and protocol are designed to provide anonymity to its users and to hide the sensitive KPI values from other clients and the central server. To this end, we integrate user-centric peer group formation, exchangeable secure multi-party computation protocols, short-lived ephemeral key pairs as pseudonyms, and attribute certificates. We show by empirical evaluation of a prototype that the performance is acceptable for reasonably sized peer groups.
Epileptogenic networks in nodular heterotopia: A stereoelectroencephalography study.
Pizzo, Francesca; Roehri, Nicolas; Catenoix, Hélène; Medina, Samuel; McGonigal, Aileen; Giusiano, Bernard; Carron, Romain; Scavarda, Didier; Ostrowsky, Karine; Lepine, Anne; Boulogne, Sébastien; Scholly, Julia; Hirsch, Edouard; Rheims, Sylvain; Bénar, Christian-George; Bartolomei, Fabrice
2017-12-01
Defining the roles of heterotopic and normotopic cortex in the epileptogenic networks in patients with nodular heterotopia is challenging. To elucidate this issue, we compared heterotopic and normotopic cortex using quantitative signal analysis on stereoelectroencephalography (SEEG) recordings. Clinically relevant biomarkers of epileptogenicity during ictal (epileptogenicity index; EI) and interictal recordings (high-frequency oscillation and spike) were evaluated in 19 patients undergoing SEEG. These biomarkers were then compared between heterotopic cortex and neocortical regions. Seizures were classified as normotopic, heterotopic, or normoheterotopic according to respective values of quantitative analysis (EI ≥0.3). A total of 1,246 contacts were analyzed: 259 in heterotopic tissue (heterotopic cortex), 873 in neocortex in the same lobe of the lesion (local neocortex), and 114 in neocortex distant from the lesion (distant neocortex). No significant difference in EI values, high-frequency oscillations, and spike rate was found comparing local neocortex and heterotopic cortex at a patient level, but local neocortex appears more epileptogenic (p < 0.001) than heterotopic cortex analyzing EI values at a seizure level. According to EI values, seizures were mostly normotopic (48.5%) or normoheterotopic (45.5%); only 6% were purely heterotopic. A good long-term treatment response was obtained in only two patients after thermocoagulation and surgical disconnection. This is the first quantitative SEEG study providing insight into the mechanisms generating seizures in nodular heterotopia. We demonstrate that both the heterotopic lesion and particularly the normotopic cortex are involved in the epileptogenic network. This could open new perspectives on multitarget treatments, other than resective surgery, aimed at modifying the epileptic network. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.
Knapp, Julika; Allesch, Astrid; Müller, Wolfgang; Bockreis, Anke
2017-11-01
Recycling of waste materials is desirable to reduce the consumption of limited primary resources, but also includes the risk of recycling unwanted, hazardous substances. In Austria, the legal framework demands secondary products must not present a higher risk than comparable products derived from primary resources. However, the act provides no definition on how to assess this risk potential. This paper describes the development of different quantitative and qualitative methods to estimate the transfer of contaminants in recycling processes. The quantitative methods comprise the comparison of concentrations of harmful substances in recycling products to corresponding primary products and to existing limit values. The developed evaluation matrix, which considers further aspects, allows for the assessment of the qualitative risk potential. The results show that, depending on the assessed waste fraction, particular contaminants can be critical. Their concentrations were higher than in comparable primary materials and did not comply with existing limit values. On the other hand, the results show that a long-term, well-established quality control system can assure compliance with the limit values. The results of the qualitative assessment obtained with the evaluation matrix support the results of the quantitative assessment. Therefore, the evaluation matrix can be suitable to quickly screen waste streams used for recycling to estimate their potential environmental and health risks. To prevent the transfer of contaminants into product cycles, improved data of relevant substances in secondary resources are necessary. In addition, regulations for material recycling are required to assure adequate quality control measures, including limit values. Copyright © 2017 Elsevier Ltd. All rights reserved.
Cui, Yi; Hu, Dehong; Markillie, Lye Meng; ...
2017-10-04
Here, quantitative gene expression analysis in intact single cells can be achieved using single molecule-based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple oligonucleotide probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific probe binding and tissue autofluorescence, especially when only a small number of probes can be fitted to the target transcript. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on on-off duty cycles of photoswitchable dyes.more » This fluctuation localization imaging-based FISH (fliFISH) uses on-time fractions (measured over a series of exposures) collected from transcripts bound to as low as 8 probes, which are distinct from on-time fractions collected from nonspecifically bound probes or autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2 and their ratio ( Nkx2- 2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct on-time fractions, laying the foundation for reliable single-cell transcriptomics.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yi; Hu, Dehong; Markillie, Lye Meng
Here, quantitative gene expression analysis in intact single cells can be achieved using single molecule-based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple oligonucleotide probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific probe binding and tissue autofluorescence, especially when only a small number of probes can be fitted to the target transcript. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on on-off duty cycles of photoswitchable dyes.more » This fluctuation localization imaging-based FISH (fliFISH) uses on-time fractions (measured over a series of exposures) collected from transcripts bound to as low as 8 probes, which are distinct from on-time fractions collected from nonspecifically bound probes or autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2 and their ratio ( Nkx2- 2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct on-time fractions, laying the foundation for reliable single-cell transcriptomics.« less
pH Mapping on Tooth Surfaces for Quantitative Caries Diagnosis Using Micro Ir/IrOx pH Sensor.
Ratanaporncharoen, Chindanai; Tabata, Miyuki; Kitasako, Yuichi; Ikeda, Masaomi; Goda, Tatsuro; Matsumoto, Akira; Tagami, Junji; Miyahara, Yuji
2018-04-03
A quantitative diagnostic method for dental caries would improve oral health, which directly affects the quality of life. Here we describe the preparation and application of Ir/IrOx pH sensors, which are used to measure the surface pH of dental caries. The pH level is used as an indicator to distinguish between active and arrested caries. After a dentist visually inspected and defined 18 extracted dentinal caries at various positions as active or arrested caries, the surface pH values of sound and caries areas were directly measured with an Ir/IrOx pH sensor with a diameter of 300 μm as a dental explorer. The average pH values of the sound root, the arrested caries, and active caries were 6.85, 6.07, and 5.30, respectively. The pH obtained with an Ir/IrOx sensor was highly correlated with the inspection results by the dentist, indicating that the types of caries were successfully categorized. This caries testing technique using a micro Ir/IrOx pH sensor provides an accurate quantitative caries evaluation and has potential in clinical diagnosis.
Music therapy for coma patients: preliminary results.
Sun, J; Chen, W
2015-04-01
The application of quantitative EEG (δ+θ/α+β value) and GCS value to evaluate the role of music therapy for traumatic brain injury coma patients. Forty patients of traumatic brain injury coma were selected to meet the inclusion criteria. Twenty cases were selected for the rehabilitation, neurology and neurosurgery ward, whose families could actively cooperate with, and the patients could receive a long-term fixed nursing staff with formal music therapy (music group). Twenty cases were in the intensive care unit of the rehabilitation, neurology and neurosurgery ward. Their families members cooperated poorly, had often changing nursing staff, and without a formal music therapy (control group). After a one monthe follow up, the GCS value and quantitative EEG (δ+θ/α+β value) were compared between the two groups. Between the two groups, except for the presence or absence of formal music therapy, the rest of treatment had no significant difference and was matched by age, gender, and injury types. In 40 cases of traumatic brain injury patients, the GCS value increased in the music group after treatment when compared to the control group. The difference between the two groups was significant (p < 0.05). The quantitative EEG value (δ+θ/α+β value) of music group values were decreased after treatment, and the difference was significant compared with the control group (p < 0.05). Through the quantitative EEG (δ+θ/α+β value) and the GCS observation score, music therapy in patients with craniocerebral trauma coma has obviously an effect on promoting to regain consciousness. The quantitative EEG (δ+θ/α+β value) can be used as an objective index to evaluate the state of brain function.
Weak Value Amplification is Suboptimal for Estimation and Detection
NASA Astrophysics Data System (ADS)
Ferrie, Christopher; Combes, Joshua
2014-01-01
We show by using statistically rigorous arguments that the technique of weak value amplification does not perform better than standard statistical techniques for the tasks of single parameter estimation and signal detection. Specifically, we prove that postselection, a necessary ingredient for weak value amplification, decreases estimation accuracy and, moreover, arranging for anomalously large weak values is a suboptimal strategy. In doing so, we explicitly provide the optimal estimator, which in turn allows us to identify the optimal experimental arrangement to be the one in which all outcomes have equal weak values (all as small as possible) and the initial state of the meter is the maximal eigenvalue of the square of the system observable. Finally, we give precise quantitative conditions for when weak measurement (measurements without postselection or anomalously large weak values) can mitigate the effect of uncharacterized technical noise in estimation.
A Method to Quantify Visual Information Processing in Children Using Eye Tracking
Kooiker, Marlou J.G.; Pel, Johan J.M.; van der Steen-Kant, Sanny P.; van der Steen, Johannes
2016-01-01
Visual problems that occur early in life can have major impact on a child's development. Without verbal communication and only based on observational methods, it is difficult to make a quantitative assessment of a child's visual problems. This limits accurate diagnostics in children under the age of 4 years and in children with intellectual disabilities. Here we describe a quantitative method that overcomes these problems. The method uses a remote eye tracker and a four choice preferential looking paradigm to measure eye movement responses to different visual stimuli. The child sits without head support in front of a monitor with integrated infrared cameras. In one of four monitor quadrants a visual stimulus is presented. Each stimulus has a specific visual modality with respect to the background, e.g., form, motion, contrast or color. From the reflexive eye movement responses to these specific visual modalities, output parameters such as reaction times, fixation accuracy and fixation duration are calculated to quantify a child's viewing behavior. With this approach, the quality of visual information processing can be assessed without the use of communication. By comparing results with reference values obtained in typically developing children from 0-12 years, the method provides a characterization of visual information processing in visually impaired children. The quantitative information provided by this method can be advantageous for the field of clinical visual assessment and rehabilitation in multiple ways. The parameter values provide a good basis to: (i) characterize early visual capacities and consequently to enable early interventions; (ii) compare risk groups and follow visual development over time; and (iii), construct an individual visual profile for each child. PMID:27500922
A Method to Quantify Visual Information Processing in Children Using Eye Tracking.
Kooiker, Marlou J G; Pel, Johan J M; van der Steen-Kant, Sanny P; van der Steen, Johannes
2016-07-09
Visual problems that occur early in life can have major impact on a child's development. Without verbal communication and only based on observational methods, it is difficult to make a quantitative assessment of a child's visual problems. This limits accurate diagnostics in children under the age of 4 years and in children with intellectual disabilities. Here we describe a quantitative method that overcomes these problems. The method uses a remote eye tracker and a four choice preferential looking paradigm to measure eye movement responses to different visual stimuli. The child sits without head support in front of a monitor with integrated infrared cameras. In one of four monitor quadrants a visual stimulus is presented. Each stimulus has a specific visual modality with respect to the background, e.g., form, motion, contrast or color. From the reflexive eye movement responses to these specific visual modalities, output parameters such as reaction times, fixation accuracy and fixation duration are calculated to quantify a child's viewing behavior. With this approach, the quality of visual information processing can be assessed without the use of communication. By comparing results with reference values obtained in typically developing children from 0-12 years, the method provides a characterization of visual information processing in visually impaired children. The quantitative information provided by this method can be advantageous for the field of clinical visual assessment and rehabilitation in multiple ways. The parameter values provide a good basis to: (i) characterize early visual capacities and consequently to enable early interventions; (ii) compare risk groups and follow visual development over time; and (iii), construct an individual visual profile for each child.
Yin, Chang-Xin; Jiang, Qian-Li; He, Han; Yu, Guo-Pan; Xu, Yue; Meng, Fan-Yi; Yang, Mo
2012-02-01
This study was aimed to establish a method for rapid detecting BK polyomavirus (BKV) and to investigate the feasibility and value used in leukemia patients undergoing hematopoietic stem cell transplantation. Primers were designed according to BKV gene sequence; the quantitative standards for BKV and a real-time fluorescent quantitative PCR for BKV were established. The BKV level in urine samples from 36 patients after hematopoietic stem cell transplantation were detected by established method. The results showed that the standard of reconstructed plasmid and real time fluorescent quantitative PCR method were successfully established, its good specificity, sensitivity and stability were confirmed by experiments. BKV was found in 55.56% of urine samples, and the BKV load in urine was 2.46 × 10(4) - 7.8 × 10(9) copy/ml. It is concluded that the establishment of real-time fluorescent quantitative PCR for BKV detection provides a method for early diagnosis of the patients with hemorrhagic cystitis after hematopoietic stem cell transplantation.
Bremner, J D; Baldwin, R; Horti, A; Staib, L H; Ng, C K; Tan, P Z; Zea-Ponce, Y; Zoghbi, S; Seibyl, J P; Soufer, R; Charney, D S; Innis, R B
1999-08-31
Although positron emission tomography (PET) and single photon emission computed tomography (SPECT) are increasingly used for quantitation of neuroreceptor binding, almost no studies to date have involved a direct comparison of the two. One study found a high level of agreement between the two techniques, although there was a systematic 30% increase in measures of benzodiazepine receptor binding in SPECT compared with PET. The purpose of the current study was to directly compare quantitation of benzodiazepine receptor binding in the same human subjects using PET and SPECT with high specific activity [11C]iomazenil and [123I]iomazenil, respectively. All subjects were administered a single bolus of high specific activity iomazenil labeled with 11C or 123I followed by dynamic PET or SPECT imaging of the brain. Arterial blood samples were obtained for measurement of metabolite-corrected radioligand in plasma. Compartmental modeling was used to fit values for kinetic rate constants of transfer of radioligand between plasma and brain compartments. These values were used for calculation of binding potential (BP = Bmax/Kd) and product of BP and the fraction of free non-protein-bound parent compound (V3'). Mean values for V3' in PET and SPECT were as follows: temporal cortex 23+/-5 and 22+/-3 ml/g, frontal cortex23+/-6 and 22+/-3 ml/g, occipital cortex 28+/-3 and 31+/-5 ml/g, and striatum 4+/-4 and 7+/-4 ml/g. These preliminary findings indicate that PET and SPECT provide comparable results in quantitation of neuroreceptor binding in the human brain.
Guo, Hailin; Ding, Wanwen; Chen, Jingbo; Chen, Xuan; Zheng, Yiqi; Wang, Zhiyong; Liu, Jianxiu
2014-01-01
Zoysiagrass (Zoysia Willd.) is an important warm season turfgrass that is grown in many parts of the world. Salt tolerance is an important trait in zoysiagrass breeding programs. In this study, a genetic linkage map was constructed using sequence-related amplified polymorphism markers and random amplified polymorphic DNA markers based on an F1 population comprising 120 progeny derived from a cross between Zoysia japonica Z105 (salt-tolerant accession) and Z061 (salt-sensitive accession). The linkage map covered 1211 cM with an average marker distance of 5.0 cM and contained 24 linkage groups with 242 marker loci (217 sequence-related amplified polymorphism markers and 25 random amplified polymorphic DNA markers). Quantitative trait loci affecting the salt tolerance of zoysiagrass were identified using the constructed genetic linkage map. Two significant quantitative trait loci (qLF-1 and qLF-2) for leaf firing percentage were detected; qLF-1 at 36.3 cM on linkage group LG4 with a logarithm of odds value of 3.27, which explained 13.1% of the total variation of leaf firing and qLF-2 at 42.3 cM on LG5 with a logarithm of odds value of 2.88, which explained 29.7% of the total variation of leaf firing. A significant quantitative trait locus (qSCW-1) for reduced percentage of dry shoot clipping weight was detected at 44.1 cM on LG5 with a logarithm of odds value of 4.0, which explained 65.6% of the total variation. This study provides important information for further functional analysis of salt-tolerance genes in zoysiagrass. Molecular markers linked with quantitative trait loci for salt tolerance will be useful in zoysiagrass breeding programs using marker-assisted selection.
Quantitative accuracy of the simplified strong ion equation to predict serum pH in dogs.
Cave, N J; Koo, S T
2015-01-01
Electrochemical approach to the assessment of acid-base states should provide a better mechanistic explanation of the metabolic component than methods that consider only pH and carbon dioxide. Simplified strong ion equation (SSIE), using published dog-specific values, would predict the measured serum pH of diseased dogs. Ten dogs, hospitalized for various reasons. Prospective study of a convenience sample of a consecutive series of dogs admitted to the Massey University Veterinary Teaching Hospital (MUVTH), from which serum biochemistry and blood gas analyses were performed at the same time. Serum pH was calculated (Hcal+) using the SSIE, and published values for the concentration and dissociation constant for the nonvolatile weak acids (Atot and Ka ), and subsequently Hcal+ was compared with the dog's actual pH (Hmeasured+). To determine the source of discordance between Hcal+ and Hmeasured+, the calculations were repeated using a series of substituted values for Atot and Ka . The Hcal+ did not approximate the Hmeasured+ for any dog (P = 0.499, r(2) = 0.068), and was consistently more basic. Substituted values Atot and Ka did not significantly improve the accuracy (r(2) = 0.169 to <0.001). Substituting the effective SID (Atot-[HCO3-]) produced a strong association between Hcal+ and Hmeasured+ (r(2) = 0.977). Using the simplified strong ion equation and the published values for Atot and Ka does not appear to provide a quantitative explanation for the acid-base status of dogs. Efficacy of substituting the effective SID in the simplified strong ion equation suggests the error lies in calculating the SID. Copyright © 2015 The Authors. Journal of Veterinary Internal Medicine published by Wiley Periodicals, Inc. on behalf of the American College of Veterinary Internal Medicine.
NASA Astrophysics Data System (ADS)
Harden, J. W.; Loisel, J.; Hugelius, G.; Sulman, B. N.; Bond-Lamberty, B. P.; Abramoff, R. Z.; Malhotra, A.; Gill, A. L.
2017-12-01
Soils support ecological and human systems by providing a physical and biogeochemical basis for plant growth, ecological functions, water quality, and water storage, and by providing services and functions needed for economic development, human well-being, and conservation of natural resources. Quantitative evaluation of soil services, however, is inconsistent and poorly communicated, in part because we lack a scientific, unified basis for evaluating soils and their potential for serving our needs. We introduce an index of soil service (SSI) in which multiple services are numerically or quantitatively assessed, normalized to a unit-less scale for purposes of intercomparability, and evaluated for a given site or region. Services include organic matter and/or organic carbon storage; plant productivity; CO2 or GHG exchange with the atmosphere; water storage capacity; and nutrient storage and/or availability. The status of SSI can be evaluated by individual services or by a composite index that combines multiple services. The status can be monitored over time; and key services that are more highly valued for a given soil can be weighted accordingly in comparison to other services. As a first step, existing data for each service are captured from a literature and data review in order to establish the full range of values. A site value establishes the ranking relative to the full range. Key services are weighted according to local values. A final index is the sum of the normalized, weighted products. Metrics can be updated and adapted as new data or services are discovered or recognized. Metrics can be used to compare among sites, regions, or time periods.
Quantitative comparison of some aesthetic factors among rivers
Leopold, Luna Bergere
1969-01-01
It is difficult to evaluate the factors contributing to aesthetic or nonmonetary aspects of a landscape. In contrast, aspects which lend themselves to cost-benefit comparisons are now treated in a routine way. As a result, nonmonetary values are described either in emotion-loaded words or else are mentioned and thence forgotten.The present report is a preliminary attempt to quantify some elements of aesthetic appeal while eliminating, insofar as possible, value judgments or personal preferences. If methods of recording such factors can be developed, the results promise to be a useful, new kind of basic data needed in many planning and decision-making circumstances. Such data would be especially useful when choices must be made among alternative courses of action. Such data would tend to provide a more prominent consideration of the nonmonetary aspects of a landscape.Assignment of quantitative estimates to aesthetic factors leads not so much to ratios of value as to relative rank positions. In fact, value itself tends to carry a connotation of preference, whereas ranking can more easily be used for categorization without attribution of preference and thus it tends to a void the introduction at too early a stage of differences in preference. Because the Federal Power Commission has been studying an application for a permit to construct one or more additional hydropower dams in the vicinity of Hells Canyon of the Snake River, the localities studied for the present discussion are in that region of Idaho. Hopefully, the data collected will provide some useful information on factors related to nonmonetary values in the region. The present discussion has been kept free of the preference judgments of the writer, and throughout the discussions observations are treated as facts.
DOT/NASA comparative assessment of Brayton engines for guideway vehicle and buses. Volume 1: Summary
NASA Technical Reports Server (NTRS)
1975-01-01
The Department of Transportation requested that the NASA Office of Aeronautics and Space Technology evaluate and assess the potential of several types of gas turbine engines and fuels for the on-board power and propulsion of a future heavy-duty ground transportation system. The purpose of the investigation was threefold: (1) to provide a definition of the potential for turbine engines to minimize pollution, energy consumption, and noise; (2) to provide a useful means of comparison of the types of engine based on consistent assumptions and a common analytical approach; and (3) to provide a compendium of comparative performance data that would serve as the technical basis for future planning. Emphasis was on establishing comparison trends rather than on absolute values and a definitive engine selection. The primary value of this study is intended to be usefulness of the results to provide a quantitative basis for future judgement.
Quantitative Monitoring of Microbial Species during Bioleaching of a Copper Concentrate.
Hedrich, Sabrina; Guézennec, Anne-Gwenaëlle; Charron, Mickaël; Schippers, Axel; Joulian, Catherine
2016-01-01
Monitoring of the microbial community in bioleaching processes is essential in order to control process parameters and enhance the leaching efficiency. Suitable methods are, however, limited as they are usually not adapted to bioleaching samples and often no taxon-specific assays are available in the literature for these types of consortia. Therefore, our study focused on the development of novel quantitative real-time PCR (qPCR) assays for the quantification of Acidithiobacillus caldus, Leptospirillum ferriphilum, Sulfobacillus thermosulfidooxidans , and Sulfobacillus benefaciens and comparison of the results with data from other common molecular monitoring methods in order to evaluate their accuracy and specificity. Stirred tank bioreactors for the leaching of copper concentrate, housing a consortium of acidophilic, moderately thermophilic bacteria, relevant in several bioleaching operations, served as a model system. The microbial community analysis via qPCR allowed a precise monitoring of the evolution of total biomass as well as abundance of specific species. Data achieved by the standard fingerprinting methods, terminal restriction fragment length polymorphism (T-RFLP) and capillary electrophoresis single strand conformation polymorphism (CE-SSCP) on the same samples followed the same trend as qPCR data. The main added value of qPCR was, however, to provide quantitative data for each species whereas only relative abundance could be deduced from T-RFLP and CE-SSCP profiles. Additional value was obtained by applying two further quantitative methods which do not require nucleic acid extraction, total cell counting after SYBR Green staining and metal sulfide oxidation activity measurements via microcalorimetry. Overall, these complementary methods allow for an efficient quantitative microbial community monitoring in various bioleaching operations.
Quantitative Monitoring of Microbial Species during Bioleaching of a Copper Concentrate
Hedrich, Sabrina; Guézennec, Anne-Gwenaëlle; Charron, Mickaël; Schippers, Axel; Joulian, Catherine
2016-01-01
Monitoring of the microbial community in bioleaching processes is essential in order to control process parameters and enhance the leaching efficiency. Suitable methods are, however, limited as they are usually not adapted to bioleaching samples and often no taxon-specific assays are available in the literature for these types of consortia. Therefore, our study focused on the development of novel quantitative real-time PCR (qPCR) assays for the quantification of Acidithiobacillus caldus, Leptospirillum ferriphilum, Sulfobacillus thermosulfidooxidans, and Sulfobacillus benefaciens and comparison of the results with data from other common molecular monitoring methods in order to evaluate their accuracy and specificity. Stirred tank bioreactors for the leaching of copper concentrate, housing a consortium of acidophilic, moderately thermophilic bacteria, relevant in several bioleaching operations, served as a model system. The microbial community analysis via qPCR allowed a precise monitoring of the evolution of total biomass as well as abundance of specific species. Data achieved by the standard fingerprinting methods, terminal restriction fragment length polymorphism (T-RFLP) and capillary electrophoresis single strand conformation polymorphism (CE-SSCP) on the same samples followed the same trend as qPCR data. The main added value of qPCR was, however, to provide quantitative data for each species whereas only relative abundance could be deduced from T-RFLP and CE-SSCP profiles. Additional value was obtained by applying two further quantitative methods which do not require nucleic acid extraction, total cell counting after SYBR Green staining and metal sulfide oxidation activity measurements via microcalorimetry. Overall, these complementary methods allow for an efficient quantitative microbial community monitoring in various bioleaching operations. PMID:28066365
Ku, Hyung-Keun; Lim, Hyuk-Min; Oh, Kyong-Hwa; Yang, Hyo-Jin; Jeong, Ji-Seon; Kim, Sook-Kyung
2013-03-01
The Bradford assay is a simple method for protein quantitation, but variation in the results between proteins is a matter of concern. In this study, we compared and normalized quantitative values from two models for protein quantitation, where the residues in the protein that bind to anionic Coomassie Brilliant Blue G-250 comprise either Arg and Lys (Method 1, M1) or Arg, Lys, and His (Method 2, M2). Use of the M2 model yielded much more consistent quantitation values compared with use of the M1 model, which exhibited marked overestimations against protein standards. Copyright © 2012 Elsevier Inc. All rights reserved.
49 CFR 232.409 - Inspection and testing of end-of-train devices.
Code of Federal Regulations, 2013 CFR
2013-10-01
... be determined, after charging the train, by comparing the quantitative value of the air pressure displayed on the front unit with the quantitative value of the air pressure displayed on the rear unit or on...
49 CFR 232.409 - Inspection and testing of end-of-train devices.
Code of Federal Regulations, 2010 CFR
2010-10-01
... be determined, after charging the train, by comparing the quantitative value of the air pressure displayed on the front unit with the quantitative value of the air pressure displayed on the rear unit or on...
49 CFR 232.409 - Inspection and testing of end-of-train devices.
Code of Federal Regulations, 2014 CFR
2014-10-01
... be determined, after charging the train, by comparing the quantitative value of the air pressure displayed on the front unit with the quantitative value of the air pressure displayed on the rear unit or on...
49 CFR 232.409 - Inspection and testing of end-of-train devices.
Code of Federal Regulations, 2012 CFR
2012-10-01
... be determined, after charging the train, by comparing the quantitative value of the air pressure displayed on the front unit with the quantitative value of the air pressure displayed on the rear unit or on...
49 CFR 232.409 - Inspection and testing of end-of-train devices.
Code of Federal Regulations, 2011 CFR
2011-10-01
... be determined, after charging the train, by comparing the quantitative value of the air pressure displayed on the front unit with the quantitative value of the air pressure displayed on the rear unit or on...
Generalized PSF modeling for optimized quantitation in PET imaging.
Ashrafinia, Saeed; Mohy-Ud-Din, Hassan; Karakatsanis, Nicolas A; Jha, Abhinav K; Casey, Michael E; Kadrmas, Dan J; Rahmim, Arman
2017-06-21
Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUV mean and SUV max , including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUV mean bias in small tumours. Overall, the results indicate that exactly matched PSF modeling does not offer optimized PET quantitation, and that PSF overestimation may provide enhanced SUV quantitation. Furthermore, generalized PSF modeling may provide a valuable approach for quantitative tasks such as treatment-response assessment and prognostication.
Elayavilli, Ravikumar Komandur; Liu, Hongfang
2016-01-01
Computational modeling of biological cascades is of great interest to quantitative biologists. Biomedical text has been a rich source for quantitative information. Gathering quantitative parameters and values from biomedical text is one significant challenge in the early steps of computational modeling as it involves huge manual effort. While automatically extracting such quantitative information from bio-medical text may offer some relief, lack of ontological representation for a subdomain serves as impedance in normalizing textual extractions to a standard representation. This may render textual extractions less meaningful to the domain experts. In this work, we propose a rule-based approach to automatically extract relations involving quantitative data from biomedical text describing ion channel electrophysiology. We further translated the quantitative assertions extracted through text mining to a formal representation that may help in constructing ontology for ion channel events using a rule based approach. We have developed Ion Channel ElectroPhysiology Ontology (ICEPO) by integrating the information represented in closely related ontologies such as, Cell Physiology Ontology (CPO), and Cardiac Electro Physiology Ontology (CPEO) and the knowledge provided by domain experts. The rule-based system achieved an overall F-measure of 68.93% in extracting the quantitative data assertions system on an independently annotated blind data set. We further made an initial attempt in formalizing the quantitative data assertions extracted from the biomedical text into a formal representation that offers potential to facilitate the integration of text mining into ontological workflow, a novel aspect of this study. This work is a case study where we created a platform that provides formal interaction between ontology development and text mining. We have achieved partial success in extracting quantitative assertions from the biomedical text and formalizing them in ontological framework. The ICEPO ontology is available for download at http://openbionlp.org/mutd/supplementarydata/ICEPO/ICEPO.owl.
Examining Provider Perspectives within Housing First and Traditional Programs
Henwood, Benjamin F.; Shinn, Marybeth; Tsemberis, Sam; Padgett, Deborah K.
2014-01-01
Pathways’ Housing First represents a radical departure from traditional programs that serve individuals experiencing homelessness and co-occurring psychiatric and substance use disorders. This paper considered two federally funded comparison studies of Pathways’ Housing First and traditional programs to examine whether differences were reflected in the perspectives of frontline providers. Both quantitative analysis of responses to structured questions with close-ended responses and qualitative analysis of open-ended responses to semistructured questions showed that Pathways providers had greater endorsement of consumer values, lesser endorsement of systems values, and greater tolerance for abnormal behavior that did not result in harm to others than their counterparts in traditional programs. Comparing provider perspectives also revealed an “implementation paradox”; traditional providers were inhibited from engaging consumers in treatment and services without housing, whereas HF providers could focus on issues other than securing housing. As programs increasingly adopt a Housing First approach, implementation challenges remain due to an existing workforce habituated to traditional services. PMID:24659925
Golditz, T; Steib, S; Pfeifer, K; Uder, M; Gelse, K; Janka, R; Hennig, F F; Welsch, G H
2014-10-01
The aim of this study was to investigate, using T2-mapping, the impact of functional instability in the ankle joint on the development of early cartilage damage. Ethical approval for this study was provided. Thirty-six volunteers from the university sports program were divided into three groups according to their ankle status: functional ankle instability (FAI, initial ankle sprain with residual instability); ankle sprain Copers (initial sprain, without residual instability); and controls (without a history of ankle injuries). Quantitative T2-mapping magnetic resonance imaging (MRI) was performed at the beginning ('early-unloading') and at the end ('late-unloading') of the MR-examination, with a mean time span of 27 min. Zonal region-of-interest T2-mapping was performed on the talar and tibial cartilage in the deep and superficial layers. The inter-group comparisons of T2-values were analyzed using paired and unpaired t-tests. Statistical analysis of variance was performed. T2-values showed significant to highly significant differences in 11 of 12 regions throughout the groups. In early-unloading, the FAI-group showed a significant increase in quantitative T2-values in the medial, talar regions (P = 0.008, P = 0.027), whereas the Coper-group showed this enhancement in the central-lateral regions (P = 0.05). Especially the comparison of early-loading to late-unloading values revealed significantly decreasing T2-values over time laterally and significantly increasing T2-values medially in the FAI-group, which were not present in the Coper- or control-group. Functional instability causes unbalanced loading in the ankle joint, resulting in cartilage alterations as assessed by quantitative T2-mapping. This approach can visualize and localize early cartilage abnormalities, possibly enabling specific treatment options to prevent osteoarthritis in young athletes. Copyright © 2014 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
Research on Customer Value Based on Extension Data Mining
NASA Astrophysics Data System (ADS)
Chun-Yan, Yang; Wei-Hua, Li
Extenics is a new discipline for dealing with contradiction problems with formulize model. Extension data mining (EDM) is a product combining Extenics with data mining. It explores to acquire the knowledge based on extension transformations, which is called extension knowledge (EK), taking advantage of extension methods and data mining technology. EK includes extensible classification knowledge, conductive knowledge and so on. Extension data mining technology (EDMT) is a new data mining technology that mining EK in databases or data warehouse. Customer value (CV) can weigh the essentiality of customer relationship for an enterprise according to an enterprise as a subject of tasting value and customers as objects of tasting value at the same time. CV varies continually. Mining the changing knowledge of CV in databases using EDMT, including quantitative change knowledge and qualitative change knowledge, can provide a foundation for that an enterprise decides the strategy of customer relationship management (CRM). It can also provide a new idea for studying CV.
Dyal, Jonathan; Akampurira, Andrew; Rhein, Joshua; Morawski, Bozena M; Kiggundu, Reuben; Nabeta, Henry W; Musubire, Abdu K; Bahr, Nathan C; Williams, Darlisha A; Bicanic, Tihana; Larsen, Robert A; Meya, David B; Boulware, David R
2016-05-01
Quantitative cerebrospinal fluid (CSF) cultures provide a measure of disease severity in cryptococcal meningitis. The fungal clearance rate by quantitative cultures has become a primary endpoint for phase II clinical trials. This study determined the inter-assay accuracy of three different quantitative culture methodologies. Among 91 participants with meningitis symptoms in Kampala, Uganda, during August-November 2013, 305 CSF samples were prospectively collected from patients at multiple time points during treatment. Samples were simultaneously cultured by three methods: (1) St. George's 100 mcl input volume of CSF with five 1:10 serial dilutions, (2) AIDS Clinical Trials Group (ACTG) method using 1000, 100, 10 mcl input volumes, and two 1:100 dilutions with 100 and 10 mcl input volume per dilution on seven agar plates; and (3) 10 mcl calibrated loop of undiluted and 1:100 diluted CSF (loop). Quantitative culture values did not statistically differ between St. George-ACTG methods (P= .09) but did for St. George-10 mcl loop (P< .001). Repeated measures pairwise correlation between any of the methods was high (r≥0.88). For detecting sterility, the ACTG-method had the highest negative predictive value of 97% (91% St. George, 60% loop), but the ACTG-method had occasional (∼10%) difficulties in quantification due to colony clumping. For CSF clearance rate, St. George-ACTG methods did not differ overall (mean -0.05 ± 0.07 log10CFU/ml/day;P= .14) on a group level; however, individual-level clearance varied. The St. George and ACTG quantitative CSF culture methods produced comparable but not identical results. Quantitative cultures can inform treatment management strategies. © The Author 2016. Published by Oxford University Press on behalf of The International Society for Human and Animal Mycology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Krajbich, Ian; Rangel, Antonio
2011-08-16
How do we make decisions when confronted with several alternatives (e.g., on a supermarket shelf)? Previous work has shown that accumulator models, such as the drift-diffusion model, can provide accurate descriptions of the psychometric data for binary value-based choices, and that the choice process is guided by visual attention. However, the computational processes used to make choices in more complicated situations involving three or more options are unknown. We propose a model of trinary value-based choice that generalizes what is known about binary choice, and test it using an eye-tracking experiment. We find that the model provides a quantitatively accurate description of the relationship between choice, reaction time, and visual fixation data using the same parameters that were estimated in previous work on binary choice. Our findings suggest that the brain uses similar computational processes to make binary and trinary choices.
Jha, Abhinav K; Caffo, Brian; Frey, Eric C
2016-01-01
The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation. PMID:26982626
Jha, Abhinav K; Caffo, Brian; Frey, Eric C
2016-04-07
The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation.
TU-H-CAMPUS-IeP2-01: Quantitative Evaluation of PROPELLER DWI Using QIBA Diffusion Phantom
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yung, J; Ai, H; Liu, H
Purpose: The purpose of this study is to determine the quantitative variability of apparent diffusion coefficient (ADC) values when varying imaging parameters in a diffusion-weighted (DW) fast spin echo (FSE) sequence with Periodically Rotated Overlapping ParallEL Lines with Enhanced Reconstruction (PROPELLER) k-space trajectory. Methods: Using a 3T MRI scanner, a NIST traceable, quantitative magnetic resonance imaging (MRI) diffusion phantom (High Precision Devices, Inc, Boulder, Colorado) consisting of 13 vials filled with various concentrations of polymer polyvinylpyrrolidone (PVP) in aqueous solution was imaged with a standard Quantitative Imaging Biomarkers Alliance (QIBA) DWI spin echo, echo planar imaging (SE EPI) acquisition. Themore » same phantom was then imaged with a DWI PROPELLER sequence at varying echo train lengths (ETL) of 8, 20, and 32, as well as b-values of 400, 900, and 2000. QIBA DWI phantom analysis software was used to generate ADC maps and create region of interests (ROIs) for quantitative measurements of each vial. Mean and standard deviations of the ROIs were compared. Results: The SE EPI sequence generated ADC values that showed very good agreement with the known ADC values of the phantom (r2 = 0.9995, slope = 1.0061). The ADC values measured from the PROPELLER sequences were inflated, but were highly correlated with an r2 range from 0.8754 to 0.9880. The PROPELLER sequence with an ETL=20 and b-value of 0 and 2000 showed the closest agreement (r2 = 0.9034, slope = 0.9880). Conclusion: The DW PROPELLER sequence is promising for quantitative evaluation of ADC values. A drawback of the PROPELLER sequence is the longer acquisition time. The 180° refocusing pulses may also cause the observed increase in ADC values compared to the standard SE EPI DW sequence. However, the FSE sequence offers an advantage with in-plane motion and geometric distortion which will be investigated in future studies.« less
Data mining for signals in spontaneous reporting databases: proceed with caution.
Stephenson, Wendy P; Hauben, Manfred
2007-04-01
To provide commentary and points of caution to consider before incorporating data mining as a routine component of any Pharmacovigilance program, and to stimulate further research aimed at better defining the predictive value of these new tools as well as their incremental value as an adjunct to traditional methods of post-marketing surveillance. Commentary includes review of current data mining methodologies employed and their limitations, caveats to consider in the use of spontaneous reporting databases and caution against over-confidence in the results of data mining. Future research should focus on more clearly delineating the limitations of the various quantitative approaches as well as the incremental value that they bring to traditional methods of pharmacovigilance.
Sadaie, Wakako; Harada, Yoshie; Matsuda, Michiyuki; Aoki, Kazuhiro
2014-09-01
Computer-assisted simulation is a promising approach for clarifying complicated signaling networks. However, this approach is currently limited by a deficiency of kinetic parameters determined in living cells. To overcome this problem, we applied fluorescence cross-correlation spectrometry (FCCS) to measure dissociation constant (Kd) values of signaling molecule complexes in living cells (in vivo Kd). Among the pairs of fluorescent molecules tested, that of monomerized enhanced green fluorescent protein (mEGFP) and HaloTag-tetramethylrhodamine was most suitable for the measurement of in vivo Kd by FCCS. Using this pair, we determined 22 in vivo Kd values of signaling molecule complexes comprising the epidermal growth factor receptor (EGFR)-Ras-extracellular signal-regulated kinase (ERK) mitogen-activated protein (MAP) kinase pathway. With these parameters, we developed a kinetic simulation model of the EGFR-Ras-ERK MAP kinase pathway and uncovered a potential role played by stoichiometry in Shc binding to EGFR during the peak activations of Ras, MEK, and ERK. Intriguingly, most of the in vivo Kd values determined in this study were higher than the in vitro Kd values reported previously, suggesting the significance of competitive bindings inside cells. These in vivo Kd values will provide a sound basis for the quantitative understanding of signal transduction. Copyright © 2014, American Society for Microbiology. All Rights Reserved.
Bossard, N; Descotes, F; Bremond, A G; Bobin, Y; De Saint Hilaire, P; Golfier, F; Awada, A; Mathevet, P M; Berrerd, L; Barbier, Y; Estève, J
2003-11-01
The prognostic value of cathepsin D has been recently recognized, but as many quantitative tumor markers, its clinical use remains unclear partly because of methodological issues in defining cut-off values. Guidelines have been proposed for analyzing quantitative prognostic factors, underlining the need for keeping data continuous, instead of categorizing them. Flexible approaches, parametric and non-parametric, have been proposed in order to improve the knowledge of the functional form relating a continuous factor to the risk. We studied the prognostic value of cathepsin D in a retrospective hospital cohort of 771 patients with breast cancer, and focused our overall survival analysis, based on the Cox regression, on two flexible approaches: smoothing splines and fractional polynomials. We also determined a cut-off value from the maximum likelihood estimate of a threshold model. These different approaches complemented each other for (1) identifying the functional form relating cathepsin D to the risk, and obtaining a cut-off value and (2) optimizing the adjustment for complex covariate like age at diagnosis in the final multivariate Cox model. We found a significant increase in the death rate, reaching 70% with a doubling of the level of cathepsin D, after the threshold of 37.5 pmol mg(-1). The proper prognostic impact of this marker could be confirmed and a methodology providing appropriate ways to use markers in clinical practice was proposed.
Using multi-species occupancy models in structured decision making on managed lands
Sauer, John R.; Blank, Peter J.; Zipkin, Elise F.; Fallon, Jane E.; Fallon, Frederick W.
2013-01-01
Land managers must balance the needs of a variety of species when manipulating habitats. Structured decision making provides a systematic means of defining choices and choosing among alternative management options; implementation of a structured decision requires quantitative approaches to predicting consequences of management on the relevant species. Multi-species occupancy models provide a convenient framework for making structured decisions when the management objective is focused on a collection of species. These models use replicate survey data that are often collected on managed lands. Occupancy can be modeled for each species as a function of habitat and other environmental features, and Bayesian methods allow for estimation and prediction of collective responses of groups of species to alternative scenarios of habitat management. We provide an example of this approach using data from breeding bird surveys conducted in 2008 at the Patuxent Research Refuge in Laurel, Maryland, evaluating the effects of eliminating meadow and wetland habitats on scrub-successional and woodland-breeding bird species using summed total occupancy of species as an objective function. Removal of meadows and wetlands decreased value of an objective function based on scrub-successional species by 23.3% (95% CI: 20.3–26.5), but caused only a 2% (0.5, 3.5) increase in value of an objective function based on woodland species, documenting differential effects of elimination of meadows and wetlands on these groups of breeding birds. This approach provides a useful quantitative tool for managers interested in structured decision making.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luria, Paolo; Aspinall, Peter A
2003-08-01
The aim of this paper is to describe a new approach to major industrial hazard assessment, which has been recently studied by the authors in conjunction with the Italian Environmental Protection Agency ('ARPAV'). The real opportunity for developing a different approach arose from the need of the Italian EPA to provide the Venice Port Authority with an appropriate estimation of major industrial hazards in Porto Marghera, an industrial estate near Venice (Italy). However, the standard model, the quantitative risk analysis (QRA), only provided a list of individual quantitative risk values, related to single locations. The experimental model is based onmore » a multi-criteria approach--the Analytic Hierarchy Process--which introduces the use of expert opinions, complementary skills and expertise from different disciplines in conjunction with quantitative traditional analysis. This permitted the generation of quantitative data on risk assessment from a series of qualitative assessments, on the present situation and on three other future scenarios, and use of this information as indirect quantitative measures, which could be aggregated for obtaining the global risk rate. This approach is in line with the main concepts proposed by the last European directive on Major Hazard Accidents, which recommends increasing the participation of operators, taking the other players into account and, moreover, paying more attention to the concepts of 'urban control', 'subjective risk' (risk perception) and intangible factors (factors not directly quantifiable)« less
Fan, Qianrui; Wang, Wenyu; Hao, Jingcan; He, Awen; Wen, Yan; Guo, Xiong; Wu, Cuiyan; Ning, Yujie; Wang, Xi; Wang, Sen; Zhang, Feng
2017-08-01
Neuroticism is a fundamental personality trait with significant genetic determinant. To identify novel susceptibility genes for neuroticism, we conducted an integrative analysis of genomic and transcriptomic data of genome wide association study (GWAS) and expression quantitative trait locus (eQTL) study. GWAS summary data was driven from published studies of neuroticism, totally involving 170,906 subjects. eQTL dataset containing 927,753 eQTLs were obtained from an eQTL meta-analysis of 5311 samples. Integrative analysis of GWAS and eQTL data was conducted by summary data-based Mendelian randomization (SMR) analysis software. To identify neuroticism associated gene sets, the SMR analysis results were further subjected to gene set enrichment analysis (GSEA). The gene set annotation dataset (containing 13,311 annotated gene sets) of GSEA Molecular Signatures Database was used. SMR single gene analysis identified 6 significant genes for neuroticism, including MSRA (p value=2.27×10 -10 ), MGC57346 (p value=6.92×10 -7 ), BLK (p value=1.01×10 -6 ), XKR6 (p value=1.11×10 -6 ), C17ORF69 (p value=1.12×10 -6 ) and KIAA1267 (p value=4.00×10 -6 ). Gene set enrichment analysis observed significant association for Chr8p23 gene set (false discovery rate=0.033). Our results provide novel clues for the genetic mechanism studies of neuroticism. Copyright © 2017. Published by Elsevier Inc.
A Routine Experimental Protocol for qHNMR Illustrated with Taxol⊥
Pauli, Guido F.; Jaki, Birgit U.; Lankin, David C.
2012-01-01
Quantitative 1H NMR (qHNMR) provides a value-added dimension to the standard spectroscopic data set involved in structure analysis, especially when analyzing bioactive molecules and elucidating new natural products. The qHNMR method can be integrated into any routine qualitative workflow without much additional effort by simply establishing quantitative conditions for the standard solution 1H NMR experiments. Moreover, examination of different chemical lots of taxol and a Taxus brevifolia extract as working examples led to a blueprint for a generic approach to performing a routinely practiced 13C-decoupled qHNMR experiment, and for recognizing its potential and main limitations. The proposed protocol is based on a newly assembled 13C GARP broadband decoupled proton acquisition sequence that reduces spectroscopic complexity by removal of carbon satellites. The method is capable of providing qualitative and quantitative NMR data simultaneously and covers various analytes from pure compounds to complex mixtures such as metabolomes. Due to a routinely achievable dynamic range of 300:1 (0.3%) or better, qHNMR qualifies for applications ranging from reference standards to biologically active compounds to metabolome analysis. Providing a “cookbook” approach to qHNMR, acquisition conditions are described that can be adapted for contemporary NMR spectrometers of all major manufacturers. PMID:17298095
Monte Carlo modeling of light-tissue interactions in narrow band imaging.
Le, Du V N; Wang, Quanzeng; Ramella-Roman, Jessica C; Pfefer, T Joshua
2013-01-01
Light-tissue interactions that influence vascular contrast enhancement in narrow band imaging (NBI) have not been the subject of extensive theoretical study. In order to elucidate relevant mechanisms in a systematic and quantitative manner we have developed and validated a Monte Carlo model of NBI and used it to study the effect of device and tissue parameters, specifically, imaging wavelength (415 versus 540 nm) and vessel diameter and depth. Simulations provided quantitative predictions of contrast-including up to 125% improvement in small, superficial vessel contrast for 415 over 540 nm. Our findings indicated that absorption rather than scattering-the mechanism often cited in prior studies-was the dominant factor behind spectral variations in vessel depth-selectivity. Narrow-band images of a tissue-simulating phantom showed good agreement in terms of trends and quantitative values. Numerical modeling represents a powerful tool for elucidating the factors that affect the performance of spectral imaging approaches such as NBI.
Chang, Cheng; Xu, Kaikun; Guo, Chaoping; Wang, Jinxia; Yan, Qi; Zhang, Jian; He, Fuchu; Zhu, Yunping
2018-05-22
Compared with the numerous software tools developed for identification and quantification of -omics data, there remains a lack of suitable tools for both downstream analysis and data visualization. To help researchers better understand the biological meanings in their -omics data, we present an easy-to-use tool, named PANDA-view, for both statistical analysis and visualization of quantitative proteomics data and other -omics data. PANDA-view contains various kinds of analysis methods such as normalization, missing value imputation, statistical tests, clustering and principal component analysis, as well as the most commonly-used data visualization methods including an interactive volcano plot. Additionally, it provides user-friendly interfaces for protein-peptide-spectrum representation of the quantitative proteomics data. PANDA-view is freely available at https://sourceforge.net/projects/panda-view/. 1987ccpacer@163.com and zhuyunping@gmail.com. Supplementary data are available at Bioinformatics online.
NASA Technical Reports Server (NTRS)
Shekhar, R.; Cothren, R. M.; Vince, D. G.; Chandra, S.; Thomas, J. D.; Cornhill, J. F.
1999-01-01
Intravascular ultrasound (IVUS) provides exact anatomy of arteries, allowing accurate quantitative analysis. Automated segmentation of IVUS images is a prerequisite for routine quantitative analyses. We present a new three-dimensional (3D) segmentation technique, called active surface segmentation, which detects luminal and adventitial borders in IVUS pullback examinations of coronary arteries. The technique was validated against expert tracings by computing correlation coefficients (range 0.83-0.97) and William's index values (range 0.37-0.66). The technique was statistically accurate, robust to image artifacts, and capable of segmenting a large number of images rapidly. Active surface segmentation enabled geometrically accurate 3D reconstruction and visualization of coronary arteries and volumetric measurements.
Quantitative Ultrasound Assessment of Duchenne Muscular Dystrophy Using Edge Detection Analysis.
Koppaka, Sisir; Shklyar, Irina; Rutkove, Seward B; Darras, Basil T; Anthony, Brian W; Zaidman, Craig M; Wu, Jim S
2016-09-01
The purpose of this study was to investigate the ability of quantitative ultrasound (US) using edge detection analysis to assess patients with Duchenne muscular dystrophy (DMD). After Institutional Review Board approval, US examinations with fixed technical parameters were performed unilaterally in 6 muscles (biceps, deltoid, wrist flexors, quadriceps, medial gastrocnemius, and tibialis anterior) in 19 boys with DMD and 21 age-matched control participants. The muscles of interest were outlined by a tracing tool, and the upper third of the muscle was used for analysis. Edge detection values for each muscle were quantified by the Canny edge detection algorithm and then normalized to the number of edge pixels in the muscle region. The edge detection values were extracted at multiple sensitivity thresholds (0.01-0.99) to determine the optimal threshold for distinguishing DMD from normal. Area under the receiver operating curve values were generated for each muscle and averaged across the 6 muscles. The average age in the DMD group was 8.8 years (range, 3.0-14.3 years), and the average age in the control group was 8.7 years (range, 3.4-13.5 years). For edge detection, a Canny threshold of 0.05 provided the best discrimination between DMD and normal (area under the curve, 0.96; 95% confidence interval, 0.84-1.00). According to a Mann-Whitney test, edge detection values were significantly different between DMD and controls (P < .0001). Quantitative US imaging using edge detection can distinguish patients with DMD from healthy controls at low Canny thresholds, at which discrimination of small structures is best. Edge detection by itself or in combination with other tests can potentially serve as a useful biomarker of disease progression and effectiveness of therapy in muscle disorders.
Allahdina, Ali M; Stetson, Paul F; Vitale, Susan; Wong, Wai T; Chew, Emily Y; Ferris, Fredrick L; Sieving, Paul A; Cukras, Catherine
2018-04-01
As optical coherence tomography (OCT) minimum intensity (MI) analysis provides a quantitative assessment of changes in the outer nuclear layer (ONL), we evaluated the ability of OCT-MI analysis to detect hydroxychloroquine toxicity. Fifty-seven predominantly female participants (91.2% female; mean age, 55.7 ± 10.4 years; mean time on hydroxychloroquine, 15.0 ± 7.5 years) were enrolled in a case-control study and categorized into affected (i.e., with toxicity, n = 19) and unaffected (n = 38) groups using objective multifocal electroretinographic (mfERG) criteria. Spectral-domain OCT scans of the macula were analyzed and OCT-MI values quantitated for each subfield of the Early Treatment Diabetic Retinopathy Study (ETDRS) grid. A two-sample U-test and a cross-validation approach were used to assess the sensitivity and specificity of toxicity detection according to OCT-MI criteria. The medians of the OCT-MI values in all nine of the ETDRS subfields were significantly elevated in the affected group relative to the unaffected group (P < 0.005 for all comparisons), with the largest difference found for the inner inferior subfield (P < 0.0001). The receiver operating characteristic analysis of median MI values of the inner inferior subfields showed high sensitivity and high specificity in the detection of toxicity with area under the curve = 0.99. Retinal changes secondary to hydroxychloroquine toxicity result in increased OCT reflectivity in the ONL that can be detected and quantitated using OCT-MI analysis. Analysis of OCT-MI values demonstrates high sensitivity and specificity for detecting the presence of hydroxychloroquine toxicity in this cohort and may contribute additionally to current screening practices.
Allahdina, Ali M.; Stetson, Paul F.; Vitale, Susan; Wong, Wai T.; Chew, Emily Y.; Ferris, Fredrick L.; Sieving, Paul A.
2018-01-01
Purpose As optical coherence tomography (OCT) minimum intensity (MI) analysis provides a quantitative assessment of changes in the outer nuclear layer (ONL), we evaluated the ability of OCT-MI analysis to detect hydroxychloroquine toxicity. Methods Fifty-seven predominantly female participants (91.2% female; mean age, 55.7 ± 10.4 years; mean time on hydroxychloroquine, 15.0 ± 7.5 years) were enrolled in a case-control study and categorized into affected (i.e., with toxicity, n = 19) and unaffected (n = 38) groups using objective multifocal electroretinographic (mfERG) criteria. Spectral-domain OCT scans of the macula were analyzed and OCT-MI values quantitated for each subfield of the Early Treatment Diabetic Retinopathy Study (ETDRS) grid. A two-sample U-test and a cross-validation approach were used to assess the sensitivity and specificity of toxicity detection according to OCT-MI criteria. Results The medians of the OCT-MI values in all nine of the ETDRS subfields were significantly elevated in the affected group relative to the unaffected group (P < 0.005 for all comparisons), with the largest difference found for the inner inferior subfield (P < 0.0001). The receiver operating characteristic analysis of median MI values of the inner inferior subfields showed high sensitivity and high specificity in the detection of toxicity with area under the curve = 0.99. Conclusions Retinal changes secondary to hydroxychloroquine toxicity result in increased OCT reflectivity in the ONL that can be detected and quantitated using OCT-MI analysis. Analysis of OCT-MI values demonstrates high sensitivity and specificity for detecting the presence of hydroxychloroquine toxicity in this cohort and may contribute additionally to current screening practices. PMID:29677357
NASA Astrophysics Data System (ADS)
Dannemiller, Karen C.; Lang-Yona, Naama; Yamamoto, Naomichi; Rudich, Yinon; Peccia, Jordan
2014-02-01
We examined fungal communities associated with the PM10 mass of Rehovot, Israel outdoor air samples collected in the spring and fall seasons. Fungal communities were described by 454 pyrosequencing of the internal transcribed spacer (ITS) region of the fungal ribosomal RNA encoding gene. To allow for a more quantitative comparison of fungal exposure in humans, the relative abundance values of specific taxa were transformed to absolute concentrations through multiplying these values by the sample's total fungal spore concentration (derived from universal fungal qPCR). Next, the sequencing-based absolute concentrations for Alternaria alternata, Cladosporium cladosporioides, Epicoccum nigrum, and Penicillium/Aspergillus spp. were compared to taxon-specific qPCR concentrations for A. alternata, C. cladosporioides, E. nigrum, and Penicillium/Aspergillus spp. derived from the same spring and fall aerosol samples. Results of these comparisons showed that the absolute concentration values generated from pyrosequencing were strongly associated with the concentration values derived from taxon-specific qPCR (for all four species, p < 0.005, all R > 0.70). The correlation coefficients were greater for species present in higher concentrations. Our microbial aerosol population analyses demonstrated that fungal diversity (number of fungal operational taxonomic units) was higher in the spring compared to the fall (p = 0.02), and principal coordinate analysis showed distinct seasonal differences in taxa distribution (ANOSIM p = 0.004). Among genera containing allergenic and/or pathogenic species, the absolute concentrations of Alternaria, Aspergillus, Fusarium, and Cladosporium were greater in the fall, while Cryptococcus, Penicillium, and Ulocladium concentrations were greater in the spring. The transformation of pyrosequencing fungal population relative abundance data to absolute concentrations can improve next-generation DNA sequencing-based quantitative aerosol exposure assessment.
FT-IR imaging for quantitative determination of liver fat content in non-alcoholic fatty liver.
Kochan, K; Maslak, E; Chlopicki, S; Baranska, M
2015-08-07
In this work we apply FT-IR imaging of large areas of liver tissue cross-section samples (∼5 cm × 5 cm) for quantitative assessment of steatosis in murine model of Non-Alcoholic Fatty Liver (NAFLD). We quantified the area of liver tissue occupied by lipid droplets (LDs) by FT-IR imaging and Oil Red O (ORO) staining for comparison. Two alternative FT-IR based approaches are presented. The first, straightforward method, was based on average spectra from tissues and provided values of the fat content by using a PLS regression model and the reference method. The second one – the chemometric-based method – enabled us to determine the values of the fat content, independently of the reference method by means of k-means cluster (KMC) analysis. In summary, FT-IR images of large size liver sections may prove to be useful for quantifying liver steatosis without the need of tissue staining.
Calibrant-Free Analyte Quantitation via a Variable Velocity Flow Cell.
Beck, Jason G; Skuratovsky, Aleksander; Granger, Michael C; Porter, Marc D
2017-01-17
In this paper, we describe a novel method for analyte quantitation that does not rely on calibrants, internal standards, or calibration curves but, rather, leverages the relationship between disparate and predictable surface-directed analyte flux to an array of sensing addresses and a measured resultant signal. To reduce this concept to practice, we fabricated two flow cells such that the mean linear fluid velocity, U, was varied systematically over an array of electrodes positioned along the flow axis. This resulted in a predictable variation of the address-directed flux of a redox analyte, ferrocenedimethanol (FDM). The resultant limiting currents measured at a series of these electrodes, and accurately described by a convective-diffusive transport model, provided a means to calculate an "unknown" concentration without the use of calibrants, internal standards, or a calibration curve. Furthermore, the experiment and concentration calculation only takes minutes to perform. Deviation in calculated FDM concentrations from true values was minimized to less than 0.5% when empirically derived values of U were employed.
Yu, S; Gao, S; Gan, Y; Zhang, Y; Ruan, X; Wang, Y; Yang, L; Shi, J
2016-04-01
Quantitative structure-property relationship modelling can be a valuable alternative method to replace or reduce experimental testing. In particular, some endpoints such as octanol-water (KOW) and organic carbon-water (KOC) partition coefficients of polychlorinated biphenyls (PCBs) are easier to predict and various models have been already developed. In this paper, two different methods, which are multiple linear regression based on the descriptors generated using Dragon software and hologram quantitative structure-activity relationships, were employed to predict suspended particulate matter (SPM) derived log KOC and generator column, shake flask and slow stirring method derived log KOW values of 209 PCBs. The predictive ability of the derived models was validated using a test set. The performances of all these models were compared with EPI Suite™ software. The results indicated that the proposed models were robust and satisfactory, and could provide feasible and promising tools for the rapid assessment of the SPM derived log KOC and generator column, shake flask and slow stirring method derived log KOW values of PCBs.
NASA Astrophysics Data System (ADS)
Fallica, Roberto; Stowers, Jason K.; Grenville, Andrew; Frommhold, Andreas; Robinson, Alex P. G.; Ekinci, Yasin
2016-07-01
The dynamic absorption coefficients of several chemically amplified resists (CAR) and non-CAR extreme ultraviolet (EUV) photoresists are measured experimentally using a specifically developed setup in transmission mode at the x-ray interference lithography beamline of the Swiss Light Source. The absorption coefficient α and the Dill parameters ABC were measured with unprecedented accuracy. In general, the α of resists match very closely with the theoretical value calculated from elemental densities and absorption coefficients, whereas exceptions are observed. In addition, through the direct measurements of the absorption coefficients and dose-to-clear values, we introduce a new figure of merit called chemical sensitivity to account for all the postabsorption chemical reaction ongoing in the resist, which also predicts a quantitative clearing volume and clearing radius, due to the photon absorption in the resist. These parameters may help provide deeper insight into the underlying mechanisms of the EUV concepts of clearing volume and clearing radius, which are then defined and quantitatively calculated.
Study on Quality Standard of Processed Curcuma Longa Radix
Zhao, Yongfeng; Quan, Liang; Zhou, Haiting; Cao, Dong; Li, Wenbing; Yang, Zhuo
2017-01-01
To control the quality of Curcuma Longa Radix by establishing quality standards, this paper increased the contents of extract and volatile oil determination. Meanwhile, the curcumin was selected as the internal marker, and the relative correlation factors (RCFs) of demethoxycurcumin and bisdemethoxycurcumin were established by high performance liquid chromatography (HPLC). The contents of multicomponents were calculated based on their RCFs. The rationality and feasibility of the methods were evaluated by comparison of the quantitative results between external standard method (ESM) and quantitative analysis of multicomponents by single-marker (QAMS). Ethanol extracts ranged from 9.749 to 15.644% and the mean value was 13.473%. The volatile oil ranged from 0.45 to 0.90 mL/100 g and the mean value was 0.66 mL/100 g. This method was accurate and feasible and could provide a reference for further comprehensive and effective control of the quality standard of Curcuma Longa Radix and its processed products. PMID:29375640
High Resolution Qualitative and Quantitative MR Evaluation of the Glenoid Labrum
Iwasaki, Kenyu; Tafur, Monica; Chang, Eric Y.; SherondaStatum; Biswas, Reni; Tran, Betty; Bae, Won C.; Du, Jiang; Bydder, Graeme M.; Chung, Christine B.
2015-01-01
Objective To implement qualitative and quantitative MR sequences for the evaluation of labral pathology. Methods Six glenoid labra were dissected and the anterior and posterior portions were divided into normal, mildly degenerated, or severely degenerated groups using gross and MR findings. Qualitative evaluation was performed using T1-weighted, proton density-weighted (PD), spoiled gradient echo (SPGR) and ultra-short echo time (UTE) sequences. Quantitative evaluation included T2 and T1rho measurements as well as T1, T2*, and T1rho measurements acquired with UTE techniques. Results SPGR and UTE sequences best demonstrated labral fiber structure. Degenerated labra had a tendency towards decreased T1 values, increased T2/T2* values and increased T1 rho values. T2* values obtained with the UTE sequence allowed for delineation between normal, mildly degenerated and severely degenerated groups (p<0.001). Conclusion Quantitative T2* measurements acquired with the UTE technique are useful for distinguishing between normal, mildly degenerated and severely degenerated labra. PMID:26359581
Gao, Jia-Suo; Tong, Xu-Peng; Chang, Yi-Qun; He, Yu-Xuan; Mei, Yu-Dan; Tan, Pei-Hong; Guo, Jia-Liang; Liao, Guo-Chao; Xiao, Gao-Keng; Chen, Wei-Min; Zhou, Shu-Feng; Sun, Ping-Hua
2015-01-01
Factor IXa (FIXa), a blood coagulation factor, is specifically inhibited at the initiation stage of the coagulation cascade, promising an excellent approach for developing selective and safe anticoagulants. Eighty-four amidinobenzothiophene antithrombotic derivatives targeting FIXa were selected to establish three-dimensional quantitative structure-activity relationship (3D-QSAR) and three-dimensional quantitative structure-selectivity relationship (3D-QSSR) models using comparative molecular field analysis and comparative similarity indices analysis methods. Internal and external cross-validation techniques were investigated as well as region focusing and bootstrapping. The satisfactory q (2) values of 0.753 and 0.770, and r (2) values of 0.940 and 0.965 for 3D-QSAR and 3D-QSSR, respectively, indicated that the models are available to predict both the inhibitory activity and selectivity on FIXa against Factor Xa, the activated status of Factor X. This work revealed that the steric, hydrophobic, and H-bond factors should appropriately be taken into account in future rational design, especially the modifications at the 2'-position of the benzene and the 6-position of the benzothiophene in the R group, providing helpful clues to design more active and selective FIXa inhibitors for the treatment of thrombosis. On the basis of the three-dimensional quantitative structure-property relationships, 16 new potent molecules have been designed and are predicted to be more active and selective than Compound 33, which has the best activity as reported in the literature.
Hoferer, Marc; Braun, Anne; Sting, Reinhard
2017-07-01
Standards are pivotal for pathogen quantification by real-time PCR (qPCR); however, the creation of a complete and universally applicable virus particle standard is challenging. In the present study a procedure based on purification of bovine herpes virus type 1 (BoHV-1) and subsequent quantification by transmission electron microscopy (TEM) is described. Accompanying quantitative quality controls of the TEM preparation procedure using qPCR yielded recovery rates of more than 95% of the BoHV-1 virus particles on the grid used for virus counting, which was attributed to pre-treatment of the grid with 5% bovine albumin. To compare the value of the new virus particle standard for use in qPCR, virus counter based quantification and established pure DNA standards represented by a plasmid and an oligonucleotide were included. It could be shown that the numbers of virus particles, plasmid and oligonucleotide equivalents were within one log10 range determined on the basis of standard curves indicating that different approaches provide comparable quantitative values. However, only virus particles represent a complete, universally applicable quantitative virus standard that meets the high requirements of an RNA and DNA virus gold standard. In contrast, standards based on pure DNA have to be considered as sub-standard due to limited applications. Copyright © 2017 International Alliance for Biological Standardization. Published by Elsevier Ltd. All rights reserved.
Yang, Qian; Manicke, Nicholas E.; Wang, He; Petucci, Christopher; Cooks, R. Graham
2013-01-01
A simple protocol for rapid quantitation of acylcarnitines in serum and whole blood has been developed using paper spray mass spectrometry. Dried serum and whole blood containing a mixture of ten acylcarnitines at various concentrations were analyzed as spots from paper directly without any sample pretreatment, separation, or derivatization. The composition of the spray solvent was found to be a critical factor: for serum samples, spray solvent of methanol/water/formic acid (80:20:0.1) gave the best signal intensity while for blood samples which contain more matrix components, acetonitrile/water (90:10) was a much more suitable spray solvent. For the paper type and size used, 0.5 μL of sample provided an optimal signal for both serum and whole blood samples. For quantitative profiling, the limits of quantitation obtained from both serum and blood were much lower than the clinically validated cutoff values for diagnosis of fatty acid oxidation disorders in newborn screening. Linearity (R2>0.95) and reproducibility (RSD ~10 %) were achieved in the concentration ranges from 100 nM to 5 μM for the C2 acylcarnitine, and for other acylcarnitines, these values were from 10 to 500 nM. Acylcarnitine profiles offer an effective demonstration of the fact that paper spray mass spectrometry is an appropriate, simple, rapid method with high sensitivity and high reproducibility applicable to newborn screening tests. PMID:22760507
Spatially resolved hazard and exposure assessments: an example of lead in soil at Lavrion, Greece.
Tristán, E; Demetriades, A; Ramsey, M H; Rosenbaum, M S; Stavrakis, P; Thornton, I; Vassiliades, E; Vergou, K
2000-01-01
Spatially resolved hazard assessment (SRHA) and spatially resolved exposure assessment (SREA) are methodologies that have been devised for assessing child exposure to soil containing environmental pollutants. These are based on either a quantitative or a semiquantitative approach. The feasibility of the methodologies has been demonstrated in a study assessing child exposure to Pb accessible in soil at the town of Lavrion in Greece. Using a quantitative approach, both measured and kriged concentrations of Pb in soil are compared with an "established" statutory threshold value. The probabilistic approach gives a refined classification of the contaminated land, since it takes into consideration the uncertainty in both the actual measurement and estimated kriged values. Two exposure assessment models (i.e., IEUBK and HESP) are used as the basis of the quantitative SREA methodologies. The significant correlation between the blood-Pb predictions, using the IEUBK model, and measured concentrations provides a partial validation of the method, because it allows for the uncertainty in the measurements and the lack of some site-specific measurements. The semiquantitative applications of SRHA and SREA incorporate both qualitative information (e.g., land use and dustiness of waste) and quantitative information (e.g., distance from wastes and distance from industry). The significant correlation between the results of these assessments and the measured blood-Pb levels confirms the robust nature of this approach. Successful application of these methodologies could reduce the cost of the assessment and allow areas to be prioritized for further investigation, remediation, or risk management.
Gao, Jia-Suo; Tong, Xu-Peng; Chang, Yi-Qun; He, Yu-Xuan; Mei, Yu-Dan; Tan, Pei-Hong; Guo, Jia-Liang; Liao, Guo-Chao; Xiao, Gao-Keng; Chen, Wei-Min; Zhou, Shu-Feng; Sun, Ping-Hua
2015-01-01
Factor IXa (FIXa), a blood coagulation factor, is specifically inhibited at the initiation stage of the coagulation cascade, promising an excellent approach for developing selective and safe anticoagulants. Eighty-four amidinobenzothiophene antithrombotic derivatives targeting FIXa were selected to establish three-dimensional quantitative structure–activity relationship (3D-QSAR) and three-dimensional quantitative structure–selectivity relationship (3D-QSSR) models using comparative molecular field analysis and comparative similarity indices analysis methods. Internal and external cross-validation techniques were investigated as well as region focusing and bootstrapping. The satisfactory q2 values of 0.753 and 0.770, and r2 values of 0.940 and 0.965 for 3D-QSAR and 3D-QSSR, respectively, indicated that the models are available to predict both the inhibitory activity and selectivity on FIXa against Factor Xa, the activated status of Factor X. This work revealed that the steric, hydrophobic, and H-bond factors should appropriately be taken into account in future rational design, especially the modifications at the 2′-position of the benzene and the 6-position of the benzothiophene in the R group, providing helpful clues to design more active and selective FIXa inhibitors for the treatment of thrombosis. On the basis of the three-dimensional quantitative structure–property relationships, 16 new potent molecules have been designed and are predicted to be more active and selective than Compound 33, which has the best activity as reported in the literature. PMID:25848211
Wu, Cheng-Ching; Lin, Hung-Yu; Wang, Chao-Ping; Lu, Li-Fen; Yu, Teng-Hung; Hung, Wei-Chin; Houng, Jer-Yiing; Chung, Fu-Mei; Lee, Yau-Jiunn; Hu, Jin-Jia
2015-11-03
Prostate cancer remains the most common cancer in men. Qualitative or semi-quantitative immunochromatographic measurements of prostate specific antigen (PSA) have been shown to be simple, noninvasive and feasible. The aim of this study was to evaluate an optimized gold immunochromatographic strip device for the detection of PSA, in which the results can be analysed using a Chromogenic Rapid Test Reader to quantitatively assess the test results. This reader measures the reflectance of the signal line via a charge-coupled device camera. For quantitative analysis, PSA concentration was computed via a calibration equation. Capillary blood samples from 305 men were evaluated, and two independent observers interpreted the test results after 12 min. Blood samples were also collected and tested with a conventional quantitative assay. Sensitivity, specificity, positive and negative predictive values, and accuracy of the PSA rapid quantitative test system were 100, 96.6, 89.5, 100, and 97.4 %, respectively. Reproducibility of the test was 99.2, and interobserver variation was 8 % with a false positive rate of 3.4 %. The correlation coefficient between the ordinary quantitative assay and the rapid quantitative test was 0.960. The PSA rapid quantitative test system provided results quickly and was easy to use, so that tests using this system can be easily performed at outpatient clinics or elsewhere. This system may also be useful for initial cancer screening and for point-of-care testing, because results can be obtained within 12 min and at a cost lower than that of conventional quantitative assays.
NASA Astrophysics Data System (ADS)
Rothman, D. S.; Siraj, A.; Hughes, B.
2013-12-01
The international research community is currently in the process of developing new scenarios for climate change research. One component of these scenarios are the Shared Socio-economic Pathways (SSPs), which describe a set of possible future socioeconomic conditions. These are presented in narrative storylines with associated quantitative drivers. The core quantitative drivers include total population, average GDP per capita, educational attainment, and urbanization at the global, regional, and national levels. At the same time there have been calls, particularly by the IAV community, for the SSPs to include additional quantitative information on other key social factors, such as income inequality, governance, health, and access to key infrastructures, which are discussed in the narratives. The International Futures system (IFs), based at the Pardee Center at the University of Denver, is able to provide forecasts of many of these indicators. IFs cannot use the SSP drivers as exogenous inputs, but we are able to create development pathways that closely reproduce the core quantitative drivers defined by the different SSPs, as well as incorporating assumptions on other key driving factors described in the qualitative narratives. In this paper, we present forecasts for additional quantitative indicators based upon the implementation of the SSP development pathways in IFs. These results will be of value to many researchers.
The use of immunohistochemistry for biomarker assessment--can it compete with other technologies?
Dunstan, Robert W; Wharton, Keith A; Quigley, Catherine; Lowe, Amanda
2011-10-01
A morphology-based assay such as immunohistochemistry (IHC) should be a highly effective means to define the expression of a target molecule of interest, especially if the target is a protein. However, over the past decade, IHC as a platform for biomarkers has been challenged by more quantitative molecular assays with reference standards but that lack morphologic context. For IHC to be considered a "top-tier" biomarker assay, it must provide truly quantitative data on par with non-morphologic assays, which means it needs to be run with reference standards. However, creating such standards for IHC will require optimizing all aspects of tissue collection, fixation, section thickness, morphologic criteria for assessment, staining processes, digitization of images, and image analysis. This will also require anatomic pathology to evolve from a discipline that is descriptive to one that is quantitative. A major step in this transformation will be replacing traditional ocular microscopes with computer monitors and whole slide images, for without digitization, there can be no accurate quantitation; without quantitation, there can be no standardization; and without standardization, the value of morphology-based IHC assays will not be realized.
Stewart, Ethan L; Hagerty, Christina H; Mikaberidze, Alexey; Mundt, Christopher C; Zhong, Ziming; McDonald, Bruce A
2016-07-01
Zymoseptoria tritici causes Septoria tritici blotch (STB) on wheat. An improved method of quantifying STB symptoms was developed based on automated analysis of diseased leaf images made using a flatbed scanner. Naturally infected leaves (n = 949) sampled from fungicide-treated field plots comprising 39 wheat cultivars grown in Switzerland and 9 recombinant inbred lines (RIL) grown in Oregon were included in these analyses. Measures of quantitative resistance were percent leaf area covered by lesions, pycnidia size and gray value, and pycnidia density per leaf and lesion. These measures were obtained automatically with a batch-processing macro utilizing the image-processing software ImageJ. All phenotypes in both locations showed a continuous distribution, as expected for a quantitative trait. The trait distributions at both sites were largely overlapping even though the field and host environments were quite different. Cultivars and RILs could be assigned to two or more statistically different groups for each measured phenotype. Traditional visual assessments of field resistance were highly correlated with quantitative resistance measures based on image analysis for the Oregon RILs. These results show that automated image analysis provides a promising tool for assessing quantitative resistance to Z. tritici under field conditions.
Quantitative Diagnosis of Continuous-Valued, Stead-State Systems
NASA Technical Reports Server (NTRS)
Rouquette, N.
1995-01-01
Quantitative diagnosis involves numerically estimating the values of unobservable parameters that best explain the observed parameter values. We consider quantitative diagnosis for continuous, lumped- parameter, steady-state physical systems because such models are easy to construct and the diagnosis problem is considerably simpler than that for corresponding dynamic models. To further tackle the difficulties of numerically inverting a simulation model to compute a diagnosis, we propose to decompose a physical system model in terms of feedback loops. This decomposition reduces the dimension of the problem and consequently decreases the diagnosis search space. We illustrate this approach on a model of thermal control system studied in earlier research.
Yao, Shun; Zhong, Yi; Xu, Yuhao; Qin, Jiasheng; Zhang, Ningning; Zhu, Xiaolan; Li, Yuefeng
2017-01-01
Previous studies have detected abnormal serum ferritin levels in patients with depression; however, the results have been inconsistent. This study used quantitative susceptibility mapping (QSM) for the first time to examine brain iron concentration in depressed patients and evaluated whether it is related to severity. We included three groups of age- and gender-matched participants: 30 patients with mild-moderate depression (MD), 14 patients with major depression disorder (MDD) and 20 control subjects. All participants underwent MR scans with a 3D gradient-echo sequence reconstructing for QSM and performed the 17-item Hamilton Depression Rating Scale (HDRS) test. In MDD, the susceptibility value in the bilateral putamen was significantly increased compared with MD or control subjects. In addition, a significant difference was also observed in the left thalamus in MDD patients compared with controls. However, the susceptibility values did not differ between MD patients and controls. The susceptibility values positively correlated with the severity of depression as indicated by the HDRS scores. Our results provide evidence that brain iron deposition may be associated with depression and may even be a biomarker for investigating the pathophysiological mechanism of depression. PMID:28900391
Huang, Weidong; Reinholz, Monica; Weidler, Jodi; Yolanda, Lie; Paquet, Agnes; Whitcomb, Jeannette; Lingle, Wilma; Jenkins, Robert B; Chen, Beiyun; Larson, Jeffrey S; Tan, Yuping; Sherwood, Thomas; Bates, Michael; Perez, Edith A
2010-08-01
The accuracy and reliability of immunohistochemical analysis and in situ hybridization for the assessment of HER2 status remains a subject of debate. We developed a novel assay (HERmark Breast Cancer Assay, Monogram Biosciences, South San Francisco, CA) that provides precise quantification of total HER2 protein expression (H2T) and HER2 homodimers (H2D) in formalin-fixed, paraffin-embedded tissue specimens. H2T and H2D results of 237 breast cancers were compared with those of immunohistochemical studies and fluorescence in situ hybridization (FISH) centrally performed at the Mayo Clinic, Rochester, MN. H2T described a continuum across a wide dynamic range ( approximately 2.5 log). Excluding the equivocal cases, HERmark showed 98% concordance with immunohistochemical studies for positive and negative assay values. For the 94 immunohistochemically equivocal cases, 67% and 39% concordance values were observed between HERmark and FISH for positive and negative assay values, respectively. Polysomy 17 in the absence of HER2 gene amplification did not result in HER2 overexpression as evaluated quantitatively using the HERmark assay.
Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.
Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L
2017-10-01
The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (p<0.005) and correlated with independently identified visual EEG patterns such as generalized periodic discharges (p<0.02). Receiver operating characteristic (ROC) analysis confirmed the predictive value of lower state space velocity for poor clinical outcome after cardiac arrest (AUC 80.8, 70% sensitivity, 15% false positive rate). Model-based quantitative EEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic encephalopathy. Copyright © 2017 Elsevier B.V. All rights reserved.
Luchins, Daniel
2012-01-01
The quality improvement model currently used in medicine and mental health was adopted from industry, where it developed out of early 20th-century efforts to apply a positivist/quantitative agenda to improving manufacturing. This article questions the application of this model to mental health care. It argues that (1) developing "operational definitions" for something as value-laden as "quality" risks conflating two realms, what we measure with what we value; (2) when measurements that are tied to individuals are aggregated to establish benchmarks and goals, unwarranted mathematical assumptions are made; (3) choosing clinical outcomes is problematic; (4) there is little relationship between process measures and clinical outcomes; and (5) since changes in quality indices do not relate to improved clinical care, management's reliance on such indices provides an illusory sense of control. An alternative model is the older, skill-based/qualitative approach to knowing, which relies on "implicit/ expert" knowledge. These two approaches offer a series of contrasts: quality versus excellence, competence versus expertise, management versus leadership, extrinsic versus intrinsic rewards. The article concludes that we need not totally dispense with the current quality improvement model, but rather should balance quantitative efforts with the older qualitative approach in a mixed methods model.
Sánchez Gómez, Serafín; Ostos, Elisa María Cabot; Solano, Juan Manuel Maza; Salado, Tomás Francisco Herrero
2013-05-06
We evaluated a newly designed electronic portfolio (e-Portfolio) that provided quantitative evaluation of surgical skills. Medical students at the University of Seville used the e-Portfolio on a voluntary basis for evaluation of their performance in undergraduate surgical subjects. Our new web-based e-Portfolio was designed to evaluate surgical practical knowledge and skills targets. Students recorded each activity on a form, attached evidence, and added their reflections. Students self-assessed their practical knowledge using qualitative criteria (yes/no), and graded their skills according to complexity (basic/advanced) and participation (observer/assistant/independent). A numerical value was assigned to each activity, and the values of all activities were summated to obtain the total score. The application automatically displayed quantitative feedback. We performed qualitative evaluation of the perceived usefulness of the e-Portfolio and quantitative evaluation of the targets achieved. Thirty-seven of 112 students (33%) used the e-Portfolio, of which 87% reported that they understood the methodology of the portfolio. All students reported an improved understanding of their learning objectives resulting from the numerical visualization of progress, all students reported that the quantitative feedback encouraged their learning, and 79% of students felt that their teachers were more available because they were using the e-Portfolio. Only 51.3% of students reported that the reflective aspects of learning were useful. Individual students achieved a maximum of 65% of the total targets and 87% of the skills targets. The mean total score was 345 ± 38 points. For basic skills, 92% of students achieved the maximum score for participation as an independent operator, and all achieved the maximum scores for participation as an observer and assistant. For complex skills, 62% of students achieved the maximum score for participation as an independent operator, and 98% achieved the maximum scores for participation as an observer or assistant. Medical students reported that use of an electronic portfolio that provided quantitative feedback on their progress was useful when the number and complexity of targets were appropriate, but not when the portfolio offered only formative evaluations based on reflection. Students felt that use of the e-Portfolio guided their learning process by indicating knowledge gaps to themselves and teachers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kopyra, Janina; Abdoul-Carime, Hassan, E-mail: hcarime@ipnl.in2p3.fr
Providing experimental values for absolute Dissociative Electron Attachment (DEA) cross sections for nucleobases at realistic biological conditions is a considerable challenge. In this work, we provide the temperature dependence of the cross section, σ, of the dehydrogenated thymine anion (T − H){sup −} produced via DEA. Within the 393-443 K temperature range, it is observed that σ varies by one order of magnitude. By extrapolating to a temperature of 313 K, the relative DEA cross section for the production of the dehydrogenated thymine anion at an incident energy of 1 eV decreases by 2 orders of magnitude and the absolutemore » value reaches approximately 6 × 10{sup −19} cm{sup 2}. These quantitative measurements provide a benchmark for theoretical prediction and also a contribution to a more accurate description of the effects of ionizing radiation on molecular medium.« less
Accuracy of quantitative visual soil assessment
NASA Astrophysics Data System (ADS)
van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne
2016-04-01
Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7 farmers carried out quantitative visual observations all independently from each other. All observers assessed five sites, having a sand, peat or clay soil. For almost all quantitative visual observations the spread of observed values was low (coefficient of variation < 1.0), except for the number of biopores and gley mottles. Furthermore, farmers' observed mean values were significantly higher than soil scientists' mean values, for soil structure, amount of gley mottles and compaction. This study showed that VSA could be a valuable tool to assess soil quality. Subjectivity, due to the background of the observer, might influence the outcome of visual assessment of some soil properties. In countries where soil analyses can easily be carried out, VSA might be a good replenishment to available soil chemical analyses, and in countries where it is not feasible to carry out soil analyses, VSA might be a good start to assess soil quality.
Erwin, Elizabeth A; Custis, Natalie J; Satinover, Shama M; Perzanowski, Matthew S; Woodfolk, Judith A; Crane, Julian; Wickens, Kristin; Platts-Mills, Thomas A E
2005-05-01
Commercially available assays for IgE antibody provide results in international units per milliliter for many allergen extracts, but this is not easily achieved with purified or novel allergens. To develop assays for IgE antibody suitable for purified or novel allergens by using a commercially available immunosorbent. Streptavidin coupled to a high-capacity immunosorbent (CAP) was used to bind biotinylated purified allergens from mite (Der p 1 and Der p 2), cat (Fel d 1), and dog (Can f 1). Assays for IgE antibody to these allergens were performed on sera from children (asthma and control) as well as adults with atopic dermatitis. The results were validated by serial dilution of sera with high and low levels of IgE antibody and were quantitated in international units per milliliter by using a standard curve. Values for IgE antibody to Der p 1, Der p 2, and Fel d 1 correlated with values obtained with the allergen extracts (r2 = 0.80, 0.84, and 0.95, respectively; P < .001 in each case). Furthermore, the values for IgE antibody in sera from children with high exposure to mite and cat allergens demonstrated 10-fold higher levels of IgE antibody to Der p 1 and Der p 2 than to Fel d 1 (P < .001). The streptavidin immunosorbent technique provides a new method for quantifying IgE antibody to purified proteins. The results provide evidence about the high quantities of IgE antibody to purified inhalant allergens in patients with atopic dermatitis. In addition, the results demonstrate major differences in IgE antibodies specific for mite and cat allergens among children with high exposure to both allergens.
Gu, Z.; Sam, S. S.; Sun, Y.; Tang, L.; Pounds, S.; Caliendo, A. M.
2016-01-01
A potential benefit of digital PCR is a reduction in result variability across assays and platforms. Three sets of PCR reagents were tested on two digital PCR systems (Bio-Rad and RainDance), using three different sets of PCR reagents for quantitation of cytomegalovirus (CMV). Both commercial quantitative viral standards and 16 patient samples (n = 16) were tested. Quantitative accuracy (compared to nominal values) and variability were determined based on viral standard testing results. Quantitative correlation and variability were assessed with pairwise comparisons across all reagent-platform combinations for clinical plasma sample results. The three reagent sets, when used to assay quantitative standards on the Bio-Rad system, all showed a high degree of accuracy, low variability, and close agreement with one another. When used on the RainDance system, one of the three reagent sets appeared to have a much better correlation to nominal values than did the other two. Quantitative results for patient samples showed good correlation in most pairwise comparisons, with some showing poorer correlations when testing samples with low viral loads. Digital PCR is a robust method for measuring CMV viral load. Some degree of result variation may be seen, depending on platform and reagents used; this variation appears to be greater in samples with low viral load values. PMID:27535685
Jiang, Shun-Yuan; Sun, Hong-Bing; Sun, Hui; Ma, Yu-Ying; Chen, Hong-Yu; Zhu, Wen-Tao; Zhou, Yi
2016-03-01
This paper aims to explore a comprehensive assessment method combined traditional Chinese medicinal material specifications with quantitative quality indicators. Seventy-six samples of Notopterygii Rhizoma et Radix were collected on market and at producing areas. Traditional commercial specifications were described and assigned, and 10 chemical components and volatile oils were determined for each sample. Cluster analysis, Fisher discriminant analysis and correspondence analysis were used to establish the relationship between the traditional qualitative commercial specifications and quantitative chemical indices for comprehensive evaluating quality of medicinal materials, and quantitative classification of commercial grade and quality grade. A herb quality index (HQI) including traditional commercial specifications and chemical components for quantitative grade classification were established, and corresponding discriminant function were figured out for precise determination of quality grade and sub-grade of Notopterygii Rhizoma et Radix. The result showed that notopterol, isoimperatorin and volatile oil were the major components for determination of chemical quality, and their dividing values were specified for every grade and sub-grade of the commercial materials of Notopterygii Rhizoma et Radix. According to the result, essential relationship between traditional medicinal indicators, qualitative commercial specifications, and quantitative chemical composition indicators can be examined by K-mean cluster, Fisher discriminant analysis and correspondence analysis, which provide a new method for comprehensive quantitative evaluation of traditional Chinese medicine quality integrated traditional commodity specifications and quantitative modern chemical index. Copyright© by the Chinese Pharmaceutical Association.
NASA Astrophysics Data System (ADS)
Mobberley, Sean David
Accurate, cross-scanner assessment of in-vivo air density used to quantitatively assess amount and distribution of emphysema in COPD subjects has remained elusive. Hounsfield units (HU) within tracheal air can be considerably more positive than -1000 HU. With the advent of new dual-source scanners which employ dedicated scatter correction techniques, it is of interest to evaluate how the quantitative measures of lung density compare between dual-source and single-source scan modes. This study has sought to characterize in-vivo and phantom-based air metrics using dual-energy computed tomography technology where the nature of the technology has required adjustments to scatter correction. Anesthetized ovine (N=6), swine (N=13: more human-like rib cage shape), lung phantom and a thoracic phantom were studied using a dual-source MDCT scanner (Siemens Definition Flash. Multiple dual-source dual-energy (DSDE) and single-source (SS) scans taken at different energy levels and scan settings were acquired for direct quantitative comparison. Density histograms were evaluated for the lung, tracheal, water and blood segments. Image data were obtained at 80, 100, 120, and 140 kVp in the SS mode (B35f kernel) and at 80, 100, 140, and 140-Sn (tin filtered) kVp in the DSDE mode (B35f and D30f kernels), in addition to variations in dose, rotation time, and pitch. To minimize the effect of cross-scatter, the phantom scans in the DSDE mode was obtained by reducing the tube current of one of the tubes to its minimum (near zero) value. When using image data obtained in the DSDE mode, the median HU values in the tracheal regions of all animals and the phantom were consistently closer to -1000 HU regardless of reconstruction kernel (chapters 3 and 4). Similarly, HU values of water and blood were consistently closer to their nominal values of 0 HU and 55 HU respectively. When using image data obtained in the SS mode the air CT numbers demonstrated a consistent positive shift of up to 35 HU with respect to the nominal -1000 HU value. In vivo data demonstrated considerable variability in tracheal, influenced by local anatomy with SS mode scanning while tracheal air was more consistent with DSDE imaging. Scatter effects in the lung parenchyma differed from adjacent tracheal measures. In summary, data suggest that enhanced scatter correction serves to provide more accurate CT lung density measures sought to quantitatively assess the presence and distribution of emphysema in COPD subjects. Data further suggest that CT images, acquired without adequate scatter correction, cannot be corrected by linear algorithms given the variability in tracheal air HU values and the independent scatter effects on lung parenchyma.
An anthropomorphic phantom for quantitative evaluation of breast MRI.
Freed, Melanie; de Zwart, Jacco A; Loud, Jennifer T; El Khouli, Riham H; Myers, Kyle J; Greene, Mark H; Duyn, Jeff H; Badano, Aldo
2011-02-01
In this study, the authors aim to develop a physical, tissue-mimicking phantom for quantitative evaluation of breast MRI protocols. The objective of this phantom is to address the need for improved standardization in breast MRI and provide a platform for evaluating the influence of image protocol parameters on lesion detection and discrimination. Quantitative comparisons between patient and phantom image properties are presented. The phantom is constructed using a mixture of lard and egg whites, resulting in a random structure with separate adipose- and glandular-mimicking components. T1 and T2 relaxation times of the lard and egg components of the phantom were estimated at 1.5 T from inversion recovery and spin-echo scans, respectively, using maximum-likelihood methods. The image structure was examined quantitatively by calculating and comparing spatial covariance matrices of phantom and patient images. A static, enhancing lesion was introduced by creating a hollow mold with stereolithography and filling it with a gadolinium-doped water solution. Measured phantom relaxation values fall within 2 standard errors of human values from the literature and are reasonably stable over 9 months of testing. Comparison of the covariance matrices of phantom and patient data demonstrates that the phantom and patient data have similar image structure. Their covariance matrices are the same to within error bars in the anterior-posterior direction and to within about two error bars in the right-left direction. The signal from the phantom's adipose-mimicking material can be suppressed using active fat-suppression protocols. A static, enhancing lesion can also be included with the ability to change morphology and contrast agent concentration. The authors have constructed a phantom and demonstrated its ability to mimic human breast images in terms of key physical properties that are relevant to breast MRI. This phantom provides a platform for the optimization and standardization of breast MRI imaging protocols for lesion detection and characterization.
Paradigm Privilege: Determining the Value of Research in Teacher Education Policy Making.
ERIC Educational Resources Information Center
Bales, Barbara L.
This paper explains that despite the long debate over the relative value of quantitative and qualitative educational research and attempts to talk across disciplines, quantitative research dominates educational policy circles. As a result, quality qualitative research may not enter into educational policy conversations. The paper discusses whether…
Uncovering the genetic signature of quantitative trait evolution with replicated time series data.
Franssen, S U; Kofler, R; Schlötterer, C
2017-01-01
The genetic architecture of adaptation in natural populations has not yet been resolved: it is not clear to what extent the spread of beneficial mutations (selective sweeps) or the response of many quantitative trait loci drive adaptation to environmental changes. Although much attention has been given to the genomic footprint of selective sweeps, the importance of selection on quantitative traits is still not well studied, as the associated genomic signature is extremely difficult to detect. We propose 'Evolve and Resequence' as a promising tool, to study polygenic adaptation of quantitative traits in evolving populations. Simulating replicated time series data we show that adaptation to a new intermediate trait optimum has three characteristic phases that are reflected on the genomic level: (1) directional frequency changes towards the new trait optimum, (2) plateauing of allele frequencies when the new trait optimum has been reached and (3) subsequent divergence between replicated trajectories ultimately leading to the loss or fixation of alleles while the trait value does not change. We explore these 3 phase characteristics for relevant population genetic parameters to provide expectations for various experimental evolution designs. Remarkably, over a broad range of parameters the trajectories of selected alleles display a pattern across replicates, which differs both from neutrality and directional selection. We conclude that replicated time series data from experimental evolution studies provide a promising framework to study polygenic adaptation from whole-genome population genetics data.
Ethnobotany of medicinal plants in district Mastung of Balochistan province-Pakistan.
Bibi, Tahira; Ahmad, Mushtaq; Bakhsh Tareen, Rsool; Mohammad Tareen, Niaz; Jabeen, Rukhsana; Rehman, Saeed-Ur; Sultana, Shazia; Zafar, Muhammad; Yaseen, Ghulam
2014-11-18
The aim of this study was to document the medicinal uses of plants in district Mastung of Balochistan province, Pakistan. The ethnobotanical results contain quantitative information on medicinal plants diversity documented for the first time in the area. The information was collected through semi-structured interviews, rapid appraisal approach, open ended questionnaire and personal observations. Results were analyzed using quantitative indices of information consent factor (ICF), fidelity level (FL), use value (UV), frequency citation (FC) and relative frequency citation (RFC). In total of 102 plant species belonging to 47 families were reported for the medicinal purposes. Asteraceae was found to be dominant family in terms of species in the area with 11 species. The whole plant and leaves were noted as most frequently used parts (24%). Decoction (31% with 40 species) was the most commonly used preparation method. Highest ICF value (1) was recorded for antidote category. 100% fidelity level was found for four plant species i.e. Achillea welhemsii, Caralluma tuberculata, Citrullus colocynthis, and Seripidium quettense. The highest use value was reported for the Acroptilon repens (0.5) while highest RFC value was calculated for Berberis balochistanica and Citrullus colocynthis (0.18). Highest use report was calculated for Caralluma tuberculata, Citrullus colocynthis, Malva neglecta and Mentha longifolia with five use reports for each. The area is rich in medicinal plants and these plants are still commonly used for medicinal purposes among the people in their daily lives. However, there is a gradual loss of traditional knowledge about these plants in new generation. This study provides basis for the conservation of the local flora, its use as food and medicine. It also provides various socio-economic dimensions associated with the common people. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Low-intensity calibration source for optical imaging systems
NASA Astrophysics Data System (ADS)
Holdsworth, David W.
2017-03-01
Laboratory optical imaging systems for fluorescence and bioluminescence imaging have become widely available for research applications. These systems use an ultra-sensitive CCD camera to produce quantitative measurements of very low light intensity, detecting signals from small-animal models labeled with optical fluorophores or luminescent emitters. Commercially available systems typically provide quantitative measurements of light output, in units of radiance (photons s-1 cm-2 SR-1) or intensity (photons s-1 cm-2). One limitation to current systems is that there is often no provision for routine quality assurance and performance evaluation. We describe such a quality assurance system, based on an LED-illuminated thin-film transistor (TFT) liquid-crystal display module. The light intensity is controlled by pulse-width modulation of the backlight, producing radiance values ranging from 1.8 x 106 photons s-1 cm-2 SR-1 to 4.2 x 1013 photons s-1 cm-2 SR-1. The lowest light intensity values are produced by very short backlight pulses (i.e. approximately 10 μs), repeated every 300 s. This very low duty cycle is appropriate for laboratory optical imaging systems, which typically operate with long-duration exposures (up to 5 minutes). The low-intensity light source provides a stable, traceable radiance standard that can be used for routine quality assurance of laboratory optical imaging systems.
Grussu, Francesco; Ianus, Andrada; Schneider, Torben; Prados, Ferran; Fairney, James; Ourselin, Sebastien; Alexander, Daniel C.; Cercignani, Mara; Gandini Wheeler‐Kingshott, Claudia A.M.; Samson, Rebecca S.
2017-01-01
Purpose To develop a framework to fully characterize quantitative magnetization transfer indices in the human cervical cord in vivo within a clinically feasible time. Methods A dedicated spinal cord imaging protocol for quantitative magnetization transfer was developed using a reduced field‐of‐view approach with echo planar imaging (EPI) readout. Sequence parameters were optimized based in the Cramer‐Rao‐lower bound. Quantitative model parameters (i.e., bound pool fraction, free and bound pool transverse relaxation times [ T2F, T2B], and forward exchange rate [k FB]) were estimated implementing a numerical model capable of dealing with the novelties of the sequence adopted. The framework was tested on five healthy subjects. Results Cramer‐Rao‐lower bound minimization produces optimal sampling schemes without requiring the establishment of a steady‐state MT effect. The proposed framework allows quantitative voxel‐wise estimation of model parameters at the resolution typically used for spinal cord imaging (i.e. 0.75 × 0.75 × 5 mm3), with a protocol duration of ∼35 min. Quantitative magnetization transfer parametric maps agree with literature values. Whole‐cord mean values are: bound pool fraction = 0.11(±0.01), T2F = 46.5(±1.6) ms, T2B = 11.0(±0.2) µs, and k FB = 1.95(±0.06) Hz. Protocol optimization has a beneficial effect on reproducibility, especially for T2B and k FB. Conclusion The framework developed enables robust characterization of spinal cord microstructure in vivo using qMT. Magn Reson Med 79:2576–2588, 2018. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. PMID:28921614
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duan, X; Arbique, G; Guild, J
Purpose: To evaluate the quantitative image quality of spectral reconstructions of phantom data from a spectral CT scanner. Methods: The spectral CT scanner (IQon Spectral CT, Philips Healthcare) is equipped with a dual-layer detector and generates conventional 80-140 kVp images and variety of spectral reconstructions, e.g., virtual monochromatic (VM) images, virtual non-contrast (VNC) images, iodine maps, and effective atomic number (Z) images. A cylindrical solid water phantom (Gammex 472, 33 cm diameter and 5 cm thick) with iodine (2.0-20.0 mg I/ml) and calcium (50-600 mg/ml) rod inserts was scanned at 120 kVp and 27 mGy CTDIvol. Spectral reconstructions were evaluatedmore » by comparing image measurements with theoretical values calculated from nominal rod compositions provided by the phantom manufacturer. The theoretical VNC was calculated using water and iodine basis material decomposition, and the theoretical Z was calculated using two common methods, the chemical formula method (Z1) and the dual-energy ratio method (Z2). Results: Beam-hardening-like artifacts between high-attenuation calcium rods (≥300 mg/ml, >800 HU) influenced quantitative measurements, so the quantitative analysis was only performed on iodine rods using the images from the scan with all the calcium rods removed. The CT numbers of the iodine rods in the VM images (50∼150 keV) were close to theoretical values with average difference of 2.4±6.9 HU. Compared with theoretical values, the average difference for iodine concentration, VNC CT number and effective Z of iodine rods were −0.10±0.38 mg/ml, −0.1±8.2 HU, 0.25±0.06 (Z1) and −0.23±0.07 (Z2). Conclusion: The results indicate that the spectral CT scanner generates quantitatively accurate spectral reconstructions at clinically relevant iodine concentrations. Beam-hardening-like artifacts still exist when high-attenuation objects are present and their impact on patient images needs further investigation. YY is an employee of Philips Healthcare.« less
Pressoir, G; Berthaud, J
2004-02-01
To conserve the long-term selection potential of maize, it is necessary to investigate past and present evolutionary processes that have shaped quantitative trait variation. Understanding the dynamics of quantitative trait evolution is crucial to future crop breeding. We characterized population differentiation of maize landraces from the State of Oaxaca, Mexico for quantitative traits and molecular markers. Qst values were much higher than Fst values obtained for molecular markers. While low values of Fst (0.011 within-village and 0.003 among-villages) suggest that considerable gene flow occurred among the studied populations, high levels of population differentiation for quantitative traits were observed (ie an among-village Qst value of 0.535 for kernel weight). Our results suggest that although quantitative traits appear to be under strong divergent selection, a considerable amount of gene flow occurs among populations. Furthermore, we characterized nonproportional changes in the G matrix structure both within and among villages that are consequences of farmer selection. As a consequence of these differences in the G matrix structure, the response to multivariate selection will be different from one population to another. Large changes in the G matrix structure could indicate that farmers select for genes of major and pleiotropic effect. Farmers' decision and selection strategies have a great impact on phenotypic diversification in maize landraces.
Space memoirs: Value hierarchies before and after missions—A pilot study
NASA Astrophysics Data System (ADS)
Suedfeld, Peter
2006-06-01
The purpose of this pilot study was to provide a quantitative content analysis of how the completion of space missions affects the value hierarchies of people in the space program. The autobiographies of two high-level NASA administrators, three female astronauts, and seven male astronauts were scored for indicators of four values: Achievement, Enjoyment, Benevolence, and Transcendence. Achievement was very high in the hierarchies of all three groups before they had mission experience, and remained high afterwards except among the female astronauts. Administrators showed essentially no pre-post mission value change; among astronauts of both sexes, Transcendence (a combination of Spirituality and Universality) rose dramatically in the hierarchy after spaceflight. The findings, if upheld after the inclusion of more subjects, have important implications for the post-NASA lives of astronauts and for their families, friends, and colleagues.
Supporting Worth Mapping with Sentence Completion
NASA Astrophysics Data System (ADS)
Cockton, Gilbert; Kujala, Sari; Nurkka, Piia; Hölttä, Taneli
Expectations for design and evaluation approaches are set by the development practices within which they are used. Worth-Centred Development (WCD) seeks to both shape and fit such practices. We report a study that combined two WCD approaches. Sentence completion gathered credible quantitative data on user values, which were used to identify relevant values and aversions of two player groups for an on-line gambling site. These values provided human value elements for a complementary WCD approach of worth mapping. Initial worth maps were extended in three workshops, which focused on outcomes and user experiences that could be better addressed in the current product and associated marketing materials. We describe how worth maps were prepared for, and presented in, workshops, and how product owners and associated business roles evaluated the combination of WCD approaches. Based on our experiences, we offer practical advice on this combinination.
Comparing and using assessments of the value of information to clinical decision-making.
Urquhart, C J; Hepworth, J B
1996-01-01
This paper discusses the Value project, which assessed the value to clinical decision-making of information supplied by National Health Service (NHS) library and information services. The project not only showed how health libraries in the United Kingdom help clinicians in decision-making but also provided quality assurance guidelines for these libraries to help make their information services more effective. The paper reviews methods and results used in previous studies of the value of health libraries, noting that methodological differences appear to affect the results. The paper also discusses aspects of user involvement, categories of clinical decision-making, the value of information to present and future clinical decisions, and the combination of quantitative and qualitative assessments of value, as applied to the Value project and the studies reviewed. The Value project also demonstrated that the value placed on information depends in part on the career stage of the physician. The paper outlines the structure of the quality assurance tool kit, which is based on the findings and methods used in the Value project. PMID:8913550
Byrd, Darrin; Christopfel, Rebecca; Arabasz, Grae; Catana, Ciprian; Karp, Joel; Lodge, Martin A; Laymon, Charles; Moros, Eduardo G; Budzevich, Mikalai; Nehmeh, Sadek; Scheuermann, Joshua; Sunderland, John; Zhang, Jun; Kinahan, Paul
2018-01-01
Positron emission tomography (PET) is a quantitative imaging modality, but the computation of standardized uptake values (SUVs) requires several instruments to be correctly calibrated. Variability in the calibration process may lead to unreliable quantitation. Sealed source kits containing traceable amounts of [Formula: see text] were used to measure signal stability for 19 PET scanners at nine hospitals in the National Cancer Institute's Quantitative Imaging Network. Repeated measurements of the sources were performed on PET scanners and in dose calibrators. The measured scanner and dose calibrator signal biases were used to compute the bias in SUVs at multiple time points for each site over a 14-month period. Estimation of absolute SUV accuracy was confounded by bias from the solid phantoms' physical properties. On average, the intrascanner coefficient of variation for SUV measurements was 3.5%. Over the entire length of the study, single-scanner SUV values varied over a range of 11%. Dose calibrator bias was not correlated with scanner bias. Calibration factors from the image metadata were nearly as variable as scanner signal, and were correlated with signal for many scanners. SUVs often showed low intrascanner variability between successive measurements but were also prone to shifts in apparent bias, possibly in part due to scanner recalibrations that are part of regular scanner quality control. Biases of key factors in the computation of SUVs were not correlated and their temporal variations did not cancel out of the computation. Long-lived sources and image metadata may provide a check on the recalibration process.
NASA Astrophysics Data System (ADS)
Zhou, Jialing; He, Honghui; Wang, Ye; Ma, Hui
2017-02-01
Fiber structure changes in the various pathological processes, such as the increase of fibrosis in liver diseases, the derangement of fiber in cervical cancer and so on. Currently, clinical pathologic diagnosis is regarded as the golden criterion, but different doctors with discrepancy in knowledge and experience may obtain different conclusions. Up to a point, quantitative evaluation of the fiber structure in the pathological tissue can be of great service to quantitative diagnosis. Mueller matrix measurement is capable of probing comprehensive microstructural information of samples and different wavelength of lights can provide more information. In this paper, we use a Mueller matrix microscope with light sources in six different wavelength. We use unstained, dewaxing liver tissue slices in four stages and the pathological biopsy of the filtration channels from rabbit eyes as samples. We apply the Mueller matrix polar decomposition (MMPD) parameter δ which corresponds to retardance to liver slices. The mean value of abnormal region get bigger when the level of fibrosis get higher and light in short wavelength is more sensitive to the microstructure of fiber. On the other hand, we use the Mueller matrix transformation (MMT) parameter Φ which is associated to the angel of fast axis in the analysis of the slices of the filtration channels from rabbit eyes. The value of kurtosis and the value of skewness shows big difference between new born region and normal region and can reveal the arrangement of fiber. These results indicate that the Mueller matrix microscope has great potential in auxiliary diagnosis.
Relations between soil hydraulic properties and burn severity
Moody, John A.; Ebel, Brian A.; Nyman, Petter; Martin, Deborah A.; Stoof, Cathelijne R.; McKinley, Randy
2015-01-01
Wildfire can affect soil hydraulic properties, often resulting in reduced infiltration. The magnitude of change in infiltration varies depending on the burn severity. Quantitative approaches to link burn severity with changes in infiltration are lacking. This study uses controlled laboratory measurements to determine relations between a remotely sensed burn severity metric (dNBR, change in normalised burn ratio) and soil hydraulic properties (SHPs). SHPs were measured on soil cores collected from an area burned by the 2013 Black Forest fire in Colorado, USA. Six sites with the same soil type were selected across a range of burn severities, and 10 random soil cores were collected from each site within a 30-m diameter circle. Cumulative infiltration measurements were made in the laboratory using a tension infiltrometer to determine field-saturated hydraulic conductivity, Kfs, and sorptivity, S. These measurements were correlated with dNBR for values ranging from 124 (low severity) to 886 (high severity). SHPs were related to dNBR by inverse functions for specific conditions of water repellency (at the time of sampling) and soil texture. Both functions had a threshold value for dNBR between 124 and 420, where Kfs and S were unchanged and equal to values for soil unaffected by fire. For dNBRs >~420, the Kfs was an exponentially decreasing function of dNBR and S was a linearly decreasing function of dNBR. These initial quantitative empirical relations provide a first step to link SHPs to burn severity, and can be used in quantitative infiltration models to predict post-wildfire infiltration and resulting runoff.
Computer-oriented synthesis of wide-band non-uniform negative resistance amplifiers
NASA Technical Reports Server (NTRS)
Branner, G. R.; Chan, S.-P.
1975-01-01
This paper presents a synthesis procedure which provides design values for broad-band amplifiers using non-uniform negative resistance devices. Employing a weighted least squares optimization scheme, the technique, based on an extension of procedures for uniform negative resistance devices, is capable of providing designs for a variety of matching network topologies. It also provides, for the first time, quantitative results for predicting the effects of parameter element variations on overall amplifier performance. The technique is also unique in that it employs exact partial derivatives for optimization and sensitivity computation. In comparison with conventional procedures, significantly improved broad-band designs are shown to result.
Faculty Grading of Quantitative Problems: A Mismatch between Values and Practice
ERIC Educational Resources Information Center
Petcovic, Heather L.; Fynewever, Herb; Henderson, Charles; Mutambuki, Jacinta M.; Barney, Jeffrey A.
2013-01-01
Grading practices can send a powerful message to students about course expectations. A study by Henderson et al. ("American Journal of Physics" 72:164-169, 2004) in physics education has identified a misalignment between what college instructors say they value and their actual scoring of quantitative student solutions. This work identified three…
Loveless, S E; Api, A-M; Crevel, R W R; Debruyne, E; Gamer, A; Jowsey, I R; Kern, P; Kimber, I; Lea, L; Lloyd, P; Mehmood, Z; Steiling, W; Veenstra, G; Woolhiser, M; Hennes, C
2010-02-01
Hundreds of chemicals are contact allergens but there remains a need to identify and characterise accurately skin sensitising hazards. The purpose of this review was fourfold. First, when using the local lymph node assay (LLNA), consider whether an exposure concentration (EC3 value) lower than 100% can be defined and used as a threshold criterion for classification and labelling. Second, is there any reason to revise the recommendation of a previous ECETOC Task Force regarding specific EC3 values used for sub-categorisation of substances based upon potency? Third, what recommendations can be made regarding classification and labelling of preparations under GHS? Finally, consider how to integrate LLNA data into risk assessment and provide a rationale for using concentration responses and corresponding no-effect concentrations. Although skin sensitising chemicals having high EC3 values may represent only relatively low risks to humans, it is not possible currently to define an EC3 value below 100% that would serve as an appropriate threshold for classification and labelling. The conclusion drawn from reviewing the use of distinct categories for characterising contact allergens was that the most appropriate, science-based classification of contact allergens according to potency is one in which four sub-categories are identified: 'extreme', 'strong', 'moderate' and 'weak'. Since draining lymph node cell proliferation is related causally and quantitatively to potency, LLNA EC3 values are recommended for determination of a no expected sensitisation induction level that represents the first step in quantitative risk assessment. 2009 Elsevier Inc. All rights reserved.
Paturzo, Marco; Colaceci, Sofia; Clari, Marco; Mottola, Antonella; Alvaro, Rosaria; Vellone, Ercole
2016-01-01
. Mixed methods designs: an innovative methodological approach for nursing research. The mixed method research designs (MM) combine qualitative and quantitative approaches in the research process, in a single study or series of studies. Their use can provide a wider understanding of multifaceted phenomena. This article presents a general overview of the structure and design of MM to spread this approach in the Italian nursing research community. The MM designs most commonly used in the nursing field are the convergent parallel design, the sequential explanatory design, the exploratory sequential design and the embedded design. For each method a research example is presented. The use of MM can be an added value to improve clinical practices as, through the integration of qualitative and quantitative methods, researchers can better assess complex phenomena typical of nursing.
Bae, Won C; Ruangchaijatuporn, Thumanoon; Chang, Eric Y; Biswas, Reni; Du, Jiang; Statum, Sheronda; Chung, Christine B
2016-04-01
To evaluate pathology of the triangular fibrocartilage complex (TFCC) using high-resolution morphologic magnetic resonance (MR) imaging, and compare with quantitative MR and biomechanical properties. Five cadaveric wrists (22-70 years) were imaged at 3 T using morphologic (proton density weighted spin echo, PD FS, and 3D spoiled gradient echo, 3D SPGR) and quantitative MR sequences to determine T2 and T1rho properties. In eight geographic regions, morphology of TFC disc and laminae were evaluated for pathology and quantitative MR values. Samples were disarticulated and biomechanical indentation testing was performed on the distal surface of the TFC disc. On morphologic PD SE images, TFC disc pathology included degeneration and tears, while that of the laminae included degeneration, degeneration with superimposed tear, mucinous transformation, and globular calcification. Punctate calcifications were highly visible on 3D SPGR images and found only in pathologic regions. Disc pathology occurred more frequently in proximal regions of the disc than distal regions. Quantitative MR values were lowest in normal samples, and generally higher in pathologic regions. Biomechanical testing demonstrated an inverse relationship, with indentation modulus being high in normal regions with low MR values. The laminae studied were mostly pathologic, and additional normal samples are needed to discern quantitative changes. These results show technical feasibility of morphologic MR, quantitative MR, and biomechanical techniques to characterize pathology of the TFCC. Quantitative MRI may be a suitable surrogate marker of soft tissue mechanical properties, and a useful adjunct to conventional morphologic MR techniques.
Gardner, Shea Nicole
2007-10-23
A method and system for tailoring treatment regimens to individual patients with diseased cells exhibiting evolution of resistance to such treatments. A mathematical model is provided which models rates of population change of proliferating and quiescent diseased cells using cell kinetics and evolution of resistance of the diseased cells, and pharmacokinetic and pharmacodynamic models. Cell kinetic parameters are obtained from an individual patient and applied to the mathematical model to solve for a plurality of treatment regimens, each having a quantitative efficacy value associated therewith. A treatment regimen may then be selected from the plurlaity of treatment options based on the efficacy value.
Looking beyond satisfaction: evaluating the value and impact of information skills training.
Raynor, Michael; Craven, Jenny
2015-03-01
In this feature guest writers Michael Raynor and Jenny Craven from the National Institute for Health and Care Excellence (NICE) present an overview of their evaluative research study on the value and impact of the information skills training courses they provide at NICE. In particular, this small study used a combination of qualitative and quantitative data to look beyond satisfaction and confidence levels and identify whether learning had actually taken place as a result of attending the sessions, and how new skills were used by the attendees in their day-to-day work. H.S. © 2015 Health Libraries Journal.
Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.
2017-01-01
Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883
Ghosh, Adarsh; Singh, Tulika; Singla, Veenu; Bagga, Rashmi; Khandelwal, Niranjan
2017-12-01
Apparent diffusion coefficient (ADC) maps are usually generated by builtin software provided by the MRI scanner vendors; however, various open-source postprocessing software packages are available for image manipulation and parametric map generation. The purpose of this study is to establish the reproducibility of absolute ADC values obtained using different postprocessing software programs. DW images with three b values were obtained with a 1.5-T MRI scanner, and the trace images were obtained. ADC maps were automatically generated by the in-line software provided by the vendor during image generation and were also separately generated on postprocessing software. These ADC maps were compared on the basis of ROIs using paired t test, Bland-Altman plot, mountain plot, and Passing-Bablok regression plot. There was a statistically significant difference in the mean ADC values obtained from the different postprocessing software programs when the same baseline trace DW images were used for the ADC map generation. For using ADC values as a quantitative cutoff for histologic characterization of tissues, standardization of the postprocessing algorithm is essential across processing software packages, especially in view of the implementation of vendor-neutral archiving.
Bannon, William
2015-04-01
Missing data typically refer to the absence of one or more values within a study variable(s) contained in a dataset. The development is often the result of a study participant choosing not to provide a response to a survey item. In general, a greater number of missing values within a dataset reflects a greater challenge to the data analyst. However, if researchers are armed with just a few basic tools, they can quite effectively diagnose how serious the issue of missing data is within a dataset, as well as prescribe the most appropriate solution. Specifically, the keys to effectively assessing and treating missing data values within a dataset involve specifying how missing data will be defined in a study, assessing the amount of missing data, identifying the pattern of the missing data, and selecting the best way to treat the missing data values. I will touch on each of these processes and provide a brief illustration of how the validity of study findings are at great risk if missing data values are not treated effectively. ©2015 American Association of Nurse Practitioners.
Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A
2012-04-01
Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. © Health Research and Educational Trust.
Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A
2012-01-01
Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and the presence of key methodological components in published reports. PMID:22092040
Ha, Richard; Mema, Eralda; Guo, Xiaotao; Mango, Victoria; Desperito, Elise; Ha, Jason; Wynn, Ralph; Zhao, Binsheng
2016-04-01
The amount of fibroglandular tissue (FGT) has been linked to breast cancer risk based on mammographic density studies. Currently, the qualitative assessment of FGT on mammogram (MG) and magnetic resonance imaging (MRI) is prone to intra and inter-observer variability. The purpose of this study is to develop an objective quantitative FGT measurement tool for breast MRI that could provide significant clinical value. An IRB approved study was performed. Sixty breast MRI cases with qualitative assessment of mammographic breast density and MRI FGT were randomly selected for quantitative analysis from routine breast MRIs performed at our institution from 1/2013 to 12/2014. Blinded to the qualitative data, whole breast and FGT contours were delineated on T1-weighted pre contrast sagittal images using an in-house, proprietary segmentation algorithm which combines the region-based active contours and a level set approach. FGT (%) was calculated by: [segmented volume of FGT (mm(3))/(segmented volume of whole breast (mm(3))] ×100. Statistical correlation analysis was performed between quantified FGT (%) on MRI and qualitative assessments of mammographic breast density and MRI FGT. There was a significant positive correlation between quantitative MRI FGT assessment and qualitative MRI FGT (r=0.809, n=60, P<0.001) and mammographic density assessment (r=0.805, n=60, P<0.001). There was a significant correlation between qualitative MRI FGT assessment and mammographic density assessment (r=0.725, n=60, P<0.001). The four qualitative assessment categories of FGT correlated with the calculated mean quantitative FGT (%) of 4.61% (95% CI, 0-12.3%), 8.74% (7.3-10.2%), 18.1% (15.1-21.1%), 37.4% (29.5-45.3%). Quantitative measures of FGT (%) were computed with data derived from breast MRI and correlated significantly with conventional qualitative assessments. This quantitative technique may prove to be a valuable tool in clinical use by providing computer generated standardized measurements with limited intra or inter-observer variability.
NASA Technical Reports Server (NTRS)
Melick, H. C., Jr.; Ybarra, A. H.; Bencze, D. P.
1975-01-01
An inexpensive method is developed to determine the extreme values of instantaneous inlet distortion. This method also provides insight into the basic mechanics of unsteady inlet flow and the associated engine reaction. The analysis is based on fundamental fluid dynamics and statistical methods to provide an understanding of the turbulent inlet flow and quantitatively relate the rms level and power spectral density (PSD) function of the measured time variant total pressure fluctuations to the strength and size of the low pressure regions. The most probable extreme value of the instantaneous distortion is then synthesized from this information in conjunction with the steady state distortion. Results of the analysis show the extreme values to be dependent upon the steady state distortion, the measured turbulence rms level and PSD function, the time on point, and the engine response characteristics. Analytical projections of instantaneous distortion are presented and compared with data obtained by a conventional, highly time correlated, 40 probe instantaneous pressure measurement system.
The impact of smart metal artefact reduction algorithm for use in radiotherapy treatment planning.
Guilfoile, Connor; Rampant, Peter; House, Michael
2017-06-01
The presence of metal artefacts in computed tomography (CT) create issues in radiation oncology. The loss of anatomical information and incorrect Hounsfield unit (HU) values produce inaccuracies in dose calculations, providing suboptimal patient treatment. Metal artefact reduction (MAR) algorithms were developed to combat these problems. This study provides a qualitative and quantitative analysis of the "Smart MAR" software (General Electric Healthcare, Chicago, IL, USA), determining its usefulness in a clinical setting. A detailed analysis was conducted using both patient and phantom data, noting any improvements in HU values and dosimetry with the GE-MAR enabled. This study indicates qualitative improvements in severity of the streak artefacts produced by metals, allowing for easier patient contouring. Furthermore, the GE-MAR managed to recover previously lost anatomical information. Additionally, phantom data showed an improvement in HU value with GE-MAR correction, producing more accurate point dose calculations in the treatment planning system. Overall, the GE-MAR is a useful tool and is suitable for clinical environments.
Selection and Presentation of Imaging Figures in the Medical Literature
Siontis, George C. M.; Patsopoulos, Nikolaos A.; Vlahos, Antonios P.; Ioannidis, John P. A.
2010-01-01
Background Images are important for conveying information, but there is no empirical evidence on whether imaging figures are properly selected and presented in the published medical literature. We therefore evaluated the selection and presentation of radiological imaging figures in major medical journals. Methodology/Principal Findings We analyzed articles published in 2005 in 12 major general and specialty medical journals that had radiological imaging figures. For each figure, we recorded information on selection, study population, provision of quantitative measurements, color scales and contrast use. Overall, 417 images from 212 articles were analyzed. Any comment/hint on image selection was made in 44 (11%) images (range 0–50% across the 12 journals) and another 37 (9%) (range 0–60%) showed both a normal and abnormal appearance. In 108 images (26%) (range 0–43%) it was unclear whether the image came from the presented study population. Eighty-three images (20%) (range 0–60%) had any quantitative or ordered categorical value on a measure of interest. Information on the distribution of the measure of interest in the study population was given in 59 cases. For 43 images (range 0–40%), a quantitative measurement was provided for the depicted case and the distribution of values in the study population was also available; in those 43 cases there was no over-representation of extreme than average cases (p = 0.37). Significance The selection and presentation of images in the medical literature is often insufficiently documented; quantitative data are sparse and difficult to place in context. PMID:20526360
Mapping quantitative trait loci for binary trait in the F2:3 design.
Zhu, Chengsong; Zhang, Yuan-Ming; Guo, Zhigang
2008-12-01
In the analysis of inheritance of quantitative traits with low heritability, an F(2:3) design that genotypes plants in F(2) and phenotypes plants in F(2:3) progeny is often used in plant genetics. Although statistical approaches for mapping quantitative trait loci (QTL) in the F(2:3) design have been well developed, those for binary traits of biological interest and economic importance are seldom addressed. In this study, an attempt was made to map binary trait loci (BTL) in the F(2:3) design. The fundamental idea was: the F(2) plants were genotyped, all phenotypic values of each F(2:3) progeny were measured for binary trait, and these binary trait values and the marker genotype informations were used to detect BTL under the penetrance and liability models. The proposed method was verified by a series of Monte-Carlo simulation experiments. These results showed that maximum likelihood approaches under the penetrance and liability models provide accurate estimates for the effects and the locations of BTL with high statistical power, even under of low heritability. Moreover, the penetrance model is as efficient as the liability model, and the F(2:3) design is more efficient than classical F(2) design, even though only a single progeny is collected from each F(2:3) family. With the maximum likelihood approaches under the penetrance and the liability models developed in this study, we can map binary traits as we can do for quantitative trait in the F(2:3) design.
Li, Lin; Xu, Shuo; An, Xin; Zhang, Lu-Da
2011-10-01
In near infrared spectral quantitative analysis, the precision of measured samples' chemical values is the theoretical limit of those of quantitative analysis with mathematical models. However, the number of samples that can obtain accurately their chemical values is few. Many models exclude the amount of samples without chemical values, and consider only these samples with chemical values when modeling sample compositions' contents. To address this problem, a semi-supervised LS-SVR (S2 LS-SVR) model is proposed on the basis of LS-SVR, which can utilize samples without chemical values as well as those with chemical values. Similar to the LS-SVR, to train this model is equivalent to solving a linear system. Finally, the samples of flue-cured tobacco were taken as experimental material, and corresponding quantitative analysis models were constructed for four sample compositions' content(total sugar, reducing sugar, total nitrogen and nicotine) with PLS regression, LS-SVR and S2 LS-SVR. For the S2 LS-SVR model, the average relative errors between actual values and predicted ones for the four sample compositions' contents are 6.62%, 7.56%, 6.11% and 8.20%, respectively, and the correlation coefficients are 0.974 1, 0.973 3, 0.923 0 and 0.948 6, respectively. Experimental results show the S2 LS-SVR model outperforms the other two, which verifies the feasibility and efficiency of the S2 LS-SVR model.
Krølner, Rikke; Rasmussen, Mette; Brug, Johannes; Klepp, Knut-Inge; Wind, Marianne; Due, Pernille
2011-10-14
Large proportions of children do not fulfil the World Health Organization recommendation of eating at least 400 grams of fruit and vegetables (FV) per day. To promote an increased FV intake among children it is important to identify factors which influence their consumption. Both qualitative and quantitative studies are needed. Earlier reviews have analysed evidence from quantitative studies. The aim of this paper is to present a systematic review of qualitative studies of determinants of children's FV intake. Relevant studies were identified by searching Anthropology Plus, Cinahl, CSA illumine, Embase, International Bibliography of the Social Sciences, Medline, PsycINFO, and Web of Science using combinations of synonyms for FV intake, children/adolescents and qualitative methods as search terms. The literature search was completed by December 1st 2010. Papers were included if they applied qualitative methods to investigate 6-18-year-olds' perceptions of factors influencing their FV consumption. Quantitative studies, review studies, studies reported in other languages than English, and non-peer reviewed or unpublished manuscripts were excluded. The papers were reviewed systematically using standardised templates for summary of papers, quality assessment, and synthesis of findings across papers. The review included 31 studies, mostly based on US populations and focus group discussions. The synthesis identified the following potential determinants for FV intake which supplement the quantitative knowledge base: Time costs; lack of taste guarantee; satiety value; appropriate time/occasions/settings for eating FV; sensory and physical aspects; variety, visibility, methods of preparation; access to unhealthy food; the symbolic value of food for image, gender identity and social interaction with peers; short term outcome expectancies. The review highlights numerous potential determinants which have not been investigated thoroughly in quantitative studies. Future large scale quantitative studies should attempt to quantify the importance of these factors. Further, mechanisms behind gender, age and socioeconomic differences in FV consumption are proposed which should be tested quantitatively in order to better tailor interventions to vulnerable groups. Finally, the review provides input to the conceptualisation and measurements of concepts (i.e. peer influence, availability in schools) which may refine survey instruments and theoretical frameworks concerning eating behaviours.
Padroni, Marina; Bernardoni, Andrea; Tamborino, Carmine; Roversi, Gloria; Borrelli, Massimo; Saletti, Andrea; De Vito, Alessandro; Azzini, Cristiano; Borgatti, Luca; Marcello, Onofrio; d'Esterre, Christopher; Ceruti, Stefano; Casetta, Ilaria; Lee, Ting-Yim; Fainardi, Enrico
2016-01-01
The capability of CT perfusion (CTP) Alberta Stroke Program Early CT Score (ASPECTS) to predict outcome and identify ischemia severity in acute ischemic stroke (AIS) patients is still questioned. 62 patients with AIS were imaged within 8 hours of symptom onset by non-contrast CT, CT angiography and CTP scans at admission and 24 hours. CTP ASPECTS was calculated on the affected hemisphere using cerebral blood flow (CBF), cerebral blood volume (CBV) and mean transit time (MTT) maps by subtracting 1 point for any abnormalities visually detected or measured within multiple cortical circular regions of interest according to previously established thresholds. MTT-CBV ASPECTS was considered as CTP ASPECTS mismatch. Hemorrhagic transformation (HT), recanalization status and reperfusion grade at 24 hours, final infarct volume at 7 days and modified Rankin scale (mRS) at 3 months after onset were recorded. Semi-quantitative and quantitative CTP ASPECTS were highly correlated (p<0.00001). CBF, CBV and MTT ASPECTS were higher in patients with no HT and mRS ≤ 2 and inversely associated with final infarct volume and mRS (p values: from p<0.05 to p<0.00001). CTP ASPECTS mismatch was slightly associated with radiological and clinical outcomes (p values: from p<0.05 to p<0.02) only if evaluated quantitatively. A CBV ASPECTS of 9 was the optimal semi-quantitative value for predicting outcome. Our findings suggest that visual inspection of CTP ASPECTS recognizes infarct and ischemic absolute values. Semi-quantitative CBV ASPECTS, but not CTP ASPECTS mismatch, represents a strong prognostic indicator, implying that core extent is the main determinant of outcome, irrespective of penumbra size.
Schwendener, Nicole; Jackowski, Christian; Persson, Anders; Warntjes, Marcel J; Schuster, Frederick; Riva, Fabiano; Zech, Wolf-Dieter
2017-01-01
Recently, quantitative MR sequences have started being used in post-mortem imaging. The goal of the present study was to evaluate if early acute and following age stages of myocardial infarction can be detected and discerned by quantitative 1.5T post-mortem cardiac magnetic resonance (PMCMR) based on quantitative T1, T2 and PD values. In 80 deceased individuals (25 female, 55 male), a cardiac MR quantification sequence was performed prior to cardiac dissection at autopsy in a prospective study. Focal myocardial signal alterations detected in synthetically generated MR images were MR quantified for their T1, T2 and PD values. The locations of signal alteration measurements in PMCMR were targeted at autopsy heart dissection and cardiac tissue specimens were taken for histologic examinations. Quantified signal alterations in PMCMR were correlated to their according histologic age stage of myocardial infarction. In PMCMR seventy-three focal myocardial signal alterations were detected in 49 of 80 investigated hearts. These signal alterations were diagnosed histologically as early acute (n=39), acute (n=14), subacute (n=10) and chronic (n=10) age stages of myocardial infarction. Statistical analysis revealed that based on their quantitative T1, T2 and PD values, a significant difference between all defined age groups of myocardial infarction can be determined. It can be concluded that quantitative 1.5T PMCMR quantification based on quantitative T1, T2 and PD values is feasible for characterization and differentiation of early acute and following age stages of myocardial infarction. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Padroni, Marina; Bernardoni, Andrea; Tamborino, Carmine; Roversi, Gloria; Borrelli, Massimo; Saletti, Andrea; De Vito, Alessandro; Azzini, Cristiano; Borgatti, Luca; Marcello, Onofrio; d’Esterre, Christopher; Ceruti, Stefano; Casetta, Ilaria; Lee, Ting-Yim; Fainardi, Enrico
2016-01-01
Introduction The capability of CT perfusion (CTP) Alberta Stroke Program Early CT Score (ASPECTS) to predict outcome and identify ischemia severity in acute ischemic stroke (AIS) patients is still questioned. Methods 62 patients with AIS were imaged within 8 hours of symptom onset by non-contrast CT, CT angiography and CTP scans at admission and 24 hours. CTP ASPECTS was calculated on the affected hemisphere using cerebral blood flow (CBF), cerebral blood volume (CBV) and mean transit time (MTT) maps by subtracting 1 point for any abnormalities visually detected or measured within multiple cortical circular regions of interest according to previously established thresholds. MTT-CBV ASPECTS was considered as CTP ASPECTS mismatch. Hemorrhagic transformation (HT), recanalization status and reperfusion grade at 24 hours, final infarct volume at 7 days and modified Rankin scale (mRS) at 3 months after onset were recorded. Results Semi-quantitative and quantitative CTP ASPECTS were highly correlated (p<0.00001). CBF, CBV and MTT ASPECTS were higher in patients with no HT and mRS≤2 and inversely associated with final infarct volume and mRS (p values: from p<0.05 to p<0.00001). CTP ASPECTS mismatch was slightly associated with radiological and clinical outcomes (p values: from p<0.05 to p<0.02) only if evaluated quantitatively. A CBV ASPECTS of 9 was the optimal semi-quantitative value for predicting outcome. Conclusions Our findings suggest that visual inspection of CTP ASPECTS recognizes infarct and ischemic absolute values. Semi-quantitative CBV ASPECTS, but not CTP ASPECTS mismatch, represents a strong prognostic indicator, implying that core extent is the main determinant of outcome, irrespective of penumbra size. PMID:26824672
Takase, Miyuki; Oba, Keiko; Yamashita, Noriko
2009-07-01
Although nurse turnover is a serious problem, the fact that each nurse has different work-related needs/values, and leaves their job for different reasons makes it difficult for organisations to develop effective countermeasures against it. Understanding nurses' needs and the reasons for job turnover by the generation in which they were born may provide some feasible solutions. The purpose of the study was to identify specific work-related needs and values of nurses in three generations (born in 1946-1959, 1960-1974, 1975-present). The study also aimed to explore generation-specific reasons that might make nurses consider leaving the jobs. The study was conducted in three public hospitals in Japan. A convenience sample of 315 registered nurses participated in the study. A survey method was used to collect quantitative and qualitative data. Quantitative data were analysed by ANOVA, and qualitative data were analysed by content analysis. Nurses born between 1960 and 1974 embraced high needs and values in professional privileges such as autonomy and recognition, while those born after 1975 expressed low needs and values in the opportunities for clinical challenge. For nurses born between 1960 and 1974, the imbalance between their jobs and personal life made them consider leaving their jobs. For those born after 1975, losing the confidence to care made them consider turning over. Nurses born after 1960 tended to value economic return and job security more highly compared to those born between 1946 and 1959. Nurses in different generations have different sets of needs/values and reasons for job turnover. Understanding generation-specific needs and values of nurses may enable organisations and Nurse Managers to develop feasible and effective countermeasures to reduce nurse turnover.
Method for the concentration and separation of actinides from biological and environmental samples
Horwitz, E. Philip; Dietz, Mark L.
1989-01-01
A method and apparatus for the quantitative recover of actinide values from biological and environmental sample by passing appropriately prepared samples in a mineral acid solution through a separation column of a dialkyl(phenyl)-N,N-dialylcarbamoylmethylphosphine oxide dissolved in tri-n-butyl phosphate on an inert substrate which selectively extracts the actinide values. The actinide values can be eluted either as a group or individually and their presence quantitatively detected by alpha counting.
Alqasim, Khalid M; Ali, Eman N; Evers, Silvia M; Hiligsmann, Mickaël
2016-01-01
To assess the views, knowledge, and experience of Dutch physicians with regard to the general objectives and values of the pay-for-performance (P4P) system, as the Dutch healthcare industry might find it useful, in terms of governance, to explore this approach further. A quantitative cross-sectional survey study was conducted among 48 physicians in surgical specialties in the Netherlands between May 2014 and July 2014. The survey questionnaire was designed to gather information regarding the intensity of feelings, on a 7-point Likert scale, toward statements that address the P4P system. Confidence intervals were calculated using the bootstrap technique with 1000 iterations. Physicians see a positive value in P4P for their organizations rather than for personal attainment (mean = 5.00; 95% CI = 4.62-5.39), even though they feared that P4P might put financial pressure on them (mean = 5.03; 95% CI = 4.50-5.54). They strongly share the view that other colleagues will resist adopting P4P as a business model (mean = 5.74; 95% CI = 5.43-6.04). Respondents stated that they would not leave their current jobs if P4P were to be incorporated in their organization. Physicians see value in P4P for their organizations, and consider that P4P could provide an incentive for improving medical outcomes. There seems to be potential for the P4P system in the Netherlands as participants expressed positive support for its values. There is an intersection of interests between the value of P4P and the physicians' aim of achieving quality outcomes; however, further studies would be needed to investigate perceptions about specific design features in a larger sample. In addition, prior to implementing P4P, broad education about the system should be provided in order to counteract pre-conceptions and prevent resistance.
Venkatakrishnan, K; Ecsedy, J A
2017-01-01
Clinical pharmacodynamic evaluation is a key component of the "pharmacologic audit trail" in oncology drug development. We posit that its value can and should be greatly enhanced via application of a robust quantitative pharmacology framework informed by biologically mechanistic considerations. Herein, we illustrate examples of intersectional blindspots across the disciplines of quantitative pharmacology and translational science and offer a roadmap aimed at enhancing the caliber of clinical pharmacodynamic research in the development of oncology therapeutics. © 2016 American Society for Clinical Pharmacology and Therapeutics.
ERIC Educational Resources Information Center
Corrao, Jocelyn J.
2016-01-01
The researcher designed this quantitative dissertation research to explore the perceptions of beginning nursing students toward professionalism in nursing, specific to professional values within the context of curriculum delivery for a leadership and management course in one baccalaureate nursing program. In addition, the researcher reviewed the…
ERIC Educational Resources Information Center
Punch, Raymond J.
2012-01-01
The purpose of the quantitative regression study was to explore and to identify relationships between attitudes toward use and perceptions of value of computer-based simulation programs, of college instructors, toward computer based simulation programs. A relationship has been reported between attitudes toward use and perceptions of the value of…
You, Jun; Chen, Juan; Xiang, Feixiang; Song, Yue; Khamis, Simai; Lu, Chengfa; Lv, Qing; Zhang, Yanrong; Xie, Mingxing
2018-04-01
This study aimed at evaluating the diagnostic performance of quantitative shear wave elastography (SWE) in differentiating metastatic cervical lymph nodes from benign nodes in patients with thyroid nodules. One hundred and forty-one cervical lymph nodes from 39 patients with thyroid nodules that were diagnosed as papillary thyroid cancer had been imaged with SWE. The shear elasticity modulus, which indicates the stiffness of the lymph nodes, was measured in terms of maximum shear elasticity modulus (maxSM), minimum shear elasticity modulus (minSM), mean shear elasticity modulus (meanSM), and standard deviation (SD) of the shear elasticity modulus. All the patients underwent thyroid surgery, 50 of the suspicious lymph nodes were resected, and 91 lymph nodes were followed up for 6 months. The maxSM value, minSM value, meanSM value, and SD value of the metastatic lymph nodes were significantly higher than those of the benign nodes. The area under the curve of the maxSM value, minSM value, meanSM value, and SD value were 0.918, 0.606, 0.865, and 0.915, respectively. SWE can differentiate metastasis from benign cervical lymph nodes in patients with thyroid nodules, and the maxSM, meanSM, and SD may be valuable quantitative indicators for characterizing cervical lymph nodes.
Jing, Guojie; Yao, Xiaoteng; Li, Yiyi; Xie, Yituan; Li, Wang#x2019;an; Liu, Kejun; Jing, Yingchao; Li, Baisheng; Lv, Yifan; Ma, Baoxin
2014-01-01
Fractional anisotropy values in diffusion tensor imaging can quantitatively reflect the consistency of nerve fibers after brain damage, where higher values generally indicate less damage to nerve fibers. Therefore, we hypothesized that diffusion tensor imaging could be used to evaluate the effect of mild hypothermia on diffuse axonal injury. A total of 102 patients with diffuse axonal injury were randomly divided into two groups: normothermic and mild hypothermic treatment groups. Patient's modified Rankin scale scores 2 months after mild hypothermia were significantly lower than those for the normothermia group. The difference in average fractional anisotropy value for each region of interest before and after mild hypothermia was 1.32-1.36 times higher than the value in the normothermia group. Quantitative assessment of diffusion tensor imaging indicates that mild hypothermia therapy may be beneficial for patients with diffuse axonal injury. PMID:25206800
Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons
2014-01-01
Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829
A Statistical Framework for Protein Quantitation in Bottom-Up MS-Based Proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas
2009-08-15
Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and associated confidence measures. Challenges include the presence of low quality or incorrectly identified peptides and informative missingness. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model that carefully accounts for informative missingness in peak intensities and allows unbiased, model-based, protein-level estimation and inference. The model is applicable to both label-based and label-free quantitation experiments. We also provide automated, model-based, algorithms for filtering of proteins and peptides as well as imputation of missing values. Two LC/MS datasets are used to illustrate themore » methods. In simulation studies, our methods are shown to achieve substantially more discoveries than standard alternatives. Availability: The software has been made available in the opensource proteomics platform DAnTE (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu Supplementary information: Supplementary data are available at Bioinformatics online.« less
Establishing the credibility of qualitative research findings: the plot thickens.
Cutcliffe, J R; McKenna, H P
1999-08-01
Qualitative research is increasingly recognized and valued and its unique place in nursing research is highlighted by many. Despite this, some nurse researchers continue to raise epistemological issues about the problems of objectivity and the validity of qualitative research findings. This paper explores the issues relating to the representativeness or credibility of qualitative research findings. It therefore critiques the existing distinct philosophical and methodological positions concerning the trustworthiness of qualitative research findings, which are described as follows: quantitative studies should be judged using the same criteria and terminology as quantitative studies; it is impossible, in a meaningful way, for any criteria to be used to judge qualitative studies; qualitative studies should be judged using criteria that are developed for and fit the qualitative paradigm; and the credibility of qualitative research findings could be established by testing out the emerging theory by means of conducting a deductive quantitative study. The authors conclude by providing some guidelines for establishing the credibility of qualitative research findings.
NASA Astrophysics Data System (ADS)
Galmed, A. H.; Elshemey, Wael M.
2017-08-01
Differentiating between normal, benign and malignant excised breast tissues is one of the major worldwide challenges that need a quantitative, fast and reliable technique in order to avoid personal errors in diagnosis. Laser induced fluorescence (LIF) is a promising technique that has been applied for the characterization of biological tissues including breast tissue. Unfortunately, only few studies have adopted a quantitative approach that can be directly applied for breast tissue characterization. This work provides a quantitative means for such characterization via introduction of several LIF characterization parameters and determining the diagnostic accuracy of each parameter in the differentiation between normal, benign and malignant excised breast tissues. Extensive analysis on 41 lyophilized breast samples using scatter diagrams, cut-off values, diagnostic indices and receiver operating characteristic (ROC) curves, shows that some spectral parameters (peak height and area under the peak) are superior for characterization of normal, benign and malignant breast tissues with high sensitivity (up to 0.91), specificity (up to 0.91) and accuracy ranking (highly accurate).
Semi-Quantitative Scoring of an Immunochromatographic Test for Circulating Filarial Antigen
Chesnais, Cédric B.; Missamou, François; Pion, Sébastien D. S.; Bopda, Jean; Louya, Frédéric; Majewski, Andrew C.; Weil, Gary J.; Boussinesq, Michel
2013-01-01
The value of a semi-quantitative scoring of the filarial antigen test (Binax Now Filariasis card test, ICT) results was evaluated during a field survey in the Republic of Congo. One hundred and thirty-four (134) of 774 tests (17.3%) were clearly positive and were scored 1, 2, or 3; and 11 (1.4%) had questionable results. Wuchereria bancrofti microfilariae (mf) were detected in 41 of those 133 individuals with an ICT test score ≥ 1 who also had a night blood smear; none of the 11 individuals with questionable ICT results harbored night mf. Cuzick's test showed a significant trend for higher microfilarial densities in groups with higher ICT scores (P < 0.001). The ICT scores were also significantly correlated with blood mf counts. Because filarial antigen levels provide an indication of adult worm infection intensity, our results suggest that semi-quantitative reading of the ICT may be useful for grading the intensity of filarial infections in individuals and populations. PMID:24019435
Quantitative Doppler Analysis Using Conventional Color Flow Imaging Acquisitions.
Karabiyik, Yucel; Ekroll, Ingvild Kinn; Eik-Nes, Sturla H; Lovstakken, Lasse
2018-05-01
Interleaved acquisitions used in conventional triplex mode result in a tradeoff between the frame rate and the quality of velocity estimates. On the other hand, workflow becomes inefficient when the user has to switch between different modes, and measurement variability is increased. This paper investigates the use of power spectral Capon estimator in quantitative Doppler analysis using data acquired with conventional color flow imaging (CFI) schemes. To preserve the number of samples used for velocity estimation, only spatial averaging was utilized, and clutter rejection was performed after spectral estimation. The resulting velocity spectra were evaluated in terms of spectral width using a recently proposed spectral envelope estimator. The spectral envelopes were also used for Doppler index calculations using in vivo and string phantom acquisitions. In vivo results demonstrated that the Capon estimator can provide spectral estimates with sufficient quality for quantitative analysis using packet-based CFI acquisitions. The calculated Doppler indices were similar to the values calculated using spectrograms estimated on a commercial ultrasound scanner.
The Science Manager's Guide to Case Studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Branch, Kristi M.; Peffers, Melissa S.; Ruegg, Rosalie T.
2001-09-24
This guide takes the science manager through the steps of planning, implementing, validating, communicating, and using case studies. It outlines the major methods of analysis, describing their relative merits and applicability while providing relevant examples and sources of additional information. Well-designed case studies can provide a combination of rich qualitative and quantitative information, offering valuable insights into the nature, outputs, and longer-term impacts of the research. An objective, systematic, and credible approach to the evaluation of U.S. Department of Energy Office of Science programs adds value to the research process and is the subject of this guide.
Oyster reef restoration supports increased nekton biomass and potential commercial fishery value
2015-01-01
Across the globe, discussions centered on the value of nature drive many conservation and restoration decisions. As a result, justification for management activities increasingly asks for two lines of evidence: (1) biological proof of augmented ecosystem function or service, and (2) monetary valuation of these services. For oyster reefs, which have seen significant global declines and increasing restoration work, the need to provide both biological and monetary evidence of reef services on a local-level has become more critical in a time of declining resources. Here, we quantified species biomass and potential commercial value of nekton collected from restored oyster (Crassostrea virginica) reefs in coastal Louisiana over a 3-year period, providing multiple snapshots of biomass support over time. Overall, and with little change over time, fish and invertebrate biomass is 212% greater at restored oyster reefs than mud-bottom, or 0.12 kg m−2. The additional biomass of commercial species is equivalent to an increase of local fisheries value by 226%, or $0.09 m−2. Understanding the ecosystem value of restoration projects, and how they interact with regional management priorities, is critical to inform local decision-making and provide testable predictions. Quantitative estimates of potential commercial fisheries enhancement by oyster reef restoration such as this one can be used directly by local managers to determine the expected return on investment. PMID:26336635
Oyster reef restoration supports increased nekton biomass and potential commercial fishery value
Humphries, Austin T.; LaPeyre, Megan K.
2015-01-01
Across the globe, discussions centered on the value of nature drive many conservation and restoration decisions. As a result, justification for management activities increasingly asks for two lines of evidence: (1) biological proof of augmented ecosystem function or service, and (2) monetary valuation of these services. For oyster reefs, which have seen significant global declines and increasing restoration work, the need to provide both biological and monetary evidence of reef services on a local-level has become more critical in a time of declining resources. Here, we quantified species biomass and potential commercial value of nekton collected from restored oyster (Crassostrea virginica) reefs in coastal Louisiana over a 3-year period, providing multiple snapshots of biomass support over time. Overall, and with little change over time, fish and invertebrate biomass is 212% greater at restored oyster reefs than mud-bottom, or 0.12 kg m−2. The additional biomass of commercial species is equivalent to an increase of local fisheries value by 226%, or $0.09 m−2. Understanding the ecosystem value of restoration projects, and how they interact with regional management priorities, is critical to inform local decision-making and provide testable predictions. Quantitative estimates of potential commercial fisheries enhancement by oyster reef restoration such as this one can be used directly by local managers to determine the expected return on investment.
Zhu, Jingqi; Xiong, Zuogang; Zhang, Jiulong; Qiu, Yuyou; Hua, Ting; Tang, Guangyu
2017-11-14
This study aims to investigate the technical feasibility of semi-quantitative and quantitative dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in the assessment of longitudinal changes of marrow perfusion in a rat osteoporosis model, using bone mineral density (BMD) measured by micro-computed tomography (micro-CT) and histopathology as the gold standards. Fifty rats were randomly assigned to the control group (n=25) and ovariectomy (OVX) group whose bilateral ovaries were excised (n=25). Semi-quantitative and quantitative DCE-MRI, micro-CT, and histopathological examinations were performed on lumbar vertebrae at baseline and 3, 6, 9, and 12 weeks after operation. The differences between the two groups in terms of semi-quantitative DCE-MRI parameter (maximum enhancement, E max ), quantitative DCE-MRI parameters (volume transfer constant, K trans ; interstitial volume, V e ; and efflux rate constant, K ep ), micro-CT parameter (BMD), and histopathological parameter (microvessel density, MVD) were compared at each of the time points using an independent-sample t test. The differences in these parameters between baseline and other time points in each group were assessed via Bonferroni's multiple comparison test. A Pearson correlation analysis was applied to assess the relationships between DCE-MRI, micro-CT, and histopathological parameters. In the OVX group, the E max values decreased significantly compared with those of the control group at weeks 6 and 9 (p=0.003 and 0.004, respectively). The K trans values decreased significantly compared with those of the control group from week 3 (p<0.05). However, the V e values decreased significantly only at week 9 (p=0.032), and no difference in the K ep was found between two groups. The BMD values of the OVX group decreased significantly compared with those of the control group from week 3 (p<0.05). Transmission electron microscopy showed tighter gaps between vascular endothelial cells with swollen mitochondria in the OVX group from week 3. The MVD values of the OVX group decreased significantly compared with those of the control group only at week 12 (p=0.023). A weak positive correlation of E max and a strong positive correlation of K trans with MVD were found. Compared with semi-quantitative DCE-MRI, the quantitative DCE-MRI parameter K trans is a more sensitive and accurate index for detecting early reduced perfusion in osteoporotic bone.
Bhatta, Dharma Nand; Liabsuetrakul, Tippawan
2016-09-01
Human immunodeficiency virus (HIV) infection affects the quality of life of infected people. It is well known that empowerment is important for appropriate access to health care and quality of care. However, there is currently limited availability of explicit methods to increase the empowerment of HIV-infected people. This study aimed to develop and test the feasibility of a social self-value package as an empowerment intervention method for HIV-infected people. One group included 8-10 participants and each session lasted for one and half hours. Six sessions in total were developed and one session was conducted in one week. A total of 66 participants were randomly selected for participating in the package, assessed its feasibility both quantitatively and qualitatively. Attitudes towards HIV-related issues significantly and positively changed after each session. Client satisfaction and acceptability of the intervention was very high indicating high feasibility with good design. The qualitative findings also supported the quantitative findings where both participants and counselors accepted and were satisfied with the structure and contents of the package. This study revealed that providing an inclusive six-week social self-value package for HIV-infected Nepali people appears to be feasible. Its effect on empowerment intervention will be measured by a randomized controlled trial.
Method for the concentration and separation of actinides from biological and environmental samples
Horwitz, E.P.; Dietz, M.L.
1989-05-30
A method and apparatus for the quantitative recover of actinide values from biological and environmental sample by passing appropriately prepared samples in a mineral acid solution through a separation column of a dialkyl(phenyl)-N,N-dialylcarbamoylmethylphosphine oxide dissolved in tri-n-butyl phosphate on an inert substrate which selectively extracts the actinide values. The actinide values can be eluted either as a group or individually and their presence quantitatively detected by alpha counting. 3 figs.
Quantitative observations of hydrogen-induced, slow crack growth in a low alloy steel
NASA Technical Reports Server (NTRS)
Nelson, H. G.; Williams, D. P.
1973-01-01
Hydrogen-induced slow crack growth, da/dt, was studied in AISI-SAE 4130 low alloy steel in gaseous hydrogen and distilled water environments as a function of applied stress intensity, K, at various temperatures, hydrogen pressures, and alloy strength levels. At low values of K, da/dt was found to exhibit a strong exponential K dependence (Stage 1 growth) in both hydrogen and water. At intermediate values of K, da/dt exhibited a small but finite K dependence (Stage 2), with the Stage 2 slope being greater in hydrogen than in water. In hydrogen, at a constant K, (da/dt) sub 2 varied inversely with alloy strength level and varied essentially in the same complex manner with temperature and hydrogen pressure as noted previously. The results of this study provide support for most of the qualitative predictions of the lattice decohesion theory as recently modified by Oriani. The lack of quantitative agreement between data and theory and the inability of theory to explain the observed pressure dependence of slow crack growth are mentioned and possible rationalizations to account for these differences are presented.
Computed Tomography Perfusion Imaging for the Diagnosis of Hepatic Alveolar Echinococcosis
Sade, Recep; Kantarci, Mecit; Genc, Berhan; Ogul, Hayri; Gundogdu, Betul; Yilmaz, Omer
2018-01-01
Objective: Alveolar echinococcosis (AE) is a rare life-threatening parasitic infection. Computed tomography perfusion (CTP) imaging has the potential to provide both quantitative and qualitative information about the tissue perfusion characteristics. The purpose of this study was the examination of the characteristic features and feasibility of CTP in AE liver lesions. Material and Methods: CTP scanning was performed in 25 patients who had a total of 35 lesions identified as AE of the liver. Blood flow (BF), blood volume (BV), portal venous perfusion (PVP), arterial liver perfusion (ALP), and hepatic perfusion indexes (HPI) were computed for background liver parenchyma and each AE lesion. Results: Significant differences were detected between perfusion values of the AE lesions and background liver tissue. The BV, BF, ALP, and PVP values for all components of the AE liver lesions were significantly lower than the normal liver parenchyma (p<0.01). Conclusions: We suggest that perfusion imaging can be used in AE of the liver. Thus, the quantitative knowledge of perfusion parameters are obtained via CT perfusion imaging. PMID:29531482
Wisnieff, Cynthia; Ramanan, Sriram; Olesik, John; Gauthier, Susan; Wang, Yi; Pitt, David
2014-01-01
Purpose Within multiple sclerosis (MS) lesions iron is present in chronically activated microglia. Thus, iron detection with MRI might provide a biomarker for chronic inflammation within lesions. Here, we examine contributions of iron and myelin to magnetic susceptibility of lesions on quantitative susceptibility mapping (QSM). Methods Fixed MS brain tissue was assessed with MRI including gradient echo data, which was processed to generate field (phase), R2* and QSM. Five lesions were sectioned and evaluated by immunohistochemistry for presence of myelin, iron and microglia/macrophages. Two of the lesions had an elemental analysis for iron concentration mapping, and their phospholipid content was estimated from the difference in the iron and QSM data. Results Three of the five lesions had substantial iron deposition that was associated with microglia and positive susceptibility values. For the two lesions with elemental analysis, the QSM derived phospholipid content maps were consistent with myelin labeled histology. Conclusion Positive susceptibility values with respect to water indicate the presence of iron in MS lesions, though both demyelination and iron deposition contribute to QSM. PMID:25137340
Sasai-Sakuma, Taeko; Frauscher, Birgit; Mitterling, Thomas; Ehrmann, Laura; Gabelia, David; Brandauer, Elisabeth; Inoue, Yuichi; Poewe, Werner; Högl, Birgit
2014-09-01
Rapid eye movement (REM) sleep without atonia (RWA) is observed in some patients without a clinical history of REM sleep behavior disorder (RBD). It remains unknown whether these patients meet the refined quantitative electromyographic (EMG) criteria supporting a clinical RBD diagnosis. We quantitatively evaluated EMG activity and investigated its overnight distribution in patients with isolated qualitative RWA. Fifty participants with an incidental polysomnographic finding of RWA (isolated qualitative RWA) were included. Tonic, phasic, and 'any' EMG activity during REM sleep on PSG were quantified retrospectively. Referring to the quantitative cut-off values for a polysomnographic diagnosis of RBD, 7/50 (14%) and 6/50 (12%) of the patients showed phasic and 'any' EMG activity in the mentalis muscle above the respective cut-off values. No patient was above the cut-off value for tonic EMG activity or phasic EMG activity in the anterior tibialis muscles. Patients with RWA above the cut-off value showed higher amounts of RWA during later REM sleep periods. This is the first study showing that some subjects with incidental RWA meet the refined quantitative EMG criteria for a diagnosis of RBD. Future longitudinal studies must investigate whether this subgroup with isolated qualitative RWA is at an increased risk of developing fully expressed RBD and/or neurodegenerative disease. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Milyakov, Hristo; Tanev, Stoyan; Ruskov, Petko
2011-03-01
Value co-creation, is an emerging business and innovation paradigm, however, there is not enough clarity on the distinctive characteristics of value co-creation as compared to more traditional value creation approaches. The present paper summarizes the results from an empirically-derived research study focusing on the development of a systematic procedure for the identification of firms that are active in value co-creation. The study is based on a sample 273 firms that were selected for being representative of the breadth of their value co-creation activities. The results include: i) the identification of the key components of value co-creation based on a research methodology using web search and Principal Component Analysis techniques, and ii) the comparison of two different classification techniques identifying the firms with the highest degree of involvement in value co-creation practices. To the best of our knowledge this is the first study using sophisticated data collection techniques to provide a classification of firms according to the degree of their involvement in value co-creation.
Smith, Eric G.
2015-01-01
Background: Nonrandomized studies typically cannot account for confounding from unmeasured factors. Method: A method is presented that exploits the recently-identified phenomenon of “confounding amplification” to produce, in principle, a quantitative estimate of total residual confounding resulting from both measured and unmeasured factors. Two nested propensity score models are constructed that differ only in the deliberate introduction of an additional variable(s) that substantially predicts treatment exposure. Residual confounding is then estimated by dividing the change in treatment effect estimate between models by the degree of confounding amplification estimated to occur, adjusting for any association between the additional variable(s) and outcome. Results: Several hypothetical examples are provided to illustrate how the method produces a quantitative estimate of residual confounding if the method’s requirements and assumptions are met. Previously published data is used to illustrate that, whether or not the method routinely provides precise quantitative estimates of residual confounding, the method appears to produce a valuable qualitative estimate of the likely direction and general size of residual confounding. Limitations: Uncertainties exist, including identifying the best approaches for: 1) predicting the amount of confounding amplification, 2) minimizing changes between the nested models unrelated to confounding amplification, 3) adjusting for the association of the introduced variable(s) with outcome, and 4) deriving confidence intervals for the method’s estimates (although bootstrapping is one plausible approach). Conclusions: To this author’s knowledge, it has not been previously suggested that the phenomenon of confounding amplification, if such amplification is as predictable as suggested by a recent simulation, provides a logical basis for estimating total residual confounding. The method's basic approach is straightforward. The method's routine usefulness, however, has not yet been established, nor has the method been fully validated. Rapid further investigation of this novel method is clearly indicated, given the potential value of its quantitative or qualitative output. PMID:25580226
2015-01-01
Although both rhizome and root of Panax notoginseng are officially utilized as notoginseng in “Chinese Pharmacopoeia”, individual parts of the root were differently used in practice. To provide chemical evidence for the differentiated usage, quantitative comparison and metabolite profiling of different portions derived from the whole root, as well as commercial samples, were carried out, showing an overall higher content of saponins in rhizome, followed by main root, branch root, and fibrous root. Ginsenoside Rb2 was proposed as a potential marker with a content of 0.5 mg/g as a threshold value for differentiating rhizome from other parts. Multivariate analysis of the metabolite profile further suggested 32 saponins as potential markers for the discrimination of different parts of notoginseng. Collectively, the study provided comprehensive chemical evidence for the distinct usage of different parts of notoginseng and, hence, is of great importance for the rational application and exploitation of individual parts of notoginseng. PMID:25118819
The attentional drift-diffusion model extends to simple purchasing decisions.
Krajbich, Ian; Lu, Dingchao; Camerer, Colin; Rangel, Antonio
2012-01-01
How do we make simple purchasing decisions (e.g., whether or not to buy a product at a given price)? Previous work has shown that the attentional drift-diffusion model (aDDM) can provide accurate quantitative descriptions of the psychometric data for binary and trinary value-based choices, and of how the choice process is guided by visual attention. Here we extend the aDDM to the case of purchasing decisions, and test it using an eye-tracking experiment. We find that the model also provides a reasonably accurate quantitative description of the relationship between choice, reaction time, and visual fixations using parameters that are very similar to those that best fit the previous data. The only critical difference is that the choice biases induced by the fixations are about half as big in purchasing decisions as in binary choices. This suggests that a similar computational process is used to make binary choices, trinary choices, and simple purchasing decisions.
Simulation of FRET dyes allows quantitative comparison against experimental data
NASA Astrophysics Data System (ADS)
Reinartz, Ines; Sinner, Claude; Nettels, Daniel; Stucki-Buchli, Brigitte; Stockmar, Florian; Panek, Pawel T.; Jacob, Christoph R.; Nienhaus, Gerd Ulrich; Schuler, Benjamin; Schug, Alexander
2018-03-01
Fully understanding biomolecular function requires detailed insight into the systems' structural dynamics. Powerful experimental techniques such as single molecule Förster Resonance Energy Transfer (FRET) provide access to such dynamic information yet have to be carefully interpreted. Molecular simulations can complement these experiments but typically face limits in accessing slow time scales and large or unstructured systems. Here, we introduce a coarse-grained simulation technique that tackles these challenges. While requiring only few parameters, we maintain full protein flexibility and include all heavy atoms of proteins, linkers, and dyes. We are able to sufficiently reduce computational demands to simulate large or heterogeneous structural dynamics and ensembles on slow time scales found in, e.g., protein folding. The simulations allow for calculating FRET efficiencies which quantitatively agree with experimentally determined values. By providing atomically resolved trajectories, this work supports the planning and microscopic interpretation of experiments. Overall, these results highlight how simulations and experiments can complement each other leading to new insights into biomolecular dynamics and function.
Hyperspectral Imaging and SPA-LDA Quantitative Analysis for Detection of Colon Cancer Tissue
NASA Astrophysics Data System (ADS)
Yuan, X.; Zhang, D.; Wang, Ch.; Dai, B.; Zhao, M.; Li, B.
2018-05-01
Hyperspectral imaging (HSI) has been demonstrated to provide a rapid, precise, and noninvasive method for cancer detection. However, because HSI contains many data, quantitative analysis is often necessary to distill information useful for distinguishing cancerous from normal tissue. To demonstrate that HSI with our proposed algorithm can make this distinction, we built a Vis-NIR HSI setup and made many spectral images of colon tissues, and then used a successive projection algorithm (SPA) to analyze the hyperspectral image data of the tissues. This was used to build an identification model based on linear discrimination analysis (LDA) using the relative reflectance values of the effective wavelengths. Other tissues were used as a prediction set to verify the reliability of the identification model. The results suggest that Vis-NIR hyperspectral images, together with the spectroscopic classification method, provide a new approach for reliable and safe diagnosis of colon cancer and could lead to advances in cancer diagnosis generally.
Simulation of UV atomic radiation for application in exhaust plume spectrometry
NASA Astrophysics Data System (ADS)
Wallace, T. L.; Powers, W. T.; Cooper, A. E.
1993-06-01
Quantitative analysis of exhaust plume spectral data has long been a goal of developers of advanced engine health monitoring systems which incorporate optical measurements of rocket exhaust constituents. Discussed herein is the status of present efforts to model and predict atomic radiation spectra and infer free-atom densities from emission/absorption measurements as part of the Optical Plume Anomaly Detection (OPAD) program at Marshall Space Flight Center (MSFC). A brief examination of the mathematical formalism is provided in the context of predicting radiation from the Mach disk region of the SSME exhaust flow at nominal conditions during ground level testing at MSFC. Computational results are provided for Chromium and Copper at selected transitions which indicate a strong dependence upon broadening parameter values determining the absorption-emission line shape. Representative plots of recent spectral data from the Stennis Space Center (SSC) Diagnostic Test Facility (DTF) rocket engine are presented and compared to numerical results from the present self-absorbing model; a comprehensive quantitative analysis will be reported at a later date.
SVD compression for magnetic resonance fingerprinting in the time domain.
McGivney, Debra F; Pierre, Eric; Ma, Dan; Jiang, Yun; Saybasili, Haris; Gulani, Vikas; Griswold, Mark A
2014-12-01
Magnetic resonance (MR) fingerprinting is a technique for acquiring and processing MR data that simultaneously provides quantitative maps of different tissue parameters through a pattern recognition algorithm. A predefined dictionary models the possible signal evolutions simulated using the Bloch equations with different combinations of various MR parameters and pattern recognition is completed by computing the inner product between the observed signal and each of the predicted signals within the dictionary. Though this matching algorithm has been shown to accurately predict the MR parameters of interest, one desires a more efficient method to obtain the quantitative images. We propose to compress the dictionary using the singular value decomposition, which will provide a low-rank approximation. By compressing the size of the dictionary in the time domain, we are able to speed up the pattern recognition algorithm, by a factor of between 3.4-4.8, without sacrificing the high signal-to-noise ratio of the original scheme presented previously.
Field Assessment of Energy Audit Tools for Retrofit Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwards, J.; Bohac, D.; Nelson, C.
2013-07-01
This project focused on the use of home energy ratings as a tool to promote energy retrofits in existing homes. A home energy rating provides a quantitative appraisal of a home’s energy performance, usually compared to a benchmark such as the average energy use of similar homes in the same region. Rating systems based on energy performance models, the focus of this report, can establish a home’s achievable energy efficiency potential and provide a quantitative assessment of energy savings after retrofits are completed, although their accuracy needs to be verified by actual measurement or billing data. Ratings can also showmore » homeowners where they stand compared to their neighbors, thus creating social pressure to conform to or surpass others. This project field-tested three different building performance models of varying complexity, in order to assess their value as rating systems in the context of a residential retrofit program: Home Energy Score, SIMPLE, and REM/Rate.« less
The Attentional Drift-Diffusion Model Extends to Simple Purchasing Decisions
Krajbich, Ian; Lu, Dingchao; Camerer, Colin; Rangel, Antonio
2012-01-01
How do we make simple purchasing decisions (e.g., whether or not to buy a product at a given price)? Previous work has shown that the attentional drift-diffusion model (aDDM) can provide accurate quantitative descriptions of the psychometric data for binary and trinary value-based choices, and of how the choice process is guided by visual attention. Here we extend the aDDM to the case of purchasing decisions, and test it using an eye-tracking experiment. We find that the model also provides a reasonably accurate quantitative description of the relationship between choice, reaction time, and visual fixations using parameters that are very similar to those that best fit the previous data. The only critical difference is that the choice biases induced by the fixations are about half as big in purchasing decisions as in binary choices. This suggests that a similar computational process is used to make binary choices, trinary choices, and simple purchasing decisions. PMID:22707945
Are quantitative cultures useful in the diagnosis of hospital-acquired pneumonia?
San Pedro, G
2001-02-01
Noninvasive and invasive tests have been developed and studied for their utility in diagnosing and guiding the treatment of hospital-acquired pneumonia, a condition with an inherently high mortality. Early empiric antibiotic treatment has been shown to reduce mortality, so delaying this treatment until test results are available is not justifiable. Furthermore, tailoring therapy based on results of either noninvasive or invasive tests has not been clearly shown to affect morbidity and mortality. This may be related to quantitative limitations of these tests or possibly to a high false-negative rate in patients who receive early antibiotic treatment and may therefore have suppressed bacterial counts. Results of these tests, however, do influence treatment. It is therefore hoped that they may ultimately provide a rational basis for making therapeutic decisions. In the future, outcomes research should be a part of large-scale clinical trials, and noninvasive and invasive tests should be incorporated into the design in an attempt to provide a better understanding of the value of such tests.
SVD Compression for Magnetic Resonance Fingerprinting in the Time Domain
McGivney, Debra F.; Pierre, Eric; Ma, Dan; Jiang, Yun; Saybasili, Haris; Gulani, Vikas; Griswold, Mark A.
2016-01-01
Magnetic resonance fingerprinting is a technique for acquiring and processing MR data that simultaneously provides quantitative maps of different tissue parameters through a pattern recognition algorithm. A predefined dictionary models the possible signal evolutions simulated using the Bloch equations with different combinations of various MR parameters and pattern recognition is completed by computing the inner product between the observed signal and each of the predicted signals within the dictionary. Though this matching algorithm has been shown to accurately predict the MR parameters of interest, one desires a more efficient method to obtain the quantitative images. We propose to compress the dictionary using the singular value decomposition (SVD), which will provide a low-rank approximation. By compressing the size of the dictionary in the time domain, we are able to speed up the pattern recognition algorithm, by a factor of between 3.4-4.8, without sacrificing the high signal-to-noise ratio of the original scheme presented previously. PMID:25029380
Bae, Won C.; Ruangchaijatuporn, Thumanoon; Chang, Eric Y; Biswas, Reni; Du, Jiang; Statum, Sheronda
2016-01-01
Objective To evaluate pathology of the triangular fibrocartilage complex (TFCC) using high resolution morphologic magnetic resonance (MR) imaging, and compare with quantitative MR and biomechanical properties. Materials and Methods Five cadaveric wrists (22 to 70 yrs) were imaged at 3T using morphologic (proton density weighted spin echo, PD FS, and 3D spoiled gradient echo, 3D SPGR) and quantitative MR sequences to determine T2 and T1rho properties. In eight geographic regions, morphology of TFC disc and laminae were evaluated for pathology and quantitative MR values. Samples were disarticulated and biomechanical indentation testing was performed on the distal surface of the TFC disc. Results On morphologic PD SE images, TFC disc pathology included degeneration and tears, while that of the laminae included degeneration, degeneration with superimposed tear, mucinous transformation, and globular calcification. Punctate calcifications were highly visible on 3D SPGR images and found only in pathologic regions. Disc pathology occurred more frequently in proximal regions of the disc than distal regions. Quantitative MR values were lowest in normal samples, and generally higher in pathologic regions. Biomechanical testing demonstrated an inverse relationship, with indentation modulus being high in normal regions with low MR values. The laminae studied were mostly pathologic, and additional normal samples are needed to discern quantitative changes. Conclusion These results show technical feasibility of morphologic MR, quantitative MR, and biomechanical techniques to characterize pathology of the TFCC. Quantitative MRI may be a suitable surrogate marker of soft tissue mechanical properties, and a useful adjunct to conventional morphologic MR techniques. PMID:26691643
Harn, Nicholas R; Hunt, Suzanne L; Hill, Jacqueline; Vidoni, Eric; Perry, Mark; Burns, Jeffrey M
2017-08-01
Establishing reliable methods for interpreting elevated cerebral amyloid-β plaque on PET scans is increasingly important for radiologists, as availability of PET imaging in clinical practice increases. We examined a 3-step method to detect plaque in cognitively normal older adults, focusing on the additive value of quantitative information during the PET scan interpretation process. Fifty-five F-florbetapir PET scans were evaluated by 3 experienced raters. Scans were first visually interpreted as having "elevated" or "nonelevated" plaque burden ("Visual Read"). Images were then processed using a standardized quantitative analysis software (MIMneuro) to generate whole brain and region of interest SUV ratios. This "Quantitative Read" was considered elevated if at least 2 of 6 regions of interest had an SUV ratio of more than 1.1. The final interpretation combined both visual and quantitative data together ("VisQ Read"). Cohen kappa values were assessed as a measure of interpretation agreement. Plaque was elevated in 25.5% to 29.1% of the 165 total Visual Reads. Interrater agreement was strong (kappa = 0.73-0.82) and consistent with reported values. Quantitative Reads were elevated in 45.5% of participants. Final VisQ Reads changed from initial Visual Reads in 16 interpretations (9.7%), with most changing from "nonelevated" Visual Reads to "elevated." These changed interpretations demonstrated lower plaque quantification than those initially read as "elevated" that remained unchanged. Interrater variability improved for VisQ Reads with the addition of quantitative information (kappa = 0.88-0.96). Inclusion of quantitative information increases consistency of PET scan interpretations for early detection of cerebral amyloid-β plaque accumulation.
Modelling the co-evolution of indirect genetic effects and inherited variability.
Marjanovic, Jovana; Mulder, Han A; Rönnegård, Lars; Bijma, Piter
2018-03-28
When individuals interact, their phenotypes may be affected not only by their own genes but also by genes in their social partners. This phenomenon is known as Indirect Genetic Effects (IGEs). In aquaculture species and some plants, however, competition not only affects trait levels of individuals, but also inflates variability of trait values among individuals. In the field of quantitative genetics, the variability of trait values has been studied as a quantitative trait in itself, and is often referred to as inherited variability. Such studies, however, consider only the genetic effect of the focal individual on trait variability and do not make a connection to competition. Although the observed phenotypic relationship between competition and variability suggests an underlying genetic relationship, the current quantitative genetic models of IGE and inherited variability do not allow for such a relationship. The lack of quantitative genetic models that connect IGEs to inherited variability limits our understanding of the potential of variability to respond to selection, both in nature and agriculture. Models of trait levels, for example, show that IGEs may considerably change heritable variation in trait values. Currently, we lack the tools to investigate whether this result extends to variability of trait values. Here we present a model that integrates IGEs and inherited variability. In this model, the target phenotype, say growth rate, is a function of the genetic and environmental effects of the focal individual and of the difference in trait value between the social partner and the focal individual, multiplied by a regression coefficient. The regression coefficient is a genetic trait, which is a measure of cooperation; a negative value indicates competition, a positive value cooperation, and an increasing value due to selection indicates the evolution of cooperation. In contrast to the existing quantitative genetic models, our model allows for co-evolution of IGEs and variability, as the regression coefficient can respond to selection. Our simulations show that the model results in increased variability of body weight with increasing competition. When competition decreases, i.e., cooperation evolves, variability becomes significantly smaller. Hence, our model facilitates quantitative genetic studies on the relationship between IGEs and inherited variability. Moreover, our findings suggest that we may have been overlooking an entire level of genetic variation in variability, the one due to IGEs.
Whittaker, Alexandra L; Williams, David L
2015-01-01
Rabbits are a common animal model in eye research and in safety testing of novel chemical agents. In addition, ocular disease is a routine presentation in clinical practice. However, few studies have quantitatively examined lacrimation kinetics in this species. This study used a noninvasive method of tear measurement (the Schirmer tear test, STT) to quantify values for basal and reflex tearing and to determine the kinetic nature of tear production in 76 New Zealand white rabbits. We obtained a value of 7.58 ± 2.3 mm/min for the standard 1-min STT. Calculated values for mean residual tear volume and reflex tear flow were 1.95 µL and 0.035 µL/s, respectively. In addition, this study provides preliminary evidence for an interaction effect between eyes given that higher STT values were obtained from the second eye tested. PMID:26632789
Normalization is a general neural mechanism for context-dependent decision making
Louie, Kenway; Khaw, Mel W.; Glimcher, Paul W.
2013-01-01
Understanding the neural code is critical to linking brain and behavior. In sensory systems, divisive normalization seems to be a canonical neural computation, observed in areas ranging from retina to cortex and mediating processes including contrast adaptation, surround suppression, visual attention, and multisensory integration. Recent electrophysiological studies have extended these insights beyond the sensory domain, demonstrating an analogous algorithm for the value signals that guide decision making, but the effects of normalization on choice behavior are unknown. Here, we show that choice models using normalization generate significant (and classically irrational) choice phenomena driven by either the value or number of alternative options. In value-guided choice experiments, both monkey and human choosers show novel context-dependent behavior consistent with normalization. These findings suggest that the neural mechanism of value coding critically influences stochastic choice behavior and provide a generalizable quantitative framework for examining context effects in decision making. PMID:23530203
NASA Astrophysics Data System (ADS)
Bocher, Thomas; Beuthan, Juergen; Scheller, M.; Hopf, Juergen U. G.; Linnarz, Marietta; Naber, Rolf-Dieter; Minet, Olaf; Becker, Wolfgang; Mueller, Gerhard J.
1995-12-01
Conventional laser-induced fluorescence spectroscopy (LIFS) of endogenous chromophores like NADH (Nicotineamide Adenine Dinucleotide, reduced form) and PP IX (Protoporphyrin IX) provides information about the relative amounts of these metabolites in the observed cells. But for diagnostic applications the concentrations of these chromophores have to be determined quantitatively to establish tissue-independent differentiation criterions. It is well- known that the individually and locally varying optical tissue parameters are major obstacles for the determination of the true chromophore concentrations by simple fluorescence spectroscopy. To overcome these problems a fiber-based, 2-channel technique including a rescaled NADH-channel (delivering quantitative values) and a relative PP IX-channel was developed. Using the accumulated information of both channels can provide good tissue state separation. Ex-vivo studies with resected and frozen samples (with LN2) of squamous cells in the histologically confirmed states: normal, tumor border, inflammation and hyperplasia were performed. Each state was represented in this series with at least 7 samples. At the identical tissue spot both, the rescaled NADH-fluorescence and the relative PP IX- fluorescence, were determined. In the first case a nitrogen laser (337 nm, 500 ps, 200 microjoule, 10 Hz) in the latter case a diode laser (633 nm, 15 mW, cw) were used as excitation sources. In this ex-vivo study a good separation between the different tissue states was achieved. With a device constructed for clinical usage one quantitative, in-vivo NADH- measurement was done recently showing similar separation capabilities.
General Methods for Evolutionary Quantitative Genetic Inference from Generalized Mixed Models.
de Villemereuil, Pierre; Schielzeth, Holger; Nakagawa, Shinichi; Morrissey, Michael
2016-11-01
Methods for inference and interpretation of evolutionary quantitative genetic parameters, and for prediction of the response to selection, are best developed for traits with normal distributions. Many traits of evolutionary interest, including many life history and behavioral traits, have inherently nonnormal distributions. The generalized linear mixed model (GLMM) framework has become a widely used tool for estimating quantitative genetic parameters for nonnormal traits. However, whereas GLMMs provide inference on a statistically convenient latent scale, it is often desirable to express quantitative genetic parameters on the scale upon which traits are measured. The parameters of fitted GLMMs, despite being on a latent scale, fully determine all quantities of potential interest on the scale on which traits are expressed. We provide expressions for deriving each of such quantities, including population means, phenotypic (co)variances, variance components including additive genetic (co)variances, and parameters such as heritability. We demonstrate that fixed effects have a strong impact on those parameters and show how to deal with this by averaging or integrating over fixed effects. The expressions require integration of quantities determined by the link function, over distributions of latent values. In general cases, the required integrals must be solved numerically, but efficient methods are available and we provide an implementation in an R package, QGglmm. We show that known formulas for quantities such as heritability of traits with binomial and Poisson distributions are special cases of our expressions. Additionally, we show how fitted GLMM can be incorporated into existing methods for predicting evolutionary trajectories. We demonstrate the accuracy of the resulting method for evolutionary prediction by simulation and apply our approach to data from a wild pedigreed vertebrate population. Copyright © 2016 de Villemereuil et al.
42 CFR 493.923 - Syphilis serology.
Code of Federal Regulations, 2013 CFR
2013-10-01
... a laboratory's response for qualitative and quantitative syphilis tests, the program must compare... under paragraphs (b)(2) and (b)(3) of this section. (2) For quantitative syphilis tests, the program... quantitative syphilis serology tests is the target value ±1 dilution. (3) The criterion for acceptable...
42 CFR 493.923 - Syphilis serology.
Code of Federal Regulations, 2012 CFR
2012-10-01
... a laboratory's response for qualitative and quantitative syphilis tests, the program must compare... under paragraphs (b)(2) and (b)(3) of this section. (2) For quantitative syphilis tests, the program... quantitative syphilis serology tests is the target value ±1 dilution. (3) The criterion for acceptable...
42 CFR 493.923 - Syphilis serology.
Code of Federal Regulations, 2014 CFR
2014-10-01
... a laboratory's response for qualitative and quantitative syphilis tests, the program must compare... under paragraphs (b)(2) and (b)(3) of this section. (2) For quantitative syphilis tests, the program... quantitative syphilis serology tests is the target value ±1 dilution. (3) The criterion for acceptable...
42 CFR 493.923 - Syphilis serology.
Code of Federal Regulations, 2010 CFR
2010-10-01
... a laboratory's response for qualitative and quantitative syphilis tests, the program must compare... under paragraphs (b)(2) and (b)(3) of this section. (2) For quantitative syphilis tests, the program... quantitative syphilis serology tests is the target value ±1 dilution. (3) The criterion for acceptable...
42 CFR 493.923 - Syphilis serology.
Code of Federal Regulations, 2011 CFR
2011-10-01
... a laboratory's response for qualitative and quantitative syphilis tests, the program must compare... under paragraphs (b)(2) and (b)(3) of this section. (2) For quantitative syphilis tests, the program... quantitative syphilis serology tests is the target value ±1 dilution. (3) The criterion for acceptable...
Henshall, John M; Dierens, Leanne; Sellars, Melony J
2014-09-02
While much attention has focused on the development of high-density single nucleotide polymorphism (SNP) assays, the costs of developing and running low-density assays have fallen dramatically. This makes it feasible to develop and apply SNP assays for agricultural species beyond the major livestock species. Although low-cost low-density assays may not have the accuracy of the high-density assays widely used in human and livestock species, we show that when combined with statistical analysis approaches that use quantitative instead of discrete genotypes, their utility may be improved. The data used in this study are from a 63-SNP marker Sequenom® iPLEX Platinum panel for the Black Tiger shrimp, for which high-density SNP assays are not currently available. For quantitative genotypes that could be estimated, in 5% of cases the most likely genotype for an individual at a SNP had a probability of less than 0.99. Matrix formulations of maximum likelihood equations for parentage assignment were developed for the quantitative genotypes and also for discrete genotypes perturbed by an assumed error term. Assignment rates that were based on maximum likelihood with quantitative genotypes were similar to those based on maximum likelihood with perturbed genotypes but, for more than 50% of cases, the two methods resulted in individuals being assigned to different families. Treating genotypes as quantitative values allows the same analysis framework to be used for pooled samples of DNA from multiple individuals. Resulting correlations between allele frequency estimates from pooled DNA and individual samples were consistently greater than 0.90, and as high as 0.97 for some pools. Estimates of family contributions to the pools based on quantitative genotypes in pooled DNA had a correlation of 0.85 with estimates of contributions from DNA-derived pedigree. Even with low numbers of SNPs of variable quality, parentage testing and family assignment from pooled samples are sufficiently accurate to provide useful information for a breeding program. Treating genotypes as quantitative values is an alternative to perturbing genotypes using an assumed error distribution, but can produce very different results. An understanding of the distribution of the error is required for SNP genotyping platforms.
van den Berg, M M J; Dancet, E A F; Erlikh, T; van der Veen, F; Goddijn, M; Hajenius, P J
2018-01-01
Early pregnancy complications, defined as miscarriage, recurrent miscarriage or ectopic pregnancy, affect the physical and psychological well-being of intended parents. Research in this field so far has focused mainly on improving accuracy of diagnostic tests and safety and effectiveness of therapeutic management. An overview of aspects of care valued by women and/or their partners is missing. This systematic review aims to provide an overview of aspects of care valued by women and/or their partners faced with early pregnancy complications and to identify potential targets for improvement in early pregnancy healthcare. We searched five electronic databases for empirical quantitative or qualitative studies on patients' perspectives of early pregnancy care in July 2017. We first identified aspects of early pregnancy care valued by women and/or their partners based on qualitative and quantitative data and organized these aspects of care according to the eight dimensions of patient-centered care. Second, we extracted the assessment of service quality from women and/or their partners on each of these aspects of care based on quantitative data. Third, we combined the findings on patients' values with the findings of service quality assessment to identify potential targets for improvement in five groups according to how likely these targets are to require improvement. The search yielded 6240 publications, of which 27 studies were eligible for inclusion in this review. All included studies focused on miscarriage or recurrent miscarriage care. We identified 24 valued aspects of care, which all covered the eight dimensions of patient-centered care. The most frequently reported valued aspect was 'being treated as an individual person experiencing a significant life event rather than a common condition'. Assessment of service quality from women and/or their partners was available for 13 of the 24 identified aspects of care. Quantitative studies all documented service quality as problematic for these 13 aspects of care. We thus identified 13 potential targets for improvement in the patient-centeredness of miscarriage and recurrent miscarriage care of which none were very likely, four were likely, six were unlikely and three were very unlikely, to require improvement. The four likely potential targets for improvement were 'Understandable information provision about the etiology of pregnancy', 'Staff discussing patients' distress', 'Informing patients on pregnancy loss in the presence of a friend or partner' and 'Staff performing follow-up phone calls to support their patients after a miscarriage'. It is important for clinicians to realize that women and their partners undergoing a miscarriage experience a significant live event and appreciate an individual approach. Future qualitative studies are needed to explore the identified potential targets for improvement of (recurrent) miscarriage care and to explore patients' perspectives in women suspected and treated for ectopic pregnancy. © The Author 2017. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Visualization techniques to aid in the analysis of multi-spectral astrophysical data sets
NASA Technical Reports Server (NTRS)
Brugel, Edward W.; Domik, Gitta O.; Ayres, Thomas R.
1993-01-01
The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions, and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists.
Wignall, Jessica A; Muratov, Eugene; Sedykh, Alexander; Guyton, Kathryn Z; Tropsha, Alexander; Rusyn, Ivan; Chiu, Weihsueh A
2018-05-01
Human health assessments synthesize human, animal, and mechanistic data to produce toxicity values that are key inputs to risk-based decision making. Traditional assessments are data-, time-, and resource-intensive, and they cannot be developed for most environmental chemicals owing to a lack of appropriate data. As recommended by the National Research Council, we propose a solution for predicting toxicity values for data-poor chemicals through development of quantitative structure-activity relationship (QSAR) models. We used a comprehensive database of chemicals with existing regulatory toxicity values from U.S. federal and state agencies to develop quantitative QSAR models. We compared QSAR-based model predictions to those based on high-throughput screening (HTS) assays. QSAR models for noncancer threshold-based values and cancer slope factors had cross-validation-based Q 2 of 0.25-0.45, mean model errors of 0.70-1.11 log 10 units, and applicability domains covering >80% of environmental chemicals. Toxicity values predicted from QSAR models developed in this study were more accurate and precise than those based on HTS assays or mean-based predictions. A publicly accessible web interface to make predictions for any chemical of interest is available at http://toxvalue.org. An in silico tool that can predict toxicity values with an uncertainty of an order of magnitude or less can be used to quickly and quantitatively assess risks of environmental chemicals when traditional toxicity data or human health assessments are unavailable. This tool can fill a critical gap in the risk assessment and management of data-poor chemicals. https://doi.org/10.1289/EHP2998.
[A comparison of convenience sampling and purposive sampling].
Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien
2014-06-01
Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.
Evaluative procedures to detect, characterize, and assess the severity of diabetic neuropathy.
Dyck, P J
1991-01-01
Minimal criteria for diabetic neuropathy need to be defined and universally applied. Standardized evaluative procedures need to be agreed and normal ranges determined from healthy volunteers. Types and stages of neuropathy should be established and assessments performed on representative populations of both Type 1 and Type 2 diabetic patients. Potential minimal criteria include absent ankle reflexes and vibratory sensation, and abnormalities of nerve conduction. However, the preferred criterion is the identification of more than two statistically defined abnormalities among symptoms and deficits, nerve conduction, quantitative sensory examination or quantitative autonomic examination. Various evaluative procedures are available. Symptoms should be assessed and scores can be assigned to neurological deficits. However, assessments of nerve conduction provide the most specific, objective, sensitive, and repeatable procedures, although these may be the least meaningful. Many techniques are available for quantitative sensory examination, but are poorly standardized and normal values are not available. For quantitative autonomic examination, tests are available for the adequacy of cardiovascular and peripheral vascular reflexes and increasingly for other autonomic functions. In any assessment of nerve function the conditions should be optimized and standardized, and stimuli defined. Specific instructions should be given and normal ranges established in healthy volunteers.
Objectives and metrics for wildlife monitoring
Sauer, J.R.; Knutson, M.G.
2008-01-01
Monitoring surveys allow managers to document system status and provide the quantitative basis for management decision-making, and large amounts of effort and funding are devoted to monitoring. Still, monitoring surveys often fall short of providing required information; inadequacies exist in survey designs, analyses procedures, or in the ability to integrate the information into an appropriate evaluation of management actions. We describe current uses of monitoring data, provide our perspective on the value and limitations of current approaches to monitoring, and set the stage for 3 papers that discuss current goals and implementation of monitoring programs. These papers were derived from presentations at a symposium at The Wildlife Society's 13th Annual Conference in Anchorage, Alaska, USA. [2006
Dumaswala, Bhavin; Dumaswala, Komal; Hsiung, Ming Chon; Quiroz, Luis David Meggo; Sungur, Aylin; Escanuela, Maximilliano German Amado; Mehta, Kruti; Oz, Tugba Kemaloglu; Bhagatwala, Kunal; Karia, Nidhi M; Nanda, Navin C
2013-09-01
In this retrospective study, we identified 7 cases where Lambl's excrescences were identified by two-dimensional transesophageal echocardiography (2DTEE) and also had live/real time three-dimensional transesophageal echocardiography (3DTEE) studies available for comparison. We subsequently assessed them for the presence of Lambl's excrescences (LE) and nodules of Arantius (NA) on the aortic valve. After their identification, we qualitatively and quantitatively organized our findings by number, cusp location, measurements, and orientation if applicable. A greater number of LE was found by 3DTEE than 2DTEE (19 vs. 11, respectively). In all 3DTEE studies, their cusp attachment site, their x-, y-, and z-axis measurements, and orientation were clearly visualized and described. Only 3DTEE studies provided confident visualization of the cusp attachment sites. Similarly, a greater number of NA was found by 3DTEE than 2DTEE (21 vs. 5, respectively). The triad of NA was visualized in all 3DTEE studies and each was described using its x-, y-, and z- axis measurements. Only three 2DTEE studies provided reliable identification of the NA. In conclusion, we present further evidence of the incremental value of 3DTEE over 2DTEE in the qualitative and quantitative assessment of cardiac structures including LE and NA on the aortic valve. © 2013, Wiley Periodicals, Inc.
Quantization of liver tissue in dual kVp computed tomography using linear discriminant analysis
NASA Astrophysics Data System (ADS)
Tkaczyk, J. Eric; Langan, David; Wu, Xiaoye; Xu, Daniel; Benson, Thomas; Pack, Jed D.; Schmitz, Andrea; Hara, Amy; Palicek, William; Licato, Paul; Leverentz, Jaynne
2009-02-01
Linear discriminate analysis (LDA) is applied to dual kVp CT and used for tissue characterization. The potential to quantitatively model both malignant and benign, hypo-intense liver lesions is evaluated by analysis of portal-phase, intravenous CT scan data obtained on human patients. Masses with an a priori classification are mapped to a distribution of points in basis material space. The degree of localization of tissue types in the material basis space is related to both quantum noise and real compositional differences. The density maps are analyzed with LDA and studied with system simulations to differentiate these factors. The discriminant analysis is formulated so as to incorporate the known statistical properties of the data. Effective kVp separation and mAs relates to precision of tissue localization. Bias in the material position is related to the degree of X-ray scatter and partial-volume effect. Experimental data and simulations demonstrate that for single energy (HU) imaging or image-based decomposition pixel values of water-like tissues depend on proximity to other iodine-filled bodies. Beam-hardening errors cause a shift in image value on the scale of that difference sought between in cancerous and cystic lessons. In contrast, projection-based decomposition or its equivalent when implemented on a carefully calibrated system can provide accurate data. On such a system, LDA may provide novel quantitative capabilities for tissue characterization in dual energy CT.
Choi, Jaesung P.; Foley, Matthew; Zhou, Zinan; Wong, Weng-Yew; Gokoolparsadh, Naveena; Arthur, J. Simon C.; Li, Dean Y.; Zheng, Xiangjian
2016-01-01
Mutations in CCM1 (aka KRIT1), CCM2, or CCM3 (aka PDCD10) gene cause cerebral cavernous malformation in humans. Mouse models of CCM disease have been established by deleting Ccm genes in postnatal animals. These mouse models provide invaluable tools to investigate molecular mechanism and therapeutic approaches for CCM disease. However, the full value of these animal models is limited by the lack of an accurate and quantitative method to assess lesion burden and progression. In the present study we have established a refined and detailed contrast enhanced X-ray micro-CT method to measure CCM lesion burden in mouse brains. As this study utilized a voxel dimension of 9.5μm (leading to a minimum feature size of approximately 25μm), it is therefore sufficient to measure CCM lesion volume and number globally and accurately, and provide high-resolution 3-D mapping of CCM lesions in mouse brains. Using this method, we found loss of Ccm1 or Ccm2 in neonatal endothelium confers CCM lesions in the mouse hindbrain with similar total volume and number. This quantitative approach also demonstrated a rescue of CCM lesions with simultaneous deletion of one allele of Mekk3. This method would enhance the value of the established mouse models to study the molecular basis and potential therapies for CCM and other cerebrovascular diseases. PMID:27513872
Direct Regularized Estimation of Retinal Vascular Oxygen Tension Based on an Experimental Model
Yildirim, Isa; Ansari, Rashid; Yetik, I. Samil; Shahidi, Mahnaz
2014-01-01
Phosphorescence lifetime imaging is commonly used to generate oxygen tension maps of retinal blood vessels by classical least squares (LS) estimation method. A spatial regularization method was later proposed and provided improved results. However, both methods obtain oxygen tension values from the estimates of intermediate variables, and do not yield an optimum estimate of oxygen tension values, due to their nonlinear dependence on the ratio of intermediate variables. In this paper, we provide an improved solution by devising a regularized direct least squares (RDLS) method that exploits available knowledge in studies that provide models of oxygen tension in retinal arteries and veins, unlike the earlier regularized LS approach where knowledge about intermediate variables is limited. The performance of the proposed RDLS method is evaluated by investigating and comparing the bias, variance, oxygen tension maps, 1-D profiles of arterial oxygen tension, and mean absolute error with those of earlier methods, and its superior performance both quantitatively and qualitatively is demonstrated. PMID:23732915
NASA Astrophysics Data System (ADS)
Lederman, Dror; Leader, Joseph K.; Zheng, Bin; Sciurba, Frank C.; Tan, Jun; Gur, David
2011-03-01
Quantitative computed tomography (CT) has been widely used to detect and evaluate the presence (or absence) of emphysema applying the density masks at specific thresholds, e.g., -910 or -950 Hounsfield Unit (HU). However, it has also been observed that subjects with similar density-mask based emphysema scores could have varying lung function, possibly indicating differences of disease severity. To assess this possible discrepancy, we investigated whether density distribution of "viable" lung parenchyma regions with pixel values > -910 HU correlates with lung function. A dataset of 38 subjects, who underwent both pulmonary function testing and CT examinations in a COPD SCCOR study, was assembled. After the lung regions depicted on CT images were automatically segmented by a computerized scheme, we systematically divided the lung parenchyma into different density groups (bins) and computed a number of statistical features (i.e., mean, standard deviation (STD), skewness of the pixel value distributions) in these density bins. We then analyzed the correlations between each feature and lung function. The correlation between diffusion lung capacity (DLCO) and STD of pixel values in the bin of -910HU <= PV < -750HU was -0.43, as compared with a correlation of -0.49 obtained between the post-bronchodilator ratio (FEV1/FVC) measured by the forced expiratory volume in 1 second (FEV1) dividing the forced vital capacity (FVC) and the STD of pixel values in the bin of -1024HU <= PV < -910HU. The results showed an association between the distribution of pixel values in "viable" lung parenchyma and lung function, which indicates that similar to the conventional density mask method, the pixel value distribution features in "viable" lung parenchyma areas may also provide clinically useful information to improve assessments of lung disease severity as measured by lung functional tests.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kiely, J Blanco; Olszanski, A; Both, S
Purpose: To develop a quantitative decision making metric for automatically detecting irregular breathing using a large patient population that received phase-sorted 4DCT. Methods: This study employed two patient cohorts. Cohort#1 contained 256 patients who received a phasesorted 4DCT. Cohort#2 contained 86 patients who received three weekly phase-sorted 4DCT scans. A previously published technique used a single abdominal surrogate to calculate the ratio of extreme inhalation tidal volume to normal inhalation tidal volume, referred to as the κ metric. Since a single surrogate is standard for phase-sorted 4DCT in radiation oncology clinical practice, tidal volume was not quantified. Without tidal volume,more » the absolute κ metric could not be determined, so a relative κ (κrel) metric was defined based on the measured surrogate amplitude instead of tidal volume. Receiver operator characteristic (ROC) curves were used to quantitatively determine the optimal cutoff value (jk) and efficiency cutoff value (τk) of κrel to automatically identify irregular breathing that would reduce the image quality of phase-sorted 4DCT. Discriminatory accuracy (area under the ROC curve) of κrel was calculated by a trapezoidal numeric integration technique. Results: The discriminatory accuracy of ?rel was found to be 0.746. The key values of jk and tk were calculated to be 1.45 and 1.72 respectively. For values of ?rel such that jk≤κrel≤τk, the decision to reacquire the 4DCT would be at the discretion of the physician. This accounted for only 11.9% of the patients in this study. The magnitude of κrel held consistent over 3 weeks for 73% of the patients in cohort#3. Conclusion: The decision making metric, ?rel, was shown to be an accurate classifier of irregular breathing patients in a large patient population. This work provided an automatic quantitative decision making metric to quickly and accurately assess the extent to which irregular breathing is occurring during phase-sorted 4DCT.« less
Bansback, Nick; Harrison, Mark; Marra, Carlo
2016-05-01
Imprecision in estimates of benefits and harms around treatment choices is rarely described to patients. Variation in sampling error between treatment alternatives (e.g., treatments have similar average risks, but one treatment has a larger confidence interval) can result in patients failing to choose the option that is best for them. The aim of this study is to use a discrete choice experiment to describe how 2 methods for conveying imprecision in risk influence people's treatment decisions. We randomized a representative sample of the Canadian general population to 1 of 3 surveys that sought choices between hypothetical treatments for rheumatoid arthritis based on different levels of 7 attributes: route and frequency of administration, chance of benefit, serious and minor side effects and life expectancy, and imprecision in benefit and side-effect estimates. The surveys differed in the way imprecision was described: 1) no imprecision, 2) quantitative description based on a range with a visual graphic, and 3) qualitative description simply describing the confidence in the evidence. The analyzed data were from 2663 respondents. Results suggested that more people understood imprecision when it was described qualitatively (88%) versus quantitatively (68%). Respondents who appeared to understand imprecision descriptions placed high value on increased precision regarding the actual benefits and harms of treatment, equivalent to the value placed on the information about the probability of serious side effects. Both qualitative and quantitative methods led to small but significant increases in decision uncertainty for choosing any treatment. Limitations included some issues in defining understanding of imprecision and the use of an internet survey of panel members. These findings provide insight into how conveying imprecision information influences patient treatment choices. © The Author(s) 2015.
Sun, T T; Liu, W H; Zhang, Y Q; Li, L H; Wang, R; Ye, Y Y
2017-08-01
Objective: To explore the differential between the value of dynamic contrast-enhanced MRI quantitative pharmacokinetic parameters and relative pharmacokinetic quantitative parameters in breast lesions. Methods: Retrospective analysis of 255 patients(262 breast lesions) who was obtained by clinical palpation , ultrasound or full-field digital mammography , and then all lessions were pathologically confirmed in Zhongda Hospital, Southeast University from May 2012 to May 2016. A 3.0 T MRI scanner was used to obtain the quantitative MR pharmacokinetic parameters: volume transfer constant (K(trans)), exchange rate constant (k(ep))and extravascular extracellular volume fraction (V(e)). And measured the quantitative pharmacokinetic parameters of normal glands tissues which on the same side of the same level of the lesions; and then calculated the value of relative pharmacokinetic parameters: rK(rans)、rk(ep) and rV(e).To explore the diagnostic value of two pharmacokinetic parameters in differential diagnosis of benign and malignant breast lesions using receiver operating curves and model of logistic regression. Results: (1)There were significant differences between benign lesions and malignant lesions in K(trans) and k(ep) ( t =15.489, 15.022, respectively, P <0.05), there were no significant differences between benign lesions and malignant lesions in V(e)( t =-2.346, P >0.05). The areas under the ROC curve(AUC)of K(trans), k(ep) and V(e) between malignant and benign lesions were 0.933, 0.948 and 0.387, the sensitivity of K(trans), k(ep) and V(e) were 77.1%, 85.0%, 51.0% , and the specificity of K(trans), k(ep) and V(e) were 96.3%, 93.6%, 60.8% for the differential diagnosis of breast lesions if taken the maximum Youden's index as cut-off. (2)There were significant differences between benign lesions and malignant lesions in rK(trans), rk(ep) and rV(e) ( t =14.177, 11.726, 2.477, respectively, P <0.05). The AUC of rK(trans), rk(ep) and rV(e) between malignant and benign lesions were 0.963, 0.903 and 0.575, the sensitivity of rK(trans), rk(ep) and rV(e) were 85.6%, 71.9%, 52.9% , and the specificity of rK(trans), rk(ep) and rV(e) were 94.5%, 92.7%, 60.6% for the differential diagnosis of breast lesions.(3)There was no significant difference in the area under the ROC curve between the predictive probability of quantitative pharmacokinetic parameters and the prediction probability of relative quantitative pharmacokinetic parameters( Z =0.867, P =0.195). Conclusion: There was no significant difference between the quantitative parameter values (K(trans,) k(ep)) and the relative quantitative parameter values (rK(trans,) rk(ep)) in diagnosis of breast lesions, which were important parameters in differential diagnosis of benign and malignant breast lesions.
Nursing students' perceptions of hospital learning environments--an Australian perspective.
Chan, Dominic S
2004-01-01
Clinical education is a vital component in the curricula of pre-registration nursing courses and provides student nurses with the opportunity to combine cognitive, psychomotor, and affective skills. Various studies have suggested that not all practice settings are able to provide nursing students with a positive learning environment. In order to maximize nursing students' clinical learning outcomes, there is a need to examine the clinical learning environment. The purpose of this study was to assess pre-registration nursing students' perceptions of hospital learning environments during clinical field placement. Quantitative and qualitative methodology was used. One hundred and eight students provided quantitative data through completion of the survey instrument, the Clinical Learning Environment Inventory (Actual and Preferred forms). Each form is a 5-point Likert-type questionnaire, made up of 35 items consisted of 5 scales with 7 items per scale. Qualitative data, obtained through semi-structured interview of 21 students from the same cohort, were used to explain and support the quantitative findings. There were significant differences between students' actual and preferred perceptions of the clinical learning environments. Generally students preferred a more positive and favourable clinical environment than they perceived as being actually present. Since participants consisted of nursing students from just one university nursing school in South Australia, the findings may not be representative of all nursing students in general with respect to their clinical placement. However, the value of this study lies in the resulting implication for nursing education and future research. A better understanding of what constitutes quality clinical education from the students' perspective would be valuable in providing better educational experiences.
O'Connor, Michael; Lee, Caroline; Ellens, Harma; Bentz, Joe
2015-02-01
Current USFDA and EMA guidance for drug transporter interactions is dependent on IC50 measurements as these are utilized in determining whether a clinical interaction study is warranted. It is therefore important not only to standardize transport inhibition assay systems but also to develop uniform statistical criteria with associated probability statements for generation of robust IC50 values, which can be easily adopted across the industry. The current work provides a quantitative examination of critical factors affecting the quality of IC50 fits for P-gp inhibition through simulations of perfect data with randomly added error as commonly observed in the large data set collected by the P-gp IC50 initiative. The types of errors simulated were (1) variability in replicate measures of transport activity; (2) transformations of error-contaminated transport activity data prior to IC50 fitting (such as performed when determining an IC50 for inhibition of P-gp based on efflux ratio); and (3) the lack of well defined "no inhibition" and "complete inhibition" plateaus. The effect of the algorithm used in fitting the inhibition curve (e.g., two or three parameter fits) was also investigated. These simulations provide strong quantitative support for the recommendations provided in Bentz et al. (2013) for the determination of IC50 values for P-gp and demonstrate the adverse effect of data transformation prior to fitting. Furthermore, the simulations validate uniform statistical criteria for robust IC50 fits in general, which can be easily implemented across the industry. A calibration of the t-statistic is provided through calculation of confidence intervals associated with the t-statistic.
O'Connor, Michael; Lee, Caroline; Ellens, Harma; Bentz, Joe
2015-01-01
Current USFDA and EMA guidance for drug transporter interactions is dependent on IC50 measurements as these are utilized in determining whether a clinical interaction study is warranted. It is therefore important not only to standardize transport inhibition assay systems but also to develop uniform statistical criteria with associated probability statements for generation of robust IC50 values, which can be easily adopted across the industry. The current work provides a quantitative examination of critical factors affecting the quality of IC50 fits for P-gp inhibition through simulations of perfect data with randomly added error as commonly observed in the large data set collected by the P-gp IC50 initiative. The types of errors simulated were (1) variability in replicate measures of transport activity; (2) transformations of error-contaminated transport activity data prior to IC50 fitting (such as performed when determining an IC50 for inhibition of P-gp based on efflux ratio); and (3) the lack of well defined “no inhibition” and “complete inhibition” plateaus. The effect of the algorithm used in fitting the inhibition curve (e.g., two or three parameter fits) was also investigated. These simulations provide strong quantitative support for the recommendations provided in Bentz et al. (2013) for the determination of IC50 values for P-gp and demonstrate the adverse effect of data transformation prior to fitting. Furthermore, the simulations validate uniform statistical criteria for robust IC50 fits in general, which can be easily implemented across the industry. A calibration of the t-statistic is provided through calculation of confidence intervals associated with the t-statistic. PMID:25692007
Amujoyegbe, O O; Idu, M; Agbedahunsi, J M; Erhabor, J O
2016-06-05
The present study entails the medicinal plant species used to manage sickle cell disorder in Southern States of Nigeria. The ethnomedicinal information was gathered through multistage approach from three geopolitical zones of Southern Nigeria, which were purposively selected. Semi-structured questionnaires were administered on 500 respondents in 125 locations. The ethnomedicinal data collected were analyzed using quantitative value indices such as fidelity level (percentage) and use value. The information got was cross checked using literature search and other related materials. Five hundred respondents comprising 53.12% females and 46.88% males were observed. It was noted that 26.70% were illiterate while 73.30% had formal education. Seventy-nine percent is traditional healers, 27% herb traders and the other 4% are those who have awareness of sickle cell disease . One hundred and seventy five plant species belonging to 70 families, of which Fabaceae made up 26.76% and Euphorbiaceae 16.90% forming the highest occurrence. It was observed that leaves were the most common plant part used (69.10%) followed by root (15%) and stem bark (14%) in the preparation for sickle cell management. Majority (48.57%) of these plants were harvested from wild with 38.86% being trees. Citrus aurantifolia and Newbouldia laevis had highest use values of 0.69 and 0.64 respectively. Plants with the least use value (0.001) include Abrus canescens, Acacia xanthophloea, Aerva lanata and Axonopus compressus. The result of fidelity level values of the plant species for the management of Sickle Cell Disorder (SCD) revealed that Citrus aurantifolia had the highest value of 70.2% while Angraecum distichum and Axonopus compressus had the lowest Fidelity Level value of 0.18%. The study revealed that people in the studied areas were well grounded in the medicinal plants used to manage sickle cell disease. This study reported for the first time 102 plant species having anti-sickling potentials with Fabaceae and Euphorbiaceae as the most dominant plant families. Many of the claimed plants were harvested from the wild showing threat thus providing needs for conservation of plants. The documented plants had high use value and fidelity level that provided quantitative and qualitative ethnomedicinal evaluation within and across the plant families. These give room for further scientific investigations in pharmacological profiles. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Verma, Shefali S; Lucas, Anastasia M; Lavage, Daniel R; Leader, Joseph B; Metpally, Raghu; Krishnamurthy, Sarathbabu; Dewey, Frederick; Borecki, Ingrid; Lopez, Alexander; Overton, John; Penn, John; Reid, Jeffrey; Pendergrass, Sarah A; Breitwieser, Gerda; Ritchie, Marylyn D
2017-01-01
A wide range of patient health data is recorded in Electronic Health Records (EHR). This data includes diagnosis, surgical procedures, clinical laboratory measurements, and medication information. Together this information reflects the patient's medical history. Many studies have efficiently used this data from the EHR to find associations that are clinically relevant, either by utilizing International Classification of Diseases, version 9 (ICD-9) codes or laboratory measurements, or by designing phenotype algorithms to extract case and control status with accuracy from the EHR. Here we developed a strategy to utilize longitudinal quantitative trait data from the EHR at Geisinger Health System focusing on outpatient metabolic and complete blood panel data as a starting point. Comprehensive Metabolic Panel (CMP) as well as Complete Blood Counts (CBC) are parts of routine care and provide a comprehensive picture from high level screening of patients' overall health and disease. We randomly split our data into two datasets to allow for discovery and replication. We first conducted a genome-wide association study (GWAS) with median values of 25 different clinical laboratory measurements to identify variants from Human Omni Express Exome beadchip data that are associated with these measurements. We identified 687 variants that associated and replicated with the tested clinical measurements at p<5×10-08. Since longitudinal data from the EHR provides a record of a patient's medical history, we utilized this information to further investigate the ICD-9 codes that might be associated with differences in variability of the measurements in the longitudinal dataset. We identified low and high variance patients by looking at changes within their individual longitudinal EHR laboratory results for each of the 25 clinical lab values (thus creating 50 groups - a high variance and a low variance for each lab variable). We then performed a PheWAS analysis with ICD-9 diagnosis codes, separately in the high variance group and the low variance group for each lab variable. We found 717 PheWAS associations that replicated at a p-value less than 0.001. Next, we evaluated the results of this study by comparing the association results between the high and low variance groups. For example, we found 39 SNPs (in multiple genes) associated with ICD-9 250.01 (Type-I diabetes) in patients with high variance of plasma glucose levels, but not in patients with low variance in plasma glucose levels. Another example is the association of 4 SNPs in UMOD with chronic kidney disease in patients with high variance for aspartate aminotransferase (discovery p-value: 8.71×10-09 and replication p-value: 2.03×10-06). In general, we see a pattern of many more statistically significant associations from patients with high variance in the quantitative lab variables, in comparison with the low variance group across all of the 25 laboratory measurements. This study is one of the first of its kind to utilize quantitative trait variance from longitudinal laboratory data to find associations among genetic variants and clinical phenotypes obtained from an EHR, integrating laboratory values and diagnosis codes to understand the genetic complexities of common diseases.
Schmidt, Simone; Hafner, Patricia; Klein, Andrea; Rubino-Nacht, Daniela; Gocheva, Vanya; Schroeder, Jonas; Naduvilekoot Devasia, Arjith; Zuesli, Stephanie; Bernert, Guenther; Laugel, Vincent; Bloetzer, Clemens; Steinlin, Maja; Capone, Andrea; Gloor, Monika; Tobler, Patrick; Haas, Tanja; Bieri, Oliver; Zumbrunn, Thomas; Fischer, Dirk; Bonati, Ulrike
2018-01-01
The development of new therapeutic agents for the treatment of Duchenne muscular dystrophy has put a focus on defining outcome measures most sensitive to capture treatment effects. This cross-sectional analysis investigates the relation between validated clinical assessments such as the 6-minute walk test, motor function measure and quantitative muscle MRI of thigh muscles in ambulant Duchenne muscular dystrophy patients, aged 6.5 to 10.8 years (mean 8.2, SD 1.1). Quantitative muscle MRI included the mean fat fraction using a 2-point Dixon technique, and transverse relaxation time (T2) measurements. All clinical assessments were highly significantly inter-correlated with p < 0.001. The strongest correlation with the motor function measure and its D1-subscore was shown by the 6-minute walk test. Clinical assessments showed no correlation with age. Importantly, quantitative muscle MRI values significantly correlated with all clinical assessments with the extensors showing the strongest correlation. In contrast to the clinical assessments, quantitative muscle MRI values were highly significantly correlated with age. In conclusion, the motor function measure and timed function tests measure disease severity in a highly comparable fashion and all tests correlated with quantitative muscle MRI values quantifying fatty muscle degeneration. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Dai, Yunpeng; Chen, Xueli; Yin, Jipeng; Kang, Xiaoyu; Wang, Guodong; Zhang, Xianghan; Nie, Yongzhan; Wu, Kaichun; Liang, Jimin
2016-08-01
The aim of this article is to investigate the influence of a tracer injection dose (ID) and camera integration time (IT) on quantifying pharmacokinetics of Cy5.5-GX1 in gastric cancer BGC-823 cell xenografted mice. Based on three factors, including whether or not to inject free GX1, the ID of Cy5.5-GX1, and the camera IT, 32 mice were randomly divided into eight groups and received 60-min dynamic fluorescence imaging. Gurfinkel exponential model (GEXPM) and Lammertsma simplified reference tissue model (SRTM) combined with a singular value decomposition analysis were used to quantitatively analyze the acquired dynamic fluorescent images. The binding potential (Bp) and the sum of the pharmacokinetic rate constants (SKRC) of Cy5.5-GX1 were determined by the SRTM and EXPM, respectively. In the tumor region, the SKRC value exhibited an obvious trend with change in the tracer ID, but the Bp value was not sensitive to it. Both the Bp and SKRC values were independent of the camera IT. In addition, the ratio of the tumor-to-muscle region was correlated with the camera IT but was independent of the tracer ID. Dynamic fluorescence imaging in conjunction with a kinetic analysis may provide more quantitative information than static fluorescence imaging, especially for a priori information on the optimal ID of targeted probes for individual therapy.
Overview of Student Affairs Research Methods: Qualitative and Quantitative.
ERIC Educational Resources Information Center
Perl, Emily J.; Noldon, Denise F.
2000-01-01
Reviews the strengths and weaknesses of quantitative and qualitative research in student affairs research, noting that many student affairs professionals question the value of more traditional quantitative approaches to research, though they typically have very good people skills that they have applied to being good qualitative researchers.…
Yang, Chuang-Bo; Zhang, Shuang; Jia, Yong-Jun; Yu, Yong; Duan, Hai-Feng; Zhang, Xi-Rong; Ma, Guang-Ming; Ren, Chenglong; Yu, Nan
2017-10-01
To study the clinical value of dual-energy spectral CT in the quantitative assessment of microvascular invasion of small hepatocellular carcinoma. This study was approved by our ethics committee. 50 patients with small hepatocellular carcinoma who underwent contrast enhanced spectral CT in arterial phase (AP) and portal venous phase (VP) were enrolled. Tumour CT value and iodine concentration (IC) were measured from spectral CT images. The slope of spectral curve, normalized iodine concentration (NIC, to abdominal aorta) and ratio of IC difference between AP and VP (RIC AP-VP : [RIC AP-VP =(IC AP -IC VP )/IC AP ]) were calculated. Tumours were identified as either with or without microvascular invasion based on pathological results. Measurements were statistically compared using independent samples t test. The receiver operating characteristic (ROC) analysis was used to evaluate the diagnostic performance of tumours microvascular invasion assessment. The 70keV images were used to simulate the results of conventional CT scans for comparison. 56 small hepatocellular carcinomas were detected with 37 lesions (Group A) with microvascular invasion and 19 (Group B) without. There were significant differences in IC, NIC and slope in AP and RIC AP-VP between Group A (2.48±0.70mg/ml, 0.23±0.05, 3.39±1.01 and 0.28±0.16) and Group B (1.65±0.47mg/ml, 0.15±0.05, 2.22±0.64 and 0.03±0.24) (all p<0.05). Using 0.188 as the threshold for NIC, one could obtain an area-under-curve (AUC) of 0.87 in ROC to differentiate between tumours with and without microvascular invasion. AUC was 0.71 with CT value at 70keV and improved to 0.81 at 40keV. Dual-energy Spectral CT provides additional quantitative parameters than conventional CT to improve the differentiation between small hepatocellular carcinoma with and without microvascular invasion. Quantitative iodine concentration measurement in spectral CT may be used to provide a new method to improve the evaluation for small hepatocellular carcinoma microvascular invasion. Copyright © 2017 Elsevier B.V. All rights reserved.
Shi, Ximin; Li, Nan; Ding, Haiyan; Dang, Yonghong; Hu, Guilan; Liu, Shuai; Cui, Jie; Zhang, Yue; Li, Fang; Zhang, Hui; Huo, Li
2018-01-01
Kinetic modeling of dynamic 11 C-acetate PET imaging provides quantitative information for myocardium assessment. The quality and quantitation of PET images are known to be dependent on PET reconstruction methods. This study aims to investigate the impacts of reconstruction algorithms on the quantitative analysis of dynamic 11 C-acetate cardiac PET imaging. Suspected alcoholic cardiomyopathy patients ( N = 24) underwent 11 C-acetate dynamic PET imaging after low dose CT scan. PET images were reconstructed using four algorithms: filtered backprojection (FBP), ordered subsets expectation maximization (OSEM), OSEM with time-of-flight (TOF), and OSEM with both time-of-flight and point-spread-function (TPSF). Standardized uptake values (SUVs) at different time points were compared among images reconstructed using the four algorithms. Time-activity curves (TACs) in myocardium and blood pools of ventricles were generated from the dynamic image series. Kinetic parameters K 1 and k 2 were derived using a 1-tissue-compartment model for kinetic modeling of cardiac flow from 11 C-acetate PET images. Significant image quality improvement was found in the images reconstructed using iterative OSEM-type algorithms (OSME, TOF, and TPSF) compared with FBP. However, no statistical differences in SUVs were observed among the four reconstruction methods at the selected time points. Kinetic parameters K 1 and k 2 also exhibited no statistical difference among the four reconstruction algorithms in terms of mean value and standard deviation. However, for the correlation analysis, OSEM reconstruction presented relatively higher residual in correlation with FBP reconstruction compared with TOF and TPSF reconstruction, and TOF and TPSF reconstruction were highly correlated with each other. All the tested reconstruction algorithms performed similarly for quantitative analysis of 11 C-acetate cardiac PET imaging. TOF and TPSF yielded highly consistent kinetic parameter results with superior image quality compared with FBP. OSEM was relatively less reliable. Both TOF and TPSF were recommended for cardiac 11 C-acetate kinetic analysis.
Cost benefit analysis of the transfer of NASA remote sensing technology to the state of Georgia
NASA Technical Reports Server (NTRS)
Zimmer, R. P. (Principal Investigator); Wilkins, R. D.; Kelly, D. L.; Brown, D. M.
1977-01-01
The author has identified the following significant results. First order benefits can generally be quantified, thus allowing quantitative comparisons of candidate land cover data systems. A meaningful dollar evaluation of LANDSAT can be made by a cost comparison with equally effective data systems. Users of LANDSAT data can be usefully categorized as performing three general functions: planning, permitting, and enforcing. The value of LANDSAT data to the State of Georgia is most sensitive to the parameters: discount rate, digitization cost, and photo acquisition cost. Under a constrained budget, LANDSAT could provide digitized land cover information roughly seven times more frequently than could otherwise be obtained. Thus on one hand, while the services derived from LANDSAT data in comparison to the baseline system has a positive net present value, on the other hand if the budget were constrained, more frequent information could be provided using the LANDSAT system than otherwise be obtained.
Space-Time Data fusion for Remote Sensing Applications
NASA Technical Reports Server (NTRS)
Braverman, Amy; Nguyen, H.; Cressie, N.
2011-01-01
NASA has been collecting massive amounts of remote sensing data about Earth's systems for more than a decade. Missions are selected to be complementary in quantities measured, retrieval techniques, and sampling characteristics, so these datasets are highly synergistic. To fully exploit this, a rigorous methodology for combining data with heterogeneous sampling characteristics is required. For scientific purposes, the methodology must also provide quantitative measures of uncertainty that propagate input-data uncertainty appropriately. We view this as a statistical inference problem. The true but notdirectly- observed quantities form a vector-valued field continuous in space and time. Our goal is to infer those true values or some function of them, and provide to uncertainty quantification for those inferences. We use a spatiotemporal statistical model that relates the unobserved quantities of interest at point-level to the spatially aggregated, observed data. We describe and illustrate our method using CO2 data from two NASA data sets.
Facial asymmetry quantitative evaluation in oculoauriculovertebral spectrum.
Manara, Renzo; Schifano, Giovanni; Brotto, Davide; Mardari, Rodica; Ghiselli, Sara; Gerunda, Antonio; Ghirotto, Cristina; Fusetti, Stefano; Piacentile, Katherine; Scienza, Renato; Ermani, Mario; Martini, Alessandro
2016-03-01
Facial asymmetries in oculoauriculovertebral spectrum (OAVS) patients might require surgical corrections that are mostly based on qualitative approach and surgeon's experience. The present study aimed to develop a quantitative 3D CT imaging-based procedure suitable for maxillo-facial surgery planning in OAVS patients. Thirteen OAVS patients (mean age 3.5 ± 4.0 years; range 0.2-14.2, 6 females) and 13 controls (mean age 7.1 ± 5.3 years; range 0.6-15.7, 5 females) who underwent head CT examination were retrospectively enrolled. Eight bilateral anatomical facial landmarks were defined on 3D CT images (porion, orbitale, most anterior point of frontozygomatic suture, most superior point of temporozygomatic suture, most posterior-lateral point of the maxilla, gonion, condylion, mental foramen) and distance from orthogonal planes (in millimeters) was used to evaluate the asymmetry on each axis and to calculate a global asymmetry index of each anatomical landmark. Mean asymmetry values and relative confidence intervals were obtained from the control group. OAVS patients showed 2.5 ± 1.8 landmarks above the confidence interval while considering the global asymmetry values; 12 patients (92%) showed at least one pathologically asymmetric landmark. Considering each axis, the mean number of pathologically asymmetric landmarks increased to 5.5 ± 2.6 (p = 0.002) and all patients presented at least one significant landmark asymmetry. Modern CT-based 3D reconstructions allow accurate assessment of facial bone asymmetries in patients affected by OAVS. The evaluation as a global score and in different orthogonal axes provides precise quantitative data suitable for maxillo-facial surgical planning. CT-based 3D reconstruction might allow a quantitative approach for planning and following-up maxillo-facial surgery in OAVS patients.
Comparison of 18F-FDG PET/CT and PET/MRI in patients with multiple myeloma
Sachpekidis, Christos; Hillengass, Jens; Goldschmidt, Hartmut; Mosebach, Jennifer; Pan, Leyun; Schlemmer, Heinz-Peter; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia
2015-01-01
PET/MRI represents a promising hybrid imaging modality with several potential clinical applications. Although PET/MRI seems highly attractive in the diagnostic approach of multiple myeloma (MM), its role has not yet been evaluated. The aims of this prospective study are to evaluate the feasibility of 18F-FDG PET/MRI in detection of MM lesions, and to investigate the reproducibility of bone marrow lesions detection and quantitative data of 18F-FDG uptake between the functional (PET) component of PET/CT and PET/MRI in MM patients. The study includes 30 MM patients. All patients initially underwent 18F-FDG PET/CT (60 min p.i.), followed by PET/MRI (120 min p.i.). PET/CT and PET/MRI data were assessed and compared based on qualitative (lesion detection) and quantitative (SUV) evaluation. The hybrid PET/MRI system provided good image quality in all cases without artefacts. PET/MRI identified 65 of the 69 lesions, which were detectable with PET/CT (94.2%). Quantitative PET evaluations showed the following mean values in MM lesions: SUVaverage=5.5 and SUVmax=7.9 for PET/CT; SUVaverage=3.9 and SUVmax=5.8 for PET/MRI. Both SUVaverage and SUVmax were significantly higher on PET/CT than on PET/MRI. Spearman correlation analysis demonstrated a strong correlation between both lesional SUVaverage (r=0.744) and lesional SUVmax (r=0.855) values derived from PET/CT and PET/MRI. Regarding detection of myeloma skeletal lesions, PET/MRI exhibited equivalent performance to PET/CT. In terms of tracer uptake quantitation, a significant correlation between the two techniques was demonstrated, despite the statistically significant differences in lesional SUVs between PET/CT and PET/MRI. PMID:26550538
Wang, Lin; Du, Jing; Li, Feng-Hua; Fang, Hua; Hua, Jia; Wan, Cai-Feng
2013-10-01
The purpose of this study was to evaluate the diagnostic efficacy of contrast-enhanced sonography for differentiation of breast lesions by combined qualitative and quantitative analyses in comparison to magnetic resonance imaging (MRI). Fifty-six patients with American College of Radiology Breast Imaging Reporting and Data System category 3 to 5 breast lesions on conventional sonography were evaluated by contrast-enhanced sonography and MRI. A comparative analysis of diagnostic results between contrast-enhanced sonography and MRI was conducted in light of the pathologic findings. Pathologic analysis showed 26 benign and 30 malignant lesions. The predominant enhancement patterns of the benign lesions on contrast-enhanced sonography were homogeneous, centrifugal, and isoenhancement or hypoenhancement, whereas the patterns of the malignant lesions were mainly heterogeneous, centripetal, and hyperenhancement. The detection rates for perfusion defects and peripheral radial vessels in the malignant group were much higher than those in the benign group (P < .05). As to quantitative analysis, statistically significant differences were found in peak and time-to-peak values between the groups (P < .05). With pathologic findings as the reference standard, the sensitivity, specificity, and accuracy of contrast-enhanced sonography and MRI were 90.0%, 92.3%, 91.1% and 96.7%, 88.5%, and 92.9%, respectively. The two methods had a concordant rate of 87.5% (49 of 56), and the concordance test gave a value of κ = 0.75, indicating that there was high concordance in breast lesion assessment between the two diagnostic modalities. Contrast-enhanced sonography provided typical enhancement patterns and valuable quantitative parameters, which showed good agreement with MRI in diagnostic efficacy and may potentially improve characterization of breast lesions.
Lemanski, Charlotte
2011-01-01
In the global South, policies providing property titles to low-income households are increasingly implemented as a solution to poverty. Integrating poor households into the capitalist economy using state-subsidized homeownership is intended to provide poor people with an asset that can be used in a productive manner. In this article the South African "housing subsidy system" is assessed using quantitative and qualitative data from in-depth research in a state-subsidized housing settlement in the city of Cape Town. The findings show that while state-subsidized property ownership provides long-term shelter and tenure security to low-income households, houses have mixed value as a financial asset. Although state-subsidized houses in South Africa are a financially tradable asset, transaction values are too low for low-income vendors to reach the next rung on the housing ladder, the township market. Furthermore, low-income homeowners are reticent to use their (typically primary) asset as collateral security for credit, and thus property ownership is not providing the financial returns that titling theories assume.
On estimating the economic value of insectivorous bats: Prospects and priorities for biologists
Boyles, Justin G.; Sole, Catherine L.; Cryan, Paul M.; McCracken, Gary F.
2013-01-01
Bats are among the most economically important nondomesticated mammals in the world. They are well-known pollinators and seed dispersers, but crop pest suppression is probably the most valuable ecosystem service provided by bats. Scientific literature and popular media often include reports of crop pests in the diet of bats and anecdotal or extrapolated estimates of how many insects are eaten by bats. However, quantitative estimates of the ecosystem services provided by bats in agricultural systems are rare, and the few estimates that are available are limited to a single cotton-dominated system in Texas. Despite the tremendous value for conservation and economic security of such information, surprisingly few scientific efforts have been dedicated to quantifying the economic value of bats. Here, we outline the types of information needed to better quantify the value of bats in agricultural ecosystems. Because of the complexity of the ecosystems involved, creative experimental design and innovative new methods will help advance our knowledge in this area. Experiments involving bats in agricultural systems may be needed sooner than later, before population declines associated with white-nose syndrome and wind turbines potentially render them impossible.
The Concept of “Peak Ecological Water” in a Changing Climate. (Invited)
NASA Astrophysics Data System (ADS)
Gleick, P. H.
2009-12-01
In a paper prepared in 2008, Meena Palaniappan and Peter Gleick proposed the concept of "peak ecological water" as a way to characterize vulnerable watersheds in the context of human withdrawals and water use. The value that humans obtain from water produced through incremental increases in supply (e.g., drinking water, irrigation) can be compared with the declining value of the ecological services (e.g., water for plants and animals) that were being satisfied with this water. As more water is appropriated from watersheds the pace or severity of ecological disruptions increases. At a certain theoretical point, the value of ecological services provided by water is equivalent to the value of human services provided by water. After this point, increasing appropriation of water leads to ecological disruptions beyond the value that this increased water provides to humans (the slope of the decline in ecological services is greater than the slope of the increase in value to humans). At the point of "peak ecological water," society will maximize the ecological and human benefits provided by water. This concept will be presented here as a way to identify water resources at risks of overabstraction and overuse. Conversely, the concept can also help in identifying strategies to adapt to changing water availability due to climate change. A key challenge remains coming up with quantitative measures of peak ecological water and using those measures to develop policies to maintain ecological health under variable conditions Palaniappan, M. and P.H. Gleick. 2008. “Peak Water.” In The World’s Water 2008-2009 (P.H. Gleick, editor). Island Press, Washington, D.C. pp. 1-16.
Investigating genetic loci that encode plant-derived paleoclimate proxies
NASA Astrophysics Data System (ADS)
Bender, A. L. D.; Suess, M.; Chitwood, D. H.; Bradley, A. S.
2016-12-01
Long chain (>C25) n-alkanes in sediments predominantly derive from terrestrial plant waxes. Hydrogen isotope ratios (δD) of leaf wax hydrocarbons correlate with δDH2O of precipitation and are commonly used as paleoclimate proxies. However, biological variability in the isotopic fractionations between water and plant materials also affects the n-alkane δD values. Correct interpretation of this paleoclimate proxy requires that we resolve genetic and environmental effects. Genetic variability underlying differences in leaf wax structure and isotopic composition can be quantitatively determined through the use of model organisms. Interfertile Solanum sect. Lycopersicon (tomato) species provide an ideal model species complex for this approach. We used a set of 76 precisely defined near-isogenic lines (introgression lines [ILs]) in which small genomic regions from the wild tomato relative Solanum pennellii have been introduced into the genome of the domestic tomato, S. lycopersicum. By characterizing quantitative traits of these ILs (leaf wax structure and isotopic composition), we can resolve the degree to which each trait is regulated by genetic versus environmental factors. We present data from two growth experiments conducted with all 76 ILs. In this study, we quantify leaf wax traits, including δD values, δ13C values, and structural metrics including the methylation index (a variable that describes the ratio of iso- and anteiso- to n-alkanes). Among ILs, δD values vary by up to 35‰ and 60‰ for C31 and C33 n-alkanes, respectively. Many ILs have methylation indices that are discernably different from the parent domesticated tomato (p < 0.001), which suggests that methylation is a highly polygenic trait. This pattern is similar to the genetics that control leaf shape, another trait commonly used as a paleoclimate proxy. Based on our preliminary analysis, we propose candidate genes that control aspects of plant physiology that affect these quantitative traits. Our results have important implications for uncovering the degree to which we can expect environmental versus genetic factors to modulate variability in n-alkane δD values. These findings can inform the interpretation of the proxy signal recovered from the geological record.
Ma, Ya-Jun; Tadros, Anthony; Du, Jiang; Chang, Eric Y
2018-04-01
To investigate quantitative 2D ultrashort echo time magnetization transfer (UTE-MT) imaging in ex vivo bovine cortical bone and in vivo human tibial cortical bone. Data were acquired from five fresh bovine cortical bone samples and five healthy volunteer tibial cortical bones using a 2D UTE-MT sequence on a clinical 3T scanner. The 2D UTE-MT sequence used four or five MT powers with five frequency offsets. Results were analyzed with a two-pool quantitative MT model, providing measurements of macromolecular fraction (f), macromolecular proton transverse relaxation times (T 2m ), proton exchange rates from water/macromolecular to the macromolecular/water pool (RM 0m /RM 0w ), and spin-lattice relaxation rate of water pool (R 1w ). A sequential air-drying study for a small bovine cortical bone chip was used to investigate whether above MT modeling parameters were sensitive to the water loss. Mean fresh bovine cortical bone values for f, T 2m , R 1w , RM 0m , and RM 0w were 59.9 ± 7.3%, 14.6 ± 0.3 μs, 9.9 ± 2.4 s -1 , 17.9 ± 3.6 s -1 , and 11.8 ± 2.0 s -1 , respectively. Mean in vivo human cortical bone values for f, T 2m , R 1w , RM 0m and RM 0w were 54.5 ± 4.9%, 15.4 ± 0.6 μs, 8.9 ± 1.1 s -1 , 11.5 ± 3.5 s -1 , and 9.5 ± 1.9 s -1 , respectively. The sequential air-drying study shows that f, RM 0m , and R 1w were increased with longer drying time. UTE-MT two-pool modeling provides novel and useful quantitative information for cortical bone. Magn Reson Med 79:1941-1949, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Mizumura, Sunao; Kumita, Shin-ichiro; Cho, Keiichi; Ishihara, Makiko; Nakajo, Hidenobu; Toba, Masahiro; Kumazaki, Tatsuo
2003-06-01
Through visual assessment by three-dimensional (3D) brain image analysis methods using stereotactic brain coordinates system, such as three-dimensional stereotactic surface projections and statistical parametric mapping, it is difficult to quantitatively assess anatomical information and the range of extent of an abnormal region. In this study, we devised a method to quantitatively assess local abnormal findings by segmenting a brain map according to anatomical structure. Through quantitative local abnormality assessment using this method, we studied the characteristics of distribution of reduced blood flow in cases with dementia of the Alzheimer type (DAT). Using twenty-five cases with DAT (mean age, 68.9 years old), all of whom were diagnosed as probable Alzheimer's disease based on NINCDS-ADRDA, we collected I-123 iodoamphetamine SPECT data. A 3D brain map using the 3D-SSP program was compared with the data of 20 cases in the control group, who age-matched the subject cases. To study local abnormalities on the 3D images, we divided the whole brain into 24 segments based on anatomical classification. We assessed the extent of an abnormal region in each segment (rate of the coordinates with a Z-value that exceeds the threshold value, in all coordinates within a segment), and severity (average Z-value of the coordinates with a Z-value that exceeds the threshold value). This method clarified orientation and expansion of reduced accumulation, through classifying stereotactic brain coordinates according to the anatomical structure. This method was considered useful for quantitatively grasping distribution abnormalities in the brain and changes in abnormality distribution.
A new method to evaluate image quality of CBCT images quantitatively without observers
Shimizu, Mayumi; Okamura, Kazutoshi; Yoshida, Shoko; Weerawanich, Warangkana; Tokumori, Kenji; Jasa, Gainer R; Yoshiura, Kazunori
2017-01-01
Objectives: To develop an observer-free method for quantitatively evaluating the image quality of CBCT images by applying just-noticeable difference (JND). Methods: We used two test objects: (1) a Teflon (polytetrafluoroethylene) plate phantom attached to a dry human mandible; and (2) a block phantom consisting of a Teflon step phantom and an aluminium step phantom. These phantoms had holes with different depths. They were immersed in water and scanned with a CB MercuRay (Hitachi Medical Corporation, Tokyo, Japan) at tube voltages of 120 kV, 100 kV, 80 kV and 60 kV. Superimposed images of the phantoms with holes were used for evaluation. The number of detectable holes was used as an index of image quality. In detecting holes quantitatively, the threshold grey value (ΔG), which differentiated holes from the background, was calculated using a specific threshold (the JND), and we extracted the holes with grey values above ΔG. The indices obtained by this quantitative method (the extracted hole values) were compared with the observer evaluations (the observed hole values). In addition, the contrast-to-noise ratio (CNR) of the shallowest detectable holes and the deepest undetectable holes were measured to evaluate the contribution of CNR to detectability. Results: The results of this evaluation method corresponded almost exactly with the evaluations made by observers. The extracted hole values reflected the influence of different tube voltages. All extracted holes had an area with a CNR of ≥1.5. Conclusions: This quantitative method of evaluating CBCT image quality may be more useful and less time-consuming than evaluation by observation. PMID:28045343
Jung, Yongsik; Jeong, Seong Kyun; Kang, Doo Kyoung; Moon, Yeorae; Kim, Tae Hee
2018-06-01
We quantitatively analyzed background parenchymal enhancement (BPE) in whole breast according to menstrual cycle and compared it with a qualitative analysis method. A data set of breast magnetic resonance imaging (MRI) from 273 breast cancer patients was used. For quantitative analysis, we used semiautomated in-house software with MATLAB. From each voxel of whole breast, the software calculated BPE using following equation: [(signal intensity [SI] at 1 min 30 s after contrast injection - baseline SI)/baseline SI] × 100%. In total, 53 patients had minimal, 108 mild, 87 moderate, and 25 marked BPE. On quantitative analysis, mean BPE values were 33.1% in the minimal, 42.1% in the mild, 59.1% in the moderate, and 81.9% in the marked BPE group showing significant difference (p = .009 for minimal vs. mild, p < 0.001 for other comparisons). Spearman's correlation test showed that there was strong significant correlation between qualitative and quantitative BPE (r = 0.63, p < 0.001). The mean BPE value was 48.7% for patients in the first week of the menstrual cycle, 43.5% in the second week, 49% in the third week, and 49.4% for those in the fourth week. The difference between the second and fourth weeks was significant (p = .005). Median, 90th percentile, and 10th percentile values were also significantly different between the second and fourth weeks but not different in other comparisons (first vs. second, first vs. third, first vs. fourth, second vs. third, or third vs. fourth). Quantitative analysis of BPE correlated well with the qualitative BPE grade. Quantitative BPE values were lowest in the second week and highest in the fourth week. Copyright © 2018 Elsevier B.V. All rights reserved.
Battiston, Marco; Grussu, Francesco; Ianus, Andrada; Schneider, Torben; Prados, Ferran; Fairney, James; Ourselin, Sebastien; Alexander, Daniel C; Cercignani, Mara; Gandini Wheeler-Kingshott, Claudia A M; Samson, Rebecca S
2018-05-01
To develop a framework to fully characterize quantitative magnetization transfer indices in the human cervical cord in vivo within a clinically feasible time. A dedicated spinal cord imaging protocol for quantitative magnetization transfer was developed using a reduced field-of-view approach with echo planar imaging (EPI) readout. Sequence parameters were optimized based in the Cramer-Rao-lower bound. Quantitative model parameters (i.e., bound pool fraction, free and bound pool transverse relaxation times [ T2F, T2B], and forward exchange rate [k FB ]) were estimated implementing a numerical model capable of dealing with the novelties of the sequence adopted. The framework was tested on five healthy subjects. Cramer-Rao-lower bound minimization produces optimal sampling schemes without requiring the establishment of a steady-state MT effect. The proposed framework allows quantitative voxel-wise estimation of model parameters at the resolution typically used for spinal cord imaging (i.e. 0.75 × 0.75 × 5 mm 3 ), with a protocol duration of ∼35 min. Quantitative magnetization transfer parametric maps agree with literature values. Whole-cord mean values are: bound pool fraction = 0.11(±0.01), T2F = 46.5(±1.6) ms, T2B = 11.0(±0.2) µs, and k FB = 1.95(±0.06) Hz. Protocol optimization has a beneficial effect on reproducibility, especially for T2B and k FB . The framework developed enables robust characterization of spinal cord microstructure in vivo using qMT. Magn Reson Med 79:2576-2588, 2018. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine. This is an open access article under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited. © 2017 The Authors Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. on behalf of International Society for Magnetic Resonance in Medicine.
Size and maceral association of pyrite in Illinois coals and their float-sink fractions
Harvey, R.D.; DeMaris, P.J.
1987-01-01
The amount of pyrite (FeS2) removed by physical cleaning varies with differences in the amount of pyrite enclosed within minerals and of free pyrite in feed coals. A microscopic procedure for characterizing the size and maceral association of pyrite grains was developed and evaluate by testing three coals and their washed products. The results yield an index to the cleanability of pyrite. The index is dependent upon particle size and has intermediate values for feed coals, lower values for cleaned fractions, and higher values for refuse fractions; furthermore, it correlates with pyritic sulfur content. In the coals examined, the summed percentage of grain diameters of pyrite enclosed in vitrinite, liptinite, and bi- and trimacerite provides a quantitative measure of the proportion of early diagenetic deposition of pyrite. ?? 1987.
Lander, Rebecca L; Hambidge, K Michael; Krebs, Nancy F; Westcott, Jamie E; Garces, Ana; Figueroa, Lester; Tejeda, Gabriela; Lokangaka, Adrien; Diba, Tshilenge S; Somannavar, Manjunath S; Honnayya, Ranjitha; Ali, Sumera A; Khan, Umber S; McClure, Elizabeth M; Thorsten, Vanessa R; Stolka, Kristen B
2017-01-01
Background : Our aim was to utilize a feasible quantitative methodology to estimate the dietary adequacy of >900 first-trimester pregnant women in poor rural areas of the Democratic Republic of the Congo, Guatemala, India and Pakistan. This paper outlines the dietary methods used. Methods : Local nutritionists were trained at the sites by the lead study nutritionist and received ongoing mentoring throughout the study. Training topics focused on the standardized conduct of repeat multiple-pass 24-hr dietary recalls, including interview techniques, estimation of portion sizes, and construction of a unique site-specific food composition database (FCDB). Each FCDB was based on 13 food groups and included values for moisture, energy, 20 nutrients (i.e. macro- and micronutrients), and phytate (an anti-nutrient). Nutrient values for individual foods or beverages were taken from recently developed FAO-supported regional food composition tables or the USDA national nutrient database. Appropriate adjustments for differences in moisture and application of nutrient retention and yield factors after cooking were applied, as needed. Generic recipes for mixed dishes consumed by the study population were compiled at each site, followed by calculation of a median recipe per 100 g. Each recipe's nutrient values were included in the FCDB. Final site FCDB checks were planned according to FAO/INFOODS guidelines. Discussion : This dietary strategy provides the opportunity to assess estimated mean group usual energy and nutrient intakes and estimated prevalence of the population 'at risk' of inadequate intakes in first-trimester pregnant women living in four low- and middle-income countries. While challenges and limitations exist, this methodology demonstrates the practical application of a quantitative dietary strategy for a large international multi-site nutrition trial, providing within- and between-site comparisons. Moreover, it provides an excellent opportunity for local capacity building and each site FCDB can be easily modified for additional research activities conducted in other populations living in the same area.
Nardo, Lorenzo; Karampinos, Dimitrios C.; Joseph, Gabby B.; Yap, Samuel P.; Baum, Thomas; Krug, Roland; Majumdar, Sharmila; Link, Thomas M.
2013-01-01
Objective The goal of this study was to compare the semi-quantitative Goutallier classification for fat infiltration with quantitative fat-fraction derived from a magnetic resonance imaging (MRI) chemical shift-based water/fat separation technique. Methods Sixty-two women (age 61±6 years), 27 of whom had diabetes, underwent MRI of the calf using a T1-weighted fast spin-echo sequence and a six-echo spoiled gradient-echo sequence at 3 T. Water/fat images and fat fraction maps were reconstructed using the IDEAL algorithm with T2* correction and a multi-peak model for the fat spectrum. Two radiologists scored fat infiltration on the T1-weighted images using the Goutallier classification in six muscle compartments. Spearman correlations between the Goutallier grades and the fat fraction were calculated; in addition, intra-observer and inter-observer agreement were calculated. Results A significant correlation between the clinical grading and the fat fraction values was found for all muscle compartments (P<0.0001, R values ranging from 0.79 to 0.88). Goutallier grades 0–4 had a fat fraction ranging from 3.5 to 19%. Intra-observer and inter-observer agreement values of 0.83 and 0.81 were calculated for the semi-quantitative grading. Conclusion Semi-quantitative grading of intramuscular fat and quantitative fat fraction were significantly correlated and both techniques had excellent reproducibility. However, the clinical grading was found to overestimate muscle fat. PMID:22411305
Oakley, Robert H; Hudson, Christine C; Cruickshank, Rachael D; Meyers, Diane M; Payne, Richard E; Rhem, Shay M; Loomis, Carson R
2002-11-01
G protein-coupled receptors (GPCRs) have proven to be a rich source of therapeutic targets; therefore, finding compounds that regulate these receptors is a critical goal in drug discovery. The Transfluor technology utilizes the redistribution of fluorescently labeled arrestins from the cytoplasm to agonist-occupied receptors at the plasma membrane to monitor quantitatively the activation or inactivation of GPCRs. Here, we show that the Transfluor technology can be quantitated on the INCell Analyzer system (INCAS) using the vasopressin V(2) receptor (V(2)R), which binds arrestin with high affinity, and the beta(2)-adrenergic receptor (beta(2)AR), which binds arrestin with low affinity. U2OS cells stably expressing an arrestin-green fluorescent protein conjugate and either the V(2)R or the beta(2)AR were plated in 96-well plastic plates and analyzed by the INCAS at a screening rate of 5 min per plate. Agonist dose-response and antagonist dose-inhibition curves revealed signal-to-background ratios of approximately 25:1 and 8:1 for the V(2)R and beta(2)AR, respectively. EC(50) values agreed closely with K(d) values reported in the literature for the different receptor agonists. In addition, small amounts of arrestin translocation induced by sub-EC(50) doses of agonist were distinguished from the background noise of untreated cells. Furthermore, differences in the magnitude of arrestin translocation distinguished partial agonists from full agonists, and Z' values for these ligands were >0.5. These data show that the Transfluor technology, combined with an automated image analysis system, provides a direct, robust, and universal assay for high throughput screening of known and orphan GPCRs.
Pargett, Michael; Umulis, David M
2013-07-15
Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.
2017-09-01
environment outcome. The value is site specific. It may depend on the travel time of groundwater from a source to a property boundary, sentry well...for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data...47 Figure 4.3.4. Decline in Concentrations of PCE, TCE, and cDCE + t-DCE Over Time in Well
2014-03-01
in a crisis. In this study, it was found that besides providing recognition and support , managers who create a positive climate in the work ...issues related to hiring freezes at other commands by means of Ensuring employees feel valued and that their hard work is successfully impacting the ...stress that can lead to poor health and absence from work . Stress is also created when employees are not only doing the tasks found in their job
Modeling the filament winding process
NASA Technical Reports Server (NTRS)
Calius, E. P.; Springer, G. S.
1985-01-01
A model is presented which can be used to determine the appropriate values of the process variables for filament winding a cylinder. The model provides the cylinder temperature, viscosity, degree of cure, fiber position and fiber tension as functions of position and time during the filament winding and subsequent cure, and the residual stresses and strains within the cylinder during and after the cure. A computer code was developed to obtain quantitative results. Sample results are given which illustrate the information that can be generated with this code.
Discovering the Quantity of Quality: Scoring "Regional Identity" for Quantitative Research
ERIC Educational Resources Information Center
Miller, Daniel A.
2008-01-01
The variationist paradigm in sociolinguistics is at a disadvantage when dealing with variables that are traditionally treated qualitatively, e.g., "identity". This study essays to level the accuracy and descriptive value of qualitative research in a quantitative setting by rendering such a variable quantitatively accessible. To this end,…
Characterization of Microporous Insulation, Microsil
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, R.
Microsil microporous insulation has been characterized by Lawrence Livermore National Laboratory for possible use in structural and thermal applications in the DPP-1 design. Qualitative test results have provided mechanical behavioral characteristics for DPP-1 design studies and focused on the material behavioral response to being crushed, cyclically loaded, and subjected to vibration for a confined material with an interference fit or a radial gap. Quantitative test results have provided data to support the DPP-1 FEA model analysis and verification and were used to determine mechanical property values for the material under a compression load. The test results are documented within thismore » report.« less
Madej, Roberta M.; Davis, Jack; Holden, Marcia J.; Kwang, Stan; Labourier, Emmanuel; Schneider, George J.
2010-01-01
The utility of quantitative molecular diagnostics for patient management depends on the ability to relate patient results to prior results or to absolute values in clinical practice guidelines. To do this, those results need to be comparable across time and methods, either by producing the same value across methods and test versions or by using reliable and stable conversions. Universally available standards and reference materials specific to quantitative molecular technologies are critical to this process but are few in number. This review describes recent history in the establishment of international standards for nucleic acid test development, organizations involved in current efforts, and future issues and initiatives. PMID:20075208
A color prediction model for imagery analysis
NASA Technical Reports Server (NTRS)
Skaley, J. E.; Fisher, J. R.; Hardy, E. E.
1977-01-01
A simple model has been devised to selectively construct several points within a scene using multispectral imagery. The model correlates black-and-white density values to color components of diazo film so as to maximize the color contrast of two or three points per composite. The CIE (Commission Internationale de l'Eclairage) color coordinate system is used as a quantitative reference to locate these points in color space. Superimposed on this quantitative reference is a perceptional framework which functionally contrasts color values in a psychophysical sense. This methodology permits a more quantitative approach to the manual interpretation of multispectral imagery while resulting in improved accuracy and lower costs.
Ning, Lei; Song, Li-Jiang; Fan, Shun-Wu; Zhao, Xing; Chen, Yi-Lei; Li, Zhao-Zhi; Hu, Zi-Ang
2017-10-11
This study established gender-specific reference values in mainland Chinese (MC) and is important for quantitative morphometry for diagnosis and epidemiological study of osteoporotic vertebral compressive fracture. Comparisons of reference values among different racial populations are then performed to demonstrate the MC-specific characteristic. Osteoporotic vertebral compressive fracture (OVCF) is a common complication of osteoporosis in the elder population. Clinical diagnosis and epidemiological study of OVCF often employ quantitative morphometry, which relies heavily on the comparison of patients' vertebral parameters to existing reference values derived from the normal population. Thus, reference values are crucial in clinical diagnosis. To our knowledge, this is the first study to establish reference values of the mainland Chinese (MC) for quantitative morphometry. Vertebral heights including anterior (Ha), middle (Hm), posterior (Hp) heights, and predicted posterior height (pp) from T4 to L5 were obtained; and ratios of Ha/Hp, Hm/Hp and Hp/pp. were calculated from 585 MC (both female and male) for establishing reference values and subsequent comparisons with other studies. Vertebral heights increased progressively from T4 to L3 but then decreased in L4 and L5. Both genders showed similar ratios of vertebral dimensions, but male vertebrae were statistically larger than those of female (P < 0.01). Vertebral size of MC population was smaller than that of US and UK population, but was surprisingly larger than that of Hong Kong Chinese, although these two are commonly considered as one race. Data from different racial populations showed similar dimensional ratios in all vertebrae. We established gender-specific reference values for MC. Our results also indicated the necessity of establishing reference values that are not only race- and gender-specific, but also population- or region-specific for accurate quantitative morphometric assessment of OVCF.
Fuzzy Performance between Surface Fitting and Energy Distribution in Turbulence Runner
Liang, Zhongwei; Liu, Xiaochu; Ye, Bangyan; Brauwer, Richard Kars
2012-01-01
Because the application of surface fitting algorithms exerts a considerable fuzzy influence on the mathematical features of kinetic energy distribution, their relation mechanism in different external conditional parameters must be quantitatively analyzed. Through determining the kinetic energy value of each selected representative position coordinate point by calculating kinetic energy parameters, several typical algorithms of complicated surface fitting are applied for constructing microkinetic energy distribution surface models in the objective turbulence runner with those obtained kinetic energy values. On the base of calculating the newly proposed mathematical features, we construct fuzzy evaluation data sequence and present a new three-dimensional fuzzy quantitative evaluation method; then the value change tendencies of kinetic energy distribution surface features can be clearly quantified, and the fuzzy performance mechanism discipline between the performance results of surface fitting algorithms, the spatial features of turbulence kinetic energy distribution surface, and their respective environmental parameter conditions can be quantitatively analyzed in detail, which results in the acquirement of final conclusions concerning the inherent turbulence kinetic energy distribution performance mechanism and its mathematical relation. A further turbulence energy quantitative study can be ensured. PMID:23213287
NASA Astrophysics Data System (ADS)
Jiang, Y.; Liu, J.-R.; Luo, Y.; Yang, Y.; Tian, F.; Lei, K.-C.
2015-11-01
Groundwater in Beijing has been excessively exploited in a long time, causing the groundwater level continued to declining and land subsidence areas expanding, which restrained the economic and social sustainable development. Long years of study show good time-space corresponding relationship between groundwater level and land subsidence. To providing scientific basis for the following land subsidence prevention and treatment, quantitative research between groundwater level and settlement is necessary. Multi-linear regression models are set up by long series factual monitoring data about layered water table and settlement in the Tianzhu monitoring station. The results show that: layered settlement is closely related to water table, water level variation and amplitude, especially the water table. Finally, according to the threshold value in the land subsidence prevention and control plan of China (45, 30, 25 mm), the minimum allowable layered water level in this region while settlement achieving the threshold value is calculated between -18.448 and -10.082 m. The results provide a reasonable and operable control target of groundwater level for rational adjustment of groundwater exploited horizon in the future.
van den Beld, Maaike J C; Friedrich, Alexander W; van Zanten, Evert; Reubsaet, Frans A G; Kooistra-Smid, Mirjam A M D; Rossen, John W A
2016-12-01
An inter-laboratory collaborative trial for the evaluation of diagnostics for detection and identification of Shigella species and Entero-invasive Escherichia coli (EIEC) was performed. Sixteen Medical Microbiological Laboratories (MMLs) participated. MMLs were interviewed about their diagnostic methods and a sample panel, consisting of DNA-extracts and spiked stool samples with different concentrations of Shigella flexneri, was provided to each MML. The results of the trial showed an enormous variety in culture-dependent and molecular diagnostic techniques currently used among MMLs. Despite the various molecular procedures, 15 out of 16 MMLs were able to detect Shigella species or EIEC in all the samples provided, showing that the diversity of methods has no effect on the qualitative detection of Shigella flexneri. In contrast to semi quantitative analysis, the minimum and maximum values per sample differed by approximately five threshold cycles (Ct-value) between the MMLs included in the study. This indicates that defining a uniform Ct-value cut-off for notification to health authorities is not advisable. Copyright © 2016 Elsevier B.V. All rights reserved.
Wildlife use of back channels associated with islands on the Ohio River
Zadnik, A.K.; Anderson, James T.; Wood, P.B.; Bledsoe, K.
2009-01-01
The back channels of islands on the Ohio River are assumed to provide habitat critical for several wildlife species. However, quantitative information on the wildlife value of back channels is needed by natural resource managers for the conservation of these forested islands and embayments in the face of increasing shoreline development and recreational boating. We compared the relative abundance of waterbirds, turtles, anurans, and riparian furbearing mammals during 2001 and 2002 in back and main channels of the Ohio River in West Virginia. Wood ducks (Aix sponsa), snapping turtles (Chelydra serpentina), beavers (Castor canadensis), and muskrats (Ondatra zibethicus) were more abundant in back than main channels. Spring peepers (Pseudacris crucifer) and American toads (Bufo americanus) occurred more frequently on back than main channels. These results provide quantitative evidence that back channels are important for several wildlife species. The narrowness of the back channels, the protection they provide from the main current of the river, and their ability to support vegetated shorelines and woody debris, are characteristics that appear to benefit these species. As a conservation measure for important riparian wildlife habitat, we suggest limiting building of piers and development of the shoreline in back channel areas. ?? 2009, The Society of Wetland Scientists.
Aragaw, Amanu; Yigzaw, Tegbar; Tetemke, Desalegn; G/Amlak, Wubalem
2015-09-24
Cultural competency is now a core requirement for maternal health providers working in multicultural society. However, it has not yet received due attention in Ethiopia. This study aimed to determine the level of cultural competence and its associated factors among maternal health care providers in Bahir Dar City Administration, Northwest Ethiopia. Institution based cross-sectional study was carried out using both quantitative and qualitative methods. Maternal health care providers from all health facilities were our study participants. Structured Questionnaire with some modification of Campinha Bacote's tool was used to collect quantitative data from health workers and semi structured guide line was used for qualitative data among women. While quantitative data analysis was done using SPSS, qualitative data was analyzed using open code software. P-value of less than 0.05 was taken to determine statistical significance. Cronbach's alpha was used to test internal reliability and a factor loading of 0.3 or greater was the criterion used to retain items. Two hundred seventy four health workers and seven women were involved in the study. The overall competency level was 57.3 % thought vary in different subscales or stages. Of the cultural competent health workers near to three fourth (73.0 %) were in awareness stage which is the earliest stage of competence in which individuals were aware only their own culture but not the world view of their clients. The voices of mothers in the qualitative assessment also showed discordance in cultural competence with their healthcare providers. Female health workers almost six times [AOR,5.5; 2.71, 11.30] more competent than male providers and those who got in-service training related to maternal care provided services more culturally competent than their counter parts with [AOR,3.5; 1.4, 8.64]. Reliability Cronbach's α coefficient value of cultural competence subscales showed 0.672,0 .719, 0.658, 0.714, and 0.631 for cultural awareness, knowledge, skill, encounter and desire, respectively. The overall competence level of health workers was low and the mean competence level falls in awareness stage in the continuum of culturally incompetent, culturally aware, culturally competent, and culturally proficient indicated that the providers were aware of only their own culture but not the world view of their clients. The voices of mothers also showed that they were dissatisfied for the services they got and the interactions they had with health care providers. Hence, we recommend on job training of health workers and incorporation of cultural components in the curriculum of health workers as it would be the key to provide culturally acceptable services.
NASA Astrophysics Data System (ADS)
Lanen, Theo A.; Watt, David W.
1995-10-01
Singular value decomposition has served as a diagnostic tool in optical computed tomography by using its capability to provide insight into the condition of ill-posed inverse problems. Various tomographic geometries are compared to one another through the singular value spectrum of their weight matrices. The number of significant singular values in the singular value spectrum of a weight matrix is a quantitative measure of the condition of the system of linear equations defined by a tomographic geometery. The analysis involves variation of the following five parameters, characterizing a tomographic geometry: 1) the spatial resolution of the reconstruction domain, 2) the number of views, 3) the number of projection rays per view, 4) the total observation angle spanned by the views, and 5) the selected basis function. Five local basis functions are considered: the square pulse, the triangle, the cubic B-spline, the Hanning window, and the Gaussian distribution. Also items like the presence of noise in the views, the coding accuracy of the weight matrix, as well as the accuracy of the accuracy of the singular value decomposition procedure itself are assessed.
2011-01-01
Background Large proportions of children do not fulfil the World Health Organization recommendation of eating at least 400 grams of fruit and vegetables (FV) per day. To promote an increased FV intake among children it is important to identify factors which influence their consumption. Both qualitative and quantitative studies are needed. Earlier reviews have analysed evidence from quantitative studies. The aim of this paper is to present a systematic review of qualitative studies of determinants of children's FV intake. Methods Relevant studies were identified by searching Anthropology Plus, Cinahl, CSA illumine, Embase, International Bibliography of the Social Sciences, Medline, PsycINFO, and Web of Science using combinations of synonyms for FV intake, children/adolescents and qualitative methods as search terms. The literature search was completed by December 1st 2010. Papers were included if they applied qualitative methods to investigate 6-18-year-olds' perceptions of factors influencing their FV consumption. Quantitative studies, review studies, studies reported in other languages than English, and non-peer reviewed or unpublished manuscripts were excluded. The papers were reviewed systematically using standardised templates for summary of papers, quality assessment, and synthesis of findings across papers. Results The review included 31 studies, mostly based on US populations and focus group discussions. The synthesis identified the following potential determinants for FV intake which supplement the quantitative knowledge base: Time costs; lack of taste guarantee; satiety value; appropriate time/occasions/settings for eating FV; sensory and physical aspects; variety, visibility, methods of preparation; access to unhealthy food; the symbolic value of food for image, gender identity and social interaction with peers; short term outcome expectancies. Conclusions The review highlights numerous potential determinants which have not been investigated thoroughly in quantitative studies. Future large scale quantitative studies should attempt to quantify the importance of these factors. Further, mechanisms behind gender, age and socioeconomic differences in FV consumption are proposed which should be tested quantitatively in order to better tailor interventions to vulnerable groups. Finally, the review provides input to the conceptualisation and measurements of concepts (i.e. peer influence, availability in schools) which may refine survey instruments and theoretical frameworks concerning eating behaviours. PMID:21999291
Alves, Antoine; Attik, Nina; Bayon, Yves; Royet, Elodie; Wirth, Carine; Bourges, Xavier; Piat, Alexis; Dolmazon, Gaëlle; Clermont, Gaëlle; Boutrand, Jean-Pierre; Grosgogeat, Brigitte; Gritsch, Kerstin
2018-03-14
The paradigm shift brought about by the expansion of tissue engineering and regenerative medicine away from the use of biomaterials, currently questions the value of histopathologic methods in the evaluation of biological changes. To date, the available tools of evaluation are not fully consistent and satisfactory for these advanced therapies. We have developed a new, simple and inexpensive quantitative digital approach that provides key metrics for structural and compositional characterization of the regenerated tissues. For example, metrics provide the tissue ingrowth rate (TIR) which integrates two separate indicators; the cell ingrowth rate (CIR) and the total collagen content (TCC) as featured in the equation, TIR% = CIR% + TCC%. Moreover a subset of quantitative indicators describing the directional organization of the collagen (relating structure and mechanical function of tissues), the ratio of collagen I to collagen III (remodeling quality) and the optical anisotropy property of the collagen (maturity indicator) was automatically assessed as well. Using an image analyzer, all metrics were extracted from only two serial sections stained with either Feulgen & Rossenbeck (cell specific) or Picrosirius Red F3BA (collagen specific). To validate this new procedure, three-dimensional (3D) scaffolds were intraperitoneally implanted in healthy and in diabetic rats. It was hypothesized that quantitatively, the healing tissue would be significantly delayed and of poor quality in diabetic rats in comparison to healthy rats. In addition, a chemically modified 3D scaffold was similarly implanted in a third group of healthy rats with the assumption that modulation of the ingrown tissue would be quantitatively present in comparison to the 3D scaffold-healthy group. After 21 days of implantation, both hypotheses were verified by use of this novel computerized approach. When the two methods were run in parallel, the quantitative results revealed fine details and differences not detected by the semi-quantitative assessment, demonstrating the importance of quantitative analysis in the performance evaluation of soft tissue healing. This automated and supervised method reduced operator dependency and proved to be simple, sensitive, cost-effective and time-effective. It supports objective therapeutic comparisons and helps to elucidate regeneration and the dynamics of a functional tissue.
Geerts, Hugo; Roberts, Patrick; Spiros, Athan; Potkin, Steven
2015-04-01
The concept of targeted therapies remains a holy grail for the pharmaceutical drug industry for identifying responder populations or new drug targets. Here we provide quantitative systems pharmacology as an alternative to the more traditional approach of retrospective responder pharmacogenomics analysis and applied this to the case of iloperidone in schizophrenia. This approach implements the actual neurophysiological effect of genotypes in a computer-based biophysically realistic model of human neuronal circuits, is parameterized with human imaging and pathology, and is calibrated by clinical data. We keep the drug pharmacology constant, but allowed the biological model coupling values to fluctuate in a restricted range around their calibrated values, thereby simulating random genetic mutations and representing variability in patient response. Using hypothesis-free Design of Experiments methods the dopamine D4 R-AMPA (receptor-alpha-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid) receptor coupling in cortical neurons was found to drive the beneficial effect of iloperidone, likely corresponding to the rs2513265 upstream of the GRIA4 gene identified in a traditional pharmacogenomics analysis. The serotonin 5-HT3 receptor-mediated effect on interneuron gamma-aminobutyric acid conductance was identified as the process that moderately drove the differentiation of iloperidone versus ziprasidone. This paper suggests that reverse-engineered quantitative systems pharmacology is a powerful alternative tool to characterize the underlying neurobiology of a responder population and possibly identifying new targets. © The Author(s) 2015.
Sabour, Siamak
2018-03-08
The purpose of this letter, in response to Hall, Mehta, and Fackrell (2017), is to provide important knowledge about methodology and statistical issues in assessing the reliability and validity of an audiologist-administered tinnitus loudness matching test and a patient-reported tinnitus loudness rating. The author uses reference textbooks and published articles regarding scientific assessment of the validity and reliability of a clinical test to discuss the statistical test and the methodological approach in assessing validity and reliability in clinical research. Depending on the type of the variable (qualitative or quantitative), well-known statistical tests can be applied to assess reliability and validity. The qualitative variables of sensitivity, specificity, positive predictive value, negative predictive value, false positive and false negative rates, likelihood ratio positive and likelihood ratio negative, as well as odds ratio (i.e., ratio of true to false results), are the most appropriate estimates to evaluate validity of a test compared to a gold standard. In the case of quantitative variables, depending on distribution of the variable, Pearson r or Spearman rho can be applied. Diagnostic accuracy (validity) and diagnostic precision (reliability or agreement) are two completely different methodological issues. Depending on the type of the variable (qualitative or quantitative), well-known statistical tests can be applied to assess validity.
The impact of injector-based contrast agent administration in time-resolved MRA.
Budjan, Johannes; Attenberger, Ulrike I; Schoenberg, Stefan O; Pietsch, Hubertus; Jost, Gregor
2018-05-01
Time-resolved contrast-enhanced MR angiography (4D-MRA), which allows the simultaneous visualization of the vasculature and blood-flow dynamics, is widely used in clinical routine. In this study, the impact of two different contrast agent injection methods on 4D-MRA was examined in a controlled, standardized setting in an animal model. Six anesthetized Goettingen minipigs underwent two identical 4D-MRA examinations at 1.5 T in a single session. The contrast agent (0.1 mmol/kg body weight gadobutrol, followed by 20 ml saline) was injected using either manual injection or an automated injection system. A quantitative comparison of vascular signal enhancement and quantitative renal perfusion analyses were performed. Analysis of signal enhancement revealed higher peak enhancements and shorter time to peak intervals for the automated injection. Significantly different bolus shapes were found: automated injection resulted in a compact first-pass bolus shape clearly separated from the recirculation while manual injection resulted in a disrupted first-pass bolus with two peaks. In the quantitative perfusion analyses, statistically significant differences in plasma flow values were found between the injection methods. The results of both qualitative and quantitative 4D-MRA depend on the contrast agent injection method, with automated injection providing more defined bolus shapes and more standardized examination protocols. • Automated and manual contrast agent injection result in different bolus shapes in 4D-MRA. • Manual injection results in an undefined and interrupted bolus with two peaks. • Automated injection provides more defined bolus shapes. • Automated injection can lead to more standardized examination protocols.
Google glass based immunochromatographic diagnostic test analysis
NASA Astrophysics Data System (ADS)
Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan
2015-03-01
Integration of optical imagers and sensors into recently emerging wearable computational devices allows for simpler and more intuitive methods of integrating biomedical imaging and medical diagnostics tasks into existing infrastructures. Here we demonstrate the ability of one such device, the Google Glass, to perform qualitative and quantitative analysis of immunochromatographic rapid diagnostic tests (RDTs) using a voice-commandable hands-free software-only interface, as an alternative to larger and more bulky desktop or handheld units. Using the built-in camera of Glass to image one or more RDTs (labeled with Quick Response (QR) codes), our Glass software application uploads the captured image and related information (e.g., user name, GPS, etc.) to our servers for remote analysis and storage. After digital analysis of the RDT images, the results are transmitted back to the originating Glass device, and made available through a website in geospatial and tabular representations. We tested this system on qualitative human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) RDTs. For qualitative HIV tests, we demonstrate successful detection and labeling (i.e., yes/no decisions) for up to 6-fold dilution of HIV samples. For quantitative measurements, we activated and imaged PSA concentrations ranging from 0 to 200 ng/mL and generated calibration curves relating the RDT line intensity values to PSA concentration. By providing automated digitization of both qualitative and quantitative test results, this wearable colorimetric diagnostic test reader platform on Google Glass can reduce operator errors caused by poor training, provide real-time spatiotemporal mapping of test results, and assist with remote monitoring of various biomedical conditions.
Digital PCR analysis of circulating nucleic acids.
Hudecova, Irena
2015-10-01
Detection of plasma circulating nucleic acids (CNAs) requires the use of extremely sensitive and precise methods. The commonly used quantitative real-time polymerase chain reaction (PCR) poses certain technical limitations in relation to the precise measurement of CNAs whereas the costs of massively parallel sequencing are still relatively high. Digital PCR (dPCR) now represents an affordable and powerful single molecule counting strategy to detect minute amounts of genetic material with performance surpassing many quantitative methods. Microfluidic (chip) and emulsion (droplet)-based technologies have already been integrated into platforms offering hundreds to millions of nanoliter- or even picoliter-scale reaction partitions. The compelling observations reported in the field of cancer research, prenatal testing, transplantation medicine and virology support translation of this technology into routine use. Extremely sensitive plasma detection of rare mutations originating from tumor or placental cells among a large background of homologous sequences facilitates unraveling of the early stages of cancer or the detection of fetal mutations. Digital measurement of quantitative changes in plasma CNAs associated with cancer or graft rejection provides valuable information on the monitoring of disease burden or the recipient's immune response and subsequent therapy treatment. Furthermore, careful quantitative assessment of the viral load offers great value for effective monitoring of antiviral therapy for immunosuppressed or transplant patients. The present review describes the inherent features of dPCR that make it exceptionally robust in precise and sensitive quantification of CNAs. Moreover, I provide an insight into the types of potential clinical applications that have been developed by researchers to date. Copyright © 2015 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Mulder, R Joshua; Guerra, Célia Fonseca; Bickelhaupt, F Matthias
2010-07-22
We have computed the methyl cation affinities in the gas phase of archetypal anionic and neutral bases across the periodic table using ZORA-relativistic density functional theory (DFT) at BP86/QZ4P//BP86/TZ2P. The main purpose of this work is to provide the methyl cation affinities (and corresponding entropies) at 298 K of all anionic (XH(n-1)(-)) and neutral bases (XH(n)) constituted by maingroup-element hydrides of groups 14-17 and the noble gases (i.e., group 18) along the periods 2-6. The cation affinity of the bases decreases from H(+) to CH(3)(+). To understand this trend, we have carried out quantitative bond energy decomposition analyses (EDA). Quantitative correlations are established between the MCA and PA values.
Prokushkin, S G; Prokushkin, A S; Sorokin, N D
2014-01-01
Based on the results of long-term investigations, quantitative assessment ofphytodetrite mineralization rates is provided. Their role in the biological cycle of larch stands growing in the permafrost zone of Central Evenkia is discussed. It is demonstrated that their destruction in the subshrub-sphagnum and cowberry-green moss larch stands is extremely slow, the plant litter contains the most cecalcitrant organic matter demonstrating the lowest decomposition coefficient of 0.03-0.04 year(-1), whereas fresh components of the plant litter have 3- to 4-fold higher values. An insignificant input of N and C from the analyzed mortmass to the soil has been registered. It has been revealed that the changes in N and C in the decomposition components are closely related to the quantitative dynamics (biomass) of microorganisms, such as hydrolytics and, especially, micromicetes.
Gyrokinetic modeling of impurity peaking in JET H-mode plasmas
NASA Astrophysics Data System (ADS)
Manas, P.; Camenen, Y.; Benkadda, S.; Weisen, H.; Angioni, C.; Casson, F. J.; Giroud, C.; Gelfusa, M.; Maslov, M.
2017-06-01
Quantitative comparisons are presented between gyrokinetic simulations and experimental values of the carbon impurity peaking factor in a database of JET H-modes during the carbon wall era. These plasmas feature strong NBI heating and hence high values of toroidal rotation and corresponding gradient. Furthermore, the carbon profiles present particularly interesting shapes for fusion devices, i.e., hollow in the core and peaked near the edge. Dependencies of the experimental carbon peaking factor ( R / L nC ) on plasma parameters are investigated via multilinear regressions. A marked correlation between R / L nC and the normalised toroidal rotation gradient is observed in the core, which suggests an important role of the rotation in establishing hollow carbon profiles. The carbon peaking factor is then computed with the gyrokinetic code GKW, using a quasi-linear approach, supported by a few non-linear simulations. The comparison of the quasi-linear predictions to the experimental values at mid-radius reveals two main regimes. At low normalised collisionality, ν * , and T e / T i < 1 , the gyrokinetic simulations quantitatively recover experimental carbon density profiles, provided that rotodiffusion is taken into account. In contrast, at higher ν * and T e / T i > 1 , the very hollow experimental carbon density profiles are never predicted by the simulations and the carbon density peaking is systematically over estimated. This points to a possible missing ingredient in this regime.
Tang, Yunhua; Han, Ming; Chen, Maogen; Wang, Xiaoping; Ji, Fei; Zhao, Qiang; Zhang, Zhiheng; Ju, Weiqiang; Wang, Dongping; Guo, Zhiyong; He, Xiaoshun
2017-11-01
Transplantation centers have given much attention to donor availability. However, no reliable quantitative methods have been employed to accurately assess graft quality before transplantation. Here, we report that the indocyanine green (ICG) clearance test is a valuable index for liver grafts. We performed the ICG clearance test on 90 brain-dead donors within 6 h before organ procurement between March 2015 and November 2016. We also analyzed the relationship between graft liver function and early graft survival after liver transplantation (LT). Our results suggest that the ICG retention rate at 15 min (ICGR15) of donors before procurement was independently associated with 3-month graft survival after LT. The best donor ICGR15 cutoff value was 11.0%/min, and we observed a significant increase in 3-month graft failure among patients with a donor ICGR15 above this value. On the other hand, a donor ICGR15 value of ≤ 11.0%/min could be used as an early assessment index of graft quality because it provides additional information to the transplant surgeon or organ procurement organization members who must maintain or improve organ function to adapt the LT. An ICG clearance test before liver procurement might be an effective quantitative method to predict graft availability and improve early graft prognosis after LT.
Ramanathan, Rajesh; Duane, Therese M; Kaplan, Brian J; Farquhar, Doris; Kasirajan, Vigneshwar; Ferrada, Paula
2015-01-01
To describe and evaluate a root cause analysis (RCA)-based educational curriculum for quality improvement (QI) practice-based learning and implementation in general surgery residency. A QI curriculum was designed using RCA and spaced-learning approaches to education. The program included a didactic session about the RCA methodology. Resident teams comprising multiple postgraduate years then selected a personal complication, completed an RCA, and presented the findings to the Department of Surgery. Mixed methods consisting of quantitative assessment of performance and qualitative feedback about the program were used to assess the value, strengths, and limitations of the program. Urban tertiary academic medical center. General surgery residents, faculty, and medical students. An RCA was completed by 4 resident teams for the following 4 adverse outcomes: postoperative neck hematoma, suboptimal massive transfusion for trauma, venous thromboembolism, and decubitus ulcer complications. Quantitative peer assessment of their performance revealed proficiency in selecting an appropriate case, defining the central problem, identifying root causes, and proposing solutions. During the qualitative feedback assessment, residents noted value of the course, with the greatest limitation being time constraints and equal participation. An RCA-based curriculum can provide general surgery residents with QI exposure and training that they value. Barriers to successful implementation include time restrictions and equal participation from all involved members. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Yoon, Jung Hyun; Ko, Kyung Hee; Jung, Hae Kyoung; Lee, Jong Tae
2013-12-01
To determine the correlation of qualitative shear wave elastography (SWE) pattern classification to quantitative SWE measurements and whether it is representative of quantitative SWE values with similar performances. From October 2012 to January 2013, 267 breast masses of 236 women (mean age: 45.12 ± 10.54 years, range: 21-88 years) who had undergone ultrasonography (US), SWE, and subsequent biopsy were included. US BI-RADS final assessment and qualitative and quantitative SWE measurements were recorded. Correlation between pattern classification and mean elasticity, maximum elasticity, elasticity ratio and standard deviation were evaluated. Diagnostic performances of grayscale US, SWE parameters, and US combined to SWE values were calculated and compared. Of the 267 breast masses, 208 (77.9%) were benign and 59 (22.1%) were malignant. Pattern classifications significantly correlated with all quantitative SWE measurements, showing highest correlation with maximum elasticity, r = 0.721 (P<0.001). Sensitivity was significantly decreased in US combined to SWE measurements to grayscale US: 69.5-89.8% to 100.0%, while specificity was significantly improved: 62.5-81.7% to 13.9% (P<0.001). Area under the ROC curve (Az) did not show significant differences between grayscale US to US combined to SWE (P>0.05). Pattern classification shows high correlation to maximum stiffness and may be representative of quantitative SWE values. When combined to grayscale US, SWE improves specificity of US. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Phelps, Charles E; Lakdawalla, Darius N; Basu, Anirban; Drummond, Michael F; Towse, Adrian; Danzon, Patricia M
2018-02-01
The fifth section of our Special Task Force report identifies and discusses two aggregation issues: 1) aggregation of cost and benefit information across individuals to a population level for benefit plan decision making and 2) combining multiple elements of value into a single value metric for individuals. First, we argue that additional elements could be included in measures of value, but such elements have not generally been included in measures of quality-adjusted life-years. For example, we describe a recently developed extended cost-effectiveness analysis (ECEA) that provides a good example of how to use a broader concept of utility. ECEA adds two features-measures of financial risk protection and income distributional consequences. We then discuss a further option for expanding this approach-augmented CEA, which can introduce many value measures. Neither of these approaches, however, provide a comprehensive measure of value. To resolve this issue, we review a technique called multicriteria decision analysis that can provide a comprehensive measure of value. We then discuss budget-setting and prioritization using multicriteria decision analysis, issues not yet fully resolved. Next, we discuss deliberative processes, which represent another important approach for population- or plan-level decisions used by many health technology assessment bodies. These use quantitative information on CEA and other elements, but the group decisions are reached by a deliberative voting process. Finally, we briefly discuss the use of stated preference methods for developing "hedonic" value frameworks, and conclude with some recommendations in this area. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Hollowell, Jennifer; Li, Yangmei; Malouf, Reem; Buchanan, James
2016-08-08
Current clinical guidelines and national policy in England support offering 'low risk' women a choice of birth setting, but despite an increase in provison of midwifery units in England the vast majority of women still give birth in obstetric units and there is uncertainty around how best to configure services. There is therefore a need to better understand women's birth place preferences. The aim of this review was to summarise the recent quantitative evidence on UK women's birth place preferences with a focus on identifying the service attributes that 'low risk' women prefer and on identifying which attributes women prioritise when choosing their intended maternity unit or birth setting. We searched Medline, Embase, PsycINFO, Science Citation Index, Social Science Index, CINAHL and ASSIA to identify quantitative studies published in scientific journals since 1992 and designed to describe and explore women's preferences in relation to place of birth. We included experimental stated preference studies, surveys and mixed-methods studies containing relevant quantitative data, where participants were 'low risk' or 'unselected' groups of women with experience of UK maternity services. We included five experimental stated preference studies and four observational surveys, including a total of 4201 respondents. Most studies were old with only three conducted since 2000. Methodological quality was generally poor. The attributes and preferences most commonly explored related to pain relief, continuity of midwife, involvement/availability of medical staff, 'homely' environment/atmosphere, decision-making style, distance/travel time and need for transfer. Service attributes that were almost universally valued by women included local services, being attended by a known midwife and a preference for a degree of control and involvement in decision-making. A substantial proportion of women had a strong preference for care in a hospital setting where medical staff are not necessarily involved in their care, but are readily available. The majority of women appear to value some service attributes while preferences differ for others. Policy makers, commissioners and service providers might usefully consider how to extend the availability of services that most women value while offering a choice of options that enable women to access services that best fit their needs and preferences.
Vlot, Anna H C; de Witte, Wilhelmus E A; Danhof, Meindert; van der Graaf, Piet H; van Westen, Gerard J P; de Lange, Elizabeth C M
2017-12-04
Selectivity is an important attribute of effective and safe drugs, and prediction of in vivo target and tissue selectivity would likely improve drug development success rates. However, a lack of understanding of the underlying (pharmacological) mechanisms and availability of directly applicable predictive methods complicates the prediction of selectivity. We explore the value of combining physiologically based pharmacokinetic (PBPK) modeling with quantitative structure-activity relationship (QSAR) modeling to predict the influence of the target dissociation constant (K D ) and the target dissociation rate constant on target and tissue selectivity. The K D values of CB1 ligands in the ChEMBL database are predicted by QSAR random forest (RF) modeling for the CB1 receptor and known off-targets (TRPV1, mGlu5, 5-HT1a). Of these CB1 ligands, rimonabant, CP-55940, and Δ 8 -tetrahydrocanabinol, one of the active ingredients of cannabis, were selected for simulations of target occupancy for CB1, TRPV1, mGlu5, and 5-HT1a in three brain regions, to illustrate the principles of the combined PBPK-QSAR modeling. Our combined PBPK and target binding modeling demonstrated that the optimal values of the K D and k off for target and tissue selectivity were dependent on target concentration and tissue distribution kinetics. Interestingly, if the target concentration is high and the perfusion of the target site is low, the optimal K D value is often not the lowest K D value, suggesting that optimization towards high drug-target affinity can decrease the benefit-risk ratio. The presented integrative structure-pharmacokinetic-pharmacodynamic modeling provides an improved understanding of tissue and target selectivity.
Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.
Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C
2015-02-01
Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Smeal, Steven W; Schmitt, Margaret A; Pereira, Ronnie Rodrigues; Prasad, Ashok; Fisk, John D
2017-01-01
To expand the quantitative, systems level understanding and foster the expansion of the biotechnological applications of the filamentous bacteriophage M13, we have unified the accumulated quantitative information on M13 biology into a genetically-structured, experimentally-based computational simulation of the entire phage life cycle. The deterministic chemical kinetic simulation explicitly includes the molecular details of DNA replication, mRNA transcription, protein translation and particle assembly, as well as the competing protein-protein and protein-nucleic acid interactions that control the timing and extent of phage production. The simulation reproduces the holistic behavior of M13, closely matching experimentally reported values of the intracellular levels of phage species and the timing of events in the M13 life cycle. The computational model provides a quantitative description of phage biology, highlights gaps in the present understanding of M13, and offers a framework for exploring alternative mechanisms of regulation in the context of the complete M13 life cycle. Copyright © 2016 Elsevier Inc. All rights reserved.
de Gramatica, Martina; Massacci, Fabio; Shim, Woohyun; Turhan, Uğur; Williams, Julian
2017-02-01
We analyze the issue of agency costs in aviation security by combining results from a quantitative economic model with a qualitative study based on semi-structured interviews. Our model extends previous principal-agent models by combining the traditional fixed and varying monetary responses to physical and cognitive effort with nonmonetary welfare and potentially transferable value of employees' own human capital. To provide empirical evidence for the tradeoffs identified in the quantitative model, we have undertaken an extensive interview process with regulators, airport managers, security personnel, and those tasked with training security personnel from an airport operating in a relatively high-risk state, Turkey. Our results indicate that the effectiveness of additional training depends on the mix of "transferable skills" and "emotional" buy-in of the security agents. Principals need to identify on which side of a critical tipping point their agents are to ensure that additional training, with attached expectations of the burden of work, aligns the incentives of employees with the principals' own objectives. © 2016 Society for Risk Analysis.
Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods.
Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C W; Lipiński, Wojciech; Bischof, John C
2016-07-21
Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.
Quantitative Comparison of Photothermal Heat Generation between Gold Nanospheres and Nanorods
NASA Astrophysics Data System (ADS)
Qin, Zhenpeng; Wang, Yiru; Randrianalisoa, Jaona; Raeesi, Vahid; Chan, Warren C. W.; Lipiński, Wojciech; Bischof, John C.
2016-07-01
Gold nanoparticles (GNPs) are widely used for biomedical applications due to unique optical properties, established synthesis methods, and biological compatibility. Despite important applications of plasmonic heating in thermal therapy, imaging, and diagnostics, the lack of quantification in heat generation leads to difficulties in comparing the heating capability for new plasmonic nanostructures and predicting the therapeutic and diagnostic outcome. This study quantifies GNP heat generation by experimental measurements and theoretical predictions for gold nanospheres (GNS) and nanorods (GNR). Interestingly, the results show a GNP-type dependent agreement between experiment and theory. The measured heat generation of GNS matches well with theory, while the measured heat generation of GNR is only 30% of that predicted theoretically at peak absorption. This then leads to a surprising finding that the polydispersity, the deviation of nanoparticle size and shape from nominal value, significantly influences GNR heat generation (>70% reduction), while having a limited effect for GNS (<10% change). This work demonstrates that polydispersity is an important metric in quantitatively predicting plasmonic heat generation and provides a validated framework to quantitatively compare the heating capabilities between gold and other plasmonic nanostructures.
Underestimating extreme events in power-law behavior due to machine-dependent cutoffs
NASA Astrophysics Data System (ADS)
Radicchi, Filippo
2014-11-01
Power-law distributions are typical macroscopic features occurring in almost all complex systems observable in nature. As a result, researchers in quantitative analyses must often generate random synthetic variates obeying power-law distributions. The task is usually performed through standard methods that map uniform random variates into the desired probability space. Whereas all these algorithms are theoretically solid, in this paper we show that they are subject to severe machine-dependent limitations. As a result, two dramatic consequences arise: (i) the sampling in the tail of the distribution is not random but deterministic; (ii) the moments of the sample distribution, which are theoretically expected to diverge as functions of the sample sizes, converge instead to finite values. We provide quantitative indications for the range of distribution parameters that can be safely handled by standard libraries used in computational analyses. Whereas our findings indicate possible reinterpretations of numerical results obtained through flawed sampling methodologies, they also pave the way for the search for a concrete solution to this central issue shared by all quantitative sciences dealing with complexity.
Indirect scaling methods for testing quantitative emotion theories.
Junge, Martin; Reisenzein, Rainer
2013-01-01
Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular.
Optical coherence tomography for the quantitative study of cerebrovascular physiology
Srinivasan, Vivek J; Atochin, Dmitriy N; Radhakrishnan, Harsha; Jiang, James Y; Ruvinskaya, Svetlana; Wu, Weicheng; Barry, Scott; Cable, Alex E; Ayata, Cenk; Huang, Paul L; Boas, David A
2011-01-01
Doppler optical coherence tomography (DOCT) and OCT angiography are novel methods to investigate cerebrovascular physiology. In the rodent cortex, DOCT flow displays features characteristic of cerebral blood flow, including conservation along nonbranching vascular segments and at branch points. Moreover, DOCT flow values correlate with hydrogen clearance flow values when both are measured simultaneously. These data validate DOCT as a noninvasive quantitative method to measure tissue perfusion over a physiologic range. PMID:21364599
ERIC Educational Resources Information Center
McLafferty, Charles L., Jr.; Slate, John R.; Onwuegbuzie, Anthony J.
2010-01-01
Quantitative research dominates published literature in the helping professions. Mixed methods research, which integrates quantitative and qualitative methodologies, has received a lukewarm reception. The authors address the iterative separation that infuses theory, praxis, philosophy, methodology, training, and public perception and propose a…
Robust biological parametric mapping: an improved technique for multimodal brain image analysis
NASA Astrophysics Data System (ADS)
Yang, Xue; Beason-Held, Lori; Resnick, Susan M.; Landman, Bennett A.
2011-03-01
Mapping the quantitative relationship between structure and function in the human brain is an important and challenging problem. Numerous volumetric, surface, region of interest and voxelwise image processing techniques have been developed to statistically assess potential correlations between imaging and non-imaging metrics. Recently, biological parametric mapping has extended the widely popular statistical parametric approach to enable application of the general linear model to multiple image modalities (both for regressors and regressands) along with scalar valued observations. This approach offers great promise for direct, voxelwise assessment of structural and functional relationships with multiple imaging modalities. However, as presented, the biological parametric mapping approach is not robust to outliers and may lead to invalid inferences (e.g., artifactual low p-values) due to slight mis-registration or variation in anatomy between subjects. To enable widespread application of this approach, we introduce robust regression and robust inference in the neuroimaging context of application of the general linear model. Through simulation and empirical studies, we demonstrate that our robust approach reduces sensitivity to outliers without substantial degradation in power. The robust approach and associated software package provides a reliable way to quantitatively assess voxelwise correlations between structural and functional neuroimaging modalities.
Migheli, Francesca; Stoccoro, Andrea; Coppedè, Fabio; Wan Omar, Wan Adnan; Failli, Alessandra; Consolini, Rita; Seccia, Massimo; Spisni, Roberto; Miccoli, Paolo; Mathers, John C.; Migliore, Lucia
2013-01-01
There is increasing interest in the development of cost-effective techniques for the quantification of DNA methylation biomarkers. We analyzed 90 samples of surgically resected colorectal cancer tissues for APC and CDKN2A promoter methylation using methylation sensitive-high resolution melting (MS-HRM) and pyrosequencing. MS-HRM is a less expensive technique compared with pyrosequencing but is usually more limited because it gives a range of methylation estimates rather than a single value. Here, we developed a method for deriving single estimates, rather than a range, of methylation using MS-HRM and compared the values obtained in this way with those obtained using the gold standard quantitative method of pyrosequencing. We derived an interpolation curve using standards of known methylated/unmethylated ratio (0%, 12.5%, 25%, 50%, 75%, and 100% of methylation) to obtain the best estimate of the extent of methylation for each of our samples. We observed similar profiles of methylation and a high correlation coefficient between the two techniques. Overall, our new approach allows MS-HRM to be used as a quantitative assay which provides results which are comparable with those obtained by pyrosequencing. PMID:23326336
Cultural Values Predicting Acculturation Orientations: Operationalizing a Quantitative Measure
ERIC Educational Resources Information Center
Ehala, Martin
2012-01-01
This article proposes that acculturation orientations are related to two sets of cultural values: utilitarianism (Ut) and traditionalism (Tr). While utilitarian values enhance assimilation, traditional values support language and identity maintenance. It is proposed that the propensity to either end of this value opposition can be measured by an…
Institutions' Espoused Values Perceived by Chinese Educational Leaders
ERIC Educational Resources Information Center
Pang, Nicholas Sun-Keung; Wang, Ting
2012-01-01
This paper presents some key findings of a quantitative study which assessed a group of Chinese educational leaders' value orientations. A survey instrument "The Institutional Values Inventory" was used to investigate their perspectives on the values espoused by their institutions in terms of traditional Confucian ethics and values of…
Groundwater availability in the United States: the value of quantitative regional assessments
Dennehy, Kevin F.; Reilly, Thomas E.; Cunningham, William L.
2015-01-01
The sustainability of water resources is under continued threat from the challenges associated with a growing population, competing demands, and a changing climate. Freshwater scarcity has become a fact in many areas. Much of the United States surface-water supplies are fully apportioned for use; thus, in some areas the only potential alternative freshwater source that can provide needed quantities is groundwater. Although frequently overlooked, groundwater serves as the principal reserve of freshwater in the US and represents much of the potential supply during periods of drought. Some nations have requirements to monitor and characterize the availability of groundwater such as the European Union’s Water Framework Directive (EPCEU 2000). In the US there is no such national requirement. Quantitative regional groundwater availability assessments, however, are essential to document the status and trends of groundwater availability for the US and make informed water-resource decisions possible now and in the future. Barthel (2014) highlighted that the value of regional groundwater assessments goes well beyond just quantifying the resource so that it can be better managed. The tools and techniques required to evaluate these unique regional systems advance the science of hydrogeology and provide enhanced methods that can benefit local-scale groundwater investigations. In addition, a significant, yet under-utilized benefit is the digital spatial and temporal data sets routinely generated as part of these studies. Even though there is no legal or regulatory requirement for regional groundwater assessments in the US, there is a logical basis for their implementation. The purpose of this essay is to articulate the rationale for and reaffirm the value of regional groundwater assessments primarily in the US; however, the arguments hold for all nations. The importance of the data sets and the methods and model development that occur as part of these assessments is stressed. These high-value data sets and models should be available in readily accessible formats for use today and in the future. Examples of advances in and accomplishments of two regional groundwater assessments are presented to demonstrate their function, relevance, and value for determining the sustainability of the groundwater resources of the US.
NASA Astrophysics Data System (ADS)
Ader, M.; Cadeau, P.; Jezequel, D.; Chaduteau, C.; Fouilland, E.; Bernard, C.; Leboulanger, C.
2017-12-01
Precambrian nitrogen biogeochemistry models rely on δ15N signatures in sedimentary rocks, but some of the underlying assumptions still need to be more robustly established. Especially when measured δ15N values are above 3‰. Several processes have been proposed to explain these values: non-quantitative reduction of nitrate to N2O/N2 (denitrification), non-quantitative oxidation of ammonium to N2O/N2, or ammonia degassing to the atmosphere. The denitrification hypothesis implies oxygenation of part the water column, allowing nitrate to accumulate. The ammonium oxidation hypothesis implies a largely anoxic water column, where ammonium can accumulates, with limited oxygenation of surface waters. This hypothesis is currently lacking modern analogues to be supported. We propose here that the volcanic crater lake Dziani Dzaha (Mayotte, Indian Ocean) might be one of them, on the basis of several analogies including: permanently anoxic conditions at depth in spite of seasonal mixing; nitrate content below detection limit in the oxic surface waters; accumulation of ammonium at depth during the stratified season; primary productivity massively dominated by cyanobacteria. One aspect may restrict the analogy: the pH value of 9-9.5. In this lake, δ15N values of primary producers and ammonium range from 6 to 9‰ and are recorded with a positive offset in the sediments (9<δ15N<13‰). Because N-sources to the system present more negative δ15N values, such positive values can only be achieved if 14N-enriched N is lost from the lake. Although NH3 degassing might play a small role, the main pathway envisaged for this N-loss is NH4+ oxidation to N2O/N2. If confirmed, this would provide strong support for the hypothesis that positive δ15N values in Precambrian rocks may indicate dominantly anoxic oceans, devoid of nitrate, in which ammonium was partly oxidized to N2O/N2.
Naik, P K; Singh, T; Singh, H
2009-07-01
Quantitative structure-activity relationship (QSAR) analyses were performed independently on data sets belonging to two groups of insecticides, namely the organophosphates and carbamates. Several types of descriptors including topological, spatial, thermodynamic, information content, lead likeness and E-state indices were used to derive quantitative relationships between insecticide activities and structural properties of chemicals. A systematic search approach based on missing value, zero value, simple correlation and multi-collinearity tests as well as the use of a genetic algorithm allowed the optimal selection of the descriptors used to generate the models. The QSAR models developed for both organophosphate and carbamate groups revealed good predictability with r(2) values of 0.949 and 0.838 as well as [image omitted] values of 0.890 and 0.765, respectively. In addition, a linear correlation was observed between the predicted and experimental LD(50) values for the test set data with r(2) of 0.871 and 0.788 for both the organophosphate and carbamate groups, indicating that the prediction accuracy of the QSAR models was acceptable. The models were also tested successfully from external validation criteria. QSAR models developed in this study should help further design of novel potent insecticides.
Radar QPE for hydrological design: Intensity-Duration-Frequency curves
NASA Astrophysics Data System (ADS)
Marra, Francesco; Morin, Efrat
2015-04-01
Intensity-duration-frequency (IDF) curves are widely used in flood risk management since they provide an easy link between the characteristics of a rainfall event and the probability of its occurrence. They are estimated analyzing the extreme values of rainfall records, usually basing on raingauge data. This point-based approach raises two issues: first, hydrological design applications generally need IDF information for the entire catchment rather than a point, second, the representativeness of point measurements decreases with the distance from measure location, especially in regions characterized by steep climatological gradients. Weather radar, providing high resolution distributed rainfall estimates over wide areas, has the potential to overcome these issues. Two objections usually restrain this approach: (i) the short length of data records and (ii) the reliability of quantitative precipitation estimation (QPE) of the extremes. This work explores the potential use of weather radar estimates for the identification of IDF curves by means of a long length radar archive and a combined physical- and quantitative- adjustment of radar estimates. Shacham weather radar, located in the eastern Mediterranean area (Tel Aviv, Israel), archives data since 1990 providing rainfall estimates for 23 years over a region characterized by strong climatological gradients. Radar QPE is obtained correcting the effects of pointing errors, ground echoes, beam blockage, attenuation and vertical variations of reflectivity. Quantitative accuracy is then ensured with a range-dependent bias adjustment technique and reliability of radar QPE is assessed by comparison with gauge measurements. IDF curves are derived from the radar data using the annual extremes method and compared with gauge-based curves. Results from 14 study cases will be presented focusing on the effects of record length and QPE accuracy, exploring the potential application of radar IDF curves for ungauged locations and providing insights on the use of radar QPE for hydrological design studies.
Data analysis and theoretical studies for atmospheric Explorer C, D and E
NASA Technical Reports Server (NTRS)
Dalgarno, A.
1983-01-01
The research concentrated on construction of a comprehensive model of the chemistry of the ionosphere. It proceeded by comparing detailed predictions of the atmospheric parameters observed by the instrumentation on board the Atmospheric Explorer Satellites with the measured values and modifying the chemistry to bring about consistency. Full account was taken of laboratory measurements of the processes identified as important. The research programs were made available to the AE team members. Regularly updated tables of recommended values of photoionization cross sections and electron impact excitation and ionization cross sections were provided. The research did indeed lead to a chemistry model in which the main pathways are quantitatively secure. The accuracy was sufficient that remaining differences are small.
Bond energy prediction of Curie temperature of lithium niobate crystals.
Zhang, Xu; Xue, Dongfeng
2007-03-15
A general expression of the Curie temperature (Tc) and spontaneous polarization (Ps) of lithium niobate (LN) crystals is energetically proposed by employing the viewpoint of the bond energy of constituent chemical bonds within the LN crystallographic frame. The calculated Tc values of various pure and doped LN crystals are in a good agreement with those reported data. Ps values of these LN crystals can also be quantitatively estimated in this work. It is found that the Li site is a sensitive lattice position to dominate the ferroelectricity of LN crystals. This novel method provides us a good understanding of ferroelectric behaviors of LN crystals, which may be applicable to the estimation of ferroelectric behaviors of LN-type solids.
NASA Astrophysics Data System (ADS)
Manaa, M. Riad
2017-06-01
Adiabatic ionization potentials (IPad) and electron affinities (EAad) are determined with the Gaussian-4 (G4) method for the energetic molecules PETN, RDX, β-δ-HMX, CL-17, TNB, TNT, CL-14, DADNE, TNA, and TATB. The IPad and EAad values are in the range of 8.43-11.73 and 0.74-2.86 eV, respectively. Variations are due to substitutional effects of electron withdrawing and donating functional groups. Enthalpies of formation are also determined for several of these molecules to augment the list of recently reported G4 values. The calculated IPad and EAad provide quantitative assessment of such molecular properties as chemical hardness, molecular electronegativity, and "intrinsic" molecular physical hardness.
Wang, Li; Zou, Zhi-Qiang; Wang, Kai; Yu, Ji-Guang; Liu, Xiang-Zhong
2016-01-01
The purpose of this study was to characterize roles of serum hepatitis B virus marker quantitation in differentiation of natural phases of HBV infection. A total of 184 chronic hepatitis B (CHB) patients were analyzed retrospectively. Patients were classified into four categories: immune tolerant phase (IT, n = 36), immune clearance phase (IC, n = 81), low-replicative phase (LR, n = 31), and HBeAg-negative hepatitis phase (ENH, n = 36), based on clinical, biochemical, serological, HBV DNA level and histological data. Hepatitis B surface antigen (HBsAg) quantitation in four phases were 4.7 ± 0.2, 3.8 ± 0.5, 2.5 ± 1.2 and 3.4 ± 0.4 log10 IU/mL, respectively. There were significant differences between IT and IC (p < 0.001) and between LR and ENH phases (p < 0.001). Quantitation of hepatitis B e antigen (HBeAg) in IT and IC phases are 1317.9 ± 332.9 and 673.4 ± 562.1 S/CO, respectively (p < 0.001). Hepatitis B core antibody (HBcAb) quantitation in the four groups were 9.48 ± 3.3, 11.7 ± 2.8, 11.2 ± 2.6 and 13.2 ± 2.9 S/CO, respectively. Area under receiver operating characteristic curve (AUCs) of HBsAg and HBeAg at cutoff values of 4.41 log10 IU/mL and 1118.96 S/CO for differentiation of IT and IC phases are 0.984 and 0.828, with sensitivity 94.4 and 85.2 %, specificity 98.7 and 75 %, respectively. AUCs of HBsAg and HBcAb at cutoff values of 3.4 log10 IU/mL and 10.5 S/CO for differentiation of LR and ENT phases are 0.796 and 0.705, with sensitivity 58.1 and 85.7 %, and specificity 94.4 and 46.2 %, respectively. HBsAg quantitation has high predictive value and HBeAg quantitation has moderate predictive value for discriminating IT and IC phase. HBsAg and HBcAb quantitations have moderate predictive values for differentiation of LR and ENH phase.
Glauser, Gaétan; Grund, Baptiste; Gassner, Anne-Laure; Menin, Laure; Henry, Hugues; Bromirski, Maciej; Schütz, Frédéric; McMullen, Justin; Rochat, Bertrand
2016-03-15
A paradigm shift is underway in the field of quantitative liquid chromatography-mass spectrometry (LC-MS) analysis thanks to the arrival of recent high-resolution mass spectrometers (HRMS). The capability of HRMS to perform sensitive and reliable quantifications of a large variety of analytes in HR-full scan mode is showing that it is now realistic to perform quantitative and qualitative analysis with the same instrument. Moreover, HR-full scan acquisition offers a global view of sample extracts and allows retrospective investigations as virtually all ionized compounds are detected with a high sensitivity. In time, the versatility of HRMS together with the increasing need for relative quantification of hundreds of endogenous metabolites should promote a shift from triple-quadrupole MS to HRMS. However, a current "pitfall" in quantitative LC-HRMS analysis is the lack of HRMS-specific guidance for validated quantitative analyses. Indeed, false positive and false negative HRMS detections are rare, albeit possible, if inadequate parameters are used. Here, we investigated two key parameters for the validation of LC-HRMS quantitative analyses: the mass accuracy (MA) and the mass-extraction-window (MEW) that is used to construct the extracted-ion-chromatograms. We propose MA-parameters, graphs, and equations to calculate rational MEW width for the validation of quantitative LC-HRMS methods. MA measurements were performed on four different LC-HRMS platforms. Experimentally determined MEW values ranged between 5.6 and 16.5 ppm and depended on the HRMS platform, its working environment, the calibration procedure, and the analyte considered. The proposed procedure provides a fit-for-purpose MEW determination and prevents false detections.
Goudet, Sophie; Griffiths, Paula L; Wainaina, Caroline W; Macharia, Teresia N; Wekesah, Frederick M; Wanjohi, Milka; Muriuki, Peter; Kimani-Murage, Elizabeth
2018-04-02
In Kenya, poor maternal nutrition, suboptimal infant and young child feeding practices and high levels of malnutrition have been shown among the urban poor. An intervention aimed at promoting optimal maternal infant and young child nutrition (MIYCN) practices in urban poor settings in Nairobi, Kenya was implemented. The intervention involved home-based counselling of pregnant and breastfeeding women and mothers of young children by community health volunteers (CHVs) on optimal MIYCN practices. This study assesses the social impact of the intervention using a Social Return on Investment (SROI) approach. Data collection was based on SROI methods and used a mixed methods approach (focus group discussions, key informant interviews, in-depth interviews, quantitative stakeholder surveys, and revealed preference approach for outcomes using value games). The SROI analysis revealed that the MIYCN intervention was assessed to be highly effective and created social value, particularly for mothers and their children. Positive changes that participants experienced included mothers being more confident in child care and children and mothers being healthier. Overall, the intervention had a negative social impact on daycare centers and on health care providers, by putting too much pressure on them to provide care without providing extra support. The study calculated that, after accounting for discounting factors, the input ($USD 419,716) generated $USD 8 million of social value at the end of the project. The net present value created by the project was estimated at $USD 29.5 million. $USD 1 invested in the project was estimated to bring USD$ 71 (sensitivity analysis: USD$ 34-136) of social value for the stakeholders. The MIYCN intervention showed an important social impact in which mothers and children benefited the most. The intervention resulted in better perceived health of mothers and children and increased confidence of mothers to provide care for their children, while it resulted in negative impacts for day care center owners and health care providers.
Using mixed methods effectively in prevention science: designs, procedures, and examples.
Zhang, Wanqing; Watanabe-Galloway, Shinobu
2014-10-01
There is growing interest in using a combination of quantitative and qualitative methods to generate evidence about the effectiveness of health prevention, services, and intervention programs. With the emerging importance of mixed methods research across the social and health sciences, there has been an increased recognition of the value of using mixed methods for addressing research questions in different disciplines. We illustrate the mixed methods approach in prevention research, showing design procedures used in several published research articles. In this paper, we focused on two commonly used mixed methods designs: concurrent and sequential mixed methods designs. We discuss the types of mixed methods designs, the reasons for, and advantages of using a particular type of design, and the procedures of qualitative and quantitative data collection and integration. The studies reviewed in this paper show that the essence of qualitative research is to explore complex dynamic phenomena in prevention science, and the advantage of using mixed methods is that quantitative data can yield generalizable results and qualitative data can provide extensive insights. However, the emphasis of methodological rigor in a mixed methods application also requires considerable expertise in both qualitative and quantitative methods. Besides the necessary skills and effective interdisciplinary collaboration, this combined approach also requires an open-mindedness and reflection from the involved researchers.
Simulating realistic predator signatures in quantitative fatty acid signature analysis
Bromaghin, Jeffrey F.
2015-01-01
Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.
Lipiäinen, Tiina; Fraser-Miller, Sara J; Gordon, Keith C; Strachan, Clare J
2018-02-05
This study considers the potential of low-frequency (terahertz) Raman spectroscopy in the quantitative analysis of ternary mixtures of solid-state forms. Direct comparison between low-frequency and mid-frequency spectral regions for quantitative analysis of crystal form mixtures, without confounding sampling and instrumental variations, is reported for the first time. Piroxicam was used as a model drug, and the low-frequency spectra of piroxicam forms β, α2 and monohydrate are presented for the first time. These forms show clear spectral differences in both the low- and mid-frequency regions. Both spectral regions provided quantitative models suitable for predicting the mixture compositions using partial least squares regression (PLSR), but the low-frequency data gave better models, based on lower errors of prediction (2.7, 3.1 and 3.2% root-mean-square errors of prediction [RMSEP] values for the β, α2 and monohydrate forms, respectively) than the mid-frequency data (6.3, 5.4 and 4.8%, for the β, α2 and monohydrate forms, respectively). The better performance of low-frequency Raman analysis was attributed to larger spectral differences between the solid-state forms, combined with a higher signal-to-noise ratio. Copyright © 2017 Elsevier B.V. All rights reserved.
Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C
2018-01-01
This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.
Pollen preservation and Quaternary environmental history in the southeastern United States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Delcourt, P.A.; Delcourt, H.R.
Reconstructions of Quaternary environmental history based upon modern pollen/vegetation/climate calibrations are more tenable if the factors responsible for variation in pollen assemblages are evaluated. Examination of the state of preservation of Quaternary palynomorphs provides quantitative data concerning the degree of information loss due to alteration of pollen assemblages by syndepositional and post-depositional deterioration. The percentage, concentration, and influx values for total indeterminable pollen are useful criteria in providing an objective and quantitative basis for evaluating the comparability of pollen spectra within and between sites. Supporting data concerning sediment particle-size distribution, organic matter content, and concentration, influx, and taxonomic composition ofmore » both determinable pollen and plant macrofossils aid in reconstructing past depositional environments. The potential is high for deterioration of pollen in sediments from the southeastern United States, although considerable variation is found in both kind and degree of deterioration between lacustrine and alluvial sites of different ages and in different latitudes. Modern analogs are a basis for late Quaternary environmental reconstructions when pollen deterioration has not significantly biased the information content of fossil pollen assemblages.« less
Quantifiable outcomes from corporate and higher education learning collaborations
NASA Astrophysics Data System (ADS)
Devine, Thomas G.
The study investigated the existence of measurable learning outcomes that emerged out of the shared strengths of collaborating sponsors. The study identified quantifiable learning outcomes that confirm corporate, academic and learner participation in learning collaborations. Each of the three hypotheses and the synergy indicator quantitatively and qualitatively confirmed learning outcomes benefiting participants. The academic-indicator quantitatively confirmed that learning outcomes attract learners to the institution. The corporate-indicator confirmed that learning outcomes include knowledge exchange and enhanced workforce talents for careers in the energy-utility industry. The learner-indicator confirmed that learning outcomes provide professional development opportunities for employment. The synergy-indicator confirmed that best learning practices in learning collaborations emanate out of the sponsors' shared strengths, and that partnerships can be elevated to strategic alliances, going beyond response to the desires of sponsors to create learner-centered cultures. The synergy-indicator confirmed the value of organizational processes that elevate sponsors' interactions to sharing strength, to create a learner-centered culture. The study's series of qualitative questions confirmed prior success factors, while verifying the hypothesis results and providing insight not available from quantitative data. The direct benefactors of the study are the energy-utility learning-collaboration participants of the study, and corporation, academic institutions, and learners of the collaboration. The indirect benefactors are the stakeholders of future learning collaborations, through improved knowledge of the existence or absence of quantifiable learning outcomes.
Zhang, Y; Gauthier, S A; Gupta, A; Chen, W; Comunale, J; Chiang, G C-Y; Zhou, D; Askin, G; Zhu, W; Pitt, D; Wang, Y
2016-09-01
Quantitative susceptibility mapping and R2* are sensitive to myelin and iron changes in multiple sclerosis lesions. This study was designed to characterize lesion changes on quantitative susceptibility mapping and R2* at various gadolinium-enhancement stages. This study included 64 patients with MS with different enhancing patterns in white matter lesions: nodular, shell-like, nonenhancing < 1 year old, and nonenhancing 1-3 years old. These represent acute, late acute, early chronic, and late chronic lesions, respectively. Susceptibility values measured on quantitative susceptibility mapping and R2* values were compared among the 4 lesion types. Their differences were assessed with a generalized estimating equation, controlling for Expanded Disability Status Scale score, age, and disease duration. We analyzed 203 lesions: 80 were nodular-enhancing, of which 77 (96.2%) were isointense on quantitative susceptibility mapping; 33 were shell-enhancing, of which 30 (90.9%) were hyperintense on quantitative susceptibility mapping; and 49 were nonenhancing lesions < 1 year old and 41 were nonenhancing lesions 1-3 years old, all of which were hyperintense on quantitative susceptibility mapping. Their relative susceptibility/R2* values were 0.5 ± 4.4 parts per billion/-5.6 ± 2.9 Hz, 10.2 ± 5.4 parts per billion/-8.0 ± 2.6 Hz, 20.2 ± 7.8 parts per billion/-3.1 ± 2.3 Hz, and 33.2 ± 8.2 parts per billion/-2.0 ± 2.6 Hz, respectively, and were significantly different (P < .005). Early active MS lesions with nodular enhancement show R2* decrease but no quantitative susceptibility mapping change, reflecting myelin breakdown; late active lesions with peripheral enhancement show R2* decrease and quantitative susceptibility mapping increase in the lesion center, reflecting further degradation and removal of myelin debris; and early or late chronic nonenhancing lesions show both quantitative susceptibility mapping and R2* increase, reflecting iron accumulation. © 2016 by American Journal of Neuroradiology.
NASA Astrophysics Data System (ADS)
Runge, Melody G.; Bebbington, Mark S.; Cronin, Shane J.; Lindsay, Jan M.; Moufti, Mohammed R.
2016-02-01
During probabilistic volcanic hazard analysis of volcanic fields, a greater variety of spatial data on crustal features should help improve forecasts of future vent locations. Without further examination, however, geophysical estimations of crustal or other features may be non-informative. Here, we present a new, robust, non-parametric method to quantitatively determine the existence of any relationship between natural phenomena (e.g., volcanic eruptions) and a variety of geophysical data. This provides a new validation tool for incorporating a range of potentially hazard-diagnostic observable data into recurrence rate estimates and hazard analyses. Through this study it is shown that the location of Cenozoic volcanic fields across the Arabian Shield appear to be related to locations of major and minor faults, at higher elevations, and regions where gravity anomaly values were between - 125 mGal and 0 mGal. These findings support earlier hypotheses that the western shield uplift was related to Cenozoic volcanism. At the harrat (volcanic field)-scale, higher vent density regions are related to both elevation and gravity anomaly values. A by-product of this work is the collection of existing data on the volcanism across Saudi Arabia, with all vent locations provided herein, as well as updated maps for Harrats Kura, Khaybar, Ithnayn, Kishb, and Rahat. This work also highlights the potential dangers of assuming relationships between observed data and the occurrence of a natural phenomenon without quantitative assessment or proper consideration of the effects of data resolution.
Bonomo, Robert A.; Bahniuk, Nadzeya; Bulitta, Juergen B.; VanScoy, Brian; DeFiglio, Holland; Fikes, Steven; Brown, David; Drawz, Sarah M.; Kulawy, Robert; Louie, Arnold
2012-01-01
The panoply of resistance mechanisms in Pseudomonas aeruginosa makes resistance suppression difficult. Defining optimal regimens is critical. Cefepime is a cephalosporin whose 3′ side chain provides some stability against AmpC β-lactamases. We examined the activity of cefepime against P. aeruginosa wild-type strain PAO1 and its isogenic AmpC stably derepressed mutant in our hollow-fiber infection model. Dose-ranging studies demonstrated complete failure with resistance emergence (both isolates). Inoculum range studies demonstrated ultimate failure for all inocula. Lower inocula failed last (10 days to 2 weeks). Addition of a β-lactamase inhibitor suppressed resistance even with the stably derepressed isolate. Tobramycin combination studies demonstrated resistance suppression in both the wild-type and the stably derepressed isolates. Quantitating the RNA message by quantitative PCR demonstrated that tobramycin decreased the message relative to that in cefepime-alone experiments. Western blotting with AmpC-specific antibody for P. aeruginosa demonstrated decreased expression. We concluded that suppression of β-lactamase expression by tobramycin (a protein synthesis inhibitor) was at least part of the mechanism behind resistance suppression. Monte Carlo simulation demonstrated that a regimen of 2 g of cefepime every 8 h plus 7 mg/kg of body weight of tobramycin daily would provide robust resistance suppression for Pseudomonas isolates with cefepime MIC values up to 8 mg/liter and tobramycin MIC values up to 1 mg/liter. For P. aeruginosa resistance suppression, combination therapy is critical. PMID:22005996
Tenti, Lorenzo; Maynau, Daniel; Angeli, Celestino; Calzado, Carmen J
2016-07-21
A new strategy based on orthogonal valence-bond analysis of the wave function combined with intermediate Hamiltonian theory has been applied to the evaluation of the magnetic coupling constants in two AF systems. This approach provides both a quantitative estimate of the J value and a detailed analysis of the main physical mechanisms controlling the coupling, using a combined perturbative + variational scheme. The procedure requires a selection of the dominant excitations to be treated variationally. Two methods have been employed: a brute-force selection, using a logic similar to that of the CIPSI approach, or entanglement measures, which identify the most interacting orbitals in the system. Once a reduced set of excitations (about 300 determinants) is established, the interaction matrix is dressed at the second-order of perturbation by the remaining excitations of the CI space. The diagonalization of the dressed matrix provides J values in good agreement with experimental ones, at a very low-cost. This approach demonstrates the key role of d → d* excitations in the quantitative description of the magnetic coupling, as well as the importance of using an extended active space, including the bridging ligand orbitals, for the binuclear model of the intermediates of multicopper oxidases. The method is a promising tool for dealing with complex systems containing several active centers, as an alternative to both pure variational and DFT approaches.
Wang, Qiang; Jia, Qingzhu; Yan, Lihong; Xia, Shuqian; Ma, Peisheng
2014-08-01
The aquatic toxicity value of hazardous contaminants plays an important role in the risk assessments of aquatic ecosystems. The following study presents a stable and accurate structure-toxicity relationship model based on the norm indexes for the prediction of toxicity value (log(LC50)) for 190 diverse narcotic pollutants (96 h LC50 data for Poecilia reticulata). Research indicates that this new model is very efficient and provides satisfactory results. The suggested prediction model is evidenced by R(2) (square correlation coefficient) and ARD (average relative difference) values of 0.9376 and 10.45%, respectively, for the training set, and 0.9264 and 13.90% for the testing set. Comparison results with reference models demonstrate that this new method, based on the norm indexes proposed in this work, results in significant improvements, both in accuracy and stability for predicting aquatic toxicity values of narcotic pollutants. Copyright © 2014 Elsevier Ltd. All rights reserved.
Thin-Layering Effect On Estimating Seismic Attenuation In Methane Hydrate-Bearing Sediments
NASA Astrophysics Data System (ADS)
Lee, K.; Matsushima, J.
2012-12-01
Seismic attenuation is one of the important parameters that provide information concerning both the detection and quantitative assessment of gas-hydrates. We estimated seismic attenuation (1/Q) from surface seismic data acquired at Nankai Trough in Japan. We adapt the Q-versus offset (QVO) method to calculate robust and continuous interval attenuations from CMP gathers. We could observe high attenuation in methane hydrate bearing sediments over the BSR region. However some negative 1/Q values are also shown. This means that the amplitude of high frequency components is increasing with depth. Such results may be due to tuning effect. Here, we carried out numerical test to see how thin-layering effect influences on seismic attenuation results. The results showed that tuning considerably influences the attenuation results, and causes the lower 1/Q values (lower attenuation) and negative 1/Q values.
Visualization techniques to aid in the analysis of multispectral astrophysical data sets
NASA Technical Reports Server (NTRS)
Brugel, E. W.; Domik, Gitta O.; Ayres, T. R.
1993-01-01
The goal of this project was to support the scientific analysis of multi-spectral astrophysical data by means of scientific visualization. Scientific visualization offers its greatest value if it is not used as a method separate or alternative to other data analysis methods but rather in addition to these methods. Together with quantitative analysis of data, such as offered by statistical analysis, image or signal processing, visualization attempts to explore all information inherent in astrophysical data in the most effective way. Data visualization is one aspect of data analysis. Our taxonomy as developed in Section 2 includes identification and access to existing information, preprocessing and quantitative analysis of data, visual representation and the user interface as major components to the software environment of astrophysical data analysis. In pursuing our goal to provide methods and tools for scientific visualization of multi-spectral astrophysical data, we therefore looked at scientific data analysis as one whole process, adding visualization tools to an already existing environment and integrating the various components that define a scientific data analysis environment. As long as the software development process of each component is separate from all other components, users of data analysis software are constantly interrupted in their scientific work in order to convert from one data format to another, or to move from one storage medium to another, or to switch from one user interface to another. We also took an in-depth look at scientific visualization and its underlying concepts, current visualization systems, their contributions and their shortcomings. The role of data visualization is to stimulate mental processes different from quantitative data analysis, such as the perception of spatial relationships or the discovery of patterns or anomalies while browsing through large data sets. Visualization often leads to an intuitive understanding of the meaning of data values and their relationships by sacrificing accuracy in interpreting the data values. In order to be accurate in the interpretation, data values need to be measured, computed on, and compared to theoretical or empirical models (quantitative analysis). If visualization software hampers quantitative analysis (which happens with some commercial visualization products), its use is greatly diminished for astrophysical data analysis. The software system STAR (Scientific Toolkit for Astrophysical Research) was developed as a prototype during the course of the project to better understand the pragmatic concerns raised in the project. STAR led to a better understanding on the importance of collaboration between astrophysicists and computer scientists. Twenty-one examples of the use of visualization for astrophysical data are included with this report. Sixteen publications related to efforts performed during or initiated through work on this project are listed at the end of this report.
Liu, Lihua; Long, Miaomiao; Wang, Junping; Liu, Ning; Ge, Xihong; Hu, Zhandong; Shen, Wen
2015-02-01
Puerperal breast abscess after polyacrylamide hydrogel (PAAG) augmentation mammoplasty can induce breast auto-inflation resulting in serious consequences. Mammography, ultrasound, and conventional MRI are poor at detecting related PAAG abnormality histologically. We evaluated the value of diffusion-weighted imaging (DWI) in the quantitative analysis of puerperal PAAG abscess after augmentation mammoplasty. This was a retrospective study, and a waiver for informed consent was granted. Sixteen puerperal women with breast discomfort underwent conventional breast non-enhanced MRI and axial DWI using a 3T MR scanner. Qualitative analysis of the signal intensity on DWI and conventional sequences was performed. The apparent diffusion coefficient (ADC) values of the affected and contralateral normal PAAG cysts were measured quantitatively. Paired t test was used to evaluate whether there was significant difference. Both affected and normal PAAG cysts showed equal signal intensity on conventional T1WI and fat saturation T2WI, which were not helpful in detecting puerperal PAAG abscess. However, the affected PAAG cysts had a significantly decreased ADC value of 1.477 ± 0.332 × 10(-3)mm(2)/s and showed obvious hypo-intensity on the ADC map and increased signal intensity on DWI compared with the ADC value of 2.775 ± 0.233 × 10(-3)mm(2)/s of the contralateral normal PAAG cysts. DWI and quantitative measurement of ADC values are of great value for the diagnosis of puerperal PAAG abscess. Standardized MRI should be suggested to these puerperal women with breast discomfort or just for the purpose of check up. DWI should be selected as the essential MRI sequence.
Quantitative data standardization of X-ray based densitometry methods
NASA Astrophysics Data System (ADS)
Sergunova, K. A.; Petraikin, A. V.; Petrjajkin, F. A.; Akhmad, K. S.; Semenov, D. S.; Potrakhov, N. N.
2018-02-01
In the present work is proposed the design of special liquid phantom for assessing the accuracy of quantitative densitometric data. Also are represented the dependencies between the measured bone mineral density values and the given values for different X-ray based densitometry techniques. Shown linear graphs make it possible to introduce correction factors to increase the accuracy of BMD measurement by QCT, DXA and DECT methods, and to use them for standardization and comparison of measurements.
Surowiec, Rachel K; Lucas, Erin P; Wilson, Katharine J; Saroki, Adriana J; Ho, Charles P
2014-01-01
Before quantitative imaging techniques can become clinically valuable, the method, and more specifically, the regions of locating and reporting these values should be standardized toward reproducibility comparisons across centers and longitudinal follow-up of individual patients. The purpose of this technical note is to describe a rigorous and reproducible method of locating, analyzing, and reporting quantitative MRI values in hip articular cartilage with an approach that is consistent with current orthopedic literature. To demonstrate this localization and documentation, 3 patients (age, 23 ± 5.1 years; 2 males, 1 female) who presented with symptomatic mixed-type femoroacetabular impingement (α angle, 63.3° ± 2.1°; center edge angle, 39° ± 4.2°) were evaluated with T2-mapping at 3 T MRI prior to hip arthroscopy. Manual segmentation was performed and cartilage of the acetabulum and femur was divided into 12 subregions adapted from the geographic zone method. Bone landmarks in the acetabulum and femur, identifiable both in arthroscopy and MR images, were manually selected and the coordinates exported for division of cartilage. Mean T2 values in each zone are presented. The current work outlines a standardized system to locate and describe quantitative mapping values that could aid in surgical decision making, planning, and the noninvasive longitudinal follow-up of implemented cartilage preservation and restoration techniques.
Normalised quantitative polymerase chain reaction for diagnosis of tuberculosis-associated uveitis.
Barik, Manas Ranjan; Rath, Soveeta; Modi, Rohit; Rana, Rajkishori; Reddy, Mamatha M; Basu, Soumyava
2018-05-01
Polymerase chain reaction (PCR)-based diagnosis of tuberculosis-associated uveitis (TBU) in TB-endemic countries is challenging due to likelihood of latent mycobacterial infection in both immune and non-immune cells. In this study, we investigated normalised quantitative PCR (nqPCR) in ocular fluids (aqueous/vitreous) for diagnosis of TBU in a TB-endemic population. Mycobacterial copy numbers (mpb64 gene) were normalised to host genome copy numbers (RNAse P RNA component H1 [RPPH1] gene) in TBU (n = 16) and control (n = 13) samples (discovery cohort). The mpb64:RPPH1 ratios (normalised value) from each TBU and control sample were tested against the current reference standard i.e. clinically-diagnosed TBU, to generate Receiver Operating Characteristic (ROC) curves. The optimum cut-off value of mpb64:RPPH1 ratio (0.011) for diagnosing TBU was identified from the highest Youden index. This cut-off value was then tested in a different cohort of TBU and controls (validation cohort, 20 cases and 18 controls), where it yielded specificity, sensitivity and diagnostic accuracy of 94.4%, 85.0%, and 89.4% respectively. The above values for conventional quantitative PCR (≥1 copy of mpb64 per reaction) were 61.1%, 90.0%, and 74.3% respectively. Normalisation markedly improved the specificity and diagnostic accuracy of quantitative PCR for diagnosis of TBU. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Chunhui; Guan, Guangying; Ling, Yuting; Lang, Stephen; Wang, Ruikang K.; Huang, Zhihong; Nabi, Ghulam
2015-03-01
Objectives. Prostate cancer is the most frequently diagnosed malignancy in men. Digital rectal examination (DRE) - a known clinical tool based on alteration in the mechanical properties of tissues due to cancer has traditionally been used for screening prostate cancer. Essentially, DRE estimates relative stiffness of cancerous and normal prostate tissue. Optical coherence elastography (OCE) are new optical imaging techniques capable of providing cross-sectional imaging of tissue microstructure as well as elastogram in vivo and in real time. In this preliminary study, OCE was used in the setting of the human prostate biopsies ex vivo, and the images acquired were compared with those obtained using standard histopathologic methods. Methods. 120 prostate biopsies were obtained by TRUS guided needle biopsy procedures from 9 patients with clinically suspected cancer of the prostate. The biopsies were approximately 0.8mm in diameter and 12mm in length, and prepared in Formalin solution. Quantitative assessment of biopsy samples using OCE was obtained in kilopascals (kPa) before histopathologic evaluation. The results obtained from OCE and standard histopathologic evaluation were compared provided the cross-validation. Sensitivity, specificity, and positive and negative predictive values were calculated for OCE (histopathology was a reference standard). Results. OCE could provide quantitative elasticity properties of prostate biopsies within benign prostate tissue, prostatic intraepithelial neoplasia, atypical hyperplasia and malignant prostate cancer. Data analysed showed that the sensitivity and specificity of OCE for PCa detection were 1 and 0.91, respectively. PCa had significantly higher stiffness values compared to benign tissues, with a trend of increasing in stiffness with increasing of malignancy. Conclusions. Using OCE, microscopic resolution elastogram is promising in diagnosis of human prostatic diseases. Further studies using this technique to improve the detection and staging of malignant cancer of the prostate are ongoing.
The protective role of coastal marshes: a systematic review and meta-analysis.
Shepard, Christine C; Crain, Caitlin M; Beck, Michael W
2011-01-01
Salt marshes lie between many human communities and the coast and have been presumed to protect these communities from coastal hazards by providing important ecosystem services. However, previous characterizations of these ecosystem services have typically been based on a small number of historical studies, and the consistency and extent to which marshes provide these services has not been investigated. Here, we review the current evidence for the specific processes of wave attenuation, shoreline stabilization and floodwater attenuation to determine if and under what conditions salt marshes offer these coastal protection services. We conducted a thorough search and synthesis of the literature with reference to these processes. Seventy-five publications met our selection criteria, and we conducted meta-analyses for publications with sufficient data available for quantitative analysis. We found that combined across all studies (n = 7), salt marsh vegetation had a significant positive effect on wave attenuation as measured by reductions in wave height per unit distance across marsh vegetation. Salt marsh vegetation also had a significant positive effect on shoreline stabilization as measured by accretion, lateral erosion reduction, and marsh surface elevation change (n = 30). Salt marsh characteristics that were positively correlated to both wave attenuation and shoreline stabilization were vegetation density, biomass production, and marsh size. Although we could not find studies quantitatively evaluating floodwater attenuation within salt marshes, there are several studies noting the negative effects of wetland alteration on water quantity regulation within coastal areas. Our results show that salt marshes have value for coastal hazard mitigation and climate change adaptation. Because we do not yet fully understand the magnitude of this value, we propose that decision makers employ natural systems to maximize the benefits and ecosystem services provided by salt marshes and exercise caution when making decisions that erode these services.
White, Crow; Halpern, Benjamin S.; Kappel, Carrie V.
2012-01-01
Marine spatial planning (MSP) is an emerging responsibility of resource managers around the United States and elsewhere. A key proposed advantage of MSP is that it makes tradeoffs in resource use and sector (stakeholder group) values explicit, but doing so requires tools to assess tradeoffs. We extended tradeoff analyses from economics to simultaneously assess multiple ecosystem services and the values they provide to sectors using a robust, quantitative, and transparent framework. We used the framework to assess potential conflicts among offshore wind energy, commercial fishing, and whale-watching sectors in Massachusetts and identify and quantify the value from choosing optimal wind farm designs that minimize conflicts among these sectors. Most notably, we show that using MSP over conventional planning could prevent >$1 million dollars in losses to the incumbent fishery and whale-watching sectors and could generate >$10 billion in extra value to the energy sector. The value of MSP increased with the greater the number of sectors considered and the larger the area under management. Importantly, the framework can be applied even when sectors are not measured in dollars (e.g., conservation). Making tradeoffs explicit improves transparency in decision-making, helps avoid unnecessary conflicts attributable to perceived but weak tradeoffs, and focuses debate on finding the most efficient solutions to mitigate real tradeoffs and maximize sector values. Our analysis demonstrates the utility, feasibility, and value of MSP and provides timely support for the management transitions needed for society to address the challenges of an increasingly crowded ocean environment. PMID:22392996
Consumer product chemical weight fractions from ingredient lists.
Isaacs, Kristin K; Phillips, Katherine A; Biryol, Derya; Dionisio, Kathie L; Price, Paul S
2018-05-01
Assessing human exposures to chemicals in consumer products requires composition information. However, comprehensive composition data for products in commerce are not generally available. Many consumer products have reported ingredient lists that are constructed using specific guidelines. A probabilistic model was developed to estimate quantitative weight fraction (WF) values that are consistent with the rank of an ingredient in the list, the number of reported ingredients, and labeling rules. The model provides the mean, median, and 95% upper and lower confidence limit WFs for ingredients of any rank in lists of any length. WFs predicted by the model compared favorably with those reported on Material Safety Data Sheets. Predictions for chemicals known to provide specific functions in products were also found to reasonably agree with reported WFs. The model was applied to a selection of publicly available ingredient lists, thereby estimating WFs for 1293 unique ingredients in 1123 products in 81 product categories. Predicted WFs, although less precise than reported values, can be estimated for large numbers of product-chemical combinations and thus provide a useful source of data for high-throughput or screening-level exposure assessments.
Quantification of EEG reactivity in comatose patients
Hermans, Mathilde C.; Westover, M. Brandon; van Putten, Michel J.A.M.; Hirsch, Lawrence J.; Gaspard, Nicolas
2016-01-01
Objective EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. Methods In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. Results The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet’s AC1: 65–70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts’ agreement regarding reactivity for each individual case. Conclusion Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Significance Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. PMID:26183757
Quantitation of polymethoxylated flavones in orange juice by high-performance liquid chromatography.
Rouseff, R L; Ting, S V
1979-08-01
A quantitative high-performance liquid chromatographic (HPLC) procedure for the determination of the five major polymethoxylated flavones (PMFs) in orange juice has been developed. It employs a unique ternary solvent system with coupled UV-fluorescence detection. The dual detectors were employed to determine the presence of interfering substances and served as a cross check on quantitation. Stop flow UV and fluorescence scanning was used to identify peaks and determine the presence of impurities. Although all five citrus PMFs fluoresce, some HPLC fluorescence peaks were too small to be of much practical use. All five citrus PMFs could be quantitated satisfactorily with the fixed wavelength UV (313 nm) detector. The HPLC procedure has been used to evaluate each step in the preparation. The optimum extracting solvent was selected and one time consuming step was eliminated, as it was found to be unnecessary. HPLC values for nobiletin and sinensetin are in good agreement with the thin-layer chromatographic (TLC) values in the literature. HPLC values for the other three flavones were considerably lower than those reported in the literature. The HPLC procedure is considerably faster than the TLC procedure with equal or superior precision and accuracy.
Graham, John D; Chang, Joice
2015-02-01
The use of table saws in the United States is associated with approximately 28,000 emergency department (ED) visits and 2,000 cases of finger amputation per year. This article provides a quantitative estimate of the economic benefits of automatic protection systems that could be designed into new table saw products. Benefits are defined as reduced health-care costs, enhanced production at work, and diminished pain and suffering. The present value of the benefits of automatic protection over the life of the table saw are interpreted as the switch-point cost value, the maximum investment in automatic protection that can be justified by benefit-cost comparison. Using two alternative methods for monetizing pain and suffering, the study finds switch-point cost values of $753 and $561 per saw. These point estimates are sensitive to the values of inputs, especially the average cost of injury. The various switch-point cost values are substantially higher than rough estimates of the incremental cost of automatic protection systems. Uncertainties and future research needs are discussed. © 2014 Society for Risk Analysis.
Phommasone, Koukeo; Althaus, Thomas; Souvanthong, Phonesavanh; Phakhounthong, Khansoudaphone; Soyvienvong, Laxoy; Malapheth, Phatthaphone; Mayxay, Mayfong; Pavlicek, Rebecca L; Paris, Daniel H; Dance, David; Newton, Paul; Lubell, Yoel
2016-02-04
C-Reactive Protein (CRP) has been shown to be an accurate biomarker for discriminating bacterial from viral infections in febrile patients in Southeast Asia. Here we investigate the accuracy of existing rapid qualitative and semi-quantitative tests as compared with a quantitative reference test to assess their potential for use in remote tropical settings. Blood samples were obtained from consecutive patients recruited to a prospective fever study at three sites in rural Laos. At each site, one of three rapid qualitative or semi-quantitative tests was performed, as well as a corresponding quantitative NycoCard Reader II as a reference test. We estimate the sensitivity and specificity of the three tests against a threshold of 10 mg/L and kappa values for the agreement of the two semi-quantitative tests with the results of the reference test. All three tests showed high sensitivity, specificity and kappa values as compared with the NycoCard Reader II. With a threshold of 10 mg/L the sensitivity of the tests ranged from 87-98 % and the specificity from 91-98 %. The weighted kappa values for the semi-quantitative tests were 0.7 and 0.8. The use of CRP rapid tests could offer an inexpensive and effective approach to improve the targeting of antibiotics in remote settings where health facilities are basic and laboratories are absent. This study demonstrates that accurate CRP rapid tests are commercially available; evaluations of their clinical impact and cost-effectiveness at point of care is warranted.
A Mixed-Methods Research Framework for Healthcare Process Improvement.
Bastian, Nathaniel D; Munoz, David; Ventura, Marta
2016-01-01
The healthcare system in the United States is spiraling out of control due to ever-increasing costs without significant improvements in quality, access to care, satisfaction, and efficiency. Efficient workflow is paramount to improving healthcare value while maintaining the utmost standards of patient care and provider satisfaction in high stress environments. This article provides healthcare managers and quality engineers with a practical healthcare process improvement framework to assess, measure and improve clinical workflow processes. The proposed mixed-methods research framework integrates qualitative and quantitative tools to foster the improvement of processes and workflow in a systematic way. The framework consists of three distinct phases: 1) stakeholder analysis, 2a) survey design, 2b) time-motion study, and 3) process improvement. The proposed framework is applied to the pediatric intensive care unit of the Penn State Hershey Children's Hospital. The implementation of this methodology led to identification and categorization of different workflow tasks and activities into both value-added and non-value added in an effort to provide more valuable and higher quality patient care. Based upon the lessons learned from the case study, the three-phase methodology provides a better, broader, leaner, and holistic assessment of clinical workflow. The proposed framework can be implemented in various healthcare settings to support continuous improvement efforts in which complexity is a daily element that impacts workflow. We proffer a general methodology for process improvement in a healthcare setting, providing decision makers and stakeholders with a useful framework to help their organizations improve efficiency. Published by Elsevier Inc.
Investment appraisal using quantitative risk analysis.
Johansson, Henrik
2002-07-01
Investment appraisal concerned with investments in fire safety systems is discussed. Particular attention is directed at evaluating, in terms of the Bayesian decision theory, the risk reduction that investment in a fire safety system involves. It is shown how the monetary value of the change from a building design without any specific fire protection system to one including such a system can be estimated by use of quantitative risk analysis, the results of which are expressed in terms of a Risk-adjusted net present value. This represents the intrinsic monetary value of investing in the fire safety system. The method suggested is exemplified by a case study performed in an Avesta Sheffield factory.
NASA Technical Reports Server (NTRS)
Montgomery, L. D.
1974-01-01
A literature search was made to obtain values of human forearm, hand and finger blood flow as functions of environmental temperature. The sources used include both government and laboratory reports and the research presented in the open literature. An attempt was made to review many of the more quantitative noninvasive determinations and to collate the results in such a way as to yield blood flow values for each body segment as continuous functions of temperature. A brief review of the various ways used to measure blood flow is included along with an abstract of each work from which data was taken.
Cho, Junghun; Kee, Youngwook; Spincemaille, Pascal; Nguyen, Thanh D; Zhang, Jingwei; Gupta, Ajay; Zhang, Shun; Wang, Yi
2018-03-07
To map the cerebral metabolic rate of oxygen (CMRO 2 ) by estimating the oxygen extraction fraction (OEF) from gradient echo imaging (GRE) using phase and magnitude of the GRE data. 3D multi-echo gradient echo imaging and perfusion imaging with arterial spin labeling were performed in 11 healthy subjects. CMRO 2 and OEF maps were reconstructed by joint quantitative susceptibility mapping (QSM) to process GRE phases and quantitative blood oxygen level-dependent (qBOLD) modeling to process GRE magnitudes. Comparisons with QSM and qBOLD alone were performed using ROI analysis, paired t-tests, and Bland-Altman plot. The average CMRO 2 value in cortical gray matter across subjects were 140.4 ± 14.9, 134.1 ± 12.5, and 184.6 ± 17.9 μmol/100 g/min, with corresponding OEFs of 30.9 ± 3.4%, 30.0 ± 1.8%, and 40.9 ± 2.4% for methods based on QSM, qBOLD, and QSM+qBOLD, respectively. QSM+qBOLD provided the highest CMRO 2 contrast between gray and white matter, more uniform OEF than QSM, and less noisy OEF than qBOLD. Quantitative CMRO 2 mapping that fits the entire complex GRE data is feasible by combining QSM analysis of phase and qBOLD analysis of magnitude. © 2018 International Society for Magnetic Resonance in Medicine.
Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C
2015-04-13
Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.
Della Bona, Alvaro
2005-03-01
The appeal of ceramics as structural dental materials is based on their light weight, high hardness values, chemical inertness, and anticipated unique tribological characteristics. A major goal of current ceramic research and development is to produce tough, strong ceramics that can provide reliable performance in dental applications. Quantifying microstructural parameters is important to develop structure/property relationships. Quantitative microstructural analysis provides an association among the constitution, physical properties, and structural characteristics of materials. Structural reliability of dental ceramics is a major factor in the clinical success of ceramic restorations. Complex stress distributions are present in most practical conditions and strength data alone cannot be directly extrapolated to predict structural performance.
Quantitative trait locus for reading disability on chromosome 6
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cardon, L.R.; Smith, S.D.; Kimberling, W.J.
1994-10-14
Interval mapping of data from two independent samples of sib pairs, at least one member of whom was reading disabled, revealed evidence for a quantitative trait locus (QTL) on chromosome 6. Results obtained from analyses of reading performance from 114 sib pairs genotyped for DNA markers localized the QTL to 6p21.3. Analyses of corresponding data from an independent sample of 50 dizygotic twin pairs provided evidence for linkage to the same region. In combination, the replicate samples yielded a x{sup 2} value of 16.73 (P = 0.0002). Examination of twin and kindred siblings with more extreme deficits in reading performancemore » yielded even stronger evidence for a QTL (x{sup 2} = 27.35, P < 0.00001). The position of the QTL was narrowly defined with a 100:1 confidence interval to a 2-centimorgan region within the human leukocyte antigen complex. 23 refs., 4 figs.« less
NASA Astrophysics Data System (ADS)
Aldrin, John C.; Lindgren, Eric A.
2018-04-01
This paper expands on the objective and motivation for NDE-based characterization and includes a discussion of the current approach using model-assisted inversion being pursued within the Air Force Research Laboratory (AFRL). This includes a discussion of the multiple model-based methods that can be used, including physics-based models, deep machine learning, and heuristic approaches. The benefits and drawbacks of each method is reviewed and the potential to integrate multiple methods is discussed. Initial successes are included to highlight the ability to obtain quantitative values of damage. Additional steps remaining to realize this capability with statistical metrics of accuracy are discussed, and how these results can be used to enable probabilistic life management are addressed. The outcome of this initiative will realize the long-term desired capability of NDE methods to provide quantitative characterization to accelerate certification of new materials and enhance life management of engineered systems.
Smoothing of the bivariate LOD score for non-normal quantitative traits.
Buil, Alfonso; Dyer, Thomas D; Almasy, Laura; Blangero, John
2005-12-30
Variance component analysis provides an efficient method for performing linkage analysis for quantitative traits. However, type I error of variance components-based likelihood ratio testing may be affected when phenotypic data are non-normally distributed (especially with high values of kurtosis). This results in inflated LOD scores when the normality assumption does not hold. Even though different solutions have been proposed to deal with this problem with univariate phenotypes, little work has been done in the multivariate case. We present an empirical approach to adjust the inflated LOD scores obtained from a bivariate phenotype that violates the assumption of normality. Using the Collaborative Study on the Genetics of Alcoholism data available for the Genetic Analysis Workshop 14, we show how bivariate linkage analysis with leptokurtotic traits gives an inflated type I error. We perform a novel correction that achieves acceptable levels of type I error.
Diffusion and ideal MRI techniques to characterize limb-girdle muscular dystrophy
NASA Astrophysics Data System (ADS)
Hernández-Salazar, G.; Hidalgo-Tobon, S.; Vargas-Cañas, S.; Marrufo-Melendez, O.; Solis-Najera, S.; Taboada-Barajas, J.; Rodríguez, A. O.; Delgado-Hernández, R.
2012-10-01
Limb-girdle muscular dystrophies (LGMD) are a group of autosomal dominantly or recessively inherited muscular dystrophies that also present with primary proximal (limb-girdle) muscle weakness. In the thigh, muscles at the back are affected, with a tendency to preserve the tibialis anterior and gastrocnemius. The aim of this study was to compare quantitative MRI measurements from IDEAL-based imaging and DW imaging in the thigh muscles of adults with LGMDs and healthy volunteers(HC). Six women (three patients and three healthy volunteers) were examined. Imaging experiments were conducted on a 1.5T GE scanner (General Electric Medical Systems. Milwaukee). T1 IDEAL 2D images and diffusion images were acquired. Results demonstrated that the use of noninvasive MRI techniques may provide the means to characterize the muscle through quantitative methods to determine the percentage of fat and ADC values.
Quantitative analysis of the evolution of novelty in cinema through crowdsourced keywords.
Sreenivasan, Sameet
2013-09-26
The generation of novelty is central to any creative endeavor. Novelty generation and the relationship between novelty and individual hedonic value have long been subjects of study in social psychology. However, few studies have utilized large-scale datasets to quantitatively investigate these issues. Here we consider the domain of American cinema and explore these questions using a database of films spanning a 70 year period. We use crowdsourced keywords from the Internet Movie Database as a window into the contents of films, and prescribe novelty scores for each film based on occurrence probabilities of individual keywords and keyword-pairs. These scores provide revealing insights into the dynamics of novelty in cinema. We investigate how novelty influences the revenue generated by a film, and find a relationship that resembles the Wundt-Berlyne curve. We also study the statistics of keyword occurrence and the aggregate distribution of keywords over a 100 year period.
Chiu, Pearl H.; Kayali, M. Amin; Kishida, Kenneth T.; Tomlin, Damon; Klinger, Laura G.; Klinger, Mark R.; Montague, P. Read
2014-01-01
Summary Attributing behavioral outcomes correctly to oneself or to other agents is essential for all productive social exchange. We approach this issue in high-functioning males with autism spectrum disorder (ASD) using two separate fMRI paradigms. First, using a visual imagery task, we extract a basis set for responses along the cingulate cortex of control subjects that reveals an agent-specific eigenvector (self eigenmode) associated with imagining oneself executing a specific motor act. Second, we show that the same self eigenmode arises during one's own decision (the self phase) in an interpersonal exchange game (iterated trust game). Third, using this exchange game, we show that ASD males exhibit a severely diminished self eigenmode when playing the game with a human partner. This diminished response covaries parametrically with their behaviorally assessed symptom severity suggesting its value as an objective endophenotype. These findings may provide a quantitative assessment tool for high functioning ASD. PMID:18255038
NASA Astrophysics Data System (ADS)
Golikov, S. Yu; Dulepov, V. I.; Maiorov, I. S.
2017-11-01
The issues on the application of autonomous underwater vehicles for assessing the abundance, biomass, distribution and reserves of invertebrates in the marine benthic ecosystems and on the environmental monitoring are discussed. An example of the application of methodology to assess some of the quantitative characteristics of macrobenthos is provided based upon using the information obtained from the TSL AUV in the Peter the Great Gulf (the Sea of Japan) in the Bay of Paris and the Eastern Bosphorus Strait within the area of the bridge leading to the Russian island. For the quantitative determination of the benthic invertebrate reserves, the values of biomass density of specific species are determined. Based on the data of direct measurements and weightings, the equations of weight dependencies on the size of animals are estimated according to the studied species that are well described by the power law dependence.
Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia
2017-01-01
This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199
Leaman, D J; Arnason, J T; Yusuf, R; Sangat-Roemantyo, H; Soedjito, H; Angerhofer, C K; Pezzuto, J M
1995-11-17
Traditional remedies have been a source of important antimalarial drugs and continue to provide novel and effective treatments, both where pharmaceuticals are not available and where the disease is increasingly resistant to commonly prescribed drugs. The Kenyah of the Apo Kayan, a remote forested plateau in Indonesian Borneo, use 17 malaria remedies derived from natural sources. A quantitative analysis of the relationship between a 'local importance value' index for each malaria remedy (IVmal) and inhibition of cultured Plasmodium falciparum by ethanolic extracts supports the hypothesis that the degree of local consensus about a given remedy is a good indicator of its potential biological efficacy. Our results confirm the rational selection and use of traditional remedies for malaria by the Kenyah. We have identified target species for further research directed toward safe and effective treatments for malaria.
NASA Astrophysics Data System (ADS)
Zhao, H.; Zhang, S.
2008-01-01
One of the most effective means to achieve controlled auto-ignition (CAI) combustion in a gasoline engine is by the residual gas trapping method. The amount of residual gas and mixture composition have significant effects on the subsequent combustion process and engine emissions. In order to obtain quantitative measurements of in-cylinder residual gas concentration and air/fuel ratio, a spontaneous Raman scattering (SRS) system has been developed recently. The optimized optical SRS setups are presented and discussed. The temperature effect on the SRS measurement is considered and a method has been developed to correct for the overestimated values due to the temperature effect. Simultaneous measurements of O2, H2O, CO2 and fuel were obtained throughout the intake, compression, combustion and expansion strokes. It shows that the SRS can provide valuable data on this process in a CAI combustion engine.
Quantitative analysis of the evolution of novelty in cinema through crowdsourced keywords
Sreenivasan, Sameet
2013-01-01
The generation of novelty is central to any creative endeavor. Novelty generation and the relationship between novelty and individual hedonic value have long been subjects of study in social psychology. However, few studies have utilized large-scale datasets to quantitatively investigate these issues. Here we consider the domain of American cinema and explore these questions using a database of films spanning a 70 year period. We use crowdsourced keywords from the Internet Movie Database as a window into the contents of films, and prescribe novelty scores for each film based on occurrence probabilities of individual keywords and keyword-pairs. These scores provide revealing insights into the dynamics of novelty in cinema. We investigate how novelty influences the revenue generated by a film, and find a relationship that resembles the Wundt-Berlyne curve. We also study the statistics of keyword occurrence and the aggregate distribution of keywords over a 100 year period. PMID:24067890
Otis, David L.; Crumpton, William R.; Green, David; Loan-Wilsey, Anna; Cooper, Tom; Johnson, Rex R.
2013-01-01
Justification for investment in restored or constructed wetland projects are often based on presumed net increases in ecosystem services. However, quantitative assessment of performance metrics is often difficult and restricted to a single objective. More comprehensive performance assessments could help inform decision-makers about trade-offs in services provided by alternative restoration program design attributes. The primary goal of the Iowa Conservation Reserve Enhancement Program is to establish wetlands that efficiently remove nitrates from tile-drained agricultural landscapes. A secondary objective is provision of wildlife habitat. We used existing wildlife habitat models to compare relative net change in potential wildlife habitat value for four alternative landscape positions of wetlands within the watershed. Predicted species richness and habitat value for birds, mammals, amphibians, and reptiles generally increased as the wetland position moved lower in the watershed. However, predicted average net increase between pre- and post-project value was dependent on taxonomic group. The increased average wetland area and changes in surrounding upland habitat composition among landscape positions were responsible for these differences. Net change in predicted densities of several grassland bird species at the four landscape positions was variable and species-dependent. Predicted waterfowl breeding activity was greater for lower drainage position wetlands. Although our models are simplistic and provide only a predictive index of potential habitat value, we believe such assessment exercises can provide a tool for coarse-level comparisons of alternative proposed project attributes and a basis for constructing informed hypotheses in auxiliary empirical field studies.
Reducible or irreducible? Mathematical reasoning and the ontological method.
Fisher, William P
2010-01-01
Science is often described as nothing but the practice of measurement. This perspective follows from longstanding respect for the roles mathematics and quantification have played as media through which alternative hypotheses are evaluated and experience becomes better managed. Many figures in the history of science and psychology have contributed to what has been called the "quantitative imperative," the demand that fields of study employ number and mathematics even when they do not constitute the language in which investigators think together. But what makes an area of study scientific is, of course, not the mere use of number, but communities of investigators who share common mathematical languages for exchanging quantitative and quantitative value. Such languages require rigorous theoretical underpinning, a basis in data sufficient to the task, and instruments traceable to reference standard quantitative metrics. The values shared and exchanged by such communities typically involve the application of mathematical models that specify the sufficient and invariant relationships necessary for rigorous theorizing and instrument equating. The mathematical metaphysics of science are explored with the aim of connecting principles of quantitative measurement with the structures of sufficient reason.
Ultrasound introscopic image quantitative characteristics for medical diagnosis
NASA Astrophysics Data System (ADS)
Novoselets, Mikhail K.; Sarkisov, Sergey S.; Gridko, Alexander N.; Tcheban, Anatoliy K.
1993-09-01
The results on computer aided extraction of quantitative characteristics (QC) of ultrasound introscopic images for medical diagnosis are presented. Thyroid gland (TG) images of Chernobil Accident sufferers are considered. It is shown that TG diseases can be associated with some values of selected QCs of random echo distribution in the image. The possibility of these QCs usage for TG diseases recognition in accordance with calculated values is analyzed. The role of speckle noise elimination in the solution of the problem on TG diagnosis is considered too.
Shaheen, Shabnam; Abbas, Safdar; Hussain, Javid; Mabood, Fazal; Umair, Muhammad; Ali, Maroof; Ahmad, Mushtaq; Zafar, Muhammad; Farooq, Umar; Khan, Ajmal
2017-01-01
Medicinal plants are important treasures for the treatment of different types of diseases. Current study provides significant ethnopharmacological information, both qualitative and quantitative on medical plants related to children disorders from district Bannu, Khyber Pakhtunkhwa (KPK) province of Pakistan. The information gathered was quantitatively analyzed using informant consensus factor, relative frequency of citation and use value method to establish a baseline data for more comprehensive investigations of bioactive compounds of indigenous medicinal plants specifically related to children disorders. To best of our knowledge it is first attempt to document ethno-botanical information of medicinal plants using quantitative approaches. Total of 130 informants were interviewed using questionnaire conducted during 2014–2016 to identify the preparations and uses of the medicinal plants for children diseases treatment. A total of 55 species of flowering plants belonging to 49 genera and 32 families were used as ethno-medicines in the study area. The largest number of specie belong to Leguminosae and Cucurbitaceae families (4 species each) followed by Apiaceae, Moraceae, Poaceae, Rosaceae, and Solanaceae (3 species each). In addition leaves and fruits are most used parts (28%), herbs are most used life form (47%), decoction method were used for administration (27%), and oral ingestion was the main used route of application (68.5%). The highest use value was reported for species Momordica charantia and Raphnus sativus (1 for each) and highest Informant Consensus Factor was observed for cardiovascular and rheumatic diseases categories (0.5 for each). Most of the species in the present study were used to cure gastrointestinal diseases (39 species). The results of present study revealed the importance of medicinal plant species and their significant role in the health care of the inhabitants in the present area. The people of Bannu own high traditional knowledge related to children diseases. In conclusion we recommend giving priority for further phytochemical investigation to plants that scored highest FIC, UV values, as such values could be considered as good indicator of prospective plants for discovering new drugs and attract future generations toward traditional healing practices. PMID:28769789
Satyagraha, Ari W.; Sadhewa, Arkasha; Elvira, Rosalie; Elyazar, Iqbal; Feriandika, Denny; Antonjaya, Ungke; Oyong, Damian; Subekti, Decy; Rozi, Ismail E.; Domingo, Gonzalo J.; Harahap, Alida R.; Baird, J. Kevin
2016-01-01
Background Patients infected by Plasmodium vivax or Plasmodium ovale suffer repeated clinical attacks without primaquine therapy against latent stages in liver. Primaquine causes seriously threatening acute hemolytic anemia in patients having inherited glucose-6-phosphate dehydrogenase (G6PD) deficiency. Access to safe primaquine therapy hinges upon the ability to confirm G6PD normal status. CareStart G6PD, a qualitative G6PD rapid diagnostic test (G6PD RDT) intended for use at point-of-care in impoverished rural settings where most malaria patients live, was evaluated. Methodology/Principal Findings This device and the standard qualitative fluorescent spot test (FST) were each compared against the quantitative spectrophotometric assay for G6PD activity as the diagnostic gold standard. The assessment occurred at meso-endemic Panenggo Ede in western Sumba Island in eastern Indonesia, where 610 residents provided venous blood. The G6PD RDT and FST qualitative assessments were performed in the field, whereas the quantitative assay was performed in a research laboratory at Jakarta. The median G6PD activity ≥5 U/gHb was 9.7 U/gHb and was considered 100% of normal activity. The prevalence of G6PD deficiency by quantitative assessment (<5 U/gHb) was 7.2%. Applying 30% of normal G6PD activity as the cut-off for qualitative testing, the sensitivity, specificity, positive predictive value, and negative predictive value for G6PD RDT versus FST among males were as follows: 100%, 98.7%, 89%, and 100% versus 91.7%, 92%, 55%, and 99%; P = 0.49, 0.001, 0.004, and 0.24, respectively. These values among females were: 83%, 92.7%, 17%, and 99.7% versus 100%, 92%, 18%, and 100%; P = 1.0, 0.89, 1.0 and 1.0, respectively. Conclusions/Significance The overall performance of G6PD RDT, especially 100% negative predictive value, demonstrates suitable safety for G6PD screening prior to administering hemolytic drugs like primaquine and many others. Relatively poor diagnostic performance among females due to mosaic G6PD phenotype is an inherent limitation of any current practical screening methodology. PMID:26894297
Satyagraha, Ari W; Sadhewa, Arkasha; Elvira, Rosalie; Elyazar, Iqbal; Feriandika, Denny; Antonjaya, Ungke; Oyong, Damian; Subekti, Decy; Rozi, Ismail E; Domingo, Gonzalo J; Harahap, Alida R; Baird, J Kevin
2016-02-01
Patients infected by Plasmodium vivax or Plasmodium ovale suffer repeated clinical attacks without primaquine therapy against latent stages in liver. Primaquine causes seriously threatening acute hemolytic anemia in patients having inherited glucose-6-phosphate dehydrogenase (G6PD) deficiency. Access to safe primaquine therapy hinges upon the ability to confirm G6PD normal status. CareStart G6PD, a qualitative G6PD rapid diagnostic test (G6PD RDT) intended for use at point-of-care in impoverished rural settings where most malaria patients live, was evaluated. This device and the standard qualitative fluorescent spot test (FST) were each compared against the quantitative spectrophotometric assay for G6PD activity as the diagnostic gold standard. The assessment occurred at meso-endemic Panenggo Ede in western Sumba Island in eastern Indonesia, where 610 residents provided venous blood. The G6PD RDT and FST qualitative assessments were performed in the field, whereas the quantitative assay was performed in a research laboratory at Jakarta. The median G6PD activity ≥ 5 U/gHb was 9.7 U/gHb and was considered 100% of normal activity. The prevalence of G6PD deficiency by quantitative assessment (<5 U/gHb) was 7.2%. Applying 30% of normal G6PD activity as the cut-off for qualitative testing, the sensitivity, specificity, positive predictive value, and negative predictive value for G6PD RDT versus FST among males were as follows: 100%, 98.7%, 89%, and 100% versus 91.7%, 92%, 55%, and 99%; P = 0.49, 0.001, 0.004, and 0.24, respectively. These values among females were: 83%, 92.7%, 17%, and 99.7% versus 100%, 92%, 18%, and 100%; P = 1.0, 0.89, 1.0 and 1.0, respectively. The overall performance of G6PD RDT, especially 100% negative predictive value, demonstrates suitable safety for G6PD screening prior to administering hemolytic drugs like primaquine and many others. Relatively poor diagnostic performance among females due to mosaic G6PD phenotype is an inherent limitation of any current practical screening methodology.
Shaheen, Shabnam; Abbas, Safdar; Hussain, Javid; Mabood, Fazal; Umair, Muhammad; Ali, Maroof; Ahmad, Mushtaq; Zafar, Muhammad; Farooq, Umar; Khan, Ajmal
2017-01-01
Medicinal plants are important treasures for the treatment of different types of diseases. Current study provides significant ethnopharmacological information, both qualitative and quantitative on medical plants related to children disorders from district Bannu, Khyber Pakhtunkhwa (KPK) province of Pakistan. The information gathered was quantitatively analyzed using informant consensus factor, relative frequency of citation and use value method to establish a baseline data for more comprehensive investigations of bioactive compounds of indigenous medicinal plants specifically related to children disorders. To best of our knowledge it is first attempt to document ethno-botanical information of medicinal plants using quantitative approaches. Total of 130 informants were interviewed using questionnaire conducted during 2014-2016 to identify the preparations and uses of the medicinal plants for children diseases treatment. A total of 55 species of flowering plants belonging to 49 genera and 32 families were used as ethno-medicines in the study area. The largest number of specie belong to Leguminosae and Cucurbitaceae families (4 species each) followed by Apiaceae, Moraceae, Poaceae, Rosaceae, and Solanaceae (3 species each). In addition leaves and fruits are most used parts (28%), herbs are most used life form (47%), decoction method were used for administration (27%), and oral ingestion was the main used route of application (68.5%). The highest use value was reported for species Momordica charantia and Raphnus sativus (1 for each) and highest Informant Consensus Factor was observed for cardiovascular and rheumatic diseases categories (0.5 for each). Most of the species in the present study were used to cure gastrointestinal diseases (39 species). The results of present study revealed the importance of medicinal plant species and their significant role in the health care of the inhabitants in the present area. The people of Bannu own high traditional knowledge related to children diseases. In conclusion we recommend giving priority for further phytochemical investigation to plants that scored highest FIC, UV values, as such values could be considered as good indicator of prospective plants for discovering new drugs and attract future generations toward traditional healing practices.
Characterization of potassium dichromate solutions for spectrophotometercalibration
NASA Astrophysics Data System (ADS)
Conceição, F. C.; Silva, E. M.; Gomes, J. F. S.; Borges, P. P.
2018-03-01
Spectrophotometric analysis in the ultraviolet (UV) region is used in the determination of several quantitative and qualitative parameters. For ensuring reliability of the analyses performed on the spectrophotometers, verification / calibration of the equipment must be performed periodically using certified reference materials (CRMs). This work presents the characterization stage needed for producing this CRM. The property value characterized was the absorbance for the wavelengths in the UV spectral regions. This CRM will contribute to guarantee the accuracy and linearity of the absorbance scale to the spectrophotometers, through which analytical measurement results will be provided with metrological traceability.
On the stability and instantaneous velocity of grasped frictionless objects
NASA Technical Reports Server (NTRS)
Trinkle, Jeffrey C.
1992-01-01
A quantitative test for form closure valid for any number of contact points is formulated as a linear program, the optimal objective value of which provides a measure of how far a grasp is from losing form closure. Another contribution of the study is the formulation of a linear program whose solution yields the same information as the classical approach. The benefit of the formulation is that explicit testing of all possible combinations of contact interactions can be avoided by the algorithm used to solve the linear program.
Contextual Fraction as a Measure of Contextuality.
Abramsky, Samson; Barbosa, Rui Soares; Mansfield, Shane
2017-08-04
We consider the contextual fraction as a quantitative measure of contextuality of empirical models, i.e., tables of probabilities of measurement outcomes in an experimental scenario. It provides a general way to compare the degree of contextuality across measurement scenarios; it bears a precise relationship to violations of Bell inequalities; its value, and a witnessing inequality, can be computed using linear programing; it is monotonic with respect to the "free" operations of a resource theory for contextuality; and it measures quantifiable advantages in informatic tasks, such as games and a form of measurement-based quantum computing.
Contextual Fraction as a Measure of Contextuality
NASA Astrophysics Data System (ADS)
Abramsky, Samson; Barbosa, Rui Soares; Mansfield, Shane
2017-08-01
We consider the contextual fraction as a quantitative measure of contextuality of empirical models, i.e., tables of probabilities of measurement outcomes in an experimental scenario. It provides a general way to compare the degree of contextuality across measurement scenarios; it bears a precise relationship to violations of Bell inequalities; its value, and a witnessing inequality, can be computed using linear programing; it is monotonic with respect to the "free" operations of a resource theory for contextuality; and it measures quantifiable advantages in informatic tasks, such as games and a form of measurement-based quantum computing.
Colloidal Dynamics Simulations of Rheology and Stability of Concentrated Fuel Slurries.
1987-04-10
Weals potential as the adsorbed polymer concentration and Hamaker con- stant are changed. These calculations provide quantitative evidence for the...derived by Hamaker : 3 6 U (r) A d 2 2 +2Ln( 2- d2(3 A T2 2 2 2 2 A value of 5.0 x 10" 2 0 j was used for the Hamaker constant, A. A plot of Eq. (31) is...parameter controlling the strength of the repulsive steric potential. The Hamaker constant A (Eq. (33)) is the nat- ural choice for the attractive
Shiel, R E; Pinilla, M; McAllister, H; Mooney, C T
2012-05-01
To assess the value of thyroid scintigraphy to determine thyroid status in dogs with hypothyroidism and various non-thyroidal illnesses. Thyroid hormone concentrations were measured and quantitative thyroid scintigraphy performed in 21 dogs with clinical and/or clinicopathological features consistent with hypothyroidism. In 14 dogs with technetium thyroidal uptake values consistent with euthyroidism, further investigations supported non-thyroidal illness. In five dogs with technetium thyroidal uptake values within the hypothyroid range, primary hypothyroidism was confirmed as the only disease in four. The remaining dog had pituitary-dependent hyperadrenocorticism. Two dogs had technetium thyroidal uptake values in the non-diagnostic range. One dog had iodothyronine concentrations indicative of euthyroidism. In the other, a dog receiving glucocorticoid therapy, all iodothyronine concentrations were decreased. Markedly asymmetric technetium thyroidal uptake was present in two dogs. All iodothyronine concentrations were within reference interval but canine thyroid stimulating hormone concentration was elevated in one. Non-thyroidal illness was identified in both cases. In dogs, technetium thyroidal uptake is a useful test to determine thyroid function. However, values may be non-diagnostic, asymmetric uptake can occur and excess glucocorticoids may variably suppress technetium thyroidal uptake and/or thyroid hormone concentrations. Further studies are necessary to evaluate quantitative thyroid scintigraphy as a gold standard method for determining canine thyroid function. © 2012 British Small Animal Veterinary Association.
The Application of a Residual Risk Evaluation Technique Used for Expendable Launch Vehicles
NASA Technical Reports Server (NTRS)
Latimer, John A.
2009-01-01
This presentation provides a Residual Risk Evaluation Technique (RRET) developed by Kennedy Space Center (KSC) Safety and Mission Assurance (S&MA) Launch Services Division. This technique is one of many procedures used by S&MA at KSC to evaluate residual risks for each Expendable Launch Vehicle (ELV) mission. RRET is a straight forward technique that incorporates the proven methodology of risk management, fault tree analysis, and reliability prediction. RRET derives a system reliability impact indicator from the system baseline reliability and the system residual risk reliability values. The system reliability impact indicator provides a quantitative measure of the reduction in the system baseline reliability due to the identified residual risks associated with the designated ELV mission. An example is discussed to provide insight into the application of RRET.
NASA Astrophysics Data System (ADS)
Liu, C.; Mcgovern, G. P.; Horita, J.
2015-12-01
Traditional isotope ratio mass spectrometry methods to measure 2H/1H and 13C/12C ratios of organic molecules only provide average isotopic values of whole molecules. During the measurement process, valuable information of position-specific isotope fractionations (PSIF) between non-equivalent H and C positions is lost, which can provide additional very useful information about the origins and history of organic molecules. Quantitative nuclear magnetic resonance (NMR) spectrometry can measure 2H and 13C PSIF of organic molecules without destruction. The 2H and 13C signals from different positions of a given molecule show up as distinctive peaks in an NMR spectrum, and their peak areas are proportional to the 2H and 13C populations at each position. Moreover, quantitative NMR can be applied to a wide variety of organic molecules. We have been developing quantitative NMR methods to determine 2H and 13C PSIF of light hydrocarbons (propane, butane and pentane), using J-Young and custom-made high-pressure NMR cells. With careful conditioning of the NMR spectrometer (e.g. tuning, shimming) and effective 1H -13C decoupling, precision of ± <10‰ (2H) and ± <1‰ (13C) can be readily attainable after several hours of acquisition. Measurement time depends on the relaxation time of interested nucleus and the total number of scans needed for high signal-to-noise ratios. Our data for commercial, pure hydrocarbon samples showed that 2H PSIF in the hydrocarbons can be larger than 60‰ and that 13C PSIF can be as large as 15‰. Comparison with theoretical calculations indicates that the PSIF patterns of some hydrocarbon samples reflect non-equilibrium processes in their productions.
Rohman, A; Man, Yb Che; Sismindari
2009-10-01
Today, virgin coconut oil (VCO) is becoming valuable oil and is receiving an attractive topic for researchers because of its several biological activities. In cosmetics industry, VCO is excellent material which functions as a skin moisturizer and softener. Therefore, it is important to develop a quantitative analytical method offering a fast and reliable technique. Fourier transform infrared (FTIR) spectroscopy with sample handling technique of attenuated total reflectance (ATR) can be successfully used to analyze VCO quantitatively in cream cosmetic preparations. A multivariate analysis using calibration of partial least square (PLS) model revealed the good relationship between actual value and FTIR-predicted value of VCO with coefficient of determination (R2) of 0.998.
Hagiwara, Akifumi; Warntjes, Marcel; Hori, Masaaki; Andica, Christina; Nakazawa, Misaki; Kumamaru, Kanako Kunishima; Abe, Osamu; Aoki, Shigeki
2017-01-01
Abstract Conventional magnetic resonance images are usually evaluated using the image signal contrast between tissues and not based on their absolute signal intensities. Quantification of tissue parameters, such as relaxation rates and proton density, would provide an absolute scale; however, these methods have mainly been performed in a research setting. The development of rapid quantification, with scan times in the order of 6 minutes for full head coverage, has provided the prerequisites for clinical use. The aim of this review article was to introduce a specific quantification method and synthesis of contrast-weighted images based on the acquired absolute values, and to present automatic segmentation of brain tissues and measurement of myelin based on the quantitative values, along with application of these techniques to various brain diseases. The entire technique is referred to as “SyMRI” in this review. SyMRI has shown promising results in previous studies when used for multiple sclerosis, brain metastases, Sturge-Weber syndrome, idiopathic normal pressure hydrocephalus, meningitis, and postmortem imaging. PMID:28257339
Health and Environmental Effects Profile for benzotrichloride
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1986-07-01
The Health and Environmental Effects Profile for benzotrichloride was prepared to support listings of hazardous constituents of a wide range of waste streams under Section 3001 of the Resource Conservation and Recovery Act (RCRA) and to provide health-related limits for emergency actions under Section 101 of the Comprehensive Environmental Response, Compensation and Liability Act (CERCLA). Both published literature and information obtained from Agency program office files were evaluated as they pertained to potential human health, aquatic life and environmental effects of hazardous waste constituents. Quantitative estimates are presented provided sufficient data are available. Benzotrichloride has been evaluated as a carcinogen.more » The human carcinogen potency factor for benzotrichloride is 12.63 (mg/kg/day) for oral exposure. The Reportable Quantity (RQ) value of 1, 10, 100, 1000 or 5000 pounds is used to determine the quantity of a hazardous substance for which notification is required in the event of a release as specified by CERCLA based on chronic toxicity. The RQ value for benzotrichloride is 10.« less
Gong, Nan-Jie; Wong, Chun-Sing; Hui, Edward S; Chan, Chun-Chung; Leung, Lam-Ming
2015-10-01
The purpose of this work was to investigate the effects of hemispheric location, gender and age on susceptibility value, as well as the association between susceptibility value and diffusional metrics, in deep gray matter. Iron content was estimated in vivo using quantitative susceptibility mapping. Microstructure was probed using diffusional kurtosis imaging. Regional susceptibility and diffusional metrics were measured for the putamen, caudate nucleus, globus pallidus, thalamus, substantia nigra and red nucleus in 42 healthy adults (age range 25-78 years). Susceptibility value was significantly higher in the left than the right side of the caudate nucleus (P = 0.043) and substantia nigra (P < 0.001). Women exhibited lower susceptibility values than men in the thalamus (P < 0.001) and red nucleus (P = 0.032). Significant age-related increases of susceptibility were observed in the putamen (P < 0.001), red nucleus (P < 0.001), substantia nigra (P = 0.004), caudate nucleus (P < 0.001) and globus pallidus (P = 0.017). The putamen exhibited the highest rate of iron accumulation with aging (slope of linear regression = 0.73 × 10(-3) ppm/year), which was nearly twice those in substantia nigra (slope = 0.40 × 10(-3) ppm/year) and caudate nucleus (slope = 0.39 × 10(-3) ppm/year). Significant positive correlations between the susceptibility value and diffusion measurements were observed for fractional anisotropy (P = 0.045) and mean kurtosis (P = 0.048) in the putamen without controlling for age. Neither correlation was significant after controlling for age. Hemisphere, gender and age-related differences in iron measurements were observed in deep gray matter. Notably, the putamen exhibited the highest rate of increase in susceptibility with aging. Correlations between susceptibility value and microstructural measurements were inconclusive. These findings could provide new clues for unveiling mechanisms underlying iron-related neurodegenerative diseases. Copyright © 2015 John Wiley & Sons, Ltd.
Bering Sea Nd isotope records of North Pacific Intermediate Water circulation
NASA Astrophysics Data System (ADS)
Rabbat, C.; Knudson, K. P.; Goldstein, S. L.
2017-12-01
North Pacific Intermediate Water (NPIW) is the primary water mass associated with Pacific meridional overturning circulation. While the relationship between Atlantic meridional overturning circulation and climate has been extensively studied, a lack of suitable sediment cores has limited past investigations of North Pacific climate and NPIW variability. Integrated Ocean Drilling Program Site U1342 (818 m water depth) on Bower's Ridge in the Bering Sea is located at a sensitive depth for detecting changes in NPIW, and it is the only available sub-arctic North Pacific site that offers long, continuous core recovery, relatively high sedimentation rates, excellent foraminifera preservation, and a well-constrained age model over multiple glacial-interglacial cycles. Previous work at Site U1342 from Knudson and Ravelo (2015), using non-quantitative circulation proxies, provides evidence for enhanced NPIW formation during extreme glacials associated with the closure of the Bering Strait and suggest that NPIW was formed locally within the Bering Sea. Our work builds on the potential importance of these results and applies more robust and potentially quantitative circulation proxies to constrain NPIW variability. Here, we present new records of NPIW circulation from Site U1342 based on Nd isotope analyses on fish debris and Fe-Mn encrusted foraminifera, which serve as semi-quantitative "water mass tracers." Weak Bering Sea NPIW formation and ventilation are reflected by relatively lower eNd values indicative of open subarctic North Pacific waters, which are presently predominant, whereas enhanced Bering Sea NPIW formation and ventilation are be reflected by relatively higher eNd values due to the input of Nd from regional volcanic rocks.
Quantitative analysis of ground penetrating radar data in the Mu Us Sandland
NASA Astrophysics Data System (ADS)
Fu, Tianyang; Tan, Lihua; Wu, Yongqiu; Wen, Yanglei; Li, Dawei; Duan, Jinlong
2018-06-01
Ground penetrating radar (GPR), which can reveal the sedimentary structure and development process of dunes, is widely used to evaluate aeolian landforms. The interpretations for GPR profiles are mostly based on qualitative descriptions of geometric features of the radar reflections. This research quantitatively analyzed the waveform parameter characteristics of different radar units by extracting the amplitude and time interval parameters of GPR data in the Mu Us Sandland in China, and then identified and interpreted different sedimentary structures. The results showed that different types of radar units had specific waveform parameter characteristics. The main waveform parameter characteristics of sand dune radar facies and sandstone radar facies included low amplitudes and wide ranges of time intervals, ranging from 0 to 0.25 and 4 to 33 ns respectively, and the mean amplitudes changed gradually with time intervals. The amplitude distribution curves of various sand dune radar facies were similar as unimodal distributions. The radar surfaces showed high amplitudes with time intervals concentrated in high-value areas, ranging from 0.08 to 0.61 and 9 to 34 ns respectively, and the mean amplitudes changed drastically with time intervals. The amplitude and time interval values of lacustrine radar facies were between that of sand dune radar facies and radar surfaces, ranging from 0.08 to 0.29 and 11 to 30 ns respectively, and the mean amplitude and time interval curve was approximately trapezoidal. The quantitative extraction and analysis of GPR reflections could help distinguish various radar units and provide evidence for identifying sedimentary structure in aeolian landforms.
Gittins, Rebecca; Harrison, Paul J
2004-03-15
There are an increasing number of quantitative morphometric studies of the human cerebral cortex, especially as part of comparative investigations of major psychiatric disorders. In this context, the present study had two aims. First, to provide quantitative data regarding key neuronal morphometric parameters in the anterior cingulate cortex. Second, to compare the results of conventional Nissl staining with those observed after immunostaining with NeuN, an antibody becoming widely used as a selective neuronal marker. We stained adjacent sections of area 24b from 16 adult brains with cresyl violet or NeuN. We measured the density of pyramidal and non-pyramidal neurons, and the size and shape of pyramidal neurons, in laminae II, III, Va, Vb and VI, using two-dimensional counting methods. Strong correlations between the two modes of staining were seen for all variables. However, NeuN gave slightly higher estimates of neuronal density and size, and a more circular perikaryal shape. Brain pH was correlated with neuronal size, measured with both methods, and with neuronal shape. Age and post-mortem interval showed no correlations with any parameter. These data confirm the value of NeuN as a tool for quantitative neuronal morphometric studies in routinely processed human brain tissue. Absolute values are highly correlated between NeuN and cresyl violet stains, but cannot be interchanged. NeuN may be particularly useful when it is important to distinguish small neurons from glia, such as in cytoarchitectural studies of the cerebral cortex in depression and schizophrenia.
Pressure and cold pain threshold reference values in a large, young adult, pain-free population.
Waller, Robert; Smith, Anne Julia; O'Sullivan, Peter Bruce; Slater, Helen; Sterling, Michele; McVeigh, Joanne Alexandra; Straker, Leon Melville
2016-10-01
Currently there is a lack of large population studies that have investigated pain sensitivity distributions in healthy pain free people. The aims of this study were: (1) to provide sex-specific reference values of pressure and cold pain thresholds in young pain-free adults; (2) to examine the association of potential correlates of pain sensitivity with pain threshold values. This study investigated sex specific pressure and cold pain threshold estimates for young pain free adults aged 21-24 years. A cross-sectional design was utilised using participants (n=617) from the Western Australian Pregnancy Cohort (Raine) Study at the 22-year follow-up. The association of site, sex, height, weight, smoking, health related quality of life, psychological measures and activity with pain threshold values was examined. Pressure pain threshold (lumbar spine, tibialis anterior, neck and dorsal wrist) and cold pain threshold (dorsal wrist) were assessed using standardised quantitative sensory testing protocols. Reference values for pressure pain threshold (four body sites) stratified by sex and site, and cold pain threshold (dorsal wrist) stratified by sex are provided. Statistically significant, independent correlates of increased pressure pain sensitivity measures were site (neck, dorsal wrist), sex (female), higher waist-hip ratio and poorer mental health. Statistically significant, independent correlates of increased cold pain sensitivity measures were, sex (female), poorer mental health and smoking. These data provide the most comprehensive and robust sex specific reference values for pressure pain threshold specific to four body sites and cold pain threshold at the dorsal wrist for young adults aged 21-24 years. Establishing normative values in this young age group is important given that the transition from adolescence to adulthood is a critical temporal period during which trajectories for persistent pain can be established. These data will provide an important research resource to enable more accurate profiling and interpretation of pain sensitivity in clinical pain disorders in young adults. The robust and comprehensive data can assist interpretation of future clinical pain studies and provide further insight into the complex associations of pain sensitivity that can be used in future research. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.
Diversity of wetland plants used traditionally in China: a literature review.
Zhang, Yin; Xu, Hualin; Chen, Hui; Wang, Fei; Huai, Huyin
2014-10-15
In comparison with terrestrial plants, those growing in wetlands have been rarely studied ethnobotanically, including in China, yet people living in or near wetlands can accumulate much knowledge of the uses of local wetland plants. A characteristic of wetlands, cutting across climatic zones, is that many species are widely distributed, providing opportunities for studying general patterns of knowledge of the uses of plants across extensive areas, in the present case China. There is urgency in undertaking such studies, given the rapid rates of loss of traditional knowledge of wetland plants as is now occurring. There have been very few studies specifically on the traditional knowledge of wetland plants in China. However, much information on such knowledge does exist, but dispersed through a wide body of literature that is not specifically ethnobotanical, such as regional Floras. We have undertaken an extensive study of such literature to determine which species of wetland plants have been used traditionally and the main factors influencing patterns shown by such knowledge. Quantitative techniques have been used to evaluate the relative usefulness of different types of wetland plants and regression analyses to determine the extent to which different quantitative indices give similar results. 350 wetland plant species, belonging to 66 families and 187 genera, were found to have been used traditionally in China for a wide range of purposes. The top ten families used, in terms of numbers of species, were Poaceae, Polygonaceae, Cyperaceae, Lamiaceae, Asteraceae, Ranunculaceae, Hydrocharitaceae, Potamogetonaceae, Fabaceae, and Brassicaceae, in total accounting for 58.6% of all species used. These families often dominate wetland vegetation in China. The three most widely used genera were Polygonum, Potamogeton and Cyperus. The main uses of wetlands plants, in terms of numbers of species, were for medicine, food, and forage. Three different ways of assigning an importance value to species (Relative Frequency of Citation RFC; Cultural Importance CI; Cultural Value Index CV) all gave similar results. A diverse range of wetland plants, in terms of both taxonomic affiliation and type of use, have been used traditionally in China. Medicine, forage and food are the three most important categories of use, the plants providing basic resources used by local people in their everyday lives. Local availability is the main factor influencing which species are used. Quantitative indexes, especially Cultural Value Index, proved very useful for evaluating the usefulness of plants as recorded in the literature.
NASA Astrophysics Data System (ADS)
Rogers, K.; Cooper, W. T.; Hodgkins, S. B.; Verbeke, B. A.; Chanton, J.
2017-12-01
Solid state direct polarization 13C NMR spectroscopy (DP-NMR) is generally considered the most quantitatively reliable method for soil organic matter (SOM) characterization, including determination of the relative abundances of carbon functional groups. These functional abundances can then be used to calculate important soil parameters such as degree of humification and extent of aromaticity that reveal differences in reactivity or compositional changes along gradients (e.g. thaw chronosequence in permafrost). Unfortunately, the 13C NMR DP-NMR experiment is time-consuming, with a single sample often requiring over 24 hours of instrument time. Alternatively, solid state cross polarization 13C NMR (CP-NMR) can circumvent this problem, reducing analyses times to 4-6 hours but with some loss of quantitative reliability. Attenuated Total Reflectance Fourier Transform Infrared spectroscopy (ATR-FTIR) is a quick and relatively inexpensive method for characterizing solid materials, and has been suggested as an alternative to NMR for analysis of soil organic matter and determination of humification (HI) and aromatization (AI) indices. However, the quantitative reliability of ATR-FTIR for SOM analyses has never been verified, nor have any ATR-FTIR data been compared to similar measurements by NMR. In this work we focused on FTIR vibrational bands that correspond to the three functional groups used to calculate HI and AI values: carbohydrates (1030 cm-1), aromatics (1510, 1630 cm-1), and aliphatics (2850, 2920 cm-1). Data from ATR-FTIR measurements were compared to analogous quantitation by DP- and CP-NMR using peat samples from Sweden, Minnesota, and North Carolina. DP- and CP-NMR correlate very strongly, although the correlations are not always 1:1. Direct comparison of relative abundances of the three functional groups determined by NMR and ATR-FTIR yielded satisfactory results for carbohydrates (r2= 0.78) and aliphatics (r2=0.58), but less so for aromatics (r2= 0.395). ATR-FTIR has to this point been used primarily for relative abundance analyses (e.g. calculating HI and AI values), but these results suggest FTIR can provide quantitative reliability that approaches that of NMR.
Robson, Philip M; Madhuranthakam, Ananth J; Smith, Martin P; Sun, Maryellen R M; Dai, Weiying; Rofsky, Neil M; Pedrosa, Ivan; Alsop, David C
2016-02-01
Renal perfusion measurements using noninvasive arterial spin-labeled (ASL) magnetic resonance imaging techniques are gaining interest. Currently, focus has been on perfusion in the context of renal transplant. Our objectives were to explore the use of ASL in patients with renal cancer, and to evaluate three-dimensional (3D) fast spin echo (FSE) acquisition, a robust volumetric imaging method for abdominal applications. We evaluate 3D ASL perfusion magnetic resonance imaging in the kidneys compared to two-dimensional (2D) ASL in patients and healthy subjects. Isotropic resolution (2.6 × 2.6 × 2.8 mm(3)) 3D ASL using segmented FSE was compared to 2D single-shot FSE. ASL used pseudo-continuous labeling, suppression of background signal, and synchronized breathing. Quantitative perfusion values and signal-to-noise ratio (SNR) were compared between 3D and 2D ASL in four healthy volunteers and semiquantitative assessments were made by four radiologists in four patients with known renal masses (primary renal cell carcinoma). Renal cortex perfusion in healthy subjects was 284 ± 21 mL/100 g/min, with test-retest repeatability of 8.8%. No significant differences were found between the quantitative perfusion value and SNR in volunteers between 3D ASL and 2D ASL, or in 3D ASL with synchronized or free breathing. In patients, semiquantitative assessment by radiologists showed no significant difference in image quality between 2D ASL and 3D ASL. In one case, 2D ASL missed a high perfusion focus in a mass that was seen by 3D ASL. 3D ASL renal perfusion imaging provides isotropic-resolution images, with comparable quantitative perfusion values and image SNR in similar imaging time to single-slice 2D ASL. Copyright © 2015 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Change analysis in the United Arab Emirates: An investigation of techniques
Sohl, Terry L.
1999-01-01
Much of the landscape of the United Arab Emirates has been transformed over the past 15 years by massive afforestation, beautification, and agricultural programs. The "greening" of the United Arab Emirates has had environmental consequences, however, including degraded groundwater quality and possible damage to natural regional ecosystems. Personnel from the Ground- Water Research project, a joint effort between the National Drilling Company of the Abu Dhabi Emirate and the U.S. Geological Survey, were interested in studying landscape change in the Abu Dhabi Emirate using Landsat thematic mapper (TM) data. The EROs Data Center in Sioux Falls, South Dakota was asked to investigate land-cover change techniques that (1) provided locational, quantitative, and qualitative information on landcover change within the Abu Dhabi Emirate; and (2) could be easily implemented by project personnel who were relatively inexperienced in remote sensing. A number of products were created with 1987 and 1996 Landsat TM data using change-detection techniques, including univariate image differencing, an "enhanced" image differencing, vegetation index differencing, post-classification differencing, and changevector analysis. The different techniques provided products that varied in levels of adequacy according to the specific application and the ease of implementation and interpretation. Specific quantitative values of change were most accurately and easily provided by the enhanced image-differencing technique, while the change-vector analysis excelled at providing rich qualitative detail about the nature of a change.
A Spectral Method for Color Quantitation of a Protein Drug Solution.
Swartz, Trevor E; Yin, Jian; Patapoff, Thomas W; Horst, Travis; Skieresz, Susan M; Leggett, Gordon; Morgan, Charles J; Rahimi, Kimia; Marhoul, Joseph; Kabakoff, Bruce
2016-01-01
Color is an important quality attribute for biotherapeutics. In the biotechnology industry, a visual method is most commonly utilized for color characterization of liquid drug protein solutions. The color testing method is used for both batch release and on stability testing for quality control. Using that method, an analyst visually determines the color of the sample by choosing the closest matching European Pharmacopeia reference color solution. The requirement to judge the best match makes it a subjective method. Furthermore, the visual method does not capture data on hue or chroma that would allow for improved product characterization and the ability to detect subtle differences between samples. To overcome these challenges, we describe a quantitative method for color determination that greatly reduces the variability in measuring color and allows for a more precise understanding of color differences. Following color industry standards established by International Commission on Illumination, this method converts a protein solution's visible absorption spectra to L*a*b* color space. Color matching is achieved within the L*a*b* color space, a practice that is already widely used in other industries. The work performed here is to facilitate the adoption and transition for the traditional visual assessment method to a quantitative spectral method. We describe here the algorithm used such that the quantitative spectral method correlates with the currently used visual method. In addition, we provide the L*a*b* values for the European Pharmacopeia reference color solutions required for the quantitative method. We have determined these L*a*b* values by gravimetrically preparing and measuring multiple lots of the reference color solutions. We demonstrate that the visual assessment and the quantitative spectral method are comparable using both low- and high-concentration antibody solutions and solutions with varying turbidity. In the biotechnology industry, a visual assessment is the most commonly used method for color characterization, batch release, and stability testing of liquid protein drug solutions. Using this method, an analyst visually determines the color of the sample by choosing the closest match to a standard color series. This visual method can be subjective because it requires an analyst to make a judgment of the best match of color of the sample to the standard color series, and it does not capture data on hue and chroma that would allow for improved product characterization and the ability to detect subtle differences between samples. To overcome these challenges, we developed a quantitative spectral method for color determination that greatly reduces the variability in measuring color and allows for a more precise understanding of color differences. The details of the spectral quantitative method are described. A comparison between the visual assessment method and spectral quantitative method is presented. This study supports the transition to a quantitative spectral method from the visual assessment method for quality testing of protein solutions. © PDA, Inc. 2016.
Geith, Tobias; Schmidt, Gerwin; Biffar, Andreas; Dietrich, Olaf; Dürr, Hans Roland; Reiser, Maximilian; Baur-Melnyk, Andrea
2012-11-01
The objective of our study was to compare the diagnostic value of qualitative diffusion-weighted imaging (DWI), quantitative DWI, and chemical-shift imaging in a single prospective cohort of patients with acute osteoporotic and malignant vertebral fractures. The study group was composed of patients with 26 osteoporotic vertebral fractures (18 women, eight men; mean age, 69 years; age range, 31 years 6 months to 86 years 2 months) and 20 malignant vertebral fractures (nine women, 11 men; mean age, 63.4 years; age range, 24 years 8 months to 86 years 4 months). T1-weighted, STIR, and T2-weighted sequences were acquired at 1.5 T. A DW reverse fast imaging with steady-state free precession (PSIF) sequence at different delta values was evaluated qualitatively. A DW echo-planar imaging (EPI) sequence and a DW single-shot turbo spin-echo (TSE) sequence at different b values were evaluated qualitatively and quantitatively using the apparent diffusion coefficient. Opposed-phase sequences were used to assess signal intensity qualitatively. The signal loss between in- and opposed-phase images was determined quantitatively. Two-tailed Fisher exact test, Mann-Whitney test, and receiver operating characteristic analysis were performed. Sensitivities, specificities, and accuracies were determined. Qualitative DW-PSIF imaging (delta = 3 ms) showed the best performance for distinguishing between benign and malignant fractures (sensitivity, 100%; specificity, 88.5%; accuracy, 93.5%). Qualitative DW-EPI (b = 50 s/mm(2) [p = 1.00]; b = 250 s/mm(2) [p = 0.50]) and DW single-shot TSE imaging (b = 100 s/mm(2) [p = 1.00]; b = 250 s/mm(2) [p = 0.18]; b = 400 s/mm(2) [p = 0.18]; b = 600 s/mm(2) [p = 0.39]) did not indicate significant differences between benign and malignant fractures. DW-EPI using a b value of 500 s/mm(2) (p = 0.01) indicated significant differences between benign and malignant vertebral fractures. Quantitative DW-EPI (p = 0.09) and qualitative opposed-phase imaging (p = 0.06) did not exhibit significant differences, quantitative DW single-shot TSE imaging (p = 0.002) and quantitative chemical-shift imaging (p = 0.01) showed significant differences between benign and malignant fractures. The DW-PSIF sequence (delta = 3 ms) had the highest accuracy in differentiating benign from malignant vertebral fractures. Quantitative chemical-shift imaging and quantitative DW single-shot TSE imaging had a lower accuracy than DW-PSIF imaging because of a large overlap. Qualitative assessment of opposed-phase, DW-EPI, and DW single-shot TSE sequences and quantitative assessment of the DW-EPI sequence were not suitable for distinguishing between benign and malignant vertebral fractures.
NASA Astrophysics Data System (ADS)
Kang, Kwang-Song; Hu, Nai-Lian; Sin, Chung-Sik; Rim, Song-Ho; Han, Eun-Cheol; Kim, Chol-Nam
2017-08-01
It is very important to obtain the mechanical paramerters of rock mass for excavation design, support design, slope design and stability analysis of the underground structure. In order to estimate the mechanical parameters of rock mass exactly, a new method of combining a geological strength index (GSI) system with intelligent displacment back analysis is proposed in this paper. Firstly, average spacing of joints (d) and rock mass block rating (RBR, a new quantitative factor), surface condition rating (SCR) and joint condition factor (J c) are obtained on in situ rock masses using the scanline method, and the GSI values of rock masses are obtained from a new quantitative GSI chart. A correction method of GSI value is newly introduced by considering the influence of joint orientation and groundwater on rock mass mechanical properties, and then value ranges of rock mass mechanical parameters are chosen by the Hoek-Brown failure criterion. Secondly, on the basis of the measurement result of vault settlements and horizontal convergence displacements of an in situ tunnel, optimal parameters are estimated by combination of genetic algorithm (GA) and numerical simulation analysis using FLAC3D. This method has been applied in a lead-zinc mine. By utilizing the improved GSI quantization, correction method and displacement back analysis, the mechanical parameters of the ore body, hanging wall and footwall rock mass were determined, so that reliable foundations were provided for mining design and stability analysis.
[QUALITY MEASURES IN MEDICINE-- A PLEA FOR NEW, VALUE BASED THINKING].
Fisher, Menachem; Wagner, Oded; Keinarl, Talia; Solt, Ido
2015-09-01
Quality is an important and basic conduct of complex systems in general and health systems in particular. Quality is a cornerstone of medicine, necessary in the eyes of the community of consumers, caregivers, and the systems that manage both. In Israel, the Ministry of Health has set the quality issue on the agenda of healthcare organizations in all existing frameworks. In this article we seek to offer an acceptable alternative perspective, in examining the quality of public health. We suggest highlighting the ethical aspect of medical care, while reducing the quantitative monitoring component of existing quality metrics. Relying solely on indices has negative effects that might cause damage. The proposed alternative focuses on the personal responsibility of health care providers, using. values and moral reasonin.
Acoustical numerology and lucky equal temperaments
NASA Astrophysics Data System (ADS)
Hall, Donald E.
1988-04-01
Equally tempered musical scales with N steps per octave are known to work especially well in approximating justly tuned intervals for such values as N=12, 19, 31, and 53. A quantitative measure of the closeness of such fits is suggested, in terms of the probabilities of coming as close to randomly chosen intervals as to the justly tuned targets. When two or more harmonic intervals are considered simultaneously, this involves a Monte Carlo evaluation of the probabilities. The results can be used to gauge how much advantage the special values of N mentioned above have over others. This article presents the rationale and method of computation, together with illustrative results in a few of the most interesting cases. References are provided to help relate these results to earlier works by music theorists.
Schaid, Daniel J
2010-01-01
Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.
Matsunaga, Nikita; Rogers, Donald W; Zavitsas, Andreas A
2003-04-18
Contrary to other recent reports, Pauling's original electronegativity equation, applied as Pauling specified, describes quite accurately homolytic bond dissociation enthalpies of common covalent bonds, including highly polar ones, with an average deviation of +/-1.5 kcal mol(-1) from literature values for 117 such bonds. Dissociation enthalpies are presented for more than 250 bonds, including 79 for which experimental values are not available. Some previous evaluations of accuracy gave misleadingly poor results by applying the equation to cases for which it was not derived and for which it should not reproduce experimental values. Properly interpreted, the results of the equation provide new and quantitative insights into many facets of chemistry such as radical stabilities, factors influencing reactivity in electrophilic aromatic substitutions, the magnitude of steric effects, conjugative stabilization in unsaturated systems, rotational barriers, molecular and electronic structure, and aspects of autoxidation. A new corollary of the original equation expands its applicability and provides a rationale for previously observed empirical correlations. The equation raises doubts about a new bonding theory. Hydrogen is unique in that its electronegativity is not constant.
Teaching quantitative biology: goals, assessments, and resources
Aikens, Melissa L.; Dolan, Erin L.
2014-01-01
More than a decade has passed since the publication of BIO2010, calling for an increased emphasis on quantitative skills in the undergraduate biology curriculum. In that time, relatively few papers have been published that describe educational innovations in quantitative biology or provide evidence of their effects on students. Using a “backward design” framework, we lay out quantitative skill and attitude goals, assessment strategies, and teaching resources to help biologists teach more quantitatively. Collaborations between quantitative biologists and education researchers are necessary to develop a broader and more appropriate suite of assessment tools, and to provide much-needed evidence on how particular teaching strategies affect biology students' quantitative skill development and attitudes toward quantitative work. PMID:25368425
Lasnon, Charline; Quak, Elske; Briand, Mélanie; Gu, Zheng; Louis, Marie-Hélène; Aide, Nicolas
2013-01-17
The use of iodinated contrast media in small-animal positron emission tomography (PET)/computed tomography (CT) could improve anatomic referencing and tumor delineation but may introduce inaccuracies in the attenuation correction of the PET images. This study evaluated the diagnostic performance and accuracy of quantitative values in contrast-enhanced small-animal PET/CT (CEPET/CT) as compared to unenhanced small animal PET/CT (UEPET/CT). Firstly, a NEMA NU 4-2008 phantom (filled with 18F-FDG or 18F-FDG plus contrast media) and a homemade phantom, mimicking an abdominal tumor surrounded by water or contrast media, were used to evaluate the impact of iodinated contrast media on the image quality parameters and accuracy of quantitative values for a pertinent-sized target. Secondly, two studies in 22 abdominal tumor-bearing mice and rats were performed. The first animal experiment studied the impact of a dual-contrast media protocol, comprising the intravenous injection of a long-lasting contrast agent mixed with 18F-FDG and the intraperitoneal injection of contrast media, on tumor delineation and the accuracy of quantitative values. The second animal experiment compared the diagnostic performance and quantitative values of CEPET/CT versus UEPET/CT by sacrificing the animals after the tracer uptake period and imaging them before and after intraperitoneal injection of contrast media. There was minimal impact on IQ parameters (%SDunif and spillover ratios in air and water) when the NEMA NU 4-2008 phantom was filled with 18F-FDG plus contrast media. In the homemade phantom, measured activity was similar to true activity (-0.02%) and overestimated by 10.30% when vials were surrounded by water or by an iodine solution, respectively. The first animal experiment showed excellent tumor delineation and a good correlation between small-animal (SA)-PET and ex vivo quantification (r2 = 0.87, P < 0.0001). The second animal experiment showed a good correlation between CEPET/CT and UEPET/CT quantitative values (r2 = 0.99, P < 0.0001). Receiver operating characteristic analysis demonstrated better diagnostic accuracy of CEPET/CT versus UEPET/CT (senior researcher, area under the curve (AUC) 0.96 versus 0.77, P = 0.004; junior researcher, AUC 0.78 versus 0.58, P = 0.004). The use of iodinated contrast media for small-animal PET imaging significantly improves tumor delineation and diagnostic performance, without significant alteration of SA-PET quantitative accuracy and NEMA NU 4-2008 IQ parameters.
Quantitative lung perfusion evaluation using Fourier decomposition perfusion MRI.
Kjørstad, Åsmund; Corteville, Dominique M R; Fischer, Andre; Henzler, Thomas; Schmid-Bindert, Gerald; Zöllner, Frank G; Schad, Lothar R
2014-08-01
To quantitatively evaluate lung perfusion using Fourier decomposition perfusion MRI. The Fourier decomposition (FD) method is a noninvasive method for assessing ventilation- and perfusion-related information in the lungs, where the perfusion maps in particular have shown promise for clinical use. However, the perfusion maps are nonquantitative and dimensionless, making follow-ups and direct comparisons between patients difficult. We present an approach to obtain physically meaningful and quantifiable perfusion maps using the FD method. The standard FD perfusion images are quantified by comparing the partially blood-filled pixels in the lung parenchyma with the fully blood-filled pixels in the aorta. The percentage of blood in a pixel is then combined with the temporal information, yielding quantitative blood flow values. The values of 10 healthy volunteers are compared with SEEPAGE measurements which have shown high consistency with dynamic contrast enhanced-MRI. All pulmonary blood flow (PBF) values are within the expected range. The two methods are in good agreement (mean difference = 0.2 mL/min/100 mL, mean absolute difference = 11 mL/min/100 mL, mean PBF-FD = 150 mL/min/100 mL, mean PBF-SEEPAGE = 151 mL/min/100 mL). The Bland-Altman plot shows a good spread of values, indicating no systematic bias between the methods. Quantitative lung perfusion can be obtained using the Fourier Decomposition method combined with a small amount of postprocessing. Copyright © 2013 Wiley Periodicals, Inc.
Efficacy and experiences of telephone counselling for informal carers of people with dementia.
Lins, Sabine; Hayder-Beichel, Daniela; Rücker, Gerta; Motschall, Edith; Antes, Gerd; Meyer, Gabriele; Langer, Gero
2014-09-01
Informal carers of people with dementia can suffer from depressive symptoms, emotional distress and other physiological, social and financial consequences. This review focuses on three main objectives:To:1) produce a quantitative review of the efficacy of telephone counselling for informal carers of people with dementia;2) synthesize qualitative studies to explore carers' experiences of receiving telephone counselling and counsellors' experiences of conducting telephone counselling; and3) integrate 1) and 2) to identify aspects of the intervention that are valued and work well, and those interventional components that should be improved or redesigned. The Cochrane Dementia and Cognitive Improvement Group's Specialized Register, The Cochrane Library, MEDLINE, MEDLINE in Process, EMBASE, CINAHL, PSYNDEX, PsycINFO, Web of Science, DIMDI databases, Springer database, Science direct and trial registers were searched on 3 May 2011 and updated on 25 February 2013. A Forward Citation search was conducted for included studies in Web of Science and Google Scholar. We used the Related Articles service of PubMed for included studies, contacted experts and hand-searched abstracts of five congresses. Randomised controlled trials (RCTs) or cross-over trials that compared telephone counselling for informal carers of people with dementia against no treatment, usual care or friendly calls for chatting were included evaluation of efficacy. Qualitative studies with qualitative methods of data collection and analysis were also included to address experiences with telephone counselling. Two authors independently screened articles for inclusion criteria, extracted data and assessed the quantitative trials with the Cochrane 'Risk of bias' tool and the qualitative studies with the Critical Appraisal Skills Program (CASP) tool. The authors conducted meta-analyses, but reported some results in narrative form due to clinical heterogeneity. The authors synthesised the qualitative data and integrated quantitative RCT data with the qualitative data. Nine RCTs and two qualitative studies were included. Six studies investigated telephone counselling without additional intervention, one study combined telephone counselling with video sessions, and two studies combined it with video sessions and a workbook. All quantitative studies had a high risk of bias in terms of blinding of participants and outcome assessment. Most studies provided no information about random sequence generation and allocation concealment. The quality of the qualitative studies ('thin descriptions') was assessed as moderate. Meta-analyses indicated a reduction of depressive symptoms for telephone counselling without additional intervention (three trials, 163 participants: standardised mean different (SMD) 0.32, 95% confidence interval (CI) 0.01 to 0.63, P value 0.04; moderate quality evidence). The estimated effects on other outcomes (burden, distress, anxiety, quality of life, self-efficacy, satisfaction and social support) were uncertain and differences could not be excluded (burden: four trials, 165 participants: SMD 0.45, 95% CI -0.01 to 0.90, P value 0.05; moderate quality evidence; support: two trials, 67 participants: SMD 0.25, 95% CI -0.24 to 0.73, P value 0.32; low quality evidence). None of the quantitative studies included reported adverse effects or harm due to telephone counselling. Three analytical themes (barriers and facilitators for successful implementation of telephone counselling, counsellor's emotional attitude and content of telephone counselling) and 16 descriptive themes that present the carers' needs for telephone counselling were identified in the thematic synthesis. Integration of quantitative and qualitative data shows potential for improvement. For example, no RCT reported that the counsellor provided 24-hour availability or that there was debriefing of the counsellor. Also, the qualitative studies covered a limited range of ways of performing telephone counselling. There is evidence that telephone counselling can reduce depressive symptoms for carers of people with dementia and that telephone counselling meets important needs of the carer. This result needs to be confirmed in future studies that evaluate efficacy through robust RCTs and the experience aspect through qualitative studies with rich data.
Smile line assessment comparing quantitative measurement and visual estimation.
Van der Geld, Pieter; Oosterveld, Paul; Schols, Jan; Kuijpers-Jagtman, Anne Marie
2011-02-01
Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation according to a standard categorization are more practical for regular diagnostics. Our objective in this study was to compare 2 semiquantitative methods with quantitative measurements for reliability and agreement. The faces of 122 male participants were individually registered by using digital videography. Spontaneous and posed smiles were captured. On the records, maxillary lip line heights and tooth display were digitally measured on each tooth and also visually estimated according to 3-grade and 4-grade scales. Two raters were involved. An error analysis was performed. Reliability was established with kappa statistics. Interexaminer and intraexaminer reliability values were high, with median kappa values from 0.79 to 0.88. Agreement of the 3-grade scale estimation with quantitative measurement showed higher median kappa values (0.76) than the 4-grade scale estimation (0.66). Differentiating high and gummy smile lines (4-grade scale) resulted in greater inaccuracies. The estimation of a high, average, or low smile line for each tooth showed high reliability close to quantitative measurements. Smile line analysis can be performed reliably with a 3-grade scale (visual) semiquantitative estimation. For a more comprehensive diagnosis, additional measuring is proposed, especially in patients with disproportional gingival display. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.
Sachpekidis, Christos; Anwar, Hoda; Winkler, Julia K; Kopp-Schneider, Annette; Larribere, Lionel; Haberkorn, Uwe; Hassel, Jessica C; Dimitrakopoulou-Strauss, Antonia
2018-06-05
Immunotherapy has raised the issue of appropriate treatment response evaluation, due to the unique mechanism of action of the immunotherapeutic agents. Aim of this analysis is to evaluate the potential role of quantitative analysis of 2-deoxy-2-( 18 F)fluoro-D-glucose ( 18 F-FDG) positron emission tomography/computed tomography (PET/CT) data in monitoring of patients with metastatic melanoma undergoing ipilimumab therapy. 25 patients with unresectable metastatic melanoma underwent dynamic PET/CT (dPET/CT) of the thorax and upper abdomen as well as static, whole body PET/CT with 18 F-FDG before the start of ipilimumab treatment (baseline PET/CT), after two cycles of treatment (interim PET/CT) and at the end of treatment after four cycles (late PET/CT). The evaluation of dPET/CT studies was based on semi-quantitative (standardized uptake value, SUV) calculation as well as quantitative analysis, based on two-tissue compartment modeling and a fractal approach. Patients' best clinical response, assessed at a mean of 59 weeks, was used as reference. According to their best clinical response, patients were dichotomized in those demonstrating clinical benefit (CB, n = 16 patients) and those demonstrating no clinical benefit (no-CB, n = 9 patients). No statistically significant differences were observed between CB and no-CB regarding either semi-quantitative or quantitative parameters in all scans. On contrary, the application of the recently introduced PET response evaluation criteria for immunotherapy (PERCIMT) led to a correct classification rate of 84% (21/25 patients). Quantitative analysis of 18 F-FDG PET data does not provide additional information in treatment response evaluation of metastatic melanoma patients receiving ipilimumab. PERCIMT criteria correlated better with clinical response.
Chen, Teresa C.
2009-01-01
Purpose: To demonstrate that video-rate spectral domain optical coherence tomography (SDOCT) can qualitatively and quantitatively evaluate optic nerve head (ONH) and retinal nerve fiber layer (RNFL) glaucomatous structural changes. To correlate quantitative SDOCT parameters with disc photography and visual fields. Methods: SDOCT images from 4 glaucoma eyes (4 patients) with varying stages of open-angle glaucoma (ie, early, moderate, late) were qualitatively contrasted with 2 age-matched normal eyes (2 patients). Of 61 other consecutive patients recruited in an institutional setting, 53 eyes (33 patients) met inclusion/exclusion criteria for quantitative studies. Images were obtained using two experimental SDOCT systems, one utilizing a superluminescent diode and the other a titanium:sapphire laser source, with axial resolutions of about 6 μm and 3 μm, respectively. Results: Classic glaucomatous ONH and RNFL structural changes were seen in SDOCT images. An SDOCT reference plane 139 μm above the retinal pigment epithelium yielded cup-disc ratios that best correlated with masked physician disc photography cup-disc ratio assessments. The minimum distance band, a novel SDOCT neuroretinal rim parameter, showed good correlation with physician cup-disc ratio assessments, visual field mean deviation, and pattern standard deviation (P values range, .0003–.024). RNFL and retinal thickness maps correlated well with disc photography and visual field testing. Conclusions: To our knowledge, this thesis presents the first comprehensive qualitative and quantitative evaluation of SDOCT images of the ONH and RNFL in glaucoma. This pilot study provides basis for developing more automated quantitative SDOCT-specific glaucoma algorithms needed for future prospective multicenter national trials. PMID:20126502
Wycherley, Thomas; Ferguson, Megan; O'Dea, Kerin; McMahon, Emma; Liberato, Selma; Brimblecombe, Julie
2016-12-01
Determine how very-remote Indigenous community (RIC) food and beverage (F&B) turnover quantities and associated dietary intake estimates derived from only stores, compare with values derived from all community F&B providers. F&B turnover quantity and associated dietary intake estimates (energy, micro/macronutrients and major contributing food types) were derived from 12-months transaction data of all F&B providers in three RICs (NT, Australia). F&B turnover quantities and dietary intake estimates from only stores (plus only the primary store in multiple-store communities) were expressed as a proportion of complete F&B provider turnover values. Food types and macronutrient distribution (%E) estimates were quantitatively compared. Combined stores F&B turnover accounted for the majority of F&B quantity (98.1%) and absolute dietary intake estimates (energy [97.8%], macronutrients [≥96.7%] and micronutrients [≥83.8%]). Macronutrient distribution estimates from combined stores and only the primary store closely aligned complete provider estimates (≤0.9% absolute). Food types were similar using combined stores, primary store or complete provider turnover. Evaluating combined stores F&B turnover represents an efficient method to estimate total F&B turnover quantity and associated dietary intake in RICs. In multiple-store communities, evaluating only primary store F&B turnover provides an efficient estimate of macronutrient distribution and major food types. © 2016 Public Health Association of Australia.
Quantitative evaluation methods of skin condition based on texture feature parameters.
Pang, Hui; Chen, Tianhua; Wang, Xiaoyi; Chang, Zhineng; Shao, Siqi; Zhao, Jing
2017-03-01
In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM) is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.
Griffin, M
2004-01-01
In 2002, the Parliament and Commission of the European Community agreed "minimum health and safety requirements" for the exposure of workers to the risks arising from vibration. The Directive defines qualitative requirements and also quantitative requirements in the form of "exposure action values" and "exposure limit values". The quantitative guidance is based on, but appears to conflict with, the guidance in International Standards for hand-transmitted vibration (ISO 5349) and whole-body vibration (ISO 2631). There is a large internal inconsistency within the Directive for short duration exposures to whole-body vibration: the two alternative methods give very different values. It would appear prudent to base actions on the qualitative guidance (i.e. reducing risk to a minimum) and only refer to the quantitative guidance where there is no other reasonable basis for the identification of risk (i.e. similar exposures are not a suspected cause of injury). Health surveillance and other precautions will be appropriate wherever there is reason to suspect a risk and will not be restricted to conditions where the exposure action value is exceeded. PMID:15090658
Quantitative real-time monitoring of dryer effluent using fiber optic near-infrared spectroscopy.
Harris, S C; Walker, D S
2000-09-01
This paper describes a method for real-time quantitation of the solvents evaporating from a dryer. The vapor stream in the vacuum line of a dryer was monitored in real time using a fiber optic-coupled acousto-optic tunable filter near-infrared (AOTF-NIR) spectrometer. A balance was placed in the dryer, and mass readings were recorded for every scan of the AOTF-NIR. A partial least-squares (PLS) calibration was subsequently built based on change in mass over change in time for solvents typically used in a chemical manufacturing plant. Controlling software for the AOTF-NIR was developed. The software collects spectra, builds the PLS calibration model, and continuously fits subsequently collected spectra to the calibration, allowing the operator to follow the mass loss of solvent from the dryer. The results indicate that solvent loss can be monitored and quantitated in real time using NIR for the optimization of drying times. These time-based mass loss values have also been used to calculate "dynamic" vapor density values for the solvents. The values calculated are in agreement with values determined from the ideal gas law and could prove valuable as tools to measure temperature or pressure indirectly.