Sample records for accurate quantitative assessment

  1. A correlative imaging based methodology for accurate quantitative assessment of bone formation in additive manufactured implants.

    PubMed

    Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D

    2016-06-01

    A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants.

  2. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  3. Toward Accurate and Quantitative Comparative Metagenomics.

    PubMed

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. The accurate assessment of small-angle X-ray scattering data

    DOE PAGES

    Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; ...

    2015-01-23

    Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targetsmore » for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. Studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.« less

  5. Toward Quantitative Small Animal Pinhole SPECT: Assessment of Quantitation Accuracy Prior to Image Compensations

    PubMed Central

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J. S.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose We assessed the quantitation accuracy of small animal pinhole single photon emission computed tomography (SPECT) under the current preclinical settings, where image compensations are not routinely applied. Procedures The effects of several common image-degrading factors and imaging parameters on quantitation accuracy were evaluated using Monte-Carlo simulation methods. Typical preclinical imaging configurations were modeled, and quantitative analyses were performed based on image reconstructions without compensating for attenuation, scatter, and limited system resolution. Results Using mouse-sized phantom studies as examples, attenuation effects alone degraded quantitation accuracy by up to −18% (Tc-99m or In-111) or −41% (I-125). The inclusion of scatter effects changed the above numbers to −12% (Tc-99m or In-111) and −21% (I-125), respectively, indicating the significance of scatter in quantitative I-125 imaging. Region-of-interest (ROI) definitions have greater impacts on regional quantitation accuracy for small sphere sources as compared to attenuation and scatter effects. For the same ROI, SPECT acquisitions using pinhole apertures of different sizes could significantly affect the outcome, whereas the use of different radii-of-rotation yielded negligible differences in quantitation accuracy for the imaging configurations simulated. Conclusions We have systematically quantified the influence of several factors affecting the quantitation accuracy of small animal pinhole SPECT. In order to consistently achieve accurate quantitation within 5% of the truth, comprehensive image compensation methods are needed. PMID:19048346

  6. Improving the Linkages between Air Pollution Epidemiology and Quantitative Risk Assessment

    PubMed Central

    Bell, Michelle L.; Walker, Katy; Hubbell, Bryan

    2011-01-01

    Background: Air pollution epidemiology plays an integral role in both identifying the hazards of air pollution as well as supplying the risk coefficients that are used in quantitative risk assessments. Evidence from both epidemiology and risk assessments has historically supported critical environmental policy decisions. The extent to which risk assessors can properly specify a quantitative risk assessment and characterize key sources of uncertainty depends in part on the availability, and clarity, of data and assumptions in the epidemiological studies. Objectives: We discuss the interests shared by air pollution epidemiology and risk assessment communities in ensuring that the findings of epidemiological studies are appropriately characterized and applied correctly in risk assessments. We highlight the key input parameters for risk assessments and consider how modest changes in the characterization of these data might enable more accurate risk assessments that better represent the findings of epidemiological studies. Discussion: We argue that more complete information regarding the methodological choices and input data used in epidemiological studies would support more accurate risk assessments—to the benefit of both disciplines. In particular, we suggest including additional details regarding air quality, demographic, and health data, as well as certain types of data-rich graphics. Conclusions: Relatively modest changes to the data reported in epidemiological studies will improve the quality of risk assessments and help prevent the misinterpretation and mischaracterization of the results of epidemiological studies. Such changes may also benefit epidemiologists undertaking meta-analyses. We suggest workshops as a way to improve the dialogue between the two communities. PMID:21816702

  7. Accurate Construction of Photoactivated Localization Microscopy (PALM) Images for Quantitative Measurements

    PubMed Central

    Coltharp, Carla; Kessler, Rene P.; Xiao, Jie

    2012-01-01

    Localization-based superresolution microscopy techniques such as Photoactivated Localization Microscopy (PALM) and Stochastic Optical Reconstruction Microscopy (STORM) have allowed investigations of cellular structures with unprecedented optical resolutions. One major obstacle to interpreting superresolution images, however, is the overcounting of molecule numbers caused by fluorophore photoblinking. Using both experimental and simulated images, we determined the effects of photoblinking on the accurate reconstruction of superresolution images and on quantitative measurements of structural dimension and molecule density made from those images. We found that structural dimension and relative density measurements can be made reliably from images that contain photoblinking-related overcounting, but accurate absolute density measurements, and consequently faithful representations of molecule counts and positions in cellular structures, require the application of a clustering algorithm to group localizations that originate from the same molecule. We analyzed how applying a simple algorithm with different clustering thresholds (tThresh and dThresh) affects the accuracy of reconstructed images, and developed an easy method to select optimal thresholds. We also identified an empirical criterion to evaluate whether an imaging condition is appropriate for accurate superresolution image reconstruction with the clustering algorithm. Both the threshold selection method and imaging condition criterion are easy to implement within existing PALM clustering algorithms and experimental conditions. The main advantage of our method is that it generates a superresolution image and molecule position list that faithfully represents molecule counts and positions within a cellular structure, rather than only summarizing structural properties into ensemble parameters. This feature makes it particularly useful for cellular structures of heterogeneous densities and irregular geometries, and

  8. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe.

    PubMed

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-12-09

    Accurate quantitation of intracellular pH (pH i ) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pH i sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pH i . Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pH i , in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF 4 :Yb 3+ , Tm 3+ UCNPs were used as pH i response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pH i value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pH i related areas and development of the intracellular drug delivery systems.

  9. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe

    NASA Astrophysics Data System (ADS)

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-12-01

    Accurate quantitation of intracellular pH (pHi) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pHi sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pHi. Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pHi, in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF4:Yb3+, Tm3+ UCNPs were used as pHi response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pHi value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pHi related areas and development of the intracellular drug delivery systems.

  10. The Quantitative Reasoning for College Science (QuaRCS) Assessment: Emerging Themes from 5 Years of Data

    NASA Astrophysics Data System (ADS)

    Follette, Katherine; Dokter, Erin; Buxner, Sanlyn

    2018-01-01

    The Quantitative Reasoning for College Science (QuaRCS) Assessment is a validated assessment instrument that was designed to measure changes in students' quantitative reasoning skills, attitudes toward mathematics, and ability to accurately assess their own quantitative abilities. It has been administered to more than 5,000 students at a variety of institutions at the start and end of a semester of general education college science instruction. I will begin by briefly summarizing our published work surrounding validation of the instrument and identification of underlying attitudinal factors (composite variables identified via factor analysis) that predict 50% of the variation in students' scores on the assessment. I will then discuss more recent unpublished work, including: (1) Development and validation of an abbreviated version of the assessment (The QuaRCS Light), which results in marked improvements in students' ability to maintain a high effort level throughout the assessment and has broad implications for quantitative reasoning assessments in general, and (2) Our efforts to revise the attitudinal portion of the assessment to better assess math anxiety level, another key factor in student performance on numerical assessments.

  11. Accurate single-shot quantitative phase imaging of biological specimens with telecentric digital holographic microscopy.

    PubMed

    Doblas, Ana; Sánchez-Ortiga, Emilio; Martínez-Corral, Manuel; Saavedra, Genaro; Garcia-Sucerquia, Jorge

    2014-04-01

    The advantages of using a telecentric imaging system in digital holographic microscopy (DHM) to study biological specimens are highlighted. To this end, the performances of nontelecentric DHM and telecentric DHM are evaluated from the quantitative phase imaging (QPI) point of view. The evaluated stability of the microscope allows single-shot QPI in DHM by using telecentric imaging systems. Quantitative phase maps of a section of the head of the drosophila melanogaster fly and of red blood cells are obtained via single-shot DHM with no numerical postprocessing. With these maps we show that the use of telecentric DHM provides larger field of view for a given magnification and permits more accurate QPI measurements with less number of computational operations.

  12. Teaching quantitative biology: goals, assessments, and resources

    PubMed Central

    Aikens, Melissa L.; Dolan, Erin L.

    2014-01-01

    More than a decade has passed since the publication of BIO2010, calling for an increased emphasis on quantitative skills in the undergraduate biology curriculum. In that time, relatively few papers have been published that describe educational innovations in quantitative biology or provide evidence of their effects on students. Using a “backward design” framework, we lay out quantitative skill and attitude goals, assessment strategies, and teaching resources to help biologists teach more quantitatively. Collaborations between quantitative biologists and education researchers are necessary to develop a broader and more appropriate suite of assessment tools, and to provide much-needed evidence on how particular teaching strategies affect biology students' quantitative skill development and attitudes toward quantitative work. PMID:25368425

  13. A quantitative framework for assessing ecological resilience

    EPA Science Inventory

    Quantitative approaches to measure and assess resilience are needed to bridge gaps between science, policy, and management. In this paper, we suggest a quantitative framework for assessing ecological resilience. Ecological resilience as an emergent ecosystem phenomenon can be de...

  14. Accurate virus quantitation using a Scanning Transmission Electron Microscopy (STEM) detector in a scanning electron microscope.

    PubMed

    Blancett, Candace D; Fetterer, David P; Koistinen, Keith A; Morazzani, Elaine M; Monninger, Mitchell K; Piper, Ashley E; Kuehl, Kathleen A; Kearney, Brian J; Norris, Sarah L; Rossi, Cynthia A; Glass, Pamela J; Sun, Mei G

    2017-10-01

    A method for accurate quantitation of virus particles has long been sought, but a perfect method still eludes the scientific community. Electron Microscopy (EM) quantitation is a valuable technique because it provides direct morphology information and counts of all viral particles, whether or not they are infectious. In the past, EM negative stain quantitation methods have been cited as inaccurate, non-reproducible, and with detection limits that were too high to be useful. To improve accuracy and reproducibility, we have developed a method termed Scanning Transmission Electron Microscopy - Virus Quantitation (STEM-VQ), which simplifies sample preparation and uses a high throughput STEM detector in a Scanning Electron Microscope (SEM) coupled with commercially available software. In this paper, we demonstrate STEM-VQ with an alphavirus stock preparation to present the method's accuracy and reproducibility, including a comparison of STEM-VQ to viral plaque assay and the ViroCyt Virus Counter. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  15. Accuracy of quantitative visual soil assessment

    NASA Astrophysics Data System (ADS)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  16. Comparison of clinical semi-quantitative assessment of muscle fat infiltration with quantitative assessment using chemical shift-based water/fat separation in MR studies of the calf of post-menopausal women.

    PubMed

    Alizai, Hamza; Nardo, Lorenzo; Karampinos, Dimitrios C; Joseph, Gabby B; Yap, Samuel P; Baum, Thomas; Krug, Roland; Majumdar, Sharmila; Link, Thomas M

    2012-07-01

    The goal of this study was to compare the semi-quantitative Goutallier classification for fat infiltration with quantitative fat-fraction derived from a magnetic resonance imaging (MRI) chemical shift-based water/fat separation technique. Sixty-two women (age 61 ± 6 years), 27 of whom had diabetes, underwent MRI of the calf using a T1-weighted fast spin-echo sequence and a six-echo spoiled gradient-echo sequence at 3 T. Water/fat images and fat fraction maps were reconstructed using the IDEAL algorithm with T2* correction and a multi-peak model for the fat spectrum. Two radiologists scored fat infiltration on the T1-weighted images using the Goutallier classification in six muscle compartments. Spearman correlations between the Goutallier grades and the fat fraction were calculated; in addition, intra-observer and inter-observer agreement were calculated. A significant correlation between the clinical grading and the fat fraction values was found for all muscle compartments (P < 0.0001, R values ranging from 0.79 to 0.88). Goutallier grades 0-4 had a fat fraction ranging from 3.5 to 19%. Intra-observer and inter-observer agreement values of 0.83 and 0.81 were calculated for the semi-quantitative grading. Semi-quantitative grading of intramuscular fat and quantitative fat fraction were significantly correlated and both techniques had excellent reproducibility. However, the clinical grading was found to overestimate muscle fat. Fat infiltration of muscle commonly occurs in many metabolic and neuromuscular diseases. • Image-based semi-quantitative classifications for assessing fat infiltration are not well validated. • Quantitative MRI techniques provide an accurate assessment of muscle fat.

  17. Towards quantitative assessment of calciphylaxis

    NASA Astrophysics Data System (ADS)

    Deserno, Thomas M.; Sárándi, István.; Jose, Abin; Haak, Daniel; Jonas, Stephan; Specht, Paula; Brandenburg, Vincent

    2014-03-01

    Calciphylaxis is a rare disease that has devastating conditions associated with high morbidity and mortality. Calciphylaxis is characterized by systemic medial calcification of the arteries yielding necrotic skin ulcerations. In this paper, we aim at supporting the installation of multi-center registries for calciphylaxis, which includes a photographic documentation of skin necrosis. However, photographs acquired in different centers under different conditions using different equipment and photographers cannot be compared quantitatively. For normalization, we use a simple color pad that is placed into the field of view, segmented from the image, and its color fields are analyzed. In total, 24 colors are printed on that scale. A least-squares approach is used to determine the affine color transform. Furthermore, the card allows scale normalization. We provide a case study for qualitative assessment. In addition, the method is evaluated quantitatively using 10 images of two sets of different captures of the same necrosis. The variability of quantitative measurements based on free hand photography is assessed regarding geometric and color distortions before and after our simple calibration procedure. Using automated image processing, the standard deviation of measurements is significantly reduced. The coefficients of variations yield 5-20% and 2-10% for geometry and color, respectively. Hence, quantitative assessment of calciphylaxis becomes practicable and will impact a better understanding of this rare but fatal disease.

  18. Searching for an Accurate Marker-Based Prediction of an Individual Quantitative Trait in Molecular Plant Breeding

    PubMed Central

    Fu, Yong-Bi; Yang, Mo-Hua; Zeng, Fangqin; Biligetu, Bill

    2017-01-01

    Molecular plant breeding with the aid of molecular markers has played an important role in modern plant breeding over the last two decades. Many marker-based predictions for quantitative traits have been made to enhance parental selection, but the trait prediction accuracy remains generally low, even with the aid of dense, genome-wide SNP markers. To search for more accurate trait-specific prediction with informative SNP markers, we conducted a literature review on the prediction issues in molecular plant breeding and on the applicability of an RNA-Seq technique for developing function-associated specific trait (FAST) SNP markers. To understand whether and how FAST SNP markers could enhance trait prediction, we also performed a theoretical reasoning on the effectiveness of these markers in a trait-specific prediction, and verified the reasoning through computer simulation. To the end, the search yielded an alternative to regular genomic selection with FAST SNP markers that could be explored to achieve more accurate trait-specific prediction. Continuous search for better alternatives is encouraged to enhance marker-based predictions for an individual quantitative trait in molecular plant breeding. PMID:28729875

  19. Knowledge gaps in host-parasite interaction preclude accurate assessment of meat-borne exposure to Toxoplasma gondii.

    PubMed

    Crotta, M; Limon, G; Blake, D P; Guitian, J

    2017-11-16

    viable cyst resulted 1.14% and 9.97% indicating that the uncertainty and lack of data surrounding key input parameters of the model preclude accurate estimation of T. gondii exposure through consumption of meat products. The hypothetical model conceptualized here is coherent with current knowledge of the biology of the parasite. Simulation outputs clearly identify the key gaps in our knowledge of the host-parasite interaction that, when filled, will support quantitative assessments and much needed accurate estimates of the risk of human exposure. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Accurate quantitation standards of glutathione via traceable sulfur measurement by inductively coupled plasma optical emission spectrometry and ion chromatography

    PubMed Central

    Rastogi, L.; Dash, K.; Arunachalam, J.

    2013-01-01

    The quantitative analysis of glutathione (GSH) is important in different fields like medicine, biology, and biotechnology. Accurate quantitative measurements of this analyte have been hampered by the lack of well characterized reference standards. The proposed procedure is intended to provide an accurate and definitive method for the quantitation of GSH for reference measurements. Measurement of the stoichiometrically existing sulfur content in purified GSH offers an approach for its quantitation and calibration through an appropriate characterized reference material (CRM) for sulfur would provide a methodology for the certification of GSH quantity, that is traceable to SI (International system of units). The inductively coupled plasma optical emission spectrometry (ICP-OES) approach negates the need for any sample digestion. The sulfur content of the purified GSH is quantitatively converted into sulfate ions by microwave-assisted UV digestion in the presence of hydrogen peroxide prior to ion chromatography (IC) measurements. The measurement of sulfur by ICP-OES and IC (as sulfate) using the “high performance” methodology could be useful for characterizing primary calibration standards and certified reference materials with low uncertainties. The relative expanded uncertainties (% U) expressed at 95% confidence interval for ICP-OES analyses varied from 0.1% to 0.3%, while in the case of IC, they were between 0.2% and 1.2%. The described methods are more suitable for characterizing primary calibration standards and certifying reference materials of GSH, than for routine measurements. PMID:29403814

  1. Accurate Virus Quantitation Using a Scanning Transmission Electron Microscopy (STEM) Detector in a Scanning Electron Microscope

    DTIC Science & Technology

    2017-06-29

    Accurate Virus Quantitation Using a Scanning Transmission Electron Microscopy (STEM) Detector in a Scanning Electron Microscope Candace D Blancett1...L Norris2, Cynthia A Rossi4 , Pamela J Glass3, Mei G Sun1,* 1 Pathology Division, United States Army Medical Research Institute of Infectious...Diseases (USAMRIID), 1425 Porter Street, Fort Detrick, Maryland, 21702 2Biostatistics Division, United States Army Medical Research Institute of

  2. Exploring a new quantitative image marker to assess benefit of chemotherapy to ovarian cancer patients

    NASA Astrophysics Data System (ADS)

    Mirniaharikandehei, Seyedehnafiseh; Patil, Omkar; Aghaei, Faranak; Wang, Yunzhi; Zheng, Bin

    2017-03-01

    Accurately assessing the potential benefit of chemotherapy to cancer patients is an important prerequisite to developing precision medicine in cancer treatment. The previous study has shown that total psoas area (TPA) measured on preoperative cross-section CT image might be a good image marker to predict long-term outcome of pancreatic cancer patients after surgery. However, accurate and automated segmentation of TPA from the CT image is difficult due to the fuzzy boundary or connection of TPA to other muscle areas. In this study, we developed a new interactive computer-aided detection (ICAD) scheme aiming to segment TPA from the abdominal CT images more accurately and assess the feasibility of using this new quantitative image marker to predict the benefit of ovarian cancer patients receiving Bevacizumab-based chemotherapy. ICAD scheme was applied to identify a CT image slice of interest, which is located at the level of L3 (vertebral spines). The cross-sections of the right and left TPA are segmented using a set of adaptively adjusted boundary conditions. TPA is then quantitatively measured. In addition, recent studies have investigated that muscle radiation attenuation which reflects fat deposition in the tissue might be a good image feature for predicting the survival rate of cancer patients. The scheme and TPA measurement task were applied to a large national clinical trial database involving 1,247 ovarian cancer patients. By comparing with manual segmentation results, we found that ICAD scheme could yield higher accuracy and consistency for this task. Using a new ICAD scheme can provide clinical researchers a useful tool to more efficiently and accurately extract TPA as well as muscle radiation attenuation as new image makers, and allow them to investigate the discriminatory power of it to predict progression-free survival and/or overall survival of the cancer patients before and after taking chemotherapy.

  3. Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool

    NASA Astrophysics Data System (ADS)

    Chakraborty, Monisha; Ghosh, Dipak

    2017-12-01

    Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.

  4. Quantitative Assessment of Arrhythmia Using Non-linear Approach: A Non-invasive Prognostic Tool

    NASA Astrophysics Data System (ADS)

    Chakraborty, Monisha; Ghosh, Dipak

    2018-04-01

    Accurate prognostic tool to identify severity of Arrhythmia is yet to be investigated, owing to the complexity of the ECG signal. In this paper, we have shown that quantitative assessment of Arrhythmia is possible using non-linear technique based on "Hurst Rescaled Range Analysis". Although the concept of applying "non-linearity" for studying various cardiac dysfunctions is not entirely new, the novel objective of this paper is to identify the severity of the disease, monitoring of different medicine and their dose, and also to assess the efficiency of different medicine. The approach presented in this work is simple which in turn will help doctors in efficient disease management. In this work, Arrhythmia ECG time series are collected from MIT-BIH database. Normal ECG time series are acquired using POLYPARA system. Both time series are analyzed in thelight of non-linear approach following the method "Rescaled Range Analysis". The quantitative parameter, "Fractal Dimension" (D) is obtained from both types of time series. The major finding is that Arrhythmia ECG poses lower values of D as compared to normal. Further, this information can be used to access the severity of Arrhythmia quantitatively, which is a new direction of prognosis as well as adequate software may be developed for the use of medical practice.

  5. Three-Dimensional Photography for Quantitative Assessment of Penile Volume-Loss Deformities in Peyronie's Disease.

    PubMed

    Margolin, Ezra J; Mlynarczyk, Carrie M; Mulhall, John P; Stember, Doron S; Stahl, Peter J

    2017-06-01

    Non-curvature penile deformities are prevalent and bothersome manifestations of Peyronie's disease (PD), but the quantitative metrics that are currently used to describe these deformities are inadequate and non-standardized, presenting a barrier to clinical research and patient care. To introduce erect penile volume (EPV) and percentage of erect penile volume loss (percent EPVL) as novel metrics that provide detailed quantitative information about non-curvature penile deformities and to study the feasibility and reliability of three-dimensional (3D) photography for measurement of quantitative penile parameters. We constructed seven penis models simulating deformities found in PD. The 3D photographs of each model were captured in triplicate by four observers using a 3D camera. Computer software was used to generate automated measurements of EPV, percent EPVL, penile length, minimum circumference, maximum circumference, and angle of curvature. The automated measurements were statistically compared with measurements obtained using water-displacement experiments, a tape measure, and a goniometer. Accuracy of 3D photography for average measurements of all parameters compared with manual measurements; inter-test, intra-observer, and inter-observer reliabilities of EPV and percent EPVL measurements as assessed by the intraclass correlation coefficient. The 3D images were captured in a median of 52 seconds (interquartile range = 45-61). On average, 3D photography was accurate to within 0.3% for measurement of penile length. It overestimated maximum and minimum circumferences by averages of 4.2% and 1.6%, respectively; overestimated EPV by an average of 7.1%; and underestimated percent EPVL by an average of 1.9%. All inter-test, inter-observer, and intra-observer intraclass correlation coefficients for EPV and percent EPVL measurements were greater than 0.75, reflective of excellent methodologic reliability. By providing highly descriptive and reliable measurements of

  6. Highly Accurate Quantitative Analysis Of Enantiomeric Mixtures from Spatially Frequency Encoded 1H NMR Spectra.

    PubMed

    Plainchont, Bertrand; Pitoux, Daisy; Cyrille, Mathieu; Giraud, Nicolas

    2018-02-06

    We propose an original concept to measure accurately enantiomeric excesses on proton NMR spectra, which combines high-resolution techniques based on a spatial encoding of the sample, with the use of optically active weakly orienting solvents. We show that it is possible to simulate accurately dipolar edited spectra of enantiomers dissolved in a chiral liquid crystalline phase, and to use these simulations to calibrate integrations that can be measured on experimental data, in order to perform a quantitative chiral analysis. This approach is demonstrated on a chemical intermediate for which optical purity is an essential criterion. We find that there is a very good correlation between the experimental and calculated integration ratios extracted from G-SERF spectra, which paves the way to a general method of determination of enantiomeric excesses based on the observation of 1 H nuclei.

  7. Foresight begins with FMEA. Delivering accurate risk assessments.

    PubMed

    Passey, R D

    1999-03-01

    If sufficient factors are taken into account and two- or three-stage analysis is employed, failure mode and effect analysis represents an excellent technique for delivering accurate risk assessments for products and processes, and for relating them to legal liability. This article describes a format that facilitates easy interpretation.

  8. Quantitative Microbial Risk Assessment Tutorial - Primer

    EPA Science Inventory

    This document provides a Quantitative Microbial Risk Assessment (QMRA) primer that organizes QMRA tutorials. The tutorials describe functionality of a QMRA infrastructure, guide the user through software use and assessment options, provide step-by-step instructions for implementi...

  9. Restriction Site Tiling Analysis: accurate discovery and quantitative genotyping of genome-wide polymorphisms using nucleotide arrays

    PubMed Central

    2010-01-01

    High-throughput genotype data can be used to identify genes important for local adaptation in wild populations, phenotypes in lab stocks, or disease-related traits in human medicine. Here we advance microarray-based genotyping for population genomics with Restriction Site Tiling Analysis. The approach simultaneously discovers polymorphisms and provides quantitative genotype data at 10,000s of loci. It is highly accurate and free from ascertainment bias. We apply the approach to uncover genomic differentiation in the purple sea urchin. PMID:20403197

  10. Can blind persons accurately assess body size from the voice?

    PubMed

    Pisanski, Katarzyna; Oleszkiewicz, Anna; Sorokowska, Agnieszka

    2016-04-01

    Vocal tract resonances provide reliable information about a speaker's body size that human listeners use for biosocial judgements as well as speech recognition. Although humans can accurately assess men's relative body size from the voice alone, how this ability is acquired remains unknown. In this study, we test the prediction that accurate voice-based size estimation is possible without prior audiovisual experience linking low frequencies to large bodies. Ninety-one healthy congenitally or early blind, late blind and sighted adults (aged 20-65) participated in the study. On the basis of vowel sounds alone, participants assessed the relative body sizes of male pairs of varying heights. Accuracy of voice-based body size assessments significantly exceeded chance and did not differ among participants who were sighted, or congenitally blind or who had lost their sight later in life. Accuracy increased significantly with relative differences in physical height between men, suggesting that both blind and sighted participants used reliable vocal cues to size (i.e. vocal tract resonances). Our findings demonstrate that prior visual experience is not necessary for accurate body size estimation. This capacity, integral to both nonverbal communication and speech perception, may be present at birth or may generalize from broader cross-modal correspondences. © 2016 The Author(s).

  11. Can blind persons accurately assess body size from the voice?

    PubMed Central

    Oleszkiewicz, Anna; Sorokowska, Agnieszka

    2016-01-01

    Vocal tract resonances provide reliable information about a speaker's body size that human listeners use for biosocial judgements as well as speech recognition. Although humans can accurately assess men's relative body size from the voice alone, how this ability is acquired remains unknown. In this study, we test the prediction that accurate voice-based size estimation is possible without prior audiovisual experience linking low frequencies to large bodies. Ninety-one healthy congenitally or early blind, late blind and sighted adults (aged 20–65) participated in the study. On the basis of vowel sounds alone, participants assessed the relative body sizes of male pairs of varying heights. Accuracy of voice-based body size assessments significantly exceeded chance and did not differ among participants who were sighted, or congenitally blind or who had lost their sight later in life. Accuracy increased significantly with relative differences in physical height between men, suggesting that both blind and sighted participants used reliable vocal cues to size (i.e. vocal tract resonances). Our findings demonstrate that prior visual experience is not necessary for accurate body size estimation. This capacity, integral to both nonverbal communication and speech perception, may be present at birth or may generalize from broader cross-modal correspondences. PMID:27095264

  12. Accurate assessment and identification of naturally occurring cellular cobalamins.

    PubMed

    Hannibal, Luciana; Axhemi, Armend; Glushchenko, Alla V; Moreira, Edward S; Brasch, Nicola E; Jacobsen, Donald W

    2008-01-01

    Accurate assessment of cobalamin profiles in human serum, cells, and tissues may have clinical diagnostic value. However, non-alkyl forms of cobalamin undergo beta-axial ligand exchange reactions during extraction, which leads to inaccurate profiles having little or no diagnostic value. Experiments were designed to: 1) assess beta-axial ligand exchange chemistry during the extraction and isolation of cobalamins from cultured bovine aortic endothelial cells, human foreskin fibroblasts, and human hepatoma HepG2 cells, and 2) to establish extraction conditions that would provide a more accurate assessment of endogenous forms containing both exchangeable and non-exchangeable beta-axial ligands. The cobalamin profile of cells grown in the presence of [ 57Co]-cyanocobalamin as a source of vitamin B12 shows that the following derivatives are present: [ 57Co]-aquacobalamin, [ 57Co]-glutathionylcobalamin, [ 57Co]-sulfitocobalamin, [ 57Co]-cyanocobalamin, [ 57Co]-adenosylcobalamin, [ 57Co]-methylcobalamin, as well as other yet unidentified corrinoids. When the extraction is performed in the presence of excess cold aquacobalaminacting as a scavenger cobalamin (i.e. "cold trapping"), the recovery of both [ 57Co]-glutathionylcobalamin and [ 57Co]-sulfitocobalamin decreases to low but consistent levels. In contrasts, the [ 57Co]-nitrocobalamin observed in the extracts prepared without excess aquacobalamin is undetected in extracts prepared with cold trapping. This demonstrates that beta-ligand exchange occur with non-covalently bound beta-ligands. The exception to this observation is cyanocobalamin with a non-exchangeable CN- group. It is now possible to obtain accurate profiles of cellular cobalamin.

  13. Accurate assessment and identification of naturally occurring cellular cobalamins

    PubMed Central

    Hannibal, Luciana; Axhemi, Armend; Glushchenko, Alla V.; Moreira, Edward S.; Brasch, Nicola E.; Jacobsen, Donald W.

    2009-01-01

    Background Accurate assessment of cobalamin profiles in human serum, cells, and tissues may have clinical diagnostic value. However, non-alkyl forms of cobalamin undergo β-axial ligand exchange reactions during extraction, which leads to inaccurate profiles having little or no diagnostic value. Methods Experiments were designed to: 1) assess β-axial ligand exchange chemistry during the extraction and isolation of cobalamins from cultured bovine aortic endothelial cells, human foreskin fibroblasts, and human hepatoma HepG2 cells, and 2) to establish extraction conditions that would provide a more accurate assessment of endogenous forms containing both exchangeable and non-exchangeable β-axial ligands. Results The cobalamin profile of cells grown in the presence of [57Co]-cyanocobalamin as a source of vitamin B12 shows that the following derivatives are present: [57Co]-aquacobalamin, [57Co]-glutathionylcobalamin, [57Co]-sulfitocobalamin, [57Co]-cyanocobalamin, [57Co]-adenosylcobalamin, [57Co]-methylcobalamin, as well as other yet unidentified corrinoids. When the extraction is performed in the presence of excess cold aquacobalamin acting as a scavenger cobalamin (i.e., “cold trapping”), the recovery of both [57Co]-glutathionylcobalamin and [57Co]-sulfitocobalamin decreases to low but consistent levels. In contrast, the [57Co]-nitrocobalamin observed in extracts prepared without excess aquacobalamin is undetectable in extracts prepared with cold trapping. Conclusions This demonstrates that β-ligand exchange occurs with non-covalently bound β-ligands. The exception to this observation is cyanocobalamin with a non-covalent but non-exchangeable− CNT group. It is now possible to obtain accurate profiles of cellular cobalamins. PMID:18973458

  14. Quantitative methods in assessment of neurologic function.

    PubMed

    Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J

    1981-01-01

    Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.

  15. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Tan, Zhibin (Inventor); Mosleh, Ali (Inventor); Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Chang, Yung-Hsien (Inventor); Groen, Francisco J (Inventor); Swaminathan, Sankaran (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  16. A unique charge-coupled device/xenon arc lamp based imaging system for the accurate detection and quantitation of multicolour fluorescence.

    PubMed

    Spibey, C A; Jackson, P; Herick, K

    2001-03-01

    In recent years the use of fluorescent dyes in biological applications has dramatically increased. The continual improvement in the capabilities of these fluorescent dyes demands increasingly sensitive detection systems that provide accurate quantitation over a wide linear dynamic range. In the field of proteomics, the detection, quantitation and identification of very low abundance proteins are of extreme importance in understanding cellular processes. Therefore, the instrumentation used to acquire an image of such samples, for spot picking and identification by mass spectrometry, must be sensitive enough to be able, not only, to maximise the sensitivity and dynamic range of the staining dyes but, as importantly, adapt to the ever changing portfolio of fluorescent dyes as they become available. Just as the available fluorescent probes are improving and evolving so are the users application requirements. Therefore, the instrumentation chosen must be flexible to address and adapt to those changing needs. As a result, a highly competitive market for the supply and production of such dyes and the instrumentation for their detection and quantitation have emerged. The instrumentation currently available is based on either laser/photomultiplier tube (PMT) scanning or lamp/charge-coupled device (CCD) based mechanisms. This review briefly discusses the advantages and disadvantages of both System types for fluorescence imaging, gives a technical overview of CCD technology and describes in detail a unique xenon/are lamp CCD based instrument, from PerkinElmer Life Sciences. The Wallac-1442 ARTHUR is unique in its ability to scan both large areas at high resolution and give accurate selectable excitation over the whole of the UV/visible range. It operates by filtering both the excitation and emission wavelengths, providing optimal and accurate measurement and quantitation of virtually any available dye and allows excellent spectral resolution between different fluorophores

  17. A quantitative reconstruction software suite for SPECT imaging

    NASA Astrophysics Data System (ADS)

    Namías, Mauro; Jeraj, Robert

    2017-11-01

    Quantitative Single Photon Emission Tomography (SPECT) imaging allows for measurement of activity concentrations of a given radiotracer in vivo. Although SPECT has usually been perceived as non-quantitative by the medical community, the introduction of accurate CT based attenuation correction and scatter correction from hybrid SPECT/CT scanners has enabled SPECT systems to be as quantitative as Positron Emission Tomography (PET) systems. We implemented a software suite to reconstruct quantitative SPECT images from hybrid or dedicated SPECT systems with a separate CT scanner. Attenuation, scatter and collimator response corrections were included in an Ordered Subset Expectation Maximization (OSEM) algorithm. A novel scatter fraction estimation technique was introduced. The SPECT/CT system was calibrated with a cylindrical phantom and quantitative accuracy was assessed with an anthropomorphic phantom and a NEMA/IEC image quality phantom. Accurate activity measurements were achieved at an organ level. This software suite helps increasing quantitative accuracy of SPECT scanners.

  18. Assessment of and standardization for quantitative nondestructive test

    NASA Technical Reports Server (NTRS)

    Neuschaefer, R. W.; Beal, J. B.

    1972-01-01

    Present capabilities and limitations of nondestructive testing (NDT) as applied to aerospace structures during design, development, production, and operational phases are assessed. It will help determine what useful structural quantitative and qualitative data may be provided from raw materials to vehicle refurbishment. This assessment considers metal alloys systems and bonded composites presently applied in active NASA programs or strong contenders for future use. Quantitative and qualitative data has been summarized from recent literature, and in-house information, and presented along with a description of those structures or standards where the information was obtained. Examples, in tabular form, of NDT technique capabilities and limitations have been provided. NDT techniques discussed and assessed were radiography, ultrasonics, penetrants, thermal, acoustic, and electromagnetic. Quantitative data is sparse; therefore, obtaining statistically reliable flaw detection data must be strongly emphasized. The new requirements for reusable space vehicles have resulted in highly efficient design concepts operating in severe environments. This increases the need for quantitative NDT evaluation of selected structural components, the end item structure, and during refurbishment operations.

  19. Quantitative assessment of upper extremities motor function in multiple sclerosis.

    PubMed

    Daunoraviciene, Kristina; Ziziene, Jurgita; Griskevicius, Julius; Pauk, Jolanta; Ovcinikova, Agne; Kizlaitiene, Rasa; Kaubrys, Gintaras

    2018-05-18

    Upper extremity (UE) motor function deficits are commonly noted in multiple sclerosis (MS) patients and assessing it is challenging because of the lack of consensus regarding its definition. Instrumented biomechanical analysis of upper extremity movements can quantify coordination with different spatiotemporal measures and facilitate disability rating in MS patients. To identify objective quantitative parameters for more accurate evaluation of UE disability and relate it to existing clinical scores. Thirty-four MS patients and 24 healthy controls (CG) performed a finger-to-nose test as fast as possible and, in addition, clinical evaluation kinematic parameters of UE were measured by using inertial sensors. Generally, a higher disability score was associated with an increase of several temporal parameters, like slower task performance. The time taken to touch their nose was longer when the task was fulfilled with eyes closed. Time to peak angular velocity significantly changed in MS patients (EDSS > 5.0). The inter-joint coordination significantly decreases in MS patients (EDSS 3.0-5.5). Spatial parameters indicated that maximal ROM changes were in elbow flexion. Our findings have revealed that spatiotemporal parameters are related to the UE motor function and MS disability level. Moreover, they facilitate clinical rating by supporting clinical decisions with quantitative data.

  20. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  1. Physiologic Basis for Understanding Quantitative Dehydration Assessment

    DTIC Science & Technology

    2012-01-01

    Perspective Physiologic basis for understanding quantitative dehydration assessment1–4 Samuel N Cheuvront, Robert W Kenefick, Nisha Charkoudian, and...Michael N Sawka ABSTRACT Dehydration (body water deficit) is a physiologic state that can have profound implications for human health and performance...review the physiologic basis for understanding quantitative dehydration as- sessment. We highlight how phenomenologic interpretations of de- hydration

  2. Quantitative assessment of 12-lead ECG synthesis using CAVIAR.

    PubMed

    Scherer, J A; Rubel, P; Fayn, J; Willems, J L

    1992-01-01

    The objective of this study is to assess the performance of patient-specific segment-specific (PSSS) synthesis in QRST complexes using CAVIAR, a new method of the serial comparison for electrocardiograms and vectorcardiograms. A collection of 250 multi-lead recordings from the Common Standards for Quantitative Electrocardiography (CSE) diagnostic pilot study is employed. QRS and ST-T segments are independently synthesized using the PSSS algorithm so that the mean-squared error between the original and estimated waveforms is minimized. CAVIAR compares the recorded and synthesized QRS and ST-T segments and calculates the mean-quadratic deviation as a measure of error. The results of this study indicate that estimated QRS complexes are good representatives of their recorded counterparts, and the integrity of the spatial information is maintained by the PSSS synthesis process. Analysis of the ST-T segments suggests that the deviations between recorded and synthesized waveforms are considerably greater than those associated with the QRS complexes. The poorer performance of the ST-T segments is attributed to magnitude normalization of the spatial loops, low-voltage passages, and noise interference. Using the mean-quadratic deviation and CAVIAR as methods of performance assessment, this study indicates that the PSSS-synthesis algorithm accurately maintains the signal information within the 12-lead electrocardiogram.

  3. Quantitative Assessment of Eye Phenotypes for Functional Genetic Studies Using Drosophila melanogaster

    PubMed Central

    Iyer, Janani; Wang, Qingyu; Le, Thanh; Pizzo, Lucilla; Grönke, Sebastian; Ambegaokar, Surendra S.; Imai, Yuzuru; Srivastava, Ashutosh; Troisí, Beatriz Llamusí; Mardon, Graeme; Artero, Ruben; Jackson, George R.; Isaacs, Adrian M.; Partridge, Linda; Lu, Bingwei; Kumar, Justin P.; Girirajan, Santhosh

    2016-01-01

    About two-thirds of the vital genes in the Drosophila genome are involved in eye development, making the fly eye an excellent genetic system to study cellular function and development, neurodevelopment/degeneration, and complex diseases such as cancer and diabetes. We developed a novel computational method, implemented as Flynotyper software (http://flynotyper.sourceforge.net), to quantitatively assess the morphological defects in the Drosophila eye resulting from genetic alterations affecting basic cellular and developmental processes. Flynotyper utilizes a series of image processing operations to automatically detect the fly eye and the individual ommatidium, and calculates a phenotypic score as a measure of the disorderliness of ommatidial arrangement in the fly eye. As a proof of principle, we tested our method by analyzing the defects due to eye-specific knockdown of Drosophila orthologs of 12 neurodevelopmental genes to accurately document differential sensitivities of these genes to dosage alteration. We also evaluated eye images from six independent studies assessing the effect of overexpression of repeats, candidates from peptide library screens, and modifiers of neurotoxicity and developmental processes on eye morphology, and show strong concordance with the original assessment. We further demonstrate the utility of this method by analyzing 16 modifiers of sine oculis obtained from two genome-wide deficiency screens of Drosophila and accurately quantifying the effect of its enhancers and suppressors during eye development. Our method will complement existing assays for eye phenotypes, and increase the accuracy of studies that use fly eyes for functional evaluation of genes and genetic interactions. PMID:26994292

  4. QUANTITATIVE PROCEDURES FOR NEUROTOXICOLOGY RISK ASSESSMENT

    EPA Science Inventory

    In this project, previously published information on biologically based dose-response model for brain development was used to quantitatively evaluate critical neurodevelopmental processes, and to assess potential chemical impacts on early brain development. This model has been ex...

  5. Accurate quantitation of D+ fetomaternal hemorrhage by flow cytometry using a novel reagent to eliminate granulocytes from analysis.

    PubMed

    Kumpel, Belinda; Hazell, Matthew; Guest, Alan; Dixey, Jonathan; Mushens, Rosey; Bishop, Debbie; Wreford-Bush, Tim; Lee, Edmond

    2014-05-01

    Quantitation of fetomaternal hemorrhage (FMH) is performed to determine the dose of prophylactic anti-D (RhIG) required to prevent D immunization of D- women. Flow cytometry (FC) is the most accurate method. However, maternal white blood cells (WBCs) can give high background by binding anti-D nonspecifically, compromising accuracy. Maternal blood samples (69) were sent for FC quantitation of FMH after positive Kleihauer-Betke test (KBT) analysis and RhIG administration. Reagents used were BRAD-3-fluorescein isothiocyanate (FITC; anti-D), AEVZ5.3-FITC (anti-varicella zoster [anti-VZ], negative control), anti-fetal hemoglobin (HbF)-FITC, blended two-color reagents, BRAD-3-FITC/anti-CD45-phycoerythrin (PE; anti-D/L), and BRAD-3-FITC/anti-CD66b-PE (anti-D/G). PE-positive WBCs were eliminated from analysis by gating. Full blood counts were performed on maternal samples and female donors. Elevated numbers of neutrophils were present in 80% of patients. Red blood cell (RBC) indices varied widely in maternal blood. D+ FMH values obtained with anti-D/L, anti-D/G, and anti-HbF-FITC were very similar (r = 0.99, p < 0.001). Correlation between KBT and anti-HbF-FITC FMH results was low (r = 0.716). Inaccurate FMH quantitation using the current method (anti-D minus anti-VZ) occurred with 71% samples having less than 15 mL of D+ FMH (RBCs) and insufficient RhIG calculated for 9%. Using two-color reagents and anti-HbF-FITC, approximately 30% patients had elevated F cells, 26% had no fetal cells, 6% had D- FMH, 26% had 4 to 15 mL of D+ FMH, and 12% patients had more than 15 mL of D+ FMH (RBCs) requiring more than 300 μg of RhIG. Without accurate quantitation of D+ FMH by FC, some women would receive inappropriate or inadequate anti-D prophylaxis. The latter may be at risk of immunization leading to hemolytic disease of the newborn. © 2013 American Association of Blood Banks.

  6. Quantitative cervical vertebral maturation assessment in adolescents with normal occlusion: a mixed longitudinal study.

    PubMed

    Chen, Li-Li; Xu, Tian-Min; Jiang, Jiu-Hui; Zhang, Xing-Zhong; Lin, Jiu-Xiang

    2008-12-01

    The purpose of this study was to establish a quantitative cervical vertebral maturation (CVM) system for adolescents with normal occlusion. Mixed longitudinal data were used. The subjects included 87 children and adolescents from 8 to 18 years old with normal occlusion (32 boys, 55 girls) selected from 901 candidates. Sequential lateral cephalograms and hand-wrist films were taken once a year for 6 years. The lateral cephalograms of all subjects were divided into 11 maturation groups according to the Fishman skeletal maturity indicators. The morphologic characteristics of the second, third, and fourth cervical vertebrae at 11 developmental stages were measured and analyzed. Three characteristic parameters (H4/W4, AH3/PH3, @2) were selected to determine the classification of CVM. With 3 morphologic variables, the quantitative CVM system including 4 maturational stages was established. An equation that can accurately estimate the maturation of the cervical vertebrae was established: CVM stage=-4.13+3.57xH4/W4+4.07xAH3/PH3+0.03x@2. The quantitative CVM method is an efficient, objective, and relatively simple approach to assess the level of skeletal maturation during adolescence.

  7. QACD: A method for the quantitative assessment of compositional distribution in geologic materials

    NASA Astrophysics Data System (ADS)

    Loocke, M. P.; Lissenberg, J. C. J.; MacLeod, C. J.

    2017-12-01

    In order to fully understand the petrogenetic history of a rock, it is critical to obtain a thorough characterization of the chemical and textural relationships of its mineral constituents. Element mapping combines the microanalytical techniques that allow for the analysis of major- and minor elements at high spatial resolutions (e.g., electron microbeam analysis) with 2D mapping of samples in order to provide unprecedented detail regarding the growth histories and compositional distributions of minerals within a sample. We present a method for the acquisition and processing of large area X-ray element maps obtained by energy-dispersive X-ray spectrometer (EDS) to produce a quantitative assessment of compositional distribution (QACD) of mineral populations within geologic materials. By optimizing the conditions at which the EDS X-ray element maps are acquired, we are able to obtain full thin section quantitative element maps for most major elements in relatively short amounts of time. Such maps can be used to not only accurately identify all phases and calculate mineral modes for a sample (e.g., a petrographic thin section), but, critically, enable a complete quantitative assessment of their compositions. The QACD method has been incorporated into a python-based, easy-to-use graphical user interface (GUI) called Quack. The Quack software facilitates the generation of mineral modes, element and molar ratio maps and the quantification of full-sample compositional distributions. The open-source nature of the Quack software provides a versatile platform which can be easily adapted and modified to suit the needs of the user.

  8. Accurate and quantitative polarization-sensitive OCT by unbiased birefringence estimator with noise-stochastic correction

    NASA Astrophysics Data System (ADS)

    Kasaragod, Deepa; Sugiyama, Satoshi; Ikuno, Yasushi; Alonso-Caneiro, David; Yamanari, Masahiro; Fukuda, Shinichi; Oshika, Tetsuro; Hong, Young-Joo; Li, En; Makita, Shuichi; Miura, Masahiro; Yasuno, Yoshiaki

    2016-03-01

    Polarization sensitive optical coherence tomography (PS-OCT) is a functional extension of OCT that contrasts the polarization properties of tissues. It has been applied to ophthalmology, cardiology, etc. Proper quantitative imaging is required for a widespread clinical utility. However, the conventional method of averaging to improve the signal to noise ratio (SNR) and the contrast of the phase retardation (or birefringence) images introduce a noise bias offset from the true value. This bias reduces the effectiveness of birefringence contrast for a quantitative study. Although coherent averaging of Jones matrix tomography has been widely utilized and has improved the image quality, the fundamental limitation of nonlinear dependency of phase retardation and birefringence to the SNR was not overcome. So the birefringence obtained by PS-OCT was still not accurate for a quantitative imaging. The nonlinear effect of SNR to phase retardation and birefringence measurement was previously formulated in detail for a Jones matrix OCT (JM-OCT) [1]. Based on this, we had developed a maximum a-posteriori (MAP) estimator and quantitative birefringence imaging was demonstrated [2]. However, this first version of estimator had a theoretical shortcoming. It did not take into account the stochastic nature of SNR of OCT signal. In this paper, we present an improved version of the MAP estimator which takes into account the stochastic property of SNR. This estimator uses a probability distribution function (PDF) of true local retardation, which is proportional to birefringence, under a specific set of measurements of the birefringence and SNR. The PDF was pre-computed by a Monte-Carlo (MC) simulation based on the mathematical model of JM-OCT before the measurement. A comparison between this new MAP estimator, our previous MAP estimator [2], and the standard mean estimator is presented. The comparisons are performed both by numerical simulation and in vivo measurements of anterior and

  9. A High Resolution/Accurate Mass (HRAM) Data-Dependent MS3 Neutral Loss Screening, Classification, and Relative Quantitation Methodology for Carbonyl Compounds in Saliva

    NASA Astrophysics Data System (ADS)

    Dator, Romel; Carrà, Andrea; Maertens, Laura; Guidolin, Valeria; Villalta, Peter W.; Balbo, Silvia

    2017-04-01

    Reactive carbonyl compounds (RCCs) are ubiquitous in the environment and are generated endogenously as a result of various physiological and pathological processes. These compounds can react with biological molecules inducing deleterious processes believed to be at the basis of their toxic effects. Several of these compounds are implicated in neurotoxic processes, aging disorders, and cancer. Therefore, a method characterizing exposures to these chemicals will provide insights into how they may influence overall health and contribute to disease pathogenesis. Here, we have developed a high resolution accurate mass (HRAM) screening strategy allowing simultaneous identification and relative quantitation of DNPH-derivatized carbonyls in human biological fluids. The screening strategy involves the diagnostic neutral loss of hydroxyl radical triggering MS3 fragmentation, which is only observed in positive ionization mode of DNPH-derivatized carbonyls. Unique fragmentation pathways were used to develop a classification scheme for characterizing known and unanticipated/unknown carbonyl compounds present in saliva. Furthermore, a relative quantitation strategy was implemented to assess variations in the levels of carbonyl compounds before and after exposure using deuterated d 3 -DNPH. This relative quantitation method was tested on human samples before and after exposure to specific amounts of alcohol. The nano-electrospray ionization (nano-ESI) in positive mode afforded excellent sensitivity with detection limits on-column in the high-attomole levels. To the best of our knowledge, this is the first report of a method using HRAM neutral loss screening of carbonyl compounds. In addition, the method allows simultaneous characterization and relative quantitation of DNPH-derivatized compounds using nano-ESI in positive mode.

  10. Quantitative assessment of hematopoietic chimerism by quantitative real-time polymerase chain reaction of sequence polymorphism systems after hematopoietic stem cell transplantation.

    PubMed

    Qin, Xiao-ying; Li, Guo-xuan; Qin, Ya-zhen; Wang, Yu; Wang, Feng-rong; Liu, Dai-hong; Xu, Lan-ping; Chen, Huan; Han, Wei; Wang, Jing-zhi; Zhang, Xiao-hui; Li, Jin-lan; Li, Ling-di; Liu, Kai-yan; Huang, Xiao-jun

    2011-08-01

    Analysis of changes in recipient and donor hematopoietic cell origin is extremely useful to monitor the effect of hematopoietic stem cell transplantation (HSCT) and sequential adoptive immunotherapy by donor lymphocyte infusions. We developed a sensitive, reliable and rapid real-time PCR method based on sequence polymorphism systems to quantitatively assess the hematopoietic chimerism after HSCT. A panel of 29 selected sequence polymorphism (SP) markers was screened by real-time PCR in 101 HSCT patients with leukemia and other hematological diseases. The chimerism kinetics of bone marrow samples of 8 HSCT patients in remission and relapse situations were followed longitudinally. Recipient genotype discrimination was possible in 97.0% (98 of 101) with a mean number of 2.5 (1-7) informative markers per recipient/donor pair. Using serial dilutions of plasmids containing specific SP markers, the linear correlation (r) of 0.99, the slope between -3.2 and -3.7 and the sensitivity of 0.1% were proved reproducible. By this method, it was possible to very accurately detect autologous signals in the range from 0.1% to 30%. The accuracy of the method in the very important range of autologous signals below 5% was extraordinarily high (standard deviation <1.85%), which might significantly improve detection accuracy of changes in autologous signals early in the post-transplantation course of follow-up. The main advantage of the real-time PCR method over short tandem repeat PCR chimerism assays is the absence of PCR competition and plateau biases, with demonstrated greater sensitivity and linearity. Finally, we prospectively analyzed bone marrow samples of 8 patients who received allografts and presented the chimerism kinetics of remission and relapse situations that illustrated the sensitivity level and the promising clinical application of this method. This SP-based real-time PCR assay provides a rapid, sensitive, and accurate quantitative assessment of mixed chimerism that can

  11. Physiologic basis for understanding quantitative dehydration assessment.

    PubMed

    Cheuvront, Samuel N; Kenefick, Robert W; Charkoudian, Nisha; Sawka, Michael N

    2013-03-01

    Dehydration (body water deficit) is a physiologic state that can have profound implications for human health and performance. Unfortunately, dehydration can be difficult to assess, and there is no single, universal gold standard for decision making. In this article, we review the physiologic basis for understanding quantitative dehydration assessment. We highlight how phenomenologic interpretations of dehydration depend critically on the type (dehydration compared with volume depletion) and magnitude (moderate compared with severe) of dehydration, which in turn influence the osmotic (plasma osmolality) and blood volume-dependent compensatory thresholds for antidiuretic and thirst responses. In particular, we review new findings regarding the biological variation in osmotic responses to dehydration and discuss how this variation can help provide a quantitative and clinically relevant link between the physiology and phenomenology of dehydration. Practical measures with empirical thresholds are provided as a starting point for improving the practice of dehydration assessment.

  12. Quantitative analysis of naphthenic acids in water by liquid chromatography-accurate mass time-of-flight mass spectrometry.

    PubMed

    Hindle, Ralph; Noestheden, Matthew; Peru, Kerry; Headley, John

    2013-04-19

    This study details the development of a routine method for quantitative analysis of oil sands naphthenic acids, which are a complex class of compounds found naturally and as contaminants in oil sands process waters from Alberta's Athabasca region. Expanding beyond classical naphthenic acids (CnH2n-zO2), those compounds conforming to the formula CnH2n-zOx (where 2≥x≤4) were examined in commercial naphthenic acid and environmental water samples. HPLC facilitated a five-fold reduction in ion suppression when compared to the more commonly used flow injection analysis. A comparison of 39 model naphthenic acids revealed significant variability in response factors, demonstrating the necessity of using naphthenic acid mixtures for quantitation, rather than model compounds. It was also demonstrated that naphthenic acidic heterogeneity (commercial and environmental) necessitates establishing a single NA mix as the standard against which all quantitation is performed. The authors present the first ISO17025 accredited method for the analysis of naphthenic acids in water using HPLC high resolution accurate mass time-of-flight mass spectrometry. The method detection limit was 1mg/L total oxy-naphthenic acids (Sigma technical mix). Copyright © 2013 Elsevier B.V. All rights reserved.

  13. Accurate ECG diagnosis of atrial tachyarrhythmias using quantitative analysis: a prospective diagnostic and cost-effectiveness study.

    PubMed

    Krummen, David E; Patel, Mitul; Nguyen, Hong; Ho, Gordon; Kazi, Dhruv S; Clopton, Paul; Holland, Marian C; Greenberg, Scott L; Feld, Gregory K; Faddis, Mitchell N; Narayan, Sanjiv M

    2010-11-01

    Quantitative ECG Analysis. Optimal atrial tachyarrhythmia management is facilitated by accurate electrocardiogram interpretation, yet typical atrial flutter (AFl) may present without sawtooth F-waves or RR regularity, and atrial fibrillation (AF) may be difficult to separate from atypical AFl or rapid focal atrial tachycardia (AT). We analyzed whether improved diagnostic accuracy using a validated analysis tool significantly impacts costs and patient care. We performed a prospective, blinded, multicenter study using a novel quantitative computerized algorithm to identify atrial tachyarrhythmia mechanism from the surface ECG in patients referred for electrophysiology study (EPS). In 122 consecutive patients (age 60 ± 12 years) referred for EPS, 91 sustained atrial tachyarrhythmias were studied. ECGs were also interpreted by 9 physicians from 3 specialties for comparison and to allow healthcare system modeling. Diagnostic accuracy was compared to the diagnosis at EPS. A Markov model was used to estimate the impact of improved arrhythmia diagnosis. We found 13% of typical AFl ECGs had neither sawtooth flutter waves nor RR regularity, and were misdiagnosed by the majority of clinicians (0/6 correctly diagnosed by consensus visual interpretation) but correctly by quantitative analysis in 83% (5/6, P = 0.03). AF diagnosis was also improved through use of the algorithm (92%) versus visual interpretation (primary care: 76%, P < 0.01). Economically, we found that these improvements in diagnostic accuracy resulted in an average cost-savings of $1,303 and 0.007 quality-adjusted-life-years per patient. Typical AFl and AF are frequently misdiagnosed using visual criteria. Quantitative analysis improves diagnostic accuracy and results in improved healthcare costs and patient outcomes. © 2010 Wiley Periodicals, Inc.

  14. Analytical method for the accurate determination of tricothecenes in grains using LC-MS/MS: a comparison between MRM transition and MS3 quantitation.

    PubMed

    Lim, Chee Wei; Tai, Siew Hoon; Lee, Lin Min; Chan, Sheot Harn

    2012-07-01

    The current food crisis demands unambiguous determination of mycotoxin contamination in staple foods to achieve safer food for consumption. This paper describes the first accurate LC-MS/MS method developed to analyze tricothecenes in grains by applying multiple reaction monitoring (MRM) transition and MS(3) quantitation strategies in tandem. The tricothecenes are nivalenol, deoxynivalenol, deoxynivalenol-3-glucoside, fusarenon X, 3-acetyl-deoxynivalenol, 15-acetyldeoxynivalenol, diacetoxyscirpenol, and HT-2 and T-2 toxins. Acetic acid and ammonium acetate were used to convert the analytes into their respective acetate adducts and ammonium adducts under negative and positive MS polarity conditions, respectively. The mycotoxins were separated by reversed-phase LC in a 13.5-min run, ionized using electrospray ionization, and detected by tandem mass spectrometry. Analyte-specific mass-to-charge (m/z) ratios were used to perform quantitation under MRM transition and MS(3) (linear ion trap) modes. Three experiments were made for each quantitation mode and matrix in batches over 6 days for recovery studies. The matrix effect was investigated at concentration levels of 20, 40, 80, 120, 160, and 200 μg kg(-1) (n = 3) in 5 g corn flour and rice flour. Extraction with acetonitrile provided a good overall recovery range of 90-108% (n = 3) at three levels of spiking concentration of 40, 80, and 120 μg kg(-1). A quantitation limit of 2-6 μg kg(-1) was achieved by applying an MRM transition quantitation strategy. Under MS(3) mode, a quantitation limit of 4-10 μg kg(-1) was achieved. Relative standard deviations of 2-10% and 2-11% were reported for MRM transition and MS(3) quantitation, respectively. The successful utilization of MS(3) enabled accurate analyte fragmentation pattern matching and its quantitation, leading to the development of analytical methods in fields that demand both analyte specificity and fragmentation fingerprint-matching capabilities that are

  15. Assessing the reporting of categorised quantitative variables in observational epidemiological studies.

    PubMed

    Mabikwa, Onkabetse V; Greenwood, Darren C; Baxter, Paul D; Fleming, Sarah J

    2017-03-14

    One aspect to consider when reporting results of observational studies in epidemiology is how quantitative risk factors are analysed. The STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) guidelines recommend that researchers describe how they handle quantitative variables when analysing data. For categorised quantitative variables, the authors are required to provide reasons and justifications informing their practice. We investigated and assessed the practices and reporting of categorised quantitative variables in epidemiology. The assessment was based on five medical journals that publish epidemiological research. Observational studies published between April and June 2015 and investigating the relationships between quantitative exposures (or risk factors) and the outcomes were considered for assessment. A standard form was used to collect the data, and the reporting patterns amongst eligible studies were quantified and described. Out of 61 articles assessed for eligibility, 23 observational studies were included in the assessment. Categorisation of quantitative exposures occurred in 61% of these studies and reasons informing the practice were rarely provided. Only one article explained the choice of categorisation in the analysis. Transformation of quantitative exposures into four or five groups was common and dominant amongst studies using equally spaced categories. Dichotomisation was not popular; the practice featured in one article. Overall, the majority (86%) of the studies preferred ordered or arbitrary group categories. Other criterions used to decide categorical boundaries were based on established guidelines such as consensus statements and WHO standards. Categorisation of continuous variables remains a dominant practice in epidemiological studies. The reasons informing the practice of categorisation within published work are limited and remain unknown in most articles. The existing STROBE guidelines could provide stronger

  16. The linearized multistage model and the future of quantitative risk assessment.

    PubMed

    Crump, K S

    1996-10-01

    The linearized multistage (LMS) model has for over 15 years been the default dose-response model used by the U.S. Environmental Protection Agency (USEPA) and other federal and state regulatory agencies in the United States for calculating quantitative estimates of low-dose carcinogenic risks from animal data. The LMS model is in essence a flexible statistical model that can describe both linear and non-linear dose-response patterns, and that produces an upper confidence bound on the linear low-dose slope of the dose-response curve. Unlike its namesake, the Armitage-Doll multistage model, the parameters of the LMS do not correspond to actual physiological phenomena. Thus the LMS is 'biological' only to the extent that the true biological dose response is linear at low dose and that low-dose slope is reflected in the experimental data. If the true dose response is non-linear the LMS upper bound may overestimate the true risk by many orders of magnitude. However, competing low-dose extrapolation models, including those derived from 'biologically-based models' that are capable of incorporating additional biological information, have not shown evidence to date of being able to produce quantitative estimates of low-dose risks that are any more accurate than those obtained from the LMS model. Further, even if these attempts were successful, the extent to which more accurate estimates of low-dose risks in a test animal species would translate into improved estimates of human risk is questionable. Thus, it does not appear possible at present to develop a quantitative approach that would be generally applicable and that would offer significant improvements upon the crude bounding estimates of the type provided by the LMS model. Draft USEPA guidelines for cancer risk assessment incorporate an approach similar to the LMS for carcinogens having a linear mode of action. However, under these guidelines quantitative estimates of low-dose risks would not be developed for

  17. Protocol for Standardizing High-to-Moderate Abundance Protein Biomarker Assessments Through an MRM-with-Standard-Peptides Quantitative Approach.

    PubMed

    Percy, Andrew J; Yang, Juncong; Chambers, Andrew G; Mohammed, Yassene; Miliotis, Tasso; Borchers, Christoph H

    2016-01-01

    Quantitative mass spectrometry (MS)-based approaches are emerging as a core technology for addressing health-related queries in systems biology and in the biomedical and clinical fields. In several 'omics disciplines (proteomics included), an approach centered on selected or multiple reaction monitoring (SRM or MRM)-MS with stable isotope-labeled standards (SIS), at the protein or peptide level, has emerged as the most precise technique for quantifying and screening putative analytes in biological samples. To enable the widespread use of MRM-based protein quantitation for disease biomarker assessment studies and its ultimate acceptance for clinical analysis, the technique must be standardized to facilitate precise and accurate protein quantitation. To that end, we have developed a number of kits for assessing method/platform performance, as well as for screening proposed candidate protein biomarkers in various human biofluids. Collectively, these kits utilize a bottom-up LC-MS methodology with SIS peptides as internal standards and quantify proteins using regression analysis of standard curves. This chapter details the methodology used to quantify 192 plasma proteins of high-to-moderate abundance (covers a 6 order of magnitude range from 31 mg/mL for albumin to 18 ng/mL for peroxidredoxin-2), and a 21-protein subset thereof. We also describe the application of this method to patient samples for biomarker discovery and verification studies. Additionally, we introduce our recently developed Qualis-SIS software, which is used to expedite the analysis and assessment of protein quantitation data in control and patient samples.

  18. Quantitative assessment of scatter correction techniques incorporated in next generation dual-source computed tomography

    NASA Astrophysics Data System (ADS)

    Mobberley, Sean David

    Accurate, cross-scanner assessment of in-vivo air density used to quantitatively assess amount and distribution of emphysema in COPD subjects has remained elusive. Hounsfield units (HU) within tracheal air can be considerably more positive than -1000 HU. With the advent of new dual-source scanners which employ dedicated scatter correction techniques, it is of interest to evaluate how the quantitative measures of lung density compare between dual-source and single-source scan modes. This study has sought to characterize in-vivo and phantom-based air metrics using dual-energy computed tomography technology where the nature of the technology has required adjustments to scatter correction. Anesthetized ovine (N=6), swine (N=13: more human-like rib cage shape), lung phantom and a thoracic phantom were studied using a dual-source MDCT scanner (Siemens Definition Flash. Multiple dual-source dual-energy (DSDE) and single-source (SS) scans taken at different energy levels and scan settings were acquired for direct quantitative comparison. Density histograms were evaluated for the lung, tracheal, water and blood segments. Image data were obtained at 80, 100, 120, and 140 kVp in the SS mode (B35f kernel) and at 80, 100, 140, and 140-Sn (tin filtered) kVp in the DSDE mode (B35f and D30f kernels), in addition to variations in dose, rotation time, and pitch. To minimize the effect of cross-scatter, the phantom scans in the DSDE mode was obtained by reducing the tube current of one of the tubes to its minimum (near zero) value. When using image data obtained in the DSDE mode, the median HU values in the tracheal regions of all animals and the phantom were consistently closer to -1000 HU regardless of reconstruction kernel (chapters 3 and 4). Similarly, HU values of water and blood were consistently closer to their nominal values of 0 HU and 55 HU respectively. When using image data obtained in the SS mode the air CT numbers demonstrated a consistent positive shift of up to 35 HU

  19. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  20. Light-propagation management in coupled waveguide arrays: Quantitative experimental and theoretical assessment from band structures to functional patterns

    NASA Astrophysics Data System (ADS)

    Moison, Jean-Marie; Belabas, Nadia; Levenson, Juan Ariel; Minot, Christophe

    2012-09-01

    We assess the band structure of arrays of coupled optical waveguides both by ab initio calculations and by experiments, with an excellent quantitative agreement without any adjustable physical parameter. The band structures we obtain can deviate strongly from the expectations of the standard coupled mode theory approximation, but we describe them efficiently by a few parameters within an extended coupled mode theory. We also demonstrate that this description is in turn a firm and simple basis for accurate beam management in functional patterns of coupled waveguides, in full accordance with their design.

  1. Using digital photography in a clinical setting: a valid, accurate, and applicable method to assess food intake.

    PubMed

    Winzer, Eva; Luger, Maria; Schindler, Karin

    2018-06-01

    Regular monitoring of food intake is hardly integrated in clinical routine. Therefore, the aim was to examine the validity, accuracy, and applicability of an appropriate and also quick and easy-to-use tool for recording food intake in a clinical setting. Two digital photography methods, the postMeal method with a picture after the meal, the pre-postMeal method with a picture before and after the meal, and the visual estimation method (plate diagram; PD) were compared against the reference method (weighed food records; WFR). A total of 420 dishes from lunch (7 weeks) were estimated with both photography methods and the visual method. Validity, applicability, accuracy, and precision of the estimation methods, and additionally food waste, macronutrient composition, and energy content were examined. Tests of validity revealed stronger correlations for photography methods (postMeal: r = 0.971, p < 0.001; pre-postMeal: r = 0.995, p < 0.001) compared to the visual estimation method (r = 0.810; p < 0.001). The pre-postMeal method showed smaller variability (bias < 1 g) and also smaller overestimation and underestimation. This method accurately and precisely estimated portion sizes in all food items. Furthermore, the total food waste was 22% for lunch over the study period. The highest food waste was observed in salads and the lowest in desserts. The pre-postMeal digital photography method is valid, accurate, and applicable in monitoring food intake in clinical setting, which enables a quantitative and qualitative dietary assessment. Thus, nutritional care might be initiated earlier. This method might be also advantageous for quantitative and qualitative evaluation of food waste, with a resultantly reduction in costs.

  2. Rock Slide Risk Assessment: A Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Duzgun, H. S. B.

    2009-04-01

    Rock slides can be better managed by systematic risk assessments. Any risk assessment methodology for rock slides involves identification of rock slide risk components, which are hazard, elements at risk and vulnerability. For a quantitative/semi-quantitative risk assessment for rock slides, a mathematical value the risk has to be computed and evaluated. The quantitative evaluation of risk for rock slides enables comparison of the computed risk with the risk of other natural and/or human-made hazards and providing better decision support and easier communication for the decision makers. A quantitative/semi-quantitative risk assessment procedure involves: Danger Identification, Hazard Assessment, Elements at Risk Identification, Vulnerability Assessment, Risk computation, Risk Evaluation. On the other hand, the steps of this procedure require adaptation of existing or development of new implementation methods depending on the type of landslide, data availability, investigation scale and nature of consequences. In study, a generic semi-quantitative risk assessment (SQRA) procedure for rock slides is proposed. The procedure has five consecutive stages: Data collection and analyses, hazard assessment, analyses of elements at risk and vulnerability and risk assessment. The implementation of the procedure for a single rock slide case is illustrated for a rock slope in Norway. Rock slides from mountain Ramnefjell to lake Loen are considered to be one of the major geohazards in Norway. Lake Loen is located in the inner part of Nordfjord in Western Norway. Ramnefjell Mountain is heavily jointed leading to formation of vertical rock slices with height between 400-450 m and width between 7-10 m. These slices threaten the settlements around Loen Valley and tourists visiting the fjord during summer season, as the released slides have potential of creating tsunami. In the past, several rock slides had been recorded from the Mountain Ramnefjell between 1905 and 1950. Among them

  3. Some suggested future directions of quantitative resource assessments

    USGS Publications Warehouse

    Singer, D.A.

    2001-01-01

    Future quantitative assessments will be expected to estimate quantities, values, and locations of undiscovered mineral resources in a form that conveys both economic viability and uncertainty associated with the resources. Historically, declining metal prices point to the need for larger deposits over time. Sensitivity analysis demonstrates that the greatest opportunity for reducing uncertainty in assessments lies in lowering uncertainty associated with tonnage estimates. Of all errors possible in assessments, those affecting tonnage estimates are by far the most important. Selecting the correct deposit model is the most important way of controlling errors because the dominance of tonnage-deposit models are the best known predictor of tonnage. Much of the surface is covered with apparently barren rocks and sediments in many large regions. Because many exposed mineral deposits are believed to have been found, a prime concern is the presence of possible mineralized rock under cover. Assessments of areas with resources under cover must rely on extrapolation from surrounding areas, new geologic maps of rocks under cover, or analogy with other well-explored areas that can be considered training tracts. Cover has a profound effect on uncertainty and on methods and procedures of assessments because geology is seldom known and geophysical methods typically have attenuated responses. Many earlier assessment methods were based on relationships of geochemical and geophysical variables to deposits learned from deposits exposed on the surface-these will need to be relearned based on covered deposits. Mineral-deposit models are important in quantitative resource assessments for two reasons: (1) grades and tonnages of most deposit types are significantly different, and (2) deposit types are present in different geologic settings that can be identified from geologic maps. Mineral-deposit models are the keystone in combining the diverse geoscience information on geology, mineral

  4. Capillary nano-immunoassays: advancing quantitative proteomics analysis, biomarker assessment, and molecular diagnostics.

    PubMed

    Chen, Jin-Qiu; Wakefield, Lalage M; Goldstein, David J

    2015-06-06

    There is an emerging demand for the use of molecular profiling to facilitate biomarker identification and development, and to stratify patients for more efficient treatment decisions with reduced adverse effects. In the past decade, great strides have been made to advance genomic, transcriptomic and proteomic approaches to address these demands. While there has been much progress with these large scale approaches, profiling at the protein level still faces challenges due to limitations in clinical sample size, poor reproducibility, unreliable quantitation, and lack of assay robustness. A novel automated capillary nano-immunoassay (CNIA) technology has been developed. This technology offers precise and accurate measurement of proteins and their post-translational modifications using either charge-based or size-based separation formats. The system not only uses ultralow nanogram levels of protein but also allows multi-analyte analysis using a parallel single-analyte format for increased sensitivity and specificity. The high sensitivity and excellent reproducibility of this technology make it particularly powerful for analysis of clinical samples. Furthermore, the system can distinguish and detect specific protein post-translational modifications that conventional Western blot and other immunoassays cannot easily capture. This review will summarize and evaluate the latest progress to optimize the CNIA system for comprehensive, quantitative protein and signaling event characterization. It will also discuss how the technology has been successfully applied in both discovery research and clinical studies, for signaling pathway dissection, proteomic biomarker assessment, targeted treatment evaluation and quantitative proteomic analysis. Lastly, a comparison of this novel system with other conventional immuno-assay platforms is performed.

  5. Distinguishing nanomaterial particles from background airborne particulate matter for quantitative exposure assessment

    NASA Astrophysics Data System (ADS)

    Ono-Ogasawara, Mariko; Serita, Fumio; Takaya, Mitsutoshi

    2009-10-01

    As the production of engineered nanomaterials quantitatively expands, the chance that workers involved in the manufacturing process will be exposed to nanoparticles also increases. A risk management system is needed for workplaces in the nanomaterial industry based on the precautionary principle. One of the problems in the risk management system is difficulty of exposure assessment. In this article, examples of exposure assessment in nanomaterial industries are reviewed with a focus on distinguishing engineered nanomaterial particles from background nanoparticles in workplace atmosphere. An approach by JNIOSH (Japan National Institute of Occupational Safety and Health) to quantitatively measure exposure to carbonaceous nanomaterials is also introduced. In addition to real-time measurements and qualitative analysis by electron microscopy, quantitative chemical analysis is necessary for quantitatively assessing exposure to nanomaterials. Chemical analysis is suitable for quantitative exposure measurement especially at facilities with high levels of background NPs.

  6. Impact of reconstruction parameters on quantitative I-131 SPECT

    NASA Astrophysics Data System (ADS)

    van Gils, C. A. J.; Beijst, C.; van Rooij, R.; de Jong, H. W. A. M.

    2016-07-01

    Radioiodine therapy using I-131 is widely used for treatment of thyroid disease or neuroendocrine tumors. Monitoring treatment by accurate dosimetry requires quantitative imaging. The high energy photons however render quantitative SPECT reconstruction challenging, potentially requiring accurate correction for scatter and collimator effects. The goal of this work is to assess the effectiveness of various correction methods on these effects using phantom studies. A SPECT/CT acquisition of the NEMA IEC body phantom was performed. Images were reconstructed using the following parameters: (1) without scatter correction, (2) with triple energy window (TEW) scatter correction and (3) with Monte Carlo-based scatter correction. For modelling the collimator-detector response (CDR), both (a) geometric Gaussian CDRs as well as (b) Monte Carlo simulated CDRs were compared. Quantitative accuracy, contrast to noise ratios and recovery coefficients were calculated, as well as the background variability and the residual count error in the lung insert. The Monte Carlo scatter corrected reconstruction method was shown to be intrinsically quantitative, requiring no experimentally acquired calibration factor. It resulted in a more accurate quantification of the background compartment activity density compared with TEW or no scatter correction. The quantification error relative to a dose calibrator derived measurement was found to be  <1%,-26% and 33%, respectively. The adverse effects of partial volume were significantly smaller with the Monte Carlo simulated CDR correction compared with geometric Gaussian or no CDR modelling. Scatter correction showed a small effect on quantification of small volumes. When using a weighting factor, TEW correction was comparable to Monte Carlo reconstruction in all measured parameters, although this approach is clinically impractical since this factor may be patient dependent. Monte Carlo based scatter correction including accurately simulated CDR

  7. Preferential access to genetic information from endogenous hominin ancient DNA and accurate quantitative SNP-typing via SPEX

    PubMed Central

    Brotherton, Paul; Sanchez, Juan J.; Cooper, Alan; Endicott, Phillip

    2010-01-01

    The analysis of targeted genetic loci from ancient, forensic and clinical samples is usually built upon polymerase chain reaction (PCR)-generated sequence data. However, many studies have shown that PCR amplification from poor-quality DNA templates can create sequence artefacts at significant levels. With hominin (human and other hominid) samples, the pervasive presence of highly PCR-amplifiable human DNA contaminants in the vast majority of samples can lead to the creation of recombinant hybrids and other non-authentic artefacts. The resulting PCR-generated sequences can then be difficult, if not impossible, to authenticate. In contrast, single primer extension (SPEX)-based approaches can genotype single nucleotide polymorphisms from ancient fragments of DNA as accurately as modern DNA. A single SPEX-type assay can amplify just one of the duplex DNA strands at target loci and generate a multi-fold depth-of-coverage, with non-authentic recombinant hybrids reduced to undetectable levels. Crucially, SPEX-type approaches can preferentially access genetic information from damaged and degraded endogenous ancient DNA templates over modern human DNA contaminants. The development of SPEX-type assays offers the potential for highly accurate, quantitative genotyping from ancient hominin samples. PMID:19864251

  8. A framework for organizing and selecting quantitative approaches for benefit-harm assessment.

    PubMed

    Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M

    2012-11-19

    Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the

  9. A framework for organizing and selecting quantitative approaches for benefit-harm assessment

    PubMed Central

    2012-01-01

    Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of

  10. Towards in vivo focal cortical dysplasia phenotyping using quantitative MRI.

    PubMed

    Adler, Sophie; Lorio, Sara; Jacques, Thomas S; Benova, Barbora; Gunny, Roxana; Cross, J Helen; Baldeweg, Torsten; Carmichael, David W

    2017-01-01

    Focal cortical dysplasias (FCDs) are a range of malformations of cortical development each with specific histopathological features. Conventional radiological assessment of standard structural MRI is useful for the localization of lesions but is unable to accurately predict the histopathological features. Quantitative MRI offers the possibility to probe tissue biophysical properties in vivo and may bridge the gap between radiological assessment and ex-vivo histology. This review will cover histological, genetic and radiological features of FCD following the ILAE classification and will explain how quantitative voxel- and surface-based techniques can characterise these features. We will provide an overview of the quantitative MRI measures available, their link with biophysical properties and finally the potential application of quantitative MRI to the problem of FCD subtyping. Future research linking quantitative MRI to FCD histological properties should improve clinical protocols, allow better characterisation of lesions in vivo and tailored surgical planning to the individual.

  11. ICSH recommendations for assessing automated high-performance liquid chromatography and capillary electrophoresis equipment for the quantitation of HbA2.

    PubMed

    Stephens, A D; Colah, R; Fucharoen, S; Hoyer, J; Keren, D; McFarlane, A; Perrett, D; Wild, B J

    2015-10-01

    Automated high performance liquid chromatography and Capillary electrophoresis are used to quantitate the proportion of Hemoglobin A2 (HbA2 ) in blood samples order to enable screening and diagnosis of carriers of β-thalassemia. Since there is only a very small difference in HbA2 levels between people who are carriers and people who are not carriers such analyses need to be both precise and accurate. This paper examines the different parameters of such equipment and discusses how they should be assessed. © 2015 John Wiley & Sons Ltd.

  12. Towards quantitative condition assessment of biodiversity outcomes: Insights from Australian marine protected areas.

    PubMed

    Addison, Prue F E; Flander, Louisa B; Cook, Carly N

    2017-08-01

    Protected area management effectiveness (PAME) evaluation is increasingly undertaken to evaluate governance, assess conservation outcomes and inform evidence-based management of protected areas (PAs). Within PAME, quantitative approaches to assess biodiversity outcomes are now emerging, where biological monitoring data are directly assessed against quantitative (numerically defined) condition categories (termed quantitative condition assessments). However, more commonly qualitative condition assessments are employed in PAME, which use descriptive condition categories and are evaluated largely with expert judgement that can be subject to a range of biases, such as linguistic uncertainty and overconfidence. Despite the benefits of increased transparency and repeatability of evaluations, quantitative condition assessments are rarely used in PAME. To understand why, we interviewed practitioners from all Australian marine protected area (MPA) networks, which have access to long-term biological monitoring data and are developing or conducting PAME evaluations. Our research revealed that there is a desire within management agencies to implement quantitative condition assessment of biodiversity outcomes in Australian MPAs. However, practitioners report many challenges in transitioning from undertaking qualitative to quantitative condition assessments of biodiversity outcomes, which are hampering progress. Challenges include a lack of agency capacity (staff numbers and money), knowledge gaps, and diminishing public and political support for PAs. We point to opportunities to target strategies that will assist agencies overcome these challenges, including new decision support tools, approaches to better finance conservation efforts, and to promote more management relevant science. While a single solution is unlikely to achieve full evidence-based conservation, we suggest ways for agencies to target strategies and advance PAME evaluations toward best practice. Copyright

  13. Accurate quantitation of circulating cell-free mitochondrial DNA in plasma by droplet digital PCR.

    PubMed

    Ye, Wei; Tang, Xiaojun; Liu, Chu; Wen, Chaowei; Li, Wei; Lyu, Jianxin

    2017-04-01

    To establish a method for accurate quantitation of circulating cell-free mitochondrial DNA (ccf-mtDNA) in plasma by droplet digital PCR (ddPCR), we designed a ddPCR method to determine the copy number of ccf-mtDNA by amplifying mitochondrial ND1 (MT-ND1). To evaluate the sensitivity and specificity of the method, a recombinant pMD18-T plasmid containing MT-ND1 sequences and mtDNA-deleted (ρ 0 ) HeLa cells were used, respectively. Subsequently, different plasma samples were prepared for ddPCR to evaluate the feasibility of detecting plasma ccf-mtDNA. In the results, the ddPCR method showed high sensitivity and specificity. When the DNA was extracted from plasma prior to ddPCR, the ccf-mtDNA copy number was higher than that measured without extraction. This difference was not due to a PCR inhibitor, such as EDTA-Na 2 , an anti-coagulant in plasma, because standard EDTA-Na 2 concentration (5 mM) did not significantly inhibit ddPCR reactions. The difference might be attributable to plasma exosomal mtDNA, which was 4.21 ± 0.38 copies/μL of plasma, accounting for ∼19% of plasma ccf-mtDNA. Therefore, ddPCR can quickly and reliably detect ccf-mtDNA from plasma with a prior DNA extraction step, providing for a more accurate detection of ccf-mtDNA. The direct use of plasma as a template in ddPCR is suitable for the detection of exogenous cell-free nucleic acids within plasma, but not of nucleic acids that have a vesicle-associated form, such as exosomal mtDNA. Graphical Abstract Designs of the present work. *: Module 1, #: Module 2, &: Module 3.

  14. Investigating the Validity of Two Widely Used Quantitative Text Tools

    ERIC Educational Resources Information Center

    Cunningham, James W.; Hiebert, Elfrieda H.; Mesmer, Heidi Anne

    2018-01-01

    In recent years, readability formulas have gained new prominence as a basis for selecting texts for learning and assessment. Variables that quantitative tools count (e.g., word frequency, sentence length) provide valid measures of text complexity insofar as they accurately predict representative and high-quality criteria. The longstanding…

  15. Quantitative phylogenetic assessment of microbial communities in diverse environments.

    PubMed

    von Mering, C; Hugenholtz, P; Raes, J; Tringe, S G; Doerks, T; Jensen, L J; Ward, N; Bork, P

    2007-02-23

    The taxonomic composition of environmental communities is an important indicator of their ecology and function. We used a set of protein-coding marker genes, extracted from large-scale environmental shotgun sequencing data, to provide a more direct, quantitative, and accurate picture of community composition than that provided by traditional ribosomal RNA-based approaches depending on the polymerase chain reaction. Mapping marker genes from four diverse environmental data sets onto a reference species phylogeny shows that certain communities evolve faster than others. The method also enables determination of preferred habitats for entire microbial clades and provides evidence that such habitat preferences are often remarkably stable over time.

  16. Quantitative Phase Microscopy for Accurate Characterization of Microlens Arrays

    NASA Astrophysics Data System (ADS)

    Grilli, Simonetta; Miccio, Lisa; Merola, Francesco; Finizio, Andrea; Paturzo, Melania; Coppola, Sara; Vespini, Veronica; Ferraro, Pietro

    Microlens arrays are of fundamental importance in a wide variety of applications in optics and photonics. This chapter deals with an accurate digital holography-based characterization of both liquid and polymeric microlenses fabricated by an innovative pyro-electrowetting process. The actuation of liquid and polymeric films is obtained through the use of pyroelectric charges generated into polar dielectric lithium niobate crystals.

  17. Accurate quantitative CF-LIBS analysis of both major and minor elements in alloys via iterative correction of plasma temperature and spectral intensity

    NASA Astrophysics Data System (ADS)

    Shuxia, ZHAO; Lei, ZHANG; Jiajia, HOU; Yang, ZHAO; Wangbao, YIN; Weiguang, MA; Lei, DONG; Liantuan, XIAO; Suotang, JIA

    2018-03-01

    The chemical composition of alloys directly determines their mechanical behaviors and application fields. Accurate and rapid analysis of both major and minor elements in alloys plays a key role in metallurgy quality control and material classification processes. A quantitative calibration-free laser-induced breakdown spectroscopy (CF-LIBS) analysis method, which carries out combined correction of plasma temperature and spectral intensity by using a second-order iterative algorithm and two boundary standard samples, is proposed to realize accurate composition measurements. Experimental results show that, compared to conventional CF-LIBS analysis, the relative errors for major elements Cu and Zn and minor element Pb in the copper-lead alloys has been reduced from 12%, 26% and 32% to 1.8%, 2.7% and 13.4%, respectively. The measurement accuracy for all elements has been improved substantially.

  18. pyQms enables universal and accurate quantification of mass spectrometry data.

    PubMed

    Leufken, Johannes; Niehues, Anna; Sarin, L Peter; Wessel, Florian; Hippler, Michael; Leidel, Sebastian A; Fufezan, Christian

    2017-10-01

    Quantitative mass spectrometry (MS) is a key technique in many research areas (1), including proteomics, metabolomics, glycomics, and lipidomics. Because all of the corresponding molecules can be described by chemical formulas, universal quantification tools are highly desirable. Here, we present pyQms, an open-source software for accurate quantification of all types of molecules measurable by MS. pyQms uses isotope pattern matching that offers an accurate quality assessment of all quantifications and the ability to directly incorporate mass spectrometer accuracy. pyQms is, due to its universal design, applicable to every research field, labeling strategy, and acquisition technique. This opens ultimate flexibility for researchers to design experiments employing innovative and hitherto unexplored labeling strategies. Importantly, pyQms performs very well to accurately quantify partially labeled proteomes in large scale and high throughput, the most challenging task for a quantification algorithm. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  19. Quantitative Procedures for the Assessment of Quality in Higher Education Institutions.

    ERIC Educational Resources Information Center

    Moran, Tom; Rowse, Glenwood

    The development of procedures designed to provide quantitative assessments of quality in higher education institutions are reviewed. These procedures employ a systems framework and utilize quantitative data to compare institutions or programs of similar types with one another. Three major elements essential in the development of models focusing on…

  20. Quantitative imaging methods in osteoporosis.

    PubMed

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  1. Accurate determination of reference materials and natural isolates by means of quantitative (1)h NMR spectroscopy.

    PubMed

    Frank, Oliver; Kreissl, Johanna Karoline; Daschner, Andreas; Hofmann, Thomas

    2014-03-26

    A fast and precise proton nuclear magnetic resonance (qHNMR) method for the quantitative determination of low molecular weight target molecules in reference materials and natural isolates has been validated using ERETIC 2 (Electronic REference To access In vivo Concentrations) based on the PULCON (PULse length based CONcentration determination) methodology and compared to the gravimetric results. Using an Avance III NMR spectrometer (400 MHz) equipped with a broad band observe (BBO) probe, the qHNMR method was validated by determining its linearity, range, precision, and accuracy as well as robustness and limit of quantitation. The linearity of the method was assessed by measuring samples of l-tyrosine, caffeine, or benzoic acid in a concentration range between 0.3 and 16.5 mmol/L (r(2) ≥ 0.99), whereas the interday and intraday precisions were found to be ≤2%. The recovery of a range of reference compounds was ≥98.5%, thus demonstrating the qHNMR method as a precise tool for the rapid quantitation (~15 min) of food-related target compounds in reference materials and natural isolates such as nucleotides, polyphenols, or cyclic peptides.

  2. [Research progress and development trend of quantitative assessment techniques for urban thermal environment.

    PubMed

    Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang

    2016-08-01

    Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.

  3. Quantitative PCR for Detection and Enumeration of Genetic Markers of Bovine Fecal Pollution

    EPA Science Inventory

    Accurate assessment of health risks associated with bovine (cattle) fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for the detection of two recently described cow feces-spec...

  4. Relationship between Plaque Echo, Thickness and Neovascularization Assessed by Quantitative and Semi-quantitative Contrast-Enhanced Ultrasonography in Different Stenosis Groups.

    PubMed

    Song, Yan; Feng, Jun; Dang, Ying; Zhao, Chao; Zheng, Jie; Ruan, Litao

    2017-12-01

    The aim of this study was to determine the relationship between plaque echo, thickness and neovascularization in different stenosis groups using quantitative and semi-quantitative contrast-enhanced ultrasound (CEUS) in patients with carotid atherosclerosis plaque. A total of 224 plaques were divided into mild stenosis (<50%; 135 plaques, 60.27%), moderate stenosis (50%-69%; 39 plaques, 17.41%) and severe stenosis (70%-99%; 50 plaques, 22.32%) groups. Quantitative and semi-quantitative methods were used to assess plaque neovascularization and determine the relationship between plaque echo, thickness and neovascularization. Correlation analysis revealed no relationship of neovascularization with plaque echo in the groups using either quantitative or semi-quantitative methods. Furthermore, there was no correlation of neovascularization with plaque thickness using the semi-quantitative method. The ratio of areas under the curve (RAUC) was negatively correlated with plaque thickness (r = -0.317, p = 0.001) in the mild stenosis group. With the quartile method, plaque thickness of the mild stenosis group was divided into four groups, with significant differences between the 1.5-2.2 mm and ≥3.5 mm groups (p = 0.002), 2.3-2.8 mm and ≥3.5 mm groups (p <0.001) and 2.9-3.4 mm and ≥3.5 mm groups (p <0.001). Both semi-quantitative and quantitative CEUS methods characterizing neovascularization of plaque are equivalent with respect to assessing relationships between neovascularization, echogenicity and thickness. However, the quantitative method could fail for plaque <3.5 mm because of motion artifacts. Copyright © 2017 World Federation for Ultrasound in Medicine and Biology. Published by Elsevier Inc. All rights reserved.

  5. The Use of Quantitative SPECT/CT Imaging to Assess Residual Limb Health

    DTIC Science & Technology

    2016-10-01

    AWARD NUMBER: W81XWH-15-1-0669 TITLE: The Use of Quantitative SPECT/CT Imaging to Assess Residual Limb Health PRINCIPAL INVESTIGATOR...3. DATES COVERED 30 Sep 2015 - 29 Sep 2016 4. TITLE AND SUBTITLE The Use of Quantitative SPECT/CT Imaging to Assess Residual Limb Health 5a...amputation and subsequently evaluate the utility of non-invasive imaging for evaluating the impact of next-generation socket technologies on the health of

  6. Assessing framing assumptions in quantitative health impact assessments: a housing intervention example.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2013-09-01

    Health impact assessment (HIA) is often used to determine ex ante the health impact of an environmental policy or an environmental intervention. Underpinning any HIA is the framing assumption, which defines the causal pathways mapping environmental exposures to health outcomes. The sensitivity of the HIA to the framing assumptions is often ignored. A novel method based on fuzzy cognitive map (FCM) is developed to quantify the framing assumptions in the assessment stage of a HIA, and is then applied to a housing intervention (tightening insulation) as a case-study. Framing assumptions of the case-study were identified through a literature search of Ovid Medline (1948-2011). The FCM approach was used to identify the key variables that have the most influence in a HIA. Changes in air-tightness, ventilation, indoor air quality and mould/humidity have been identified as having the most influence on health. The FCM approach is widely applicable and can be used to inform the formulation of the framing assumptions in any quantitative HIA of environmental interventions. We argue that it is necessary to explore and quantify framing assumptions prior to conducting a detailed quantitative HIA during the assessment stage. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. A qualitative and quantitative assessment for a bone marrow harvest simulator.

    PubMed

    Machado, Liliane S; Moraes, Ronei M

    2009-01-01

    Several approaches to perform assessment in training simulators based on virtual reality have been proposed. There are two kinds of assessment methods: offline and online. The main requirements related to online training assessment methodologies applied to virtual reality systems are the low computational complexity and the high accuracy. In the literature it can be found several approaches for general cases which can satisfy such requirements. An inconvenient about those approaches is related to an unsatisfactory solution for specific cases, as in some medical procedures, where there are quantitative and qualitative information available to perform the assessment. In this paper, we present an approach to online training assessment based on a Modified Naive Bayes which can manipulate qualitative and quantitative variables simultaneously. A special medical case was simulated in a bone marrow harvest simulator. The results obtained were satisfactory and evidenced the applicability of the method.

  8. A Quantitative Point-of-Need Assay for the Assessment of Vitamin D3 Deficiency.

    PubMed

    Vemulapati, S; Rey, E; O'Dell, D; Mehta, S; Erickson, D

    2017-10-26

    Vitamin D is necessary for the healthy growth and development of bone and muscle. Vitamin D deficiency, which is present in 42% of the US population, is often undiagnosed as symptoms may not manifest for several years and long-term deficiency has been linked to osteoporosis, diabetes and cancer. Currently the majority of vitamin D testing is performed in large-scale commercial laboratories which have high operational costs and long times-to-result. Development of a low-cost point-of-need assay could be transformative to deficiency analysis in limited-resource settings. The best biomarker of vitamin D status, 25hydroxyvitamin D 3 (25(OH)D 3 ), however, is particularly challenging to measure in such a format due to complexities involved in sample preparation, including the need to separate the marker from its binding protein. Here we present a rapid diagnostic test for the accurate, quantitative assessment of 25(OH)D 3 in finger-stick blood. The assay is accompanied by a smartphone-assisted portable imaging device that can autonomously perform the necessary image processing. To achieve accurate quantification of 25(OH)D 3 , we also demonstrate a novel elution buffer that separates 25(OH)D 3 from its binding protein in situ, eliminating the need for sample preparation. In human trials, the accuracy of our platform is 90.5%.

  9. Repeatability Assessment by ISO 11843-7 in Quantitative HPLC for Herbal Medicines.

    PubMed

    Chen, Liangmian; Kotani, Akira; Hakamata, Hideki; Tsutsumi, Risa; Hayashi, Yuzuru; Wang, Zhimin; Kusu, Fumiyo

    2015-01-01

    We have proposed an assessment methods to estimate the measurement relative standard deviation (RSD) of chromatographic peaks in quantitative HPLC for herbal medicines by the methodology of ISO 11843 Part 7 (ISO 11843-7:2012), which provides detection limits stochastically. In quantitative HPLC with UV detection (HPLC-UV) of Scutellaria Radix for the determination of baicalin, the measurement RSD of baicalin by ISO 11843-7:2012 stochastically was within a 95% confidence interval of the statistically obtained RSD by repetitive measurements (n = 6). Thus, our findings show that it is applicable for estimating of the repeatability of HPLC-UV for determining baicalin without repeated measurements. In addition, the allowable limit of the "System repeatability" in "Liquid Chromatography" regulated in a pharmacopoeia can be obtained by the present assessment method. Moreover, the present assessment method was also successfully applied to estimate the measurement RSDs of quantitative three-channel liquid chromatography with electrochemical detection (LC-3ECD) of Chrysanthemi Flos for determining caffeoylquinic acids and flavonoids. By the present repeatability assessment method, reliable measurement RSD was obtained stochastically, and the experimental time was remarkably reduced.

  10. Home Circadian Phase Assessments with Measures of Compliance Yield Accurate Dim Light Melatonin Onsets

    PubMed Central

    Burgess, Helen J.; Wyatt, James K.; Park, Margaret; Fogg, Louis F.

    2015-01-01

    Study Objectives: There is a need for the accurate assessment of circadian phase outside of the clinic/laboratory, particularly with the gold standard dim light melatonin onset (DLMO). We tested a novel kit designed to assist in saliva sampling at home for later determination of the DLMO. The home kit includes objective measures of compliance to the requirements for dim light and half-hourly saliva sampling. Design: Participants were randomized to one of two 10-day protocols. Each protocol consisted of two back-to-back home and laboratory phase assessments in counterbalanced order, separated by a 5-day break. Setting: Laboratory or participants' homes. Participants: Thirty-five healthy adults, age 21–62 y. Interventions: N/A. Measurements and Results: Most participants received at least one 30-sec epoch of light > 50 lux during the home phase assessments (average light intensity 4.5 lux), but on average for < 9 min of the required 8.5 h. Most participants collected every saliva sample within 5 min of the scheduled time. Ninety-two percent of home DLMOs were not affected by light > 50 lux or sampling errors. There was no significant difference between the home and laboratory DLMOs (P > 0.05); on average the home DLMOs occurred 9.6 min before the laboratory DLMOs. The home DLMOs were highly correlated with the laboratory DLMOs (r = 0.91, P < 0.001). Conclusions: Participants were reasonably compliant to the home phase assessment procedures. The good agreement between the home and laboratory dim light melatonin onsets (DLMOs) demonstrates that including objective measures of light exposure and sample timing during home saliva sampling can lead to accurate home DLMOs. Clinical Trial Registration: Circadian Phase Assessments at Home, http://clinicaltrials.gov/show/NCT01487252, NCT01487252. Citation: Burgess HJ, Wyatt JK, Park M, Fogg LF. Home circadian phase assessments with measures of compliance yield accurate dim light melatonin onsets. SLEEP 2015;38(6):889–897

  11. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  12. NecroQuant: quantitative assessment of radiological necrosis

    NASA Astrophysics Data System (ADS)

    Hwang, Darryl H.; Mohamed, Passant; Varghese, Bino A.; Cen, Steven Y.; Duddalwar, Vinay

    2017-11-01

    Clinicians can now objectively quantify tumor necrosis by Hounsfield units and enhancement characteristics from multiphase contrast enhanced CT imaging. NecroQuant has been designed to work as part of a radiomics pipelines. The software is a departure from the conventional qualitative assessment of tumor necrosis, as it provides the user (radiologists and researchers) a simple interface to precisely and interactively define and measure necrosis in contrast-enhanced CT images. Although, the software is tested here on renal masses, it can be re-configured to assess tumor necrosis across variety of tumors from different body sites, providing a generalized, open, portable, and extensible quantitative analysis platform that is widely applicable across cancer types to quantify tumor necrosis.

  13. PconsD: ultra rapid, accurate model quality assessment for protein structure prediction.

    PubMed

    Skwark, Marcin J; Elofsson, Arne

    2013-07-15

    Clustering methods are often needed for accurately assessing the quality of modeled protein structures. Recent blind evaluation of quality assessment methods in CASP10 showed that there is little difference between many different methods as far as ranking models and selecting best model are concerned. When comparing many models, the computational cost of the model comparison can become significant. Here, we present PconsD, a fast, stream-computing method for distance-driven model quality assessment that runs on consumer hardware. PconsD is at least one order of magnitude faster than other methods of comparable accuracy. The source code for PconsD is freely available at http://d.pcons.net/. Supplementary benchmarking data are also available there. arne@bioinfo.se Supplementary data are available at Bioinformatics online.

  14. Cardiovascular magnetic resonance of myocardial edema using a short inversion time inversion recovery (STIR) black-blood technique: Diagnostic accuracy of visual and semi-quantitative assessment

    PubMed Central

    2012-01-01

    Background The short inversion time inversion recovery (STIR) black-blood technique has been used to visualize myocardial edema, and thus to differentiate acute from chronic myocardial lesions. However, some cardiovascular magnetic resonance (CMR) groups have reported variable image quality, and hence the diagnostic value of STIR in routine clinical practice has been put into question. The aim of our study was to analyze image quality and diagnostic performance of STIR using a set of pulse sequence parameters dedicated to edema detection, and to discuss possible factors that influence image quality. We hypothesized that STIR imaging is an accurate and robust way of detecting myocardial edema in non-selected patients with acute myocardial infarction. Methods Forty-six consecutive patients with acute myocardial infarction underwent CMR (day 4.5, +/- 1.6) including STIR for the assessment of myocardial edema and late gadolinium enhancement (LGE) for quantification of myocardial necrosis. Thirty of these patients underwent a follow-up CMR at approximately six months (195 +/- 39 days). Both STIR and LGE images were evaluated separately on a segmental basis for image quality as well as for presence and extent of myocardial hyper-intensity, with both visual and semi-quantitative (threshold-based) analysis. LGE was used as a reference standard for localization and extent of myocardial necrosis (acute) or scar (chronic). Results Image quality of STIR images was rated as diagnostic in 99.5% of cases. At the acute stage, the sensitivity and specificity of STIR to detect infarcted segments on visual assessment was 95% and 78% respectively, and on semi-quantitative assessment was 99% and 83%, respectively. STIR differentiated acutely from chronically infarcted segments with a sensitivity of 95% by both methods and with a specificity of 99% by visual assessment and 97% by semi-quantitative assessment. The extent of hyper-intense areas on acute STIR images was 85% larger than

  15. Do doctors accurately assess coronary risk in their patients? Preliminary results of the coronary health assessment study.

    PubMed Central

    Grover, S. A.; Lowensteyn, I.; Esrey, K. L.; Steinert, Y.; Joseph, L.; Abrahamowicz, M.

    1995-01-01

    OBJECTIVE--To evaluate the ability of doctors in primary care to assess risk patients' risk of coronary heart disease. DESIGN--Questionnaire survey. SETTING--Continuing medical education meetings, Ontario and Quebec, Canada. SUBJECTS--Community based doctors who agreed to enroll in the coronary health assessment study. MAIN OUTCOME MEASURE--Ratings of coronary risk factors and estimates by doctors of relative and absolute coronary risk of two hypothetical patients and the "average" 40 year old Canadian man and 70 year old Canadian woman. RESULTS--253 doctors answered the questionnaire. For 30 year olds the doctors rated cigarette smoking as the most important risk factor and raised serum triglyceride concentrations as the least important; for 70 year old patients they rated diabetes as the most important risk factor and raised serum triglyceride concentrations as the least important. They rated each individual risk factor as significantly less important for 70 year olds than for 30 year olds (all risk factors, P < 0.001). They showed a strong understanding of the relative importance of specific risk factors, and most were confident in their ability to estimate coronary risk. While doctors accurately estimated the relative risk of a specific patient (compared with the average adult) they systematically overestimated the absolute baseline risk of developing coronary disease and the risk reductions associated with specific interventions. CONCLUSIONS--Despite guidelines on targeting patients at high risk of coronary disease accurate assessment of coronary risk remains difficult for many doctors. Additional strategies must be developed to help doctors to assess better their patients' coronary risk. PMID:7728035

  16. Quantitative single-photon emission computed tomography/computed tomography for technetium pertechnetate thyroid uptake measurement

    PubMed Central

    Lee, Hyunjong; Kim, Ji Hyun; Kang, Yeon-koo; Moon, Jae Hoon; So, Young; Lee, Won Woo

    2016-01-01

    Abstract Objectives: Technetium pertechnetate (99mTcO4) is a radioactive tracer used to assess thyroid function by thyroid uptake system (TUS). However, the TUS often fails to deliver accurate measurements of the percent of thyroid uptake (%thyroid uptake) of 99mTcO4. Here, we investigated the usefulness of quantitative single-photon emission computed tomography/computed tomography (SPECT/CT) after injection of 99mTcO4 in detecting thyroid function abnormalities. Materials and methods: We retrospectively reviewed data from 50 patients (male:female = 15:35; age, 46.2 ± 16.3 years; 17 Graves disease, 13 thyroiditis, and 20 euthyroid). All patients underwent 99mTcO4 quantitative SPECT/CT (185 MBq = 5 mCi), which yielded %thyroid uptake and standardized uptake value (SUV). Twenty-one (10 Graves disease and 11 thyroiditis) of the 50 patients also underwent conventional %thyroid uptake measurements using a TUS. Results: Quantitative SPECT/CT parameters (%thyroid uptake, SUVmean, and SUVmax) were the highest in Graves disease, second highest in euthyroid, and lowest in thyroiditis (P < 0.0001, Kruskal–Wallis test). TUS significantly overestimated the %thyroid uptake compared with SPECT/CT (P < 0.0001, paired t test) because other 99mTcO4 sources in addition to thyroid, such as salivary glands and saliva, contributed to the %thyroid uptake result by TUS, whereas %thyroid uptake, SUVmean and SUVmax from the SPECT/CT were associated with the functional status of thyroid. Conclusions: Quantitative SPECT/CT is more accurate than conventional TUS for measuring 99mTcO4 %thyroid uptake. Quantitative measurements using SPECT/CT may facilitate more accurate assessment of thyroid tracer uptake. PMID:27399139

  17. Combined visual and semi-quantitative assessment of 123I-FP-CIT SPECT for the diagnosis of dopaminergic neurodegenerative diseases.

    PubMed

    Ueda, Jun; Yoshimura, Hajime; Shimizu, Keiji; Hino, Megumu; Kohara, Nobuo

    2017-07-01

    Visual and semi-quantitative assessments of 123 I-FP-CIT single-photon emission computed tomography (SPECT) are useful for the diagnosis of dopaminergic neurodegenerative diseases (dNDD), including Parkinson's disease, dementia with Lewy bodies, progressive supranuclear palsy, multiple system atrophy, and corticobasal degeneration. However, the diagnostic value of combined visual and semi-quantitative assessment in dNDD remains unclear. Among 239 consecutive patients with a newly diagnosed possible parkinsonian syndrome who underwent 123 I-FP-CIT SPECT in our medical center, 114 patients with a disease duration less than 7 years were diagnosed as dNDD with the established criteria or as non-dNDD according to clinical judgment. We retrospectively examined their clinical characteristics and visual and semi-quantitative assessments of 123 I-FP-CIT SPECT. The striatal binding ratio (SBR) was used as a semi-quantitative measure of 123 I-FP-CIT SPECT. We calculated the sensitivity and specificity of visual assessment alone, semi-quantitative assessment alone, and combined visual and semi-quantitative assessment for the diagnosis of dNDD. SBR was correlated with visual assessment. Some dNDD patients with a normal visual assessment had an abnormal SBR, and vice versa. There was no statistically significant difference between sensitivity of the diagnosis with visual assessment alone and semi-quantitative assessment alone (91.2 vs. 86.8%, respectively, p = 0.29). Combined visual and semi-quantitative assessment demonstrated superior sensitivity (96.7%) to visual assessment (p = 0.03) or semi-quantitative assessment (p = 0.003) alone with equal specificity. Visual and semi-quantitative assessments of 123 I-FP-CIT SPECT are helpful for the diagnosis of dNDD, and combined visual and semi-quantitative assessment shows superior sensitivity with equal specificity.

  18. Objective, Quantitative, Data-Driven Assessment of Chemical Probes.

    PubMed

    Antolin, Albert A; Tym, Joseph E; Komianou, Angeliki; Collins, Ian; Workman, Paul; Al-Lazikani, Bissan

    2018-02-15

    Chemical probes are essential tools for understanding biological systems and for target validation, yet selecting probes for biomedical research is rarely based on objective assessment of all potential compounds. Here, we describe the Probe Miner: Chemical Probes Objective Assessment resource, capitalizing on the plethora of public medicinal chemistry data to empower quantitative, objective, data-driven evaluation of chemical probes. We assess >1.8 million compounds for their suitability as chemical tools against 2,220 human targets and dissect the biases and limitations encountered. Probe Miner represents a valuable resource to aid the identification of potential chemical probes, particularly when used alongside expert curation. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Multiparametric Quantitative Ultrasound Imaging in Assessment of Chronic Kidney Disease.

    PubMed

    Gao, Jing; Perlman, Alan; Kalache, Safa; Berman, Nathaniel; Seshan, Surya; Salvatore, Steven; Smith, Lindsey; Wehrli, Natasha; Waldron, Levi; Kodali, Hanish; Chevalier, James

    2017-11-01

    To evaluate the value of multiparametric quantitative ultrasound imaging in assessing chronic kidney disease (CKD) using kidney biopsy pathologic findings as reference standards. We prospectively measured multiparametric quantitative ultrasound markers with grayscale, spectral Doppler, and acoustic radiation force impulse imaging in 25 patients with CKD before kidney biopsy and 10 healthy volunteers. Based on all pathologic (glomerulosclerosis, interstitial fibrosis/tubular atrophy, arteriosclerosis, and edema) scores, the patients with CKD were classified into mild (no grade 3 and <2 of grade 2) and moderate to severe (at least 2 of grade 2 or 1 of grade 3) CKD groups. Multiparametric quantitative ultrasound parameters included kidney length, cortical thickness, pixel intensity, parenchymal shear wave velocity, intrarenal artery peak systolic velocity (PSV), end-diastolic velocity (EDV), and resistive index. We tested the difference in quantitative ultrasound parameters among mild CKD, moderate to severe CKD, and healthy controls using analysis of variance, analyzed correlations of quantitative ultrasound parameters with pathologic scores and the estimated glomerular filtration rate (GFR) using Pearson correlation coefficients, and examined the diagnostic performance of quantitative ultrasound parameters in determining moderate CKD and an estimated GFR of less than 60 mL/min/1.73 m 2 using receiver operating characteristic curve analysis. There were significant differences in cortical thickness, pixel intensity, PSV, and EDV among the 3 groups (all P < .01). Among quantitative ultrasound parameters, the top areas under the receiver operating characteristic curves for PSV and EDV were 0.88 and 0.97, respectively, for determining pathologic moderate to severe CKD, and 0.76 and 0.86 for estimated GFR of less than 60 mL/min/1.73 m 2 . Moderate to good correlations were found for PSV, EDV, and pixel intensity with pathologic scores and estimated GFR. The

  20. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    USDA-ARS?s Scientific Manuscript database

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  1. Quantitative, spectrally-resolved intraoperative fluorescence imaging

    PubMed Central

    Valdés, Pablo A.; Leblond, Frederic; Jacobs, Valerie L.; Wilson, Brian C.; Paulsen, Keith D.; Roberts, David W.

    2012-01-01

    Intraoperative visual fluorescence imaging (vFI) has emerged as a promising aid to surgical guidance, but does not fully exploit the potential of the fluorescent agents that are currently available. Here, we introduce a quantitative fluorescence imaging (qFI) approach that converts spectrally-resolved data into images of absolute fluorophore concentration pixel-by-pixel across the surgical field of view (FOV). The resulting estimates are linear, accurate, and precise relative to true values, and spectral decomposition of multiple fluorophores is also achieved. Experiments with protoporphyrin IX in a glioma rodent model demonstrate in vivo quantitative and spectrally-resolved fluorescence imaging of infiltrating tumor margins for the first time. Moreover, we present images from human surgery which detect residual tumor not evident with state-of-the-art vFI. The wide-field qFI technique has broad implications for intraoperative surgical guidance because it provides near real-time quantitative assessment of multiple fluorescent biomarkers across the operative field. PMID:23152935

  2. Rigour in quantitative research.

    PubMed

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  3. Preanalytical variables and phosphoepitope expression in FFPE tissue: quantitative epitope assessment after variable cold ischemic time.

    PubMed

    Vassilakopoulou, Maria; Parisi, Fabio; Siddiqui, Summar; England, Allison M; Zarella, Elizabeth R; Anagnostou, Valsamo; Kluger, Yuval; Hicks, David G; Rimm, David L; Neumeister, Veronique M

    2015-03-01

    Individualized targeted therapies for cancer patients require accurate and reproducible assessment of biomarkers to be able to plan treatment accordingly. Recent studies have shown highly variable effects of preanalytical variables on gene expression profiling and protein levels of different tissue types. Several publications have described protein degradation of tissue samples as a direct result of delay of formalin fixation of the tissue. Phosphorylated proteins are more labile and epitope degradation can happen within 30 min of cold ischemic time. To address this issue, we evaluated the change in antigenicity of a series of phosphoproteins in paraffin-embedded samples from breast tumors as a function of time to formalin fixation. A tissue microarray consisting of 93 breast cancer specimens with documented time-to-fixation was used to evaluate changes in antigenicity of 12 phosphoepitopes frequently used in research settings as a function of cold ischemic time. Analysis was performed in a quantitative manner using the AQUA technology for quantitative immunofluorescence. For each marker, least squares univariate linear regression was performed and confidence intervals were computed using bootstrapping. The majority of the epitopes tested revealed changes in expression levels with increasing time to formalin fixation. Some phosphorylated proteins, such as phospho-HSP27 and phospho-S6 RP, involved in post-translational modification and stress response pathways increased in expression or phosphorylation levels. Others (like phospho-AKT, phosphor-ERK1/2, phospho-Tyrosine, phospho-MET, and others) are quite labile and loss of antigenicity can be reported within 1-2 h of cold ischemic time. Therefore specimen collection should be closely monitored and subjected to quality control measures to ensure accurate measurement of these epitopes. However, a few phosphoepitopes (like phospho-JAK2 and phospho-ER) are sufficiently robust for routine usage in companion

  4. Method for accurate quantitation of background tissue optical properties in the presence of emission from a strong fluorescence marker

    NASA Astrophysics Data System (ADS)

    Bravo, Jaime; Davis, Scott C.; Roberts, David W.; Paulsen, Keith D.; Kanick, Stephen C.

    2015-03-01

    Quantification of targeted fluorescence markers during neurosurgery has the potential to improve and standardize surgical distinction between normal and cancerous tissues. However, quantitative analysis of marker fluorescence is complicated by tissue background absorption and scattering properties. Correction algorithms that transform raw fluorescence intensity into quantitative units, independent of absorption and scattering, require a paired measurement of localized white light reflectance to provide estimates of the optical properties. This study focuses on the unique problem of developing a spectral analysis algorithm to extract tissue absorption and scattering properties from white light spectra that contain contributions from both elastically scattered photons and fluorescence emission from a strong fluorophore (i.e. fluorescein). A fiber-optic reflectance device was used to perform measurements in a small set of optical phantoms, constructed with Intralipid (1% lipid), whole blood (1% volume fraction) and fluorescein (0.16-10 μg/mL). Results show that the novel spectral analysis algorithm yields accurate estimates of tissue parameters independent of fluorescein concentration, with relative errors of blood volume fraction, blood oxygenation fraction (BOF), and the reduced scattering coefficient (at 521 nm) of <7%, <1%, and <22%, respectively. These data represent a first step towards quantification of fluorescein in tissue in vivo.

  5. Hydrogen quantitative risk assessment workshop proceedings.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersionmore » 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.« less

  6. Quantitative Assessment of Fat Infiltration in the Rotator Cuff Muscles using water-fat MRI

    PubMed Central

    Nardo, Lorenzo; Karampinos, Dimitrios C.; Lansdown, Drew A.; Carballido-Gamio, Julio; Lee, Sonia; Maroldi, Roberto; Ma, C. Benjamin; Link, Thomas M.; Krug, Roland

    2013-01-01

    Purpose To evaluate a chemical shift-based fat quantification technique in the rotator cuff muscles in comparison with the semi-quantitative Goutallier fat infiltration classification (GC) and to assess their relationship with clinical parameters. Materials and Methods The shoulders of 57 patients were imaged using a 3T MR scanner. The rotator cuff muscles were assessed for fat infiltration using GC by two radiologists and an orthopedic surgeon. Sequences included oblique-sagittal T1-, T2- and proton density-weighted fast spin echo, and six-echo gradient echo. The iterative decomposition of water and fat with echo asymmetry and least-squares estimation (IDEAL) was used to measure fat fraction. Pain and range of motion of the shoulder were recorded. Results Fat fraction values were significantly correlated with GC grades (p< 0.0001, kappa>0.9) showing consistent increase with GC grades (grade=0, 0%–5.59%; grade=1, 1.1%–9.70%; grade=2, 6.44%–14.86%; grade=3, 15.25%–17.77%; grade=4, 19.85%–29.63%). A significant correlation between fat infiltration of the subscapularis muscle quantified with IDEAL versus a) deficit in internal rotation (Spearman Rank Correlation Coefficient=0.39, 95% CI 0.13–0.60, p<0.01) and b) pain (Spearman Rank Correlation coefficient=0.313, 95% CI 0.049–0.536, p=0.02) was found but was not seen between the clinical parameters and GC grades. Additionally, only quantitative fat infiltration measures of the supraspinatus muscle were significantly correlated with a deficit in abduction (Spearman Rank Correlation Coefficient=0.45, 95% CI 0.20–0.60, p<0.01). Conclusion We concluded that an accurate and highly reproducible fat quantification in the rotator cuff muscles using water-fat MRI techniques is possible and significantly correlates with shoulder pain and range of motion. PMID:24115490

  7. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  8. Quantitation of specific binding ratio in 123I-FP-CIT SPECT: accurate processing strategy for cerebral ventricular enlargement with use of 3D-striatal digital brain phantom.

    PubMed

    Furuta, Akihiro; Onishi, Hideo; Amijima, Hizuru

    2018-06-01

    This study aimed to evaluate the effect of ventricular enlargement on the specific binding ratio (SBR) and to validate the cerebrospinal fluid (CSF)-Mask algorithm for quantitative SBR assessment of 123 I-FP-CIT single-photon emission computed tomography (SPECT) images with the use of a 3D-striatum digital brain (SDB) phantom. Ventricular enlargement was simulated by three-dimensional extensions in a 3D-SDB phantom comprising segments representing the striatum, ventricle, brain parenchyma, and skull bone. The Evans Index (EI) was measured in 3D-SDB phantom images of an enlarged ventricle. Projection data sets were generated from the 3D-SDB phantoms with blurring, scatter, and attenuation. Images were reconstructed using the ordered subset expectation maximization (OSEM) algorithm and corrected for attenuation, scatter, and resolution recovery. We bundled DaTView (Southampton method) with the CSF-Mask processing software for SBR. We assessed SBR with the use of various coefficients (f factor) of the CSF-Mask. Specific binding ratios of 1, 2, 3, 4, and 5 corresponded to SDB phantom simulations with true values. Measured SBRs > 50% that were underestimated with EI increased compared with the true SBR and this trend was outstanding at low SBR. The CSF-Mask improved 20% underestimates and brought the measured SBR closer to the true values at an f factor of 1.0 despite an increase in EI. We connected the linear regression function (y = - 3.53x + 1.95; r = 0.95) with the EI and f factor using root-mean-square error. Processing with CSF-Mask generates accurate quantitative SBR from dopamine transporter SPECT images of patients with ventricular enlargement.

  9. A comparison of manual and quantitative elbow strength testing.

    PubMed

    Shahgholi, Leili; Bengtson, Keith A; Bishop, Allen T; Shin, Alexander Y; Spinner, Robert J; Basford, Jeffrey R; Kaufman, Kenton R

    2012-10-01

    The aim of this study was to compare the clinical ratings of elbow strength obtained by skilled clinicians with objective strength measurement obtained through quantitative testing. A retrospective comparison of subject clinical records with quantitative strength testing results in a motion analysis laboratory was conducted. A total of 110 individuals between the ages of 8 and 65 yrs with traumatic brachial plexus injuries were identified. Patients underwent manual muscle strength testing as assessed on the 5-point British Medical Research Council Scale (5/5, normal; 0/5, absent) and quantitative elbow flexion and extension strength measurements. A total of 92 subjects had elbow flexion testing. Half of the subjects clinically assessed as having normal (5/5) elbow flexion strength on manual muscle testing exhibited less than 42% of their age-expected strength on quantitative testing. Eighty-four subjects had elbow extension strength testing. Similarly, half of those displaying normal elbow extension strength on manual muscle testing were found to have less than 62% of their age-expected values on quantitative testing. Significant differences between manual muscle testing and quantitative findings were not detected for the lesser (0-4) strength grades. Manual muscle testing, even when performed by experienced clinicians, may be more misleading than expected for subjects graded as having normal (5/5) strength. Manual muscle testing estimates for the lesser strength grades (1-4/5) seem reasonably accurate.

  10. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses

    ERIC Educational Resources Information Center

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory…

  11. Contrast-enhanced ultrasound for quantitative assessment of portal pressure in canine liver fibrosis.

    PubMed

    Zhai, Lin; Qiu, Lan-Yan; Zu, Yuan; Yan, Yan; Ren, Xiao-Zhuan; Zhao, Jun-Feng; Liu, Yu-Jiang; Liu, Ji-Bin; Qian, Lin-Xue

    2015-04-21

    To explore the feasibility of non-invasive quantitative estimation of portal venous pressure by contrast-enhanced ultrasound (CEUS) in a canine model. Liver fibrosis was established in adult canines (Beagles; n = 14) by subcutaneous injection of carbon tetrachloride (CCl4). CEUS parameters, including the area under the time-intensity curve and intensity at portal/arterial phases (Qp/Qa and Ip/Ia, respectively), were used to quantitatively assess the blood flow ratio of the portal vein/hepatic artery at multiple time points. The free portal venous pressures (FPP) were measured by a multi-channel baroreceptor using a percutaneous approach at baseline and 8, 16, and 24 wk after CCl4 injections in each canine. Liver biopsies were obtained at the end of 8, 16, and 24 wk from each animal, and the stage of the fibrosis was assessed according to the Metavir scoring system. A Pearson correlation test was performed to compare the FPP with Qp/Qa and Ip/Ia. Pathologic examination of 42 biopsies from the 14 canines at weeks 8, 16, and 24 revealed that liver fibrosis was induced by CCl4 and represented various stages of liver fibrosis, including F0 (n = 3), F1 (n = 12), F2 (n = 14), F3 (n = 11), and F4 (n = 2). There were significant differences in the measurements of Qp/Qa (19.85 ± 3.30 vs 10.43 ± 1.21, 9.63 ± 1.03, and 8.77 ± 0.96) and Ip/Ia (1.77 ± 0.37 vs 1.03 ± 0.12, 0.83 ± 0.10, and 0.69 ± 0.13) between control and canine fibrosis at 8, 16, and 24 wk, respectively (all P < 0.001). There were statistically significant negative correlations between FPP and Qp/Qa (r = -0.707, P < 0.001), and between FPP and Ip/Ia (r = -0.759, P < 0.001) in the canine fibrosis model. Prediction of elevated FPP based on Qp/Qa and Ip/Ia was highly sensitive, as assessed by the area under the receiver operating curve (0.866 and 0.895, respectively). CEUS is a potential method to accurately, but non-invasively, estimate portal venous pressure through measurement of Qp/Qa and Ip

  12. An accurate method of extracting fat droplets in liver images for quantitative evaluation

    NASA Astrophysics Data System (ADS)

    Ishikawa, Masahiro; Kobayashi, Naoki; Komagata, Hideki; Shinoda, Kazuma; Yamaguchi, Masahiro; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie

    2015-03-01

    The steatosis in liver pathological tissue images is a promising indicator of nonalcoholic fatty liver disease (NAFLD) and the possible risk of hepatocellular carcinoma (HCC). The resulting values are also important for ensuring the automatic and accurate classification of HCC images, because the existence of many fat droplets is likely to create errors in quantifying the morphological features used in the process. In this study we propose a method that can automatically detect, and exclude regions with many fat droplets by using the feature values of colors, shapes and the arrangement of cell nuclei. We implement the method and confirm that it can accurately detect fat droplets and quantify the fat droplet ratio of actual images. This investigation also clarifies the effective characteristics that contribute to accurate detection.

  13. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    PubMed Central

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  14. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  15. Quantitative dose-response assessment of inhalation exposures to toxic air pollutants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarabek, A.M.; Foureman, G.L.; Gift, J.S.

    1997-12-31

    Implementation of the 1990 Clean Air Act Amendments, including evaluation of residual risks. requires accurate human health risk estimates of both acute and chronic inhalation exposures to toxic air pollutants. The U.S. Environmental Protection Agency`s National Center for Environmental Assessment, Research Triangle Park, NC, has a research program that addresses several key issues for development of improved quantitative approaches for dose-response assessment. This paper describes three projects underway in the program. Project A describes a Bayesian approach that was developed to base dose-response estimates on combined data sets and that expresses these estimates as probability density functions. A categorical regressionmore » model has been developed that allows for the combination of all available acute data, with toxicity expressed as severity categories (e.g., mild, moderate, severe), and with both duration and concentration as governing factors. Project C encompasses two refinements to uncertainty factors (UFs) often applied to extrapolate dose-response estimates from laboratory animal data to human equivalent concentrations. Traditional UFs have been based on analyses of oral administration and may not be appropriate for extrapolation of inhalation exposures. Refinement of the UF applied to account for the use of subchronic rather than chronic data was based on an analysis of data from inhalation exposures (Project C-1). Mathematical modeling using the BMD approach was used to calculate the dose-response estimates for comparison between the subchronic and chronic data so that the estimates were not subject to dose-spacing or sample size variability. The second UF that was refined for extrapolation of inhalation data was the adjustment for the use of a LOAEL rather than a NOAEL (Project C-2).« less

  16. Quantitative falls risk estimation through multi-sensor assessment of standing balance.

    PubMed

    Greene, Barry R; McGrath, Denise; Walsh, Lorcan; Doheny, Emer P; McKeown, David; Garattini, Chiara; Cunningham, Clodagh; Crosby, Lisa; Caulfield, Brian; Kenny, Rose A

    2012-12-01

    Falls are the most common cause of injury and hospitalization and one of the principal causes of death and disability in older adults worldwide. Measures of postural stability have been associated with the incidence of falls in older adults. The aim of this study was to develop a model that accurately classifies fallers and non-fallers using novel multi-sensor quantitative balance metrics that can be easily deployed into a home or clinic setting. We compared the classification accuracy of our model with an established method for falls risk assessment, the Berg balance scale. Data were acquired using two sensor modalities--a pressure sensitive platform sensor and a body-worn inertial sensor, mounted on the lower back--from 120 community dwelling older adults (65 with a history of falls, 55 without, mean age 73.7 ± 5.8 years, 63 female) while performing a number of standing balance tasks in a geriatric research clinic. Results obtained using a support vector machine yielded a mean classification accuracy of 71.52% (95% CI: 68.82-74.28) in classifying falls history, obtained using one model classifying all data points. Considering male and female participant data separately yielded classification accuracies of 72.80% (95% CI: 68.85-77.17) and 73.33% (95% CI: 69.88-76.81) respectively, leading to a mean classification accuracy of 73.07% in identifying participants with a history of falls. Results compare favourably to those obtained using the Berg balance scale (mean classification accuracy: 59.42% (95% CI: 56.96-61.88)). Results from the present study could lead to a robust method for assessing falls risk in both supervised and unsupervised environments.

  17. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  18. Preschool Temperament Assessment: A Quantitative Assessment of the Validity of Behavioral Style Questionnaire Data

    ERIC Educational Resources Information Center

    Huelsman, Timothy J.; Gagnon, Sandra Glover; Kidder-Ashley, Pamela; Griggs, Marissa Swaim

    2014-01-01

    Research Findings: Child temperament is an important construct, but its measurement has been marked by a number of weaknesses that have diminished the frequency with which it is assessed in practice. We address this problem by presenting the results of a quantitative construct validation study. We calculated validity indices by hypothesizing the…

  19. 76 FR 19311 - Update of the 2003 Interagency Quantitative Assessment of the Relative Risk to Public Health From...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-07

    ... the 2003 Interagency Quantitative Assessment of the Relative Risk to Public Health From Foodborne... quantitative targets established in ``Healthy People 2010.'' In 2005, FoodNet data showed 0.30 L. monocytogenes... 4). In 2003, FDA and FSIS published a quantitative assessment of the relative risk to public health...

  20. A general method for bead-enhanced quantitation by flow cytometry

    PubMed Central

    Montes, Martin; Jaensson, Elin A.; Orozco, Aaron F.; Lewis, Dorothy E.; Corry, David B.

    2009-01-01

    Flow cytometry provides accurate relative cellular quantitation (percent abundance) of cells from diverse samples, but technical limitations of most flow cytometers preclude accurate absolute quantitation. Several quantitation standards are now commercially available which, when added to samples, permit absolute quantitation of CD4+ T cells. However, these reagents are limited by their cost, technical complexity, requirement for additional software and/or limited applicability. Moreover, few studies have validated the use of such reagents in complex biological samples, especially for quantitation of non-T cells. Here we show that addition to samples of known quantities of polystyrene fluorescence standardization beads permits accurate quantitation of CD4+ T cells from complex cell samples. This procedure, here termed single bead-enhanced cytofluorimetry (SBEC), was equally capable of enumerating eosinophils as well as subcellular fragments of apoptotic cells, moieties with very different optical and fluorescent characteristics. Relative to other proprietary products, SBEC is simple, inexpensive and requires no special software, suggesting that the method is suitable for the routine quantitation of most cells and other particles by flow cytometry. PMID:17067632

  1. Status and future of Quantitative Microbiological Risk Assessment in China

    PubMed Central

    Dong, Q.L.; Barker, G.C.; Gorris, L.G.M.; Tian, M.S.; Song, X.Y.; Malakar, P.K.

    2015-01-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives. PMID:26089594

  2. Optimization of Dual-Energy Xenon-CT for Quantitative Assessment of Regional Pulmonary Ventilation

    PubMed Central

    Fuld, Matthew K.; Halaweish, Ahmed; Newell, John D.; Krauss, Bernhard; Hoffman, Eric A.

    2013-01-01

    Objective Dual-energy X-ray computed tomography (DECT) offers visualization of the airways and quantitation of regional pulmonary ventilation using a single breath of inhaled xenon gas. In this study we seek to optimize scanning protocols for DECT xenon gas ventilation imaging of the airways and lung parenchyma and to characterize the quantitative nature of the developed protocols through a series of test-object and animal studies. Materials and Methods The Institutional Animal Care and Use Committee approved all animal studies reported here. A range of xenon-oxygen gas mixtures (0, 20, 25, 33, 50, 66, 100%; balance oxygen) were scanned in syringes and balloon test-objects to optimize the delivered gas mixture for assessment of regional ventilation while allowing for the development of improved three-material decomposition calibration parameters. Additionally, to alleviate gravitational effects on xenon gas distribution, we replaced a portion of the oxygen in the xenon/oxygen gas mixture with helium and compared gas distributions in a rapid-prototyped human central-airway test-object. Additional syringe tests were performed to determine if the introduction of helium had any effect on xenon quantitation. Xenon gas mixtures were delivered to anesthetized swine in order to assess airway and lung parenchymal opacification while evaluating various DECT scan acquisition settings. Results Attenuation curves for xenon were obtained from the syringe test objects and were used to develop improved three-material decomposition parameters (HU enhancement per percent xenon: Within the chest phantom: 2.25 at 80kVp, 1.7 at 100 kVp, and 0.76 at 140 kVp with tin filtration; In open air: 2.5 at 80kVp, 1.95 at 100 kVp, and 0.81 at 140 kVp with tin filtration). The addition of helium improved the distribution of xenon gas to the gravitationally non-dependent portion of the airway tree test-object, while not affecting quantitation of xenon in the three-material decomposition DECT. 40%Xe

  3. Accurate Determination of Tunneling-Affected Rate Coefficients: Theory Assessing Experiment.

    PubMed

    Zuo, Junxiang; Xie, Changjian; Guo, Hua; Xie, Daiqian

    2017-07-20

    The thermal rate coefficients of a prototypical bimolecular reaction are determined on an accurate ab initio potential energy surface (PES) using ring polymer molecular dynamics (RPMD). It is shown that quantum effects such as tunneling and zero-point energy (ZPE) are of critical importance for the HCl + OH reaction at low temperatures, while the heavier deuterium substitution renders tunneling less facile in the DCl + OH reaction. The calculated RPMD rate coefficients are in excellent agreement with experimental data for the HCl + OH reaction in the entire temperature range of 200-1000 K, confirming the accuracy of the PES. On the other hand, the RPMD rate coefficients for the DCl + OH reaction agree with some, but not all, experimental values. The self-consistency of the theoretical results thus allows a quality assessment of the experimental data.

  4. Short Course Introduction to Quantitative Mineral Resource Assessments

    USGS Publications Warehouse

    Singer, Donald A.

    2007-01-01

    This is an abbreviated text supplementing the content of three sets of slides used in a short course that has been presented by the author at several workshops. The slides should be viewed in the order of (1) Introduction and models, (2) Delineation and estimation, and (3) Combining estimates and summary. References cited in the slides are listed at the end of this text. The purpose of the three-part form of mineral resource assessments discussed in the accompanying slides is to make unbiased quantitative assessments in a format needed in decision-support systems so that consequences of alternative courses of action can be examined. The three-part form of mineral resource assessments was developed to assist policy makers evaluate the consequences of alternative courses of action with respect to land use and mineral-resource development. The audience for three-part assessments is a governmental or industrial policy maker, a manager of exploration, a planner of regional development, or similar decision-maker. Some of the tools and models presented here will be useful for selection of exploration sites, but that is a side benefit, not the goal. To provide unbiased information, we recommend the three-part form of mineral resource assessments where general locations of undiscovered deposits are delineated from a deposit type's geologic setting, frequency distributions of tonnages and grades of well-explored deposits serve as models of grades and tonnages of undiscovered deposits, and number of undiscovered deposits are estimated probabilistically by type. The internally consistent descriptive, grade and tonnage, deposit density, and economic models used in the design of the three-part form of assessments reduce the chances of biased estimates of the undiscovered resources. What and why quantitative resource assessments: The kind of assessment recommended here is founded in decision analysis in order to provide a framework for making decisions concerning mineral

  5. High Resolution Peripheral Quantitative Computed Tomography for Assessment of Bone Quality

    NASA Astrophysics Data System (ADS)

    Kazakia, Galateia

    2014-03-01

    The study of bone quality is motivated by the high morbidity, mortality, and societal cost of skeletal fractures. Over 10 million people are diagnosed with osteoporosis in the US alone, suffering 1.5 million osteoporotic fractures and costing the health care system over 17 billion annually. Accurate assessment of fracture risk is necessary to ensure that pharmacological and other interventions are appropriately administered. Currently, areal bone mineral density (aBMD) based on 2D dual-energy X-ray absorptiometry (DXA) is used to determine osteoporotic status and predict fracture risk. Though aBMD is a significant predictor of fracture risk, it does not completely explain bone strength or fracture incidence. The major limitation of aBMD is the lack of 3D information, which is necessary to distinguish between cortical and trabecular bone and to quantify bone geometry and microarchitecture. High resolution peripheral quantitative computed tomography (HR-pQCT) enables in vivo assessment of volumetric BMD within specific bone compartments as well as quantification of geometric and microarchitectural measures of bone quality. HR-pQCT studies have documented that trabecular bone microstructure alterations are associated with fracture risk independent of aBMD.... Cortical bone microstructure - specifically porosity - is a major determinant of strength, stiffness, and fracture toughness of cortical tissue and may further explain the aBMD-independent effect of age on bone fragility and fracture risk. The application of finite element analysis (FEA) to HR-pQCT data permits estimation of patient-specific bone strength, shown to be associated with fracture incidence independent of aBMD. This talk will describe the HR-pQCT scanner, established metrics of bone quality derived from HR-pQCT data, and novel analyses of bone quality currently in development. Cross-sectional and longitudinal HR-pQCT studies investigating the impact of aging, disease, injury, gender, race, and

  6. Assessing covariate balance when using the generalized propensity score with quantitative or continuous exposures.

    PubMed

    Austin, Peter C

    2018-01-01

    Propensity score methods are increasingly being used to estimate the effects of treatments and exposures when using observational data. The propensity score was initially developed for use with binary exposures (e.g., active treatment vs. control). The generalized propensity score is an extension of the propensity score for use with quantitative exposures (e.g., dose or quantity of medication, income, years of education). A crucial component of any propensity score analysis is that of balance assessment. This entails assessing the degree to which conditioning on the propensity score (via matching, weighting, or stratification) has balanced measured baseline covariates between exposure groups. Methods for balance assessment have been well described and are frequently implemented when using the propensity score with binary exposures. However, there is a paucity of information on how to assess baseline covariate balance when using the generalized propensity score. We describe how methods based on the standardized difference can be adapted for use with quantitative exposures when using the generalized propensity score. We also describe a method based on assessing the correlation between the quantitative exposure and each covariate in the sample when weighted using generalized propensity score -based weights. We conducted a series of Monte Carlo simulations to evaluate the performance of these methods. We also compared two different methods of estimating the generalized propensity score: ordinary least squared regression and the covariate balancing propensity score method. We illustrate the application of these methods using data on patients hospitalized with a heart attack with the quantitative exposure being creatinine level.

  7. A Framework for General Education Assessment: Assessing Information Literacy and Quantitative Literacy with ePortfolios

    ERIC Educational Resources Information Center

    Hubert, David A.; Lewis, Kati J.

    2014-01-01

    This essay presents the findings of an authentic and holistic assessment, using a random sample of one hundred student General Education ePortfolios, of two of Salt Lake Community College's (SLCC) college-wide learning outcomes: quantitative literacy (QL) and information literacy (IL). Performed by four faculty from biology, humanities, and…

  8. Contrast-enhanced ultrasound for quantitative assessment of portal pressure in canine liver fibrosis

    PubMed Central

    Zhai, Lin; Qiu, Lan-Yan; Zu, Yuan; Yan, Yan; Ren, Xiao-Zhuan; Zhao, Jun-Feng; Liu, Yu-Jiang; Liu, Ji-Bin; Qian, Lin-Xue

    2015-01-01

    AIM: To explore the feasibility of non-invasive quantitative estimation of portal venous pressure by contrast-enhanced ultrasound (CEUS) in a canine model. METHODS: Liver fibrosis was established in adult canines (Beagles; n = 14) by subcutaneous injection of carbon tetrachloride (CCl4). CEUS parameters, including the area under the time-intensity curve and intensity at portal/arterial phases (Qp/Qa and Ip/Ia, respectively), were used to quantitatively assess the blood flow ratio of the portal vein/hepatic artery at multiple time points. The free portal venous pressures (FPP) were measured by a multi-channel baroreceptor using a percutaneous approach at baseline and 8, 16, and 24 wk after CCl4 injections in each canine. Liver biopsies were obtained at the end of 8, 16, and 24 wk from each animal, and the stage of the fibrosis was assessed according to the Metavir scoring system. A Pearson correlation test was performed to compare the FPP with Qp/Qa and Ip/Ia. RESULTS: Pathologic examination of 42 biopsies from the 14 canines at weeks 8, 16, and 24 revealed that liver fibrosis was induced by CCl4 and represented various stages of liver fibrosis, including F0 (n = 3), F1 (n = 12), F2 (n = 14), F3 (n = 11), and F4 (n = 2). There were significant differences in the measurements of Qp/Qa (19.85 ± 3.30 vs 10.43 ± 1.21, 9.63 ± 1.03, and 8.77 ± 0.96) and Ip/Ia (1.77 ± 0.37 vs 1.03 ± 0.12, 0.83 ± 0.10, and 0.69 ± 0.13) between control and canine fibrosis at 8, 16, and 24 wk, respectively (all P < 0.001). There were statistically significant negative correlations between FPP and Qp/Qa (r = -0.707, P < 0.001), and between FPP and Ip/Ia (r = -0.759, P < 0.001) in the canine fibrosis model. Prediction of elevated FPP based on Qp/Qa and Ip/Ia was highly sensitive, as assessed by the area under the receiver operating curve (0.866 and 0.895, respectively). CONCLUSION: CEUS is a potential method to accurately, but non-invasively, estimate portal venous pressure through

  9. Home Circadian Phase Assessments with Measures of Compliance Yield Accurate Dim Light Melatonin Onsets.

    PubMed

    Burgess, Helen J; Wyatt, James K; Park, Margaret; Fogg, Louis F

    2015-06-01

    There is a need for the accurate assessment of circadian phase outside of the clinic/laboratory, particularly with the gold standard dim light melatonin onset (DLMO). We tested a novel kit designed to assist in saliva sampling at home for later determination of the DLMO. The home kit includes objective measures of compliance to the requirements for dim light and half-hourly saliva sampling. Participants were randomized to one of two 10-day protocols. Each protocol consisted of two back-to-back home and laboratory phase assessments in counterbalanced order, separated by a 5-day break. Laboratory or participants' homes. Thirty-five healthy adults, age 21-62 y. N/A. Most participants received at least one 30-sec epoch of light > 50 lux during the home phase assessments (average light intensity 4.5 lux), but on average for < 9 min of the required 8.5 h. Most participants collected every saliva sample within 5 min of the scheduled time. Ninety-two percent of home DLMOs were not affected by light > 50 lux or sampling errors. There was no significant difference between the home and laboratory DLMOs (P > 0.05); on average the home DLMOs occurred 9.6 min before the laboratory DLMOs. The home DLMOs were highly correlated with the laboratory DLMOs (r = 0.91, P < 0.001). Participants were reasonably compliant to the home phase assessment procedures. The good agreement between the home and laboratory dim light melatonin onsets (DLMOs) demonstrates that including objective measures of light exposure and sample timing during home saliva sampling can lead to accurate home DLMOs. Circadian Phase Assessments at Home, http://clinicaltrials.gov/show/NCT01487252, NCT01487252. © 2015 Associated Professional Sleep Societies, LLC.

  10. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  11. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE PAGES

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; ...

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  12. Identification and evaluation of new reference genes in Gossypium hirsutum for accurate normalization of real-time quantitative RT-PCR data.

    PubMed

    Artico, Sinara; Nardeli, Sarah M; Brilhante, Osmundo; Grossi-de-Sa, Maria Fátima; Alves-Ferreira, Marcio

    2010-03-21

    Normalizing through reference genes, or housekeeping genes, can make more accurate and reliable results from reverse transcription real-time quantitative polymerase chain reaction (qPCR). Recent studies have shown that no single housekeeping gene is universal for all experiments. Thus, suitable reference genes should be the first step of any qPCR analysis. Only a few studies on the identification of housekeeping gene have been carried on plants. Therefore qPCR studies on important crops such as cotton has been hampered by the lack of suitable reference genes. By the use of two distinct algorithms, implemented by geNorm and NormFinder, we have assessed the gene expression of nine candidate reference genes in cotton: GhACT4, GhEF1alpha5, GhFBX6, GhPP2A1, GhMZA, GhPTB, GhGAPC2, GhbetaTUB3 and GhUBQ14. The candidate reference genes were evaluated in 23 experimental samples consisting of six distinct plant organs, eight stages of flower development, four stages of fruit development and in flower verticils. The expression of GhPP2A1 and GhUBQ14 genes were the most stable across all samples and also when distinct plants organs are examined. GhACT4 and GhUBQ14 present more stable expression during flower development, GhACT4 and GhFBX6 in the floral verticils and GhMZA and GhPTB during fruit development. Our analysis provided the most suitable combination of reference genes for each experimental set tested as internal control for reliable qPCR data normalization. In addition, to illustrate the use of cotton reference genes we checked the expression of two cotton MADS-box genes in distinct plant and floral organs and also during flower development. We have tested the expression stabilities of nine candidate genes in a set of 23 tissue samples from cotton plants divided into five different experimental sets. As a result of this evaluation, we recommend the use of GhUBQ14 and GhPP2A1 housekeeping genes as superior references for normalization of gene expression measures in

  13. Identification and evaluation of new reference genes in Gossypium hirsutum for accurate normalization of real-time quantitative RT-PCR data

    PubMed Central

    2010-01-01

    Background Normalizing through reference genes, or housekeeping genes, can make more accurate and reliable results from reverse transcription real-time quantitative polymerase chain reaction (qPCR). Recent studies have shown that no single housekeeping gene is universal for all experiments. Thus, suitable reference genes should be the first step of any qPCR analysis. Only a few studies on the identification of housekeeping gene have been carried on plants. Therefore qPCR studies on important crops such as cotton has been hampered by the lack of suitable reference genes. Results By the use of two distinct algorithms, implemented by geNorm and NormFinder, we have assessed the gene expression of nine candidate reference genes in cotton: GhACT4, GhEF1α5, GhFBX6, GhPP2A1, GhMZA, GhPTB, GhGAPC2, GhβTUB3 and GhUBQ14. The candidate reference genes were evaluated in 23 experimental samples consisting of six distinct plant organs, eight stages of flower development, four stages of fruit development and in flower verticils. The expression of GhPP2A1 and GhUBQ14 genes were the most stable across all samples and also when distinct plants organs are examined. GhACT4 and GhUBQ14 present more stable expression during flower development, GhACT4 and GhFBX6 in the floral verticils and GhMZA and GhPTB during fruit development. Our analysis provided the most suitable combination of reference genes for each experimental set tested as internal control for reliable qPCR data normalization. In addition, to illustrate the use of cotton reference genes we checked the expression of two cotton MADS-box genes in distinct plant and floral organs and also during flower development. Conclusion We have tested the expression stabilities of nine candidate genes in a set of 23 tissue samples from cotton plants divided into five different experimental sets. As a result of this evaluation, we recommend the use of GhUBQ14 and GhPP2A1 housekeeping genes as superior references for normalization of gene

  14. Assessing Quantitative Resistance against Leptosphaeria maculans (Phoma Stem Canker) in Brassica napus (Oilseed Rape) in Young Plants

    PubMed Central

    Huang, Yong-Ju; Qi, Aiming; King, Graham J.; Fitt, Bruce D. L.

    2014-01-01

    Quantitative resistance against Leptosphaeria maculans in Brassica napus is difficult to assess in young plants due to the long period of symptomless growth of the pathogen from the appearance of leaf lesions to the appearance of canker symptoms on the stem. By using doubled haploid (DH) lines A30 (susceptible) and C119 (with quantitative resistance), quantitative resistance against L. maculans was assessed in young plants in controlled environments at two stages: stage 1, growth of the pathogen along leaf veins/petioles towards the stem by leaf lamina inoculation; stage 2, growth in stem tissues to produce stem canker symptoms by leaf petiole inoculation. Two types of inoculum (ascospores; conidia) and three assessment methods (extent of visible necrosis; symptomless pathogen growth visualised using the GFP reporter gene; amount of pathogen DNA quantified by PCR) were used. In stage 1 assessments, significant differences were observed between lines A30 and C119 in area of leaf lesions, distance grown along veins/petioles assessed by visible necrosis or by viewing GFP and amount of L. maculans DNA in leaf petioles. In stage 2 assessments, significant differences were observed between lines A30 and C119 in severity of stem canker and amount of L. maculans DNA in stem tissues. GFP-labelled L. maculans spread more quickly from the stem cortex to the stem pith in A30 than in C119. Stem canker symptoms were produced more rapidly by using ascospore inoculum than by using conidial inoculum. These results suggest that quantitative resistance against L. maculans in B. napus can be assessed in young plants in controlled conditions. Development of methods to phenotype quantitative resistance against plant pathogens in young plants in controlled environments will help identification of stable quantitative resistance for control of crop diseases. PMID:24454767

  15. Are general surgeons able to accurately self-assess their level of technical skills?

    PubMed

    Rizan, C; Ansell, J; Tilston, T W; Warren, N; Torkington, J

    2015-11-01

    Self-assessment is a way of improving technical capabilities without the need for trainer feedback. It can identify areas for improvement and promote professional medical development. The aim of this review was to identify whether self-assessment is an accurate form of technical skills appraisal in general surgery. The PubMed, MEDLINE(®), Embase(™) and Cochrane databases were searched for studies assessing the reliability of self-assessment of technical skills in general surgery. For each study, we recorded the skills assessed and the evaluation methods used. Common endpoints between studies were compared to provide recommendations based on the levels of evidence. Twelve studies met the inclusion criteria from 22,292 initial papers. There was no level 1 evidence published. All papers compared the correlation between self-appraisal versus an expert score but differed in the technical skills assessment and the evaluation tools used. The accuracy of self-assessment improved with increasing experience (level 2 recommendation), age (level 3 recommendation) and the use of video playback (level 3 recommendation). Accuracy was reduced by stressful learning environments (level 2 recommendation), lack of familiarity with assessment tools (level 3 recommendation) and in advanced surgical procedures (level 3 recommendation). Evidence exists to support the reliability of self-assessment of technical skills in general surgery. Several variables have been shown to affect the accuracy of self-assessment of technical skills. Future work should focus on evaluating the reliability of self-assessment during live operating procedures.

  16. Quantitative multiparametric MRI assessment of glioma response to radiotherapy in a rat model.

    PubMed

    Hong, Xiaohua; Liu, Li; Wang, Meiyun; Ding, Kai; Fan, Ying; Ma, Bo; Lal, Bachchu; Tyler, Betty; Mangraviti, Antonella; Wang, Silun; Wong, John; Laterra, John; Zhou, Jinyuan

    2014-06-01

    The inability of structural MRI to accurately measure tumor response to therapy complicates care management for patients with gliomas. The purpose of this study was to assess the potential of several noninvasive functional and molecular MRI biomarkers for the assessment of glioma response to radiotherapy. Fourteen U87 tumor-bearing rats were irradiated using a small-animal radiation research platform (40 or 20 Gy), and 6 rats were used as controls. MRI was performed on a 4.7 T animal scanner, preradiation treatment, as well as at 3, 6, 9, and 14 days postradiation. Image features of the tumors, as well as tumor volumes and animal survival, were quantitatively compared. Structural MRI showed that all irradiated tumors still grew in size during the initial days postradiation. The apparent diffusion coefficient (ADC) values of tumors increased significantly postradiation (40 and 20 Gy), except at day 3 postradiation, compared with preradiation. The tumor blood flow decreased significantly postradiation (40 and 20 Gy), but the relative blood flow (tumor vs contralateral) did not show a significant change at most time points postradiation. The amide proton transfer weighted (APTw) signals of the tumor decreased significantly at all time points postradiation (40 Gy), and also at day 9 postradiation (20 Gy). The blood flow and APTw maps demonstrated tumor features that were similar to those seen on gadolinium-enhanced T1-weighted images. Tumor ADC, blood flow, and APTw were all useful imaging biomarkers by which to predict glioma response to radiotherapy. The APTw signal was most promising for early response assessment in this model. © The Author(s) 2013. Published by Oxford University Press on behalf of the Society for Neuro-Oncology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Personal Exposure Monitoring Wearing Protocol Compliance: An Initial Assessment of Quantitative Measurements

    EPA Science Inventory

    Personal exposure sampling provides the most accurate and representative assessment of exposure to a pollutant, but only if measures are implemented to minimize exposure misclassification and reduce confounders that may cause misinterpretation of the collected data. Poor complian...

  18. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  19. Diagnostic accuracy of stress perfusion CMR in comparison with quantitative coronary angiography: fully quantitative, semiquantitative, and qualitative assessment.

    PubMed

    Mordini, Federico E; Haddad, Tariq; Hsu, Li-Yueh; Kellman, Peter; Lowrey, Tracy B; Aletras, Anthony H; Bandettini, W Patricia; Arai, Andrew E

    2014-01-01

    This study's primary objective was to determine the sensitivity, specificity, and accuracy of fully quantitative stress perfusion cardiac magnetic resonance (CMR) versus a reference standard of quantitative coronary angiography. We hypothesized that fully quantitative analysis of stress perfusion CMR would have high diagnostic accuracy for identifying significant coronary artery stenosis and exceed the accuracy of semiquantitative measures of perfusion and qualitative interpretation. Relatively few studies apply fully quantitative CMR perfusion measures to patients with coronary disease and comparisons to semiquantitative and qualitative methods are limited. Dual bolus dipyridamole stress perfusion CMR exams were performed in 67 patients with clinical indications for assessment of myocardial ischemia. Stress perfusion images alone were analyzed with a fully quantitative perfusion (QP) method and 3 semiquantitative methods including contrast enhancement ratio, upslope index, and upslope integral. Comprehensive exams (cine imaging, stress/rest perfusion, late gadolinium enhancement) were analyzed qualitatively with 2 methods including the Duke algorithm and standard clinical interpretation. A 70% or greater stenosis by quantitative coronary angiography was considered abnormal. The optimum diagnostic threshold for QP determined by receiver-operating characteristic curve occurred when endocardial flow decreased to <50% of mean epicardial flow, which yielded a sensitivity of 87% and specificity of 93%. The area under the curve for QP was 92%, which was superior to semiquantitative methods: contrast enhancement ratio: 78%; upslope index: 82%; and upslope integral: 75% (p = 0.011, p = 0.019, p = 0.004 vs. QP, respectively). Area under the curve for QP was also superior to qualitative methods: Duke algorithm: 70%; and clinical interpretation: 78% (p < 0.001 and p < 0.001 vs. QP, respectively). Fully quantitative stress perfusion CMR has high diagnostic accuracy for

  20. Optimization of dual-energy xenon-computed tomography for quantitative assessment of regional pulmonary ventilation.

    PubMed

    Fuld, Matthew K; Halaweish, Ahmed F; Newell, John D; Krauss, Bernhard; Hoffman, Eric A

    2013-09-01

    Dual-energy x-ray computed tomography (DECT) offers visualization of the airways and quantitation of regional pulmonary ventilation using a single breath of inhaled xenon gas. In this study, we sought to optimize scanning protocols for DECT xenon gas ventilation imaging of the airways and lung parenchyma and to characterize the quantitative nature of the developed protocols through a series of test-object and animal studies. The Institutional Animal Care and Use Committee approved all animal studies reported here. A range of xenon/oxygen gas mixtures (0%, 20%, 25%, 33%, 50%, 66%, 100%; balance oxygen) were scanned in syringes and balloon test-objects to optimize the delivered gas mixture for assessment of regional ventilation while allowing for the development of improved 3-material decomposition calibration parameters. In addition, to alleviate gravitational effects on xenon gas distribution, we replaced a portion of the oxygen in the xenon/oxygen gas mixture with helium and compared gas distributions in a rapid-prototyped human central-airway test-object. Additional syringe tests were performed to determine if the introduction of helium had any effect on xenon quantitation. Xenon gas mixtures were delivered to anesthetized swine to assess airway and lung parenchymal opacification while evaluating various DECT scan acquisition settings. Attenuation curves for xenon were obtained from the syringe test-objects and were used to develop improved 3-material decomposition parameters (Hounsfield unit enhancement per percentage xenon: within the chest phantom, 2.25 at 80 kVp, 1.7 at 100 kVp, and 0.76 at 140 kVp with tin filtration; in open air, 2.5 at 80 kVp, 1.95 at 100 kVp, and 0.81 at 140 kVp with tin filtration). The addition of helium improved the distribution of xenon gas to the gravitationally nondependent portion of the airway tree test-object, while not affecting the quantitation of xenon in the 3-material decomposition DECT. The mixture 40% Xe/40% He/20% O2

  1. Quantitative and Qualitative Assessment of Pulmonary Emphysema with T2-Weighted PROPELLER MRI in a High-Risk Population Compared to Low-Dose CT.

    PubMed

    Meier-Schroers, Michael; Sprinkart, Alois Martin; Becker, Manuel; Homsi, Rami; Thomas, Daniel

    2018-03-07

     To determine the suitability of T2-weighted PROPELLER MRI for the assessment of pulmonary emphysema.  60 participants in a lung cancer screening program (30 subjects with pulmonary emphysema, and 30 control subjects without emphysema) were included for this retrospective study. All subjects were examined with low-dose CT (LDCT) and MRI within the screening program. The use of a T2-weighted PROPELLER sequence for the assessment of emphysema was analyzed and correlated with the results of LDCT. The presence and the extent of pulmonary emphysema were first assessed qualitatively using a three-point score, and then quantitatively with a semi-automated software program to obtain emphysema indices.  All 30 cases with pulmonary emphysema were accurately detected by MRI. There were 3 cases with emphysema according to MRI without emphysematous changes on LDCT (false-positive results). The qualitative scores as well as the emphysema indices were significantly higher in the emphysema group compared to the control group for MRI and LDCT (p < 0.001). Both the scores and the indices correlated significantly between MRI and LDCT (qualitative score of severity: r = 0.912/p < 0.001 in the emphysema group and r = 0.668/p < 0.001 in the control group; emphysema index: r = 0.960/p < 0.001 in the emphysema group and r = 0.746/p < 0.001 in the control group).  The presence and the extent of pulmonary emphysema may be assessed qualitatively and quantitatively by T2-weighted PROPELLER MRI with very good correlation to LDCT.   · T2-weighted PROPELLER MRI may be suitable for the assessment of pulmonary emphysema.. · There was significant correlation between MRI and LDCT regarding qualitative scores and quantitative emphysema indices in our study with correlation coefficients for different subgroups ranging from r = 0.668 to r = 0.960.. · T2-weighted PROPELLER MRI may have the potential to be used for follow-up examinations in

  2. Novel quantitative assessment of metamorphopsia in maculopathy.

    PubMed

    Wiecek, Emily; Lashkari, Kameran; Dakin, Steven C; Bex, Peter

    2014-11-18

    Patients with macular disease often report experiencing metamorphopsia (visual distortion). Although typically measured with Amsler charts, more quantitative assessments of perceived distortion are desirable to effectively monitor the presence, progression, and remediation of visual impairment. Participants with binocular (n = 33) and monocular (n = 50) maculopathy across seven disease groups, and control participants (n = 10) with no identifiable retinal disease completed a modified Amsler grid assessment (presented on a computer screen with eye tracking to ensure fixation compliance) and two novel assessments to measure metamorphopsia in the central 5° of visual field. A total of 81% (67/83) of participants completed a hyperacuity task where they aligned eight dots in the shape of a square, and 64% (32/50) of participants with monocular distortion completed a spatial alignment task using dichoptic stimuli. Ten controls completed all tasks. Horizontal and vertical distortion magnitudes were calculated for each of the three assessments. Distortion magnitudes were significantly higher in patients than controls in all assessments. There was no significant difference in magnitude of distortion across different macular diseases. There were no significant correlations between overall magnitude of distortion among any of the three measures and no significant correlations in localized measures of distortion. Three alternative quantifications of monocular spatial distortion in the central visual field generated uncorrelated estimates of visual distortion. It is therefore unlikely that metamorphopsia is caused solely by retinal displacement, but instead involves additional top-down information, knowledge about the scene, and perhaps, cortical reorganization. Copyright 2015 The Association for Research in Vision and Ophthalmology, Inc.

  3. QUANTITATIVE CANCER RISK ASSESSMENT METHODOLOGY USING SHORT-TERM GENETIC BIOASSAYS: THE COMPARATIVE POTENCY METHOD

    EPA Science Inventory

    Quantitative risk assessment is fraught with many uncertainties. The validity of the assumptions underlying the methods employed are often difficult to test or validate. Cancer risk assessment has generally employed either human epidemiological data from relatively high occupatio...

  4. Detailed behavioral assessment promotes accurate diagnosis in patients with disorders of consciousness

    PubMed Central

    Gilutz, Yael; Lazary, Avraham; Karpin, Hana; Vatine, Jean-Jacques; Misha, Tamar; Fortinsky, Hadassah; Sharon, Haggai

    2015-01-01

    Introduction: Assessing the awareness level in patients with disorders of consciousness (DOC) is made on the basis of exhibited behaviors. However, since motor signs of awareness (i.e., non-reflex motor responses) can be very subtle, differentiating the vegetative from minimally conscious states (which is in itself not clear-cut) is often challenging. Even the careful clinician relying on standardized scales may arrive at a wrong diagnosis. Aim: To report our experience in tackling this problem by using two in-house use assessment procedures developed at Reuth Rehabilitation Hospital, and demonstrate their clinical significance by reviewing two cases. Methods: (1) Reuth DOC Response Assessment (RDOC-RA) –administered in addition to the standardized tools, and emphasizes the importance of assessing a wide range of motor responses. In our experience, in some patients the only evidence for awareness may be a private specific movement that is not assessed by standard assessment tools. (2) Reuth DOC Periodic Intervention Model (RDOC-PIM) – current literature regarding assessment and diagnosis in DOC refers mostly to the acute phase of up to 1 year post injury. However, we have found major changes in responsiveness occurring 1 year or more post-injury in many patients. Therefore, we conduct periodic assessments at predetermined times points to ensure patients are not misdiagnosed or neurological changes overlooked. Results: In the first case the RDOC-RA promoted a more accurate diagnosis than that based on standardized scales alone. The second case shows how the RDOC-PIM allowed us to recognize late recovery and promoted reinstatement of treatment with good results. Conclusion: Adding a detailed periodic assessment of DOC patients to existing scales can yield critical information, promoting better diagnosis, treatment, and clinical outcomes. We discuss the implications of this observation for the future development and validation of assessment tools in DOC patients

  5. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  6. Quantitating Organoleptic Volatile Phenols in Smoke-Exposed Vitis vinifera Berries.

    PubMed

    Noestheden, Matthew; Thiessen, Katelyn; Dennis, Eric G; Tiet, Ben; Zandberg, Wesley F

    2017-09-27

    Accurate methods for quantitating volatile phenols (i.e., guaiacol, syringol, 4-ethylphenol, etc.) in smoke-exposed Vitis vinifera berries prior to fermentation are needed to predict the likelihood of perceptible smoke taint following vinification. Reported here is a complete, cross-validated analytical workflow to accurately quantitate free and glycosidically bound volatile phenols in smoke-exposed berries using liquid-liquid extraction, acid-mediated hydrolysis, and gas chromatography-tandem mass spectrometry. The reported workflow addresses critical gaps in existing methods for volatile phenols that impact quantitative accuracy, most notably the effect of injection port temperature and the variability in acid-mediated hydrolytic procedures currently used. Addressing these deficiencies will help the wine industry make accurate, informed decisions when producing wines from smoke-exposed berries.

  7. Quantitative Assessment of Breast Cosmetic Outcome After Whole-Breast Irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reddy, Jay P.; Lei, Xiudong; Huang, Sheng-Cheng

    Purpose: To measure, by quantitative analysis of digital photographs, breast cosmetic outcome within the setting of a randomized trial of conventionally fractionated (CF) and hypofractionated (HF) whole-breast irradiation (WBI), to identify how quantitative cosmesis metrics were associated with patient- and physician-reported cosmesis and whether they differed by treatment arm. Methods and Materials: From 2011 to 2014, 287 women aged ≥40 with ductal carcinoma in situ or early invasive breast cancer were randomized to HF-WBI (42.56 Gy/16 fractions [fx] + 10-12.5 Gy/4-5 fx boost) or CF-WBI (50 Gy/25 fx + 10-14 Gy/5-7 fx). At 1 year after treatment we collected digital photographs, patient-reported cosmesis using the Breast Cancer Treatment and Outcomesmore » Scale, and physician-reported cosmesis using the Radiation Therapy Oncology Group scale. Six quantitative measures of breast symmetry, labeled M1-M6, were calculated from anteroposterior digital photographs. For each measure, values closer to 1 imply greater symmetry, and values closer to 0 imply greater asymmetry. Associations between M1-M6 and patient- and physician-reported cosmesis and treatment arm were evaluated using the Kruskal-Wallis test. Results: Among 245 evaluable patients, patient-reported cosmesis was strongly associated with M1 (vertical symmetry measure) (P<.01). Physician-reported cosmesis was similarly correlated with M1 (P<.01) and also with M2 (vertical symmetry, P=.01) and M4 (horizontal symmetry, P=.03). At 1 year after treatment, HF-WBI resulted in better values of M2 (P=.02) and M3 (P<.01) than CF-WBI; treatment arm was not significantly associated with M1, M4, M5, or M6 (P≥.12). Conclusions: Quantitative assessment of breast photographs reveals similar to improved cosmetic outcome with HF-WBI compared with CF-WBI 1 year after treatment. Assessing cosmetic outcome using these measures could be useful for future comparative effectiveness studies and outcome reporting.« less

  8. Quantitative Assessment of Breast Cosmetic Outcome After Whole-Breast Irradiation.

    PubMed

    Reddy, Jay P; Lei, Xiudong; Huang, Sheng-Cheng; Nicklaus, Krista M; Fingeret, Michelle C; Shaitelman, Simona F; Hunt, Kelly K; Buchholz, Thomas A; Merchant, Fatima; Markey, Mia K; Smith, Benjamin D

    2017-04-01

    To measure, by quantitative analysis of digital photographs, breast cosmetic outcome within the setting of a randomized trial of conventionally fractionated (CF) and hypofractionated (HF) whole-breast irradiation (WBI), to identify how quantitative cosmesis metrics were associated with patient- and physician-reported cosmesis and whether they differed by treatment arm. From 2011 to 2014, 287 women aged ≥40 with ductal carcinoma in situ or early invasive breast cancer were randomized to HF-WBI (42.56 Gy/16 fractions [fx] + 10-12.5 Gy/4-5 fx boost) or CF-WBI (50 Gy/25 fx + 10-14 Gy/5-7 fx). At 1 year after treatment we collected digital photographs, patient-reported cosmesis using the Breast Cancer Treatment and Outcomes Scale, and physician-reported cosmesis using the Radiation Therapy Oncology Group scale. Six quantitative measures of breast symmetry, labeled M1-M6, were calculated from anteroposterior digital photographs. For each measure, values closer to 1 imply greater symmetry, and values closer to 0 imply greater asymmetry. Associations between M1-M6 and patient- and physician-reported cosmesis and treatment arm were evaluated using the Kruskal-Wallis test. Among 245 evaluable patients, patient-reported cosmesis was strongly associated with M1 (vertical symmetry measure) (P<.01). Physician-reported cosmesis was similarly correlated with M1 (P<.01) and also with M2 (vertical symmetry, P=.01) and M4 (horizontal symmetry, P=.03). At 1 year after treatment, HF-WBI resulted in better values of M2 (P=.02) and M3 (P<.01) than CF-WBI; treatment arm was not significantly associated with M1, M4, M5, or M6 (P≥.12). Quantitative assessment of breast photographs reveals similar to improved cosmetic outcome with HF-WBI compared with CF-WBI 1 year after treatment. Assessing cosmetic outcome using these measures could be useful for future comparative effectiveness studies and outcome reporting. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. A Quantitative Needs Assessment Technique for Cross-Cultural Work Adjustment Training.

    ERIC Educational Resources Information Center

    Selmer, Lyn

    2000-01-01

    A study of 67 Swedish expatriate bosses and 104 local Hong Kong middle managers tested a quantitative needs assessment technique measuring work values. Two-thirds of middle managers' work values were not correctly estimated by their bosses, especially instrumental values (pay, benefits, security, working hours and conditions), indicating a need…

  10. Quantitative aspects of inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Bulska, Ewa; Wagner, Barbara

    2016-10-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.

  11. Study on quantitative risk assessment model of the third party damage for natural gas pipelines based on fuzzy comprehensive assessment

    NASA Astrophysics Data System (ADS)

    Qiu, Zeyang; Liang, Wei; Wang, Xue; Lin, Yang; Zhang, Meng

    2017-05-01

    As an important part of national energy supply system, transmission pipelines for natural gas are possible to cause serious environmental pollution, life and property loss in case of accident. The third party damage is one of the most significant causes for natural gas pipeline system accidents, and it is very important to establish an effective quantitative risk assessment model of the third party damage for reducing the number of gas pipelines operation accidents. Against the third party damage accident has the characteristics such as diversity, complexity and uncertainty, this paper establishes a quantitative risk assessment model of the third party damage based on Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE). Firstly, risk sources of third party damage should be identified exactly, and the weight of factors could be determined via improved AHP, finally the importance of each factor is calculated by fuzzy comprehensive evaluation model. The results show that the quantitative risk assessment model is suitable for the third party damage of natural gas pipelines and improvement measures could be put forward to avoid accidents based on the importance of each factor.

  12. Improvement of medical content in the curriculum of biomedical engineering based on assessment of students outcomes.

    PubMed

    Abdulhay, Enas; Khnouf, Ruba; Haddad, Shireen; Al-Bashir, Areen

    2017-08-04

    Improvement of medical content in Biomedical Engineering curricula based on a qualitative assessment process or on a comparison with another high-standard program has been approached by a number of studies. However, the quantitative assessment tools have not been emphasized. The quantitative assessment tools can be more accurate and robust in cases of challenging multidisciplinary fields like that of Biomedical Engineering which includes biomedicine elements mixed with technology aspects. The major limitations of the previous research are the high dependence on surveys or pure qualitative approaches as well as the absence of strong focus on medical outcomes without implicit confusion with the technical ones. The proposed work presents the development and evaluation of an accurate/robust quantitative approach to the improvement of the medical content in the challenging multidisciplinary BME curriculum. The work presents quantitative assessment tools and subsequent improvement of curriculum medical content applied, as example for explanation, to the ABET (Accreditation Board for Engineering and Technology, USA) accredited biomedical engineering BME department at Jordan University of Science and Technology. The quantitative results of assessment of curriculum/course, capstone, exit exam, course assessment by student (CAS) as well as of surveys filled by alumni, seniors, employers and training supervisors were, first, mapped to the expected students' outcomes related to the medical field (SOsM). The collected data were then analyzed and discussed to find curriculum weakness points by tracking shortcomings in every outcome degree of achievement. Finally, actions were taken to fill in the gaps of the curriculum. Actions were also mapped to the students' medical outcomes (SOsM). Weighted averages of obtained quantitative values, mapped to SOsM, indicated accurately the achievement levels of all outcomes as well as the necessary improvements to be performed in curriculum

  13. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.

  14. Clinical assessment is an accurate predictor of which patients will need septoplasty.

    PubMed

    Sedaghat, Ahmad R; Busaba, Nicolas Y; Cunningham, Michael J; Kieff, David A

    2013-01-01

    Septoplasty is a frequently performed surgical procedure with the most common indication being nasal airway obstruction. Almost universally, health insurance companies mandate a trial of medical therapy consisting of intranasal corticosteroids prior to performance of septoplasty regardless of clinical assessment. Evidence for this requirement is lacking. We sought to evaluate the initial clinical assessment as a predictor of response to this mandated trial of medical treatment. Retrospective review of prospectively collected data on 137 consecutive patients who presented with symptoms of nasal obstruction and a deviated nasal septum on physical examination. Patients were placed into one of three cohorts based on prediction of 1) failure of medical therapy with subsequent septoplasty, 2) success of medical therapy without subsequent septoplasty, or 3) unable to make a prediction. Patients from each cohort were assessed for subsequent response to medical therapy and ultimate need for septoplasty. Overall clinical assessment had a sensitivity of 86.9%, specificity of 91.8%, positive predictive value of 93.6%, and negative predictive value of 96.4% for detecting/predicting need for septoplasty. The accuracy of the overall clinical assessment is considerably better than severe deviation at any one septal anatomical site. Of patients whose response to medical therapy could not be predicted, 61.3% failed medical therapy and needed surgery; this is statistically equivalent to a 50/50 distribution between either needing septoplasty or not. Clinical assessment at initial presentation of patients with nasal obstruction and deviated septum is highly accurate in predicting which patients will need septoplasty. Copyright © 2012 The American Laryngological, Rhinological, and Otological Society, Inc.

  15. A quantitative microbial risk assessment for center pivot irrigation of dairy wastewaters

    USDA-ARS?s Scientific Manuscript database

    In the western United States where livestock wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks from inhaling pathogens aerosolized dur...

  16. Quantitative 3D breast magnetic resonance imaging fibroglandular tissue analysis and correlation with qualitative assessments: a feasibility study.

    PubMed

    Ha, Richard; Mema, Eralda; Guo, Xiaotao; Mango, Victoria; Desperito, Elise; Ha, Jason; Wynn, Ralph; Zhao, Binsheng

    2016-04-01

    The amount of fibroglandular tissue (FGT) has been linked to breast cancer risk based on mammographic density studies. Currently, the qualitative assessment of FGT on mammogram (MG) and magnetic resonance imaging (MRI) is prone to intra and inter-observer variability. The purpose of this study is to develop an objective quantitative FGT measurement tool for breast MRI that could provide significant clinical value. An IRB approved study was performed. Sixty breast MRI cases with qualitative assessment of mammographic breast density and MRI FGT were randomly selected for quantitative analysis from routine breast MRIs performed at our institution from 1/2013 to 12/2014. Blinded to the qualitative data, whole breast and FGT contours were delineated on T1-weighted pre contrast sagittal images using an in-house, proprietary segmentation algorithm which combines the region-based active contours and a level set approach. FGT (%) was calculated by: [segmented volume of FGT (mm(3))/(segmented volume of whole breast (mm(3))] ×100. Statistical correlation analysis was performed between quantified FGT (%) on MRI and qualitative assessments of mammographic breast density and MRI FGT. There was a significant positive correlation between quantitative MRI FGT assessment and qualitative MRI FGT (r=0.809, n=60, P<0.001) and mammographic density assessment (r=0.805, n=60, P<0.001). There was a significant correlation between qualitative MRI FGT assessment and mammographic density assessment (r=0.725, n=60, P<0.001). The four qualitative assessment categories of FGT correlated with the calculated mean quantitative FGT (%) of 4.61% (95% CI, 0-12.3%), 8.74% (7.3-10.2%), 18.1% (15.1-21.1%), 37.4% (29.5-45.3%). Quantitative measures of FGT (%) were computed with data derived from breast MRI and correlated significantly with conventional qualitative assessments. This quantitative technique may prove to be a valuable tool in clinical use by providing computer generated standardized

  17. The Quantitative Reasoning for College Science (QuaRCS) Assessment in non-Astro 101 Courses

    NASA Astrophysics Data System (ADS)

    Kirkman, Thomas W.; Jensen, Ellen

    2016-06-01

    The innumeracy of American students and adults is a much lamented educational problem. The quantitative reasoning skills of college students may be particularly addressed and improved in "general education" science courses like Astro 101. Demonstrating improvement requires a standardized instrument. Among the non-proprietary instruments the Quantitative Literacy and Reasoning Assessment[1] (QRLA) and the Quantitative Reasoning for College Science (QuaRCS) Assessment[2] stand out.Follette et al. developed the QuaRCS in the context of Astro 101 at University of Arizona. We report on QuaRCS results in different contexts: pre-med physics and pre-nursing microbiology at a liberal arts college. We report on the mismatch between students' contemporaneous report of a question's difficulty and the actual probability of success. We report correlations between QuaRCS and other assessments of overall student performance in the class. We report differences in attitude towards mathematics in these two different but health-related student populations .[1] QLRA, Gaze et al., 2014, DOI: http://dx.doi.org/10.5038/1936-4660.7.2.4[2] QuaRCS, Follette, et al., 2015, DOI: http://dx.doi.org/10.5038/1936-4660.8.2.2

  18. Addressing variability in the acoustic startle reflex for accurate gap detection assessment.

    PubMed

    Longenecker, Ryan J; Kristaponyte, Inga; Nelson, Gregg L; Young, Jesse W; Galazyuk, Alexander V

    2018-06-01

    The acoustic startle reflex (ASR) is subject to substantial variability. This inherent variability consequently shapes the conclusions drawn from gap-induced prepulse inhibition of the acoustic startle reflex (GPIAS) assessments. Recent studies have cast doubt as to the efficacy of this methodology as it pertains to tinnitus assessment, partially, due to variability in and between data sets. The goal of this study was to examine the variance associated with several common data collection variables and data analyses with the aim to improve GPIAS reliability. To study this the GPIAS tests were conducted in adult male and female CBA/CaJ mice. Factors such as inter-trial interval, circadian rhythm, sex differences, and sensory adaptation were each evaluated. We then examined various data analysis factors which influence GPIAS assessment. Gap-induced facilitation, data processing options, and assessments of tinnitus were studied. We found that the startle reflex is highly variable in CBA/CaJ mice, but this can be minimized by certain data collection factors. We also found that careful consideration of temporal fluctuations of the ASR and controlling for facilitation can lead to more accurate GPIAS results. This study provides a guide for reducing variance in the GPIAS methodology - thereby improving the diagnostic power of the test. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Understanding outbreaks of waterborne infectious disease: quantitative microbial risk assessment vs. epidemiology

    USDA-ARS?s Scientific Manuscript database

    Drinking water contaminated with microbial pathogens can cause outbreaks of infectious disease, and these outbreaks are traditionally studied using epidemiologic methods. Quantitative microbial risk assessment (QMRA) can predict – and therefore help prevent – such outbreaks, but it has never been r...

  20. QUANTITATIVE ASSESSMENT OF CORAL DISEASES IN THE FLORIDA KEYS: STRATEGY AND METHODOLOGY

    EPA Science Inventory

    Most studies of coral disease have focused on the incidence of a single disease within a single location. Our overall objective is to use quantitative assessments to characterize annual patterns in the distribution and frequency of scleractinian and gorgonian coral diseases over ...

  1. The Novel Quantitative Technique for Assessment of Gait Symmetry Using Advanced Statistical Learning Algorithm

    PubMed Central

    Wu, Jianning; Wu, Bin

    2015-01-01

    The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis. PMID:25705672

  2. The novel quantitative technique for assessment of gait symmetry using advanced statistical learning algorithm.

    PubMed

    Wu, Jianning; Wu, Bin

    2015-01-01

    The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.

  3. Quantitative assessment of RNA-protein interactions with high-throughput sequencing-RNA affinity profiling.

    PubMed

    Ozer, Abdullah; Tome, Jacob M; Friedman, Robin C; Gheba, Dan; Schroth, Gary P; Lis, John T

    2015-08-01

    Because RNA-protein interactions have a central role in a wide array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the high-throughput sequencing-RNA affinity profiling (HiTS-RAP) assay that couples sequencing on an Illumina GAIIx genome analyzer with the quantitative assessment of protein-RNA interactions. This assay is able to analyze interactions between one or possibly several proteins with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of the EGFP and negative elongation factor subunit E (NELF-E) proteins with their corresponding canonical and mutant RNA aptamers. Here we provide a detailed protocol for HiTS-RAP that can be completed in about a month (8 d hands-on time). This includes the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, HiTS and protein binding with a GAIIx instrument, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, quantitative analysis of RNA on a massively parallel array (RNA-MaP) and RNA Bind-n-Seq (RBNS), for quantitative analysis of RNA-protein interactions.

  4. Automated Tracking of Quantitative Assessments of Tumor Burden in Clinical Trials1

    PubMed Central

    Rubin, Daniel L; Willrett, Debra; O'Connor, Martin J; Hage, Cleber; Kurtz, Camille; Moreira, Dilvan A

    2014-01-01

    There are two key challenges hindering effective use of quantitative assessment of imaging in cancer response assessment: 1) Radiologists usually describe the cancer lesions in imaging studies subjectively and sometimes ambiguously, and 2) it is difficult to repurpose imaging data, because lesion measurements are not recorded in a format that permits machine interpretation and interoperability. We have developed a freely available software platform on the basis of open standards, the electronic Physician Annotation Device (ePAD), to tackle these challenges in two ways. First, ePAD facilitates the radiologist in carrying out cancer lesion measurements as part of routine clinical trial image interpretation workflow. Second, ePAD records all image measurements and annotations in a data format that permits repurposing image data for analyses of alternative imaging biomarkers of treatment response. To determine the impact of ePAD on radiologist efficiency in quantitative assessment of imaging studies, a radiologist evaluated computed tomography (CT) imaging studies from 20 subjects having one baseline and three consecutive follow-up imaging studies with and without ePAD. The radiologist made measurements of target lesions in each imaging study using Response Evaluation Criteria in Solid Tumors 1.1 criteria, initially with the aid of ePAD, and then after a 30-day washout period, the exams were reread without ePAD. The mean total time required to review the images and summarize measurements of target lesions was 15% (P < .039) shorter using ePAD than without using this tool. In addition, it was possible to rapidly reanalyze the images to explore lesion cross-sectional area as an alternative imaging biomarker to linear measure. We conclude that ePAD appears promising to potentially improve reader efficiency for quantitative assessment of CT examinations, and it may enable discovery of future novel image-based biomarkers of cancer treatment response. PMID:24772204

  5. Quantitative motor assessment of muscular weakness in myasthenia gravis: a pilot study.

    PubMed

    Hoffmann, Sarah; Siedler, Jana; Brandt, Alexander U; Piper, Sophie K; Kohler, Siegfried; Sass, Christian; Paul, Friedemann; Reilmann, Ralf; Meisel, Andreas

    2015-12-23

    Muscular weakness in myasthenia gravis (MG) is commonly assessed using Quantitative Myasthenia Gravis Score (QMG). More objective and quantitative measures may complement the use of clinical scales and might detect subclinical affection of muscles. We hypothesized that muscular weakness in patients with MG can be quantified with the non-invasive Quantitative Motor (Q-Motor) test for Grip Force Assessment (QGFA) and Involuntary Movement Assessment (QIMA) and that pathological findings correlate with disease severity as measured by QMG. This was a cross-sectional pilot study investigating patients with confirmed diagnosis of MG. Data was compared to healthy controls (HC). Subjects were asked to lift a device (250 and 500 g) equipped with electromagnetic sensors that measured grip force (GF) and three-dimensional changes in position and orientation. These were used to calculate the position index (PI) and orientation index (OI) as measures for involuntary movements due to muscular weakness. Overall, 40 MG patients and 23 HC were included. PI and OI were significantly higher in MG patients for both weights in the dominant and non-dominant hand. Subgroup analysis revealed that patients with clinically ocular myasthenia gravis (OMG) also showed significantly higher values for PI and OI in both hands and for both weights. Disease severity correlates with QIMA performance in the non-dominant hand. Q-Motor tests and particularly QIMA may be useful objective tools for measuring motor impairment in MG and seem to detect subclinical generalized motor signs in patients with OMG. Q-Motor parameters might serve as sensitive endpoints for clinical trials in MG.

  6. A comparative study of qualitative and quantitative methods for the assessment of adhesive remnant after bracket debonding.

    PubMed

    Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C

    2012-04-01

    The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.

  7. Quantitative aspects of inductively coupled plasma mass spectrometry

    PubMed Central

    Wagner, Barbara

    2016-01-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644971

  8. Affordable, automatic quantitative fall risk assessment based on clinical balance scales and Kinect data.

    PubMed

    Colagiorgio, P; Romano, F; Sardi, F; Moraschini, M; Sozzi, A; Bejor, M; Ricevuti, G; Buizza, A; Ramat, S

    2014-01-01

    The problem of a correct fall risk assessment is becoming more and more critical with the ageing of the population. In spite of the available approaches allowing a quantitative analysis of the human movement control system's performance, the clinical assessment and diagnostic approach to fall risk assessment still relies mostly on non-quantitative exams, such as clinical scales. This work documents our current effort to develop a novel method to assess balance control abilities through a system implementing an automatic evaluation of exercises drawn from balance assessment scales. Our aim is to overcome the classical limits characterizing these scales i.e. limited granularity and inter-/intra-examiner reliability, to obtain objective scores and more detailed information allowing to predict fall risk. We used Microsoft Kinect to record subjects' movements while performing challenging exercises drawn from clinical balance scales. We then computed a set of parameters quantifying the execution of the exercises and fed them to a supervised classifier to perform a classification based on the clinical score. We obtained a good accuracy (~82%) and especially a high sensitivity (~83%).

  9. Comparison of semi-quantitative and quantitative dynamic contrast-enhanced MRI evaluations of vertebral marrow perfusion in a rat osteoporosis model.

    PubMed

    Zhu, Jingqi; Xiong, Zuogang; Zhang, Jiulong; Qiu, Yuyou; Hua, Ting; Tang, Guangyu

    2017-11-14

    This study aims to investigate the technical feasibility of semi-quantitative and quantitative dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in the assessment of longitudinal changes of marrow perfusion in a rat osteoporosis model, using bone mineral density (BMD) measured by micro-computed tomography (micro-CT) and histopathology as the gold standards. Fifty rats were randomly assigned to the control group (n=25) and ovariectomy (OVX) group whose bilateral ovaries were excised (n=25). Semi-quantitative and quantitative DCE-MRI, micro-CT, and histopathological examinations were performed on lumbar vertebrae at baseline and 3, 6, 9, and 12 weeks after operation. The differences between the two groups in terms of semi-quantitative DCE-MRI parameter (maximum enhancement, E max ), quantitative DCE-MRI parameters (volume transfer constant, K trans ; interstitial volume, V e ; and efflux rate constant, K ep ), micro-CT parameter (BMD), and histopathological parameter (microvessel density, MVD) were compared at each of the time points using an independent-sample t test. The differences in these parameters between baseline and other time points in each group were assessed via Bonferroni's multiple comparison test. A Pearson correlation analysis was applied to assess the relationships between DCE-MRI, micro-CT, and histopathological parameters. In the OVX group, the E max values decreased significantly compared with those of the control group at weeks 6 and 9 (p=0.003 and 0.004, respectively). The K trans values decreased significantly compared with those of the control group from week 3 (p<0.05). However, the V e values decreased significantly only at week 9 (p=0.032), and no difference in the K ep was found between two groups. The BMD values of the OVX group decreased significantly compared with those of the control group from week 3 (p<0.05). Transmission electron microscopy showed tighter gaps between vascular endothelial cells with swollen mitochondria

  10. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  11. Methods of quantitative risk assessment: The case of the propellant supply system

    NASA Astrophysics Data System (ADS)

    Merz, H. A.; Bienz, A.

    1984-08-01

    As a consequence of the disastrous accident in Lapua (Finland) in 1976, where an explosion in a cartridge loading facility killed 40 and injured more than 70 persons, efforts were undertaken to examine and improve the safety of such installations. An ammunition factory in Switzerland considered the replacement of the manual supply of propellant hoppers by a new pneumatic supply system. This would reduce the maximum quantity of propellant in the hoppers to a level, where an accidental ignition would no longer lead to a detonation, and this would drastically limit the effects on persons. A quantitative risk assessment of the present and the planned supply system demonstrated that, in this particular case, the pneumatic supply system would not reduce the risk enough to justify the related costs. In addition, it could be shown that the safety of the existing system can be improved more effectively by other safety measures at considerably lower costs. Based on this practical example, the advantages of a strictly quantitative risk assessment for the safety planning in explosives factories are demonstrated. The methodological background of a risk assessment and the steps involved in the analysis are summarized. In addition, problems of quantification are discussed.

  12. Use of Quantitative Microbial Risk Assessment to Improve Interpretation of a Recreational Water Epidemiological Study

    EPA Science Inventory

    We conducted a supplemental water quality monitoring study and quantitative microbial risk assessment (QMRA) to complement the United States Environmental Protection Agency’s (U.S. EPA) National Epidemiological and Environmental Assessment of Recreational Water study at Boquerón ...

  13. [A quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse].

    PubMed

    Zhang, Yu; Chen, Yuzhen; Hu, Chunguang; Zhang, Huaning; Bi, Zhenwang; Bi, Zhenqiang

    2015-05-01

    To construct a quantitative risk assessment model of salmonella on carcass in poultry slaughterhouse and to find out effective interventions to reduce salmonella contamination. We constructed a modular process risk model (MPRM) from evisceration to chilling in Excel Sheet using the data of the process parameters in poultry and the Salmomella concentration surveillance of Jinan in 2012. The MPRM was simulated by @ risk software. The concentration of salmonella on carcass after chilling was 1.96MPN/g which was calculated by model. The sensitive analysis indicated that the correlation coefficient of the concentration of salmonella after defeathering and in chilling pool were 0.84 and 0.34,which were the primary factors to the concentration of salmonella on carcass after chilling. The study provided a quantitative assessment model structure for salmonella on carcass in poultry slaughterhouse. The risk manager could control the contamination of salmonella on carcass after chilling by reducing the concentration of salmonella after defeathering and in chilling pool.

  14. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and themore » potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk

  15. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses

    PubMed Central

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning. PMID:27146161

  16. Assessment of Scientific Literacy: Development and Validation of the Quantitative Assessment of Socio-Scientific Reasoning (QuASSR)

    ERIC Educational Resources Information Center

    Romine, William L.; Sadler, Troy D.; Kinslow, Andrew T.

    2017-01-01

    We describe the development and validation of the Quantitative Assessment of Socio-scientific Reasoning (QuASSR) in a college context. The QuASSR contains 10 polytomous, two-tiered items crossed between two scenarios, and is based on theory suggesting a four-pronged structure for SSR (complexity, perspective taking, inquiry, and skepticism). In…

  17. Analytical Validation of a Highly Quantitative, Sensitive, Accurate, and Reproducible Assay (HERmark) for the Measurement of HER2 Total Protein and HER2 Homodimers in FFPE Breast Cancer Tumor Specimens.

    PubMed

    Larson, Jeffrey S; Goodman, Laurie J; Tan, Yuping; Defazio-Eli, Lisa; Paquet, Agnes C; Cook, Jennifer W; Rivera, Amber; Frankson, Kristi; Bose, Jolly; Chen, Lili; Cheung, Judy; Shi, Yining; Irwin, Sarah; Kiss, Linda D B; Huang, Weidong; Utter, Shannon; Sherwood, Thomas; Bates, Michael; Weidler, Jodi; Parry, Gordon; Winslow, John; Petropoulos, Christos J; Whitcomb, Jeannette M

    2010-06-28

    We report here the results of the analytical validation of assays that measure HER2 total protein (H2T) and HER2 homodimer (H2D) expression in Formalin Fixed Paraffin Embedded (FFPE) breast cancer tumors as well as cell line controls. The assays are based on the VeraTag technology platform and are commercially available through a central CAP-accredited clinical reference laboratory. The accuracy of H2T measurements spans a broad dynamic range (2-3 logs) as evaluated by comparison with cross-validating technologies. The measurement of H2T expression demonstrates a sensitivity that is approximately 7-10 times greater than conventional immunohistochemistry (IHC) (HercepTest). The HERmark assay is a quantitative assay that sensitively and reproducibly measures continuous H2T and H2D protein expression levels and therefore may have the potential to stratify patients more accurately with respect to response to HER2-targeted therapies than current methods which rely on semiquantitative protein measurements (IHC) or on indirect assessments of gene amplification (FISH).

  18. Stroke onset time estimation from multispectral quantitative magnetic resonance imaging in a rat model of focal permanent cerebral ischemia.

    PubMed

    McGarry, Bryony L; Rogers, Harriet J; Knight, Michael J; Jokivarsi, Kimmo T; Sierra, Alejandra; Gröhn, Olli Hj; Kauppinen, Risto A

    2016-08-01

    Quantitative T2 relaxation magnetic resonance imaging allows estimation of stroke onset time. We aimed to examine the accuracy of quantitative T1 and quantitative T2 relaxation times alone and in combination to provide estimates of stroke onset time in a rat model of permanent focal cerebral ischemia and map the spatial distribution of elevated quantitative T1 and quantitative T2 to assess tissue status. Permanent middle cerebral artery occlusion was induced in Wistar rats. Animals were scanned at 9.4T for quantitative T1, quantitative T2, and Trace of Diffusion Tensor (Dav) up to 4 h post-middle cerebral artery occlusion. Time courses of differentials of quantitative T1 and quantitative T2 in ischemic and non-ischemic contralateral brain tissue (ΔT1, ΔT2) and volumes of tissue with elevated T1 and T2 relaxation times (f1, f2) were determined. TTC staining was used to highlight permanent ischemic damage. ΔT1, ΔT2, f1, f2, and the volume of tissue with both elevated quantitative T1 and quantitative T2 (V(Overlap)) increased with time post-middle cerebral artery occlusion allowing stroke onset time to be estimated. V(Overlap) provided the most accurate estimate with an uncertainty of ±25 min. At all times-points regions with elevated relaxation times were smaller than areas with Dav defined ischemia. Stroke onset time can be determined by quantitative T1 and quantitative T2 relaxation times and tissue volumes. Combining quantitative T1 and quantitative T2 provides the most accurate estimate and potentially identifies irreversibly damaged brain tissue. © 2016 World Stroke Organization.

  19. Quantitative risk assessment for skin sensitization: Success or failure?

    PubMed

    Kimber, Ian; Gerberick, G Frank; Basketter, David A

    2017-02-01

    Skin sensitization is unique in the world of toxicology. There is a combination of reliable, validated predictive test methods for identification of skin sensitizing chemicals, a clearly documented and transparent approach to risk assessment, and effective feedback from dermatology clinics around the world delivering evidence of the success or failure of the hazard identification/risk assessment/management process. Recent epidemics of contact allergy, particularly to preservatives, have raised questions of whether the safety/risk assessment process is working in an optimal manner (or indeed is working at all!). This review has as its focus skin sensitization quantitative risk assessment (QRA). The core toxicological principles of QRA are reviewed, and evidence of use and misuse examined. What becomes clear is that skin sensitization QRA will only function adequately if two essential criteria are met. The first is that QRA is applied rigourously, and the second is that potential exposure to the sensitizing substance is assessed adequately. This conclusion will come as no surprise to any toxicologist who appreciates the basic premise that "risk = hazard x exposure". Accordingly, use of skin sensitization QRA is encouraged, not least because the essential feedback from dermatology clinics can be used as a tool to refine QRA in situations where this risk assessment tool has not been properly used. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Quantitative analysis for peripheral vascularity assessment based on clinical photoacoustic and ultrasound images

    NASA Astrophysics Data System (ADS)

    Murakoshi, Dai; Hirota, Kazuhiro; Ishii, Hiroyasu; Hashimoto, Atsushi; Ebata, Tetsurou; Irisawa, Kaku; Wada, Takatsugu; Hayakawa, Toshiro; Itoh, Kenji; Ishihara, Miya

    2018-02-01

    Photoacoustic (PA) imaging technology is expected to be applied to clinical assessment for peripheral vascularity. We started a clinical evaluation with the prototype PA imaging system we recently developed. Prototype PA imaging system was composed with in-house Q-switched Alexandrite laser system which emits short-pulsed laser with 750 nm wavelength, handheld ultrasound transducer where illumination optics were integrated and signal processing for PA image reconstruction implemented in the clinical ultrasound (US) system. For the purpose of quantitative assessment of PA images, an image analyzing function has been developed and applied to clinical PA images. In this analyzing function, vascularity derived from PA signal intensity ranged for prescribed threshold was defined as a numerical index of vessel fulfillment and calculated for the prescribed region of interest (ROI). Skin surface was automatically detected by utilizing B-mode image acquired simultaneously with PA image. Skinsurface position is utilized to place the ROI objectively while avoiding unwanted signals such as artifacts which were imposed due to melanin pigment in the epidermal layer which absorbs laser emission and generates strong PA signals. Multiple images were available to support the scanned image set for 3D viewing. PA images for several fingers of patients with systemic sclerosis (SSc) were quantitatively assessed. Since the artifact region is trimmed off in PA images, the visibility of vessels with rather low PA signal intensity on the 3D projection image was enhanced and the reliability of the quantitative analysis was improved.

  1. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    USDA-ARS?s Scientific Manuscript database

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  2. Application of a Multiplex Quantitative PCR to Assess Prevalence and Intensity Of Intestinal Parasite Infections in a Controlled Clinical Trial

    PubMed Central

    Llewellyn, Stacey; Inpankaew, Tawin; Nery, Susana Vaz; Gray, Darren J.; Verweij, Jaco J.; Clements, Archie C. A.; Gomes, Santina J.; Traub, Rebecca; McCarthy, James S.

    2016-01-01

    Background Accurate quantitative assessment of infection with soil transmitted helminths and protozoa is key to the interpretation of epidemiologic studies of these parasites, as well as for monitoring large scale treatment efficacy and effectiveness studies. As morbidity and transmission of helminth infections are directly related to both the prevalence and intensity of infection, there is particular need for improved techniques for assessment of infection intensity for both purposes. The current study aimed to evaluate two multiplex PCR assays to determine prevalence and intensity of intestinal parasite infections, and compare them to standard microscopy. Methodology/Principal Findings Faecal samples were collected from a total of 680 people, originating from rural communities in Timor-Leste (467 samples) and Cambodia (213 samples). DNA was extracted from stool samples and subject to two multiplex real-time PCR reactions the first targeting: Necator americanus, Ancylostoma spp., Ascaris spp., and Trichuris trichiura; and the second Entamoeba histolytica, Cryptosporidium spp., Giardia. duodenalis, and Strongyloides stercoralis. Samples were also subject to sodium nitrate flotation for identification and quantification of STH eggs, and zinc sulphate centrifugal flotation for detection of protozoan parasites. Higher parasite prevalence was detected by multiplex PCR (hookworms 2.9 times higher, Ascaris 1.2, Giardia 1.6, along with superior polyparasitism detection with this effect magnified as the number of parasites present increased (one: 40.2% vs. 38.1%, two: 30.9% vs. 12.9%, three: 7.6% vs. 0.4%, four: 0.4% vs. 0%). Although, all STH positive samples were low intensity infections by microscopy as defined by WHO guidelines the DNA-load detected by multiplex PCR suggested higher intensity infections. Conclusions/Significance Multiplex PCR, in addition to superior sensitivity, enabled more accurate determination of infection intensity for Ascaris, hookworms and

  3. Generalized PSF modeling for optimized quantitation in PET imaging.

    PubMed

    Ashrafinia, Saeed; Mohy-Ud-Din, Hassan; Karakatsanis, Nicolas A; Jha, Abhinav K; Casey, Michael E; Kadrmas, Dan J; Rahmim, Arman

    2017-06-21

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUV mean and SUV max , including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUV mean bias in small tumours. Overall, the results indicate that exactly matched PSF

  4. Quantitative Approach to Collaborative Learning: Performance Prediction, Individual Assessment, and Group Composition

    ERIC Educational Resources Information Center

    Cen, Ling; Ruta, Dymitr; Powell, Leigh; Hirsch, Benjamin; Ng, Jason

    2016-01-01

    The benefits of collaborative learning, although widely reported, lack the quantitative rigor and detailed insight into the dynamics of interactions within the group, while individual contributions and their impacts on group members and their collaborative work remain hidden behind joint group assessment. To bridge this gap we intend to address…

  5. Quantitative Assessment of a Field-Based Course on Integrative Geology, Ecology and Cultural History

    ERIC Educational Resources Information Center

    Sheppard, Paul R.; Donaldson, Brad A.; Huckleberry, Gary

    2010-01-01

    A field-based course at the University of Arizona called Sense of Place (SOP) covers the geology, ecology and cultural history of the Tucson area. SOP was quantitatively assessed for pedagogical effectiveness. Students of the Spring 2008 course were given pre- and post-course word association surveys in order to assess awareness and comprehension…

  6. Laser gingival retraction: a quantitative assessment.

    PubMed

    Krishna Ch, Vamsi; Gupta, Nidhi; Reddy, K Mahendranadh; Sekhar, N Chandra; Aditya, Venkata; Reddy, G V K Mohan

    2013-08-01

    Proper gingival retraction improves the prognosis of crowns and bridges with sub gingival finishlines.Use of lasers assists the operator to achieve proper retraction with good clinical results. The present study was intended to assess the amount of lateral gingival retraction achieved quantitatively by using diode lasers. Study was carried on 20 patients attended to a dental institution that underwent root canal treatment and indicated for fabrication of crowns. Gingival retraction was carried out on 20 teeth and elastomeric impressions were obtained. Models retrieved from the impressions were sectioned and the lateral distance between finish line and the marginal gingival was measured using tool makers microscope. Retraction was measured in mid buccal, mesio buccal and disto buccal regions. The values obtained were used to calculate the mean lateral retraction in microns. Mean retraction values of 399.5 μm, 445.5 μm and 422.5μm were obtained in mid buccal, mesio buccal and disto buccal regions respectively. Gingival Retraction achieved was closer to the thickness of sulcular epithelium and greater than the minimum required retraction of 200um.

  7. Development of CD3 cell quantitation algorithms for renal allograft biopsy rejection assessment utilizing open source image analysis software.

    PubMed

    Moon, Andres; Smith, Geoffrey H; Kong, Jun; Rogers, Thomas E; Ellis, Carla L; Farris, Alton B Brad

    2018-02-01

    Renal allograft rejection diagnosis depends on assessment of parameters such as interstitial inflammation; however, studies have shown interobserver variability regarding interstitial inflammation assessment. Since automated image analysis quantitation can be reproducible, we devised customized analysis methods for CD3+ T-cell staining density as a measure of rejection severity and compared them with established commercial methods along with visual assessment. Renal biopsy CD3 immunohistochemistry slides (n = 45), including renal allografts with various degrees of acute cellular rejection (ACR) were scanned for whole slide images (WSIs). Inflammation was quantitated in the WSIs using pathologist visual assessment, commercial algorithms (Aperio nuclear algorithm for CD3+ cells/mm 2 and Aperio positive pixel count algorithm), and customized open source algorithms developed in ImageJ with thresholding/positive pixel counting (custom CD3+%) and identification of pixels fulfilling "maxima" criteria for CD3 expression (custom CD3+ cells/mm 2 ). Based on visual inspections of "markup" images, CD3 quantitation algorithms produced adequate accuracy. Additionally, CD3 quantitation algorithms correlated between each other and also with visual assessment in a statistically significant manner (r = 0.44 to 0.94, p = 0.003 to < 0.0001). Methods for assessing inflammation suggested a progression through the tubulointerstitial ACR grades, with statistically different results in borderline versus other ACR types, in all but the custom methods. Assessment of CD3-stained slides using various open source image analysis algorithms presents salient correlations with established methods of CD3 quantitation. These analysis techniques are promising and highly customizable, providing a form of on-slide "flow cytometry" that can facilitate additional diagnostic accuracy in tissue-based assessments.

  8. [A new method of processing quantitative PCR data].

    PubMed

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  9. Quantitative Evaluation of MODIS Fire Radiative Power Measurement for Global Smoke Emissions Assessment

    NASA Technical Reports Server (NTRS)

    Ichoku, Charles; Ellison, Luke

    2011-01-01

    Satellite remote sensing is providing us tremendous opportunities to measure the fire radiative energy (FRE) release rate or power (FRP) from open biomass burning, which affects many vegetated regions of the world on a seasonal basis. Knowledge of the biomass burning characteristics and emission source strengths of different (particulate and gaseous) smoke constituents is one of the principal ingredients upon which the assessment, modeling, and forecasting of their distribution and impacts depend. This knowledge can be gained through accurate measurement of FRP, which has been shown to have a direct relationship with the rates of biomass consumption and emissions of major smoke constituents. Over the last decade or so, FRP has been routinely measured from space by both the MODIS sensors aboard the polar orbiting Terra and Aqua satellites, and the SEVIRI sensor aboard the Meteosat Second Generation (MSG) geostationary satellite. During the last few years, FRP has steadily gained increasing recognition as an important parameter for facilitating the development of various scientific studies and applications relating to the quantitative characterization of biomass burning and their emissions. To establish the scientific integrity of the FRP as a stable quantity that can be measured consistently across a variety of sensors and platforms, with the potential of being utilized to develop a unified long-term climate data record of fire activity and impacts, it needs to be thoroughly evaluated, calibrated, and validated. Therefore, we are conducting a detailed analysis of the FRP products from MODIS to evaluate the uncertainties associated with them, such as those due to the effects of satellite variable observation geometry and other factors, in order to establish their error budget for use in diverse scientific research and applications. In this presentation, we will show recent results of the MODIS FRP uncertainty analysis and error mitigation solutions, and demonstrate

  10. Assessing the performance of quantitative image features on early stage prediction of treatment effectiveness for ovary cancer patients: a preliminary investigation

    NASA Astrophysics Data System (ADS)

    Zargari, Abolfazl; Du, Yue; Thai, Theresa C.; Gunderson, Camille C.; Moore, Kathleen; Mannel, Robert S.; Liu, Hong; Zheng, Bin; Qiu, Yuchen

    2018-02-01

    The objective of this study is to investigate the performance of global and local features to better estimate the characteristics of highly heterogeneous metastatic tumours, for accurately predicting the treatment effectiveness of the advanced stage ovarian cancer patients. In order to achieve this , a quantitative image analysis scheme was developed to estimate a total of 103 features from three different groups including shape and density, Wavelet, and Gray Level Difference Method (GLDM) features. Shape and density features are global features, which are directly applied on the entire target image; wavelet and GLDM features are local features, which are applied on the divided blocks of the target image. To assess the performance, the new scheme was applied on a retrospective dataset containing 120 recurrent and high grade ovary cancer patients. The results indicate that the three best performed features are skewness, root-mean-square (rms) and mean of local GLDM texture, indicating the importance of integrating local features. In addition, the averaged predicting performance are comparable among the three different categories. This investigation concluded that the local features contains at least as copious tumour heterogeneity information as the global features, which may be meaningful on improving the predicting performance of the quantitative image markers for the diagnosis and prognosis of ovary cancer patients.

  11. Performance Evaluation and Quantitative Accuracy of Multipinhole NanoSPECT/CT Scanner for Theranostic Lu-177 Imaging

    NASA Astrophysics Data System (ADS)

    Gupta, Arun; Kim, Kyeong Yun; Hwang, Donghwi; Lee, Min Sun; Lee, Dong Soo; Lee, Jae Sung

    2018-06-01

    SPECT plays important role in peptide receptor targeted radionuclide therapy using theranostic radionuclides such as Lu-177 for the treatment of various cancers. However, SPECT studies must be quantitatively accurate because the reliable assessment of tumor uptake and tumor-to-normal tissue ratios can only be performed using quantitatively accurate images. Hence, it is important to evaluate performance parameters and quantitative accuracy of preclinical SPECT systems for therapeutic radioisotopes before conducting pre- and post-therapy SPECT imaging or dosimetry studies. In this study, we evaluated system performance and quantitative accuracy of NanoSPECT/CT scanner for Lu-177 imaging using point source and uniform phantom studies. We measured recovery coefficient, uniformity, spatial resolution, system sensitivity and calibration factor for mouse whole body standard aperture. We also performed the experiments using Tc-99m to compare the results with that of Lu-177. We found that the recovery coefficient of more than 70% for Lu-177 at the optimum noise level when nine iterations were used. The spatial resolutions of Lu-177 with and without adding uniform background was comparable to that of Tc-99m in axial, radial and tangential directions. System sensitivity measured for Lu-177 was almost three times less than that of Tc-99m.

  12. Quantitative assessment model for gastric cancer screening

    PubMed Central

    Chen, Kun; Yu, Wei-Ping; Song, Liang; Zhu, Yi-Min

    2005-01-01

    AIM: To set up a mathematic model for gastric cancer screening and to evaluate its function in mass screening for gastric cancer. METHODS: A case control study was carried on in 66 patients and 198 normal people, then the risk and protective factors of gastric cancer were determined, including heavy manual work, foods such as small yellow-fin tuna, dried small shrimps, squills, crabs, mothers suffering from gastric diseases, spouse alive, use of refrigerators and hot food, etc. According to some principles and methods of probability and fuzzy mathematics, a quantitative assessment model was established as follows: first, we selected some factors significant in statistics, and calculated weight coefficient for each one by two different methods; second, population space was divided into gastric cancer fuzzy subset and non gastric cancer fuzzy subset, then a mathematic model for each subset was established, we got a mathematic expression of attribute degree (AD). RESULTS: Based on the data of 63 patients and 693 normal people, AD of each subject was calculated. Considering the sensitivity and specificity, the thresholds of AD values calculated were configured with 0.20 and 0.17, respectively. According to these thresholds, the sensitivity and specificity of the quantitative model were about 69% and 63%. Moreover, statistical test showed that the identification outcomes of these two different calculation methods were identical (P>0.05). CONCLUSION: The validity of this method is satisfactory. It is convenient, feasible, economic and can be used to determine individual and population risks of gastric cancer. PMID:15655813

  13. [Application of three risk assessment models in occupational health risk assessment of dimethylformamide].

    PubMed

    Wu, Z J; Xu, B; Jiang, H; Zheng, M; Zhang, M; Zhao, W J; Cheng, J

    2016-08-20

    Objective: To investigate the application of United States Environmental Protection Agency (EPA) inhalation risk assessment model, Singapore semi-quantitative risk assessment model, and occupational hazards risk assessment index method in occupational health risk in enterprises using dimethylformamide (DMF) in a certain area in Jiangsu, China, and to put forward related risk control measures. Methods: The industries involving DMF exposure in Jiangsu province were chosen as the evaluation objects in 2013 and three risk assessment models were used in the evaluation. EPA inhalation risk assessment model: HQ=EC/RfC; Singapore semi-quantitative risk assessment model: Risk= (HR×ER) 1/2 ; Occupational hazards risk assessment index=2 Health effect level ×2 exposure ratio ×Operation condition level. Results: The results of hazard quotient (HQ>1) from EPA inhalation risk assessment model suggested that all the workshops (dry method, wet method and printing) and work positions (pasting, burdening, unreeling, rolling, assisting) were high risk. The results of Singapore semi-quantitative risk assessment model indicated that the workshop risk level of dry method, wet method and printing were 3.5 (high) , 3.5 (high) and 2.8 (general) , and position risk level of pasting, burdening, unreeling, rolling, assisting were 4 (high) , 4 (high) , 2.8 (general) , 2.8 (general) and 2.8 (general) . The results of occupational hazards risk assessment index method demonstrated that the position risk index of pasting, burdening, unreeling, rolling, assisting were 42 (high) , 33 (high) , 23 (middle) , 21 (middle) and 22 (middle) . The results of Singapore semi-quantitative risk assessment model and occupational hazards risk assessment index method were similar, while EPA inhalation risk assessment model indicated all the workshops and positions were high risk. Conclusion: The occupational hazards risk assessment index method fully considers health effects, exposure, and operating conditions

  14. Quantitative LC-MS of polymers: determining accurate molecular weight distributions by combined size exclusion chromatography and electrospray mass spectrometry with maximum entropy data processing.

    PubMed

    Gruendling, Till; Guilhaus, Michael; Barner-Kowollik, Christopher

    2008-09-15

    We report on the successful application of size exclusion chromatography (SEC) combined with electrospray ionization mass spectrometry (ESI-MS) and refractive index (RI) detection for the determination of accurate molecular weight distributions of synthetic polymers, corrected for chromatographic band broadening. The presented method makes use of the ability of ESI-MS to accurately depict the peak profiles and retention volumes of individual oligomers eluting from the SEC column, whereas quantitative information on the absolute concentration of oligomers is obtained from the RI-detector only. A sophisticated computational algorithm based on the maximum entropy principle is used to process the data gained by both detectors, yielding an accurate molecular weight distribution, corrected for chromatographic band broadening. Poly(methyl methacrylate) standards with molecular weights up to 10 kDa serve as model compounds. Molecular weight distributions (MWDs) obtained by the maximum entropy procedure are compared to MWDs, which were calculated by a conventional calibration of the SEC-retention time axis with peak retention data obtained from the mass spectrometer. Comparison showed that for the employed chromatographic system, distributions below 7 kDa were only weakly influenced by chromatographic band broadening. However, the maximum entropy algorithm could successfully correct the MWD of a 10 kDa standard for band broadening effects. Molecular weight averages were between 5 and 14% lower than the manufacturer stated data obtained by classical means of calibration. The presented method demonstrates a consistent approach for analyzing data obtained by coupling mass spectrometric detectors and concentration sensitive detectors to polymer liquid chromatography.

  15. Quantitative Graphics in Newspapers.

    ERIC Educational Resources Information Center

    Tankard, James W., Jr.

    The use of quantitative graphics in newspapers requires achieving a balance between being accurate and getting the attention of the reader. The statistical representations in newspapers are drawn by graphic designers whose key technique is fusion--the striking combination of two visual images. This technique often results in visual puns,…

  16. Accurate radiation temperature and chemical potential from quantitative photoluminescence analysis of hot carrier populations.

    PubMed

    Gibelli, François; Lombez, Laurent; Guillemoles, Jean-François

    2017-02-15

    In order to characterize hot carrier populations in semiconductors, photoluminescence measurement is a convenient tool, enabling us to probe the carrier thermodynamical properties in a contactless way. However, the analysis of the photoluminescence spectra is based on some assumptions which will be discussed in this work. We especially emphasize the importance of the variation of the material absorptivity that should be considered to access accurate thermodynamical properties of the carriers, especially by varying the excitation power. The proposed method enables us to obtain more accurate results of thermodynamical properties by taking into account a rigorous physical description and finds direct application in investigating hot carrier solar cells, which are an adequate concept for achieving high conversion efficiencies with a relatively simple device architecture.

  17. Tomosynthesis can facilitate accurate measurement of joint space width under the condition of the oblique incidence of X-rays in patients with rheumatoid arthritis.

    PubMed

    Ono, Yohei; Kashihara, Rina; Yasojima, Nobutoshi; Kasahara, Hideki; Shimizu, Yuka; Tamura, Kenichi; Tsutsumi, Kaori; Sutherland, Kenneth; Koike, Takao; Kamishima, Tamotsu

    2016-06-01

    Accurate evaluation of joint space width (JSW) is important in the assessment of rheumatoid arthritis (RA). In clinical radiography of bilateral hands, the oblique incidence of X-rays is unavoidable, which may cause perceptional or measurement error of JSW. The objective of this study was to examine whether tomosynthesis, a recently developed modality, can facilitate a more accurate evaluation of JSW than radiography under the condition of oblique incidence of X-rays. We investigated quantitative errors derived from the oblique incidence of X-rays by imaging phantoms simulating various finger joint spaces using radiographs and tomosynthesis images. We then compared the qualitative results of the modified total Sharp score of a total of 320 joints from 20 patients with RA between these modalities. A quantitative error was prominent when the location of the phantom was shifted along the JSW direction. Modified total Sharp scores of tomosynthesis images were significantly higher than those of radiography, that is to say JSW was regarded as narrower in tomosynthesis than in radiography when finger joints were located where the oblique incidence of X-rays is expected in the JSW direction. Tomosynthesis can facilitate accurate evaluation of JSW in finger joints of patients with RA, even with oblique incidence of X-rays. Accurate evaluation of JSW is necessary for the management of patients with RA. Through phantom and clinical studies, we demonstrate that tomosynthesis may achieve more accurate evaluation of JSW.

  18. Standardizing evaluation of pQCT image quality in the presence of subject movement: qualitative versus quantitative assessment.

    PubMed

    Blew, Robert M; Lee, Vinson R; Farr, Joshua N; Schiferl, Daniel J; Going, Scott B

    2014-02-01

    Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale and (2) establish a quantitative motion assessment methodology. Scans were performed on 506 healthy girls (9-13 years) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n = 46) was examined to determine %Move's impact on bone parameters. Agreement between measurers was strong (intraclass correlation coefficient = 0.732 for tibia, 0.812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat and no repeat. The quantitative approach found ≥95% of subjects had %Move <25 %. Comparison of initial and repeat scans by groups above and below 25% initial movement showed significant differences in the >25 % grouping. A pQCT visual inspection scale can be a reliable metric of image quality, but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat.

  19. Standardizing Evaluation of pQCT Image Quality in the Presence of Subject Movement: Qualitative vs. Quantitative Assessment

    PubMed Central

    Blew, Robert M.; Lee, Vinson R.; Farr, Joshua N.; Schiferl, Daniel J.; Going, Scott B.

    2013-01-01

    Purpose Peripheral quantitative computed tomography (pQCT) is an essential tool for assessing bone parameters of the limbs, but subject movement and its impact on image quality remains a challenge to manage. The current approach to determine image viability is by visual inspection, but pQCT lacks a quantitative evaluation. Therefore, the aims of this study were to (1) examine the reliability of a qualitative visual inspection scale, and (2) establish a quantitative motion assessment methodology. Methods Scans were performed on 506 healthy girls (9–13yr) at diaphyseal regions of the femur and tibia. Scans were rated for movement independently by three technicians using a linear, nominal scale. Quantitatively, a ratio of movement to limb size (%Move) provided a measure of movement artifact. A repeat-scan subsample (n=46) was examined to determine %Move’s impact on bone parameters. Results Agreement between measurers was strong (ICC = .732 for tibia, .812 for femur), but greater variability was observed in scans rated 3 or 4, the delineation between repeat or no repeat. The quantitative approach found ≥95% of subjects had %Move <25%. Comparison of initial and repeat scans by groups above and below 25% initial movement, showed significant differences in the >25% grouping. Conclusions A pQCT visual inspection scale can be a reliable metric of image quality but technicians may periodically mischaracterize subject motion. The presented quantitative methodology yields more consistent movement assessment and could unify procedure across laboratories. Data suggest a delineation of 25% movement for determining whether a diaphyseal scan is viable or requires repeat. PMID:24077875

  20. A novel iris transillumination grading scale allowing flexible assessment with quantitative image analysis and visual matching.

    PubMed

    Wang, Chen; Brancusi, Flavia; Valivullah, Zaheer M; Anderson, Michael G; Cunningham, Denise; Hedberg-Buenz, Adam; Power, Bradley; Simeonov, Dimitre; Gahl, William A; Zein, Wadih M; Adams, David R; Brooks, Brian

    2018-01-01

    To develop a sensitive scale of iris transillumination suitable for clinical and research use, with the capability of either quantitative analysis or visual matching of images. Iris transillumination photographic images were used from 70 study subjects with ocular or oculocutaneous albinism. Subjects represented a broad range of ocular pigmentation. A subset of images was subjected to image analysis and ranking by both expert and nonexpert reviewers. Quantitative ordering of images was compared with ordering by visual inspection. Images were binned to establish an 8-point scale. Ranking consistency was evaluated using the Kendall rank correlation coefficient (Kendall's tau). Visual ranking results were assessed using Kendall's coefficient of concordance (Kendall's W) analysis. There was a high degree of correlation among the image analysis, expert-based and non-expert-based image rankings. Pairwise comparisons of the quantitative ranking with each reviewer generated an average Kendall's tau of 0.83 ± 0.04 (SD). Inter-rater correlation was also high with Kendall's W of 0.96, 0.95, and 0.95 for nonexpert, expert, and all reviewers, respectively. The current standard for assessing iris transillumination is expert assessment of clinical exam findings. We adapted an image-analysis technique to generate quantitative transillumination values. Quantitative ranking was shown to be highly similar to a ranking produced by both expert and nonexpert reviewers. This finding suggests that the image characteristics used to quantify iris transillumination do not require expert interpretation. Inter-rater rankings were also highly similar, suggesting that varied methods of transillumination ranking are robust in terms of producing reproducible results.

  1. Image change detection using paradoxical theory for patient follow-up quantitation and therapy assessment.

    PubMed

    David, Simon; Visvikis, Dimitris; Quellec, Gwénolé; Le Rest, Catherine Cheze; Fernandez, Philippe; Allard, Michèle; Roux, Christian; Hatt, Mathieu

    2012-09-01

    In clinical oncology, positron emission tomography (PET) imaging can be used to assess therapeutic response by quantifying the evolution of semi-quantitative values such as standardized uptake value, early during treatment or after treatment. Current guidelines do not include metabolically active tumor volume (MATV) measurements and derived parameters such as total lesion glycolysis (TLG) to characterize the response to the treatment. To achieve automatic MATV variation estimation during treatment, we propose an approach based on the change detection principle using the recent paradoxical theory, which models imprecision, uncertainty, and conflict between sources. It was applied here simultaneously to pre- and post-treatment PET scans. The proposed method was applied to both simulated and clinical datasets, and its performance was compared to adaptive thresholding applied separately on pre- and post-treatment PET scans. On simulated datasets, the adaptive threshold was associated with significantly higher classification errors than the developed approach. On clinical datasets, the proposed method led to results more consistent with the known partial responder status of these patients. The method requires accurate rigid registration of both scans which can be obtained only in specific body regions and does not explicitly model uptake heterogeneity. In further investigations, the change detection of intra-MATV tracer uptake heterogeneity will be developed by incorporating textural features into the proposed approach.

  2. The Incremental Value of Subjective and Quantitative Assessment of 18F-FDG PET for the Prediction of Pathologic Complete Response to Preoperative Chemoradiotherapy in Esophageal Cancer.

    PubMed

    van Rossum, Peter S N; Fried, David V; Zhang, Lifei; Hofstetter, Wayne L; van Vulpen, Marco; Meijer, Gert J; Court, Laurence E; Lin, Steven H

    2016-05-01

    A reliable prediction of a pathologic complete response (pathCR) to chemoradiotherapy before surgery for esophageal cancer would enable investigators to study the feasibility and outcome of an organ-preserving strategy after chemoradiotherapy. So far no clinical parameters or diagnostic studies are able to accurately predict which patients will achieve a pathCR. The aim of this study was to determine whether subjective and quantitative assessment of baseline and postchemoradiation (18)F-FDG PET can improve the accuracy of predicting pathCR to preoperative chemoradiotherapy in esophageal cancer beyond clinical predictors. This retrospective study was approved by the institutional review board, and the need for written informed consent was waived. Clinical parameters along with subjective and quantitative parameters from baseline and postchemoradiation (18)F-FDG PET were derived from 217 esophageal adenocarcinoma patients who underwent chemoradiotherapy followed by surgery. The associations between these parameters and pathCR were studied in univariable and multivariable logistic regression analysis. Four prediction models were constructed and internally validated using bootstrapping to study the incremental predictive values of subjective assessment of (18)F-FDG PET, conventional quantitative metabolic features, and comprehensive (18)F-FDG PET texture/geometry features, respectively. The clinical benefit of (18)F-FDG PET was determined using decision-curve analysis. A pathCR was found in 59 (27%) patients. A clinical prediction model (corrected c-index, 0.67) was improved by adding (18)F-FDG PET-based subjective assessment of response (corrected c-index, 0.72). This latter model was slightly improved by the addition of 1 conventional quantitative metabolic feature only (i.e., postchemoradiation total lesion glycolysis; corrected c-index, 0.73), and even more by subsequently adding 4 comprehensive (18)F-FDG PET texture/geometry features (corrected c-index, 0

  3. A Quantitative Approach to Assessing System Evolvability

    NASA Technical Reports Server (NTRS)

    Christian, John A., III

    2004-01-01

    When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.

  4. Quantitative sonoelastography for the in vivo assessment of skeletal muscle viscoelasticity

    NASA Astrophysics Data System (ADS)

    Hoyt, Kenneth; Kneezel, Timothy; Castaneda, Benjamin; Parker, Kevin J.

    2008-08-01

    A novel quantitative sonoelastography technique for assessing the viscoelastic properties of skeletal muscle tissue was developed. Slowly propagating shear wave interference patterns (termed crawling waves) were generated using a two-source configuration vibrating normal to the surface. Theoretical models predict crawling wave displacement fields, which were validated through phantom studies. In experiments, a viscoelastic model was fit to dispersive shear wave speed sonoelastographic data using nonlinear least-squares techniques to determine frequency-independent shear modulus and viscosity estimates. Shear modulus estimates derived using the viscoelastic model were in agreement with that obtained by mechanical testing on phantom samples. Preliminary sonoelastographic data acquired in healthy human skeletal muscles confirm that high-quality quantitative elasticity data can be acquired in vivo. Studies on relaxed muscle indicate discernible differences in both shear modulus and viscosity estimates between different skeletal muscle groups. Investigations into the dynamic viscoelastic properties of (healthy) human skeletal muscles revealed that voluntarily contracted muscles exhibit considerable increases in both shear modulus and viscosity estimates as compared to the relaxed state. Overall, preliminary results are encouraging and quantitative sonoelastography may prove clinically feasible for in vivo characterization of the dynamic viscoelastic properties of human skeletal muscle.

  5. Quantitative assessment of emphysema from whole lung CT scans: comparison with visual grading

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Apanosovich, Tatiyana V.; Wang, Jianwei; Yankelevitz, David F.; Henschke, Claudia I.

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for imaging of the anatomical basis of emphysema and for visual assessment by radiologists of the extent present in the lungs. Several measures have been introduced for the quantification of the extent of disease directly from CT data in order to add to the qualitative assessments made by radiologists. In this paper we compare emphysema index, mean lung density, histogram percentiles, and the fractal dimension to visual grade in order to evaluate the predictability of radiologist visual scoring of emphysema from low-dose CT scans through quantitative scores, in order to determine which measures can be useful as surrogates for visual assessment. All measures were computed over nine divisions of the lung field (whole lung, individual lungs, and upper/middle/lower thirds of each lung) for each of 148 low-dose, whole lung scans. In addition, a visual grade of each section was also given by an expert radiologist. One-way ANOVA and multinomial logistic regression were used to determine the ability of the measures to predict visual grade from quantitative score. We found that all measures were able to distinguish between normal and severe grades (p<0.01), and between mild/moderate and all other grades (p<0.05). However, no measure was able to distinguish between mild and moderate cases. Approximately 65% prediction accuracy was achieved from using quantitative score to predict visual grade, with 73% if mild and moderate cases are considered as a single class.

  6. A Preliminary Quantitative Comparison of Vibratory Amplitude Using Rigid and Flexible Stroboscopic Assessment.

    PubMed

    Hosbach-Cannon, Carly J; Lowell, Soren Y; Kelley, Richard T; Colton, Raymond H

    2016-07-01

    The purpose of this study was to establish preliminary, quantitative data on amplitude of vibration during stroboscopic assessment in healthy speakers with normal voice characteristics. Amplitude of vocal fold vibration is a core physiological parameter used in diagnosing voice disorders, yet quantitative data are lacking to guide the determination of what constitutes normal vibratory amplitude. Eleven participants were assessed during sustained vowel production using rigid and flexible endoscopy with stroboscopy. Still images were extracted from digital recordings of a sustained /i/ produced at a comfortable pitch and loudness, with F0 controlled so that levels were within ±15% of each participant's comfortable mean level as determined from connected speech. Glottal width (GW), true vocal fold (TVF) length, and TVF width were measured from still frames representing the maximum open phase of the vibratory cycle. To control for anatomic and magnification differences across participants, GW was normalized to TVF length. GW as a ratio of TVF width was also computed for comparison with prior studies. Mean values and standard deviations were computed for the normalized measures. Paired t tests showed no significant differences between rigid and flexible endoscopy methods. Interrater and intrarater reliability values for raw measurements were found to be high (0.89-0.99). These preliminary quantitative data may be helpful in determining normality or abnormality of vocal fold vibration. Results indicate that quantified amplitude of vibration is similar between endoscopic methods, a clinically relevant finding for individuals performing and interpreting stroboscopic assessments. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  7. A quantitative model to assess Social Responsibility in Environmental Science and Technology.

    PubMed

    Valcárcel, M; Lucena, R

    2014-01-01

    The awareness of the impact of human activities in society and environment is known as "Social Responsibility" (SR). It has been a topic of growing interest in many enterprises since the fifties of the past Century, and its implementation/assessment is nowadays supported by international standards. There is a tendency to amplify its scope of application to other areas of the human activities, such as Research, Development and Innovation (R + D + I). In this paper, a model of quantitative assessment of Social Responsibility in Environmental Science and Technology (SR EST) is described in detail. This model is based on well established written standards as the EFQM Excellence model and the ISO 26000:2010 Guidance on SR. The definition of five hierarchies of indicators, the transformation of qualitative information into quantitative data and the dual procedure of self-evaluation and external evaluation are the milestones of the proposed model, which can be applied to Environmental Research Centres and institutions. In addition, a simplified model that facilitates its implementation is presented in the article. © 2013 Elsevier B.V. All rights reserved.

  8. Analytical Validation of a Highly Quantitative, Sensitive, Accurate, and Reproducible Assay (HERmark®) for the Measurement of HER2 Total Protein and HER2 Homodimers in FFPE Breast Cancer Tumor Specimens

    PubMed Central

    Larson, Jeffrey S.; Goodman, Laurie J.; Tan, Yuping; Defazio-Eli, Lisa; Paquet, Agnes C.; Cook, Jennifer W.; Rivera, Amber; Frankson, Kristi; Bose, Jolly; Chen, Lili; Cheung, Judy; Shi, Yining; Irwin, Sarah; Kiss, Linda D. B.; Huang, Weidong; Utter, Shannon; Sherwood, Thomas; Bates, Michael; Weidler, Jodi; Parry, Gordon; Winslow, John; Petropoulos, Christos J.; Whitcomb, Jeannette M.

    2010-01-01

    We report here the results of the analytical validation of assays that measure HER2 total protein (H2T) and HER2 homodimer (H2D) expression in Formalin Fixed Paraffin Embedded (FFPE) breast cancer tumors as well as cell line controls. The assays are based on the VeraTag technology platform and are commercially available through a central CAP-accredited clinical reference laboratory. The accuracy of H2T measurements spans a broad dynamic range (2-3 logs) as evaluated by comparison with cross-validating technologies. The measurement of H2T expression demonstrates a sensitivity that is approximately 7–10 times greater than conventional immunohistochemistry (IHC) (HercepTest). The HERmark assay is a quantitative assay that sensitively and reproducibly measures continuous H2T and H2D protein expression levels and therefore may have the potential to stratify patients more accurately with respect to response to HER2-targeted therapies than current methods which rely on semiquantitative protein measurements (IHC) or on indirect assessments of gene amplification (FISH). PMID:21151530

  9. Magnetic Resonance Imaging of Intracranial Hypotension: Diagnostic Value of Combined Qualitative Signs and Quantitative Metrics.

    PubMed

    Aslan, Kerim; Gunbey, Hediye Pinar; Tomak, Leman; Ozmen, Zafer; Incesu, Lutfi

    The aim of this study was to investigate whether the use of combination quantitative metrics (mamillopontine distance [MPD], pontomesencephalic angle, and mesencephalon anterior-posterior/medial-lateral diameter ratios) with qualitative signs (dural enhancement, subdural collections/hematoma, venous engorgement, pituitary gland enlargements, and tonsillar herniations) provides a more accurate diagnosis of intracranial hypotension (IH). The quantitative metrics and qualitative signs of 34 patients and 34 control subjects were assessed by 2 independent observers. Receiver operating characteristic (ROC) curve was used to evaluate the diagnostic performance of quantitative metrics and qualitative signs, and for the diagnosis of IH, optimum cutoff values of quantitative metrics were found with ROC analysis. Combined ROC curve was measured for the quantitative metrics, and qualitative signs combinations in determining diagnostic accuracy and sensitivity, specificity, and positive and negative predictive values were found, and the best model combination was formed. Whereas MPD and pontomesencephalic angle were significantly lower in patients with IH when compared with the control group (P < 0.001), mesencephalon anterior-posterior/medial-lateral diameter ratio was significantly higher (P < 0.001). For qualitative signs, the highest individual distinctive power was dural enhancement with area under the ROC curve (AUC) of 0.838. For quantitative metrics, the highest individual distinctive power was MPD with AUC of 0.947. The best accuracy in the diagnosis of IH was obtained by combination of dural enhancement, venous engorgement, and MPD with an AUC of 1.00. This study showed that the combined use of dural enhancement, venous engorgement, and MPD had diagnostic accuracy of 100 % for the diagnosis of IH. Therefore, a more accurate IH diagnosis can be provided with combination of quantitative metrics with qualitative signs.

  10. DOSIMETRY MODELING OF INHALED FORMALDEHYDE: BINNING NASAL FLUX PREDICTIONS FOR QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    Dosimetry Modeling of Inhaled Formaldehyde: Binning Nasal Flux Predictions for Quantitative Risk Assessment. Kimbell, J.S., Overton, J.H., Subramaniam, R.P., Schlosser, P.M., Morgan, K.T., Conolly, R.B., and Miller, F.J. (2001). Toxicol. Sci. 000, 000:000.

    Interspecies e...

  11. A standardized kit for automated quantitative assessment of candidate protein biomarkers in human plasma.

    PubMed

    Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H

    2015-12-01

    An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.

  12. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit

    PubMed Central

    Dai, Houde; Zhang, Pengyue; Lueth, Tim C.

    2015-01-01

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor assessment method. This inertial sensor-based method was verified through comparison with an electromagnetic motion tracking system. Seven Parkinson’s disease (PD) patients were tested using this tremor assessment system. The measured tremor amplitudes correlated well with the judgments of a neurologist (r = 0.98). The systematic analysis of sensor-based tremor quantification and the corresponding experiments could be of great help in monitoring the severity of parkinsonian tremor. PMID:26426020

  13. Quantitative and qualitative 5-aminolevulinic acid–induced protoporphyrin IX fluorescence in skull base meningiomas

    PubMed Central

    Bekelis, Kimon; Valdés, Pablo A.; Erkmen, Kadir; Leblond, Frederic; Kim, Anthony; Wilson, Brian C.; Harris, Brent T.; Paulsen, Keith D.; Roberts, David W.

    2011-01-01

    Object Complete resection of skull base meningiomas provides patients with the best chance for a cure; however, surgery is frequently difficult given the proximity of lesions to vital structures, such as cranial nerves, major vessels, and venous sinuses. Accurate discrimination between tumor and normal tissue is crucial for optimal tumor resection. Qualitative assessment of protoporphyrin IX (PpIX) fluorescence following the exogenous administration of 5-aminolevulinic acid (ALA) has demonstrated utility in malignant glioma resection but limited use in meningiomas. Here the authors demonstrate the use of ALA-induced PpIX fluorescence guidance in resecting a skull base meningioma and elaborate on the advantages and disadvantages provided by both quantitative and qualitative fluorescence methodologies in skull base meningioma resection. Methods A 52-year-old patient with a sphenoid wing WHO Grade I meningioma underwent tumor resection as part of an institutional review board–approved prospective study of fluorescence-guided resection. A surgical microscope modified for fluorescence imaging was used for the qualitative assessment of visible fluorescence, and an intraoperative probe for in situ fluorescence detection was utilized for quantitative measurements of PpIX. The authors assessed the detection capabilities of both the qualitative and quantitative fluorescence approaches. Results The patient harboring a sphenoid wing meningioma with intraorbital extension underwent radical resection of the tumor with both visibly and nonvisibly fluorescent regions. The patient underwent a complete resection without any complications. Some areas of the tumor demonstrated visible fluorescence. The quantitative probe detected neoplastic tissue better than the qualitative modified surgical microscope. The intraoperative probe was particularly useful in areas that did not reveal visible fluorescence, and tissue from these areas was confirmed as tumor following histopathological

  14. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  15. Quantitative Assessment of Liver Fat with Magnetic Resonance Imaging and Spectroscopy

    PubMed Central

    Reeder, Scott B.; Cruite, Irene; Hamilton, Gavin; Sirlin, Claude B.

    2011-01-01

    Hepatic steatosis is characterized by abnormal and excessive accumulation of lipids within hepatocytes. It is an important feature of diffuse liver disease, and the histological hallmark of non-alcoholic fatty liver disease (NAFLD). Other conditions associated with steatosis include alcoholic liver disease, viral hepatitis, HIV and genetic lipodystrophies, cystic fibrosis liver disease, and hepatotoxicity from various therapeutic agents. Liver biopsy, the current clinical gold standard for assessment of liver fat, is invasive and has sampling errors, and is not optimal for screening, monitoring, clinical decision making, or well-suited for many types of research studies. Non-invasive methods that accurately and objectively quantify liver fat are needed. Ultrasound (US) and computed tomography (CT) can be used to assess liver fat but have limited accuracy as well as other limitations. Magnetic resonance (MR) techniques can decompose the liver signal into its fat and water signal components and therefore assess liver fat more directly than CT or US. Most magnetic resonance (MR) techniques measure the signal fat-fraction (the fraction of the liver MR signal attributable to liver fat), which may be confounded by numerous technical and biological factors and may not reliably reflect fat content. By addressing the factors that confound the signal fat-fraction, advanced MR techniques measure the proton density fat-fraction (the fraction of the liver proton density attributable to liver fat), which is a fundamental tissue property and a direct measure of liver fat content. These advanced techniques show promise for accurate fat quantification and are likely to be commercially available soon. PMID:22025886

  16. Influence of Pre-Analytical Factors on Thymus- and Activation-Regulated Chemokine Quantitation in Plasma

    PubMed Central

    Zhao, Xuemei; Delgado, Liliana; Weiner, Russell; Laterza, Omar F.

    2015-01-01

    Thymus- and activation-regulated chemokine (TARC) in serum/plasma associates with the disease activity of atopic dermatitis (AD), and is a promising tool for assessing the response to the treatment of the disease. TARC also exists within platelets, with elevated levels detectable in AD patients. We examined the effects of pre-analytical factors on the quantitation of TARC in human EDTA plasma. TARC levels in platelet-free plasma were significantly lower than those in platelet-containing plasma. After freeze-thaw, TARC levels increased in platelet-containing plasma, but remained unchanged in platelet-free plasma, suggesting TARC was released from the platelets during the freeze-thaw process. In contrast, TARC levels were stable in serum independent of freeze-thaw. These findings underscore the importance of pre-analytical factors to TARC quantitation. Plasma TARC levels should be measured in platelet-free plasma for accurate quantitation. Pre-analytical factors influence the quantitation, interpretation, and implementation of circulating TARC as a biomarker for the development of AD therapeutics. PMID:28936246

  17. Quantitative assessment of paretic limb dexterity and interlimb coordination during bilateral arm rehabilitation training.

    PubMed

    Xu, Chang; Li, Siyi; Wang, Kui; Hou, Zengguang; Yu, Ningbo

    2017-07-01

    In neuro-rehabilitation after stroke, the conventional constrained induced movement therapy (CIMT) has been well-accepted. Existing bilateral trainings are mostly on mirrored symmetrical motion. However, complementary bilateral movements are dominantly involved in activities of daily living (ADLs), and functional bilateral therapies may bring better skill transfer from trainings to daily life. Neurophysiological evidence is also growing. In this work, we firstly introduce our bilateral arm training system realized with a haptic interface and a motion sensor, as well as the tasks that have been designed to train both the manipulation function of the paretic arm and coordination of bilateral upper limbs. Then, we propose quantitative measures for functional assessment of complementary bilateral training performance, including kinematic behavior indices, smoothness, submovement and bimanual coordination. After that, we describe the experiments with healthy subjects and the results with respect to these quantitative measures. Feasibility and sensitivity of the proposed indices were evaluated through comparison of unilateral and bilateral training outcomes. The proposed bilateral training system and tasks, as well as the quantitative measures, have been demonstrated effective for training and assessment of unilateral and bilateral arm functions.

  18. A quantitative assessment of alkaptonuria: testing the reliability of two disease severity scoring systems.

    PubMed

    Cox, Trevor F; Ranganath, Lakshminarayan

    2011-12-01

    Alkaptonuria (AKU) is due to excessive homogentisic acid accumulation in body fluids due to lack of enzyme homogentisate dioxygenase leading in turn to varied clinical manifestations mainly by a process of conversion of HGA to a polymeric melanin-like pigment known as ochronosis. A potential treatment, a drug called nitisinone, to decrease formation of HGA is available. However, successful demonstration of its efficacy in modifying the natural history of AKU requires an effective quantitative assessment tool. We have described two potential tools that could be used to quantitate disease burden in AKU. One tool describes scoring the clinical features that includes clinical assessments, investigations and questionnaires in 15 patients with AKU. The second tool describes a scoring system that only includes items obtained from questionnaires used in 44 people with AKU. Statistical analyses were carried out on the two patient datasets to assess the AKU tools; these included the calculation of Chronbach's alpha, multidimensional scaling and simple linear regression analysis. The conclusion was that there was good evidence that the tools could be adopted as AKU assessment tools, but perhaps with further refinement before being used in the practical setting of a clinical trial.

  19. Quantitative Assessment of In-solution Digestion Efficiency Identifies Optimal Protocols for Unbiased Protein Analysis*

    PubMed Central

    León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.

    2013-01-01

    The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921

  20. QUANTITATIVE ASSESSMENT OF INTEGRATED PHRENIC NERVE ACTIVITY

    PubMed Central

    Nichols, Nicole L.; Mitchell, Gordon S.

    2016-01-01

    Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1G93A Taconic rat groups (an ALS model). Meta-analysis results indicate: 1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; 2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ~1.0; and 3) consistently reduced activity in end-stage SOD1G93A rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. PMID:26724605

  1. Quantitative assessment of integrated phrenic nerve activity.

    PubMed

    Nichols, Nicole L; Mitchell, Gordon S

    2016-06-01

    Integrated electrical activity in the phrenic nerve is commonly used to assess within-animal changes in phrenic motor output. Because of concerns regarding the consistency of nerve recordings, activity is most often expressed as a percent change from baseline values. However, absolute values of nerve activity are necessary to assess the impact of neural injury or disease on phrenic motor output. To date, no systematic evaluations of the repeatability/reliability have been made among animals when phrenic recordings are performed by an experienced investigator using standardized methods. We performed a meta-analysis of studies reporting integrated phrenic nerve activity in many rat groups by the same experienced investigator; comparisons were made during baseline and maximal chemoreceptor stimulation in 14 wild-type Harlan and 14 Taconic Sprague Dawley groups, and in 3 pre-symptomatic and 11 end-stage SOD1(G93A) Taconic rat groups (an ALS model). Meta-analysis results indicate: (1) consistent measurements of integrated phrenic activity in each sub-strain of wild-type rats; (2) with bilateral nerve recordings, left-to-right integrated phrenic activity ratios are ∼1.0; and (3) consistently reduced activity in end-stage SOD1(G93A) rats. Thus, with appropriate precautions, integrated phrenic nerve activity enables robust, quantitative comparisons among nerves or experimental groups, including differences caused by neuromuscular disease. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Assessment of metabolic bone diseases by quantitative computed tomography

    NASA Technical Reports Server (NTRS)

    Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.

    1985-01-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.

  3. Bayes` theorem and quantitative risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaplan, S.

    1994-12-31

    This paper argues that for a quantitative risk analysis (QRA) to be useful for public and private decision making, and for rallying the support necessary to implement those decisions, it is necessary that the QRA results be ``trustable.`` Trustable means that the results are based solidly and logically on all the relevant evidence available. This, in turn, means that the quantitative results must be derived from the evidence using Bayes` theorem. Thus, it argues that one should strive to make their QRAs more clearly and explicitly Bayesian, and in this way make them more ``evidence dependent`` than ``personality dependent.``

  4. Experimental Null Method to Guide the Development of Technical Procedures and to Control False-Positive Discovery in Quantitative Proteomics.

    PubMed

    Shen, Xiaomeng; Hu, Qiang; Li, Jun; Wang, Jianmin; Qu, Jun

    2015-10-02

    Comprehensive and accurate evaluation of data quality and false-positive biomarker discovery is critical to direct the method development/optimization for quantitative proteomics, which nonetheless remains challenging largely due to the high complexity and unique features of proteomic data. Here we describe an experimental null (EN) method to address this need. Because the method experimentally measures the null distribution (either technical or biological replicates) using the same proteomic samples, the same procedures and the same batch as the case-vs-contol experiment, it correctly reflects the collective effects of technical variability (e.g., variation/bias in sample preparation, LC-MS analysis, and data processing) and project-specific features (e.g., characteristics of the proteome and biological variation) on the performances of quantitative analysis. To show a proof of concept, we employed the EN method to assess the quantitative accuracy and precision and the ability to quantify subtle ratio changes between groups using different experimental and data-processing approaches and in various cellular and tissue proteomes. It was found that choices of quantitative features, sample size, experimental design, data-processing strategies, and quality of chromatographic separation can profoundly affect quantitative precision and accuracy of label-free quantification. The EN method was also demonstrated as a practical tool to determine the optimal experimental parameters and rational ratio cutoff for reliable protein quantification in specific proteomic experiments, for example, to identify the necessary number of technical/biological replicates per group that affords sufficient power for discovery. Furthermore, we assessed the ability of EN method to estimate levels of false-positives in the discovery of altered proteins, using two concocted sample sets mimicking proteomic profiling using technical and biological replicates, respectively, where the true

  5. Precocious quantitative cognition in monkeys.

    PubMed

    Ferrigno, Stephen; Hughes, Kelly D; Cantlon, Jessica F

    2016-02-01

    Basic quantitative abilities are thought to have an innate basis in humans partly because the ability to discriminate quantities emerges early in child development. If humans and nonhuman primates share this developmentally primitive foundation of quantitative reasoning, then this ability should be present early in development across species and should emerge earlier in monkeys than in humans because monkeys mature faster than humans. We report that monkeys spontaneously make accurate quantity choices by 1 year of age in a task that human children begin to perform only at 2.5 to 3 years of age. Additionally, we report that the quantitative sensitivity of infant monkeys is equal to that of the adult animals in their group and that rates of learning do not differ between infant and adult animals. This novel evidence of precocious quantitative reasoning in infant monkeys suggests that human quantitative reasoning shares its early developing foundation with other primates. The data further suggest that early developing components of primate quantitative reasoning are constrained by maturational factors related to genetic development as opposed to learning experience alone.

  6. Development and Validation of Broad-Range Qualitative and Clade-Specific Quantitative Molecular Probes for Assessing Mercury Methylation in the Environment.

    PubMed

    Christensen, Geoff A; Wymore, Ann M; King, Andrew J; Podar, Mircea; Hurt, Richard A; Santillan, Eugenio U; Soren, Ally; Brandt, Craig C; Brown, Steven D; Palumbo, Anthony V; Wall, Judy D; Gilmour, Cynthia C; Elias, Dwayne A

    2016-10-01

    Two genes, hgcA and hgcB, are essential for microbial mercury (Hg) methylation. Detection and estimation of their abundance, in conjunction with Hg concentration, bioavailability, and biogeochemistry, are critical in determining potential hot spots of methylmercury (MeHg) generation in at-risk environments. We developed broad-range degenerate PCR primers spanning known hgcAB genes to determine the presence of both genes in diverse environments. These primers were tested against an extensive set of pure cultures with published genomes, including 13 Deltaproteobacteria, nine Firmicutes, and nine methanogenic Archaea genomes. A distinct PCR product at the expected size was confirmed for all hgcAB(+) strains tested via Sanger sequencing. Additionally, we developed clade-specific degenerate quantitative PCR (qPCR) primers that targeted hgcA for each of the three dominant Hg-methylating clades. The clade-specific qPCR primers amplified hgcA from 64%, 88%, and 86% of tested pure cultures of Deltaproteobacteria, Firmicutes, and Archaea, respectively, and were highly specific for each clade. Amplification efficiencies and detection limits were quantified for each organism. Primer sensitivity varied among species based on sequence conservation. Finally, to begin to evaluate the utility of our primer sets in nature, we tested hgcA and hgcAB recovery from pure cultures spiked into sand and soil. These novel quantitative molecular tools designed in this study will allow for more accurate identification and quantification of the individual Hg-methylating groups of microorganisms in the environment. The resulting data will be essential in developing accurate and robust predictive models of Hg methylation potential, ideally integrating the geochemistry of Hg methylation to the microbiology and genetics of hgcAB IMPORTANCE: The neurotoxin methylmercury (MeHg) poses a serious risk to human health. MeHg production in nature is associated with anaerobic microorganisms. The recent

  7. The application of quantitative risk assessment to microbial food safety risks.

    PubMed

    Jaykus, L A

    1996-01-01

    Regulatory programs and guidelines for the control of foodborne microbial agents have existed in the U.S. for nearly 100 years. However, increased awareness of the scope and magnitude of foodborne disease, as well as the emergence of previously unrecognized human pathogens transmitted via the foodborne route, have prompted regulatory officials to consider new and improved strategies to reduce the health risks associated with pathogenic microorganisms in foods. Implementation of these proposed strategies will involve definitive costs for a finite level of risk reduction. While regulatory decisions regarding the management of foodborne disease risk have traditionally been done with the aid of the scientific community, a formal conceptual framework for the evaluation of health risks from pathogenic microorganisms in foods is warranted. Quantitative risk assessment (QRA), which is formally defined as the technical assessment of the nature and magnitude of a risk caused by a hazard, provides such a framework. Reproducing microorganisms in foods present a particular challenge to QRA because both their introduction and numbers may be affected by numerous factors within the food chain, with all of these factors representing significant stages in food production, handling, and consumption, in a farm-to-table type of approach. The process of QRA entails four designated phases: (1) hazard identification, (2) exposure assessment, (3) dose-response assessment, and (4) risk characterization. Specific analytical tools are available to accomplish the analyses required for each phase of the QRA. The purpose of this paper is to provide a description of the conceptual framework for quantitative microbial risk assessment within the standard description provided by the National Academy of Sciences (NAS) paradigm. Each of the sequential steps in QRA are discussed in detail, providing information on current applications, tools for conducting the analyses, and methodological and/or data

  8. Quantitative assessment of fat infiltration in the rotator cuff muscles using water-fat MRI.

    PubMed

    Nardo, Lorenzo; Karampinos, Dimitrios C; Lansdown, Drew A; Carballido-Gamio, Julio; Lee, Sonia; Maroldi, Roberto; Ma, C Benjamin; Link, Thomas M; Krug, Roland

    2014-05-01

    To evaluate a chemical shift-based fat quantification technique in the rotator cuff muscles in comparison with the semiquantitative Goutallier fat infiltration classification (GC) and to assess their relationship with clinical parameters. The shoulders of 57 patients were imaged using a 3T MR scanner. The rotator cuff muscles were assessed for fat infiltration using GC by two radiologists and an orthopedic surgeon. Sequences included oblique-sagittal T1-, T2-, and proton density-weighted fast spin echo, and six-echo gradient echo. The iterative decomposition of water and fat with echo asymmetry and least-squares estimation (IDEAL) was used to measure fat fraction. Pain and range of motion of the shoulder were recorded. Fat fraction values were significantly correlated with GC grades (P < 0.0001, κ >0.9) showing consistent increase with GC grades (grade = 0, 0%-5.59%; grade = 1, 1.1%-9.70%; grade = 2, 6.44%-14.86%; grade = 3, 15.25%-17.77%; grade = 4, 19.85%-29.63%). A significant correlation between fat infiltration of the subscapularis muscle quantified with IDEAL versus 1) deficit in internal rotation (Spearman Rank Correlation Coefficient [SRC] = 0.39, 95% confidence interval [CI] 0.13-0.60, P < 0.01) and 2) pain (SRC coefficient = 0.313, 95% CI 0.049-0.536, P = 0.02) was found but was not seen between the clinical parameters and GC grades. Additionally, only quantitative fat infiltration measures of the supraspinatus muscle were significantly correlated with a deficit in abduction (SRC coefficient = 0.45, 95% CI 0.20-0.60, P < 0.01). An accurate and highly reproducible fat quantification in the rotator cuff muscles using water-fat magnetic resonance imaging (MRI) techniques is possible and significantly correlates with shoulder pain and range of motion. Copyright © 2013 Wiley Periodicals, Inc.

  9. Quantitative CMMI Assessment for Offshoring through the Analysis of Project Management Repositories

    NASA Astrophysics Data System (ADS)

    Sunetnanta, Thanwadee; Nobprapai, Ni-On; Gotel, Olly

    The nature of distributed teams and the existence of multiple sites in offshore software development projects pose a challenging setting for software process improvement. Often, the improvement and appraisal of software processes is achieved through a turnkey solution where best practices are imposed or transferred from a company’s headquarters to its offshore units. In so doing, successful project health checks and monitoring for quality on software processes requires strong project management skills, well-built onshore-offshore coordination, and often needs regular onsite visits by software process improvement consultants from the headquarters’ team. This paper focuses on software process improvement as guided by the Capability Maturity Model Integration (CMMI) and proposes a model to evaluate the status of such improvement efforts in the context of distributed multi-site projects without some of this overhead. The paper discusses the application of quantitative CMMI assessment through the collection and analysis of project data gathered directly from project repositories to facilitate CMMI implementation and reduce the cost of such implementation for offshore-outsourced software development projects. We exemplify this approach to quantitative CMMI assessment through the analysis of project management data and discuss the future directions of this work in progress.

  10. Qualitative and quantitative approaches in the dose-response assessment of genotoxic carcinogens.

    PubMed

    Fukushima, Shoji; Gi, Min; Kakehashi, Anna; Wanibuchi, Hideki; Matsumoto, Michiharu

    2016-05-01

    Qualitative and quantitative approaches are important issues in field of carcinogenic risk assessment of the genotoxic carcinogens. Herein, we provide quantitative data on low-dose hepatocarcinogenicity studies for three genotoxic hepatocarcinogens: 2-amino-3,8-dimethylimidazo[4,5-f]quinoxaline (MeIQx), 2-amino-3-methylimidazo[4,5-f]quinoline (IQ) and N-nitrosodiethylamine (DEN). Hepatocarcinogenicity was examined by quantitative analysis of glutathione S-transferase placental form (GST-P) positive foci, which are the preneoplastic lesions in rat hepatocarcinogenesis and the endpoint carcinogenic marker in the rat liver medium-term carcinogenicity bioassay. We also examined DNA damage and gene mutations which occurred through the initiation stage of carcinogenesis. For the establishment of points of departure (PoD) from which the cancer-related risk can be estimated, we analyzed the above events by quantitative no-observed-effect level and benchmark dose approaches. MeIQx at low doses induced formation of DNA-MeIQx adducts; somewhat higher doses caused elevation of 8-hydroxy-2'-deoxyquanosine levels; at still higher doses gene mutations occurred; and the highest dose induced formation of GST-P positive foci. These data indicate that early genotoxic events in the pathway to carcinogenesis showed the expected trend of lower PoDs for earlier events in the carcinogenic process. Similarly, only the highest dose of IQ caused an increase in the number of GST-P positive foci in the liver, while IQ-DNA adduct formation was observed with low doses. Moreover, treatment with DEN at low doses had no effect on development of GST-P positive foci in the liver. These data on PoDs for the markers contribute to understand whether genotoxic carcinogens have a threshold for their carcinogenicity. The most appropriate approach to use in low dose-response assessment must be approved on the basis of scientific judgment. © The Author 2015. Published by Oxford University Press on behalf of

  11. Fluorescence correlation spectroscopy analysis for accurate determination of proportion of doubly labeled DNA in fluorescent DNA pool for quantitative biochemical assays.

    PubMed

    Hou, Sen; Sun, Lili; Wieczorek, Stefan A; Kalwarczyk, Tomasz; Kaminski, Tomasz S; Holyst, Robert

    2014-01-15

    Fluorescent double-stranded DNA (dsDNA) molecules labeled at both ends are commonly produced by annealing of complementary single-stranded DNA (ssDNA) molecules, labeled with fluorescent dyes at the same (3' or 5') end. Because the labeling efficiency of ssDNA is smaller than 100%, the resulting dsDNA have two, one or are without a dye. Existing methods are insufficient to measure the percentage of the doubly-labeled dsDNA component in the fluorescent DNA sample and it is even difficult to distinguish the doubly-labeled DNA component from the singly-labeled component. Accurate measurement of the percentage of such doubly labeled dsDNA component is a critical prerequisite for quantitative biochemical measurements, which has puzzled scientists for decades. We established a fluorescence correlation spectroscopy (FCS) system to measure the percentage of doubly labeled dsDNA (PDL) in the total fluorescent dsDNA pool. The method is based on comparative analysis of the given sample and a reference dsDNA sample prepared by adding certain amount of unlabeled ssDNA into the original ssDNA solution. From FCS autocorrelation functions, we obtain the number of fluorescent dsDNA molecules in the focal volume of the confocal microscope and PDL. We also calculate the labeling efficiency of ssDNA. The method requires minimal amount of material. The samples have the concentration of DNA in the nano-molar/L range and the volume of tens of microliters. We verify our method by using restriction enzyme Hind III to cleave the fluorescent dsDNA. The kinetics of the reaction depends strongly on PDL, a critical parameter for quantitative biochemical measurements. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. A quantitative method for risk assessment of agriculture due to climate change

    NASA Astrophysics Data System (ADS)

    Dong, Zhiqiang; Pan, Zhihua; An, Pingli; Zhang, Jingting; Zhang, Jun; Pan, Yuying; Huang, Lei; Zhao, Hui; Han, Guolin; Wu, Dong; Wang, Jialin; Fan, Dongliang; Gao, Lin; Pan, Xuebiao

    2018-01-01

    Climate change has greatly affected agriculture. Agriculture is facing increasing risks as its sensitivity and vulnerability to climate change. Scientific assessment of climate change-induced agricultural risks could help to actively deal with climate change and ensure food security. However, quantitative assessment of risk is a difficult issue. Here, based on the IPCC assessment reports, a quantitative method for risk assessment of agriculture due to climate change is proposed. Risk is described as the product of the degree of loss and its probability of occurrence. The degree of loss can be expressed by the yield change amplitude. The probability of occurrence can be calculated by the new concept of climate change effect-accumulated frequency (CCEAF). Specific steps of this assessment method are suggested. This method is determined feasible and practical by using the spring wheat in Wuchuan County of Inner Mongolia as a test example. The results show that the fluctuation of spring wheat yield increased with the warming and drying climatic trend in Wuchuan County. The maximum yield decrease and its probability were 3.5 and 64.6%, respectively, for the temperature maximum increase 88.3%, and its risk was 2.2%. The maximum yield decrease and its probability were 14.1 and 56.1%, respectively, for the precipitation maximum decrease 35.2%, and its risk was 7.9%. For the comprehensive impacts of temperature and precipitation, the maximum yield decrease and its probability were 17.6 and 53.4%, respectively, and its risk increased to 9.4%. If we do not adopt appropriate adaptation strategies, the degree of loss from the negative impacts of multiclimatic factors and its probability of occurrence will both increase accordingly, and the risk will also grow obviously.

  13. Promoting the safety performance of industrial radiography using a quantitative assessment system.

    PubMed

    Kardan, M R; Mianji, F A; Rastkhah, N; Babakhani, A; Azad, S Borhan

    2006-12-01

    The increasing number of industrial radiographers and their considerable occupational exposure has been one of the main concerns of the Iran Nuclear Regulatory Authority (INRA) in recent years. In 2002, a quantitative system of evaluating the safety performance of licensees and a complementary enforcement system was introduced by the National Radiation Protection Department (NRPD). Each parameter of the practice is given a weighting factor according to its importance to safety. Assessment of the licensees is done quantitatively by summing up their scores using prepared tables. Implementing this system of evaluation showed a considerable decrease in deficiencies in the various centres. Tables are updated regularly as a result of findings during the inspections. This system is used in addition to enforcement to promote safety performance and to increase the culture of safety in industrial radiography.

  14. Mitochondrial DNA as a non-invasive biomarker: Accurate quantification using real time quantitative PCR without co-amplification of pseudogenes and dilution bias

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malik, Afshan N., E-mail: afshan.malik@kcl.ac.uk; Shahni, Rojeen; Rodriguez-de-Ledesma, Ana

    2011-08-19

    Highlights: {yields} Mitochondrial dysfunction is central to many diseases of oxidative stress. {yields} 95% of the mitochondrial genome is duplicated in the nuclear genome. {yields} Dilution of untreated genomic DNA leads to dilution bias. {yields} Unique primers and template pretreatment are needed to accurately measure mitochondrial DNA content. -- Abstract: Circulating mitochondrial DNA (MtDNA) is a potential non-invasive biomarker of cellular mitochondrial dysfunction, the latter known to be central to a wide range of human diseases. Changes in MtDNA are usually determined by quantification of MtDNA relative to nuclear DNA (Mt/N) using real time quantitative PCR. We propose that themore » methodology for measuring Mt/N needs to be improved and we have identified that current methods have at least one of the following three problems: (1) As much of the mitochondrial genome is duplicated in the nuclear genome, many commonly used MtDNA primers co-amplify homologous pseudogenes found in the nuclear genome; (2) use of regions from genes such as {beta}-actin and 18S rRNA which are repetitive and/or highly variable for qPCR of the nuclear genome leads to errors; and (3) the size difference of mitochondrial and nuclear genomes cause a 'dilution bias' when template DNA is diluted. We describe a PCR-based method using unique regions in the human mitochondrial genome not duplicated in the nuclear genome; unique single copy region in the nuclear genome and template treatment to remove dilution bias, to accurately quantify MtDNA from human samples.« less

  15. A relative quantitative assessment of myocardial perfusion by first-pass technique: animal study

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Zhang, Zhang; Yu, Xuefang; Zhou, Kenneth J.

    2015-03-01

    The purpose of this study is to quantitatively assess the myocardial perfusion by first-pass technique in swine model. Numerous techniques based on the analysis of Computed Tomography (CT) Hounsfield Unit (HU) density have emerged. Although these methods proposed to be able to assess haemodynamically significant coronary artery stenosis, their limitations are noticed. There are still needs to develop some new techniques. Experiments were performed upon five (5) closed-chest swine. Balloon catheters were placed into the coronary artery to simulate different degrees of luminal stenosis. Myocardial Blood Flow (MBF) was measured using color microsphere technique. Fractional Flow Reserve (FFR) was measured using pressure wire. CT examinations were performed twice during First-pass phase under adenosine-stress condition. CT HU Density (HUDCT) and CT HU Density Ratio (HUDRCT) were calculated using the acquired CT images. Our study presents that HUDRCT shows a good (y=0.07245+0.09963x, r2=0.898) correlation with MBF and FFR. In receiver operating characteristic (ROC) curve analyses, HUDRCT provides excellent diagnostic performance for the detection of significant ischemia during adenosine-stress as defined by FFR indicated by the value of Area Under the Curve (AUC) of 0.927. HUDRCT has the potential to be developed as a useful indicator of quantitative assessment of myocardial perfusion.

  16. Daily FOUR score assessment provides accurate prognosis of long-term outcome in out-of-hospital cardiac arrest.

    PubMed

    Weiss, N; Venot, M; Verdonk, F; Chardon, A; Le Guennec, L; Llerena, M C; Raimbourg, Q; Taldir, G; Luque, Y; Fagon, J-Y; Guerot, E; Diehl, J-L

    2015-05-01

    The accurate prediction of outcome after out-of-hospital cardiac arrest (OHCA) is of major importance. The recently described Full Outline of UnResponsiveness (FOUR) is well adapted to mechanically ventilated patients and does not depend on verbal response. To evaluate the ability of FOUR assessed by intensivists to accurately predict outcome in OHCA. We prospectively identified patients admitted for OHCA with a Glasgow Coma Scale below 8. Neurological assessment was performed daily. Outcome was evaluated at 6 months using Glasgow-Pittsburgh Cerebral Performance Categories (GP-CPC). Eighty-five patients were included. At 6 months, 19 patients (22%) had a favorable outcome, GP-CPC 1-2, and 66 (78%) had an unfavorable outcome, GP-CPC 3-5. Compared to both brainstem responses at day 3 and evolution of Glasgow Coma Scale, evolution of FOUR score over the three first days was able to predict unfavorable outcome more precisely. Thus, absence of improvement or worsening from day 1 to day 3 of FOUR had 0.88 (0.79-0.97) specificity, 0.71 (0.66-0.76) sensitivity, 0.94 (0.84-1.00) PPV and 0.54 (0.49-0.59) NPV to predict unfavorable outcome. Similarly, the brainstem response of FOUR score at 0 evaluated at day 3 had 0.94 (0.89-0.99) specificity, 0.60 (0.50-0.70) sensitivity, 0.96 (0.92-1.00) PPV and 0.47 (0.37-0.57) NPV to predict unfavorable outcome. The absence of improvement or worsening from day 1 to day 3 of FOUR evaluated by intensivists provides an accurate prognosis of poor neurological outcome in OHCA. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  17. Quantitative assessment of the multivalent protein-carbohydrate interactions on silicon.

    PubMed

    Yang, Jie; Chazalviel, Jean-Noël; Siriwardena, Aloysius; Boukherroub, Rabah; Ozanam, François; Szunerits, Sabine; Gouget-Laemmel, Anne Chantal

    2014-10-21

    A key challenge in the development of glycan arrays is that the sensing interface be fabricated reliably so as to ensure the sensitive and accurate analysis of the protein-carbohydrate interaction of interest, reproducibly. These goals are complicated in the case of glycan arrays as surface sugar density can influence dramatically the strength and mode of interaction of the sugar ligand at any interface with lectin partners. In this Article, we describe the preparation of carboxydecyl-terminated crystalline silicon (111) surfaces onto which are grafted either mannosyl moieties or a mixture of mannose and spacer alcohol molecules to provide "diluted" surfaces. The fabrication of the silicon surfaces was achieved efficiently through a strategy implicating a "click" coupling step. The interactions of these newly fabricated glycan interfaces with the lectin, Lens culinaris, have been characterized using quantitative infrared (IR) spectroscopy in the attenuated total geometry (ATR). The density of mannose probes and lectin targets was precisely determined for the first time by the aid of special IR calibration experiments, thus allowing for the interpretation of the distribution of mannose and its multivalent binding with lectins. These experimental findings were accounted for by numerical simulations of lectin adsorption.

  18. A review of quantitative structure-property relationships for the fate of ionizable organic chemicals in water matrices and identification of knowledge gaps.

    PubMed

    Nolte, Tom M; Ragas, Ad M J

    2017-03-22

    Many organic chemicals are ionizable by nature. After use and release into the environment, various fate processes determine their concentrations, and hence exposure to aquatic organisms. In the absence of suitable data, such fate processes can be estimated using Quantitative Structure-Property Relationships (QSPRs). In this review we compiled available QSPRs from the open literature and assessed their applicability towards ionizable organic chemicals. Using quantitative and qualitative criteria we selected the 'best' QSPRs for sorption, (a)biotic degradation, and bioconcentration. The results indicate that many suitable QSPRs exist, but some critical knowledge gaps remain. Specifically, future focus should be directed towards the development of QSPR models for biodegradation in wastewater and sediment systems, direct photolysis and reaction with singlet oxygen, as well as additional reactive intermediates. Adequate QSPRs for bioconcentration in fish exist, but more accurate assessments can be achieved using pharmacologically based toxicokinetic (PBTK) models. No adequate QSPRs exist for bioconcentration in non-fish species. Due to the high variability of chemical and biological species as well as environmental conditions in QSPR datasets, accurate predictions for specific systems and inter-dataset conversions are problematic, for which standardization is needed. For all QSPR endpoints, additional data requirements involve supplementing the current chemical space covered and accurately characterizing the test systems used.

  19. Quantitative Microbial Risk Assessment of Pharmaceutical Products.

    PubMed

    Eissa, Mostafa Essam

    2017-01-01

    Monitoring of microbiological quality in the pharmaceutical industry is an important criterion that is required to justify safe product release to the drug market. Good manufacturing practice and efficient control on bioburden level of product components are critical parameters that influence the microbiological cleanliness of medicinal products. However, because microbial dispersion through the samples follows Poisson distribution, the rate of detection of microbiologically defective samples lambda (λ) decreases when the number of defective units per batch decreases. When integrating a dose-response model of infection (P inf ) of a specific objectionable microbe with a contamination module, the overall probability of infection from a single batch of pharmaceutical product can be estimated. The combination of P inf with detectability chance of the test (P det ) will yield a value that could be used as a quantitative measure of the possibility of passing contaminated batch units of product with a certain load of a specific pathogen and infecting the final consumer without being detected in the firm. The simulation study can be used to assess the risk of contamination and infection from objectionable microorganisms for sterile and non-sterile products. LAY ABSTRACT: Microbial contamination of pharmaceutical products is a global problem that may lead to infection and possibly death. While reputable pharmaceutical companies strive to deliver microbiologically safe products, it would be helpful to apply an assessment system for the current risk associated with pharmaceutical batches delivered to the drug market. The current methodology may be helpful also in determining the degree of improvement or deterioration on the batch processing flow until reaching the final consumer. Moreover, the present system is flexible and can be applied to other industries such as food, cosmetics, or medical devices manufacturing and processing fields to assess the microbiological risk of

  20. Quantitative assessment of risk reduction from hand washing with antibacterial soaps.

    PubMed

    Gibson, L L; Rose, J B; Haas, C N; Gerba, C P; Rusin, P A

    2002-01-01

    The Centers for Disease Control and Prevention have estimated that there are 3,713,000 cases of infectious disease associated with day care facilities each year. The objective of this study was to examine the risk reduction achieved from using different soap formulations after diaper changing using a microbial quantitative risk assessment approach. To achieve this, a probability of infection model and an exposure assessment based on micro-organism transfer were used to evaluate the efficacy of different soap formulations in reducing the probability of disease following hand contact with an enteric pathogen. Based on this model, it was determined that the probability of infection ranged from 24/100 to 91/100 for those changing diapers of babies with symptomatic shigellosis who used a control product (soap without an antibacterial ingredient), 22/100 to 91/100 for those who used an antibacterial soap (chlorohexadine 4%), and 15/100 to 90/100 for those who used a triclosan (1.5%) antibacterial soap. Those with asymptomatic shigellosis who used a non-antibacterial control soap had a risk between 49/100,000 and 53/100, those who used the 4% chlorohexadine-containing soap had a risk between 43/100,000 and 51/100, and for those who used a 1.5% triclosan soap had a risk between 21/100,000 and 43/100. The adequate washing of hands after diapering reduces risk and can be further reduced by a factor of 20% by the use of an antibacterial soap. Quantitative risk assessment is a valuable tool in the evaluation of household sanitizing agents and low risk outcomes.

  1. Computer-Assisted Quantitative Assessment of Prostatic Calcifications in Patients with Chronic Prostatitis.

    PubMed

    Boltri, Matteo; Magri, Vittorio; Montanari, Emanuele; Perletti, Gianpaolo; Trinchieri, Alberto

    2018-04-26

    The aim of this study was the development of quantitative assessment of prostatic calcifications at prostatic ultrasound examination by the use of an image analyzer. A group of 82 patients was evaluated by medical history, physical, and transrectal ultrasound examination. Patients had a urethral swab, a 4-specimen study and culture of the seminal fluid. Patients were classified according to National Institute of Diabetes and Digestive and Kidney Diseases/National Institutes of Health. Subjective symptoms were scored by Chronic Prostatitis Symptom Index (CPSI) questionnaire. Ultrasound images were analyzed by the digital processing software Image J to quantitatively assess the presence of calcifications. Computer-assessed calcified areas were significantly higher in chronic bacterial prostatitis (n = 18; group II; 6.76 ± 8.09%) than in the chronic pelvic pain syndrome group IIIa (n = 26; 2.07 ± 1.01%) and IIIb (n = 38; 2.31 ± 2.18%). The area of calcification of the prostate was significantly related to the CPSI score for domains of micturition (r = 0.278, p = 0.023), Prostatic Specific Antigen values (r = 0341, p = 0.005), postvoiding residual urine (r = 0.262, p = 0.032), total prostate volume (r = 0.592, p = 0.000), and adenoma volume (r = 0.593; p = 0.000). The presence of calcifications is more frequently observed in patients with chronic bacterial prostatitis and is related to urinary symptoms. © 2018 S. Karger AG, Basel.

  2. A Compressed Sensing-Based Wearable Sensor Network for Quantitative Assessment of Stroke Patients

    PubMed Central

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-01-01

    Clinical rehabilitation assessment is an important part of the therapy process because it is the premise for prescribing suitable rehabilitation interventions. However, the commonly used assessment scales have the following two drawbacks: (1) they are susceptible to subjective factors; (2) they only have several rating levels and are influenced by a ceiling effect, making it impossible to exactly detect any further improvement in the movement. Meanwhile, energy constraints are a primary design consideration in wearable sensor network systems since they are often battery-operated. Traditionally, for wearable sensor network systems that follow the Shannon/Nyquist sampling theorem, there are many data that need to be sampled and transmitted. This paper proposes a novel wearable sensor network system to monitor and quantitatively assess the upper limb motion function, based on compressed sensing technology. With the sparse representation model, less data is transmitted to the computer than with traditional systems. The experimental results show that the accelerometer signals of Bobath handshake and shoulder touch exercises can be compressed, and the length of the compressed signal is less than 1/3 of the raw signal length. More importantly, the reconstruction errors have no influence on the predictive accuracy of the Brunnstrom stage classification model. It also indicated that the proposed system can not only reduce the amount of data during the sampling and transmission processes, but also, the reconstructed accelerometer signals can be used for quantitative assessment without any loss of useful information. PMID:26861337

  3. A Compressed Sensing-Based Wearable Sensor Network for Quantitative Assessment of Stroke Patients.

    PubMed

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-02-05

    Clinical rehabilitation assessment is an important part of the therapy process because it is the premise for prescribing suitable rehabilitation interventions. However, the commonly used assessment scales have the following two drawbacks: (1) they are susceptible to subjective factors; (2) they only have several rating levels and are influenced by a ceiling effect, making it impossible to exactly detect any further improvement in the movement. Meanwhile, energy constraints are a primary design consideration in wearable sensor network systems since they are often battery-operated. Traditionally, for wearable sensor network systems that follow the Shannon/Nyquist sampling theorem, there are many data that need to be sampled and transmitted. This paper proposes a novel wearable sensor network system to monitor and quantitatively assess the upper limb motion function, based on compressed sensing technology. With the sparse representation model, less data is transmitted to the computer than with traditional systems. The experimental results show that the accelerometer signals of Bobath handshake and shoulder touch exercises can be compressed, and the length of the compressed signal is less than 1/3 of the raw signal length. More importantly, the reconstruction errors have no influence on the predictive accuracy of the Brunnstrom stage classification model. It also indicated that the proposed system can not only reduce the amount of data during the sampling and transmission processes, but also, the reconstructed accelerometer signals can be used for quantitative assessment without any loss of useful information.

  4. Automated selected reaction monitoring software for accurate label-free protein quantification.

    PubMed

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik

    2012-07-06

    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.

  5. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Technical Performance Assessment

    PubMed Central

    2017-01-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers (QIBs) to measure changes in these features. Critical to the performance of a QIB in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method and metrics used to assess a QIB for clinical use. It is therefore, difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America (RSNA) and the Quantitative Imaging Biomarker Alliance (QIBA) with technical, radiological and statistical experts developed a set of technical performance analysis methods, metrics and study designs that provide terminology, metrics and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of QIB performance studies so that results from multiple studies can be compared, contrasted or combined. PMID:24919831

  6. Characteristics of liver fibrosis with different etiologies using a fully quantitative fibrosis assessment tool.

    PubMed

    Wu, Q; Zhao, X; You, H

    2017-05-18

    This study aimed to test the diagnostic performance of a fully quantitative fibrosis assessment tool for liver fibrosis in patients with chronic hepatitis B (CHB), primary biliary cirrhosis (PBC) and non-alcoholic steatohepatitis (NASH). A total of 117 patients with liver fibrosis were included in this study, including 50 patients with CHB, 49 patients with PBC and 18 patients with NASH. All patients underwent liver biopsy (LB). Fibrosis stages were assessed by two experienced pathologists. Histopathological images of LB slices were processed by second harmonic generation (SHG)/two-photon excited fluorescence (TPEF) microscopy without staining, a system called qFibrosis (quantitative fibrosis) system. Altogether 101 quantitative features of the SHG/TPEF images were acquired. The parameters of aggregated collagen in portal, septal and fibrillar areas increased significantly with stages of liver fibrosis in PBC and CHB (P<0.05), but the same was not found for parameters of distributed collagen (P>0.05). There was a significant correlation between parameters of aggregated collagen in portal, septal and fibrillar areas and stages of liver fibrosis from CHB and PBC (P<0.05), but no correlation was found between the distributed collagen parameters and the stages of liver fibrosis from those patients (P>0.05). There was no significant correlation between NASH parameters and stages of fibrosis (P>0.05). For CHB and PBC patients, the highest correlation was between septal parameters and fibrosis stages, the second highest was between portal parameters and fibrosis stages and the lowest correlation was between fibrillar parameters and fibrosis stages. The correlation between the septal parameters of the PBC and stages is significantly higher than the parameters of the other two areas (P<0.05). The qFibrosis candidate parameters based on CHB were also applicable for quantitative analysis of liver fibrosis in PBC patients. Different parameters should be selected for liver

  7. Characteristics of liver fibrosis with different etiologies using a fully quantitative fibrosis assessment tool

    PubMed Central

    Wu, Q.; Zhao, X.; You, H.

    2017-01-01

    This study aimed to test the diagnostic performance of a fully quantitative fibrosis assessment tool for liver fibrosis in patients with chronic hepatitis B (CHB), primary biliary cirrhosis (PBC) and non-alcoholic steatohepatitis (NASH). A total of 117 patients with liver fibrosis were included in this study, including 50 patients with CHB, 49 patients with PBC and 18 patients with NASH. All patients underwent liver biopsy (LB). Fibrosis stages were assessed by two experienced pathologists. Histopathological images of LB slices were processed by second harmonic generation (SHG)/two-photon excited fluorescence (TPEF) microscopy without staining, a system called qFibrosis (quantitative fibrosis) system. Altogether 101 quantitative features of the SHG/TPEF images were acquired. The parameters of aggregated collagen in portal, septal and fibrillar areas increased significantly with stages of liver fibrosis in PBC and CHB (P<0.05), but the same was not found for parameters of distributed collagen (P>0.05). There was a significant correlation between parameters of aggregated collagen in portal, septal and fibrillar areas and stages of liver fibrosis from CHB and PBC (P<0.05), but no correlation was found between the distributed collagen parameters and the stages of liver fibrosis from those patients (P>0.05). There was no significant correlation between NASH parameters and stages of fibrosis (P>0.05). For CHB and PBC patients, the highest correlation was between septal parameters and fibrosis stages, the second highest was between portal parameters and fibrosis stages and the lowest correlation was between fibrillar parameters and fibrosis stages. The correlation between the septal parameters of the PBC and stages is significantly higher than the parameters of the other two areas (P<0.05). The qFibrosis candidate parameters based on CHB were also applicable for quantitative analysis of liver fibrosis in PBC patients. Different parameters should be selected for liver

  8. A review of empirical research related to the use of small quantitative samples in clinical outcome scale development.

    PubMed

    Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S

    2016-11-01

    There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.

  9. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses.

    PubMed

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning. © 2016 K. Hoffman, S. Leupen, et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  10. Reference charts for young stands — a quantitative methodology for assessing tree performance

    Treesearch

    Lance A. Vickers; David R. Larsen; Benjamin O. Knapp; John M. Kabrick; Daniel C. Dey

    2017-01-01

    Reference charts have long been used in the medical field for quantitative clinical assessment of juvenile development by plotting distribution quantiles for a selected attribute (e.g., height) against age for specified peer populations.We propose that early stand dynamics is an area of study that could benefit from the descriptions and analyses offered by similar...

  11. Current Strategies for Quantitating Fibrosis in Liver Biopsy

    PubMed Central

    Wang, Yan; Hou, Jin-Lin

    2015-01-01

    Objective: The present mini-review updated the progress in methodologies based on using liver biopsy. Data Sources: Articles for study of liver fibrosis, liver biopsy or fibrosis assessment published on high impact peer review journals from 1980 to 2014. Study Selection: Key articles were selected mainly according to their levels of relevance to this topic and citations. Results: With the recently mounting progress in chronic liver disease therapeutics, comes by a pressing need for precise, accurate, and dynamic assessment of hepatic fibrosis and cirrhosis in individual patients. Histopathological information is recognized as the most valuable data for fibrosis assessment. Conventional histology categorical systems describe the changes of fibrosis patterns in liver tissue; but the simplified ordinal digits assigned by these systems cannot reflect the fibrosis dynamics with sufficient precision and reproducibility. Morphometric assessment by computer assist digital image analysis, such as collagen proportionate area (CPA), detects change of fibrosis amount in tissue section in a continuous variable, and has shown its independent diagnostic value for assessment of advanced or late-stage of fibrosis. Due to its evident sensitivity to sampling variances, morphometric measurement is feasible to be taken as a reliable statistical parameter for the study of a large cohort. Combining state-of-art imaging technology and fundamental principle in Tissue Engineering, structure-based quantitation was recently initiated with a novel proof-of-concept tool, qFibrosis. qFibrosis showed not only the superior performance to CPA in accurately and reproducibly differentiating adjacent stages of fibrosis, but also the possibility for facilitating analysis of fibrotic regression and cirrhosis sub-staging. Conclusions: With input from multidisciplinary innovation, liver biopsy assessment as a new “gold standard” is anticipated to substantially support the accelerated progress of

  12. A Framework for Quantitative Assessment of Impacts Related to Energy and Mineral Resource Development

    DOE PAGES

    Haines, Seth S.; Diffendorfer, Jay E.; Balistrieri, Laurie; ...

    2013-05-15

    Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sagemore » grouse leks and pinon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. In conclusion, the framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.« less

  13. A framework for quantitative assessment of impacts related to energy and mineral resource development

    USGS Publications Warehouse

    Haines, Seth S.; Diffendorfer, James; Balistrieri, Laurie S.; Berger, Byron R.; Cook, Troy A.; Gautier, Donald L.; Gallegos, Tanya J.; Gerritsen, Margot; Graffy, Elisabeth; Hawkins, Sarah; Johnson, Kathleen; Macknick, Jordan; McMahon, Peter; Modde, Tim; Pierce, Brenda; Schuenemeyer, John H.; Semmens, Darius; Simon, Benjamin; Taylor, Jason; Walton-Day, Katherine

    2013-01-01

    Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sage grouse leks and piñon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. The framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.

  14. Quantitative analysis of a scar's pliability, perfusion and metrology

    NASA Astrophysics Data System (ADS)

    Gonzalez, Mariacarla; Sevilla, Nicole; Chue-Sang, Joseph; Ramella-Roman, Jessica C.

    2017-02-01

    The primary effect of scarring is the loss of function in the affected area. Scarring also leads to physical and psychological problems that could be devastating to the patient's life. Currently, scar assessment is highly subjective and physician dependent. The examination relies on the expertise of the physician to determine the characteristics of the scar by touch and visual examination using the Vancouver scar scale (VSS), which categorizes scars depending on pigmentation, pliability, height and vascularity. In order to establish diagnostic guidelines for scar formation, a quantitative, accurate assessment method needs to be developed. An instrument capable of measuring all categories was developed; three of the aforementioned parameters will be explored. In order to look at pliability, a durometer which measures the amount of resistance a surface exerts to prevent the permanent indentation of the surface is used due to its simplicity and quantitative output. To look at height and vascularity, a profilometry system that collects the location of the scar in three-dimensions and laser speckle imaging (LSI), which shows the dynamic changes in perfusion, respectively, are used. Gelatin phantoms were utilized to measure pliability. Finally, dynamic changes in skin perfusion of volunteers' forearms undergoing pressure cuff occlusion were measured, along with incisional scars.

  15. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  16. An Assessment of the Quantitative Literacy of Undergraduate Students

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.

    2016-01-01

    Quantitative literacy (QLT) represents an underlying higher-order construct that accounts for a person's willingness to engage in quantitative situations in everyday life. The purpose of this study is to retest the construct validity of a model of quantitative literacy (Wilkins, 2010). In this model, QLT represents a second-order factor that…

  17. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  18. PET optimization for improved assessment and accurate quantification of {sup 90}Y-microsphere biodistribution after radioembolization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martí-Climent, Josep M., E-mail: jmmartic@unav.es; Prieto, Elena; Elosúa, César

    2014-09-15

    Purpose: {sup 90}Y-microspheres are widely used for the radioembolization of metastatic liver cancer or hepatocellular carcinoma and there is a growing interest for imaging {sup 90}Y-microspheres with PET. The aim of this study is to evaluate the performance of a current generation PET/CT scanner for {sup 90}Y imaging and to optimize the PET protocol to improve the assessment and the quantification of {sup 90}Y-microsphere biodistribution after radioembolization. Methods: Data were acquired on a Biograph mCT-TrueV scanner with time of flight (TOF) and point spread function (PSF) modeling. Spatial resolution was measured with a{sup 90}Y point source. Sensitivity was evaluated usingmore » the NEMA 70 cm line source filled with {sup 90}Y. To evaluate the count rate performance, {sup 90}Y vials with activity ranging from 3.64 to 0.035 GBq were measured in the center of the field of view (CFOV). The energy spectrum was evaluated. Image quality with different reconstructions was studied using the Jaszczak phantom containing six hollow spheres (diameters: 31.3, 28.1, 21.8, 16.1, 13.3, and 10.5 mm), filled with a 207 kBq/ml {sup 90}Y concentration and a 5:1 sphere-to-background ratio. Acquisition time was adjusted to simulate the quality of a realistic clinical PET acquisition of a patient treated with SIR-Spheres{sup ®}. The developed methodology was applied to ten patients after SIR-Spheres{sup ®} treatment acquiring a 10 min per bed PET. Results: The energy spectrum showed the{sup 90}Y bremsstrahlung radiation. The {sup 90}Y transverse resolution, with filtered backprojection reconstruction, was 4.5 mm in the CFOV and degraded to 5.0 mm at 10 cm off-axis. {sup 90}Y absolute sensitivity was 0.40 kcps/MBq in the center of the field of view. Tendency of true and random rates as a function of the {sup 90}Y activity could be accurately described using linear and quadratic models, respectively. Phantom studies demonstrated that, due to low count statistics in {sup 90}Y

  19. Accurate atomistic first-principles calculations of electronic stopping

    DOE PAGES

    Schleife, André; Kanai, Yosuke; Correa, Alfredo A.

    2015-01-20

    In this paper, we show that atomistic first-principles calculations based on real-time propagation within time-dependent density functional theory are capable of accurately describing electronic stopping of light projectile atoms in metal hosts over a wide range of projectile velocities. In particular, we employ a plane-wave pseudopotential scheme to solve time-dependent Kohn-Sham equations for representative systems of H and He projectiles in crystalline aluminum. This approach to simulate nonadiabatic electron-ion interaction provides an accurate framework that allows for quantitative comparison with experiment without introducing ad hoc parameters such as effective charges, or assumptions about the dielectric function. Finally, our work clearlymore » shows that this atomistic first-principles description of electronic stopping is able to disentangle contributions due to tightly bound semicore electrons and geometric aspects of the stopping geometry (channeling versus off-channeling) in a wide range of projectile velocities.« less

  20. Generation of accurate peptide retention data for targeted and data independent quantitative LC-MS analysis: Chromatographic lessons in proteomics.

    PubMed

    Krokhin, Oleg V; Spicer, Vic

    2016-12-01

    The emergence of data-independent quantitative LC-MS/MS analysis protocols further highlights the importance of high-quality reproducible chromatographic procedures. Knowing, controlling and being able to predict the effect of multiple factors that alter peptide RP-HPLC separation selectivity is critical for successful data collection for the construction of ion libraries. Proteomic researchers have often regarded RP-HPLC as a "black box", while vast amount of research on peptide separation is readily available. In addition to obvious parameters, such as the type of ion-pairing modifier, stationary phase and column temperature, we describe the "mysterious" effects of gradient slope, column size and flow rate on peptide separation selectivity. Retention time variations due to these parameters are governed by the linear solvent strength (LSS) theory on a peptide level by the value of its slope S in the basic LSS equation-a parameter that can be accurately predicted. Thus, the application of shallower gradients, higher flow rates, or smaller columns will each increases the relative retention of peptides with higher S-values (long species with multiple positively charged groups). Simultaneous changes to these parameters that each drive shifts in separation selectivity in the same direction should be avoided. The unification of terminology represents another pressing issue in this field of applied proteomics that should be addressed to facilitate further progress. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Quantitative Assessment of Combination Antimicrobial Therapy against Multidrug-Resistant Acinetobacter baumannii▿

    PubMed Central

    Lim, Tze-Peng; Ledesma, Kimberly R.; Chang, Kai-Tai; Hou, Jing-Guo; Kwa, Andrea L.; Nikolaou, Michael; Quinn, John P.; Prince, Randall A.; Tam, Vincent H.

    2008-01-01

    Treatment of multidrug-resistant bacterial infections poses a therapeutic challenge to clinicians; combination therapy is often the only viable option for multidrug-resistant infections. A quantitative method was developed to assess the combined killing abilities of antimicrobial agents. Time-kill studies (TKS) were performed using a multidrug-resistant clinical isolate of Acinetobacter baumannii with escalating concentrations of cefepime (0 to 512 mg/liter), amikacin (0 to 256 mg/liter), and levofloxacin (0 to 64 mg/liter). The bacterial burden data in single and combined (two of the three agents with clinically achievable concentrations in serum) TKS at 24 h were mathematically modeled to provide an objective basis for comparing various antimicrobial agent combinations. Synergy and antagonism were defined as interaction indices of <1 and >1, respectively. A hollow-fiber infection model (HFIM) simulating various clinical (fluctuating concentrations over time) dosing exposures was used to selectively validate our quantitative assessment of the combined killing effect. Model fits in all single-agent TKS were satisfactory (r2 > 0.97). An enhanced combined overall killing effect was seen in the cefepime-amikacin combination (interactive index, 0.698; 95% confidence interval [CI], 0.675 to 0.722) and the cefepime-levofloxacin combination (interactive index, 0.929; 95% CI, 0.903 to 0.956), but no significant difference in the combined overall killing effect for the levofloxacin-amikacin combination was observed (interactive index, 0.994; 95% CI, 0.982 to 1.005). These assessments were consistent with observations in HFIM validation studies. Our method could be used to objectively rank the combined killing activities of two antimicrobial agents when used together against a multidrug-resistant A. baumannii isolate. It may offer better insights into the effectiveness of various antimicrobial combinations and warrants further investigations. PMID:18505848

  2. Comparative study of quantitative phase imaging techniques for refractometry of optical fibers

    NASA Astrophysics Data System (ADS)

    de Dorlodot, Bertrand; Bélanger, Erik; Bérubé, Jean-Philippe; Vallée, Réal; Marquet, Pierre

    2018-02-01

    The refractive index difference profile of optical fibers is the key design parameter because it determines, among other properties, the insertion losses and propagating modes. Therefore, an accurate refractive index profiling method is of paramount importance to their development and optimization. Quantitative phase imaging (QPI) is one of the available tools to retrieve structural characteristics of optical fibers, including the refractive index difference profile. Having the advantage of being non-destructive, several different QPI methods have been developed over the last decades. Here, we present a comparative study of three different available QPI techniques, namely the transport-of-intensity equation, quadriwave lateral shearing interferometry and digital holographic microscopy. To assess the accuracy and precision of those QPI techniques, quantitative phase images of the core of a well-characterized optical fiber have been retrieved for each of them and a robust image processing procedure has been applied in order to retrieve their refractive index difference profiles. As a result, even if the raw images for all the three QPI methods were suffering from different shortcomings, our robust automated image-processing pipeline successfully corrected these. After this treatment, all three QPI techniques yielded accurate, reliable and mutually consistent refractive index difference profiles in agreement with the accuracy and precision of the refracted near-field benchmark measurement.

  3. MO-DE-303-03: Session on quantitative imaging for assessment of tumor response to radiation therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowen, S.

    This session will focus on quantitative imaging for assessment of tumor response to radiation therapy. This is a technically challenging method to translate to practice in radiation therapy. In the new era of precision medicine, however, delivering the right treatment, to the right patient, and at the right time, can positively impact treatment choices and patient outcomes. Quantitative imaging provides the spatial sensitivity required by radiation therapy for precision medicine that is not available by other means. In this Joint ESTRO -AAPM Symposium, three leading-edge investigators will present specific motivations for quantitative imaging biomarkers in radiation therapy of esophageal, headmore » and neck, locally advanced non-small cell lung cancer, and hepatocellular carcinoma. Experiences with the use of dynamic contrast enhanced (DCE) MRI, diffusion- weighted (DW) MRI, PET/CT, and SPECT/CT will be presented. Issues covered will include: response prediction, dose-painting, timing between therapy and imaging, within-therapy biomarkers, confounding effects, normal tissue sparing, dose-response modeling, and association with clinical biomarkers and outcomes. Current information will be presented from investigational studies and clinical practice. Learning Objectives: Learn motivations for the use of quantitative imaging biomarkers for assessment of response to radiation therapy Review the potential areas of application in cancer therapy Examine the challenges for translation, including imaging confounds and paucity of evidence to date Compare exemplary examples of the current state of the art in DCE-MRI, DW-MRI, PET/CT and SPECT/CT imaging for assessment of response to radiation therapy Van der Heide: Research grants from the Dutch Cancer Society and the European Union (FP7) Bowen: RSNA Scholar grant.« less

  4. Quantitative photoacoustic assessment of ex-vivo lymph nodes of colorectal cancer patients

    NASA Astrophysics Data System (ADS)

    Sampathkumar, Ashwin; Mamou, Jonathan; Saegusa-Beercroft, Emi; Chitnis, Parag V.; Machi, Junji; Feleppa, Ernest J.

    2015-03-01

    Staging of cancers and selection of appropriate treatment requires histological examination of multiple dissected lymph nodes (LNs) per patient, so that a staggering number of nodes require histopathological examination, and the finite resources of pathology facilities create a severe processing bottleneck. Histologically examining the entire 3D volume of every dissected node is not feasible, and therefore, only the central region of each node is examined histologically, which results in severe sampling limitations. In this work, we assess the feasibility of using quantitative photoacoustics (QPA) to overcome the limitations imposed by current procedures and eliminate the resulting under sampling in node assessments. QPA is emerging as a new hybrid modality that assesses tissue properties and classifies tissue type based on multiple estimates derived from spectrum analysis of photoacoustic (PA) radiofrequency (RF) data and from statistical analysis of envelope-signal data derived from the RF signals. Our study seeks to use QPA to distinguish cancerous from non-cancerous regions of dissected LNs and hence serve as a reliable means of imaging and detecting small but clinically significant cancerous foci that would be missed by current methods. Dissected lymph nodes were placed in a water bath and PA signals were generated using a wavelength-tunable (680-950 nm) laser. A 26-MHz, f-2 transducer was used to sense the PA signals. We present an overview of our experimental setup; provide a statistical analysis of multi-wavelength classification parameters (mid-band fit, slope, intercept) obtained from the PA signal spectrum generated in the LNs; and compare QPA performance with our established quantitative ultrasound (QUS) techniques in distinguishing metastatic from non-cancerous tissue in dissected LNs. QPA-QUS methods offer a novel general means of tissue typing and evaluation in a broad range of disease-assessment applications, e.g., cardiac, intravascular

  5. Quantitative MRI assessments of white matter in children treated for acute lymphoblastic leukemia

    NASA Astrophysics Data System (ADS)

    Reddick, Wilburn E.; Glass, John O.; Helton, Kathleen J.; Li, Chin-Shang; Pui, Ching-Hon

    2005-04-01

    The purpose of this study was to use objective quantitative MR imaging methods to prospectively assess changes in the physiological structure of white matter during the temporal evolution of leukoencephalopathy (LE) in children treated for acute lymphoblastic leukemia. The longitudinal incidence, extent (proportion of white matter affect), and intensity (elevation of T1 and T2 relaxation rates) of LE was evaluated for 44 children. A combined imaging set consisting of T1, T2, PD, and FLAIR MR images and white matter, gray matter and CSF a priori maps from a spatially normalized atlas were analyzed with a neural network segmentation based on a Kohonen Self-Organizing Map (SOM). Quantitative T1 and T2 relaxation maps were generated using a nonlinear parametric optimization procedure to fit the corresponding multi-exponential models. A Cox proportional regression was performed to estimate the effect of intravenous methotrexate (IV-MTX) exposure on the development of LE followed by a generalized linear model to predict the probability of LE in new patients. Additional T-tests of independent samples were performed to assess differences in quantitative measures of extent and intensity at four different points in therapy. Higher doses and more courses of IV-MTX placed patients at a higher risk of developing LE and were associated with more intense changes affecting more of the white matter volume; many of the changes resolved after completion of therapy. The impact of these changes on neurocognitive functioning and quality of life in survivors remains to be determined.

  6. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  7. Accurate phase measurements for thick spherical objects using optical quadrature microscopy

    NASA Astrophysics Data System (ADS)

    Warger, William C., II; DiMarzio, Charles A.

    2009-02-01

    In vitro fertilization (IVF) procedures have resulted in the birth of over three million babies since 1978. Yet the live birth rate in the United States was only 34% in 2005, with 32% of the successful pregnancies resulting in multiple births. These multiple pregnancies were directly attributed to the transfer of multiple embryos to increase the probability that a single, healthy embryo was included. Current viability markers used for IVF, such as the cell number, symmetry, size, and fragmentation, are analyzed qualitatively with differential interference contrast (DIC) microscopy. However, this method is not ideal for quantitative measures beyond the 8-cell stage of development because the cells overlap and obstruct the view within and below the cluster of cells. We have developed the phase-subtraction cell-counting method that uses the combination of DIC and optical quadrature microscopy (OQM) to count the number of cells accurately in live mouse embryos beyond the 8-cell stage. We have also created a preliminary analysis to measure the cell symmetry, size, and fragmentation quantitatively by analyzing the relative dry mass from the OQM image in conjunction with the phase-subtraction count. In this paper, we will discuss the characterization of OQM with respect to measuring the phase accurately for spherical samples that are much larger than the depth of field. Once fully characterized and verified with human embryos, this methodology could provide the means for a more accurate method to score embryo viability.

  8. Sugar concentration in nectar: a quantitative metric of crop attractiveness for refined pollinator risk assessments.

    PubMed

    Knopper, Loren D; Dan, Tereza; Reisig, Dominic D; Johnson, Josephine D; Bowers, Lisa M

    2016-10-01

    Those involved with pollinator risk assessment know that agricultural crops vary in attractiveness to bees. Intuitively, this means that exposure to agricultural pesticides is likely greatest for attractive plants and lowest for unattractive plants. While crop attractiveness in the risk assessment process has been qualitatively remarked on by some authorities, absent is direction on how to refine the process with quantitative metrics of attractiveness. At a high level, attractiveness of crops to bees appears to depend on several key variables, including but not limited to: floral, olfactory, visual and tactile cues; seasonal availability; physical and behavioral characteristics of the bee; plant and nectar rewards. Notwithstanding the complexities and interactions among these variables, sugar content in nectar stands out as a suitable quantitative metric by which to refine pollinator risk assessments for attractiveness. Provided herein is a proposed way to use sugar nectar concentration to adjust the exposure parameter (with what is called a crop attractiveness factor) in the calculation of risk quotients in order to derive crop-specific tier I assessments. This Perspective is meant to invite discussion on incorporating such changes in the risk assessment process. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.

  9. Dynamic and quantitative assessment of blood coagulation using optical coherence elastography

    PubMed Central

    Xu, Xiangqun; Zhu, Jiang; Chen, Zhongping

    2016-01-01

    Reliable clot diagnostic systems are needed for directing treatment in a broad spectrum of cardiovascular diseases and coagulopathy. Here, we report on non-contact measurement of elastic modulus for dynamic and quantitative assessment of whole blood coagulation using acoustic radiation force orthogonal excitation optical coherence elastography (ARFOE-OCE). In this system, acoustic radiation force (ARF) is produced by a remote ultrasonic transducer, and a shear wave induced by ARF excitation is detected by the optical coherence tomography (OCT) system. During porcine whole blood coagulation, changes in the elastic property of the clots increase the shear modulus of the sample, altering the propagating velocity of the shear wave. Consequently, dynamic blood coagulation status can be measured quantitatively by relating the velocity of the shear wave with clinically relevant coagulation metrics, including reaction time, clot formation kinetics and maximum shear modulus. The results show that the ARFOE-OCE is sensitive to the clot formation kinetics and can differentiate the elastic properties of the recalcified porcine whole blood, blood added with kaolin as an activator, and blood spiked with fibrinogen. PMID:27090437

  10. Dynamic and quantitative assessment of blood coagulation using optical coherence elastography

    NASA Astrophysics Data System (ADS)

    Xu, Xiangqun; Zhu, Jiang; Chen, Zhongping

    2016-04-01

    Reliable clot diagnostic systems are needed for directing treatment in a broad spectrum of cardiovascular diseases and coagulopathy. Here, we report on non-contact measurement of elastic modulus for dynamic and quantitative assessment of whole blood coagulation using acoustic radiation force orthogonal excitation optical coherence elastography (ARFOE-OCE). In this system, acoustic radiation force (ARF) is produced by a remote ultrasonic transducer, and a shear wave induced by ARF excitation is detected by the optical coherence tomography (OCT) system. During porcine whole blood coagulation, changes in the elastic property of the clots increase the shear modulus of the sample, altering the propagating velocity of the shear wave. Consequently, dynamic blood coagulation status can be measured quantitatively by relating the velocity of the shear wave with clinically relevant coagulation metrics, including reaction time, clot formation kinetics and maximum shear modulus. The results show that the ARFOE-OCE is sensitive to the clot formation kinetics and can differentiate the elastic properties of the recalcified porcine whole blood, blood added with kaolin as an activator, and blood spiked with fibrinogen.

  11. Quantitation of valve regurgitation severity by three-dimensional vena contracta area is superior to flow convergence method of quantitation on transesophageal echocardiography.

    PubMed

    Abudiab, Muaz M; Chao, Chieh-Ju; Liu, Shuang; Naqvi, Tasneem Z

    2017-07-01

    Quantitation of regurgitation severity using the proximal isovelocity acceleration (PISA) method to calculate effective regurgitant orifice (ERO) area has limitations. Measurement of three-dimensional (3D) vena contracta area (VCA) accurately grades mitral regurgitation (MR) severity on transthoracic echocardiography (TTE). We evaluated 3D VCA quantitation of regurgitant jet severity using 3D transesophageal echocardiography (TEE) in 110 native mitral, aortic, and tricuspid valves and six prosthetic valves in patients with at least mild valvular regurgitation. The ASE-recommended integrative method comprising semiquantitative and quantitative assessment of valvular regurgitation was used as a reference method, including ERO area by 2D PISA for assigning severity of regurgitation grade. Mean age was 62.2±14.4 years; 3D VCA quantitation was feasible in 91% regurgitant valves compared to 78% by the PISA method. When both methods were feasible and in the presence of a single regurgitant jet, 3D VCA and 2D PISA were similar in differentiating assigned severity (ANOVAP<.001). In valves with multiple jets, however, 3D VCA had a better correlation to assigned severity (ANOVAP<.0001). The agreement of 2D PISA and 3D VCA with the integrative method was 47% and 58% for moderate and 65% and 88% for severe regurgitation, respectively. Measurement of 3D VCA by TEE is superior to the 2D PISA method in determination of regurgitation severity in multiple native and prosthetic valves. © 2017, Wiley Periodicals, Inc.

  12. Semi-quantitative assessment of pulmonary perfusion in children using dynamic contrast-enhanced MRI

    NASA Astrophysics Data System (ADS)

    Fetita, Catalin; Thong, William E.; Ou, Phalla

    2013-03-01

    This paper addresses the study of semi-quantitative assessment of pulmonary perfusion acquired from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in a study population mainly composed of children with pulmonary malformations. The automatic analysis approach proposed is based on the indicator-dilution theory introduced in 1954. First, a robust method is developed to segment the pulmonary artery and the lungs from anatomical MRI data, exploiting 2D and 3D mathematical morphology operators. Second, the time-dependent contrast signal of the lung regions is deconvolved by the arterial input function for the assessment of the local hemodynamic system parameters, ie. mean transit time, pulmonary blood volume and pulmonary blood flow. The discrete deconvolution method implements here a truncated singular value decomposition (tSVD) method. Parametric images for the entire lungs are generated as additional elements for diagnosis and quantitative follow-up. The preliminary results attest the feasibility of perfusion quantification in pulmonary DCE-MRI and open an interesting alternative to scintigraphy for this type of evaluation, to be considered at least as a preliminary decision in the diagnostic due to the large availability of the technique and to the non-invasive aspects.

  13. Multicenter trial of the proficiency of smart quantitative sensation tests.

    PubMed

    Dyck, Peter J; Argyros, Barbara; Russell, James W; Gahnstrom, Linde E; Nalepa, Susan; Albers, James W; Lodermeier, Karen A; Zafft, Andrew J; Dyck, P James B; Klein, Christopher J; Litchy, William J; Davies, Jenny L; Carter, Rickey E; Melton, L Joseph

    2014-05-01

    We assessed proficiency (accuracy and intra- and intertest reproducibility) of smart quantitative sensation tests (smart QSTs) in subjects without and with diabetic sensorimotor polyneuropathy (DSPN). Technologists from 3 medical centers using different but identical QSTs independently assessed 6 modalities of sensation of the foot (or leg) twice in patients without (n = 6) and with (n = 6) DSPN using smart computer assisted QSTs. Low rates of test abnormalities were observed in health and high rates in DSPN. Very high intraclass correlations were obtained between continuous measures of QSTs and neuropathy signs, symptoms, or nerve conductions (NCs). No significant intra- or intertest differences were observed. These results provide proof of concept that smart QSTs provide accurate assessment of sensation loss without intra- or intertest differences useful for multicenter trials. Smart technology makes possible efficient testing of body surface area sensation loss in symmetric length-dependent sensorimotor polyneuropathies. Copyright © 2013 Wiley Periodicals, Inc.

  14. Identification of internal control genes for quantitative expression analysis by real-time PCR in bovine peripheral lymphocytes.

    PubMed

    Spalenza, Veronica; Girolami, Flavia; Bevilacqua, Claudia; Riondato, Fulvio; Rasero, Roberto; Nebbia, Carlo; Sacchi, Paola; Martin, Patrice

    2011-09-01

    Gene expression studies in blood cells, particularly lymphocytes, are useful for monitoring potential exposure to toxicants or environmental pollutants in humans and livestock species. Quantitative PCR is the method of choice for obtaining accurate quantification of mRNA transcripts although variations in the amount of starting material, enzymatic efficiency, and the presence of inhibitors can lead to evaluation errors. As a result, normalization of data is of crucial importance. The most common approach is the use of endogenous reference genes as an internal control, whose expression should ideally not vary among individuals and under different experimental conditions. The accurate selection of reference genes is therefore an important step in interpreting quantitative PCR studies. Since no systematic investigation in bovine lymphocytes has been performed, the aim of the present study was to assess the expression stability of seven candidate reference genes in circulating lymphocytes collected from 15 dairy cows. Following the characterization by flow cytometric analysis of the cell populations obtained from blood through a density gradient procedure, three popular softwares were used to evaluate the gene expression data. The results showed that two genes are sufficient for normalization of quantitative PCR studies in cattle lymphocytes and that YWAHZ, S24 and PPIA are the most stable genes. Copyright © 2010 Elsevier Ltd. All rights reserved.

  15. Anatomic and Quantitative Temporal Bone CT for Preoperative Assessment of Branchio-Oto-Renal Syndrome.

    PubMed

    Ginat, D T; Ferro, L; Gluth, M B

    2016-12-01

    We describe the temporal bone computed tomography (CT) findings of an unusual case of branchio-oto-renal syndrome with ectopic ossicles that are partially located in the middle cranial fossa. We also describe quantitative temporal bone CT assessment pertaining to cochlear implantation in the setting of anomalous cochlear anatomy associated with this syndrome.

  16. Computational technique for stepwise quantitative assessment of equation correctness

    NASA Astrophysics Data System (ADS)

    Othman, Nuru'l Izzah; Bakar, Zainab Abu

    2017-04-01

    Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors.

  17. Quantitative 3D analysis of bone in hip osteoarthritis using clinical computed tomography.

    PubMed

    Turmezei, Tom D; Treece, Graham M; Gee, Andrew H; Fotiadou, Anastasia F; Poole, Kenneth E S

    2016-07-01

    To assess the relationship between proximal femoral cortical bone thickness and radiological hip osteoarthritis using quantitative 3D analysis of clinical computed tomography (CT) data. Image analysis was performed on clinical CT imaging data from 203 female volunteers with a technique called cortical bone mapping (CBM). Colour thickness maps were created for each proximal femur. Statistical parametric mapping was performed to identify statistically significant differences in cortical bone thickness that corresponded with the severity of radiological hip osteoarthritis. Kellgren and Lawrence (K&L) grade, minimum joint space width (JSW) and a novel CT-based osteophyte score were also blindly assessed from the CT data. For each increase in K&L grade, cortical thickness increased by up to 25 % in distinct areas of the superolateral femoral head-neck junction and superior subchondral bone plate. For increasing severity of CT osteophytes, the increase in cortical thickness was more circumferential, involving a wider portion of the head-neck junction, with up to a 7 % increase in cortical thickness per increment in score. Results were not significant for minimum JSW. These findings indicate that quantitative 3D analysis of the proximal femur can identify changes in cortical bone thickness relevant to structural hip osteoarthritis. • CT is being increasingly used to assess bony involvement in osteoarthritis • CBM provides accurate and reliable quantitative analysis of cortical bone thickness • Cortical bone is thicker at the superior femoral head-neck with worse osteoarthritis • Regions of increased thickness co-locate with impingement and osteophyte formation • Quantitative 3D bone analysis could enable clinical disease prediction and therapy development.

  18. The quantitative assessment of epicardial fat distribution on human hearts: Implications for epicardial electrophysiology.

    PubMed

    Mattson, Alexander R; Soto, Mario J; Iaizzo, Paul A

    2018-07-01

    Epicardial electrophysiological procedures rely on dependable interfacing with the myocardial tissue. For example, epicardial pacing systems must generate sustainable chronic pacing capture, while epicardial ablations must effectively deliver energy to the target hyper-excitable myocytes. The human heart has a significant adipose layer which may impede epicardial procedures. The objective of this study was to quantitatively assess the relative location of epicardial adipose on the human heart, to define locations where epicardial therapies might be performed successfully. We studied perfusion-fixed human hearts (n = 105) in multiple isolated planes including: left ventricular margin, diaphragmatic surface, and anterior right ventricle. Relative adipose distribution was quantitatively assessed via planar images, using a custom-generated image analysis algorithm. In these specimens, 76.7 ± 13.8% of the left ventricular margin, 72.7 ± 11.3% of the diaphragmatic surface, and 92.1 ± 8.7% of the anterior right margin were covered with superficial epicardial adipose layers. Percent adipose coverage significantly increased with age (P < 0.001) and history of coronary artery disease (P < 0.05). No significant relationships were identified between relative percent adipose coverage and gender, body weight or height, BMI, history of hypertension, and/or history of congestive heart failure. Additionally, we describe two-dimensional probability distributions of epicardial adipose coverage for each of the three analysis planes. In this study, we detail the quantitative assessment and probabilistic mapping of the distribution of superficial epicardial adipose on the adult human heart. These findings have implications relative to performing epicardial procedures and/or designing procedures or tools to successfully perform such treatments. Clin. Anat. 31:661-666, 2018. © 2018 Wiley Periodicals, Inc. © 2018 Wiley Periodicals, Inc.

  19. Quantitative instruments used to assess children's sense of smell: a review article.

    PubMed

    Moura, Raissa Gomes Fonseca; Cunha, Daniele Andrade; Gomes, Ana Carolina de Lima Gusmão; Silva, Hilton Justino da

    2014-01-01

    To systematically gather from the literature available the quantitative instruments used to assess the sense of smell in studies carried out with children. The present study included a survey in the Pubmed and Bireme platforms and in the databases of MedLine, Lilacs, regional SciELO and Web of Science, followed by selection and critical analysis of the articles found and chosen. We selected original articles related to the topic in question, conducted only with children in Portuguese, English, and Spanish. We excluded studies addressing other phases of human development, exclusively or concurrently with the pediatric population; studies on animals; literature review articles; dissertations; book chapters; case study articles; and editorials. A book report protocol was created for this study, including the following information: author, department, year, location, population/sample, age, purpose of the study, methods, and main results. We found 8,451 articles by typing keywords and identifiers. Out of this total, 5,928 were excluded by the title, 2,366 by the abstract, and 123 after we read the full text. Thus, 34 articles were selected, of which 28 were repeated in the databases, totalizing 6 articles analyzed in this review. We observed a lack of standardization of the quantitative instruments used to assess children's sense of smell, with great variability in the methodology of the tests, which reduces the effectiveness and reliability of the results.

  20. Bone-marrow densitometry: Assessment of marrow space of human vertebrae by single energy high resolution-quantitative computed tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peña, Jaime A.; Damm, Timo; Bastgen, Jan

    Purpose: Accurate noninvasive assessment of vertebral bone marrow fat fraction is important for diagnostic assessment of a variety of disorders and therapies known to affect marrow composition. Moreover, it provides a means to correct fat-induced bias of single energy quantitative computed tomography (QCT) based bone mineral density (BMD) measurements. The authors developed new segmentation and calibration methods to obtain quantitative surrogate measures of marrow-fat density in the axial skeleton. Methods: The authors developed and tested two high resolution-QCT (HR-QCT) based methods which permit segmentation of bone voids in between trabeculae hypothesizing that they are representative of bone marrow space. Themore » methods permit calculation of marrow content in units of mineral equivalent marrow density (MeMD). The first method is based on global thresholding and peeling (GTP) to define a volume of interest away from the transition between trabecular bone and marrow. The second method, morphological filtering (MF), uses spherical elements of different radii (0.1–1.2 mm) and automatically places them in between trabeculae to identify regions with large trabecular interspace, the bone-void space. To determine their performance, data were compared ex vivo to high-resolution peripheral CT (HR-pQCT) images as the gold-standard. The performance of the methods was tested on a set of excised human vertebrae with intact bone marrow tissue representative of an elderly population with low BMD. Results: 86% (GTP) and 87% (MF) of the voxels identified as true marrow space on HR-pQCT images were correctly identified on HR-QCT images and thus these volumes of interest can be considered to be representative of true marrow space. Within this volume, MeMD was estimated with residual errors of 4.8 mg/cm{sup 3} corresponding to accuracy errors in fat fraction on the order of 5% both for GTP and MF methods. Conclusions: The GTP and MF methods on HR-QCT images permit

  1. QUESP and QUEST revisited - fast and accurate quantitative CEST experiments.

    PubMed

    Zaiss, Moritz; Angelovski, Goran; Demetriou, Eleni; McMahon, Michael T; Golay, Xavier; Scheffler, Klaus

    2018-03-01

    Chemical exchange saturation transfer (CEST) NMR or MRI experiments allow detection of low concentrated molecules with enhanced sensitivity via their proton exchange with the abundant water pool. Be it endogenous metabolites or exogenous contrast agents, an exact quantification of the actual exchange rate is required to design optimal pulse sequences and/or specific sensitive agents. Refined analytical expressions allow deeper insight and improvement of accuracy for common quantification techniques. The accuracy of standard quantification methodologies, such as quantification of exchange rate using varying saturation power or varying saturation time, is improved especially for the case of nonequilibrium initial conditions and weak labeling conditions, meaning the saturation amplitude is smaller than the exchange rate (γB 1  < k). The improved analytical 'quantification of exchange rate using varying saturation power/time' (QUESP/QUEST) equations allow for more accurate exchange rate determination, and provide clear insights on the general principles to execute the experiments and to perform numerical evaluation. The proposed methodology was evaluated on the large-shift regime of paramagnetic chemical-exchange-saturation-transfer agents using simulated data and data of the paramagnetic Eu(III) complex of DOTA-tetraglycineamide. The refined formulas yield improved exchange rate estimation. General convergence intervals of the methods that would apply for smaller shift agents are also discussed. Magn Reson Med 79:1708-1721, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  2. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  3. Quantitative Assessment of Transportation Network Vulnerability with Dynamic Traffic Simulation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shekar, Venkateswaran; Fiondella, Lance; Chatterjee, Samrat

    Transportation networks are critical to the social and economic function of nations. Given the continuing increase in the populations of cities throughout the world, the criticality of transportation infrastructure is expected to increase. Thus, it is ever more important to mitigate congestion as well as to assess the impact disruptions would have on individuals who depend on transportation for their work and livelihood. Moreover, several government organizations are responsible for ensuring transportation networks are available despite the constant threat of natural disasters and terrorist activities. Most of the previous transportation network vulnerability research has been performed in the context ofmore » static traffic models, many of which are formulated as traditional optimization problems. However, transportation networks are dynamic because their usage varies over time. Thus, more appropriate methods to characterize the vulnerability of transportation networks should consider their dynamic properties. This paper presents a quantitative approach to assess the vulnerability of a transportation network to disruptions with methods from traffic simulation. Our approach can prioritize the critical links over time and is generalizable to the case where both link and node disruptions are of concern. We illustrate the approach through a series of examples. Our results demonstrate that the approach provides quantitative insight into the time varying criticality of links. Such an approach could be used as the objective function of less traditional optimization methods that use simulation and other techniques to evaluate the relative utility of a particular network defense to reduce vulnerability and increase resilience.« less

  4. Quantitative phase imaging for enhanced assessment of optomechanical cancer cell properties

    NASA Astrophysics Data System (ADS)

    Kastl, Lena; Kemper, Björn; Schnekenburger, Jürgen

    2018-02-01

    Optical cell stretching provides label-free investigations of cells by measuring their biomechanical properties based on deformability determination in a fiber optical two-beam trap. However, the stretching forces in this two-beam laser trap depend on the optical properties of the investigated specimen. Therefore, we characterized in parallel four cancer cell lines with varying degree of differentiation utilizing quantitative phase imaging (QPI) and optical cell stretching. The QPI data allowed enhanced assessment of the mechanical cell properties measured with the optical cell stretcher and demonstrates the high potential of cell phenotyping when both techniques are combined.

  5. Remotely Sensed Quantitative Drought Risk Assessment in Vulnerable Agroecosystems

    NASA Astrophysics Data System (ADS)

    Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.

    2012-04-01

    Hazard may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, economy and society. This paper deals with drought risk assessment, which the first step designed to find out what the problems are and comprises three distinct steps, namely risk identification, risk management which is not covered in this paper, there should be a fourth step to address the need for feedback and to take post-audits of all risk assessment exercises. In particular, quantitative drought risk assessment is attempted by using statistical methods. For the qualification of drought, the Reconnaissance Drought Index (RDI) is employed, which is a new index based on hydrometeorological parameters, such as precipitation and potential evapotranspiration. The remotely sensed estimation of RDI is based on NOA-AVHRR satellite data for a period of 20 years (1981-2001). The study area is Thessaly, central Greece, which is a drought-prone agricultural region characterized by vulnerable agriculture. Specifically, the undertaken drought risk assessment processes are specified as follows: 1. Risk identification: This step involves drought quantification and monitoring based on remotely sensed RDI and extraction of several features such as severity, duration, areal extent, onset and end time. Moreover, it involves a drought early warning system based on the above parameters. 2. Risk estimation: This step includes an analysis of drought severity, frequency and their relationships. 3. Risk evaluation: This step covers drought evaluation based on analysis of RDI images before and after each drought episode, which usually lasts one hydrological year (12month). The results of these three-step drought assessment processes are considered quite satisfactory in a drought-prone region such as Thessaly in central

  6. Quantitative assessment of human body shape using Fourier analysis

    NASA Astrophysics Data System (ADS)

    Friess, Martin; Rohlf, F. J.; Hsiao, Hongwei

    2004-04-01

    Fall protection harnesses are commonly used to reduce the number and severity of injuries. Increasing the efficiency of harness design requires the size and shape variation of the user population to be assessed as detailed and as accurately as possible. In light of the unsatisfactory performance of traditional anthropometry with respect to such assessments, we propose the use of 3D laser surface scans of whole bodies and the statistical analysis of elliptic Fourier coefficients. Ninety-eight male and female adults were scanned. Key features of each torso were extracted as a 3D curve along front, back and the thighs. A 3D extension of Elliptic Fourier analysis4 was used to quantify their shape through multivariate statistics. Shape change as a function of size (allometry) was predicted by regressing the coefficients onto stature, weight and hip circumference. Upper and lower limits of torso shape variation were determined and can be used to redefine the design of the harness that will fit most individual body shapes. Observed allometric changes are used for adjustments to the harness shape in each size. Finally, the estimated outline data were used as templates for a free-form deformation of the complete torso surface using NURBS models (non-uniform rational B-splines).

  7. Quantitative assessment of Cerenkov luminescence for radioguided brain tumor resection surgery

    NASA Astrophysics Data System (ADS)

    Klein, Justin S.; Mitchell, Gregory S.; Cherry, Simon R.

    2017-05-01

    Cerenkov luminescence imaging (CLI) is a developing imaging modality that detects radiolabeled molecules via visible light emitted during the radioactive decay process. We used a Monte Carlo based computer simulation to quantitatively investigate CLI compared to direct detection of the ionizing radiation itself as an intraoperative imaging tool for assessment of brain tumor margins. Our brain tumor model consisted of a 1 mm spherical tumor remnant embedded up to 5 mm in depth below the surface of normal brain tissue. Tumor to background contrast ranging from 2:1 to 10:1 were considered. We quantified all decay signals (e±, gamma photon, Cerenkov photons) reaching the brain volume surface. CLI proved to be the most sensitive method for detecting the tumor volume in both imaging and non-imaging strategies as assessed by contrast-to-noise ratio and by receiver operating characteristic output of a channelized Hotelling observer.

  8. Quantitative assessment of arm tremor in people with neurological disorders.

    PubMed

    Jeonghee Kim; Parnell, Claire; Wichmann, Thomas; DeWeerth, Stephen P

    2016-08-01

    Abnormal oscillatory movement (i.e. tremor) is usually evaluated with qualitative assessment by clinicians, and quantified with subjective scoring methods. These methods are often inaccurate. We utilized a quantitative and standardized task based on the Fitts' law to assess the performance of arm movement with tremor by controlling a gyration mouse on a computer. The experiment included the center-out tapping (COT) and rectangular track navigation (RTN) tasks. We report the results of a pilot study in which we collected the performance for healthy participants in whom tremor was simulated by imposing oscillatory movements to the arm with a vibration motor. We compared their movement speed and accuracy with and without the artificial "tremor." We found that the artificial tremor significantly affected the path efficiency for both tasks (COT: 56.8 vs. 46.2%, p <; 0.05; RTN: 94.2 vs. 67.4%, p <; 0.05), and we were able to distinguish the presence of tremor. From this result, we expect to quantify severity of tremor and the effectiveness therapy for tremor patients.

  9. Development of quantitative analysis method for stereotactic brain image: assessment of reduced accumulation in extent and severity using anatomical segmentation.

    PubMed

    Mizumura, Sunao; Kumita, Shin-ichiro; Cho, Keiichi; Ishihara, Makiko; Nakajo, Hidenobu; Toba, Masahiro; Kumazaki, Tatsuo

    2003-06-01

    Through visual assessment by three-dimensional (3D) brain image analysis methods using stereotactic brain coordinates system, such as three-dimensional stereotactic surface projections and statistical parametric mapping, it is difficult to quantitatively assess anatomical information and the range of extent of an abnormal region. In this study, we devised a method to quantitatively assess local abnormal findings by segmenting a brain map according to anatomical structure. Through quantitative local abnormality assessment using this method, we studied the characteristics of distribution of reduced blood flow in cases with dementia of the Alzheimer type (DAT). Using twenty-five cases with DAT (mean age, 68.9 years old), all of whom were diagnosed as probable Alzheimer's disease based on NINCDS-ADRDA, we collected I-123 iodoamphetamine SPECT data. A 3D brain map using the 3D-SSP program was compared with the data of 20 cases in the control group, who age-matched the subject cases. To study local abnormalities on the 3D images, we divided the whole brain into 24 segments based on anatomical classification. We assessed the extent of an abnormal region in each segment (rate of the coordinates with a Z-value that exceeds the threshold value, in all coordinates within a segment), and severity (average Z-value of the coordinates with a Z-value that exceeds the threshold value). This method clarified orientation and expansion of reduced accumulation, through classifying stereotactic brain coordinates according to the anatomical structure. This method was considered useful for quantitatively grasping distribution abnormalities in the brain and changes in abnormality distribution.

  10. A Machine Learned Classifier That Uses Gene Expression Data to Accurately Predict Estrogen Receptor Status

    PubMed Central

    Bastani, Meysam; Vos, Larissa; Asgarian, Nasimeh; Deschenes, Jean; Graham, Kathryn; Mackey, John; Greiner, Russell

    2013-01-01

    Background Selecting the appropriate treatment for breast cancer requires accurately determining the estrogen receptor (ER) status of the tumor. However, the standard for determining this status, immunohistochemical analysis of formalin-fixed paraffin embedded samples, suffers from numerous technical and reproducibility issues. Assessment of ER-status based on RNA expression can provide more objective, quantitative and reproducible test results. Methods To learn a parsimonious RNA-based classifier of hormone receptor status, we applied a machine learning tool to a training dataset of gene expression microarray data obtained from 176 frozen breast tumors, whose ER-status was determined by applying ASCO-CAP guidelines to standardized immunohistochemical testing of formalin fixed tumor. Results This produced a three-gene classifier that can predict the ER-status of a novel tumor, with a cross-validation accuracy of 93.17±2.44%. When applied to an independent validation set and to four other public databases, some on different platforms, this classifier obtained over 90% accuracy in each. In addition, we found that this prediction rule separated the patients' recurrence-free survival curves with a hazard ratio lower than the one based on the IHC analysis of ER-status. Conclusions Our efficient and parsimonious classifier lends itself to high throughput, highly accurate and low-cost RNA-based assessments of ER-status, suitable for routine high-throughput clinical use. This analytic method provides a proof-of-principle that may be applicable to developing effective RNA-based tests for other biomarkers and conditions. PMID:24312637

  11. N-Nitroso compounds: Assessing agreement between food frequency questionnaires and 7-day food records

    USDA-ARS?s Scientific Manuscript database

    N-nitroso compounds are recognized as important dietary carcinogens. Accurate assessment of N-nitroso intake is fundamental to advancing research regarding its role with cancer. Previous studies have not used a quantitative database to estimate the intake of these compounds in a US population. To ad...

  12. Clinical value of patient-specific three-dimensional printing of congenital heart disease: Quantitative and qualitative assessments

    PubMed Central

    Lau, Ivan Wen Wen; Liu, Dongting; Xu, Lei; Fan, Zhanming

    2018-01-01

    Objective Current diagnostic assessment tools remain suboptimal in demonstrating complex morphology of congenital heart disease (CHD). This limitation has posed several challenges in preoperative planning, communication in medical practice, and medical education. This study aims to investigate the dimensional accuracy and the clinical value of 3D printed model of CHD in the above three areas. Methods Using cardiac computed tomography angiography (CCTA) data, a patient-specific 3D model of a 20-month-old boy with double outlet right ventricle was printed in Tango Plus material. Pearson correlation coefficient was used to evaluate correlation of the quantitative measurements taken at analogous anatomical locations between the CCTA images pre- and post-3D printing. Qualitative analysis was conducted by distributing surveys to six health professionals (two radiologists, two cardiologists and two cardiac surgeons) and three medical academics to assess the clinical value of the 3D printed model in these three areas. Results Excellent correlation (r = 0.99) was noted in the measurements between CCTA and 3D printed model, with a mean difference of 0.23 mm. Four out of six health professionals found the model to be useful in facilitating preoperative planning, while all of them thought that the model would be invaluable in enhancing patient-doctor communication. All three medical academics found the model to be helpful in teaching, and thought that the students will be able to learn the pathology quicker with better understanding. Conclusion The complex cardiac anatomy can be accurately replicated in flexible material using 3D printing technology. 3D printed heart models could serve as an excellent tool in facilitating preoperative planning, communication in medical practice, and medical education, although further studies with inclusion of more clinical cases are needed. PMID:29561912

  13. Evaluating variability and uncertainty separately in microbial quantitative risk assessment using two R packages.

    PubMed

    Pouillot, Régis; Delignette-Muller, Marie Laure

    2010-09-01

    Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org). Copyright 2010 Elsevier B.V. All rights reserved.

  14. A Multicenter Trial of the Proficiency of Smart Quantitative Sensation Tests

    PubMed Central

    Dyck, Peter J.; Argyros, Barbara; Russell, James W.; Gahnstrom, Linde E.; Nalepa, Susan; Albers, James W.; Lodermeier, Karen A.; Zafft, Andrew J.; Dyck, P. James B.; Klein, Christopher J.; Litchy, William J.; Davies, Jenny L.; Carter, Rickey E.; Melton, L. Joseph

    2014-01-01

    Introduction We assessed proficiency (accuracy and intra- and inter-test reproducibility) of smart quantitative sensation tests (smart QSTs) in subjects without and with diabetic polyneuropathy (DSPN). Methods Technologists from 3 medical centers using different but identical QSTs assessed independently 6 modalities of sensation of foot (or leg) twice in patients without (n = 6) and with (n = 6) DSPN using smart computer assisted QSTs. Results Low rates of test abnormalities were observed in health and high rates in DSPN. Very high intra-class correlations were obtained between continuous measures of QSTs and neuropathy signs, symptoms, or nerve conductions (NCs). No significant intra- or inter-test differences were observed. Discussion These results provide proof of concept that smart QSTs provide accurate assessment of sensation loss without intra- or inter-test differences useful for multicenter trials. Smart technology makes possible efficient testing of body surface area sensation loss in symmetric length-dependent sensorimotor polyneuropathies. PMID:23929701

  15. Dynamic and accurate assessment of acetaminophen-induced hepatotoxicity by integrated photoacoustic imaging and mechanistic biomarkers in vivo.

    PubMed

    Brillant, Nathalie; Elmasry, Mohamed; Burton, Neal C; Rodriguez, Josep Monne; Sharkey, Jack W; Fenwick, Stephen; Poptani, Harish; Kitteringham, Neil R; Goldring, Christopher E; Kipar, Anja; Park, B Kevin; Antoine, Daniel J

    2017-10-01

    The prediction and understanding of acetaminophen (APAP)-induced liver injury (APAP-ILI) and the response to therapeutic interventions is complex. This is due in part to sensitivity and specificity limitations of currently used assessment techniques. Here we sought to determine the utility of integrating translational non-invasive photoacoustic imaging of liver function with mechanistic circulating biomarkers of hepatotoxicity with histological assessment to facilitate the more accurate and precise characterization of APAP-ILI and the efficacy of therapeutic intervention. Perturbation of liver function and cellular viability was assessed in C57BL/6J male mice by Indocyanine green (ICG) clearance (Multispectral Optoacoustic Tomography (MSOT)) and by measurement of mechanistic (miR-122, HMGB1) and established (ALT, bilirubin) circulating biomarkers in response to the acetaminophen and its treatment with acetylcysteine (NAC) in vivo. We utilised a 60% partial hepatectomy model as a situation of defined hepatic functional mass loss to compared acetaminophen-induced changes to. Integration of these mechanistic markers correlated with histological features of APAP hepatotoxicity in a time-dependent manner. They accurately reflected the onset and recovery from hepatotoxicity compared to traditional biomarkers and also reported the efficacy of NAC with high sensitivity. ICG clearance kinetics correlated with histological scores for acute liver damage for APAP (i.e. 3h timepoint; r=0.90, P<0.0001) and elevations in both of the mechanistic biomarkers, miR-122 (e.g. 6h timepoint; r=0.70, P=0.005) and HMGB1 (e.g. 6h timepoint; r=0.56, P=0.04). For the first time we report the utility of this non-invasive longitudinal imaging approach to provide direct visualisation of the liver function coupled with mechanistic biomarkers, in the same animal, allowing the investigation of the toxicological and pharmacological aspects of APAP-ILI and hepatic regeneration. Copyright © 2017

  16. Introduction of an automated user-independent quantitative volumetric magnetic resonance imaging breast density measurement system using the Dixon sequence: comparison with mammographic breast density assessment.

    PubMed

    Wengert, Georg Johannes; Helbich, Thomas H; Vogl, Wolf-Dieter; Baltzer, Pascal; Langs, Georg; Weber, Michael; Bogner, Wolfgang; Gruber, Stephan; Trattnig, Siegfried; Pinker, Katja

    2015-02-01

    The purposes of this study were to introduce and assess an automated user-independent quantitative volumetric (AUQV) breast density (BD) measurement system on the basis of magnetic resonance imaging (MRI) using the Dixon technique as well as to compare it with qualitative and quantitative mammographic (MG) BD measurements. Forty-three women with normal mammogram results (Breast Imaging Reporting and Data System 1) were included in this institutional review board-approved prospective study. All participants were subjected to BD assessment with MRI using the following sequence with the Dixon technique (echo time/echo time, 6 milliseconds/2.45 milliseconds/2.67 milliseconds; 1-mm isotropic; 3 minutes 38 seconds). To test the reproducibility, a second MRI after patient repositioning was performed. The AUQV magnetic resonance (MR) BD measurement system automatically calculated percentage (%) BD. The qualitative BD assessment was performed using the American College of Radiology Breast Imaging Reporting and Data System BD categories. Quantitative BD was estimated semiautomatically using the thresholding technique Cumulus4. Appropriate statistical tests were used to assess the agreement between the AUQV MR measurements and to compare them with qualitative and quantitative MG BD estimations. The AUQV MR BD measurements were successfully performed in all 43 women. There was a nearly perfect agreement of AUQV MR BD measurements between the 2 MR examinations for % BD (P < 0.001; intraclass correlation coefficient, 0.998) with no significant differences (P = 0.384). The AUQV MR BD measurements were significantly lower than quantitative and qualitative MG BD assessment (P < 0.001). The AUQV MR BD measurement system allows a fully automated, user-independent, robust, reproducible, as well as radiation- and compression-free volumetric quantitative BD assessment through different levels of BD. The AUQV MR BD measurements were significantly lower than the currently used qualitative

  17. Assessing Student Status and Progress in Science Reasoning and Quantitative Literacy at a Very Large Undergraduate Institution

    NASA Astrophysics Data System (ADS)

    Donahue, Megan; Kaplan, J.; Ebert-May, D.; Ording, G.; Melfi, V.; Gilliland, D.; Sikorski, A.; Johnson, N.

    2009-01-01

    The typical large liberal-arts, tier-one research university requires all of its graduates to achieve some minimal standards of quantitative literacy and scientific reasoning skills. But how do we know what we are doing, as instructors and as a university, is working the way we think it should? At Michigan State University, a cross-disciplinary team of scientists, statisticians, and teacher education experts have begun a large-scale investigation about student mastery of quantitative and scientific skills, beginning with an assessment of 3,000 freshmen before they start their university careers. We will describe the process we used for developing and testing an instrument, for expanding faculty involvement and input on high-level goals. For this limited presentation, we will limit the discussion mainly to the scientific reasoning perspective, but we will briefly mention some intriguing observations regarding quantitative literacy as well. This project represents the beginning of long-term, longitudinal tracking of the progress of students at our institution. We will discuss preliminary results our 2008 assessment of incoming freshman at Michigan State, and where we plan to go from here. We acknowledge local support from the Quality Fund from the Office of the Provost at MSU. We also acknowledge the Center for Assessment at James Madison University and the NSF for their support at the very beginning of our work.

  18. Dynamic of grassland vegetation degradation and its quantitative assessment in the northwest China

    NASA Astrophysics Data System (ADS)

    Zhou, Wei; Gang, Chengcheng; Zhou, Liang; Chen, Yizhao; Li, Jianlong; Ju, Weimin; Odeh, Inakwu

    2014-02-01

    Grasslands, one of the most widespread land cover types in China, are of great importance to natural environmental protection and socioeconomic development. An accurate quantitative assessment of the effects of inter-annual climate change and human activities on grassland productivity has great theoretical significance to understanding the driving mechanisms of grassland degradation. Net primary productivity (NPP) was selected as an indicator for analyzing grassland vegetation dynamics from 2001 to 2010. Potential NPP and the difference between potential NPP and actual NPP were used to represent the effects of climate and human factors, respectively, on grassland degradation. The results showed that 61.49% of grassland areas underwent degradation, whereas only 38.51% exhibited restoration. In addition, 65.75% of grassland degradation was caused by human activities whereas 19.94% was caused by inter-annual climate change. By contrast, 32.32% of grassland restoration was caused by human activities, whereas 56.56% was caused by climatic factors. Therefore, inter-annual climate change is the primary cause of grassland restoration, whereas human activities are the primary cause of grassland degradation. Grassland dynamics and the relative roles of climate and human factors in grassland degradation and restoration varied greatly across the five provinces studied. The contribution of human activities to grassland degradation was greater than that of climate change in all five provinces. Three outcomes were observed in grassland restoration: First, the contribution of climate to grassland restoration was greater than that of human activities, particularly in Qinghai, Inner Mongolia, and Xinjiang. Second, the contribution of human activities to grassland restoration was greater than that of climate in Gansu. Third, the two factors almost equally contributed to grassland restoration in Tibet. Therefore, the effectiveness of ecological restoration programs should be enhanced

  19. Assessment of fat and lean mass by quantitative magnetic resonance: a future technology of body composition research?

    PubMed

    Bosy-Westphal, Anja; Müller, Manfred J

    2015-09-01

    For the assessment of energy balance or monitoring of therapeutic interventions, there is a need for noninvasive and highly precise methods of body composition analysis that are able to accurately measure small changes in fat and fat-free mass (FFM). The use of quantitative magnetic resonance (QMR) for measurement of body composition has long been established in animal studies. There are, however, only a few human studies that examine the validity of this method. These studies have consistently shown a high precision of QMR and only a small underestimation of fat mass by QMR when compared with a 4-compartment model as a reference. An underestimation of fat mass by QMR is also supported by the comparison between measured energy balance (as a difference between energy intake and energy expenditure) and energy balance predicted from changes in fat mass and FFM. Fewer calories were lost and gained as fat mass compared with the value expected from measured energy balance. Current evidence in healthy humans has shown that QMR is a valid and precise method for noninvasive measurement of body composition. Contrary to standard reference methods, such as densitometry and dual X-ray absorptiometry, QMR results are independent of FFM hydration. However, despite a high accuracy and a low minimal detectable change, underestimation of fat mass by QMR is possible and limits the use of this method for quantification of energy balance.

  20. Co-Teaching in Middle School Classrooms: Quantitative Comparative Study of Special Education Student Assessment Performance

    ERIC Educational Resources Information Center

    Reese, De'borah Reese

    2017-01-01

    The purpose of this quantitative comparative study was to determine the existence or nonexistence of performance pass rate differences of special education middle school students on standardized assessments between pre and post co-teaching eras disaggregated by subject area and school. Co-teaching has altered classroom environments in many ways.…

  1. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Waters, Michael; Jackson, Marcus

    2008-11-15

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens andmore » presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data

  2. Determination of exposure multiples of human metabolites for MIST assessment in preclinical safety species without using reference standards or radiolabeled compounds.

    PubMed

    Ma, Shuguang; Li, Zhiling; Lee, Keun-Joong; Chowdhury, Swapan K

    2010-12-20

    A simple, reliable, and accurate method was developed for quantitative assessment of metabolite coverage in preclinical safety species by mixing equal volumes of human plasma with blank plasma of animal species and vice versa followed by an analysis using high-resolution full-scan accurate mass spectrometry. This approach provided comparable results (within (±15%) to those obtained from regulated bioanalysis and did not require synthetic standards or radiolabeled compounds. In addition, both qualitative and quantitative data were obtained from a single LC-MS analysis on all metabolites and, therefore, the coverage of any metabolite of interest can be obtained.

  3. Quantitative Phase Imaging in a Volume Holographic Microscope

    NASA Astrophysics Data System (ADS)

    Waller, Laura; Luo, Yuan; Barbastathis, George

    2010-04-01

    We demonstrate a method for quantitative phase imaging in a Volume Holographic Microscope (VHM) from a single exposure, describe the properties of the system and show experimental results. The VHM system uses a multiplexed volume hologram (VH) to laterally separate images from different focal planes. This 3D intensity information is then used to solve the transport of intensity (TIE) equation and recover phase quantitatively. We discuss the modifications to the technique that were made in order to give accurate results.

  4. Assessment of Intervertebral Disc Degeneration Based on Quantitative MRI Analysis: an in vivo study

    PubMed Central

    Grunert, Peter; Hudson, Katherine D.; Macielak, Michael R.; Aronowitz, Eric; Borde, Brandon H.; Alimi, Marjan; Njoku, Innocent; Ballon, Douglas; Tsiouris, Apostolos John; Bonassar, Lawrence J.; Härtl, Roger

    2015-01-01

    Study design Animal experimental study Objective To evaluate a novel quantitative imaging technique for assessing disc degeneration. Summary of Background Data T2-relaxation time (T2-RT) measurements have been used to quantitatively assess disc degeneration. T2 values correlate with the water content of inter vertebral disc tissue and thereby allow for the indirect measurement of nucleus pulposus (NP) hydration. Methods We developed an algorithm to subtract out MRI voxels not representing NP tissue based on T2-RT values. Filtered NP voxels were used to measure nuclear size by their amount and nuclear hydration by their mean T2-RT. This technique was applied to 24 rat-tail intervertebral discs’ (IVDs), which had been punctured with an 18-gauge needle according to different techniques to induce varying degrees of degeneration. NP voxel count and average T2-RT were used as parameters to assess the degeneration process at 1 and 3 months post puncture. NP voxel counts were evaluated against X-ray disc height measurements and qualitative MRI studies based on the Pfirrmann grading system. Tails were collected for histology to correlate NP voxel counts to histological disc degeneration grades and to NP cross-sectional area measurements. Results NP voxel count measurements showed strong correlations to qualitative MRI analyses (R2=0.79, p<0.0001), histological degeneration grades (R2=0.902, p<0.0001) and histological NP cross-sectional area measurements (R2=0.887, p<0.0001). In contrast to NP voxel counts, the mean T2-RT for each punctured group remained constant between months 1 and 3. The mean T2-RTs for the punctured groups did not show a statistically significant difference from those of healthy IVDs (63.55ms ±5.88ms month 1 and 62.61ms ±5.02ms) at either time point. Conclusion The NP voxel count proved to be a valid parameter to quantitatively assess disc degeneration in a needle puncture model. The mean NP T2-RT does not change significantly in needle

  5. A gold nanoparticle-based semi-quantitative and quantitative ultrasensitive paper sensor for the detection of twenty mycotoxins

    NASA Astrophysics Data System (ADS)

    Kong, Dezhao; Liu, Liqiang; Song, Shanshan; Suryoprabowo, Steven; Li, Aike; Kuang, Hua; Wang, Libing; Xu, Chuanlai

    2016-02-01

    A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan reader, with the calculated limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.04-0.17, 0.06-49, 0.15-0.22, 0.056-0.49 and 0.53-1.05 μg kg-1, respectively. The analytical results of spiked samples were in accordance with the accurate content in the simultaneous detection analysis. This newly developed ICA strip assay is suitable for the on-site detection and rapid initial screening of mycotoxins in cereal samples, facilitating both semi-quantitative and quantitative determination.A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan

  6. Conditional Toxicity Value (CTV) Predictor: An In Silico Approach for Generating Quantitative Risk Estimates for Chemicals.

    PubMed

    Wignall, Jessica A; Muratov, Eugene; Sedykh, Alexander; Guyton, Kathryn Z; Tropsha, Alexander; Rusyn, Ivan; Chiu, Weihsueh A

    2018-05-01

    Human health assessments synthesize human, animal, and mechanistic data to produce toxicity values that are key inputs to risk-based decision making. Traditional assessments are data-, time-, and resource-intensive, and they cannot be developed for most environmental chemicals owing to a lack of appropriate data. As recommended by the National Research Council, we propose a solution for predicting toxicity values for data-poor chemicals through development of quantitative structure-activity relationship (QSAR) models. We used a comprehensive database of chemicals with existing regulatory toxicity values from U.S. federal and state agencies to develop quantitative QSAR models. We compared QSAR-based model predictions to those based on high-throughput screening (HTS) assays. QSAR models for noncancer threshold-based values and cancer slope factors had cross-validation-based Q 2 of 0.25-0.45, mean model errors of 0.70-1.11 log 10 units, and applicability domains covering >80% of environmental chemicals. Toxicity values predicted from QSAR models developed in this study were more accurate and precise than those based on HTS assays or mean-based predictions. A publicly accessible web interface to make predictions for any chemical of interest is available at http://toxvalue.org. An in silico tool that can predict toxicity values with an uncertainty of an order of magnitude or less can be used to quickly and quantitatively assess risks of environmental chemicals when traditional toxicity data or human health assessments are unavailable. This tool can fill a critical gap in the risk assessment and management of data-poor chemicals. https://doi.org/10.1289/EHP2998.

  7. Quantitative risk assessment of durable glass fibers.

    PubMed

    Fayerweather, William E; Eastes, Walter; Cereghini, Francesco; Hadley, John G

    2002-06-01

    This article presents a quantitative risk assessment for the theoretical lifetime cancer risk from the manufacture and use of relatively durable synthetic glass fibers. More specifically, we estimate levels of exposure to respirable fibers or fiberlike structures of E-glass and C-glass that, assuming a working lifetime exposure, pose a theoretical lifetime cancer risk of not more than 1 per 100,000. For comparability with other risk assessments we define these levels as nonsignificant exposures. Nonsignificant exposure levels are estimated from (a) the Institute of Occupational Medicine (IOM) chronic rat inhalation bioassay of durable E-glass microfibers, and (b) the Research Consulting Company (RCC) chronic inhalation bioassay of durable refractory ceramic fibers (RCF). Best estimates of nonsignificant E-glass exposure exceed 0.05-0.13 fibers (or shards) per cubic centimeter (cm3) when calculated from the multistage nonthreshold model. Best estimates of nonsignificant C-glass exposure exceed 0.27-0.6 fibers/cm3. Estimates of nonsignificant exposure increase markedly for E- and C-glass when non-linear models are applied and rapidly exceed 1 fiber/cm3. Controlling durable fiber exposures to an 8-h time-weighted average of 0.05 fibers/cm3 will assure that the additional theoretical lifetime risk from working lifetime exposures to these durable fibers or shards is kept below the 1 per 100,000 level. Measured airborne exposures to respirable, durable glass fibers (or shards) in glass fiber manufacturing and fabrication operations were compared with the nonsignificant exposure estimates described. Sampling results for B-sized respirable E-glass fibers at facilities that manufacture or fabricate small-diameter continuous-filament products, from those that manufacture respirable E-glass shards from PERG (process to efficiently recycle glass), from milled fiber operations, and from respirable C-glass shards from Flakeglass operations indicate very low median exposures of 0

  8. Multifunctional Skin-like Electronics for Quantitative, Clinical Monitoring of Cutaneous Wound Healing

    PubMed Central

    Hattori, Yoshiaki; Falgout, Leo; Lee, Woosik; Jung, Sung-Young; Poon, Emily; Lee, Jung Woo; Na, Ilyoun; Geisler, Amelia; Sadhwani, Divya; Zhang, Yihui; Su, Yewang; Wang, Xiaoqi; Liu, Zhuangjian; Xia, Jing; Cheng, Huanyu; Webb, R. Chad; Bonifas, Andrew P.; Won, Philip; Jeong, Jae-Woong; Jang, Kyung-In; Song, Young Min; Nardone, Beatrice; Nodzenski, Michael; Fan, Jonathan A.; Huang, Yonggang; West, Dennis P.; Paller, Amy S.; Alam, Murad

    2014-01-01

    Non-invasive, biomedical devices have the potential to provide important, quantitative data for the assessment of skin diseases and wound healing. Traditional methods either rely on qualitative visual and tactile judgments of a professional and/or data obtained using instrumentation with forms that do not readily allow intimate integration with sensitive skin near a wound site. Here we report a skin-like electronics platform that can softly and reversibly laminate perilesionally at wounds to provide highly accurate, quantitative data of relevance to the management of surgical wound healing. Clinical studies on patients using thermal sensors and actuators in fractal layouts provide precise time-dependent mapping of temperature and thermal conductivity of the skin near the wounds. Analytical and simulation results establish the fundamentals of the sensing modalities, the mechanics of the system, and strategies for optimized design. The use of this type of ‘epidermal’ electronics system in a realistic, clinical setting with human subjects establishes a set of practical procedures in disinfection, reuse, and protocols for quantitative measurement. The results have the potential to address important unmet needs in chronic wound management. PMID:24668927

  9. Multifunctional skin-like electronics for quantitative, clinical monitoring of cutaneous wound healing

    DOE PAGES

    Hattori, Yoshiaki; Falgout, Leo; Lee, Woosik; ...

    2014-03-26

    Non-invasive, biomedical devices have the potential to provide important, quantitative data for the assessment of skin diseases and wound healing. Traditional methods either rely on qualitative visual and tactile judgments of a professional and/or data obtained using instrumentation with forms that do not readily allow intimate integration with sensitive skin near a wound site. In this paper, an electronic sensor platform that can softly and reversibly laminate perilesionally at wounds to provide highly accurate, quantitative data of relevance to the management of surgical wound healing is reported. Clinical studies on patients using thermal sensors and actuators in fractal layouts providemore » precise time-dependent mapping of temperature and thermal conductivity of the skin near the wounds. Analytical and simulation results establish the fundamentals of the sensing modalities, the mechanics of the system, and strategies for optimized design. The use of this type of “epidermal” electronics system in a realistic clinical setting with human subjects establishes a set of practical procedures in disinfection, reuse, and protocols for quantitative measurement. Finally, the results have the potential to address important unmet needs in chronic wound management.« less

  10. Multifunctional skin-like electronics for quantitative, clinical monitoring of cutaneous wound healing.

    PubMed

    Hattori, Yoshiaki; Falgout, Leo; Lee, Woosik; Jung, Sung-Young; Poon, Emily; Lee, Jung Woo; Na, Ilyoun; Geisler, Amelia; Sadhwani, Divya; Zhang, Yihui; Su, Yewang; Wang, Xiaoqi; Liu, Zhuangjian; Xia, Jing; Cheng, Huanyu; Webb, R Chad; Bonifas, Andrew P; Won, Philip; Jeong, Jae-Woong; Jang, Kyung-In; Song, Young Min; Nardone, Beatrice; Nodzenski, Michael; Fan, Jonathan A; Huang, Yonggang; West, Dennis P; Paller, Amy S; Alam, Murad; Yeo, Woon-Hong; Rogers, John A

    2014-10-01

    Non-invasive, biomedical devices have the potential to provide important, quantitative data for the assessment of skin diseases and wound healing. Traditional methods either rely on qualitative visual and tactile judgments of a professional and/or data obtained using instrumentation with forms that do not readily allow intimate integration with sensitive skin near a wound site. Here, an electronic sensor platform that can softly and reversibly laminate perilesionally at wounds to provide highly accurate, quantitative data of relevance to the management of surgical wound healing is reported. Clinical studies on patients using thermal sensors and actuators in fractal layouts provide precise time-dependent mapping of temperature and thermal conductivity of the skin near the wounds. Analytical and simulation results establish the fundamentals of the sensing modalities, the mechanics of the system, and strategies for optimized design. The use of this type of "epidermal" electronics system in a realistic clinical setting with human subjects establishes a set of practical procedures in disinfection, reuse, and protocols for quantitative measurement. The results have the potential to address important unmet needs in chronic wound management. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Quantitative and qualitative assessment of the bovine abortion surveillance system in France.

    PubMed

    Bronner, Anne; Gay, Emilie; Fortané, Nicolas; Palussière, Mathilde; Hendrikx, Pascal; Hénaux, Viviane; Calavas, Didier

    2015-06-01

    Bovine abortion is the main clinical sign of bovine brucellosis, a disease of which France has been declared officially free since 2005. To ensure the early detection of any brucellosis outbreak, event-driven surveillance relies on the mandatory notification of bovine abortions and the brucellosis testing of aborting cows. However, the under-reporting of abortions appears frequent. Our objectives were to assess the aptitude of the bovine abortion surveillance system to detect each and every bovine abortion and to identify factors influencing the system's effectiveness. We evaluated five attributes defined by the U.S. Centers for Disease Control with a method suited to each attribute: (1) data quality was studied quantitatively and qualitatively, as this factor considerably influences data analysis and results; (2) sensitivity and representativeness were estimated using a unilist capture-recapture approach to quantify the surveillance system's effectiveness; (3) acceptability and simplicity were studied through qualitative interviews of actors in the field, given that the surveillance system relies heavily on abortion notifications by farmers and veterinarians. Our analysis showed that (1) data quality was generally satisfactory even though some errors might be due to actors' lack of awareness of the need to collect accurate data; (2) from 2006 to 2011, the mean annual sensitivity - i.e. the proportion of farmers who reported at least one abortion out of all those who detected such events - was around 34%, but was significantly higher in dairy than beef cattle herds (highlighting a lack of representativeness); (3) overall, the system's low sensitivity was related to its low acceptability and lack of simplicity. This study showed that, in contrast to policy-makers, most farmers and veterinarians perceived the risk of a brucellosis outbreak as negligible. They did not consider sporadic abortions as a suspected case of brucellosis and usually reported abortions only to

  12. Evaluation of coronary stenosis with the aid of quantitative image analysis in histological cross sections.

    PubMed

    Dulohery, Kate; Papavdi, Asteria; Michalodimitrakis, Manolis; Kranioti, Elena F

    2012-11-01

    Coronary artery atherosclerosis is a hugely prevalent condition in the Western World and is often encountered during autopsy. Atherosclerotic plaques can cause luminal stenosis: which, if over a significant level (75%), is said to contribute to cause of death. Estimation of stenosis can be macroscopically performed by the forensic pathologists at the time of autopsy or by microscopic examination. This study compares macroscopic estimation with quantitative microscopic image analysis with a particular focus on the assessment of significant stenosis (>75%). A total of 131 individuals were analysed. The sample consists of an atherosclerotic group (n=122) and a control group (n=9). The results of the two methods were significantly different from each other (p=0.001) and the macroscopic method gave a greater percentage stenosis by an average of 3.5%. Also, histological examination of coronary artery stenosis yielded a difference in significant stenosis in 11.5% of cases. The differences were attributed to either histological quantitative image analysis underestimation; gross examination overestimation; or, a combination of both. The underestimation may have come from tissue shrinkage during tissue processing for histological specimen. The overestimation from the macroscopic assessment can be attributed to the lumen shape, to the examiner observer error or to a possible bias to diagnose coronary disease when no other cause of death is apparent. The results indicate that the macroscopic estimation is open to more biases and that histological quantitative image analysis only gives a precise assessment of stenosis ex vivo. Once tissue shrinkage, if any, is accounted for then histological quantitative image analysis will yield a more accurate assessment of in vivo stenosis. It may then be considered a complementary tool for the examination of coronary stenosis. Copyright © 2012 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  13. Magnetic resonance imaging assessment of the rotator cuff: is it really accurate?

    PubMed

    Wnorowski, D C; Levinsohn, E M; Chamberlain, B C; McAndrew, D L

    1997-12-01

    Magnetic resonance imaging (MRI) is used increasingly for evaluating the rotator cuff. This study of 39 shoulders (38 patients) compared the accuracy of MRI interpretation of rotator cuff integrity by a group of community hospital radiologists (clinical community scenario, CCS) with that of a musculoskeletal radiologist (experienced specialist scenario, ESS), relative to arthroscopy. For the CCS subgroup, the sensitivity, specificity, positive predictive value (PV), negative PV, and accuracy for partial tears were: 0%, 68%, 0%, 82%, and 59%, respectively; for complete tears: 56%, 73%, 36%, 86%, and 69%, respectively; and for all tears combined: 85%, 52%, 50%, 87%, and 64%, respectively. For the ESS subgroup, the respective values for partial tears were: 20%, 88%, 20%, 88%, and 79%, respectively; for complete tears: 78%, 83%, 58%, 92%, and 82%, respectively; and for all tears: 71%, 71%, 59%, 81%, and 71%, respectively. We concluded that MRI assessment of the rotator cuff was not accurate relative to arthroscopy. MRI was most helpful if the result was negative, and MRI diagnosis of partial tear was of little value. Considering the high cost of shoulder MRI, this study has significant implications for the evaluation of patients with possible rotator cuff pathology.

  14. Residual Isocyanates in Medical Devices and Products: A Qualitative and Quantitative Assessment

    PubMed Central

    Franklin, Gillian; Harari, Homero; Ahsan, Samavi; Bello, Dhimiter; Sterling, David A.; Nedrelow, Jonathan; Raynaud, Scott; Biswas, Swati; Liu, Youcheng

    2016-01-01

    We conducted a pilot qualitative and quantitative assessment of residual isocyanates and their potential initial exposures in neonates, as little is known about their contact effect. After a neonatal intensive care unit (NICU) stockroom inventory, polyurethane (PU) and PU foam (PUF) devices and products were qualitatively evaluated for residual isocyanates using Surface SWYPE™. Those containing isocyanates were quantitatively tested for methylene diphenyl diisocyanate (MDI) species, using UPLC-UV-MS/MS method. Ten of 37 products and devices tested, indicated both free and bound residual surface isocyanates; PU/PUF pieces contained aromatic isocyanates; one product contained aliphatic isocyanates. Overall, quantified mean MDI concentrations were low (4,4′-MDI = 0.52 to 140.1 pg/mg) and (2,4′-MDI = 0.01 to 4.48 pg/mg). The 4,4′-MDI species had the highest measured concentration (280 pg/mg). Commonly used medical devices/products contain low, but measurable concentrations of residual isocyanates. Quantifying other isocyanate species and neonatal skin exposure to isocyanates from these devices and products requires further investigation. PMID:27773989

  15. Quantitation of hepatitis B virus DNA in plasma using a sensitive cost-effective "in-house" real-time PCR assay.

    PubMed

    Daniel, Hubert Darius J; Fletcher, John G; Chandy, George M; Abraham, Priya

    2009-01-01

    Sensitive nucleic acid testing for the detection and accurate quantitation of hepatitis B virus (HBV) is necessary to reduce transmission through blood and blood products and for monitoring patients on antiviral therapy. The aim of this study is to standardize an "in-house" real-time HBV polymerase chain reaction (PCR) for accurate quantitation and screening of HBV. The "in-house" real-time assay was compared with a commercial assay using 30 chronically infected individuals and 70 blood donors who are negative for hepatitis B surface antigen, hepatitis C virus (HCV) antibody and human immunodeficiency virus (HIV) antibody. Further, 30 HBV-genotyped samples were tested to evaluate the "in-house" assay's capacity to detect genotypes prevalent among individuals attending this tertiary care hospital. The lower limit of detection of this "in-house" HBV real-time PCR was assessed against the WHO international standard and found to be 50 IU/mL. The interassay and intra-assay coefficient of variation (CV) of this "in-house" assay ranged from 1.4% to 9.4% and 0.0% to 2.3%, respectively. Virus loads as estimated with this "in-house" HBV real-time assay correlated well with the commercial artus HBV RG PCR assay ( r = 0.95, P < 0.0001). This assay can be used for the detection and accurate quantitation of HBV viral loads in plasma samples. This assay can be employed for the screening of blood donations and can potentially be adapted to a multiplex format for simultaneous detection of HBV, HIV and HCV to reduce the cost of testing in blood banks.

  16. Using an Educational Electronic Documentation System to Help Nursing Students Accurately Identify Nursing Diagnoses

    ERIC Educational Resources Information Center

    Pobocik, Tamara J.

    2013-01-01

    The use of technology and electronic medical records in healthcare has exponentially increased. This quantitative research project used a pretest/posttest design, and reviewed how an educational electronic documentation system helped nursing students to identify the accurate related to statement of the nursing diagnosis for the patient in the case…

  17. Monitoring and quantitative assessment of tumor burden using in vivo bioluminescence imaging

    NASA Astrophysics Data System (ADS)

    Chen, Chia-Chi; Hwang, Jeng-Jong; Ting, Gann; Tseng, Yun-Long; Wang, Shyh-Jen; Whang-Peng, Jaqueline

    2007-02-01

    In vivo bioluminescence imaging (BLI) is a sensitive imaging modality that is rapid and accessible, and may comprise an ideal tool for evaluating tumor growth. In this study, the kinetic of tumor growth has been assessed in C26 colon carcinoma bearing BALB/c mouse model. The ability of BLI to noninvasively quantitate the growth of subcutaneous tumors transplanted with C26 cells genetically engineered to stably express firefly luciferase and herpes simplex virus type-1 thymidine kinase (C26/ tk-luc). A good correlation ( R2=0.998) of photon emission to the cell number was found in vitro. Tumor burden and tumor volume were monitored in vivo over time by quantitation of photon emission using Xenogen IVIS 50 and standard external caliper measurement, respectively. At various time intervals, tumor-bearing mice were imaged to determine the correlation of in vivo BLI to tumor volume. However, a correlation of BLI to tumor volume was observed when tumor volume was smaller than 1000 mm 3 ( R2=0.907). γ Scintigraphy combined with [ 131I]FIAU was another imaging modality used for verifying the previous results. In conclusion, this study showed that bioluminescence imaging is a powerful and quantitative tool for the direct assay to monitor tumor growth in vivo. The dual reporter genes transfected tumor-bearing animal model can be applied in the evaluation of the efficacy of new developed anti-cancer drugs.

  18. Quantitative Microbial Risk Assessment Tutorial Installation of Software for Watershed Modeling in Support of QMRA - Updated 2017

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling: • QMRA Installation • SDMProjectBuilder (which includes the Microbial ...

  19. Ultra-fast quantitative imaging using ptychographic iterative engine based digital micro-mirror device

    NASA Astrophysics Data System (ADS)

    Sun, Aihui; Tian, Xiaolin; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng

    2018-01-01

    As a lensfree imaging technique, ptychographic iterative engine (PIE) method can provide both quantitative sample amplitude and phase distributions avoiding aberration. However, it requires field of view (FoV) scanning often relying on mechanical translation, which not only slows down measuring speed, but also introduces mechanical errors decreasing both resolution and accuracy in retrieved information. In order to achieve high-accurate quantitative imaging with fast speed, digital micromirror device (DMD) is adopted in PIE for large FoV scanning controlled by on/off state coding by DMD. Measurements were implemented using biological samples as well as USAF resolution target, proving high resolution in quantitative imaging using the proposed system. Considering its fast and accurate imaging capability, it is believed the DMD based PIE technique provides a potential solution for medical observation and measurements.

  20. Survey of Quantitative Research Metrics to Assess Pilot Performance in Upset Recovery

    NASA Technical Reports Server (NTRS)

    Le Vie, Lisa R.

    2016-01-01

    Accidents attributable to in-flight loss of control are the primary cause for fatal commercial jet accidents worldwide. The National Aeronautics and Space Administration (NASA) conducted a literature review to determine and identify the quantitative standards for assessing upset recovery performance. This review contains current recovery procedures for both military and commercial aviation and includes the metrics researchers use to assess aircraft recovery performance. Metrics include time to first input, recognition time and recovery time and whether that input was correct or incorrect. Other metrics included are: the state of the autopilot and autothrottle, control wheel/sidestick movement resulting in pitch and roll, and inputs to the throttle and rudder. In addition, airplane state measures, such as roll reversals, altitude loss/gain, maximum vertical speed, maximum/minimum air speed, maximum bank angle and maximum g loading are reviewed as well.

  1. Highly accurate quantitative spectroscopy of massive stars in the Galaxy

    NASA Astrophysics Data System (ADS)

    Nieva, María-Fernanda; Przybilla, Norbert

    2017-11-01

    Achieving high accuracy and precision in stellar parameter and chemical composition determinations is challenging in massive star spectroscopy. On one hand, the target selection for an unbiased sample build-up is complicated by several types of peculiarities that can occur in individual objects. On the other hand, composite spectra are often not recognized as such even at medium-high spectral resolution and typical signal-to-noise ratios, despite multiplicity among massive stars is widespread. In particular, surveys that produce large amounts of automatically reduced data are prone to oversight of details that turn hazardous for the analysis with techniques that have been developed for a set of standard assumptions applicable to a spectrum of a single star. Much larger systematic errors than anticipated may therefore result because of the unrecognized true nature of the investigated objects, or much smaller sample sizes of objects for the analysis than initially planned, if recognized. More factors to be taken care of are the multiple steps from the choice of instrument over the details of the data reduction chain to the choice of modelling code, input data, analysis technique and the selection of the spectral lines to be analyzed. Only when avoiding all the possible pitfalls, a precise and accurate characterization of the stars in terms of fundamental parameters and chemical fingerprints can be achieved that form the basis for further investigations regarding e.g. stellar structure and evolution or the chemical evolution of the Galaxy. The scope of the present work is to provide the massive star and also other astrophysical communities with criteria to evaluate the quality of spectroscopic investigations of massive stars before interpreting them in a broader context. The discussion is guided by our experiences made in the course of over a decade of studies of massive star spectroscopy ranging from the simplest single objects to multiple systems.

  2. Quantitative prediction of phase transformations in silicon during nanoindentation

    NASA Astrophysics Data System (ADS)

    Zhang, Liangchi; Basak, Animesh

    2013-08-01

    This paper establishes the first quantitative relationship between the phases transformed in silicon and the shape characteristics of nanoindentation curves. Based on an integrated analysis using TEM and unit cell properties of phases, the volumes of the phases emerged in a nanoindentation are formulated as a function of pop-out size and depth of nanoindentation impression. This simple formula enables a fast, accurate and quantitative prediction of the phases in a nanoindentation cycle, which has been impossible before.

  3. Quantitative Microbial Risk Assessment and Infectious Disease Transmission Modeling of Waterborne Enteric Pathogens.

    PubMed

    Brouwer, Andrew F; Masters, Nina B; Eisenberg, Joseph N S

    2018-04-20

    Waterborne enteric pathogens remain a global health threat. Increasingly, quantitative microbial risk assessment (QMRA) and infectious disease transmission modeling (IDTM) are used to assess waterborne pathogen risks and evaluate mitigation. These modeling efforts, however, have largely been conducted independently for different purposes and in different settings. In this review, we examine the settings where each modeling strategy is employed. QMRA research has focused on food contamination and recreational water in high-income countries (HICs) and drinking water and wastewater in low- and middle-income countries (LMICs). IDTM research has focused on large outbreaks (predominately LMICs) and vaccine-preventable diseases (LMICs and HICs). Human ecology determines the niches that pathogens exploit, leading researchers to focus on different risk assessment research strategies in different settings. To enhance risk modeling, QMRA and IDTM approaches should be integrated to include dynamics of pathogens in the environment and pathogen transmission through populations.

  4. Rapid and Accurate Evaluation of the Quality of Commercial Organic Fertilizers Using Near Infrared Spectroscopy

    PubMed Central

    Wang, Chang; Huang, Chichao; Qian, Jian; Xiao, Jian; Li, Huan; Wen, Yongli; He, Xinhua; Ran, Wei; Shen, Qirong; Yu, Guanghui

    2014-01-01

    The composting industry has been growing rapidly in China because of a boom in the animal industry. Therefore, a rapid and accurate assessment of the quality of commercial organic fertilizers is of the utmost importance. In this study, a novel technique that combines near infrared (NIR) spectroscopy with partial least squares (PLS) analysis is developed for rapidly and accurately assessing commercial organic fertilizers quality. A total of 104 commercial organic fertilizers were collected from full-scale compost factories in Jiangsu Province, east China. In general, the NIR-PLS technique showed accurate predictions of the total organic matter, water soluble organic nitrogen, pH, and germination index; less accurate results of the moisture, total nitrogen, and electrical conductivity; and the least accurate results for water soluble organic carbon. Our results suggested the combined NIR-PLS technique could be applied as a valuable tool to rapidly and accurately assess the quality of commercial organic fertilizers. PMID:24586313

  5. Rapid and accurate evaluation of the quality of commercial organic fertilizers using near infrared spectroscopy.

    PubMed

    Wang, Chang; Huang, Chichao; Qian, Jian; Xiao, Jian; Li, Huan; Wen, Yongli; He, Xinhua; Ran, Wei; Shen, Qirong; Yu, Guanghui

    2014-01-01

    The composting industry has been growing rapidly in China because of a boom in the animal industry. Therefore, a rapid and accurate assessment of the quality of commercial organic fertilizers is of the utmost importance. In this study, a novel technique that combines near infrared (NIR) spectroscopy with partial least squares (PLS) analysis is developed for rapidly and accurately assessing commercial organic fertilizers quality. A total of 104 commercial organic fertilizers were collected from full-scale compost factories in Jiangsu Province, east China. In general, the NIR-PLS technique showed accurate predictions of the total organic matter, water soluble organic nitrogen, pH, and germination index; less accurate results of the moisture, total nitrogen, and electrical conductivity; and the least accurate results for water soluble organic carbon. Our results suggested the combined NIR-PLS technique could be applied as a valuable tool to rapidly and accurately assess the quality of commercial organic fertilizers.

  6. Arctic Stratospheric Temperature In The Winters 1999/2000 and 2000/2001: A Quantitative Assessment and Microphysical Implications

    NASA Astrophysics Data System (ADS)

    Buss, S.; Wernli, H.; Peter, T.; Kivi, R.; Bui, T. P.; Kleinböhl, A.; Schiller, C.

    Stratospheric winter temperatures play a key role in the chain of microphysical and chemical processes that lead to the formation of polar stratospheric clouds (PSCs), chlorine activation and eventually to stratospheric ozone depletion. Here the tempera- ture conditions during the Arctic winters 1999/2000 and 2000/2001 are quantitatively investigated using observed profiles of water vapour and nitric acid, and tempera- tures from high-resolution radiosondes and aircraft observations, global ECMWF and UKMO analyses and mesoscale model simulations over Scandinavia and Greenland. The ECMWF model resolves parts of the gravity wave activity and generally agrees well with the observations. However, for the very cold temperatures near the ice frost point the ECMWF analyses have a warm bias of 1-6 K compared to radiosondes. For the mesoscale model HRM, this bias is generally reduced due to a more accurate rep- resentation of gravity waves. Quantitative estimates of the impact of the mesoscale temperature perturbations indicates that over Scandinavia and Greenland the wave- induced stratospheric cooling (as simulated by the HRM) affects only moderately the estimated chlorine activation and homogeneous NAT particle formation, but strongly enhances the potential for ice formation.

  7. Zebrafish Caudal Fin Angiogenesis Assay—Advanced Quantitative Assessment Including 3-Way Correlative Microscopy

    PubMed Central

    Correa Shokiche, Carlos; Schaad, Laura; Triet, Ramona; Jazwinska, Anna; Tschanz, Stefan A.; Djonov, Valentin

    2016-01-01

    Background Researchers evaluating angiomodulating compounds as a part of scientific projects or pre-clinical studies are often confronted with limitations of applied animal models. The rough and insufficient early-stage compound assessment without reliable quantification of the vascular response counts, at least partially, to the low transition rate to clinics. Objective To establish an advanced, rapid and cost-effective angiogenesis assay for the precise and sensitive assessment of angiomodulating compounds using zebrafish caudal fin regeneration. It should provide information regarding the angiogenic mechanisms involved and should include qualitative and quantitative data of drug effects in a non-biased and time-efficient way. Approach & Results Basic vascular parameters (total regenerated area, vascular projection area, contour length, vessel area density) were extracted from in vivo fluorescence microscopy images using a stereological approach. Skeletonization of the vasculature by our custom-made software Skelios provided additional parameters including “graph energy” and “distance to farthest node”. The latter gave important insights into the complexity, connectivity and maturation status of the regenerating vascular network. The employment of a reference point (vascular parameters prior amputation) is unique for the model and crucial for a proper assessment. Additionally, the assay provides exceptional possibilities for correlative microscopy by combining in vivo-imaging and morphological investigation of the area of interest. The 3-way correlative microscopy links the dynamic changes in vivo with their structural substrate at the subcellular level. Conclusions The improved zebrafish fin regeneration model with advanced quantitative analysis and optional 3-way correlative morphology is a promising in vivo angiogenesis assay, well-suitable for basic research and preclinical investigations. PMID:26950851

  8. Detection and quantitation of trace phenolphthalein (in pharmaceutical preparations and in forensic exhibits) by liquid chromatography-tandem mass spectrometry, a sensitive and accurate method.

    PubMed

    Sharma, Kakali; Sharma, Shiba P; Lahiri, Sujit C

    2013-01-01

    Phenolphthalein, an acid-base indicator and laxative, is important as a constituent of widely used weight-reducing multicomponent food formulations. Phenolphthalein is an useful reagent in forensic science for the identification of blood stains of suspected victims and for apprehending erring officials accepting bribes in graft or trap cases. The pink-colored alkaline hand washes originating from the phenolphthalein-smeared notes can easily be determined spectrophotometrically. But in many cases, colored solution turns colorless with time, which renders the genuineness of bribe cases doubtful to the judiciary. No method is known till now for the detection and identification of phenolphthalein in colorless forensic exhibits with positive proof. Liquid chromatography-tandem mass spectrometry had been found to be most sensitive, accurate method capable of detection and quantitation of trace phenolphthalein in commercial formulations and colorless forensic exhibits with positive proof. The detection limit of phenolphthalein was found to be 1.66 pg/L or ng/mL, and the calibration curve shows good linearity (r(2) = 0.9974). © 2012 American Academy of Forensic Sciences.

  9. SearchLight: a freely available web-based quantitative spectral analysis tool (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Prabhat, Prashant; Peet, Michael; Erdogan, Turan

    2016-03-01

    In order to design a fluorescence experiment, typically the spectra of a fluorophore and of a filter set are overlaid on a single graph and the spectral overlap is evaluated intuitively. However, in a typical fluorescence imaging system the fluorophores and optical filters are not the only wavelength dependent variables - even the excitation light sources have been changing. For example, LED Light Engines may have a significantly different spectral response compared to the traditional metal-halide lamps. Therefore, for a more accurate assessment of fluorophore-to-filter-set compatibility, all sources of spectral variation should be taken into account simultaneously. Additionally, intuitive or qualitative evaluation of many spectra does not necessarily provide a realistic assessment of the system performance. "SearchLight" is a freely available web-based spectral plotting and analysis tool that can be used to address the need for accurate, quantitative spectral evaluation of fluorescence measurement systems. This tool is available at: http://searchlight.semrock.com/. Based on a detailed mathematical framework [1], SearchLight calculates signal, noise, and signal-to-noise ratio for multiple combinations of fluorophores, filter sets, light sources and detectors. SearchLight allows for qualitative and quantitative evaluation of the compatibility of filter sets with fluorophores, analysis of bleed-through, identification of optimized spectral edge locations for a set of filters under specific experimental conditions, and guidance regarding labeling protocols in multiplexing imaging assays. Entire SearchLight sessions can be shared with colleagues and collaborators and saved for future reference. [1] Anderson, N., Prabhat, P. and Erdogan, T., Spectral Modeling in Fluorescence Microscopy, http://www.semrock.com (2010).

  10. Radiomics biomarkers for accurate tumor progression prediction of oropharyngeal cancer

    NASA Astrophysics Data System (ADS)

    Hadjiiski, Lubomir; Chan, Heang-Ping; Cha, Kenny H.; Srinivasan, Ashok; Wei, Jun; Zhou, Chuan; Prince, Mark; Papagerakis, Silvana

    2017-03-01

    Accurate tumor progression prediction for oropharyngeal cancers is crucial for identifying patients who would best be treated with optimized treatment and therefore minimize the risk of under- or over-treatment. An objective decision support system that can merge the available radiomics, histopathologic and molecular biomarkers in a predictive model based on statistical outcomes of previous cases and machine learning may assist clinicians in making more accurate assessment of oropharyngeal tumor progression. In this study, we evaluated the feasibility of developing individual and combined predictive models based on quantitative image analysis from radiomics, histopathology and molecular biomarkers for oropharyngeal tumor progression prediction. With IRB approval, 31, 84, and 127 patients with head and neck CT (CT-HN), tumor tissue microarrays (TMAs) and molecular biomarker expressions, respectively, were collected. For 8 of the patients all 3 types of biomarkers were available and they were sequestered in a test set. The CT-HN lesions were automatically segmented using our level sets based method. Morphological, texture and molecular based features were extracted from CT-HN and TMA images, and selected features were merged by a neural network. The classification accuracy was quantified using the area under the ROC curve (AUC). Test AUCs of 0.87, 0.74, and 0.71 were obtained with the individual predictive models based on radiomics, histopathologic, and molecular features, respectively. Combining the radiomics and molecular models increased the test AUC to 0.90. Combining all 3 models increased the test AUC further to 0.94. This preliminary study demonstrates that the individual domains of biomarkers are useful and the integrated multi-domain approach is most promising for tumor progression prediction.

  11. Validity of using a 3-dimensional PET scanner during inhalation of 15O-labeled oxygen for quantitative assessment of regional metabolic rate of oxygen in man

    NASA Astrophysics Data System (ADS)

    Hori, Yuki; Hirano, Yoshiyuki; Koshino, Kazuhiro; Moriguchi, Tetsuaki; Iguchi, Satoshi; Yamamoto, Akihide; Enmi, Junichiro; Kawashima, Hidekazu; Zeniya, Tsutomu; Morita, Naomi; Nakagawara, Jyoji; Casey, Michael E.; Iida, Hidehiro

    2014-09-01

    Use of 15O labeled oxygen (15O2) and positron emission tomography (PET) allows quantitative assessment of the regional metabolic rate of oxygen (CMRO2) in vivo, which is essential to understanding the pathological status of patients with cerebral vascular and neurological disorders. The method has, however, been challenging, when a 3D PET scanner is employed, largely attributed to the presence of gaseous radioactivity in the trachea and the inhalation system, which results in a large amount of scatter and random events in the PET assessment. The present study was intended to evaluate the adequacy of using a recently available commercial 3D PET scanner in the assessment of regional cerebral radioactivity distribution during an inhalation of 15O2. Systematic experiments were carried out on a brain phantom. Experiments were also performed on a healthy volunteer following a recently developed protocol for simultaneous assessment of CMRO2 and cerebral blood flow, which involves sequential administration of 15O2 and C15O2. A particular intention was to evaluate the adequacy of the scatter-correction procedures. The phantom experiment demonstrated that errors were within 3% at the practically maximum radioactivity in the face mask, with the greatest radioactivity in the lung. The volunteer experiment demonstrated that the counting rate was at peak during the 15O gas inhalation period, within a verified range. Tomographic images represented good quality over the entire FOV, including the lower part of the cerebral structures and the carotid artery regions. The scatter-correction procedures appeared to be important, particularly in the process to compensate for the scatter originating outside the FOV. Reconstructed images dramatically changed if the correction was carried out using inappropriate procedures. This study demonstrated that accurate reconstruction could be obtained when the scatter compensation was appropriately carried out. This study also suggested the

  12. Agreement between quantitative microbial risk assessment and epidemiology at low doses during waterborne outbreaks of protozoan disease

    USDA-ARS?s Scientific Manuscript database

    Quantitative microbial risk assessment (QMRA) is a valuable complement to epidemiology for understanding the health impacts of waterborne pathogens. The approach works by extrapolating available data in two ways. First, dose-response data are typically extrapolated from feeding studies, which use ...

  13. Stochastic optical reconstruction microscopy-based relative localization analysis (STORM-RLA) for quantitative nanoscale assessment of spatial protein organization.

    PubMed

    Veeraraghavan, Rengasayee; Gourdie, Robert G

    2016-11-07

    The spatial association between proteins is crucial to understanding how they function in biological systems. Colocalization analysis of fluorescence microscopy images is widely used to assess this. However, colocalization analysis performed on two-dimensional images with diffraction-limited resolution merely indicates that the proteins are within 200-300 nm of each other in the xy-plane and within 500-700 nm of each other along the z-axis. Here we demonstrate a novel three-dimensional quantitative analysis applicable to single-molecule positional data: stochastic optical reconstruction microscopy-based relative localization analysis (STORM-RLA). This method offers significant advantages: 1) STORM imaging affords 20-nm resolution in the xy-plane and <50 nm along the z-axis; 2) STORM-RLA provides a quantitative assessment of the frequency and degree of overlap between clusters of colabeled proteins; and 3) STORM-RLA also calculates the precise distances between both overlapping and nonoverlapping clusters in three dimensions. Thus STORM-RLA represents a significant advance in the high-throughput quantitative assessment of the spatial organization of proteins. © 2016 Veeraraghavan and Gourdie. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  14. Assessment of tennis elbow using the Marcy Wedge-Pro.

    PubMed Central

    Smith, R W; Mani, R; Cawley, M I; Englisch, W; Eckenberger, P

    1993-01-01

    The Marcy Wedge-Pro (MWP), a device used in training by tennis players, was employed in the assessment of tennis elbow. The MWP was used to measure the ability of patients to perform wrist extension exercises, since pain resulting from this specific activity is a prominent symptom of the condition. The MWP results were compared with clinical measures and found to identify accurately patients who responded to treatment (P < 0.05). This study illustrates the potential of the MWP to assess tennis elbow quantitatively. Images Figure 1 PMID:8130959

  15. Noninvasive Assessment of Biochemical and Mechanical Properties of Lumbar Discs Through Quantitative Magnetic Resonance Imaging in Asymptomatic Volunteers.

    PubMed

    Foltz, Mary H; Kage, Craig C; Johnson, Casey P; Ellingson, Arin M

    2017-11-01

    Intervertebral disc degeneration is a prevalent phenomenon associated with back pain. It is of critical clinical interest to discriminate disc health and identify early stages of degeneration. Traditional clinical T2-weighted magnetic resonance imaging (MRI), assessed using the Pfirrmann classification system, is subjective and fails to adequately capture initial degenerative changes. Emerging quantitative MRI techniques offer a solution. Specifically, T2* mapping images water mobility in the macromolecular network, and our preliminary ex vivo work shows high predictability of the disc's glycosaminoglycan content (s-GAG) and residual mechanics. The present study expands upon this work to predict the biochemical and biomechanical properties in vivo and assess their relationship with both age and Pfirrmann grade. Eleven asymptomatic subjects (range: 18-62 yrs) were enrolled and imaged using a 3T MRI scanner. T2-weighted images (Pfirrmann grade) and quantitative T2* maps (predict s-GAG and residual stress) were acquired. Surface maps based on the distribution of these properties were generated and integrated to quantify the surface volume. Correlational analyses were conducted to establish the relationship between each metric of disc health derived from the quantitative T2* maps with both age and Pfirrmann grade, where an inverse trend was observed. Furthermore, the nucleus pulposus (NP) signal in conjunction with volumetric surface maps provided the ability to discern differences during initial stages of disc degeneration. This study highlights the ability of T2* mapping to noninvasively assess the s-GAG content, residual stress, and distributions throughout the entire disc, which may provide a powerful diagnostic tool for disc health assessment.

  16. Quantitative assessment of ischemia and reactive hyperemia of the dermal layers using multi - spectral imaging on the human arm

    NASA Astrophysics Data System (ADS)

    Kainerstorfer, Jana M.; Amyot, Franck; Demos, Stavros G.; Hassan, Moinuddin; Chernomordik, Victor; Hitzenberger, Christoph K.; Gandjbakhche, Amir H.; Riley, Jason D.

    2009-07-01

    Quantitative assessment of skin chromophores in a non-invasive fashion is often desirable. Especially pixel wise assessment of blood volume and blood oxygenation is beneficial for improved diagnostics. We utilized a multi-spectral imaging system for acquiring diffuse reflectance images of healthy volunteers' lower forearm. Ischemia and reactive hyperemia was introduced by occluding the upper arm with a pressure cuff for 5min with 180mmHg. Multi-spectral images were taken every 30s, before, during and after occlusion. Image reconstruction for blood volume and blood oxygenation was performed, using a two layered skin model. As the images were taken in a non-contact way, strong artifacts related to the shape (curvature) of the arms were observed, making reconstruction of optical / physiological parameters highly inaccurate. We developed a curvature correction method, which is based on extracting the curvature directly from the intensity images acquired and does not require any additional measures on the object imaged. The effectiveness of the algorithm was demonstrated, on reconstruction results of blood volume and blood oxygenation for in vivo data during occlusion of the arm. Pixel wise assessment of blood volume and blood oxygenation was made possible over the entire image area and comparison of occlusion effects between veins and surrounding skin was performed. Induced ischemia during occlusion and reactive hyperemia afterwards was observed and quantitatively assessed. Furthermore, the influence of epidermal thickness on reconstruction results was evaluated and the exact knowledge of this parameter for fully quantitative assessment was pointed out.

  17. Investigation of the feasibility of non-invasive optical sensors for the quantitative assessment of dehydration.

    PubMed

    Visser, Cobus; Kieser, Eduard; Dellimore, Kiran; van den Heever, Dawie; Smith, Johan

    2017-10-01

    This study explores the feasibility of prospectively assessing infant dehydration using four non-invasive, optical sensors based on the quantitative and objective measurement of various clinical markers of dehydration. The sensors were investigated to objectively and unobtrusively assess the hydration state of an infant based on the quantification of capillary refill time (CRT), skin recoil time (SRT), skin temperature profile (STP) and skin tissue hydration by means of infrared spectrometry (ISP). To evaluate the performance of the sensors a clinical study was conducted on a cohort of 10 infants (aged 6-36 months) with acute gastroenteritis. High sensitivity and specificity were exhibited by the sensors, in particular the STP and SRT sensors, when combined into a fusion regression model (sensitivity: 0.90, specificity: 0.78). The SRT and STP sensors and the fusion model all outperformed the commonly used "gold standard" clinical dehydration scales including the Gorelick scale (sensitivity: 0.56, specificity: 0.56), CDS scale (sensitivity: 1.0, specificity: 0.2) and WHO scale (sensitivity: 0.13, specificity: 0.79). These results suggest that objective and quantitative assessment of infant dehydration may be possible using the sensors investigated. However, further evaluation of the sensors on a larger sample population is needed before deploying them in a clinical setting. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  18. A NOVEL TECHNIQUE FOR QUANTITATIVE ESTIMATION OF UPTAKE OF DIESEL EXHAUST PARTICLES BY LUNG CELLS

    EPA Science Inventory

    While airborne particulates like diesel exhaust particulates (DEP) exert significant toxicological effects on lungs, quantitative estimation of accumulation of DEP inside lung cells has not been reported due to a lack of an accurate and quantitative technique for this purpose. I...

  19. PBPK Models, BBDR Models, and Virtual Tissues: How Will They Contribute to the Use of Toxicity Pathways in Risk Assessment?

    EPA Science Inventory

    Accuracy in risk assessment, which is desirable in order to ensure protection of the public health while avoiding over-regulation of economically-important substances, requires quantitatively accurate, in vivo descriptions of dose-response and time-course behaviors. This level of...

  20. Quantitative meta-analytic approaches for the analysis of animal toxicology and epidemiologic data in human health risk assessments

    EPA Science Inventory

    Often, human health risk assessments have relied on qualitative approaches for hazard identification to integrate evidence across multiple studies to conclude whether particular hazards exist. However, quantitative approaches for evidence integration, including the application o...

  1. Cross-sectional evaluation of electrical impedance myography and quantitative ultrasound for the assessment of Duchenne muscular dystrophy in a clinical trial setting.

    PubMed

    Rutkove, Seward B; Geisbush, Tom R; Mijailovic, Aleksandar; Shklyar, Irina; Pasternak, Amy; Visyak, Nicole; Wu, Jim S; Zaidman, Craig; Darras, Basil T

    2014-07-01

    Electrical impedance myography and quantitative ultrasound are two noninvasive, painless, and effort-independent approaches for assessing neuromuscular disease. Both techniques have potential to serve as useful biomarkers in clinical trials in Duchenne muscular dystrophy. However, their comparative sensitivity to disease status and how they relate to one another are unknown. We performed a cross-sectional analysis of electrical impedance myography and quantitative ultrasound in 24 healthy boys and 24 with Duchenne muscular dystrophy, aged 2 to 14 years with trained research assistants performing all measurements. Three upper and three lower extremity muscles were studied unilaterally in each child, and the data averaged for each individual. Both electrical impedance myography and quantitative ultrasound differentiated healthy boys from those with Duchenne muscular dystrophy (P < 0.001 for both). Quantitative ultrasound values correlated with age in Duchenne muscular dystrophy boys (rho = 0.45; P = 0.029), whereas electrical impedance myography did not (rho = -0.31; P = 0.14). However, electrical impedance myography phase correlated with age in healthy boys (rho = 0.51; P = 0.012), whereas quantitative ultrasound did not (rho = -0.021; P = 0.92). In Duchenne muscular dystrophy boys, electrical impedance myography phase correlated with the North Star Ambulatory Assessment (rho = 0.65; P = 0.022); quantitative ultrasound revealed a near-significant association (rho = -0.56; P = 0.060). The two technologies trended toward a moderate correlation with one another in the Duchenne muscular dystrophy cohort but not in the healthy group (rho = -0.40; P = 0.054 and rho = -0.32; P = 0.13, respectively). Electrical impedance myography and quantitative ultrasound are complementary modalities for the assessment of boys with Duchenne muscular dystrophy; further study and application of these two modalities alone or in combination in a longitudinal fashion are warranted. Copyright

  2. Climate Change Education: Quantitatively Assessing the Impact of a Botanical Garden as an Informal Learning Environment

    ERIC Educational Resources Information Center

    Sellmann, Daniela; Bogner, Franz X.

    2013-01-01

    Although informal learning environments have been studied extensively, ours is one of the first studies to quantitatively assess the impact of learning in botanical gardens on students' cognitive achievement. We observed a group of 10th graders participating in a one-day educational intervention on climate change implemented in a botanical garden.…

  3. Dynamic assessment of narrative ability in English accurately identifies language impairment in English language learners.

    PubMed

    Peña, Elizabeth D; Gillam, Ronald B; Bedore, Lisa M

    2014-12-01

    To assess the identification accuracy of dynamic assessment (DA) of narrative ability in English for children learning English as a 2nd language. A DA task was administered to 54 children: 18 Spanish-English-speaking children with language impairment (LI); 18 age-, sex-, IQ- and language experience-matched typical control children; and an additional 18 age- and language experience-matched comparison children. A variety of quantitative and qualitative measures were collected in the pretest phase, the mediation phase, and the posttest phase of the study. Exploratory discriminant analysis was used to determine the set of measures that best differentiated among this group of children with and without LI. A combination of examiner ratings of modifiability (compliance, metacognition, and task orientation), DA story scores (setting, dialogue, and complexity of vocabulary), and ungrammaticality (derived from the posttest narrative sample) classified children with 80.6% to 97.2% accuracy. DA conducted in English provides a systematic means for measuring learning processes and learning outcomes, resulting in a clinically useful procedure for identifying LIs in bilingual children who are in the process of learning English as a second language.

  4. Quantitative assessment of airborne exposures generated during common cleaning tasks: a pilot study

    PubMed Central

    2010-01-01

    Background A growing body of epidemiologic evidence suggests an association between exposure to cleaning products with asthma and other respiratory disorders. Thus far, these studies have conducted only limited quantitative exposure assessments. Exposures from cleaning products are difficult to measure because they are complex mixtures of chemicals with a range of physicochemical properties, thus requiring multiple measurement techniques. We conducted a pilot exposure assessment study to identify methods for assessing short term, task-based airborne exposures and to quantitatively evaluate airborne exposures associated with cleaning tasks simulated under controlled work environment conditions. Methods Sink, mirror, and toilet bowl cleaning tasks were simulated in a large ventilated bathroom and a small unventilated bathroom using a general purpose, a glass, and a bathroom cleaner. All tasks were performed for 10 minutes. Airborne total volatile organic compounds (TVOC) generated during the tasks were measured using a direct reading instrument (DRI) with a photo ionization detector. Volatile organic ingredients of the cleaning mixtures were assessed utilizing an integrated sampling and analytic method, EPA TO-17. Ammonia air concentrations were also measured with an electrochemical sensor embedded in the DRI. Results Average TVOC concentrations calculated for 10 minute tasks ranged 0.02 - 6.49 ppm and the highest peak concentrations observed ranged 0.14-11 ppm. TVOC time concentration profiles indicated that exposures above background level remained present for about 20 minutes after cessation of the tasks. Among several targeted VOC compounds from cleaning mixtures, only 2-BE was detectable with the EPA method. The ten minute average 2- BE concentrations ranged 0.30 -21 ppm between tasks. The DRI underestimated 2-BE exposures compared to the results from the integrated method. The highest concentration of ammonia of 2.8 ppm occurred during mirror cleaning

  5. Semi-quantitative analysis of salivary gland scintigraphy in Sjögren's syndrome diagnosis: a first-line tool.

    PubMed

    Angusti, Tiziana; Pilati, Emanuela; Parente, Antonella; Carignola, Renato; Manfredi, Matteo; Cauda, Simona; Pizzigati, Elena; Dubreuil, Julien; Giammarile, Francesco; Podio, Valerio; Skanjeti, Andrea

    2017-09-01

    The aim of this study was the assessment of semi-quantified salivary gland dynamic scintigraphy (SGdS) parameters independently and in an integrated way in order to predict primary Sjögren's syndrome (pSS). Forty-six consecutive patients (41 females; age 61 ± 11 years) with sicca syndrome were studied by SGdS after injection of 200 MBq of pertechnetate. In sixteen patients, pSS was diagnosed, according to American-European Consensus Group criteria (AECGc). Semi-quantitative parameters (uptake (UP) and excretion fraction (EF)) were obtained for each gland. ROC curves were used to determine the best cut-off value. The area under the curve (AUC) was used to estimate the accuracy of each semi-quantitative analysis. To assess the correlation between scintigraphic results and disease severity, semi-quantitative parameters were plotted versus Sjögren's syndrome disease activity index (ESSDAI). A nomogram was built to perform an integrated evaluation of all the scintigraphic semi-quantitative data. Both UP and EF of salivary glands were significantly lower in pSS patients compared to those in non-pSS (p < 0.001). ROC curve showed significantly large AUC for both the parameters (p < 0.05). Parotid UP and submandibular EF, assessed by univariated and multivariate logistic regression, showed a significant and independent correlation with pSS diagnosis (p value <0.05). No correlation was found between SGdS semi-quantitative parameters and ESSDAI. The proposed nomogram accuracy was 87%. SGdS is an accurate and reproducible tool for the diagnosis of pSS. ESSDAI was not shown to be correlated with SGdS data. SGdS should be the first-line imaging technique in patients with suspected pSS.

  6. Assessment of acute myocarditis by cardiac magnetic resonance imaging: Comparison of qualitative and quantitative analysis methods.

    PubMed

    Imbriaco, Massimo; Nappi, Carmela; Puglia, Marta; De Giorgi, Marco; Dell'Aversana, Serena; Cuocolo, Renato; Ponsiglione, Andrea; De Giorgi, Igino; Polito, Maria Vincenza; Klain, Michele; Piscione, Federico; Pace, Leonardo; Cuocolo, Alberto

    2017-10-26

    To compare cardiac magnetic resonance (CMR) qualitative and quantitative analysis methods for the noninvasive assessment of myocardial inflammation in patients with suspected acute myocarditis (AM). A total of 61 patients with suspected AM underwent coronary angiography and CMR. Qualitative analysis was performed applying Lake-Louise Criteria (LLC), followed by quantitative analysis based on the evaluation of edema ratio (ER) and global relative enhancement (RE). Diagnostic performance was assessed for each method by measuring the area under the curves (AUC) of the receiver operating characteristic analyses. The final diagnosis of AM was based on symptoms and signs suggestive of cardiac disease, evidence of myocardial injury as defined by electrocardiogram changes, elevated troponin I, exclusion of coronary artery disease by coronary angiography, and clinical and echocardiographic follow-up at 3 months after admission to the chest pain unit. In all patients, coronary angiography did not show significant coronary artery stenosis. Troponin I levels and creatine kinase were higher in patients with AM compared to those without (both P < .001). There were no significant differences among LLC, T2-weighted short inversion time inversion recovery (STIR) sequences, early (EGE), and late (LGE) gadolinium-enhancement sequences for diagnosis of AM. The AUC for qualitative (T2-weighted STIR 0.92, EGE 0.87 and LGE 0.88) and quantitative (ER 0.89 and global RE 0.80) analyses were also similar. Qualitative and quantitative CMR analysis methods show similar diagnostic accuracy for the diagnosis of AM. These findings suggest that a simplified approach using a shortened CMR protocol including only T2-weighted STIR sequences might be useful to rule out AM in patients with acute coronary syndrome and normal coronary angiography.

  7. Quantitative characterization of metastatic disease in the spine. Part I. Semiautomated segmentation using atlas-based deformable registration and the level set method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardisty, M.; Gordon, L.; Agarwal, P.

    2007-08-15

    Quantitative assessment of metastatic disease in bone is often considered immeasurable and, as such, patients with skeletal metastases are often excluded from clinical trials. In order to effectively quantify the impact of metastatic tumor involvement in the spine, accurate segmentation of the vertebra is required. Manual segmentation can be accurate but involves extensive and time-consuming user interaction. Potential solutions to automating segmentation of metastatically involved vertebrae are demons deformable image registration and level set methods. The purpose of this study was to develop a semiautomated method to accurately segment tumor-bearing vertebrae using the aforementioned techniques. By maintaining morphology of anmore » atlas, the demons-level set composite algorithm was able to accurately differentiate between trans-cortical tumors and surrounding soft tissue of identical intensity. The algorithm successfully segmented both the vertebral body and trabecular centrum of tumor-involved and healthy vertebrae. This work validates our approach as equivalent in accuracy to an experienced user.« less

  8. 78 FR 9701 - Draft Joint Food and Drug Administration/Health Canada Quantitative Assessment of the Risk of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-11

    ... on the sources of L. monocytogenes contamination, the effects of individual manufacturing and/or... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2012-N-1182] Draft Joint Food and Drug Administration/Health Canada Quantitative Assessment of the Risk of...

  9. QUANTITATIVE MORPHOLOGY

    EPA Science Inventory

    Abstract: In toxicology, the role of quantitative assessment of brain morphology can be understood in the context of two types of treatment-related alterations. One type of alteration is specifically associated with treatment and is not observed in control animals. Measurement ...

  10. Global and local health burden trade-off through the hybridisation of quantitative microbial risk assessment and life cycle assessment to aid water management.

    PubMed

    Kobayashi, Yumi; Peters, Greg M; Ashbolt, Nicholas J; Heimersson, Sara; Svanström, Magdalena; Khan, Stuart J

    2015-08-01

    Life cycle assessment (LCA) and quantitative risk assessment (QRA) are commonly used to evaluate potential human health impacts associated with proposed or existing infrastructure and products. Each approach has a distinct objective and, consequently, their conclusions may be inconsistent or contradictory. It is proposed that the integration of elements of QRA and LCA may provide a more holistic approach to health impact assessment. Here we examine the possibility of merging LCA assessed human health impacts with quantitative microbial risk assessment (QMRA) for waterborne pathogen impacts, expressed with the common health metric, disability adjusted life years (DALYs). The example of a recent large-scale water recycling project in Sydney, Australia was used to identify and demonstrate the potential advantages and current limitations of this approach. A comparative analysis of two scenarios - with and without the development of this project - was undertaken for this purpose. LCA and QMRA were carried out independently for the two scenarios to compare human health impacts, as measured by DALYs lost per year. LCA results suggested that construction of the project would lead to an increased number of DALYs lost per year, while estimated disease burden resulting from microbial exposures indicated that it would result in the loss of fewer DALYs per year than the alternative scenario. By merging the results of the LCA and QMRA, we demonstrate the advantages in providing a more comprehensive assessment of human disease burden for the two scenarios, in particular, the importance of considering the results of both LCA and QRA in a comparative assessment of decision alternatives to avoid problem shifting. The application of DALYs as a common measure between the two approaches was found to be useful for this purpose. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. Sensitive Quantitative Assessment of Balance Disorders

    NASA Technical Reports Server (NTRS)

    Paloski, Willilam H.

    2007-01-01

    Computerized dynamic posturography (CDP) has become a standard technique for objectively quantifying balance control performance, diagnosing the nature of functional impairments underlying balance disorders, and monitoring clinical treatment outcomes. We have long used CDP protocols to assess recovery of sensory-motor function in astronauts following space flight. The most reliable indicators of post-flight crew performance are the sensory organization tests (SOTs), particularly SOTs 5 and 6, which are sensitive to changes in availability and/or utilization of vestibular cues. We have noted, however, that some astronauts exhibiting obvious signs of balance impairment after flight are able to score within clinical norms on these tests, perhaps as a result of adopting competitive strategies or by their natural skills at substituting alternate sensory information sources. This insensitivity of the CDP protocol could underestimate of the degree of impairment and, perhaps, lead to premature release of those crewmembers to normal duties. To improve the sensitivity of the CDP protocol we have introduced static and dynamic head tilt SOT trials into our protocol. The pattern of postflight recovery quantified by the enhanced CDP protocol appears to more aptly track the re-integration of sensory-motor function, with recovery time increasing as the complexity of sensory-motor/biomechanical task increases. The new CDP protocol therefore seems more suitable for monitoring post-flight sensory-motor recovery and for indicating to crewmembers and flight surgeons fitness for return to duty and/or activities of daily living. There may be classes of patients (e.g., athletes, pilots) having motivation and/or performance characteristics similar to astronauts whose sensory-motor treatment outcomes would also be more accurately monitored using the enhanced CDP protocol. Furthermore, the enhanced protocol may be useful in early detection of age-related balance disorders.

  12. Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.

    PubMed

    Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L

    2017-10-01

    The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (p<0.005) and correlated with independently identified visual EEG patterns such as generalized periodic discharges (p<0.02). Receiver operating characteristic (ROC) analysis confirmed the predictive value of lower state space velocity for poor clinical outcome after cardiac arrest (AUC 80.8, 70% sensitivity, 15% false positive rate). Model-based quantitative EEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic

  13. Quantitative Assessment of Commutability for Clinical Viral Load Testing Using a Digital PCR-Based Reference Standard

    PubMed Central

    Tang, L.; Sun, Y.; Buelow, D.; Gu, Z.; Caliendo, A. M.; Pounds, S.

    2016-01-01

    Given recent advances in the development of quantitative standards, particularly WHO international standards, efforts to better understand the commutability of reference materials have been made. Existing approaches in evaluating commutability include prediction intervals and correspondence analysis; however, the results obtained from existing approaches may be ambiguous. We have developed a “deviation-from-ideal” (DFI) approach to evaluate commutability of standards and applied it to the assessment of Epstein-Bar virus (EBV) load testing in four quantitative PCR assays, treating digital PCR as a reference assay. We then discuss advantages and limitations of the DFI approach as well as experimental design to best evaluate the commutability of an assay in practice. PMID:27076654

  14. Laparoscopic training using a quantitative assessment and instructional system.

    PubMed

    Yamaguchi, T; Nakamura, R

    2018-04-28

    Laparoscopic surgery requires complex surgical skills; hence, surgeons require regular training to improve their surgical techniques. The quantitative assessment of a surgeon's skills and the provision of feedback are important processes for conducting effective training. The aim of this study was to develop an inexpensive training system that provides automatic technique evaluation and feedback. We detected the instrument using image processing of commercial web camera images and calculated the motion analysis parameters (MAPs) of the instrument to quantify performance features. Upon receiving the results, we developed a method of evaluating the surgeon's skill level. The feedback system was developed using MAPs-based radar charts and scores for determining the skill level. These methods were evaluated using the videos of 38 surgeons performing a suturing task. There were significant differences in MAPs among surgeons; therefore, MAPs can be effectively used to quantify a surgeon's performance features. The results of skill evaluation and feedback differed greatly between skilled and unskilled surgeons, and it was possible to indicate points of improvement for the procedure performed in this study. Furthermore, the results obtained for certain novice surgeons were similar to those obtained for skilled surgeons. This system can be used to assess the skill level of surgeons, independent of the years of experience, and provide an understanding of the individual's current surgical skill level effectively. We conclude that our system is useful as an inexpensive laparoscopic training system that might aid in skill improvement.

  15. Comparison of PIV with 4D-Flow in a physiological accurate flow phantom

    NASA Astrophysics Data System (ADS)

    Sansom, Kurt; Balu, Niranjan; Liu, Haining; Aliseda, Alberto; Yuan, Chun; Canton, Maria De Gador

    2016-11-01

    Validation of 4D MRI flow sequences with planar particle image velocimetry (PIV) is performed in a physiologically-accurate flow phantom. A patient-specific phantom of a carotid artery is connected to a pulsatile flow loop to simulate the 3D unsteady flow in the cardiovascular anatomy. Cardiac-cycle synchronized MRI provides time-resolved 3D blood velocity measurements in clinical tool that is promising but lacks a robust validation framework. PIV at three different Reynolds numbers (540, 680, and 815, chosen based on +/- 20 % of the average velocity from the patient-specific CCA waveform) and four different Womersley numbers (3.30, 3.68, 4.03, and 4.35, chosen to reflect a physiological range of heart rates) are compared to 4D-MRI measurements. An accuracy assessment of raw velocity measurements and a comparison of estimated and measureable flow parameters such as wall shear stress, fluctuating velocity rms, and Lagrangian particle residence time, will be presented, with justification for their biomechanics relevance to the pathophysiology of arterial disease: atherosclerosis and intimal hyperplasia. Lastly, the framework is applied to a new 4D-Flow MRI sequence and post processing techniques to provide a quantitative assessment with the benchmarked data. Department of Education GAANN Fellowship.

  16. Characterization of 3-Dimensional PET Systems for Accurate Quantification of Myocardial Blood Flow.

    PubMed

    Renaud, Jennifer M; Yip, Kathy; Guimond, Jean; Trottier, Mikaël; Pibarot, Philippe; Turcotte, Eric; Maguire, Conor; Lalonde, Lucille; Gulenchyn, Karen; Farncombe, Troy; Wisenberg, Gerald; Moody, Jonathan; Lee, Benjamin; Port, Steven C; Turkington, Timothy G; Beanlands, Rob S; deKemp, Robert A

    2017-01-01

    Three-dimensional (3D) mode imaging is the current standard for PET/CT systems. Dynamic imaging for quantification of myocardial blood flow with short-lived tracers, such as 82 Rb-chloride, requires accuracy to be maintained over a wide range of isotope activities and scanner counting rates. We proposed new performance standard measurements to characterize the dynamic range of PET systems for accurate quantitative imaging. 82 Rb or 13 N-ammonia (1,100-3,000 MBq) was injected into the heart wall insert of an anthropomorphic torso phantom. A decaying isotope scan was obtained over 5 half-lives on 9 different 3D PET/CT systems and 1 3D/2-dimensional PET-only system. Dynamic images (28 × 15 s) were reconstructed using iterative algorithms with all corrections enabled. Dynamic range was defined as the maximum activity in the myocardial wall with less than 10% bias, from which corresponding dead-time, counting rates, and/or injected activity limits were established for each scanner. Scatter correction residual bias was estimated as the maximum cavity blood-to-myocardium activity ratio. Image quality was assessed via the coefficient of variation measuring nonuniformity of the left ventricular myocardium activity distribution. Maximum recommended injected activity/body weight, peak dead-time correction factor, counting rates, and residual scatter bias for accurate cardiac myocardial blood flow imaging were 3-14 MBq/kg, 1.5-4.0, 22-64 Mcps singles and 4-14 Mcps prompt coincidence counting rates, and 2%-10% on the investigated scanners. Nonuniformity of the myocardial activity distribution varied from 3% to 16%. Accurate dynamic imaging is possible on the 10 3D PET systems if the maximum injected MBq/kg values are respected to limit peak dead-time losses during the bolus first-pass transit. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  17. Basic concepts in three-part quantitative assessments of undiscovered mineral resources

    USGS Publications Warehouse

    Singer, D.A.

    1993-01-01

    Since 1975, mineral resource assessments have been made for over 27 areas covering 5??106 km2 at various scales using what is now called the three-part form of quantitative assessment. In these assessments, (1) areas are delineated according to the types of deposits permitted by the geology,(2) the amount of metal and some ore characteristics are estimated using grade and tonnage models, and (3) the number of undiscovered deposits of each type is estimated. Permissive boundaries are drawn for one or more deposit types such that the probability of a deposit lying outside the boundary is negligible, that is, less than 1 in 100,000 to 1,000,000. Grade and tonnage models combined with estimates of the number of deposits are the fundamental means of translating geologists' resource assessments into a language that economists can use. Estimates of the number of deposits explicitly represent the probability (or degree of belief) that some fixed but unknown number of undiscovered deposits exist in the delineated tracts. Estimates are by deposit type and must be consistent with the grade and tonnage model. Other guidelines for these estimates include (1) frequency of deposits from well-explored areas, (2) local deposit extrapolations, (3) counting and assigning probabilities to anomalies and occurrences, (4) process constraints, (5) relative frequencies of related deposit types, and (6) area spatial limits. In most cases, estimates are made subjectively, as they are in meteorology, gambling, and geologic interpretations. In three-part assessments, the estimates are internally consistent because delineated tracts are consistent with descriptive models, grade and tonnage models are consistent with descriptive models, as well as with known deposits in the area, and estimates of number of deposits are consistent with grade and tonnage models. All available information is used in the assessment, and uncertainty is explicitly represented. ?? 1993 Oxford University Press.

  18. Current status of accurate prognostic awareness in advanced/terminally ill cancer patients: Systematic review and meta-regression analysis.

    PubMed

    Chen, Chen Hsiu; Kuo, Su Ching; Tang, Siew Tzuh

    2017-05-01

    No systematic meta-analysis is available on the prevalence of cancer patients' accurate prognostic awareness and differences in accurate prognostic awareness by publication year, region, assessment method, and service received. To examine the prevalence of advanced/terminal cancer patients' accurate prognostic awareness and differences in accurate prognostic awareness by publication year, region, assessment method, and service received. Systematic review and meta-analysis. MEDLINE, Embase, The Cochrane Library, CINAHL, and PsycINFO were systematically searched on accurate prognostic awareness in adult patients with advanced/terminal cancer (1990-2014). Pooled prevalences were calculated for accurate prognostic awareness by a random-effects model. Differences in weighted estimates of accurate prognostic awareness were compared by meta-regression. In total, 34 articles were retrieved for systematic review and meta-analysis. At best, only about half of advanced/terminal cancer patients accurately understood their prognosis (49.1%; 95% confidence interval: 42.7%-55.5%; range: 5.4%-85.7%). Accurate prognostic awareness was independent of service received and publication year, but highest in Australia, followed by East Asia, North America, and southern Europe and the United Kingdom (67.7%, 60.7%, 52.8%, and 36.0%, respectively; p = 0.019). Accurate prognostic awareness was higher by clinician assessment than by patient report (63.2% vs 44.5%, p < 0.001). Less than half of advanced/terminal cancer patients accurately understood their prognosis, with significant variations by region and assessment method. Healthcare professionals should thoroughly assess advanced/terminal cancer patients' preferences for prognostic information and engage them in prognostic discussion early in the cancer trajectory, thus facilitating their accurate prognostic awareness and the quality of end-of-life care decision-making.

  19. Systems Toxicology: From Basic Research to Risk Assessment

    PubMed Central

    2014-01-01

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment. PMID:24446777

  20. Systems toxicology: from basic research to risk assessment.

    PubMed

    Sturla, Shana J; Boobis, Alan R; FitzGerald, Rex E; Hoeng, Julia; Kavlock, Robert J; Schirmer, Kristin; Whelan, Maurice; Wilks, Martin F; Peitsch, Manuel C

    2014-03-17

    Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment.

  1. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  2. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    PubMed

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  3. The Rényi divergence enables accurate and precise cluster analysis for localisation microscopy.

    PubMed

    Staszowska, Adela D; Fox-Roberts, Patrick; Hirvonen, Liisa M; Peddie, Christopher J; Collinson, Lucy M; Jones, Gareth E; Cox, Susan

    2018-06-01

    Clustering analysis is a key technique for quantitatively characterising structures in localisation microscopy images. To build up accurate information about biological structures, it is critical that the quantification is both accurate (close to the ground truth) and precise (has small scatter and is reproducible). Here we describe how the Rényi divergence can be used for cluster radius measurements in localisation microscopy data. We demonstrate that the Rényi divergence can operate with high levels of background and provides results which are more accurate than Ripley's functions, Voronoi tesselation or DBSCAN. Data supporting this research will be made accessible via a web link. Software codes developed for this work can be accessed via http://coxphysics.com/Renyi_divergence_software.zip. Implemented in C ++. Correspondence and requests for materials can be also addressed to the corresponding author. adela.staszowska@gmail.com or susan.cox@kcl.ac.uk. Supplementary data are available at Bioinformatics online.

  4. Quantitative evaluation methods of skin condition based on texture feature parameters.

    PubMed

    Pang, Hui; Chen, Tianhua; Wang, Xiaoyi; Chang, Zhineng; Shao, Siqi; Zhao, Jing

    2017-03-01

    In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM) is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.

  5. Quantitative Assessment of Motor and Sensory/Motor Acquisition in Handicapped and Nonhandicapped Infants and Young Children. Volume IV: Application of the Procedures.

    ERIC Educational Resources Information Center

    Guess, Doug; And Others

    Three studies that applied quantitative procedures to measure motor and sensory/motor acquisition among handicapped and nonhandicapped infants and children are presented. In addition, a study concerning the replication of the quantitative procedures for assessing rolling behavior is described in a fourth article. The first study, by C. Janssen,…

  6. The value of assessing pulmonary venous flow velocity for predicting severity of mitral regurgitation: A quantitative assessment integrating left ventricular function

    NASA Technical Reports Server (NTRS)

    Pu, M.; Griffin, B. P.; Vandervoort, P. M.; Stewart, W. J.; Fan, X.; Cosgrove, D. M.; Thomas, J. D.

    1999-01-01

    Although alteration in pulmonary venous flow has been reported to relate to mitral regurgitant severity, it is also known to vary with left ventricular (LV) systolic and diastolic dysfunction. There are few data relating pulmonary venous flow to quantitative indexes of mitral regurgitation (MR). The object of this study was to assess quantitatively the accuracy of pulmonary venous flow for predicting MR severity by using transesophageal echocardiographic measurement in patients with variable LV dysfunction. This study consisted of 73 patients undergoing heart surgery with mild to severe MR. Regurgitant orifice area (ROA), regurgitant stroke volume (RSV), and regurgitant fraction (RF) were obtained by quantitative transesophageal echocardiography and proximal isovelocity surface area. Both left and right upper pulmonary venous flow velocities were recorded and their patterns classified by the ratio of systolic to diastolic velocity: normal (>/=1), blunted (<1), and systolic reversal (<0). Twenty-three percent of patients had discordant patterns between the left and right veins. When the most abnormal patterns either in the left or right vein were used for analysis, the ratio of peak systolic to diastolic flow velocity was negatively correlated with ROA (r = -0.74, P <.001), RSV (r = -0.70, P <.001), and RF (r = -0.66, P <.001) calculated by the Doppler thermodilution method; values were r = -0.70, r = -0.67, and r = -0.57, respectively (all P <.001), for indexes calculated by the proximal isovelocity surface area method. The sensitivity, specificity, and predictive values of the reversed pulmonary venous flow pattern for detecting a large ROA (>0.3 cm(2)) were 69%, 98%, and 97%, respectively. The sensitivity, specificity, and predictive values of the normal pulmonary venous flow pattern for detecting a small ROA (<0.3 cm(2)) were 60%, 96%, and 94%, respectively. However, the blunted pattern had low sensitivity (22%), specificity (61%), and predictive values (30

  7. Composition and Quantitation of Microalgal Lipids by ERETIC 1H NMR Method

    PubMed Central

    Nuzzo, Genoveffa; Gallo, Carmela; d’Ippolito, Giuliana; Cutignano, Adele; Sardo, Angela; Fontana, Angelo

    2013-01-01

    Accurate characterization of biomass constituents is a crucial aspect of research in the biotechnological application of natural products. Here we report an efficient, fast and reproducible method for the identification and quantitation of fatty acids and complex lipids (triacylglycerols, glycolipids, phospholipids) in microalgae under investigation for the development of functional health products (probiotics, food ingredients, drugs, etc.) or third generation biofuels. The procedure consists of extraction of the biological matrix by modified Folch method and direct analysis of the resulting material by proton nuclear magnetic resonance (1H NMR). The protocol uses a reference electronic signal as external standard (ERETIC method) and allows assessment of total lipid content, saturation degree and class distribution in both high throughput screening of algal collection and metabolic analysis during genetic or culturing studies. As proof of concept, the methodology was applied to the analysis of three microalgal species (Thalassiosira weissflogii, Cyclotella cryptica and Nannochloropsis salina) which drastically differ for the qualitative and quantitative composition of their fatty acid-based lipids. PMID:24084790

  8. Elementary Writing Assessment Platforms: A Quantitative Examination of Online versus Offline Writing Performance of Fifth-Grade Students

    ERIC Educational Resources Information Center

    Heath, Vickie L.

    2013-01-01

    This quantitative study explored if significant differences exist between how fifth-grade students produce a written response to a narrative prompt using online versus offline writing platforms. The cultural and social trend of instructional and assessment writing paradigms in education is shifting to online writing platforms (National Assessment…

  9. Study Quality in SLA: An Assessment of Designs, Analyses, and Reporting Practices in Quantitative L2 Research

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2013-01-01

    This study assesses research and reporting practices in quantitative second language (L2) research. A sample of 606 primary studies, published from 1990 to 2010 in "Language Learning and Studies in Second Language Acquisition," was collected and coded for designs, statistical analyses, reporting practices, and outcomes (i.e., effect…

  10. Closing the Loop: Involving Faculty in the Assessment of Scientific and Quantitative Reasoning Skills of Biology Majors

    ERIC Educational Resources Information Center

    Hurney, Carol A.; Brown, Justin; Griscom, Heather Peckham; Kancler, Erika; Wigtil, Clifton J.; Sundre, Donna

    2011-01-01

    The development of scientific and quantitative reasoning skills in undergraduates majoring in science, technology, engineering, and mathematics (STEM) is an objective of many courses and curricula. The Biology Department at James Madison University (JMU) assesses these essential skills in graduating biology majors by using a multiple-choice exam…

  11. Supramolecular assembly affording a ratiometric two-photon fluorescent nanoprobe for quantitative detection and bioimaging.

    PubMed

    Wang, Peng; Zhang, Cheng; Liu, Hong-Wen; Xiong, Mengyi; Yin, Sheng-Yan; Yang, Yue; Hu, Xiao-Xiao; Yin, Xia; Zhang, Xiao-Bing; Tan, Weihong

    2017-12-01

    Fluorescence quantitative analyses for vital biomolecules are in great demand in biomedical science owing to their unique detection advantages with rapid, sensitive, non-damaging and specific identification. However, available fluorescence strategies for quantitative detection are usually hard to design and achieve. Inspired by supramolecular chemistry, a two-photon-excited fluorescent supramolecular nanoplatform ( TPSNP ) was designed for quantitative analysis with three parts: host molecules (β-CD polymers), a guest fluorophore of sensing probes (Np-Ad) and a guest internal reference (NpRh-Ad). In this strategy, the TPSNP possesses the merits of (i) improved water-solubility and biocompatibility; (ii) increased tissue penetration depth for bioimaging by two-photon excitation; (iii) quantitative and tunable assembly of functional guest molecules to obtain optimized detection conditions; (iv) a common approach to avoid the limitation of complicated design by adjustment of sensing probes; and (v) accurate quantitative analysis by virtue of reference molecules. As a proof-of-concept, we utilized the two-photon fluorescent probe NHS-Ad-based TPSNP-1 to realize accurate quantitative analysis of hydrogen sulfide (H 2 S), with high sensitivity and good selectivity in live cells, deep tissues and ex vivo -dissected organs, suggesting that the TPSNP is an ideal quantitative indicator for clinical samples. What's more, TPSNP will pave the way for designing and preparing advanced supramolecular sensors for biosensing and biomedicine.

  12. OMICS DATA IN THE QUALITATIVE AND QUANTITATIVE CHARACTERIZATION OF THE MODE OF ACTION IN SUPPORT OF IRIS ASSESSMENTS

    EPA Science Inventory

    Knowledge and information generated using new tools/methods collectively called "Omics" technologies could have a profound effect on qualitative and quantitative characterizations of human health risk assessments.

    The suffix "Omics" is a descriptor used for a series of e...

  13. Quantitative computed tomography assessment of transfusional iron overload.

    PubMed

    Wood, John C; Mo, Ashley; Gera, Aakansha; Koh, Montre; Coates, Thomas; Gilsanz, Vicente

    2011-06-01

    Quantitative computed tomography (QCT) has been proposed for iron quantification for more than 30 years, however there has been little clinical validation. We compared liver attenuation by QCT with magnetic resonance imaging (MRI)-derived estimates of liver iron concentration (LIC) in 37 patients with transfusional siderosis. MRI and QCT measurements were performed as clinically indicated monitoring of LIC and vertebral bone-density respectively, over a 6-year period. Mean time difference between QCT and MRI studies was 14 d, with 25 studies performed on the same day. For liver attenuation outside the normal range, attenuation values rose linearly with LIC (r(2) = 0·94). However, intersubject variability in intrinsic liver attenuation prevented quantitation of LIC <8 mg/g dry weight of liver, and was the dominant source of measurement uncertainty. Calculated QCT and MRI accuracies were equivalent for LIC values approaching 22 mg/g dry weight, with QCT having superior performance at higher LIC's. Although not suitable for monitoring patients with good iron control, QCT may nonetheless represent a viable technique for liver iron quantitation in patients with moderate to severe iron in regions where MRI resources are limited because of its low cost, availability, and high throughput. © 2011 Blackwell Publishing Ltd.

  14. Examining the Role of Numeracy in College STEM Courses: Results from the Quantitative Reasoning for College Science (QuaRCS) Assessment Instrument

    NASA Astrophysics Data System (ADS)

    Follette, Katherine B.; McCarthy, Donald W.; Dokter, Erin F.; Buxner, Sanlyn; Prather, Edward E.

    2016-01-01

    Is quantitative literacy a prerequisite for science literacy? Can students become discerning voters, savvy consumers and educated citizens without it? Should college science courses for nonmajors be focused on "science appreciation", or should they engage students in the messy quantitative realities of modern science? We will present results from the recently developed and validated Quantitative Reasoning for College Science (QuaRCS) Assessment, which probes both quantitative reasoning skills and attitudes toward mathematics. Based on data from nearly two thousand students enrolled in nineteen general education science courses, we show that students in these courses did not demonstrate significant skill or attitude improvements over the course of a single semester, but find encouraging evidence for longer term trends.

  15. Quantitative assessment of locomotive syndrome by the loco-check questionnaire in older Japanese females

    PubMed Central

    Noge, Sachiko; Ohishi, Tatsuo; Yoshida, Takuya; Kumagai, Hiromichi

    2017-01-01

    [Purpose] Locomotive syndrome (LS) is a condition by which older people may require care service because of problems with locomotive organs. This study examined whether the loco-check, a 7-item questionnaire, is useful for quantitatively assessing the severity of LS. [Subjects and Methods] Seventy-one community dwelling Japanese females aged 64–96 years (81.7 ± 8.0 years) participated in this study. The associations of the loco-check with thigh muscle mass measured by X-ray CT, physical performance, nutritional status, and quality of life (QOL) were investigated. [Results] The results showed that the number of times that “yes” was selected in the loco-check was significantly correlated with thigh muscle mass, major measures of physical performance, nutritional status, and QOL. This number was also significantly larger in the participants experiencing falling, fracture, and lumbar pain than in those without these episodes. [Conclusion] These results suggest that the loco-check might be useful for quantitatively evaluating LS. PMID:28932003

  16. An electronic portfolio for quantitative assessment of surgical skills in undergraduate medical education.

    PubMed

    Sánchez Gómez, Serafín; Ostos, Elisa María Cabot; Solano, Juan Manuel Maza; Salado, Tomás Francisco Herrero

    2013-05-06

    We evaluated a newly designed electronic portfolio (e-Portfolio) that provided quantitative evaluation of surgical skills. Medical students at the University of Seville used the e-Portfolio on a voluntary basis for evaluation of their performance in undergraduate surgical subjects. Our new web-based e-Portfolio was designed to evaluate surgical practical knowledge and skills targets. Students recorded each activity on a form, attached evidence, and added their reflections. Students self-assessed their practical knowledge using qualitative criteria (yes/no), and graded their skills according to complexity (basic/advanced) and participation (observer/assistant/independent). A numerical value was assigned to each activity, and the values of all activities were summated to obtain the total score. The application automatically displayed quantitative feedback. We performed qualitative evaluation of the perceived usefulness of the e-Portfolio and quantitative evaluation of the targets achieved. Thirty-seven of 112 students (33%) used the e-Portfolio, of which 87% reported that they understood the methodology of the portfolio. All students reported an improved understanding of their learning objectives resulting from the numerical visualization of progress, all students reported that the quantitative feedback encouraged their learning, and 79% of students felt that their teachers were more available because they were using the e-Portfolio. Only 51.3% of students reported that the reflective aspects of learning were useful. Individual students achieved a maximum of 65% of the total targets and 87% of the skills targets. The mean total score was 345 ± 38 points. For basic skills, 92% of students achieved the maximum score for participation as an independent operator, and all achieved the maximum scores for participation as an observer and assistant. For complex skills, 62% of students achieved the maximum score for participation as an independent operator, and 98% achieved

  17. Quantitative risk assessment using empirical vulnerability functions from debris flow event reconstruction

    NASA Astrophysics Data System (ADS)

    Luna, Byron Quan; Blahut, Jan; Camera, Corrado; van Westen, Cees; Sterlacchini, Simone; Apuani, Tiziana; Akbas, Sami

    2010-05-01

    For a quantitative risk assessment framework it is essential to assess not only the hazardous process itself but to perform an analysis of their consequences. This quantitative assessment should include the expected monetary losses as the product of the probability of occurrence of a hazard with a given magnitude and its vulnerability. A quantifiable integrated approach of both hazard and risk is becoming a required practice in risk reduction management. Dynamic run-out models for debris flows are able to calculate physical outputs (extension, depths, velocities, impact pressures) and to determine the zones where the elements at risk could suffer an impact. These results are then applied for vulnerability and risk calculations. The risk assessment has been conducted in the Valtellina Valley, a typical Italian alpine valley lying in northern Italy (Lombardy Region). On 13th July 2008, after more than two days of intense rainfall, several debris and mud flows were released in the central part of valley between Morbegno and Berbenno. One of the largest debris flows occurred in Selvetta. The debris flow event was reconstructed after extensive field work and interviews with local inhabitants and civil protection teams. Also inside the Valtellina valley, between the 22nd and the 23rd of May 1983, two debris flows happened in Tresenda (Teglio municipality), causing casualties and considerable economic damages. On the same location, during the 26th of November 2002, another debris flow occurred that caused significant damage. For the quantification of a new scenario, the outcome results obtained from the event of Selvetta were applied in Tresenda. The Selvetta and Tresenda event were modelled with the FLO2D program. FLO2D is an Eulerian formulation with a finite differences numerical scheme that requires the specification of an input hydrograph. The internal stresses are isotropic and the basal shear stresses are calculated using a quadratic model. The significance of

  18. Quantitative versus semiquantitative MR imaging of cartilage in blood-induced arthritic ankles: preliminary findings.

    PubMed

    Doria, Andrea S; Zhang, Ningning; Lundin, Bjorn; Hilliard, Pamela; Man, Carina; Weiss, Ruth; Detzler, Gary; Blanchette, Victor; Moineddin, Rahim; Eckstein, Felix; Sussman, Marshall S

    2014-05-01

    Recent advances in hemophilia prophylaxis have raised the need for accurate noninvasive methods for assessment of early cartilage damage in maturing joints to guide initiation of prophylaxis. Such methods can either be semiquantitative or quantitative. Whereas semiquantitative scores are less time-consuming to be performed than quantitative methods, they are prone to subjective interpretation. To test the feasibility of a manual segmentation and a quantitative methodology for cross-sectional evaluation of articular cartilage status in growing ankles of children with blood-induced arthritis, as compared with a semiquantitative scoring system and clinical-radiographic constructs. Twelve boys, 11 with hemophilia (A, n = 9; B, n = 2) and 1 with von Willebrand disease (median age: 13; range: 6-17), underwent physical examination and MRI at 1.5 T. Two radiologists semiquantitatively scored the MRIs for cartilage pathology (surface erosions, cartilage loss) with blinding to clinical information. An experienced operator applied a validated quantitative 3-D MRI method to determine the percentage area of denuded bone (dAB) and the cartilage thickness (ThCtAB) in the joints' MRIs. Quantitative and semiquantitative MRI methods and clinical-radiographic constructs (Hemophilia Joint Health Score [HJHS], Pettersson radiograph scores) were compared. Moderate correlations were noted between erosions and dAB (r = 0.62, P = 0.03) in the talus but not in the distal tibia (P > 0.05). Whereas substantial to high correlations (r range: 0.70-0.94, P < 0.05) were observed between erosions, cartilage loss, HJHS and Pettersson scores both at the distal tibia and talus levels, moderate/borderline substantial (r range: 0.55-0.61, P < 0.05) correlations were noted between dAB/ThCtAB and clinical-radiographic constructs. Whereas the semiquantitative method of assessing cartilage status is closely associated with clinical-radiographic scores in cross-sectional studies

  19. The Moment of Learning: Quantitative Analysis of Exemplar Gameplay Supports CyGaMEs Approach to Embedded Assessment

    ERIC Educational Resources Information Center

    Reese, Debbie Denise; Tabachnick, Barbara G.

    2010-01-01

    In this paper, the authors summarize a quantitative analysis demonstrating that the CyGaMEs toolset for embedded assessment of learning within instructional games measures growth in conceptual knowledge by quantifying player behavior. CyGaMEs stands for Cyberlearning through GaME-based, Metaphor Enhanced Learning Objects. Some scientists of…

  20. ADVANCING EPA WETLAND SCIENCE: DEVELOPING TOOLS FOR QUANTITATIVE ASSESSMENT OF WETLAND FUNCTION AND CONDITION AT THE REGIONAL LEVEL

    EPA Science Inventory

    The EPA Office of Water has recognized a critical need for tribes, states and federal agencies to be able to quantitatively assess the condition of the nations wetland resources. Currently, greater than 85% of states, tribes, and territories are lacking even rudimentary biologic...

  1. Development of an exposure measurement database on five lung carcinogens (ExpoSYN) for quantitative retrospective occupational exposure assessment.

    PubMed

    Peters, Susan; Vermeulen, Roel; Olsson, Ann; Van Gelder, Rainer; Kendzia, Benjamin; Vincent, Raymond; Savary, Barbara; Williams, Nick; Woldbæk, Torill; Lavoué, Jérôme; Cavallo, Domenico; Cattaneo, Andrea; Mirabelli, Dario; Plato, Nils; Dahmann, Dirk; Fevotte, Joelle; Pesch, Beate; Brüning, Thomas; Straif, Kurt; Kromhout, Hans

    2012-01-01

    SYNERGY is a large pooled analysis of case-control studies on the joint effects of occupational carcinogens and smoking in the development of lung cancer. A quantitative job-exposure matrix (JEM) will be developed to assign exposures to five major lung carcinogens [asbestos, chromium, nickel, polycyclic aromatic hydrocarbons (PAH), and respirable crystalline silica (RCS)]. We assembled an exposure database, called ExpoSYN, to enable such a quantitative exposure assessment. Existing exposure databases were identified and European and Canadian research institutes were approached to identify pertinent exposure measurement data. Results of individual air measurements were entered anonymized according to a standardized protocol. The ExpoSYN database currently includes 356 551 measurements from 19 countries. In total, 140 666 personal and 215 885 stationary data points were available. Measurements were distributed over the five agents as follows: RCS (42%), asbestos (20%), chromium (16%), nickel (15%), and PAH (7%). The measurement data cover the time period from 1951 to present. However, only a small portion of measurements (1.4%) were performed prior to 1975. The major contributing countries for personal measurements were Germany (32%), UK (22%), France (14%), and Norway and Canada (both 11%). ExpoSYN is a unique occupational exposure database with measurements from 18 European countries and Canada covering a time period of >50 years. This database will be used to develop a country-, job-, and time period-specific quantitative JEM. This JEM will enable data-driven quantitative exposure assessment in a multinational pooled analysis of community-based lung cancer case-control studies.

  2. Tandem mass spectrometry measurement of the collision products of carbamate anions derived from CO2 capture sorbents: paving the way for accurate quantitation.

    PubMed

    Jackson, Phil; Fisher, Keith J; Attalla, Moetaz Ibrahim

    2011-08-01

    The reaction between CO(2) and aqueous amines to produce a charged carbamate product plays a crucial role in post-combustion capture chemistry when primary and secondary amines are used. In this paper, we report the low energy negative-ion CID results for several anionic carbamates derived from primary and secondary amines commonly used as post-combustion capture solvents. The study was performed using the modern equivalent of a triple quadrupole instrument equipped with a T-wave collision cell. Deuterium labeling of 2-aminoethanol (1,1,2,2,-d(4)-2-aminoethanol) and computations at the M06-2X/6-311++G(d,p) level were used to confirm the identity of the fragmentation products for 2-hydroxyethylcarbamate (derived from 2-aminoethanol), in particular the ions CN(-), NCO(-) and facile neutral losses of CO(2) and water; there is precedent for the latter in condensed phase isocyanate chemistry. The fragmentations of 2-hydroxyethylcarbamate were generalized for carbamate anions derived from other capture amines, including ethylenediamine, diethanolamine, and piperazine. We also report unequivocal evidence for the existence of carbamate anions derived from sterically hindered amines (Tris(2-hydroxymethyl)aminomethane and 2-methyl-2-aminopropanol). For the suite of carbamates investigated, diagnostic losses include the decarboxylation product (-CO(2), 44 mass units), loss of 46 mass units and the fragments NCO(-) (m/z 42) and CN(-) (m/z 26). We also report low energy CID results for the dicarbamate dianion ((-)O(2)CNHC(2)H(4)NHCO(2)(-)) commonly encountered in CO(2) capture solution utilizing ethylenediamine. Finally, we demonstrate a promising ion chromatography-MS based procedure for the separation and quantitation of aqueous anionic carbamates, which is based on the reported CID findings. The availability of accurate quantitation methods for ionic CO(2) capture products could lead to dynamic operational tuning of CO(2) capture-plants and, thus, cost-savings via real

  3. Quantitative photoacoustic characterization of blood clot in blood: A mechanobiological assessment through spectral information

    NASA Astrophysics Data System (ADS)

    Biswas, Deblina; Vasudevan, Srivathsan; Chen, George C. K.; Sharma, Norman

    2017-02-01

    Formation of blood clots, called thrombus, can happen due to hyper-coagulation of blood. Thrombi, while moving through blood vessels can impede blood flow, an important criterion for many critical diseases like deep vein thrombosis and heart attacks. Understanding mechanical properties of clot formation is vital for assessment of severity of thrombosis and proper treatment. However, biomechanics of thrombus is less known to clinicians and not very well investigated. Photoacoustic (PA) spectral response, a non-invasive technique, is proposed to investigate the mechanism of formation of blood clots through elasticity and also differentiate clots from blood. Distinct shift (increase in frequency) of the PA response dominant frequency during clot formation is reported. In addition, quantitative differentiation of blood clots from blood has been achieved through parameters like dominant frequency and spectral energy of PA spectral response. Nearly twofold increases in dominant frequency in blood clots compared to blood were found in the PA spectral response. Significant changes in energy also help in quantitatively differentiating clots from blood, in the blood. Our results reveal that increase in density during clot formation is reflected in the PA spectral response, a significant step towards understanding the mechanobiology of thrombus formation. Hence, the proposed tool, in addition to detecting thrombus formation, could reveal mechanical properties of the sample through quantitative photoacoustic spectral parameters.

  4. Exposure assessment in investigations of waterborne illness: a quantitative estimate of measurement error

    PubMed Central

    Jones, Andria Q; Dewey, Catherine E; Doré, Kathryn; Majowicz, Shannon E; McEwen, Scott A; Waltner-Toews, David

    2006-01-01

    Background Exposure assessment is typically the greatest weakness of epidemiologic studies of disinfection by-products (DBPs) in drinking water, which largely stems from the difficulty in obtaining accurate data on individual-level water consumption patterns and activity. Thus, surrogate measures for such waterborne exposures are commonly used. Little attention however, has been directed towards formal validation of these measures. Methods We conducted a study in the City of Hamilton, Ontario (Canada) in 2001–2002, to assess the accuracy of two surrogate measures of home water source: (a) urban/rural status as assigned using residential postal codes, and (b) mapping of residential postal codes to municipal water systems within a Geographic Information System (GIS). We then assessed the accuracy of a commonly-used surrogate measure of an individual's actual drinking water source, namely, their home water source. Results The surrogates for home water source provided good classification of residents served by municipal water systems (approximately 98% predictive value), but did not perform well in classifying those served by private water systems (average: 63.5% predictive value). More importantly, we found that home water source was a poor surrogate measure of the individuals' actual drinking water source(s), being associated with high misclassification errors. Conclusion This study demonstrated substantial misclassification errors associated with a surrogate measure commonly used in studies of drinking water disinfection byproducts. Further, the limited accuracy of two surrogate measures of an individual's home water source heeds caution in their use in exposure classification methodology. While these surrogates are inexpensive and convenient, they should not be substituted for direct collection of accurate data pertaining to the subjects' waterborne disease exposure. In instances where such surrogates must be used, estimation of the misclassification and its

  5. Assessment of Renal Hemodynamics and Oxygenation by Simultaneous Magnetic Resonance Imaging (MRI) and Quantitative Invasive Physiological Measurements.

    PubMed

    Cantow, Kathleen; Arakelyan, Karen; Seeliger, Erdmann; Niendorf, Thoralf; Pohlmann, Andreas

    2016-01-01

    In vivo assessment of renal perfusion and oxygenation under (patho)physiological conditions by means of noninvasive diagnostic imaging is conceptually appealing. Blood oxygen level-dependent (BOLD) magnetic resonance imaging (MRI) and quantitative parametric mapping of the magnetic resonance (MR) relaxation times T 2* and T 2 are thought to provide surrogates of renal tissue oxygenation. The validity and efficacy of this technique for quantitative characterization of local tissue oxygenation and its changes under different functional conditions have not been systematically examined yet and remain to be established. For this purpose, the development of an integrative multimodality approaches is essential. Here we describe an integrated hybrid approach (MR-PHYSIOL) that combines established quantitative physiological measurements with T 2* (T 2) mapping and MR-based kidney size measurements. Standardized reversible (patho)physiologically relevant interventions, such as brief periods of aortic occlusion, hypoxia, and hyperoxia, are used for detailing the relation between the MR-PHYSIOL parameters, in particular between renal T 2* and tissue oxygenation.

  6. Enantioselective reductive transformation of climbazole: A concept towards quantitative biodegradation assessment in anaerobic biological treatment processes.

    PubMed

    Brienza, Monica; Chiron, Serge

    2017-06-01

    An efficient chiral method-based using liquid chromatography-high resolution-mass spectrometry analytical method has been validated for the determination of climbazole (CBZ) enantiomers in wastewater and sludge with quantification limits below the 1 ng/L and 2 ng/g range, respectively. On the basis of this newly developed analytical method, the stereochemistry of CBZ was investigated over time in sludge biotic and sterile batch experiments under anoxic dark and light conditions and during wastewater biological treatment by subsurface flow constructed wetlands. CBZ stereoselective degradation was exclusively observed under biotic conditions, confirming the specificity of enantiomeric fraction variations to biodegradation processes. Abiotic CBZ enantiomerization was insignificant at circumneutral pH and CBZ was always biotransformed into CBZ-alcohol due to the specific and enantioselective reduction of the ketone function of CBZ into a secondary alcohol function. This transformation was almost quantitative and biodegradation gave good first order kinetic fit for both enantiomers. The possibility to apply the Rayleigh equation to enantioselective CBZ biodegradation processes was investigated. The results of enantiomeric enrichment allowed for a quantitative assessment of in situ biodegradation processes due to a good fit (R 2  > 0.96) of the anoxic/anaerobic CBZ biodegradation to the Rayleigh dependency in all the biotic microcosms and was also applied in subsurface flow constructed wetlands. This work extended the concept of applying the Rayleigh equation towards quantitative biodegradation assessment of organic contaminants to enantioselective processes operating under anoxic/anaerobic conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. The metatarsosesamoid joint: an in vitro 3D quantitative assessment.

    PubMed

    Jamal, Bilal; Pillai, Anand; Fogg, Quentin; Kumar, Senthil

    2015-03-01

    The anatomy of the first metatarsophalangeal (MTP) joint, particularly the metatarsosesamoid articulation, remains poorly understood. Our goal was to quantitatively define the excursion of the sesamoids. Seven cadavers were dissected to assess the articulating surfaces throughout a normal range of motion. The dissections were digitally reconstructed in various positions using a MicroScribe. For first MTP joint, excursion averaged 14.7mm for the tibial sesamoid in the sagittal plane and 7.5mm for the fibular sesamoid. The sesamoids also moved medially to laterally when the joint was dorsiflexed. For the maximally dorsiflexed joint, excursion averaged 2.8mm for the tibial sesamoid and 3.5mm for the fibular sesamoid. Hallucal sesamoids appear to have differential tracking: the tibial sesamoid has greater longitudinal excursion; the fibular sesamoid has greater lateral excursion. The anatomical data will interest those involved with the design of an effective hallux arthroplasty. Copyright © 2014 European Foot and Ankle Society. Published by Elsevier Ltd. All rights reserved.

  8. Development and validation of broad-range qualitative and clade-specific quantitative molecular probes for assessing mercury methylation in the environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christensen, Geoff A.; Wymore, Ann M.; King, Andrew J.

    Two genes, hgcA and hgcB, are essential for microbial mercury (Hg)-methylation. Detection and estimation of their abundance, in conjunction with Hg concentration, bioavailability and biogeochemistry is critical in determining potential hot spots of methylmercury (MeHg) generation in at-risk environments. We developed broad-range degenerate PCR primers spanning known hgcAB genes to determine the presence of both genes in diverse environments. These primers were tested against an extensive set of pure cultures with published genomes, including 13 Deltaproteobacteria, nine Firmicutes, and nine methanogenic Archaea. A distinct PCR product at the expected size was confirmed for all hgcAB+ strains tested via Sanger sequencing.more » Additionally, we developed clade-specific degenerate quantitative primers (qPCR) that targeted hgcA for each of the three dominant Hg-methylating clades. The clade-specific qPCR primers amplified hgcA from 64%, 88% and 86% of tested pure cultures of Deltaproteobacteria, Firmicutes and Archaea, respectively, and were highly specific for each clade. Amplification efficiencies and detection limits were quantified for each organism. Primer sensitivity varied among species based on sequence conservation. Finally, to begin to evaluate the utility of our primer sets in nature, we tested hgcA and hgcAB recovery from pure cultures spiked into sand and soil. These novel quantitative molecular tools designed in this study will allow for more accurate identification and quantification of the individual Hg-methylating groups of microorganisms in the environment. Here, the resulting data will be essential in developing accurate and robust predictive models of Hg-methylation potential, ideally integrating the geochemistry of Hg methylation to the microbiology and genetics of hgcAB.« less

  9. Development and validation of broad-range qualitative and clade-specific quantitative molecular probes for assessing mercury methylation in the environment

    DOE PAGES

    Christensen, Geoff A.; Wymore, Ann M.; King, Andrew J.; ...

    2016-07-15

    Two genes, hgcA and hgcB, are essential for microbial mercury (Hg)-methylation. Detection and estimation of their abundance, in conjunction with Hg concentration, bioavailability and biogeochemistry is critical in determining potential hot spots of methylmercury (MeHg) generation in at-risk environments. We developed broad-range degenerate PCR primers spanning known hgcAB genes to determine the presence of both genes in diverse environments. These primers were tested against an extensive set of pure cultures with published genomes, including 13 Deltaproteobacteria, nine Firmicutes, and nine methanogenic Archaea. A distinct PCR product at the expected size was confirmed for all hgcAB+ strains tested via Sanger sequencing.more » Additionally, we developed clade-specific degenerate quantitative primers (qPCR) that targeted hgcA for each of the three dominant Hg-methylating clades. The clade-specific qPCR primers amplified hgcA from 64%, 88% and 86% of tested pure cultures of Deltaproteobacteria, Firmicutes and Archaea, respectively, and were highly specific for each clade. Amplification efficiencies and detection limits were quantified for each organism. Primer sensitivity varied among species based on sequence conservation. Finally, to begin to evaluate the utility of our primer sets in nature, we tested hgcA and hgcAB recovery from pure cultures spiked into sand and soil. These novel quantitative molecular tools designed in this study will allow for more accurate identification and quantification of the individual Hg-methylating groups of microorganisms in the environment. Here, the resulting data will be essential in developing accurate and robust predictive models of Hg-methylation potential, ideally integrating the geochemistry of Hg methylation to the microbiology and genetics of hgcAB.« less

  10. Epidemiological survey of quantitative ultrasound in risk assessment of falls in middle-aged and elderly people.

    PubMed

    Ou, Ling-Chun; Sun, Zih-Jie; Chang, Yin-Fan; Chang, Chin-Sung; Chao, Ting-Hsing; Kuo, Po-Hsiu; Lin, Ruey-Mo; Wu, Chih-Hsing

    2013-01-01

    The risk assessment of falls is important, but still unsatisfactory and time-consuming. Our objective was to assess quantitative ultrasound (QUS) in the risk assessment of falls. Our study was designed as epidemiological cross-sectional study occurring from March 2009 to February 2010 by community survey at a medical center. The participants were collected from systemic sample of 1,200 community-dwelling people (Male/Female = 524/676) 40 years old and over in Yunlin County, Mid-Taiwan. Structural questionnaires including socioeconomic status, living status, smoking and drinking habits, exercise and medical history were completed. Quantitative ultrasound (QUS) at the non-dominant distal radial area (QUS-R) and the left calcaneal area (QUS-C) were measured. The overall prevalence of falls was 19.8%. In men, the independently associated factors for falls were age (OR: 1.04; 95%CI: 1.01~1.06), fracture history (OR: 1.89; 95%CI: 1.12~3.19), osteoarthritis history (OR: 3.66; 95%CI: 1.15~11.64) and speed of sound (OR: 0.99; 95%CI: 0.99~1.00; p<0.05) by QUS-R. In women, the independently associated factors for falls were current drinking (OR: 3.54; 95%CI: 1.35∼9.31) and broadband ultrasound attenuation (OR: 0.98; 95%CI: 0.97~0.99; p<0.01) by QUS-C. The cutoffs at -2.5< T-score<-1 derived using QUS-R (OR: 2.85; 95%CI: 1.64~4.96; p<0.01) in men or T-score ≦-2.5 derived using QUS-C (OR: 2.72; 95%CI: 1.42~5.21; p<0.01) in women showed an independent association with falls. The lowest T-score derived using either QUS-R or QUS-C was also revealed as an independent factor for falls in both men (OR: 2.13; 95%CI: 1.03~4.43; p<0.05) and women (OR: 2.36; 95%CI: 1.13~4.91; p<0.05). Quantitative ultrasounds, measured either at the radial or calcaneal area, are convenient tools by which to assess the risk of falls in middle-aged and elderly people.

  11. Treating knee pain: history taking and accurate diagnoses.

    PubMed

    Barratt, Julian

    2010-07-01

    Prompt and effective diagnosis and treatment for common knee problems depend on practitioners' ability to distinguish between traumatic and inflammatory knee conditions. This article aims to enable practitioners to make accurate assessments, carry out knee examinations and undertake selected special tests as necessary before discharging or referring patients.

  12. Quantitative echocardiographic measures in the assessment of single ventricle function post-Fontan: Incorporation into routine clinical practice.

    PubMed

    Rios, Rodrigo; Ginde, Salil; Saudek, David; Loomba, Rohit S; Stelter, Jessica; Frommelt, Peter

    2017-01-01

    Quantitative echocardiographic measurements of single ventricular (SV) function have not been incorporated into routine clinical practice. A clinical protocol, which included quantitative measurements of SV deformation (global circumferential and longitudinal strain and strain rate), standard deviation of time to peak systolic strain, myocardial performance index (MPI), dP/dT from an atrioventricular valve regurgitant jet, and superior mesenteric artery resistance index, was instituted for all patients with a history of Fontan procedure undergoing echocardiography. All measures were performed real time during clinically indicated studies and were included in clinical reports. A total of 100 consecutive patients (mean age = 11.95±6.8 years, range 17 months-31.3 years) completed the protocol between September 1, 2014 to April 29, 2015. Deformation measures were completed in 100% of the studies, MPI in 93%, dP/dT in 55%, and superior mesenteric artery Doppler in 82%. The studies were reviewed to assess for efficiency in completing the protocol. The average time for image acquisition was 27.4±8.8 (range 10-62 minutes). The average time to perform deformation measures was 10.8±5.5 minutes (range 5-35 minutes) and time from beginning of imaging to report completion was 53.4±13.7 minutes (range 27-107 minutes). There was excellent inter-observer reliability when deformation indices were blindly repeated. Patients with a single left ventricle had significantly higher circumferential strain and strain rate, longitudinal strain and strain rate, and dP/dT compared to a single right ventricle. There were no differences in quantitative indices of ventricular function between patients <10 vs. >10 years post-Fontan. Advanced quantitative assessment of SV function post-Fontan can be consistently and efficiently performed real time during clinically indicated echocardiograms with excellent reliability. © 2016, Wiley Periodicals, Inc.

  13. Can cancer researchers accurately judge whether preclinical reports will reproduce?

    PubMed Central

    Mandel, David R.; Kimmelman, Jonathan

    2017-01-01

    There is vigorous debate about the reproducibility of research findings in cancer biology. Whether scientists can accurately assess which experiments will reproduce original findings is important to determining the pace at which science self-corrects. We collected forecasts from basic and preclinical cancer researchers on the first 6 replication studies conducted by the Reproducibility Project: Cancer Biology (RP:CB) to assess the accuracy of expert judgments on specific replication outcomes. On average, researchers forecasted a 75% probability of replicating the statistical significance and a 50% probability of replicating the effect size, yet none of these studies successfully replicated on either criterion (for the 5 studies with results reported). Accuracy was related to expertise: experts with higher h-indices were more accurate, whereas experts with more topic-specific expertise were less accurate. Our findings suggest that experts, especially those with specialized knowledge, were overconfident about the RP:CB replicating individual experiments within published reports; researcher optimism likely reflects a combination of overestimating the validity of original studies and underestimating the difficulties of repeating their methodologies. PMID:28662052

  14. Facial asymmetry quantitative evaluation in oculoauriculovertebral spectrum.

    PubMed

    Manara, Renzo; Schifano, Giovanni; Brotto, Davide; Mardari, Rodica; Ghiselli, Sara; Gerunda, Antonio; Ghirotto, Cristina; Fusetti, Stefano; Piacentile, Katherine; Scienza, Renato; Ermani, Mario; Martini, Alessandro

    2016-03-01

    Facial asymmetries in oculoauriculovertebral spectrum (OAVS) patients might require surgical corrections that are mostly based on qualitative approach and surgeon's experience. The present study aimed to develop a quantitative 3D CT imaging-based procedure suitable for maxillo-facial surgery planning in OAVS patients. Thirteen OAVS patients (mean age 3.5 ± 4.0 years; range 0.2-14.2, 6 females) and 13 controls (mean age 7.1 ± 5.3 years; range 0.6-15.7, 5 females) who underwent head CT examination were retrospectively enrolled. Eight bilateral anatomical facial landmarks were defined on 3D CT images (porion, orbitale, most anterior point of frontozygomatic suture, most superior point of temporozygomatic suture, most posterior-lateral point of the maxilla, gonion, condylion, mental foramen) and distance from orthogonal planes (in millimeters) was used to evaluate the asymmetry on each axis and to calculate a global asymmetry index of each anatomical landmark. Mean asymmetry values and relative confidence intervals were obtained from the control group. OAVS patients showed 2.5 ± 1.8 landmarks above the confidence interval while considering the global asymmetry values; 12 patients (92%) showed at least one pathologically asymmetric landmark. Considering each axis, the mean number of pathologically asymmetric landmarks increased to 5.5 ± 2.6 (p = 0.002) and all patients presented at least one significant landmark asymmetry. Modern CT-based 3D reconstructions allow accurate assessment of facial bone asymmetries in patients affected by OAVS. The evaluation as a global score and in different orthogonal axes provides precise quantitative data suitable for maxillo-facial surgical planning. CT-based 3D reconstruction might allow a quantitative approach for planning and following-up maxillo-facial surgery in OAVS patients.

  15. Purity assessment of organic calibration standards using a combination of quantitative NMR and mass balance.

    PubMed

    Davies, Stephen R; Jones, Kai; Goldys, Anna; Alamgir, Mahuiddin; Chan, Benjamin K H; Elgindy, Cecile; Mitchell, Peter S R; Tarrant, Gregory J; Krishnaswami, Maya R; Luo, Yawen; Moawad, Michael; Lawes, Douglas; Hook, James M

    2015-04-01

    Quantitative NMR spectroscopy (qNMR) has been examined for purity assessment using a range of organic calibration standards of varying structural complexities, certified using the traditional mass balance approach. Demonstrated equivalence between the two independent purity values confirmed the accuracy of qNMR and highlighted the benefit of using both methods in tandem to minimise the potential for hidden bias, thereby conferring greater confidence in the overall purity assessment. A comprehensive approach to purity assessment is detailed, utilising, where appropriate, multiple peaks in the qNMR spectrum, chosen on the basis of scientific reason and statistical analysis. Two examples are presented in which differences between the purity assignment by qNMR and mass balance are addressed in different ways depending on the requirement of the end user, affording fit-for-purpose calibration standards in a cost-effective manner.

  16. Quantitative assessment of developmental levels in overarm throwing using wearable inertial sensing technology.

    PubMed

    Grimpampi, Eleni; Masci, Ilaria; Pesce, Caterina; Vannozzi, Giuseppe

    2016-09-01

    Motor proficiency in childhood has been recently recognised as a public health determinant, having a potential impact on the physical activity level and possible sedentary behaviour of the child later in life. Among fundamental motor skills, ballistic skills assessment based on in-field quantitative observations is progressively needed in the motor development community. The aim of this study was to propose an in-field quantitative approach to identify different developmental levels in overarm throwing. Fifty-eight children aged 5-10 years performed an overarm throwing task while wearing three inertial sensors located at the wrist, trunk and pelvis level and were then categorised using a developmental sequence of overarm throwing. A set of biomechanical parameters were defined and analysed using multivariate statistics to evaluate whether they can be used as developmental indicators. Trunk and pelvis angular velocities and time durations before the ball release showed increasing/decreasing trends with increasing developmental level. Significant differences between developmental level pairs were observed for selected biomechanical parameters. The results support the suitability and feasibility of objective developmental measures in ecological learning contexts, suggesting their potential supportiveness to motor learning experiences in educational and youth sports training settings.

  17. Quantitative measurement of marginal disintegration of ceramic inlays.

    PubMed

    Hayashi, Mikako; Tsubakimoto, Yuko; Takeshige, Fumio; Ebisu, Shigeyuki

    2004-01-01

    The objectives of this study include establishing a method for quantitative measurement of marginal change in ceramic inlays and clarifying their marginal disintegration in vivo. An accurate CCD optical laser scanner system was used for morphological measurement of the marginal change of ceramic inlays. The accuracy of the CCD measurement was assessed by comparing it with microscopic measurement. Replicas of 15 premolars restored with Class II ceramic inlays at the time of placement and eight years after restoration were used for morphological measurement by means of the CCD laser scanner system. Occlusal surfaces of the restored teeth were scanned and cross-sections of marginal areas were computed with software. Marginal change was defined as the area enclosed by two profiles obtained by superimposing two cross-sections of the same location at two different times and expressing the maximum depth and mean area of the area enclosed. The accuracy of this method of measurement was 4.3 +/- 3.2 microm in distance and 2.0 +/- 0.6% in area. Quantitative marginal changes for the eight-year period were 10 x 10 microm in depth and 50 x 10(3) microm2 in area at the functional cusp area and 7 x 10 microm in depth and 28 x 10(3) microm2 in area at the non-functional cusp area. Marginal disintegration at the functional cusp area was significantly greater than at the non-functional cusp area (Wilcoxon signed-ranks test, p < 0.05). This study constitutes a quantitative measurement of in vivo deterioration in marginal adaptation of ceramic inlays and indicates that occlusal force may accelerate marginal disintegration.

  18. Quantitative MRI and strength measurements in the assessment of muscle quality in Duchenne muscular dystrophy.

    PubMed

    Wokke, B H; van den Bergen, J C; Versluis, M J; Niks, E H; Milles, J; Webb, A G; van Zwet, E W; Aartsma-Rus, A; Verschuuren, J J; Kan, H E

    2014-05-01

    The purpose of this study was to assess leg muscle quality and give a detailed description of leg muscle involvement in a series of Duchenne muscular dystrophy patients using quantitative MRI and strength measurements. Fatty infiltration, as well as total and contractile (not fatty infiltrated) cross sectional areas of various leg muscles were determined in 16 Duchenne patients and 11 controls (aged 8-15). To determine specific muscle strength, four leg muscle groups (quadriceps femoris, hamstrings, anterior tibialis and triceps surae) were measured and related to the amount of contractile tissue. In patients, the quadriceps femoris showed decreased total and contractile cross sectional area, attributable to muscle atrophy. The total, but not the contractile, cross sectional area of the triceps surae was increased in patients, corresponding to hypertrophy. Specific strength decreased in all four muscle groups of Duchenne patients, indicating reduced muscle quality. This suggests that muscle hypertrophy and fatty infiltration are two distinct pathological processes, differing between muscle groups. Additionally, the quality of remaining muscle fibers is severely reduced in the legs of Duchenne patients. The combination of quantitative MRI and quantitative muscle testing could be a valuable outcome parameter in longitudinal studies and in the follow-up of therapeutic effects. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. An analytic model for accurate spring constant calibration of rectangular atomic force microscope cantilevers.

    PubMed

    Li, Rui; Ye, Hongfei; Zhang, Weisheng; Ma, Guojun; Su, Yewang

    2015-10-29

    Spring constant calibration of the atomic force microscope (AFM) cantilever is of fundamental importance for quantifying the force between the AFM cantilever tip and the sample. The calibration within the framework of thin plate theory undoubtedly has a higher accuracy and broader scope than that within the well-established beam theory. However, thin plate theory-based accurate analytic determination of the constant has been perceived as an extremely difficult issue. In this paper, we implement the thin plate theory-based analytic modeling for the static behavior of rectangular AFM cantilevers, which reveals that the three-dimensional effect and Poisson effect play important roles in accurate determination of the spring constants. A quantitative scaling law is found that the normalized spring constant depends only on the Poisson's ratio, normalized dimension and normalized load coordinate. Both the literature and our refined finite element model validate the present results. The developed model is expected to serve as the benchmark for accurate calibration of rectangular AFM cantilevers.

  20. Prognostic Value of Quantitative Stress Perfusion Cardiac Magnetic Resonance.

    PubMed

    Sammut, Eva C; Villa, Adriana D M; Di Giovine, Gabriella; Dancy, Luke; Bosio, Filippo; Gibbs, Thomas; Jeyabraba, Swarna; Schwenke, Susanne; Williams, Steven E; Marber, Michael; Alfakih, Khaled; Ismail, Tevfik F; Razavi, Reza; Chiribiri, Amedeo

    2018-05-01

    This study sought to evaluate the prognostic usefulness of visual and quantitative perfusion cardiac magnetic resonance (CMR) ischemic burden in an unselected group of patients and to assess the validity of consensus-based ischemic burden thresholds extrapolated from nuclear studies. There are limited data on the prognostic value of assessing myocardial ischemic burden by CMR, and there are none using quantitative perfusion analysis. Patients with suspected coronary artery disease referred for adenosine-stress perfusion CMR were included (n = 395; 70% male; age 58 ± 13 years). The primary endpoint was a composite of cardiovascular death, nonfatal myocardial infarction, aborted sudden death, and revascularization after 90 days. Perfusion scans were assessed visually and with quantitative analysis. Cross-validated Cox regression analysis and net reclassification improvement were used to assess the incremental prognostic value of visual or quantitative perfusion analysis over a baseline clinical model, initially as continuous covariates, then using accepted thresholds of ≥2 segments or ≥10% myocardium. After a median 460 days (interquartile range: 190 to 869 days) follow-up, 52 patients reached the primary endpoint. At 2 years, the addition of ischemic burden was found to increase prognostic value over a baseline model of age, sex, and late gadolinium enhancement (baseline model area under the curve [AUC]: 0.75; visual AUC: 0.84; quantitative AUC: 0.85). Dichotomized quantitative ischemic burden performed better than visual assessment (net reclassification improvement 0.043 vs. 0.003 against baseline model). This study was the first to address the prognostic benefit of quantitative analysis of perfusion CMR and to support the use of consensus-based ischemic burden thresholds by perfusion CMR for prognostic evaluation of patients with suspected coronary artery disease. Quantitative analysis provided incremental prognostic value to visual assessment and

  1. Assessing Microneurosurgical Skill with Medico-Engineering Technology.

    PubMed

    Harada, Kanako; Morita, Akio; Minakawa, Yoshiaki; Baek, Young Min; Sora, Shigeo; Sugita, Naohiko; Kimura, Toshikazu; Tanikawa, Rokuya; Ishikawa, Tatsuya; Mitsuishi, Mamoru

    2015-10-01

    Most methods currently used to assess surgical skill are rather subjective or not adequate for microneurosurgery. Objective and quantitative microneurosurgical skill assessment systems that are capable of accurate measurements are necessary for the further development of microneurosurgery. Infrared optical motion tracking markers, an inertial measurement unit, and strain gauges were mounted on tweezers to measure many parameters related to instrument manipulation. We then recorded the activity of 23 neurosurgeons. The task completion time, tool path, and needle-gripping force were evaluated for three stitches made in an anastomosis of 0.7-mm artificial blood vessels. Videos of the activity were evaluated by three blinded expert surgeons. Surgeons who had recently done many bypass procedures demonstrated better skills. These skilled surgeons performed the anastomosis with in a shorter time, with a shorter tool path, and with a lesser force when extracting the needle. These results show the potential contribution of the system to microsurgical skill assessment. Quantitative and detailed analysis of surgical tasks helps surgeons better understand the key features of the required skills. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Quantitative self-assembly prediction yields targeted nanomedicines

    NASA Astrophysics Data System (ADS)

    Shamay, Yosi; Shah, Janki; Işık, Mehtap; Mizrachi, Aviram; Leibold, Josef; Tschaharganeh, Darjus F.; Roxbury, Daniel; Budhathoki-Uprety, Januka; Nawaly, Karla; Sugarman, James L.; Baut, Emily; Neiman, Michelle R.; Dacek, Megan; Ganesh, Kripa S.; Johnson, Darren C.; Sridharan, Ramya; Chu, Karen L.; Rajasekhar, Vinagolu K.; Lowe, Scott W.; Chodera, John D.; Heller, Daniel A.

    2018-02-01

    Development of targeted nanoparticle drug carriers often requires complex synthetic schemes involving both supramolecular self-assembly and chemical modification. These processes are generally difficult to predict, execute, and control. We describe herein a targeted drug delivery system that is accurately and quantitatively predicted to self-assemble into nanoparticles based on the molecular structures of precursor molecules, which are the drugs themselves. The drugs assemble with the aid of sulfated indocyanines into particles with ultrahigh drug loadings of up to 90%. We devised quantitative structure-nanoparticle assembly prediction (QSNAP) models to identify and validate electrotopological molecular descriptors as highly predictive indicators of nano-assembly and nanoparticle size. The resulting nanoparticles selectively targeted kinase inhibitors to caveolin-1-expressing human colon cancer and autochthonous liver cancer models to yield striking therapeutic effects while avoiding pERK inhibition in healthy skin. This finding enables the computational design of nanomedicines based on quantitative models for drug payload selection.

  3. Smile line assessment comparing quantitative measurement and visual estimation.

    PubMed

    Van der Geld, Pieter; Oosterveld, Paul; Schols, Jan; Kuijpers-Jagtman, Anne Marie

    2011-02-01

    Esthetic analysis of dynamic functions such as spontaneous smiling is feasible by using digital videography and computer measurement for lip line height and tooth display. Because quantitative measurements are time-consuming, digital videography and semiquantitative (visual) estimation according to a standard categorization are more practical for regular diagnostics. Our objective in this study was to compare 2 semiquantitative methods with quantitative measurements for reliability and agreement. The faces of 122 male participants were individually registered by using digital videography. Spontaneous and posed smiles were captured. On the records, maxillary lip line heights and tooth display were digitally measured on each tooth and also visually estimated according to 3-grade and 4-grade scales. Two raters were involved. An error analysis was performed. Reliability was established with kappa statistics. Interexaminer and intraexaminer reliability values were high, with median kappa values from 0.79 to 0.88. Agreement of the 3-grade scale estimation with quantitative measurement showed higher median kappa values (0.76) than the 4-grade scale estimation (0.66). Differentiating high and gummy smile lines (4-grade scale) resulted in greater inaccuracies. The estimation of a high, average, or low smile line for each tooth showed high reliability close to quantitative measurements. Smile line analysis can be performed reliably with a 3-grade scale (visual) semiquantitative estimation. For a more comprehensive diagnosis, additional measuring is proposed, especially in patients with disproportional gingival display. Copyright © 2011 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  4. Is there a place for quantitative risk assessment?

    PubMed

    Hall, Eric J

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  5. Quantitative assessment of cancer cell morphology and motility using telecentric digital holographic microscopy and machine learning.

    PubMed

    Lam, Van K; Nguyen, Thanh C; Chung, Byung M; Nehmetallah, George; Raub, Christopher B

    2018-03-01

    The noninvasive, fast acquisition of quantitative phase maps using digital holographic microscopy (DHM) allows tracking of rapid cellular motility on transparent substrates. On two-dimensional surfaces in vitro, MDA-MB-231 cancer cells assume several morphologies related to the mode of migration and substrate stiffness, relevant to mechanisms of cancer invasiveness in vivo. The quantitative phase information from DHM may accurately classify adhesive cancer cell subpopulations with clinical relevance. To test this, cells from the invasive breast cancer MDA-MB-231 cell line were cultured on glass, tissue-culture treated polystyrene, and collagen hydrogels, and imaged with DHM followed by epifluorescence microscopy after staining F-actin and nuclei. Trends in cell phase parameters were tracked on the different substrates, during cell division, and during matrix adhesion, relating them to F-actin features. Support vector machine learning algorithms were trained and tested using parameters from holographic phase reconstructions and cell geometric features from conventional phase images, and used to distinguish between elongated and rounded cell morphologies. DHM was able to distinguish between elongated and rounded morphologies of MDA-MB-231 cells with 94% accuracy, compared to 83% accuracy using cell geometric features from conventional brightfield microscopy. This finding indicates the potential of DHM to detect and monitor cancer cell morphologies relevant to cell cycle phase status, substrate adhesion, and motility. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.

  6. Quantitative fluorescence tomography using a trimodality system: in vivo validation

    PubMed Central

    Lin, Yuting; Barber, William C.; Iwanczyk, Jan S.; Roeck, Werner W.; Nalcioglu, Orhan; Gulsen, Gultekin

    2010-01-01

    A fully integrated trimodality fluorescence, diffuse optical, and x-ray computed tomography (FT∕DOT∕XCT) system for small animal imaging is reported in this work. The main purpose of this system is to obtain quantitatively accurate fluorescence concentration images using a multimodality approach. XCT offers anatomical information, while DOT provides the necessary background optical property map to improve FT image accuracy. The quantitative accuracy of this trimodality system is demonstrated in vivo. In particular, we show that a 2-mm-diam fluorescence inclusion located 8 mm deep in a nude mouse can only be localized when functional a priori information from DOT is available. However, the error in the recovered fluorophore concentration is nearly 87%. On the other hand, the fluorophore concentration can be accurately recovered within 2% error when both DOT functional and XCT structural a priori information are utilized together to guide and constrain the FT reconstruction algorithm. PMID:20799770

  7. Machine Learning of Accurate Energy-Conserving Molecular Force Fields

    NASA Astrophysics Data System (ADS)

    Chmiela, Stefan; Tkatchenko, Alexandre; Sauceda, Huziel; Poltavsky, Igor; Schütt, Kristof; Müller, Klaus-Robert; GDML Collaboration

    Efficient and accurate access to the Born-Oppenheimer potential energy surface (PES) is essential for long time scale molecular dynamics (MD) simulations. Using conservation of energy - a fundamental property of closed classical and quantum mechanical systems - we develop an efficient gradient-domain machine learning (GDML) approach to construct accurate molecular force fields using a restricted number of samples from ab initio MD trajectories (AIMD). The GDML implementation is able to reproduce global potential-energy surfaces of intermediate-size molecules with an accuracy of 0.3 kcal/mol for energies and 1 kcal/mol/Å for atomic forces using only 1000 conformational geometries for training. We demonstrate this accuracy for AIMD trajectories of molecules, including benzene, toluene, naphthalene, malonaldehyde, ethanol, uracil, and aspirin. The challenge of constructing conservative force fields is accomplished in our work by learning in a Hilbert space of vector-valued functions that obey the law of energy conservation. The GDML approach enables quantitative MD simulations for molecules at a fraction of cost of explicit AIMD calculations, thereby allowing the construction of efficient force fields with the accuracy and transferability of high-level ab initio methods.

  8. Laboratory assessment of Activated Protein C Resistance/Factor V-Leiden and performance characteristics of a new quantitative assay.

    PubMed

    Amiral, Jean; Vissac, Anne Marie; Seghatchian, Jerard

    2017-12-01

    therapies, and has no interference with lupus anticoagulant (LA). This new assay for Factor V-Leiden can be easily used in any coagulation laboratory, is performed as a single test, and is quantitative. This assay has a high robustness, is accurate and presents a good intra- (<3%) and inter-assay (<5%) variability. It contributes solving most of the laboratory issues faced when testing factor V-Leiden. Quantitation of Factor V-L could contribute to a better assessment of thrombotic risk in affected patients, as this complication is first associated to and caused by the presence of a defined amount of FVa. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. A Quantitative Climate-Match Score for Risk-Assessment Screening of Reptile and Amphibian Introductions

    NASA Astrophysics Data System (ADS)

    van Wilgen, Nicola J.; Roura-Pascual, Núria; Richardson, David M.

    2009-09-01

    Assessing climatic suitability provides a good preliminary estimate of the invasive potential of a species to inform risk assessment. We examined two approaches for bioclimatic modeling for 67 reptile and amphibian species introduced to California and Florida. First, we modeled the worldwide distribution of the biomes found in the introduced range to highlight similar areas worldwide from which invaders might arise. Second, we modeled potentially suitable environments for species based on climatic factors in their native ranges, using three sources of distribution data. Performance of the three datasets and both approaches were compared for each species. Climate match was positively correlated with species establishment success (maximum predicted suitability in the introduced range was more strongly correlated with establishment success than mean suitability). Data assembled from the Global Amphibian Assessment through NatureServe provided the most accurate models for amphibians, while ecoregion data compiled by the World Wide Fund for Nature yielded models which described reptile climatic suitability better than available point-locality data. We present three methods of assigning a climate-match score for use in risk assessment using both the mean and maximum climatic suitabilities. Managers may choose to use different methods depending on the stringency of the assessment and the available data, facilitating higher resolution and accuracy for herpetofaunal risk assessment. Climate-matching has inherent limitations and other factors pertaining to ecological interactions and life-history traits must also be considered for thorough risk assessment.

  10. Genomic Quantitative Genetics to Study Evolution in the Wild.

    PubMed

    Gienapp, Phillip; Fior, Simone; Guillaume, Frédéric; Lasky, Jesse R; Sork, Victoria L; Csilléry, Katalin

    2017-12-01

    Quantitative genetic theory provides a means of estimating the evolutionary potential of natural populations. However, this approach was previously only feasible in systems where the genetic relatedness between individuals could be inferred from pedigrees or experimental crosses. The genomic revolution opened up the possibility of obtaining the realized proportion of genome shared among individuals in natural populations of virtually any species, which could promise (more) accurate estimates of quantitative genetic parameters in virtually any species. Such a 'genomic' quantitative genetics approach relies on fewer assumptions, offers a greater methodological flexibility, and is thus expected to greatly enhance our understanding of evolution in natural populations, for example, in the context of adaptation to environmental change, eco-evolutionary dynamics, and biodiversity conservation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. A quantitative risk assessment for the safety of carcase storage systems for scrapie infected farms.

    PubMed

    Adkin, A; Jones, D L; Eckford, R L; Edwards-Jones, G; Williams, A P

    2014-10-01

    To determine the risk associated with the use of carcase storage vessels on a scrapie infected farm. A stochastic quantitative risk assessment was developed to determine the rate of accumulation and fate of scrapie in a novel low-input storage system. For an example farm infected with classical scrapie, a mean of 10(3·6) Ovine Oral ID50 s was estimated to accumulate annually. Research indicates that the degradation of any prions present may range from insignificant to a magnitude of one or two logs over several months of storage. For infected farms, the likely partitioning of remaining prion into the sludge phase would necessitate the safe operation and removal of resulting materials from these systems. If complete mixing could be assumed, on average, the concentrations of infectivity are estimated to be slightly lower than that measured in placenta from infected sheep at lambing. This is the first quantitative assessment of the scrapie risk associated with fallen stock on farm and provides guidance to policy makers on the safety of one type of storage system and the relative risk when compared to other materials present on an infected farm. © 2014 Crown Copyright. Journal of Applied Microbiology © 2014 Society for Applied Microbiology This article is published with the permission of the Controller of HMSO and the Queen's Printer for Scotland.

  12. Quantitative Risk Assessment of Human Trichinellosis Caused by Consumption of Pork Meat Sausages in Argentina.

    PubMed

    Sequeira, G J; Zbrun, M V; Soto, L P; Astesana, D M; Blajman, J E; Rosmini, M R; Frizzo, L S; Signorini, M L

    2016-03-01

    In Argentina, there are three known species of genus Trichinella; however, Trichinella spiralis is most commonly associated with domestic pigs and it is recognized as the main cause of human trichinellosis by the consumption of products made with raw or insufficiently cooked pork meat. In some areas of Argentina, this disease is endemic and it is thus necessary to develop a more effective programme of prevention and control. Here, we developed a quantitative risk assessment of human trichinellosis following pork meat sausage consumption, which may be used to identify the stages with greater impact on the probability of acquiring the disease. The quantitative model was designed to describe the conditions in which the meat is produced, processed, transported, stored, sold and consumed in Argentina. The model predicted a risk of human trichinellosis of 4.88 × 10(-6) and an estimated annual number of trichinellosis cases of 109. The risk of human trichinellosis was sensitive to the number of Trichinella larvae that effectively survived the storage period (r = 0.89), the average probability of infection (PPinf ) (r = 0.44) and the storage time (Storage) (r = 0.08). This model allowed assessing the impact of different factors influencing the risk of acquiring trichinellosis. The model may thus help to select possible strategies to reduce the risk in the chain of by-products of pork production. © 2015 Blackwell Verlag GmbH.

  13. IWGT report on quantitative approaches to genotoxicity risk assessment II. Use of point-of-departure (PoD) metrics in defining acceptable exposure limits and assessing human risk

    EPA Science Inventory

    This is the second of two reports from the International Workshops on Genotoxicity Testing (IWGT) Working Group on Quantitative Approaches to Genetic Toxicology Risk Assessment (the QWG). The first report summarized the discussions and recommendations of the QWG related to the ne...

  14. Quantitative 3-D imaging topogrammetry for telemedicine applications

    NASA Technical Reports Server (NTRS)

    Altschuler, Bruce R.

    1994-01-01

    precision micro-sewing machines, splice neural connections with laser welds, micro-bore through constricted vessels, and computer combine ultrasound, microradiography, and 3-D mini-borescopes to quickly assess and trace vascular problems in situ. The spatial relationships between organs, robotic arms, and end-effector diagnostic, manipulative, and surgical instruments would be constantly monitored by the robot 'brain' using inputs from its multiple 3-D quantitative 'eyes' remote sensing, as well as by contact and proximity force measuring devices. Methods to create accurate and quantitative 3-D topograms at continuous video data rates are described.

  15. Quantitative assessment of desertification in south of Iran using MEDALUS method.

    PubMed

    Sepehr, A; Hassanli, A M; Ekhtesasi, M R; Jamali, J B

    2007-11-01

    The main aim of this study was the quantitative assessment of desertification process in the case study area of the Fidoye-Garmosht plain (Southern Iran). Based on the MEDALUS approach and the characteristics of study area a regional model developed using GIS. Six main factors or indicators of desertification including: soil, climate, erosion, plant cover, groundwater and management were considered for evaluation. Then several sub-indicators affecting the quality of each main indicator were identified. Based on the MEDALUS approach, each sub-indicator was quantified according to its quality and given a weighting of between 1.0 and 2.0. ArcGIS 9 was used to analyze and prepare the layers of quality maps using the geometric mean to integrate the individual sub-indicator maps. In turn the geometric mean of all six quality maps was used to generate a single desertification status map. Results showed that 12% of the area is classified as very severe, 81% as severe and 7% as moderately affected by desertification. In addition the plant cover and groundwater indicators were the most important factors affecting desertification process in the study area. The model developed may be used to assess desertification process and distinguish the areas sensitive to desertification in the study region and in regions with the similar characteristics.

  16. Quantitative assessment of joint position sense recovery in subacute stroke patients: a pilot study.

    PubMed

    Kattenstroth, Jan-Christoph; Kalisch, Tobias; Kowalewski, Rebecca; Tegenthoff, Martin; Dinse, Hubert R

    2013-11-01

    To assess joint position sense performance in subacute stroke patients using a novel quantitative assessment. Proof-of-principle pilot study with a group of subacute stroke patients. Assessment at baseline and after 2 weeks of intervention. Additional data for a healthy age-matched control group. Ten subacute stroke patients (aged 65.41 years (standard deviation 2.5), 4 females, 2.3 weeks (standard deviation 0.2)) post-stroke receiving in-patient standard rehabilitation and repetitive electrical stimulation of the affected hand. Joint position sense was assessed based on the ability of correctly perceiving the opening angles of the finger joints. Patients had to report size differences of polystyrene balls of various sizes, whilst the balls were enclosed simultaneously by the affected and the non-affected hands. A total of 21 pairwise size comparisons was used to quantify joint position performance. After 2 weeks of therapeutic intervention a significant improvement in joint position sense performance was observed; however, the performance level was still below that of a healthy control group. The results indicate high feasibility and sensitivity of the joint position test in subacute stroke patients. Testing allowed quantification of both the deficit and the rehabilitation outcome.

  17. Single-Cell Based Quantitative Assay of Chromosome Transmission Fidelity

    PubMed Central

    Zhu, Jin; Heinecke, Dominic; Mulla, Wahid A.; Bradford, William D.; Rubinstein, Boris; Box, Andrew; Haug, Jeffrey S.; Li, Rong

    2015-01-01

    Errors in mitosis are a primary cause of chromosome instability (CIN), generating aneuploid progeny cells. Whereas a variety of factors can influence CIN, under most conditions mitotic errors are rare events that have been difficult to measure accurately. Here we report a green fluorescent protein−based quantitative chromosome transmission fidelity (qCTF) assay in budding yeast that allows sensitive and quantitative detection of CIN and can be easily adapted to high-throughput analysis. Using the qCTF assay, we performed genome-wide quantitative profiling of genes that affect CIN in a dosage-dependent manner and identified genes that elevate CIN when either increased (icCIN) or decreased in copy number (dcCIN). Unexpectedly, qCTF screening also revealed genes whose change in copy number quantitatively suppress CIN, suggesting that the basal error rate of the wild-type genome is not minimized, but rather, may have evolved toward an optimal level that balances both stability and low-level karyotype variation for evolutionary adaptation. PMID:25823586

  18. Quantitative assessment of cancer cell morphology and movement using telecentric digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Nguyen, Thanh C.; Nehmetallah, George; Lam, Van; Chung, Byung Min; Raub, Christopher

    2017-02-01

    Digital holographic microscopy (DHM) provides label-free and real-time quantitative phase information relevant to the analysis of dynamic biological systems. A DHM based on telecentric configuration optically mitigates phase aberrations due to the microscope objective and linear high frequency fringes due to the reference beam thus minimizing digital aberration correction needed for distortion free 3D reconstruction. The purpose of this work is to quantitatively assess growth and migratory behavior of invasive cancer cells using a telecentric DHM system. Together, the height and lateral shape features of individual cells, determined from time-lapse series of phase reconstructions, should reveal aspects of cell migration, cell-matrix adhesion, and cell cycle phase transitions. To test this, MDA-MB-231 breast cancer cells were cultured on collagen-coated or un-coated glass, and 3D holograms were reconstructed over 2 hours. Cells on collagencoated glass had an average 14% larger spread area than cells on uncoated glass (n=18-22 cells/group). The spread area of cells on uncoated glass were 15-21% larger than cells seeded on collagen hydrogels (n=18-22 cells/group). Premitotic cell rounding was observed with average phase height increasing 57% over 10 minutes. Following cell division phase height decreased linearly (R2=0.94) to 58% of the original height pre-division. Phase objects consistent with lamellipodia were apparent from the reconstructions at the leading edge of migrating cells. These data demonstrate the ability to track quantitative phase parameters and relate them to cell morphology during cell migration and division on adherent substrates, using telecentric DHM. The technique enables future studies of cell-matrix interactions relevant to cancer.

  19. Time-Accurate Numerical Simulations of Synthetic Jet Quiescent Air

    NASA Technical Reports Server (NTRS)

    Rupesh, K-A. B.; Ravi, B. R.; Mittal, R.; Raju, R.; Gallas, Q.; Cattafesta, L.

    2007-01-01

    The unsteady evolution of three-dimensional synthetic jet into quiescent air is studied by time-accurate numerical simulations using a second-order accurate mixed explicit-implicit fractional step scheme on Cartesian grids. Both two-dimensional and three-dimensional calculations of synthetic jet are carried out at a Reynolds number (based on average velocity during the discharge phase of the cycle V(sub j), and jet width d) of 750 and Stokes number of 17.02. The results obtained are assessed against PIV and hotwire measurements provided for the NASA LaRC workshop on CFD validation of synthetic jets.

  20. I Vivo Quantitative Ultrasound Imaging and Scatter Assessments.

    NASA Astrophysics Data System (ADS)

    Lu, Zheng Feng

    There is evidence that "instrument independent" measurements of ultrasonic scattering properties would provide useful diagnostic information that is not available with conventional ultrasound imaging. This dissertation is a continuing effort to test the above hypothesis and to incorporate quantitative ultrasound methods into clinical examinations for early detection of diffuse liver disease. A well-established reference phantom method was employed to construct quantitative ultrasound images of tissue in vivo. The method was verified by extensive phantom tests. A new method was developed to measure the effective attenuation coefficient of the body wall. The method relates the slope of the difference between the echo signal power spectrum from a uniform region distal to the body wall and the echo signal power spectrum from a reference phantom to the body wall attenuation. The accuracy obtained from phantom tests suggests further studies with animal experiments. Clinically, thirty-five healthy subjects and sixteen patients with diffuse liver disease were studied by these quantitative ultrasound methods. The average attenuation coefficient in normals agreed with previous investigators' results; in vivo backscatter coefficients agreed with the results from normals measured by O'Donnell. Strong discriminating power (p < 0.001) was found for both attenuation and backscatter coefficients between fatty livers and normals; a significant difference (p < 0.01) was observed in the backscatter coefficient but not in the attenuation coefficient between cirrhotic livers and normals. An in vivo animal model of steroid hepatopathy was used to investigate the system sensitivity in detecting early changes in canine liver resulting from corticosteroid administration. The average attenuation coefficient slope increased from 0.7 dB/cm/MHz in controls to 0.82 dB/cm/MHz (at 6 MHz) in treated animals on day 14 into the treatment, and the backscatter coefficient was 26times 10^{ -4}cm^{-1}sr

  1. Accurate and efficient spin integration for particle accelerators

    DOE PAGES

    Abell, Dan T.; Meiser, Dominic; Ranjbar, Vahid H.; ...

    2015-02-01

    Accurate spin tracking is a valuable tool for understanding spin dynamics in particle accelerators and can help improve the performance of an accelerator. In this paper, we present a detailed discussion of the integrators in the spin tracking code GPUSPINTRACK. We have implemented orbital integrators based on drift-kick, bend-kick, and matrix-kick splits. On top of the orbital integrators, we have implemented various integrators for the spin motion. These integrators use quaternions and Romberg quadratures to accelerate both the computation and the convergence of spin rotations.We evaluate their performance and accuracy in quantitative detail for individual elements as well as formore » the entire RHIC lattice. We exploit the inherently data-parallel nature of spin tracking to accelerate our algorithms on graphics processing units.« less

  2. Quantitative characterization of surface topography using spectral analysis

    NASA Astrophysics Data System (ADS)

    Jacobs, Tevis D. B.; Junge, Till; Pastewka, Lars

    2017-03-01

    Roughness determines many functional properties of surfaces, such as adhesion, friction, and (thermal and electrical) contact conductance. Recent analytical models and simulations enable quantitative prediction of these properties from knowledge of the power spectral density (PSD) of the surface topography. The utility of the PSD is that it contains statistical information that is unbiased by the particular scan size and pixel resolution chosen by the researcher. In this article, we first review the mathematical definition of the PSD, including the one- and two-dimensional cases, and common variations of each. We then discuss strategies for reconstructing an accurate PSD of a surface using topography measurements at different size scales. Finally, we discuss detecting and mitigating artifacts at the smallest scales, and computing upper/lower bounds on functional properties obtained from models. We accompany our discussion with virtual measurements on computer-generated surfaces. This discussion summarizes how to analyze topography measurements to reconstruct a reliable PSD. Analytical models demonstrate the potential for tuning functional properties by rationally tailoring surface topography—however, this potential can only be achieved through the accurate, quantitative reconstruction of the PSDs of real-world surfaces.

  3. Quantitative analysis of tympanic membrane perforation: a simple and reliable method.

    PubMed

    Ibekwe, T S; Adeosun, A A; Nwaorgu, O G

    2009-01-01

    Accurate assessment of the features of tympanic membrane perforation, especially size, site, duration and aetiology, is important, as it enables optimum management. To describe a simple, cheap and effective method of quantitatively analysing tympanic membrane perforations. The system described comprises a video-otoscope (capable of generating still and video images of the tympanic membrane), adapted via a universal serial bus box to a computer screen, with images analysed using the Image J geometrical analysis software package. The reproducibility of results and their correlation with conventional otoscopic methods of estimation were tested statistically with the paired t-test and correlational tests, using the Statistical Package for the Social Sciences version 11 software. The following equation was generated: P/T x 100 per cent = percentage perforation, where P is the area (in pixels2) of the tympanic membrane perforation and T is the total area (in pixels2) for the entire tympanic membrane (including the perforation). Illustrations are shown. Comparison of blinded data on tympanic membrane perforation area obtained independently from assessments by two trained otologists, of comparative years of experience, using the video-otoscopy system described, showed similar findings, with strong correlations devoid of inter-observer error (p = 0.000, r = 1). Comparison with conventional otoscopic assessment also indicated significant correlation, comparing results for two trained otologists, but some inter-observer variation was present (p = 0.000, r = 0.896). Correlation between the two methods for each of the otologists was also highly significant (p = 0.000). A computer-adapted video-otoscope, with images analysed by Image J software, represents a cheap, reliable, technology-driven, clinical method of quantitative analysis of tympanic membrane perforations and injuries.

  4. Using pseudoalignment and base quality to accurately quantify microbial community composition

    PubMed Central

    Novembre, John

    2018-01-01

    Pooled DNA from multiple unknown organisms arises in a variety of contexts, for example microbial samples from ecological or human health research. Determining the composition of pooled samples can be difficult, especially at the scale of modern sequencing data and reference databases. Here we propose a novel method for taxonomic profiling in pooled DNA that combines the speed and low-memory requirements of k-mer based pseudoalignment with a likelihood framework that uses base quality information to better resolve multiply mapped reads. We apply the method to the problem of classifying 16S rRNA reads using a reference database of known organisms, a common challenge in microbiome research. Using simulations, we show the method is accurate across a variety of read lengths, with different length reference sequences, at different sample depths, and when samples contain reads originating from organisms absent from the reference. We also assess performance in real 16S data, where we reanalyze previous genetic association data to show our method discovers a larger number of quantitative trait associations than other widely used methods. We implement our method in the software Karp, for k-mer based analysis of read pools, to provide a novel combination of speed and accuracy that is uniquely suited for enhancing discoveries in microbial studies. PMID:29659582

  5. Utility of high-resolution accurate MS to eliminate interferences in the bioanalysis of ribavirin and its phosphate metabolites.

    PubMed

    Wei, Cong; Grace, James E; Zvyaga, Tatyana A; Drexler, Dieter M

    2012-08-01

    The polar nucleoside drug ribavirin (RBV) combined with IFN-α is a front-line treatment for chronic hepatitis C virus infection. RBV acts as a prodrug and exerts its broad antiviral activity primarily through its active phosphorylated metabolite ribavirin 5´-triphosphate (RTP), and also possibly through ribavirin 5´-monophosphate (RMP). To study RBV transport, diffusion, metabolic clearance and its impact on drug-metabolizing enzymes, a LC-MS method is needed to simultaneously quantify RBV and its phosphorylated metabolites (RTP, ribavirin 5´-diphosphate and RMP). In a recombinant human UGT1A1 assay, the assay buffer components uridine and its phosphorylated derivatives are isobaric with RBV and its phosphorylated metabolites, leading to significant interference when analyzed by LC-MS with the nominal mass resolution mode. Presented here is a LC-MS method employing LC coupled with full-scan high-resolution accurate MS analysis for the simultaneous quantitative determination of RBV, RMP, ribavirin 5´-diphosphate and RTP by differentiating RBV and its phosphorylated metabolites from uridine and its phosphorylated derivatives by accurate mass, thus avoiding interference. The developed LC-high-resolution accurate MS method allows for quantitation of RBV and its phosphorylated metabolites, eliminating the interferences from uridine and its phosphorylated derivatives in recombinant human UGT1A1 assays.

  6. Current Understanding of the Pathophysiology of Myocardial Fibrosis and Its Quantitative Assessment in Heart Failure

    PubMed Central

    Liu, Tong; Song, Deli; Dong, Jianzeng; Zhu, Pinghui; Liu, Jie; Liu, Wei; Ma, Xiaohai; Zhao, Lei; Ling, Shukuan

    2017-01-01

    Myocardial fibrosis is an important part of cardiac remodeling that leads to heart failure and death. Myocardial fibrosis results from increased myofibroblast activity and excessive extracellular matrix deposition. Various cells and molecules are involved in this process, providing targets for potential drug therapies. Currently, the main detection methods of myocardial fibrosis rely on serum markers, cardiac magnetic resonance imaging, and endomyocardial biopsy. This review summarizes our current knowledge regarding the pathophysiology, quantitative assessment, and novel therapeutic strategies of myocardial fibrosis. PMID:28484397

  7. Quantitative assessment of background parenchymal enhancement in breast magnetic resonance images predicts the risk of breast cancer.

    PubMed

    Hu, Xiaoxin; Jiang, Luan; Li, Qiang; Gu, Yajia

    2017-02-07

    The objective of this study was to evaluate the association betweenthe quantitative assessment of background parenchymal enhancement rate (BPER) and breast cancer. From 14,033 consecutive patients who underwent breast MRI in our center, we randomly selected 101 normal controls. Then, we selected 101 women with benign breast lesions and 101 women with breast cancer who were matched for age and menstruation status. We evaluated BPER at early (2 minutes), medium (4 minutes) and late (6 minutes) enhanced time phases of breast MRI for quantitative assessment. Odds ratios (ORs) for risk of breast cancer were calculated using the receiver operating curve. The BPER increased in a time-dependent manner after enhancement in both premenopausal and postmenopausal women. Premenopausal women had higher BPER than postmenopausal women at early, medium and late enhanced phases. In the normal population, the OR for probability of breast cancer for premenopausal women with high BPER was 4.1 (95% CI: 1.7-9.7) and 4.6 (95% CI: 1.7-12.0) for postmenopausal women. The OR of breast cancer morbidity in premenopausal women with high BPER was 2.6 (95% CI: 1.1-6.4) and 2.8 (95% CI: 1.2-6.1) for postmenopausal women. The BPER was found to be a predictive factor of breast cancer morbidity. Different time phases should be used to assess BPER in premenopausal and postmenopausal women.

  8. Fast and accurate determination of arsenobetaine in fish tissues using accelerated solvent extraction and HPLC-ICP-MS determination.

    PubMed

    Wahlen, Raimund

    2004-04-01

    A high-performance liquid chromatography-inductively coupled plasma-mass spectrometry (HPLC-ICP-MS) method has been developed for the fast and accurate analysis of arsenobetaine (AsB) in fish samples extracted by accelerated solvent extraction. The combined extraction and analysis approach is validated using certified reference materials for AsB in fish and during a European intercomparison exercise with a blind sample. Up to six species of arsenic (As) can be separated and quantitated in the extracts within a 10-min isocratic elution. The method is optimized so as to minimize time-consuming sample preparation steps and allow for automated extraction and analysis of large sample batches. A comparison of standard addition and external calibration show no significant difference in the results obtained, which indicates that the LC-ICP-MS method is not influenced by severe matrix effects. The extraction procedure can process up to 24 samples in an automated manner, yet the robustness of the developed HPLC-ICP-MS approach is highlighted by the capability to run more than 50 injections per sequence, which equates to a total run-time of more than 12 h. The method can therefore be used to rapidly and accurately assess the proportion of nontoxic AsB in fish samples with high total As content during toxicological screening studies.

  9. Rapid and accurate prediction of degradant formation rates in pharmaceutical formulations using high-performance liquid chromatography-mass spectrometry.

    PubMed

    Darrington, Richard T; Jiao, Jim

    2004-04-01

    Rapid and accurate stability prediction is essential to pharmaceutical formulation development. Commonly used stability prediction methods include monitoring parent drug loss at intended storage conditions or initial rate determination of degradants under accelerated conditions. Monitoring parent drug loss at the intended storage condition does not provide a rapid and accurate stability assessment because often <0.5% drug loss is all that can be observed in a realistic time frame, while the accelerated initial rate method in conjunction with extrapolation of rate constants using the Arrhenius or Eyring equations often introduces large errors in shelf-life prediction. In this study, the shelf life prediction of a model pharmaceutical preparation utilizing sensitive high-performance liquid chromatography-mass spectrometry (LC/MS) to directly quantitate degradant formation rates at the intended storage condition is proposed. This method was compared to traditional shelf life prediction approaches in terms of time required to predict shelf life and associated error in shelf life estimation. Results demonstrated that the proposed LC/MS method using initial rates analysis provided significantly improved confidence intervals for the predicted shelf life and required less overall time and effort to obtain the stability estimation compared to the other methods evaluated. Copyright 2004 Wiley-Liss, Inc. and the American Pharmacists Association.

  10. Quantitative wound healing measurement and monitoring system based on an innovative 3D imaging system

    NASA Astrophysics Data System (ADS)

    Yi, Steven; Yang, Arthur; Yin, Gongjie; Wen, James

    2011-03-01

    In this paper, we report a novel three-dimensional (3D) wound imaging system (hardware and software) under development at Technest Inc. System design is aimed to perform accurate 3D measurement and modeling of a wound and track its healing status over time. Accurate measurement and tracking of wound healing enables physicians to assess, document, improve, and individualize the treatment plan given to each wound patient. In current wound care practices, physicians often visually inspect or roughly measure the wound to evaluate the healing status. This is not an optimal practice since human vision lacks precision and consistency. In addition, quantifying slow or subtle changes through perception is very difficult. As a result, an instrument that quantifies both skin color and geometric shape variations would be particularly useful in helping clinicians to assess healing status and judge the effect of hyperemia, hematoma, local inflammation, secondary infection, and tissue necrosis. Once fully developed, our 3D imaging system will have several unique advantages over traditional methods for monitoring wound care: (a) Non-contact measurement; (b) Fast and easy to use; (c) up to 50 micron measurement accuracy; (d) 2D/3D Quantitative measurements;(e) A handheld device; and (f) Reasonable cost (< $1,000).

  11. A low-cost tracked C-arm (TC-arm) upgrade system for versatile quantitative intraoperative imaging.

    PubMed

    Amiri, Shahram; Wilson, David R; Masri, Bassam A; Anglin, Carolyn

    2014-07-01

    C-arm fluoroscopy is frequently used in clinical applications as a low-cost and mobile real-time qualitative assessment tool. C-arms, however, are not widely accepted for applications involving quantitative assessments, mainly due to the lack of reliable and low-cost position tracking methods, as well as adequate calibration and registration techniques. The solution suggested in this work is a tracked C-arm (TC-arm) which employs a low-cost sensor tracking module that can be retrofitted to any conventional C-arm for tracking the individual joints of the device. Registration and offline calibration methods were developed that allow accurate tracking of the gantry and determination of the exact intrinsic and extrinsic parameters of the imaging system for any acquired fluoroscopic image. The performance of the system was evaluated in comparison to an Optotrak[Formula: see text] motion tracking system and by a series of experiments on accurately built ball-bearing phantoms. Accuracies of the system were determined for 2D-3D registration, three-dimensional landmark localization, and for generating panoramic stitched views in simulated intraoperative applications. The system was able to track the center point of the gantry with an accuracy of [Formula: see text] mm or better. Accuracies of 2D-3D registrations were [Formula: see text] mm and [Formula: see text]. Three-dimensional landmark localization had an accuracy of [Formula: see text] of the length (or [Formula: see text] mm) on average, depending on whether the landmarks were located along, above, or across the table. The overall accuracies of the two-dimensional measurements conducted on stitched panoramic images of the femur and lumbar spine were 2.5 [Formula: see text] 2.0 % [Formula: see text] and [Formula: see text], respectively. The TC-arm system has the potential to achieve sophisticated quantitative fluoroscopy assessment capabilities using an existing C-arm imaging system. This technology may be useful to

  12. A novel anthropomorphic flow phantom for the quantitative evaluation of prostate DCE-MRI acquisition techniques

    NASA Astrophysics Data System (ADS)

    Knight, Silvin P.; Browne, Jacinta E.; Meaney, James F.; Smith, David S.; Fagan, Andrew J.

    2016-10-01

    A novel anthropomorphic flow phantom device has been developed, which can be used for quantitatively assessing the ability of magnetic resonance imaging (MRI) scanners to accurately measure signal/concentration time-intensity curves (CTCs) associated with dynamic contrast-enhanced (DCE) MRI. Modelling of the complex pharmacokinetics of contrast agents as they perfuse through the tumour capillary network has shown great promise for cancer diagnosis and therapy monitoring. However, clinical adoption has been hindered by methodological problems, resulting in a lack of consensus regarding the most appropriate acquisition and modelling methodology to use and a consequent wide discrepancy in published data. A heretofore overlooked source of such discrepancy may arise from measurement errors of tumour CTCs deriving from the imaging pulse sequence itself, while the effects on the fidelity of CTC measurement of using rapidly-accelerated sequences such as parallel imaging and compressed sensing remain unknown. The present work aimed to investigate these features by developing a test device in which ‘ground truth’ CTCs were generated and presented to the MRI scanner for measurement, thereby allowing for an assessment of the DCE-MRI protocol to accurately measure this curve shape. The device comprised a four-pump flow system wherein CTCs derived from prior patient prostate data were produced in measurement chambers placed within the imaged volume. The ground truth was determined as the mean of repeat measurements using an MRI-independent, custom-built optical imaging system. In DCE-MRI experiments, significant discrepancies between the ground truth and measured CTCs were found for both tumorous and healthy tissue-mimicking curve shapes. Pharmacokinetic modelling revealed errors in measured K trans, v e and k ep values of up to 42%, 31%, and 50% respectively, following a simple variation of the parallel imaging factor and number of signal averages in the acquisition

  13. 75 FR 9488 - Basel Comprehensive Quantitative Impact Study

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-02

    ... DEPARTMENT OF THE TREASURY Office of Thrift Supervision Basel Comprehensive Quantitative Impact... Quantitative Impact Study. OMB Number: 1550-0NEW. Form Numbers: N/A. Regulation requirement: 12 CFR Part 567... Basel II Capital Accord, the Basel Committee will conduct a quantitative impact study (QIS) to assess...

  14. Computer-aided analysis with Image J for quantitatively assessing psoriatic lesion area.

    PubMed

    Sun, Z; Wang, Y; Ji, S; Wang, K; Zhao, Y

    2015-11-01

    Body surface area is important in determining the severity of psoriasis. However, objective, reliable, and practical method is still in need for this purpose. We performed a computer image analysis (CIA) of psoriatic area using the image J freeware to determine whether this method could be used for objective evaluation of psoriatic area. Fifteen psoriasis patients were randomized to be treated with adalimumab or placebo in a clinical trial. At each visit, the psoriasis area of each body site was estimated by two physicians (E-method), and standard photographs were taken. The psoriasis area in the pictures was assessed with CIA using semi-automatic threshold selection (T-method), or manual selection (M-method, gold standard). The results assessed by the three methods were analyzed with reliability and affecting factors evaluated. Both T- and E-method correlated strongly with M-method, and T-method had a slightly stronger correlation with M-method. Both T- and E-methods had a good consistency between the evaluators. All the three methods were able to detect the change in the psoriatic area after treatment, while the E-method tends to overestimate. The CIA with image J freeware is reliable and practicable in quantitatively assessing the lesional of psoriasis area. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  15. Importance of accurately assessing biomechanics of the cornea.

    PubMed

    Roberts, Cynthia J

    2016-07-01

    This article summarizes the state-of-the-art in clinical corneal biomechanics, including procedures in which biomechanics play a role, and the clinical consequences in terms of error in estimating intraocular pressure (IOP). Corneal biomechanical response to refractive surgery can be categorized into either stable alteration of surface shape and thus visual outcome, or unstable biomechanical decompensation. The stable response is characterized by central flattening and peripheral steepening that is potentiated in a stiffer cornea. Two clinical devices for assessing corneal biomechanics do not yet measure classic biomechanical properties, but rather provide assessment of corneal deformation response. Biomechanical parameters are a function of IOP, and both the cornea and sclera become stiffer as IOP increases. Any assessment of biomechanical parameters must include IOP, and one value of stiffness does not adequately characterize a cornea. Corneal biomechanics plays a role in the outcomes of any procedure in which lamellae are transected. Once the corneal structure has been altered in a manner that includes central thinning, IOP measurements with applanation tonometry are likely not valid, and other technologies should be used.

  16. Quantitative Imaging in Cancer Clinical Trials

    PubMed Central

    Yankeelov, Thomas E.; Mankoff, David A.; Schwartz, Lawrence H.; Lieberman, Frank S.; Buatti, John M.; Mountz, James M.; Erickson, Bradley J.; Fennessy, Fiona M.M.; Huang, Wei; Kalpathy-Cramer, Jayashree; Wahl, Richard L.; Linden, Hannah M.; Kinahan, Paul; Zhao, Binsheng; Hylton, Nola M.; Gillies, Robert J.; Clarke, Laurence; Nordstrom, Robert; Rubin, Daniel L.

    2015-01-01

    As anti-cancer therapies designed to target specific molecular pathways have been developed, it has become critical to develop methods to assess the response induced by such agents. While traditional, anatomic CT and MRI exams are useful in many settings, there is increasing evidence that these methods cannot answer the fundamental biological and physiological questions essential for assessment and, eventually, prediction of treatment response in the clinical trial setting, especially in the critical period soon after treatment is initiated. To optimally apply advances in quantitative imaging methods to trials of targeted cancer therapy, new infrastructure improvements are needed that incorporate these emerging techniques into the settings where they are most likely to have impact. In this review, we first elucidate the needs for therapeutic response assessment in the era of molecularly targeted therapy and describe how quantitative imaging can most effectively provide scientifically and clinically relevant data. We then describe the tools and methods required to apply quantitative imaging and provide concrete examples of work making these advances practically available for routine application in clinical trials. We conclude by proposing strategies to surmount barriers to wider incorporation of these quantitative imaging methods into clinical trials and, eventually, clinical practice. Our goal is to encourage and guide the oncology community to deploy standardized quantitative imaging techniques in clinical trials to further personalize care for cancer patients, and to provide a more efficient path for the development of improved targeted therapies. PMID:26773162

  17. Quantitative Assessment of RNA-Protein Interactions with High Throughput Sequencing - RNA Affinity Profiling (HiTS-RAP)

    PubMed Central

    Ozer, Abdullah; Tome, Jacob M.; Friedman, Robin C.; Gheba, Dan; Schroth, Gary P.; Lis, John T.

    2016-01-01

    Because RNA-protein interactions play a central role in a wide-array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the High Throughput Sequencing-RNA Affinity Profiling (HiTS-RAP) assay, which couples sequencing on an Illumina GAIIx with the quantitative assessment of one or several proteins’ interactions with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of EGFP and NELF-E proteins with their corresponding canonical and mutant RNA aptamers. Here, we provide a detailed protocol for HiTS-RAP, which can be completed in about a month (8 days hands-on time) including the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, high-throughput sequencing and protein binding with GAIIx, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, RNA-MaP and RBNS. A successful HiTS-RAP experiment provides the sequence and binding curves for approximately 200 million RNAs in a single experiment. PMID:26182240

  18. A quantitative comparison of transesophageal and epicardial color Doppler echocardiography in the intraoperative assessment of mitral regurgitation.

    PubMed

    Kleinman, J P; Czer, L S; DeRobertis, M; Chaux, A; Maurer, G

    1989-11-15

    Epicardial and transesophageal color Doppler echocardiography are both widely used for the intraoperative assessment of mitral regurgitation (MR); however, it has not been established whether grading of regurgitation is comparable when evaluated by these 2 techniques. MR jet size was quantitatively compared in 29 hemodynamically and temporally matched open-chest epicardial and transesophageal color Doppler echocardiography studies from 22 patients (18 with native and 4 with porcine mitral valves) scheduled to undergo mitral valve repair or replacement. Jet area, jet length and left atrial area were analyzed. Comparison of jet area measurements as assessed by epicardial and transesophageal color flow mapping revealed an excellent correlation between the techniques (r = 0.95, p less than 0.001). Epicardial and transesophageal jet length measurements were also similar (r = 0.77, p less than 0.001). Left atrial area could not be measured in 18 transesophageal studies (62%) due to foreshortening, and in 5 epicardial studies (17%) due to poor image resolution. Acoustic interference with left atrial and color flow mapping signals was noted in all patients with mitral valve prostheses when imaged by epicardial echocardiography, but this did not occur with transesophageal imaging. Thus, in patients undergoing valve repair or replacement, transesophageal and epicardial color flow mapping provide similar quantitative assessment of MR jet size. Jet area to left atrial area ratios have limited applicability in transesophageal color flow mapping, due to foreshortening of the left atrial borders in transesophageal views. Transesophageal color flow mapping may be especially useful in assessing dysfunctional mitral prostheses due to the lack of left atrial acoustic interference.

  19. Inter- and intra-observer agreement of BI-RADS-based subjective visual estimation of amount of fibroglandular breast tissue with magnetic resonance imaging: comparison to automated quantitative assessment.

    PubMed

    Wengert, G J; Helbich, T H; Woitek, R; Kapetas, P; Clauser, P; Baltzer, P A; Vogl, W-D; Weber, M; Meyer-Baese, A; Pinker, Katja

    2016-11-01

    To evaluate the inter-/intra-observer agreement of BI-RADS-based subjective visual estimation of the amount of fibroglandular tissue (FGT) with magnetic resonance imaging (MRI), and to investigate whether FGT assessment benefits from an automated, observer-independent, quantitative MRI measurement by comparing both approaches. Eighty women with no imaging abnormalities (BI-RADS 1 and 2) were included in this institutional review board (IRB)-approved prospective study. All women underwent un-enhanced breast MRI. Four radiologists independently assessed FGT with MRI by subjective visual estimation according to BI-RADS. Automated observer-independent quantitative measurement of FGT with MRI was performed using a previously described measurement system. Inter-/intra-observer agreements of qualitative and quantitative FGT measurements were assessed using Cohen's kappa (k). Inexperienced readers achieved moderate inter-/intra-observer agreement and experienced readers a substantial inter- and perfect intra-observer agreement for subjective visual estimation of FGT. Practice and experience reduced observer-dependency. Automated observer-independent quantitative measurement of FGT was successfully performed and revealed only fair to moderate agreement (k = 0.209-0.497) with subjective visual estimations of FGT. Subjective visual estimation of FGT with MRI shows moderate intra-/inter-observer agreement, which can be improved by practice and experience. Automated observer-independent quantitative measurements of FGT are necessary to allow a standardized risk evaluation. • Subjective FGT estimation with MRI shows moderate intra-/inter-observer agreement in inexperienced readers. • Inter-observer agreement can be improved by practice and experience. • Automated observer-independent quantitative measurements can provide reliable and standardized assessment of FGT with MRI.

  20. Quantitation of spatially-localized proteins in tissue samples using MALDI-MRM imaging.

    PubMed

    Clemis, Elizabeth J; Smith, Derek S; Camenzind, Alexander G; Danell, Ryan M; Parker, Carol E; Borchers, Christoph H

    2012-04-17

    MALDI imaging allows the creation of a "molecular image" of a tissue slice. This image is reconstructed from the ion abundances in spectra obtained while rastering the laser over the tissue. These images can then be correlated with tissue histology to detect potential biomarkers of, for example, aberrant cell types. MALDI, however, is known to have problems with ion suppression, making it difficult to correlate measured ion abundance with concentration. It would be advantageous to have a method which could provide more accurate protein concentration measurements, particularly for screening applications or for precise comparisons between samples. In this paper, we report the development of a novel MALDI imaging method for the localization and accurate quantitation of proteins in tissues. This method involves optimization of in situ tryptic digestion, followed by reproducible and uniform deposition of an isotopically labeled standard peptide from a target protein onto the tissue, using an aerosol-generating device. Data is acquired by MALDI multiple reaction monitoring (MRM) mass spectrometry (MS), and accurate peptide quantitation is determined from the ratio of MRM transitions for the endogenous unlabeled proteolytic peptides to the corresponding transitions from the applied isotopically labeled standard peptides. In a parallel experiment, the quantity of the labeled peptide applied to the tissue was determined using a standard curve generated from MALDI time-of-flight (TOF) MS data. This external calibration curve was then used to determine the quantity of endogenous peptide in a given area. All standard curves generate by this method had coefficients of determination greater than 0.97. These proof-of-concept experiments using MALDI MRM-based imaging show the feasibility for the precise and accurate quantitation of tissue protein concentrations over 2 orders of magnitude, while maintaining the spatial localization information for the proteins.

  1. Quantitative Assessment of the Safety Benefits Associated with Increasing Clinical Peanut Thresholds Through Immunotherapy.

    PubMed

    Baumert, Joseph L; Taylor, Steve L; Koppelman, Stef J

    Peanut immunotherapy studies are conducted with the aim to decrease the sensitivity of patients to peanut exposure with the outcome evaluated by testing the threshold for allergic response in a double-blind placebo-controlled food challenge. The clinical relevance of increasing this threshold is not well characterized. We aimed to quantify the clinical benefit of an increased threshold for peanut-allergic patients. Quantitative risk assessment was performed by matching modeled exposure to peanut protein with individual threshold levels. Exposure was modeled by pairing US consumption data for various food product categories with potential contamination levels of peanut that have been demonstrated to be present on occasion in such food products. Cookies, ice cream, doughnuts/snack cakes, and snack chip mixes were considered in the risk assessment. Increasing the baseline threshold before immunotherapy from 100 mg or less peanut protein to 300 mg peanut protein postimmunotherapy reduces the risk of experiencing an allergic reaction by more than 95% for all 4 food product categories that may contain trace levels of peanut residue. Further increase in the threshold to 1000 mg of peanut protein had an additional quantitative benefit in risk reduction for all patients reacting to 300 mg or less at baseline. We conclude that achieving thresholds of 300 mg and 1000 mg of peanut protein by peanut immunotherapy is clinically relevant, and that the risk for peanut-allergic patients who have achieved this increased threshold to experience an allergic reaction is reduced in a clinically meaningful way. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Assessment of the ion-trap mass spectrometer for routine qualitative and quantitative analysis of drugs of abuse extracted from urine.

    PubMed

    Vorce, S P; Sklerov, J H; Kalasinsky, K S

    2000-10-01

    The ion-trap mass spectrometer (MS) has been available as a detector for gas chromatography (GC) for nearly two decades. However, it still occupies a minor role in forensic toxicology drug-testing laboratories. Quadrupole MS instruments make up the majority of GC detectors used in drug confirmation. This work addresses the use of these two MS detectors, comparing the ion ratio precision and quantitative accuracy for the analysis of different classes of abused drugs extracted from urine. Urine specimens were prepared at five concentrations each for amphetamine (AMP), methamphetamine (METH), benzoylecgonine (BZE), delta9-carboxy-tetrahydrocannabinol (delta9-THCCOOH), phencyclidine (PCP), morphine (MOR), codeine (COD), and 6-acetylmorphine (6-AM). Concentration ranges for AMP, METH, BZE, delta9-THCCOOH, PCP, MOR, COD, and 6-AM were 50-2500, 50-5000, 15-800, 1.5-65, 1-250, 500-32000, 250-21000, and 1.5-118 ng/mL, respectively. Sample extracts were injected into a GC-quadrupole MS operating in selected ion monitoring (SIM) mode and a GC-ion-trap MS operating in either selected ion storage (SIS) or full scan (FS) mode. Precision was assessed by the evaluation of five ion ratios for n = 15 injections at each concentration using a single-point calibration. Precision measurements for SIM ion ratios provided coefficients of variation (CV) between 2.6 and 9.8% for all drugs. By comparison, the SIS and FS data yielded CV ranges of 4.0-12.8% and 4.0-11.2%, respectively. The total ion ratio failure rates were 0.2% (SIM), 0.7% (SIS), and 1.2% (FS) for the eight drugs analyzed. Overall, the SIS mode produced stable, comparable mean ratios over the concentration ranges examined, but had greater variance within batch runs. Examination of postmortem and quality-control samples produced forensically accurate quantitation by SIS when compared to SIM. Furthermore, sensitivity of FS was equivalent to SIM for all compounds examined except for 6-AM.

  3. Quantitative breast MRI radiomics for cancer risk assessment and the monitoring of high-risk populations

    NASA Astrophysics Data System (ADS)

    Mendel, Kayla R.; Li, Hui; Giger, Maryellen L.

    2016-03-01

    Breast density is routinely assessed qualitatively in screening mammography. However, it is challenging to quantitatively determine a 3D density from a 2D image such as a mammogram. Furthermore, dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) is used more frequently in the screening of high-risk populations. The purpose of our study is to segment parenchyma and to quantitatively determine volumetric breast density on pre-contrast axial DCE-MRI images (i.e., non-contrast) using a semi-automated quantitative approach. In this study, we retroactively examined 3D DCE-MRI images taken for breast cancer screening of a high-risk population. We analyzed 66 cases with ages between 28 and 76 (mean 48.8, standard deviation 10.8). DCE-MRIs were obtained on a Philips 3.0 T scanner. Our semi-automated DCE-MRI algorithm includes: (a) segmentation of breast tissue from non-breast tissue using fuzzy cmeans clustering (b) separation of dense and fatty tissues using Otsu's method, and (c) calculation of volumetric density as the ratio of dense voxels to total breast voxels. We examined the relationship between pre-contrast DCE-MRI density and clinical BI-RADS density obtained from radiology reports, and obtained a statistically significant correlation [Spearman ρ-value of 0.66 (p < 0.0001)]. Our method within precision medicine may be useful for monitoring high-risk populations.

  4. [The development of a computer model in the quantitative assessment of thallium-201 myocardial scintigraphy].

    PubMed

    Raineri, M; Traina, M; Rotolo, A; Candela, B; Lombardo, R M; Raineri, A A

    1993-05-01

    Thallium-201 scintigraphy is a widely used noninvasive procedure for the detection and prognostic assessment of patients with suspected or proven coronary artery disease. Thallium uptake can be evaluated by a visual analysis or by a quantitative interpretation. Quantitative scintigraphy enhances disease detection in individual coronary arteries, provides a more precise estimate of the amount of ischemic myocardium, distinguishing scar from hypoperfused tissue. Due to the great deal of data, analysis, interpretation and comparison of thallium uptake can be very complex. We designed a computer-based system for the interpretation of quantitative thallium-201 scintigraphy data uptake. We used a database (DataEase 4.2-DataEase Italia). Our software has the following functions: data storage; calculation; conversion of numerical data into different definitions classifying myocardial perfusion; uptake data comparison; automatic conclusion; comparison of different scintigrams for the same patient. Our software is made up by 4 sections: numeric analysis, descriptive analysis, automatic conclusion, clinical remarks. We introduced in the computer system appropriate information, "logical paths", that use the "IF ... THEN" rules. The software executes these rules in order to analyze the myocardial regions in the 3 phases of scintigraphic analysis (stress, redistribution, re-injection), in the 3 projections (LAO 45 degrees, LAT,ANT), considering our uptake cutoff, obtaining, finally, the automatic conclusions. For these reasons, our computer-based system could be considered a real "expert system".

  5. Comparing models for quantitative risk assessment: an application to the European Registry of foreign body injuries in children.

    PubMed

    Berchialla, Paola; Scarinzi, Cecilia; Snidero, Silvia; Gregori, Dario

    2016-08-01

    Risk Assessment is the systematic study of decisions subject to uncertain consequences. An increasing interest has been focused on modeling techniques like Bayesian Networks since their capability of (1) combining in the probabilistic framework different type of evidence including both expert judgments and objective data; (2) overturning previous beliefs in the light of the new information being received and (3) making predictions even with incomplete data. In this work, we proposed a comparison among Bayesian Networks and other classical Quantitative Risk Assessment techniques such as Neural Networks, Classification Trees, Random Forests and Logistic Regression models. Hybrid approaches, combining both Classification Trees and Bayesian Networks, were also considered. Among Bayesian Networks, a clear distinction between purely data-driven approach and combination of expert knowledge with objective data is made. The aim of this paper consists in evaluating among this models which best can be applied, in the framework of Quantitative Risk Assessment, to assess the safety of children who are exposed to the risk of inhalation/insertion/aspiration of consumer products. The issue of preventing injuries in children is of paramount importance, in particular where product design is involved: quantifying the risk associated to product characteristics can be of great usefulness in addressing the product safety design regulation. Data of the European Registry of Foreign Bodies Injuries formed the starting evidence for risk assessment. Results showed that Bayesian Networks appeared to have both the ease of interpretability and accuracy in making prediction, even if simpler models like logistic regression still performed well. © The Author(s) 2013.

  6. Residual eDNA detection sensitivity assessed by quantitative real-time PCR in a river ecosystem.

    PubMed

    Balasingham, Katherine D; Walter, Ryan P; Heath, Daniel D

    2017-05-01

    Several studies have demonstrated that environmental DNA (eDNA) can be used to detect the presence of aquatic species, days to weeks after the target species has been removed. However, most studies used eDNA analysis in lentic systems (ponds or lakes), or in controlled laboratory experiments. While eDNA degrades rapidly in all aquatic systems, it also undergoes dilution effects and physical destruction in flowing systems, complicating detection in rivers. However, some eDNA (i.e. residual eDNA) can be retained in aquatic systems, even those subject to high flow regimes. Our goal was to determine residual eDNA detection sensitivity using quantitative real-time polymerase chain reaction (qRT-PCR), in a flowing, uncontrolled river after the eDNA source was removed from the system; we repeated the experiment over 2 years. Residual eDNA had the strongest signal strength at the original source site and was detectable there up to 11.5 h after eDNA source removal. Residual eDNA signal strength decreased as sampling distance downstream from the eDNA source site increased, and was no longer detectable at the source site 48 h after the eDNA source water was exhausted in both experiments. This experiment shows that residual eDNA sampled in surface water can be mapped quantitatively using qRT-PCR, which allows a more accurate spatial identification of the target species location in lotic systems, and relative residual eDNA signal strength may allow the determination of the timing of the presence of target species. © 2016 John Wiley & Sons Ltd.

  7. Assessment and Mission Planning Capability For Quantitative Aerothermodynamic Flight Measurements Using Remote Imaging

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas; Splinter, Scott; Daryabeigi, Kamran; Wood, William; Schwartz, Richard; Ross, Martin

    2008-01-01

    assessment study focused on increasing the probability of returning spatially resolved scientific/engineering thermal imagery. This paper provides an overview of the assessment task and the systematic approach designed to establish confidence in the ability of existing assets to reliably acquire, track and return global quantitative surface temperatures of the Shuttle during entry. A discussion of capability demonstration in support of a potential Shuttle boundary layer transition flight test is presented. Successful demonstration of a quantitative, spatially resolved, global temperature measurement on the proposed Shuttle boundary layer transition flight test could lead to potential future applications with hypersonic flight test programs within the USAF and DARPA along with flight test opportunities supporting NASA s project Constellation.

  8. Quantitation of influenza virus using field flow fractionation and multi-angle light scattering for quantifying influenza A particles

    PubMed Central

    Bousse, Tatiana; Shore, David A.; Goldsmith, Cynthia S.; Hossain, M. Jaber; Jang, Yunho; Davis, Charles T.; Donis, Ruben O.; Stevens, James

    2017-01-01

    Summary Recent advances in instrumentation and data analysis in field flow fractionation and multi-angle light scattering (FFF-MALS) have enabled greater use of this technique to characterize and quantitate viruses. In this study, the FFF-MALS technique was applied to the characterization and quantitation of type A influenza virus particles to assess its usefulness for vaccine preparation. The use of FFF-MALS for quantitation and measurement of control particles provided data accurate to within 5% of known values, reproducible with a coefficient of variation of 1.9 %. The methods, sensitivity and limit of detection were established by analyzing different volumes of purified virus, which produced a linear regression with fitting value R2 of 0.99. FFF-MALS was further applied to detect and quantitate influenza virus in the supernatant of infected MDCK cells and allantoic fluids of infected eggs. FFF fractograms of the virus present in these different fluids revealed similar distribution of monomeric and oligomeric virions. However, the monomer fraction of cell grown virus has greater size variety. Notably, β-propialactone (BPL) inactivation of influenza viruses did not influence any of the FFF-MALS measurements. Quantitation analysis by FFF-MALS was compared to infectivity assays and real-time RT-PCR (qRT-PCR) and the limitations of each assay were discussed. PMID:23916678

  9. A clinically applicable non-invasive method to quantitatively assess the visco-hyperelastic properties of human heel pad, implications for assessing the risk of mechanical trauma.

    PubMed

    Behforootan, Sara; Chatzistergos, Panagiotis E; Chockalingam, Nachiappan; Naemi, Roozbeh

    2017-04-01

    Pathological conditions such as diabetic foot and plantar heel pain are associated with changes in the mechanical properties of plantar soft tissue. However, the causes and implications of these changes are not yet fully understood. This is mainly because accurate assessment of the mechanical properties of plantar soft tissue in the clinic remains extremely challenging. To develop a clinically viable non-invasive method of assessing the mechanical properties of the heel pad. Furthermore the effect of non-linear mechanical behaviour of the heel pad on its ability to uniformly distribute foot-ground contact loads in light of the effect of overloading is also investigated. An automated custom device for ultrasound indentation was developed along with custom algorithms for the automated subject-specific modeling of heel pad. Non-time-dependent and time-dependent material properties were inverse engineered from results from quasi-static indentation and stress relaxation test respectively. The validity of the calculated coefficients was assessed for five healthy participants. The implications of altered mechanical properties on the heel pad's ability to uniformly distribute plantar loading were also investigated in a parametric analysis. The subject-specific heel pad models with coefficients calculated based on quasi-static indentation and stress relaxation were able to accurately simulate dynamic indentation. Average error in the predicted forces for maximum deformation was only 6.6±4.0%. When the inverse engineered coefficients were used to simulate the first instance of heel strike the error in terms of peak plantar pressure was 27%. The parametric analysis indicated that the heel pad's ability to uniformly distribute plantar loads is influenced both by its overall deformability and by its stress-strain behaviour. When overall deformability stays constant, changes in stress/strain behaviour leading to a more "linear" mechanical behaviour appear to improve the heel

  10. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  11. Is photometry an accurate and reliable method to assess boar semen concentration?

    PubMed

    Camus, A; Camugli, S; Lévêque, C; Schmitt, E; Staub, C

    2011-02-01

    Sperm concentration assessment is a key point to insure appropriate sperm number per dose in species subjected to artificial insemination (AI). The aim of the present study was to evaluate the accuracy and reliability of two commercially available photometers, AccuCell™ and AccuRead™ pre-calibrated for boar semen in comparison to UltiMate™ boar version 12.3D, NucleoCounter SP100 and Thoma hemacytometer. For each type of instrument, concentration was measured on 34 boar semen samples in quadruplicate and agreement between measurements and instruments were evaluated. Accuracy for both photometers was illustrated by mean of percentage differences to the general mean. It was -0.6% and 0.5% for Accucell™ and Accuread™ respectively, no significant differences were found between instrument and mean of measurement among all equipment. Repeatability for both photometers was 1.8% and 3.2% for AccuCell™ and AccuRead™ respectively. Low differences were observed between instruments (confidence interval 3%) except when hemacytometer was used as a reference. Even though hemacytometer is considered worldwide as the gold standard, it is the more variable instrument (confidence interval 7.1%). The conclusion is that routine photometry measures of raw semen concentration are reliable, accurate and precise using AccuRead™ or AccuCell™. There are multiple steps in semen processing that can induce sperm loss and therefore increase differences between theoretical and real sperm numbers in doses. Potential biases that depend on the workflow but not on the initial photometric measure of semen concentration are discussed. Copyright © 2011 Elsevier Inc. All rights reserved.

  12. Quantitative risk assessment of foods containing peanut advisory labeling.

    PubMed

    Remington, Benjamin C; Baumert, Joseph L; Marx, David B; Taylor, Steve L

    2013-12-01

    Foods with advisory labeling (i.e. "may contain") continue to be prevalent and the warning may be increasingly ignored by allergic consumers. We sought to determine the residual levels of peanut in various packaged foods bearing advisory labeling, compare similar data from 2005 and 2009, and determine any potential risk for peanut-allergic consumers. Of food products bearing advisory statements regarding peanut or products that had peanut listed as a minor ingredient, 8.6% and 37.5% contained detectable levels of peanut (>2.5 ppm whole peanut), respectively. Peanut-allergic individuals should be advised to avoid such products regardless of the wording of the advisory statement. Peanut was detected at similar rates and levels in products tested in both 2005 and 2009. Advisory labeled nutrition bars contained the highest levels of peanut and an additional market survey of 399 products was conducted. Probabilistic risk assessment showed the risk of a reaction to peanut-allergic consumers from advisory labeled nutrition bars was significant but brand-dependent. Peanut advisory labeling may be overused on some nutrition bars but prudently used on others. The probabilistic approach could provide the food industry with a quantitative method to assist with determining when advisory labeling is most appropriate. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Quantitative assessment of computational models for retinotopic map formation

    PubMed Central

    Sterratt, David C; Cutts, Catherine S; Willshaw, David J; Eglen, Stephen J

    2014-01-01

    ABSTRACT Molecular and activity‐based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. Our framework facilitates comparison between models, testing new models against known phenotypes and simulating new phenotypes in existing models. We have used this framework to assess four representative models that combine Eph/ephrin gradients and/or activity‐based mechanisms and competition. Two of the models were updated from their original form to fit into our framework. The models were tested against five different phenotypes: wild type, Isl2‐EphA3 ki/ki, Isl2‐EphA3 ki/+, ephrin‐A2,A3,A5 triple knock‐out (TKO), and Math5 −/− (Atoh7). Two models successfully reproduced the extent of the Math5 −/− anteromedial projection, but only one of those could account for the collapse point in Isl2‐EphA3 ki/+. The models needed a weak anteroposterior gradient in the SC to reproduce the residual order in the ephrin‐A2,A3,A5 TKO phenotype, suggesting either an incomplete knock‐out or the presence of another guidance molecule. Our article demonstrates the importance of testing retinotopic models against as full a range of phenotypes as possible, and we have made available MATLAB software, we wrote to facilitate this process. © 2014 Wiley Periodicals, Inc. Develop Neurobiol 75: 641–666, 2015 PMID:25367067

  14. Quantitative Assessment of Molecular Dynamics Sampling for Flexible Systems.

    PubMed

    Nemec, Mike; Hoffmann, Daniel

    2017-02-14

    Molecular dynamics (MD) simulation is a natural method for the study of flexible molecules but at the same time is limited by the large size of the conformational space of these molecules. We ask by how much the MD sampling quality for flexible molecules can be improved by two means: the use of diverse sets of trajectories starting from different initial conformations to detect deviations between samples and sampling with enhanced methods such as accelerated MD (aMD) or scaled MD (sMD) that distort the energy landscape in controlled ways. To this end, we test the effects of these approaches on MD simulations of two flexible biomolecules in aqueous solution, Met-Enkephalin (5 amino acids) and HIV-1 gp120 V3 (a cycle of 35 amino acids). We assess the convergence of the sampling quantitatively with known, extensive measures of cluster number N c and cluster distribution entropy S c and with two new quantities, conformational overlap O conf and density overlap O dens , both conveniently ranging from 0 to 1. These new overlap measures quantify self-consistency of sampling in multitrajectory MD experiments, a necessary condition for converged sampling. A comprehensive assessment of sampling quality of MD experiments identifies the combination of diverse trajectory sets and aMD as the most efficient approach among those tested. However, analysis of O dens between conventional and aMD trajectories also reveals that we have not completely corrected aMD sampling for the distorted energy landscape. Moreover, for V3, the courses of N c and O dens indicate that much higher resources than those generally invested today will probably be needed to achieve convergence. The comparative analysis also shows that conventional MD simulations with insufficient sampling can be easily misinterpreted as being converged.

  15. Cerebral Microbleeds: Burden Assessment by Using Quantitative Susceptibility Mapping

    PubMed Central

    Liu, Tian; Surapaneni, Krishna; Lou, Min; Cheng, Liuquan; Spincemaille, Pascal

    2012-01-01

    Purpose: To assess quantitative susceptibility mapping (QSM) for reducing the inconsistency of standard magnetic resonance (MR) imaging sequences in measurements of cerebral microbleed burden. Materials and Methods: This retrospective study was HIPAA compliant and institutional review board approved. Ten patients (5.6%) were selected from among 178 consecutive patients suspected of having experienced a stroke who were imaged with a multiecho gradient-echo sequence at 3.0 T and who had cerebral microbleeds on T2*-weighted images. QSM was performed for various ranges of echo time by using both the magnitude and phase components in the morphology-enabled dipole inversion method. Cerebral microbleed size was measured by two neuroradiologists on QSM images, T2*-weighted images, susceptibility-weighted (SW) images, and R2* maps calculated by using different echo times. The sum of susceptibility over a region containing a cerebral microbleed was also estimated on QSM images as its total susceptibility. Measurement differences were assessed by using the Student t test and the F test; P < .05 was considered to indicate a statistically significant difference. Results: When echo time was increased from approximately 20 to 40 msec, the measured cerebral microbleed volume increased by mean factors of 1.49 ± 0.86 (standard deviation), 1.64 ± 0.84, 2.30 ± 1.20, and 2.30 ± 1.19 for QSM, R2*, T2*-weighted, and SW images, respectively (P < .01). However, the measured total susceptibility with QSM did not show significant change over echo time (P = .31), and the variation was significantly smaller than any of the volume increases (P < .01 for each). Conclusion: The total susceptibility of a cerebral microbleed measured by using QSM is a physical property that is independent of echo time. © RSNA, 2011 PMID:22056688

  16. Matrix Effects in Quantitative Assessment of Pharmaceutical Tablets Using Transmission Raman and Near-Infrared (NIR) Spectroscopy.

    PubMed

    Sparén, Anders; Hartman, Madeleine; Fransson, Magnus; Johansson, Jonas; Svensson, Olof

    2015-05-01

    Raman spectroscopy can be an alternative to near-infrared spectroscopy (NIR) for nondestructive quantitative analysis of solid pharmaceutical formulations. Compared with NIR spectra, Raman spectra have much better selectivity, but subsampling was always an issue for quantitative assessment. Raman spectroscopy in transmission mode has reduced this issue, since a large volume of the sample is measured in transmission mode. The sample matrix, such as particle size of the drug substance in a tablet, may affect the Raman signal. In this work, matrix effects in transmission NIR and Raman spectroscopy were systematically investigated for a solid pharmaceutical formulation. Tablets were manufactured according to an experimental design, varying the factors particle size of the drug substance (DS), particle size of the filler, compression force, and content of drug substance. All factors were varied at two levels plus a center point, except the drug substance content, which was varied at five levels. Six tablets from each experimental point were measured with transmission NIR and Raman spectroscopy, and their concentration of DS was determined for a third of those tablets. Principal component analysis of NIR and Raman spectra showed that the drug substance content and particle size, the particle size of the filler, and the compression force affected both NIR and Raman spectra. For quantitative assessment, orthogonal partial least squares regression was applied. All factors varied in the experimental design influenced the prediction of the DS content to some extent, both for NIR and Raman spectroscopy, the particle size of the filler having the largest effect. When all matrix variations were included in the multivariate calibrations, however, good predictions of all types of tablets were obtained, both for NIR and Raman spectroscopy. The prediction error using transmission Raman spectroscopy was about 30% lower than that obtained with transmission NIR spectroscopy.

  17. Quantitative contrast-enhanced mammography for contrast medium kinetics studies

    NASA Astrophysics Data System (ADS)

    Arvanitis, C. D.; Speller, R.

    2009-10-01

    Quantitative contrast-enhanced mammography, based on a dual-energy approach, aims to extract quantitative and temporal information of the tumour enhancement after administration of iodinated vascular contrast media. Simulations using analytical expressions and optimization of critical parameters essential for the development of quantitative contrast-enhanced mammography are presented. The procedure has been experimentally evaluated using a tissue-equivalent phantom and an amorphous silicon active matrix flat panel imager. The x-ray beams were produced by a tungsten target tube and spectrally shaped using readily available materials. Measurement of iodine projected thickness in mg cm-2 has been performed. The effect of beam hardening does not introduce nonlinearities in the measurement of iodine projected thickness for values of thicknesses found in clinical investigations. However, scattered radiation introduces significant deviations from slope equal to unity when compared with the actual iodine projected thickness. Scatter correction before the analysis of the dual-energy images provides accurate iodine projected thickness measurements. At 10% of the exposure used in clinical mammography, signal-to-noise ratios in excess of 5 were achieved for iodine projected thicknesses less than 3 mg cm-2 within a 4 cm thick phantom. For the extraction of temporal information, a limited number of low-dose images were used with the phantom incorporating a flow of iodinated contrast medium. The results suggest that spatial and temporal information of iodinated contrast media can be used to indirectly measure the tumour microvessel density and determine its uptake and washout from breast tumours. The proposed method can significantly improve tumour detection in dense breasts. Its application to perform in situ x-ray biopsy and assessment of the oncolytic effect of anticancer agents is foreseeable.

  18. Quantitative Methods in the Study of Local History

    ERIC Educational Resources Information Center

    Davey, Pene

    1974-01-01

    The author suggests how the quantitative analysis of data from census records, assessment roles, and newspapers may be integrated into the classroom. Suggestions for obtaining quantitative data are provided. (DE)

  19. Quantitative detection of astaxanthin and cantaxanthin in Atlantic salmon by resonance Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Ermakov, Igor V.; Ermakova, Maia R.; Gellermann, Werner

    2006-02-01

    Two major carotenoids species found in salmonids muscle tissues are astaxanthin and cantaxanthin. They are taken up from fish food and are responsible for the attractive red-orange color of salmon filet. Since carotenoids are powerful antioxidants and biomarkers of nutrient consumption, they are thought to indicate fish health and resistance to diseases in fish farm environments. Therefore, a rapid, accurate, quantitative optical technique for measuring carotenoid content in salmon tissues is of economic interest. We demonstrate the possibility of using fast, selective, quantitative detection of astaxanthin and cantaxanthin in salmon muscle tissues, employing resonance Raman spectroscopy. Analyzing strong Raman signals originating from the carbon-carbon double bond stretch vibrations of the carotenoid molecules under blue laser excitation, we are able to characterize quantitatively the concentrations of carotenoids in salmon muscle tissue. To validate the technique, we compared Raman data with absorption measurements of carotenoid extracts in acetone. A close correspondence was observed in absorption spectra for tissue extract in acetone and a pure astaxanthin solution. Raman results show a linear dependence between Raman and absorption data. The proposed technique holds promise as a method of rapid screening of carotenoid levels in fish muscle tissues and may be attractive for the fish farm industry to assess the dietary status of salmon, risk for infective diseases, and product quality control.

  20. Quantitative Assessment of Regional Wall Motion Abnormalities Using Dual-Energy Digital Subtraction Intravenous Ventriculography

    NASA Astrophysics Data System (ADS)

    McCollough, Cynthia H.

    Healthy portions of the left ventricle (LV) can often compensate for regional dysfunction, thereby masking regional disease when global indices of LV function are employed. Thus, quantitation of regional function provides a more useful method of assessing LV function, especially in diseases that have regional effects such as coronary artery disease. This dissertation studied the ability of a phase -matched dual-energy digital subtraction angiography (DE -DSA) technique to quantitate changes in regional LV systolic volume. The potential benefits and a theoretical description of the DE imaging technique are detailed. A correlated noise reduction algorithm is also presented which raises the signal-to-noise ratio of DE images by a factor of 2 -4. Ten open-chest dogs were instrumented with transmural ultrasonic crystals to assess regional LV function in terms of systolic normalized-wall-thickening rate (NWTR) and percent-systolic-thickening (PST). A pneumatic occluder was placed on the left-anterior-descending (LAD) coronary artery to temporarily reduce myocardial blood flow, thereby changing regional LV function in the LAD bed. DE-DSA intravenous left ventriculograms were obtained at control and four levels of graded myocardial ischemia, as determined by reductions in PST. Phase-matched images displaying changes in systolic contractile function were created by subtracting an end-systolic (ES) control image from ES images acquired at each level of myocardial ischemia. The resulting wall-motion difference signal (WMD), which represents a change in regional systolic volume between the control and ischemic states, was quantitated by videodensitometry and compared with changes in NWTR and PST. Regression analysis of 56 data points from 10 animals shows a linear relationship between WMD and both NWTR and PST: WMD = -2.46 NWTR + 13.9, r = 0.64, p < 0.001; WMD = -2.11 PST + 18.4, r = 0.54, p < 0.001. Thus, changes in regional ES LV volume between rest and ischemic states, as

  1. Towards assessing cortical bone porosity using low-frequency quantitative acoustics: A phantom-based study.

    PubMed

    Vogl, Florian; Bernet, Benjamin; Bolognesi, Daniele; Taylor, William R

    2017-01-01

    Cortical porosity is a key characteristic governing the structural properties and mechanical behaviour of bone, and its quantification is therefore critical for understanding and monitoring the development of various bone pathologies such as osteoporosis. Axial transmission quantitative acoustics has shown to be a promising technique for assessing bone health in a fast, non-invasive, and radiation-free manner. One major hurdle in bringing this approach to clinical application is the entanglement of the effects of individual characteristics (e.g. geometry, porosity, anisotropy etc.) on the measured wave propagation. In order to address this entanglement problem, we therefore propose a systematic bottom-up approach, in which only one bone property is varied, before addressing interaction effects. This work therefore investigated the sensitivity of low-frequency quantitative acoustics to changes in porosity as well as individual pore characteristics using specifically designed cortical bone phantoms. 14 bone phantoms were designed with varying pore size, axial-, and radial pore number, resulting in porosities (bone volume fraction) between 0% and 15%, similar to porosity values found in human cortical bone. All phantoms were manufactured using laser sintering, measured using axial-transmission acoustics and analysed using a full-wave approach. Experimental results were compared to theoretical predictions based on a modified Timoshenko theory. A clear dependence of phase velocity on frequency and porosity produced by increasing pore size or radial pore number was demonstrated, with the velocity decreasing by between 2-5 m/s per percent of additional porosity, which corresponds to -0.5% to -1.0% of wave speed. While the change in phase velocity due to axial pore number was consistent with the results due to pore size and radial pore number, the relative uncertainties for the estimates were too high to draw any conclusions for this parameter. This work has shown the

  2. Towards assessing cortical bone porosity using low-frequency quantitative acoustics: A phantom-based study

    PubMed Central

    Vogl, Florian; Bernet, Benjamin; Bolognesi, Daniele; Taylor, William R.

    2017-01-01

    Purpose Cortical porosity is a key characteristic governing the structural properties and mechanical behaviour of bone, and its quantification is therefore critical for understanding and monitoring the development of various bone pathologies such as osteoporosis. Axial transmission quantitative acoustics has shown to be a promising technique for assessing bone health in a fast, non-invasive, and radiation-free manner. One major hurdle in bringing this approach to clinical application is the entanglement of the effects of individual characteristics (e.g. geometry, porosity, anisotropy etc.) on the measured wave propagation. In order to address this entanglement problem, we therefore propose a systematic bottom-up approach, in which only one bone property is varied, before addressing interaction effects. This work therefore investigated the sensitivity of low-frequency quantitative acoustics to changes in porosity as well as individual pore characteristics using specifically designed cortical bone phantoms. Materials and methods 14 bone phantoms were designed with varying pore size, axial-, and radial pore number, resulting in porosities (bone volume fraction) between 0% and 15%, similar to porosity values found in human cortical bone. All phantoms were manufactured using laser sintering, measured using axial-transmission acoustics and analysed using a full-wave approach. Experimental results were compared to theoretical predictions based on a modified Timoshenko theory. Results A clear dependence of phase velocity on frequency and porosity produced by increasing pore size or radial pore number was demonstrated, with the velocity decreasing by between 2–5 m/s per percent of additional porosity, which corresponds to -0.5% to -1.0% of wave speed. While the change in phase velocity due to axial pore number was consistent with the results due to pore size and radial pore number, the relative uncertainties for the estimates were too high to draw any conclusions for this

  3. Assessing Pharmacy Students’ Ability to Accurately Measure Blood Pressure Using a Blood Pressure Simulator Arm

    PubMed Central

    Bryant, Ginelle A.; Haack, Sally L.; North, Andrew M.

    2013-01-01

    Objective. To compare student accuracy in measuring normal and high blood pressures using a simulator arm. Methods. In this prospective, single-blind, study involving third-year pharmacy students, simulator arms were programmed with prespecified normal and high blood pressures. Students measured preset normal and high diastolic and systolic blood pressure using a crossover design. Results. One hundred sixteen students completed both blood pressure measurements. There was a significant difference between the accuracy of high systolic blood pressure (HSBP) measurement and normal systolic blood pressure (NSBP) measurement (mean HSBP difference 8.4 ± 10.9 mmHg vs NSBP 3.6 ± 6.4 mmHg; p<0.001). However, there was no difference between the accuracy of high diastolic blood pressure (HDBP) measurement and normal diastolic blood pressure (NDBP) measurement (mean HDBP difference 6.8 ± 9.6 mmHg vs. mean NDBP difference 4.6 ± 4.5 mmHg; p=0.089). Conclusions. Pharmacy students may need additional instruction and experience with taking high blood pressure measurements to ensure they are able to accurately assess this important vital sign. PMID:23788809

  4. Assessing pharmacy students' ability to accurately measure blood pressure using a blood pressure simulator arm.

    PubMed

    Bottenberg, Michelle M; Bryant, Ginelle A; Haack, Sally L; North, Andrew M

    2013-06-12

    To compare student accuracy in measuring normal and high blood pressures using a simulator arm. In this prospective, single-blind, study involving third-year pharmacy students, simulator arms were programmed with prespecified normal and high blood pressures. Students measured preset normal and high diastolic and systolic blood pressure using a crossover design. One hundred sixteen students completed both blood pressure measurements. There was a significant difference between the accuracy of high systolic blood pressure (HSBP) measurement and normal systolic blood pressure (NSBP) measurement (mean HSBP difference 8.4 ± 10.9 mmHg vs NSBP 3.6 ± 6.4 mmHg; p<0.001). However, there was no difference between the accuracy of high diastolic blood pressure (HDBP) measurement and normal diastolic blood pressure (NDBP) measurement (mean HDBP difference 6.8 ± 9.6 mmHg vs. mean NDBP difference 4.6 ± 4.5 mmHg; p=0.089). Pharmacy students may need additional instruction and experience with taking high blood pressure measurements to ensure they are able to accurately assess this important vital sign.

  5. A quantitative test of population genetics using spatiogenetic patterns in bacterial colonies.

    PubMed

    Korolev, Kirill S; Xavier, João B; Nelson, David R; Foster, Kevin R

    2011-10-01

    It is widely accepted that population-genetics theory is the cornerstone of evolutionary analyses. Empirical tests of the theory, however, are challenging because of the complex relationships between space, dispersal, and evolution. Critically, we lack quantitative validation of the spatial models of population genetics. Here we combine analytics, on- and off-lattice simulations, and experiments with bacteria to perform quantitative tests of the theory. We study two bacterial species, the gut microbe Escherichia coli and the opportunistic pathogen Pseudomonas aeruginosa, and show that spatiogenetic patterns in colony biofilms of both species are accurately described by an extension of the one-dimensional stepping-stone model. We use one empirical measure, genetic diversity at the colony periphery, to parameterize our models and show that we can then accurately predict another key variable: the degree of short-range cell migration along an edge. Moreover, the model allows us to estimate other key parameters, including effective population size (density) at the expansion frontier. While our experimental system is a simplification of natural microbial community, we argue that it constitutes proof of principle that the spatial models of population genetics can quantitatively capture organismal evolution.

  6. Assessing the use of Quantitative Light-induced Fluorescence-Digital as a clinical plaque assessment.

    PubMed

    Han, Sun-Young; Kim, Bo-Ra; Ko, Hae-Youn; Kwon, Ho-Keun; Kim, Baek-Il

    2016-03-01

    The aims of this study were to compare the relationship between red fluorescent plaque (RF plaque) area by Quantitative Light-induced Fluorescence-Digital (QLF-D) and disclosed plaque area by two-tone disclosure, and to assess the bacterial composition of the RF plaque by real time-PCR. Fifty healthy subjects were included and 600 facial surfaces of their anterior teeth were examined. QLF-D was taken on two separate occasions (before and after disclosing), and the RF plaque area was calculated based on Plaque Percent Index (PPI). After disclosing, the stained plaque area was analyzed to investigate the relationship with the RF plaque area. The relationship was evaluated using Pearson correlation and paired t-test. Then, the RF and non-red fluorescent (non-RF) plaque samples were obtained from the same subject for real-time PCR test. Total 10 plaque samples were compared the ratio of the 6 of bacteria using Wilcoxon signed rank test. Regarding the paired t-test, the blue-staining plaque area (9.3±9.2) showed significantly similarity with the RF plaque area (9.1±14.9, p=0.80) at ΔR20, however, the red-staining plaque area (31.6±20.9) presented difference from the RF plaque area (p<0.0001). In addition, bacterial composition of Prevotella intermedia and Streptococcus anginosus was associated with substantially more the RF plaque than the non-RF plaque (p<0.05). The plaque assessment method using QLF-D has potential to detect mature plaque, and the plaque area was associated with the blue-staining area using two-tone disclosure. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Readability of Wikipedia Pages on Autoimmune Disorders: Systematic Quantitative Assessment

    PubMed Central

    Bragazzi, Nicola Luigi; Brigo, Francesco; Sharif, Kassem; Amital, Howard; McGonagle, Dennis; Shoenfeld, Yehuda; Adawi, Mohammad

    2017-01-01

    Background In the era of new information and communication technologies, the Internet is being increasingly accessed for health-related information. Indeed, recently published patient surveys of people with autoimmune disorders confirmed that the Internet was reported as one of the most important health information sources. Wikipedia, a free online encyclopedia launched in 2001, is generally one of the most visited websites worldwide and is often consulted for health-related information. Objective The main objective of this investigation was to quantitatively assess whether the Wikipedia pages related to autoimmune disorders can be easily accessed by patients and their families, in terms of readability. Methods We obtained and downloaded a list of autoimmune disorders from the American Autoimmune Related Diseases Association (AARDA) website. We analyzed Wikipedia articles for their overall level of readability with 6 different quantitative readability scales: (1) the Flesch Reading Ease, (2) the Gunning Fog Index, (3) the Coleman-Liau Index, (4) the Flesch-Kincaid Grade Level, (5) the Automated Readability Index (ARI), and (6) the Simple Measure of Gobbledygook (SMOG). Further, we investigated the correlation between readability and clinical, pathological, and epidemiological parameters. Moreover, each Wikipedia analysis was assessed according to its content, breaking down the readability indices by main topic of each part (namely, pathogenesis, treatment, diagnosis, and prognosis plus a section containing paragraphs not falling into any of the previous categories). Results We retrieved 134 diseases from the AARDA website. The Flesch Reading Ease yielded a mean score of 24.34 (SD 10.73), indicating that the sites were very difficult to read and best understood by university graduates, while mean Gunning Fog Index and ARI scores were 16.87 (SD 2.03) and 14.06 (SD 2.12), respectively. The Coleman-Liau Index and the Flesch-Kincaid Grade Level yielded mean scores of 14

  8. The use of mode of action information in risk assessment: quantitative key events/dose-response framework for modeling the dose-response for key events.

    PubMed

    Simon, Ted W; Simons, S Stoney; Preston, R Julian; Boobis, Alan R; Cohen, Samuel M; Doerrer, Nancy G; Fenner-Crisp, Penelope A; McMullin, Tami S; McQueen, Charlene A; Rowlands, J Craig

    2014-08-01

    The HESI RISK21 project formed the Dose-Response/Mode-of-Action Subteam to develop strategies for using all available data (in vitro, in vivo, and in silico) to advance the next-generation of chemical risk assessments. A goal of the Subteam is to enhance the existing Mode of Action/Human Relevance Framework and Key Events/Dose Response Framework (KEDRF) to make the best use of quantitative dose-response and timing information for Key Events (KEs). The resulting Quantitative Key Events/Dose-Response Framework (Q-KEDRF) provides a structured quantitative approach for systematic examination of the dose-response and timing of KEs resulting from a dose of a bioactive agent that causes a potential adverse outcome. Two concepts are described as aids to increasing the understanding of mode of action-Associative Events and Modulating Factors. These concepts are illustrated in two case studies; 1) cholinesterase inhibition by the pesticide chlorpyrifos, which illustrates the necessity of considering quantitative dose-response information when assessing the effect of a Modulating Factor, that is, enzyme polymorphisms in humans, and 2) estrogen-induced uterotrophic responses in rodents, which demonstrate how quantitative dose-response modeling for KE, the understanding of temporal relationships between KEs and a counterfactual examination of hypothesized KEs can determine whether they are Associative Events or true KEs.

  9. Spotsizer: High-throughput quantitative analysis of microbial growth.

    PubMed

    Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg

    2016-10-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.

  10. Fertility preservation: a pilot study to assess previsit patient knowledge quantitatively.

    PubMed

    Balthazar, Ursula; Fritz, Marc A; Mersereau, Jennifer E

    2011-05-01

    To provide a quantitative assessment of patient knowledge about fertility and fertility preservation treatment options before the initial fertility preservation consultation at a university-based fertility preservation center. Prospective pilot survey containing 13 items assessing patient knowledge about fertility preservation, including the available treatment options and their requirements, success rates, and associated risks. University-based IVF center. Women aged 18 to 41 years with illnesses requiring treatments posing a serious threat to future fertility who were referred for fertility preservation consultation between April 2009 and June 2010. None. Knowledge score. Forty-one eligible patients were identified, and all completed surveys before their consultation. A knowledge score was generated for each patient with 1 point awarded for each correct answer. Overall, patients had poor previsit fertility preservation knowledge (mean score 5.9±2.7). Higher knowledge scores were correlated with personal experience with infertility and previous exposure to fertility preservation treatment information. There was no correlation between knowledge score and age, relationship status, pregnancy history, education, or income. Patients seen for fertility preservation consultation at our university-based center generally tend to be in their early 30s, white, well educated, and married. Previsit knowledge about fertility preservation treatment options was poor and did not correlate with age, education, and relationship status. Copyright © 2011 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  11. Application of High-Performance Liquid Chromatography Coupled with Linear Ion Trap Quadrupole Orbitrap Mass Spectrometry for Qualitative and Quantitative Assessment of Shejin-Liyan Granule Supplements.

    PubMed

    Gu, Jifeng; Wu, Weijun; Huang, Mengwei; Long, Fen; Liu, Xinhua; Zhu, Yizhun

    2018-04-11

    A method for high-performance liquid chromatography coupled with linear ion trap quadrupole Orbitrap high-resolution mass spectrometry (HPLC-LTQ-Orbitrap MS) was developed and validated for the qualitative and quantitative assessment of Shejin-liyan Granule. According to the fragmentation mechanism and high-resolution MS data, 54 compounds, including fourteen isoflavones, eleven ligands, eight flavonoids, six physalins, six organic acids, four triterpenoid saponins, two xanthones, two alkaloids, and one licorice coumarin, were identified or tentatively characterized. In addition, ten of the representative compounds (matrine, galuteolin, tectoridin, iridin, arctiin, tectorigenin, glycyrrhizic acid, irigenin, arctigenin, and irisflorentin) were quantified using the validated HPLC-LTQ-Orbitrap MS method. The method validation showed a good linearity with coefficients of determination (r²) above 0.9914 for all analytes. The accuracy of the intra- and inter-day variation of the investigated compounds was 95.0-105.0%, and the precision values were less than 4.89%. The mean recoveries and reproducibilities of each analyte were 95.1-104.8%, with relative standard deviations below 4.91%. The method successfully quantified the ten compounds in Shejin-liyan Granule, and the results show that the method is accurate, sensitive, and reliable.

  12. Quantitative Muscle Ultrasonography in Carpal Tunnel Syndrome.

    PubMed

    Lee, Hyewon; Jee, Sungju; Park, Soo Ho; Ahn, Seung-Chan; Im, Juneho; Sohn, Min Kyun

    2016-12-01

    To assess the reliability of quantitative muscle ultrasonography (US) in healthy subjects and to evaluate the correlation between quantitative muscle US findings and electrodiagnostic study results in patients with carpal tunnel syndrome (CTS). The clinical significance of quantitative muscle US in CTS was also assessed. Twenty patients with CTS and 20 age-matched healthy volunteers were recruited. All control and CTS subjects underwent a bilateral median and ulnar nerve conduction study (NCS) and quantitative muscle US. Transverse US images of the abductor pollicis brevis (APB) and abductor digiti minimi (ADM) were obtained to measure muscle cross-sectional area (CSA), thickness, and echo intensity (EI). EI was determined using computer-assisted, grayscale analysis. Inter-rater and intra-rater reliability for quantitative muscle US in control subjects, and differences in muscle thickness, CSA, and EI between the CTS patient and control groups were analyzed. Relationships between quantitative US parameters and electrodiagnostic study results were evaluated. Quantitative muscle US had high inter-rater and intra-rater reliability in the control group. Muscle thickness and CSA were significantly decreased, and EI was significantly increased in the APB of the CTS group (all p<0.05). EI demonstrated a significant positive correlation with latency of the median motor and sensory NCS in CTS patients (p<0.05). These findings suggest that quantitative muscle US parameters may be useful for detecting muscle changes in CTS. Further study involving patients with other neuromuscular diseases is needed to evaluate peripheral muscle change using quantitative muscle US.

  13. Quantitative Assessment the Relationship between p21 rs1059234 Polymorphism and Cancer Risk.

    PubMed

    Huang, Yong-Sheng; Fan, Qian-Qian; Li, Chuang; Nie, Meng; Quan, Hong-Yang; Wang, Lin

    2015-01-01

    p21 is a cyclin-dependent kinase inhibitor, which can arrest cell proliferation and serve as a tumor suppressor. Though many studies were published to assess the relationship between p21 rs1059234 polymorphism and various cancer risks, there was no definite conclusion on this association. To derive a more precise quantitative assessment of the relationship, a large scale meta-analysis of 5,963 cases and 8,405 controls from 16 eligible published case-control studies was performed. Our analysis suggested that rs1059234 was not associated with the integral cancer risk for both dominant model [(T/T+C/T) vs C/C, OR=1.00, 95% CI: 0.84-1.18] and recessive model [T/T vs (C/C+C/T), OR=1.03, 95% CI: 0.93-1.15)]. However, further stratified analysis showed rs1059234 was greatly associated with the risk of squamous cell carcinoma of head and neck (SCCHN). Thus, larger scale primary studies are still required to further evaluate the interaction of p21 rs1059234 polymorphism and cancer risk in specific cancer subtypes.

  14. An approach to quantitative sustainability assessment in the early stages of process design.

    PubMed

    Tugnoli, Alessandro; Santarelli, Francesco; Cozzani, Valerio

    2008-06-15

    A procedure was developed for the quantitative assessment of key performance indicators suitable for the sustainability analysis of alternative processes, mainly addressing the early stages of process design. The methodology was based on the calculation of a set of normalized impact indices allowing a direct comparison of the additional burden of each process alternative on a selected reference area. Innovative reference criteria were developed to compare and aggregate the impact indicators on the basis of the site-specific impact burden and sustainability policy. An aggregation procedure also allows the calculation of overall sustainability performance indicators and of an "impact fingerprint" of each process alternative. The final aim of the method is to support the decision making process during process development, providing a straightforward assessment of the expected sustainability performances. The application of the methodology to case studies concerning alternative waste disposal processes allowed a preliminary screening of the expected critical sustainability impacts of each process. The methodology was shown to provide useful results to address sustainability issues in the early stages of process design.

  15. Dynamic gadolinium-enhanced magnetic resonance imaging allows accurate assessment of the synovial inflammatory activity in rheumatoid arthritis knee joints: a comparison with synovial histology.

    PubMed

    Axelsen, M B; Stoltenberg, M; Poggenborg, R P; Kubassova, O; Boesen, M; Bliddal, H; Hørslev-Petersen, K; Hanson, L G; Østergaard, M

    2012-03-01

    To determine whether dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) evaluated using semi-automatic image processing software can accurately assess synovial inflammation in rheumatoid arthritis (RA) knee joints. In 17 RA patients undergoing knee surgery, the average grade of histological synovial inflammation was determined from four biopsies obtained during surgery. A preoperative series of T(1)-weighted dynamic fast low-angle shot (FLASH) MR images was obtained. Parameters characterizing contrast uptake dynamics, including the initial rate of enhancement (IRE), were generated by the software in three different areas: (I) the entire slice (Whole slice); (II) a manually outlined region of interest (ROI) drawn quickly around the joint, omitting large artefacts such as blood vessels (Quick ROI); and (III) a manually outlined ROI following the synovial capsule of the knee joint (Precise ROI). Intra- and inter-reader agreement was assessed using the intra-class correlation coefficient (ICC). The IRE from the Quick ROI and the Precise ROI revealed high correlations to the grade of histological inflammation (Spearman's correlation coefficient (rho) = 0.70, p = 0.001 and rho = 0.74, p = 0.001, respectively). Intra- and inter-reader ICCs were very high (0.93-1.00). No Whole slice parameters were correlated to histology. DCE-MRI provides fast and accurate assessment of synovial inflammation in RA patients. Manual outlining of the joint to omit large artefacts is necessary.

  16. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNAmore » populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.« less

  17. Accurate FRET Measurements within Single Diffusing Biomolecules Using Alternating-Laser Excitation

    PubMed Central

    Lee, Nam Ki; Kapanidis, Achillefs N.; Wang, You; Michalet, Xavier; Mukhopadhyay, Jayanta; Ebright, Richard H.; Weiss, Shimon

    2005-01-01

    Fluorescence resonance energy transfer (FRET) between a donor (D) and an acceptor (A) at the single-molecule level currently provides qualitative information about distance, and quantitative information about kinetics of distance changes. Here, we used the sorting ability of confocal microscopy equipped with alternating-laser excitation (ALEX) to measure accurate FRET efficiencies and distances from single molecules, using corrections that account for cross-talk terms that contaminate the FRET-induced signal, and for differences in the detection efficiency and quantum yield of the probes. ALEX yields accurate FRET independent of instrumental factors, such as excitation intensity or detector alignment. Using DNA fragments, we showed that ALEX-based distances agree well with predictions from a cylindrical model of DNA; ALEX-based distances fit better to theory than distances obtained at the ensemble level. Distance measurements within transcription complexes agreed well with ensemble-FRET measurements, and with structural models based on ensemble-FRET and x-ray crystallography. ALEX can benefit structural analysis of biomolecules, especially when such molecules are inaccessible to conventional structural methods due to heterogeneity or transient nature. PMID:15653725

  18. Turning education into action: Impact of a collective social education approach to improve nurses' ability to recognize and accurately assess delirium in hospitalized older patients.

    PubMed

    Travers, Catherine; Henderson, Amanda; Graham, Fred; Beattie, Elizabeth

    2018-03-01

    Although cognitive impairment including dementia and delirium is common in older hospital patients, it is not well recognized or managed by hospital staff, potentially resulting in adverse events. This paper describes, and reports on the impact of a collective social education approach to improving both nurses' knowledge of, and screening for delirium. Thirty-four experienced nurses from six hospital wards, became Cognition Champions (CogChamps) to lead their wards in a collective social education process about cognitive impairment and the assessment of delirium. At the outset, the CogChamps were provided with comprehensive education about dementia and delirium from a multidisciplinary team of clinicians. Their knowledge was assessed to ascertain they had the requisite understanding to engage in education as a collective social process, namely, with each other and their local teams. Following this, they developed ward specific Action Plans in collaboration with their teams aimed at educating and evaluating ward nurses' ability to accurately assess and care for patients for delirium. The plans were implemented over five months. The broader nursing teams' knowledge was assessed, together with their ability to accurately assess patients for delirium. Each ward implemented their Action Plan to varying degrees and key achievements included the education of a majority of ward nurses about delirium and the certification of the majority as competent to assess patients for delirium using the Confusion Assessment Method. Two wards collected pre-and post-audit data that demonstrated a substantial improvement in delirium screening rates. The education process led by CogChamps and supported by educators and clinical experts provides an example of successfully educating nurses about delirium and improving screening rates of patients for delirium. ACTRN 12617000563369. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. Single-Cell Based Quantitative Assay of Chromosome Transmission Fidelity.

    PubMed

    Zhu, Jin; Heinecke, Dominic; Mulla, Wahid A; Bradford, William D; Rubinstein, Boris; Box, Andrew; Haug, Jeffrey S; Li, Rong

    2015-03-30

    Errors in mitosis are a primary cause of chromosome instability (CIN), generating aneuploid progeny cells. Whereas a variety of factors can influence CIN, under most conditions mitotic errors are rare events that have been difficult to measure accurately. Here we report a green fluorescent protein-based quantitative chromosome transmission fidelity (qCTF) assay in budding yeast that allows sensitive and quantitative detection of CIN and can be easily adapted to high-throughput analysis. Using the qCTF assay, we performed genome-wide quantitative profiling of genes that affect CIN in a dosage-dependent manner and identified genes that elevate CIN when either increased (icCIN) or decreased in copy number (dcCIN). Unexpectedly, qCTF screening also revealed genes whose change in copy number quantitatively suppress CIN, suggesting that the basal error rate of the wild-type genome is not minimized, but rather, may have evolved toward an optimal level that balances both stability and low-level karyotype variation for evolutionary adaptation. Copyright © 2015 Zhu et al.

  20. Quantitative benefit-harm assessment for setting research priorities: the example of roflumilast for patients with COPD.

    PubMed

    Puhan, Milo A; Yu, Tsung; Boyd, Cynthia M; Ter Riet, Gerben

    2015-07-02

    When faced with uncertainties about the effects of medical interventions regulatory agencies, guideline developers, clinicians, and researchers commonly ask for more research, and in particular for more randomized trials. The conduct of additional randomized trials is, however, sometimes not the most efficient way to reduce uncertainty. Instead, approaches such as value of information analysis or other approaches should be used to prioritize research that will most likely reduce uncertainty and inform decisions. In situations where additional research for specific interventions needs to be prioritized, we propose the use of quantitative benefit-harm assessments that illustrate how the benefit-harm balance may change as a consequence of additional research. The example of roflumilast for patients with chronic obstructive pulmonary disease shows that additional research on patient preferences (e.g., how important are exacerbations relative to psychiatric harms?) or outcome risks (e.g., what is the incidence of psychiatric outcomes in patients with chronic obstructive pulmonary disease without treatment?) is sometimes more valuable than additional randomized trials. We propose that quantitative benefit-harm assessments have the potential to explore the impact of additional research and to identify research priorities Our approach may be seen as another type of value of information analysis and as a useful approach to stimulate specific new research that has the potential to change current estimates of the benefit-harm balance and decision making.

  1. Landslide hazard assessment: recent trends and techniques.

    PubMed

    Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S

    2013-01-01

    Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.

  2. Assessing signal-to-noise in quantitative proteomics: multivariate statistical analysis in DIGE experiments.

    PubMed

    Friedman, David B

    2012-01-01

    All quantitative proteomics experiments measure variation between samples. When performing large-scale experiments that involve multiple conditions or treatments, the experimental design should include the appropriate number of individual biological replicates from each condition to enable the distinction between a relevant biological signal from technical noise. Multivariate statistical analyses, such as principal component analysis (PCA), provide a global perspective on experimental variation, thereby enabling the assessment of whether the variation describes the expected biological signal or the unanticipated technical/biological noise inherent in the system. Examples will be shown from high-resolution multivariable DIGE experiments where PCA was instrumental in demonstrating biologically significant variation as well as sample outliers, fouled samples, and overriding technical variation that would not be readily observed using standard univariate tests.

  3. Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations

    PubMed Central

    Røttingen, John-Arne

    2015-01-01

    We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196

  4. Approaches to advancing quantitative human health risk assessment of environmental chemicals in the post-genomic era.

    PubMed

    Chiu, Weihsueh A; Euling, Susan Y; Scott, Cheryl Siegel; Subramaniam, Ravi P

    2013-09-15

    The contribution of genomics and associated technologies to human health risk assessment for environmental chemicals has focused largely on elucidating mechanisms of toxicity, as discussed in other articles in this issue. However, there is interest in moving beyond hazard characterization to making more direct impacts on quantitative risk assessment (QRA)--i.e., the determination of toxicity values for setting exposure standards and cleanup values. We propose that the evolution of QRA of environmental chemicals in the post-genomic era will involve three, somewhat overlapping phases in which different types of approaches begin to mature. The initial focus (in Phase I) has been and continues to be on "augmentation" of weight of evidence--using genomic and related technologies qualitatively to increase the confidence in and scientific basis of the results of QRA. Efforts aimed towards "integration" of these data with traditional animal-based approaches, in particular quantitative predictors, or surrogates, for the in vivo toxicity data to which they have been anchored are just beginning to be explored now (in Phase II). In parallel, there is a recognized need for "expansion" of the use of established biomarkers of susceptibility or risk of human diseases and disorders for QRA, particularly for addressing the issues of cumulative assessment and population risk. Ultimately (in Phase III), substantial further advances could be realized by the development of novel molecular and pathway-based biomarkers and statistical and in silico models that build on anticipated progress in understanding the pathways of human diseases and disorders. Such efforts would facilitate a gradual "reorientation" of QRA towards approaches that more directly link environmental exposures to human outcomes. Published by Elsevier Inc.

  5. Quantitative somatosensory testing of the penis: optimizing the clinical neurological examination.

    PubMed

    Bleustein, Clifford B; Eckholdt, Haftan; Arezzo, Joseph C; Melman, Arnold

    2003-06-01

    Quantitative somatosensory testing, including vibration, pressure, spatial perception and thermal thresholds of the penis, has demonstrated neuropathy in patients with a history of erectile dysfunction of all etiologies. We evaluated which measurement of neurological function of the penis was best at predicting erectile dysfunction and examined the impact of location on the penis for quantitative somatosensory testing measurements. A total of 107 patients were evaluated. All patients were required to complete the erectile function domain of the International Index of Erectile Function (IIEF) questionnaire, of whom 24 had no complaints of erectile dysfunction and scored within the "normal" range on the IIEF. Patients were subsequently tested on ventral middle penile shaft, proximal dorsal midline penile shaft and glans penis (with foreskin retracted) for vibration, pressure, spatial perception, and warm and cold thermal thresholds. Mixed models repeated measures analysis of variance controlling for age, diabetes and hypertension revealed that method of measurement (quantitative somatosensory testing) was predictive of IIEF score (F = 209, df = 4,1315, p <0.001), while site of measurement on the penis was not. To determine the best method of measurement, we used hierarchical regression, which revealed that warm temperature was the best predictor of erectile dysfunction with pseudo R(2) = 0.19, p <0.0007. There was no significant improvement in predicting erectile dysfunction when another test was added. Using 37C and greater as the warm thermal threshold yielded a sensitivity of 88.5%, specificity 70.0% and positive predictive value 85.5%. Quantitative somatosensory testing using warm thermal threshold measurements taken at the glans penis can be used alone to assess the neurological status of the penis. Warm thermal thresholds alone offer a quick, noninvasive accurate method of evaluating penile neuropathy in an office setting.

  6. Assessing the properties of internal standards for quantitative matrix-assisted laser desorption/ionization mass spectrometry of small molecules.

    PubMed

    Sleno, Lekha; Volmer, Dietrich A

    2006-01-01

    Growing interest in the ability to conduct quantitative assays for small molecules by matrix-assisted laser desorption/ionization (MALDI) has been the driving force for several recent studies. This present work includes the investigation of internal standards for these analyses using a high-repetition rate MALDI triple quadrupole instrument. Certain physicochemical properties are assessed for predicting possible matches for internal standards for different small molecules. The importance of similar molecular weight of an internal standard to its analyte is seen through experiments with a series of acylcarnitines, having a fixed charge site and growing alkyl chain length. Both acetyl- and hexanoyl-carnitine were systematically assessed with several other acylcarnitine compounds as internal standards. The results clearly demonstrate that closely matched molecular weights between analyte and internal standard are essential for acceptable quantitation results. Using alpha-cyano-4-hydroxycinnamic acid as the organic matrix, the similarities between analyte and internal standard remain the most important parameter and not necessarily their even distribution within the solid sample spot. Several 4-quinolone antibiotics as well as a diverse group of pharmaceutical drugs were tested as internal standards for the 4-quinolone, ciprofloxacin. Quantitative results were shown using the solution-phase properties, log D and pKa, of these molecules. Their distribution coefficients, log D, are demonstrated as a fundamental parameter for similar crystallization patterns of analyte and internal standard. In the end, it was also possible to quantify ciprofloxacin using a drug from a different compound class, namely quinidine, having a similar log D value as the analyte. Copyright 2006 John Wiley & Sons, Ltd.

  7. Quantitative Imaging with a Mobile Phone Microscope

    PubMed Central

    Skandarajah, Arunan; Reber, Clay D.; Switz, Neil A.; Fletcher, Daniel A.

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone–based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications. PMID:24824072

  8. Optimization of metabolite basis sets prior to quantitation in magnetic resonance spectroscopy: an approach based on quantum mechanics

    NASA Astrophysics Data System (ADS)

    Lazariev, A.; Allouche, A.-R.; Aubert-Frécon, M.; Fauvelle, F.; Piotto, M.; Elbayed, K.; Namer, I.-J.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    High-resolution magic angle spinning (HRMAS) nuclear magnetic resonance (NMR) is playing an increasingly important role for diagnosis. This technique enables setting up metabolite profiles of ex vivo pathological and healthy tissue. The need to monitor diseases and pharmaceutical follow-up requires an automatic quantitation of HRMAS 1H signals. However, for several metabolites, the values of chemical shifts of proton groups may slightly differ according to the micro-environment in the tissue or cells, in particular to its pH. This hampers the accurate estimation of the metabolite concentrations mainly when using quantitation algorithms based on a metabolite basis set: the metabolite fingerprints are not correct anymore. In this work, we propose an accurate method coupling quantum mechanical simulations and quantitation algorithms to handle basis-set changes. The proposed algorithm automatically corrects mismatches between the signals of the simulated basis set and the signal under analysis by maximizing the normalized cross-correlation between the mentioned signals. Optimized chemical shift values of the metabolites are obtained. This method, QM-QUEST, provides more robust fitting while limiting user involvement and respects the correct fingerprints of metabolites. Its efficiency is demonstrated by accurately quantitating 33 signals from tissue samples of human brains with oligodendroglioma, obtained at 11.7 tesla. The corresponding chemical shift changes of several metabolites within the series are also analyzed.

  9. Quantitative analysis of binary polymorphs mixtures of fusidic acid by diffuse reflectance FTIR spectroscopy, diffuse reflectance FT-NIR spectroscopy, Raman spectroscopy and multivariate calibration.

    PubMed

    Guo, Canyong; Luo, Xuefang; Zhou, Xiaohua; Shi, Beijia; Wang, Juanjuan; Zhao, Jinqi; Zhang, Xiaoxia

    2017-06-05

    Vibrational spectroscopic techniques such as infrared, near-infrared and Raman spectroscopy have become popular in detecting and quantifying polymorphism of pharmaceutics since they are fast and non-destructive. This study assessed the ability of three vibrational spectroscopy combined with multivariate analysis to quantify a low-content undesired polymorph within a binary polymorphic mixture. Partial least squares (PLS) regression and support vector machine (SVM) regression were employed to build quantitative models. Fusidic acid, a steroidal antibiotic, was used as the model compound. It was found that PLS regression performed slightly better than SVM regression in all the three spectroscopic techniques. Root mean square errors of prediction (RMSEP) were ranging from 0.48% to 1.17% for diffuse reflectance FTIR spectroscopy and 1.60-1.93% for diffuse reflectance FT-NIR spectroscopy and 1.62-2.31% for Raman spectroscopy. The results indicate that diffuse reflectance FTIR spectroscopy offers significant advantages in providing accurate measurement of polymorphic content in the fusidic acid binary mixtures, while Raman spectroscopy is the least accurate technique for quantitative analysis of polymorphs. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Facile and quantitative electrochemical detection of yeast cell apoptosis

    NASA Astrophysics Data System (ADS)

    Yue, Qiulin; Xiong, Shiquan; Cai, Dongqing; Wu, Zhengyan; Zhang, Xin

    2014-03-01

    An electrochemical method based on square wave anodic stripping voltammetry (SWASV) was developed to detect the apoptosis of yeast cells conveniently and quantitatively through the high affinity between Cu2+ and phosphatidylserine (PS) translocated from the inner to the outer plasma membrane of the apoptotic cells. The combination of negatively charged PS and Cu2+ could decrease the electrochemical response of Cu2+ on the electrode. The results showed that the apoptotic rates of cells could be detected quantitatively through the variations of peak currents of Cu2+ by SWASV, and agreed well with those obtained through traditional flow cytometry detection. This work thus may provide a novel, simple, immediate and accurate detection method for cell apoptosis.

  11. Synthesis of Survey Questions That Accurately Discriminate the Elements of the TPACK Framework

    ERIC Educational Resources Information Center

    Jaikaran-Doe, Seeta; Doe, Peter Edward

    2015-01-01

    A number of validated survey instruments for assessing technological pedagogical content knowledge (TPACK) do not accurately discriminate between the seven elements of the TPACK framework particularly technological content knowledge (TCK) and technological pedagogical knowledge (TPK). By posing simple questions that assess technological,…

  12. [Quantitative assessment of urban ecosystem services flow based on entropy theory: A case study of Beijing, China].

    PubMed

    Li, Jing Xin; Yang, Li; Yang, Lei; Zhang, Chao; Huo, Zhao Min; Chen, Min Hao; Luan, Xiao Feng

    2018-03-01

    Quantitative evaluation of ecosystem service is a primary premise for rational resources exploitation and sustainable development. Examining ecosystem services flow provides a scientific method to quantity ecosystem services. We built an assessment indicator system based on land cover/land use under the framework of four types of ecosystem services. The types of ecosystem services flow were reclassified. Using entropy theory, disorder degree and developing trend of indicators and urban ecosystem were quantitatively assessed. Beijing was chosen as the study area, and twenty-four indicators were selected for evaluation. The results showed that the entropy value of Beijing urban ecosystem during 2004 to 2015 was 0.794 and the entropy flow was -0.024, suggesting a large disordered degree and near verge of non-health. The system got maximum values for three times, while the mean annual variation of the system entropy value increased gradually in three periods, indicating that human activities had negative effects on urban ecosystem. Entropy flow reached minimum value in 2007, implying the environmental quality was the best in 2007. The determination coefficient for the fitting function of total permanent population in Beijing and urban ecosystem entropy flow was 0.921, indicating that urban ecosystem health was highly correlated with total permanent population.

  13. Quantitative Ultrasound Assessment of Duchenne Muscular Dystrophy Using Edge Detection Analysis.

    PubMed

    Koppaka, Sisir; Shklyar, Irina; Rutkove, Seward B; Darras, Basil T; Anthony, Brian W; Zaidman, Craig M; Wu, Jim S

    2016-09-01

    The purpose of this study was to investigate the ability of quantitative ultrasound (US) using edge detection analysis to assess patients with Duchenne muscular dystrophy (DMD). After Institutional Review Board approval, US examinations with fixed technical parameters were performed unilaterally in 6 muscles (biceps, deltoid, wrist flexors, quadriceps, medial gastrocnemius, and tibialis anterior) in 19 boys with DMD and 21 age-matched control participants. The muscles of interest were outlined by a tracing tool, and the upper third of the muscle was used for analysis. Edge detection values for each muscle were quantified by the Canny edge detection algorithm and then normalized to the number of edge pixels in the muscle region. The edge detection values were extracted at multiple sensitivity thresholds (0.01-0.99) to determine the optimal threshold for distinguishing DMD from normal. Area under the receiver operating curve values were generated for each muscle and averaged across the 6 muscles. The average age in the DMD group was 8.8 years (range, 3.0-14.3 years), and the average age in the control group was 8.7 years (range, 3.4-13.5 years). For edge detection, a Canny threshold of 0.05 provided the best discrimination between DMD and normal (area under the curve, 0.96; 95% confidence interval, 0.84-1.00). According to a Mann-Whitney test, edge detection values were significantly different between DMD and controls (P < .0001). Quantitative US imaging using edge detection can distinguish patients with DMD from healthy controls at low Canny thresholds, at which discrimination of small structures is best. Edge detection by itself or in combination with other tests can potentially serve as a useful biomarker of disease progression and effectiveness of therapy in muscle disorders.

  14. Micro-anatomical quantitative optical imaging: toward automated assessment of breast tissues.

    PubMed

    Dobbs, Jessica L; Mueller, Jenna L; Krishnamurthy, Savitri; Shin, Dongsuk; Kuerer, Henry; Yang, Wei; Ramanujam, Nirmala; Richards-Kortum, Rebecca

    2015-08-20

    Pathologists currently diagnose breast lesions through histologic assessment, which requires fixation and tissue preparation. The diagnostic criteria used to classify breast lesions are qualitative and subjective, and inter-observer discordance has been shown to be a significant challenge in the diagnosis of selected breast lesions, particularly for borderline proliferative lesions. Thus, there is an opportunity to develop tools to rapidly visualize and quantitatively interpret breast tissue morphology for a variety of clinical applications. Toward this end, we acquired images of freshly excised breast tissue specimens from a total of 34 patients using confocal fluorescence microscopy and proflavine as a topical stain. We developed computerized algorithms to segment and quantify nuclear and ductal parameters that characterize breast architectural features. A total of 33 parameters were evaluated and used as input to develop a decision tree model to classify benign and malignant breast tissue. Benign features were classified in tissue specimens acquired from 30 patients and malignant features were classified in specimens from 22 patients. The decision tree model that achieved the highest accuracy for distinguishing between benign and malignant breast features used the following parameters: standard deviation of inter-nuclear distance and number of duct lumens. The model achieved 81 % sensitivity and 93 % specificity, corresponding to an area under the curve of 0.93 and an overall accuracy of 90 %. The model classified IDC and DCIS with 92 % and 96 % accuracy, respectively. The cross-validated model achieved 75 % sensitivity and 93 % specificity and an overall accuracy of 88 %. These results suggest that proflavine staining and confocal fluorescence microscopy combined with image analysis strategies to segment morphological features could potentially be used to quantitatively diagnose freshly obtained breast tissue at the point of care without the need for

  15. Quantitation of dissolved gas content in emulsions and in blood using mass spectrometric detection

    PubMed Central

    Grimley, Everett; Turner, Nicole; Newell, Clayton; Simpkins, Cuthbert; Rodriguez, Juan

    2011-01-01

    Quantitation of dissolved gases in blood or in other biological media is essential for understanding the dynamics of metabolic processes. Current detection techniques, while enabling rapid and convenient assessment of dissolved gases, provide only direct information on the partial pressure of gases dissolved in the aqueous fraction of the fluid. The more relevant quantity known as gas content, which refers to the total amount of the gas in all fractions of the sample, can be inferred from those partial pressures, but only indirectly through mathematical modeling. Here we describe a simple mass spectrometric technique for rapid and direct quantitation of gas content for a wide range of gases. The technique is based on a mass spectrometer detector that continuously monitors gases that are rapidly extracted from samples injected into a purge vessel. The accuracy and sample processing speed of the system is demonstrated with experiments that reproduce within minutes literature values for the solubility of various gases in water. The capability of the technique is further demonstrated through accurate determination of O2 content in a lipid emulsion and in whole blood, using as little as 20 μL of sample. The approach to gas content quantitation described here should greatly expand the range of animals and conditions that may be used in studies of metabolic gas exchange, and facilitate the development of artificial oxygen carriers and resuscitation fluids. PMID:21497566

  16. Quantitative safety assessment of air traffic control systems through system control capacity

    NASA Astrophysics Data System (ADS)

    Guo, Jingjing

    Quantitative Safety Assessments (QSA) are essential to safety benefit verification and regulations of developmental changes in safety critical systems like the Air Traffic Control (ATC) systems. Effectiveness of the assessments is particularly desirable today in the safe implementations of revolutionary ATC overhauls like NextGen and SESAR. QSA of ATC systems are however challenged by system complexity and lack of accident data. Extending from the idea "safety is a control problem" in the literature, this research proposes to assess system safety from the control perspective, through quantifying a system's "control capacity". A system's safety performance correlates to this "control capacity" in the control of "safety critical processes". To examine this idea in QSA of the ATC systems, a Control-capacity Based Safety Assessment Framework (CBSAF) is developed which includes two control capacity metrics and a procedural method. The two metrics are Probabilistic System Control-capacity (PSC) and Temporal System Control-capacity (TSC); each addresses an aspect of a system's control capacity. And the procedural method consists three general stages: I) identification of safety critical processes, II) development of system control models and III) evaluation of system control capacity. The CBSAF was tested in two case studies. The first one assesses an en-route collision avoidance scenario and compares three hypothetical configurations. The CBSAF was able to capture the uncoordinated behavior between two means of control, as was observed in a historic midair collision accident. The second case study compares CBSAF with an existing risk based QSA method in assessing the safety benefits of introducing a runway incursion alert system. Similar conclusions are reached between the two methods, while the CBSAF has the advantage of simplicity and provides a new control-based perspective and interpretation to the assessments. The case studies are intended to investigate the

  17. Hydrologic connectivity: Quantitative assessments of hydrologic-enforced drainage structures in an elevation model

    USGS Publications Warehouse

    Poppenga, Sandra K.; Worstell, Bruce B.

    2016-01-01

    Elevation data derived from light detection and ranging present challenges for hydrologic modeling as the elevation surface includes bridge decks and elevated road features overlaying culvert drainage structures. In reality, water is carried through these structures; however, in the elevation surface these features impede modeled overland surface flow. Thus, a hydrologically-enforced elevation surface is needed for hydrodynamic modeling. In the Delaware River Basin, hydrologic-enforcement techniques were used to modify elevations to simulate how constructed drainage structures allow overland surface flow. By calculating residuals between unfilled and filled elevation surfaces, artificially pooled depressions that formed upstream of constructed drainage structure features were defined, and elevation values were adjusted by generating transects at the location of the drainage structures. An assessment of each hydrologically-enforced drainage structure was conducted using field-surveyed culvert and bridge coordinates obtained from numerous public agencies, but it was discovered the disparate drainage structure datasets were not comprehensive enough to assess all remotely located depressions in need of hydrologic-enforcement. Alternatively, orthoimagery was interpreted to define drainage structures near each depression, and these locations were used as reference points for a quantitative hydrologic-enforcement assessment. The orthoimagery-interpreted reference points resulted in a larger corresponding sample size than the assessment between hydrologic-enforced transects and field-surveyed data. This assessment demonstrates the viability of rules-based hydrologic-enforcement that is needed to achieve hydrologic connectivity, which is valuable for hydrodynamic models in sensitive coastal regions. Hydrologic-enforced elevation data are also essential for merging with topographic/bathymetric elevation data that extend over vulnerable urbanized areas and dynamic coastal

  18. Seeing and Being Seen: Predictors of Accurate Perceptions about Classmates’ Relationships

    PubMed Central

    Neal, Jennifer Watling; Neal, Zachary P.; Cappella, Elise

    2015-01-01

    This study examines predictors of observer accuracy (i.e. seeing) and target accuracy (i.e. being seen) in perceptions of classmates’ relationships in a predominantly African American sample of 420 second through fourth graders (ages 7 – 11). Girls, children in higher grades, and children in smaller classrooms were more accurate observers. Targets (i.e. pairs of children) were more accurately observed when they occurred in smaller classrooms of higher grades and involved same-sex, high-popularity, and similar-popularity children. Moreover, relationships between pairs of girls were more accurately observed than relationships between pairs of boys. As a set, these findings suggest the importance of both observer and target characteristics for children’s accurate perceptions of classroom relationships. Moreover, the substantial variation in observer accuracy and target accuracy has methodological implications for both peer-reported assessments of classroom relationships and the use of stochastic actor-based models to understand peer selection and socialization processes. PMID:26347582

  19. Efficient quantitative assessment of facial paralysis using iris segmentation and active contour-based key points detection with hybrid classifier.

    PubMed

    Barbosa, Jocelyn; Lee, Kyubum; Lee, Sunwon; Lodhi, Bilal; Cho, Jae-Gu; Seo, Woo-Keun; Kang, Jaewoo

    2016-03-12

    Facial palsy or paralysis (FP) is a symptom that loses voluntary muscles movement in one side of the human face, which could be very devastating in the part of the patients. Traditional methods are solely dependent to clinician's judgment and therefore time consuming and subjective in nature. Hence, a quantitative assessment system becomes apparently invaluable for physicians to begin the rehabilitation process; and to produce a reliable and robust method is challenging and still underway. We introduce a novel approach for a quantitative assessment of facial paralysis that tackles classification problem for FP type and degree of severity. Specifically, a novel method of quantitative assessment is presented: an algorithm that extracts the human iris and detects facial landmarks; and a hybrid approach combining the rule-based and machine learning algorithm to analyze and prognosticate facial paralysis using the captured images. A method combining the optimized Daugman's algorithm and Localized Active Contour (LAC) model is proposed to efficiently extract the iris and facial landmark or key points. To improve the performance of LAC, appropriate parameters of initial evolving curve for facial features' segmentation are automatically selected. The symmetry score is measured by the ratio between features extracted from the two sides of the face. Hybrid classifiers (i.e. rule-based with regularized logistic regression) were employed for discriminating healthy and unhealthy subjects, FP type classification, and for facial paralysis grading based on House-Brackmann (H-B) scale. Quantitative analysis was performed to evaluate the performance of the proposed approach. Experiments show that the proposed method demonstrates its efficiency. Facial movement feature extraction on facial images based on iris segmentation and LAC-based key point detection along with a hybrid classifier provides a more efficient way of addressing classification problem on facial palsy type and degree

  20. Digital pathology and image analysis for robust high-throughput quantitative assessment of Alzheimer disease neuropathologic changes.

    PubMed

    Neltner, Janna Hackett; Abner, Erin Lynn; Schmitt, Frederick A; Denison, Stephanie Kay; Anderson, Sonya; Patel, Ela; Nelson, Peter T

    2012-12-01

    Quantitative neuropathologic methods provide information that is important for both research and clinical applications. The technologic advancement of digital pathology and image analysis offers new solutions to enable valid quantification of pathologic severity that is reproducible between raters regardless of experience. Using an Aperio ScanScope XT and its accompanying image analysis software, we designed algorithms for quantitation of amyloid and tau pathologies on 65 β-amyloid (6F/3D antibody) and 48 phospho-tau (PHF-1)-immunostained sections of human temporal neocortex. Quantitative digital pathologic data were compared with manual pathology counts. There were excellent correlations between manually counted and digitally analyzed neuropathologic parameters (R² = 0.56-0.72). Data were highly reproducible among 3 participants with varying degrees of expertise in neuropathology (intraclass correlation coefficient values, >0.910). Digital quantification also provided additional parameters, including average plaque area, which shows statistically significant differences when samples are stratified according to apolipoprotein E allele status (average plaque area, 380.9 μm² in apolipoprotein E [Latin Small Letter Open E]4 carriers vs 274.4 μm² for noncarriers; p < 0.001). Thus, digital pathology offers a rigorous and reproducible method for quantifying Alzheimer disease neuropathologic changes and may provide additional insights into morphologic characteristics that were previously more challenging to assess because of technical limitations.