Science.gov

Sample records for accurate quantitative assessment

  1. A correlative imaging based methodology for accurate quantitative assessment of bone formation in additive manufactured implants.

    PubMed

    Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D

    2016-06-01

    A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants. PMID:27153828

  2. Toward Accurate and Quantitative Comparative Metagenomics.

    PubMed

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  3. Accurate and easy-to-use assessment of contiguous DNA methylation sites based on proportion competitive quantitative-PCR and lateral flow nucleic acid biosensor.

    PubMed

    Xu, Wentao; Cheng, Nan; Huang, Kunlun; Lin, Yuehe; Wang, Chenguang; Xu, Yuancong; Zhu, Longjiao; Du, Dan; Luo, Yunbo

    2016-06-15

    Many types of diagnostic technologies have been reported for DNA methylation, but they require a standard curve for quantification or only show moderate accuracy. Moreover, most technologies have difficulty providing information on the level of methylation at specific contiguous multi-sites, not to mention easy-to-use detection to eliminate labor-intensive procedures. We have addressed these limitations and report here a cascade strategy that combines proportion competitive quantitative PCR (PCQ-PCR) and lateral flow nucleic acid biosensor (LFNAB), resulting in accurate and easy-to-use assessment. The P16 gene with specific multi-methylated sites, a well-studied tumor suppressor gene, was used as the target DNA sequence model. First, PCQ-PCR provided amplification products with an accurate proportion of multi-methylated sites following the principle of proportionality, and double-labeled duplex DNA was synthesized. Then, a LFNAB strategy was further employed for amplified signal detection via immune affinity recognition, and the exact level of site-specific methylation could be determined by the relative intensity of the test line and internal reference line. This combination resulted in all recoveries being greater than 94%, which are pretty satisfactory recoveries in DNA methylation assessment. Moreover, the developed cascades show significantly high usability as a simple, sensitive, and low-cost tool. Therefore, as a universal platform for sensing systems for the detection of contiguous multi-sites of DNA methylation without external standards and expensive instrumentation, this PCQ-PCR-LFNAB cascade method shows great promise for the point-of-care diagnosis of cancer risk and therapeutics. PMID:26914373

  4. Quantitative Assessment of Protein Structural Models by Comparison of H/D Exchange MS Data with Exchange Behavior Accurately Predicted by DXCOREX

    NASA Astrophysics Data System (ADS)

    Liu, Tong; Pantazatos, Dennis; Li, Sheng; Hamuro, Yoshitomo; Hilser, Vincent J.; Woods, Virgil L.

    2012-01-01

    Peptide amide hydrogen/deuterium exchange mass spectrometry (DXMS) data are often used to qualitatively support models for protein structure. We have developed and validated a method (DXCOREX) by which exchange data can be used to quantitatively assess the accuracy of three-dimensional (3-D) models of protein structure. The method utilizes the COREX algorithm to predict a protein's amide hydrogen exchange rates by reference to a hypothesized structure, and these values are used to generate a virtual data set (deuteron incorporation per peptide) that can be quantitatively compared with the deuteration level of the peptide probes measured by hydrogen exchange experimentation. The accuracy of DXCOREX was established in studies performed with 13 proteins for which both high-resolution structures and experimental data were available. The DXCOREX-calculated and experimental data for each protein was highly correlated. We then employed correlation analysis of DXCOREX-calculated versus DXMS experimental data to assess the accuracy of a recently proposed structural model for the catalytic domain of a Ca2+-independent phospholipase A2. The model's calculated exchange behavior was highly correlated with the experimental exchange results available for the protein, supporting the accuracy of the proposed model. This method of analysis will substantially increase the precision with which experimental hydrogen exchange data can help decipher challenging questions regarding protein structure and dynamics.

  5. Groundtruth approach to accurate quantitation of fluorescence microarrays

    SciTech Connect

    Mascio-Kegelmeyer, L; Tomascik-Cheeseman, L; Burnett, M S; van Hummelen, P; Wyrobek, A J

    2000-12-01

    To more accurately measure fluorescent signals from microarrays, we calibrated our acquisition and analysis systems by using groundtruth samples comprised of known quantities of red and green gene-specific DNA probes hybridized to cDNA targets. We imaged the slides with a full-field, white light CCD imager and analyzed them with our custom analysis software. Here we compare, for multiple genes, results obtained with and without preprocessing (alignment, color crosstalk compensation, dark field subtraction, and integration time). We also evaluate the accuracy of various image processing and analysis techniques (background subtraction, segmentation, quantitation and normalization). This methodology calibrates and validates our system for accurate quantitative measurement of microarrays. Specifically, we show that preprocessing the images produces results significantly closer to the known ground-truth for these samples.

  6. Fast and Accurate Detection of Multiple Quantitative Trait Loci

    PubMed Central

    Nettelblad, Carl; Holmgren, Sverker

    2013-01-01

    Abstract We present a new computational scheme that enables efficient and reliable quantitative trait loci (QTL) scans for experimental populations. Using a standard brute-force exhaustive search effectively prohibits accurate QTL scans involving more than two loci to be performed in practice, at least if permutation testing is used to determine significance. Some more elaborate global optimization approaches, for example, DIRECT have been adopted earlier to QTL search problems. Dramatic speedups have been reported for high-dimensional scans. However, since a heuristic termination criterion must be used in these types of algorithms, the accuracy of the optimization process cannot be guaranteed. Indeed, earlier results show that a small bias in the significance thresholds is sometimes introduced. Our new optimization scheme, PruneDIRECT, is based on an analysis leading to a computable (Lipschitz) bound on the slope of a transformed objective function. The bound is derived for both infinite- and finite-size populations. Introducing a Lipschitz bound in DIRECT leads to an algorithm related to classical Lipschitz optimization. Regions in the search space can be permanently excluded (pruned) during the optimization process. Heuristic termination criteria can thus be avoided. Hence, PruneDIRECT has a well-defined error bound and can in practice be guaranteed to be equivalent to a corresponding exhaustive search. We present simulation results that show that for simultaneous mapping of three QTLS using permutation testing, PruneDIRECT is typically more than 50 times faster than exhaustive search. The speedup is higher for stronger QTL. This could be used to quickly detect strong candidate eQTL networks. PMID:23919387

  7. Microbiological Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dominguez, Silvia; Schaffner, Donald W.

    The meat and poultry industry faces ongoing challenges due to the natural association of pathogens of concern (e.g., Salmonella, Campylobacter jejuni, Escherichia coli O157:H7) with a variety of domesticated food animals. In addition, pathogens such as Listeria monocytogenes pose a significant cross-contamination risk during further meat and poultry processing, distribution, and storage. Furthermore, the meat and poultry industries are constantly changing with the addition of new products, use of new raw materials, and targeting of new consumer populations, each of which may give rise to potential new risks. National and international regulations are increasingly using a “risk-based” approach to food safety (where the regulatory focus is driven by the magnitude of the risk), so risk assessment is becoming a valuable tool to systematically organize and evaluate the potential public health risk posed by food processing operations.

  8. A quantitative phosphorus loss assessment tool for agricultural fields

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Conservation and nutrient management planners need an assessment tool to accurately predict phosphorus (P) loss from agricultural lands. Available tools are either qualitative indices with limited capability to quantify offsite water quality impacts or prohibitively complex quantitative process-bas...

  9. Towards quantitative assessment of calciphylaxis

    NASA Astrophysics Data System (ADS)

    Deserno, Thomas M.; Sárándi, István.; Jose, Abin; Haak, Daniel; Jonas, Stephan; Specht, Paula; Brandenburg, Vincent

    2014-03-01

    Calciphylaxis is a rare disease that has devastating conditions associated with high morbidity and mortality. Calciphylaxis is characterized by systemic medial calcification of the arteries yielding necrotic skin ulcerations. In this paper, we aim at supporting the installation of multi-center registries for calciphylaxis, which includes a photographic documentation of skin necrosis. However, photographs acquired in different centers under different conditions using different equipment and photographers cannot be compared quantitatively. For normalization, we use a simple color pad that is placed into the field of view, segmented from the image, and its color fields are analyzed. In total, 24 colors are printed on that scale. A least-squares approach is used to determine the affine color transform. Furthermore, the card allows scale normalization. We provide a case study for qualitative assessment. In addition, the method is evaluated quantitatively using 10 images of two sets of different captures of the same necrosis. The variability of quantitative measurements based on free hand photography is assessed regarding geometric and color distortions before and after our simple calibration procedure. Using automated image processing, the standard deviation of measurements is significantly reduced. The coefficients of variations yield 5-20% and 2-10% for geometry and color, respectively. Hence, quantitative assessment of calciphylaxis becomes practicable and will impact a better understanding of this rare but fatal disease.

  10. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  11. Quantitative assessment of growth plate activity

    SciTech Connect

    Harcke, H.T.; Macy, N.J.; Mandell, G.A.; MacEwen, G.D.

    1984-01-01

    In the immature skeleton the physis or growth plate is the area of bone least able to withstand external forces and is therefore prone to trauma. Such trauma often leads to premature closure of the plate and results in limb shortening and/or angular deformity (varus or valgus). Active localization of bone seeking tracers in the physis makes bone scintigraphy an excellent method for assessing growth plate physiology. To be most effective, however, physeal activity should be quantified so that serial evaluations are accurate and comparable. The authors have developed a quantitative method for assessing physeal activity and have applied it ot the hip and knee. Using computer acquired pinhole images of the abnormal and contralateral normal joints, ten regions of interest are placed at key locations around each joint and comparative ratios are generated to form a growth plate profile. The ratios compare segmental physeal activity to total growth plate activity on both ipsilateral and contralateral sides and to adjacent bone. In 25 patients, ages 2 to 15 years, with angular deformities of the legs secondary to trauma, Blount's disease, and Perthes disease, this technique is able to differentiate abnormal segmental physeal activity. This is important since plate closure does not usually occur uniformly across the physis. The technique may permit the use of scintigraphy in the prediction of early closure through the quantitative analysis of serial studies.

  12. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  13. Designer cantilevers for even more accurate quantitative measurements of biological systems with multifrequency AFM

    NASA Astrophysics Data System (ADS)

    Contera, S.

    2016-04-01

    Multifrequency excitation/monitoring of cantilevers has made it possible both to achieve fast, relatively simple, nanometre-resolution quantitative mapping of mechanical of biological systems in solution using atomic force microscopy (AFM), and single molecule resolution detection by nanomechanical biosensors. A recent paper by Penedo et al [2015 Nanotechnology 26 485706] has made a significant contribution by developing simple methods to improve the signal to noise ratio in liquid environments, by selectively enhancing cantilever modes, which will lead to even more accurate quantitative measurements.

  14. Teaching quantitative biology: goals, assessments, and resources

    PubMed Central

    Aikens, Melissa L.; Dolan, Erin L.

    2014-01-01

    More than a decade has passed since the publication of BIO2010, calling for an increased emphasis on quantitative skills in the undergraduate biology curriculum. In that time, relatively few papers have been published that describe educational innovations in quantitative biology or provide evidence of their effects on students. Using a “backward design” framework, we lay out quantitative skill and attitude goals, assessment strategies, and teaching resources to help biologists teach more quantitatively. Collaborations between quantitative biologists and education researchers are necessary to develop a broader and more appropriate suite of assessment tools, and to provide much-needed evidence on how particular teaching strategies affect biology students' quantitative skill development and attitudes toward quantitative work. PMID:25368425

  15. Quantitative assessment of scientific quality

    NASA Astrophysics Data System (ADS)

    Heinzl, Harald; Bloching, Philipp

    2012-09-01

    Scientific publications, authors, and journals are commonly evaluated with quantitative bibliometric measures. Frequently-used measures will be reviewed and their strengths and weaknesses will be highlighted. Reflections about conditions for a new, research paper-specific measure will be presented.

  16. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  17. Can blind persons accurately assess body size from the voice?

    PubMed

    Pisanski, Katarzyna; Oleszkiewicz, Anna; Sorokowska, Agnieszka

    2016-04-01

    Vocal tract resonances provide reliable information about a speaker's body size that human listeners use for biosocial judgements as well as speech recognition. Although humans can accurately assess men's relative body size from the voice alone, how this ability is acquired remains unknown. In this study, we test the prediction that accurate voice-based size estimation is possible without prior audiovisual experience linking low frequencies to large bodies. Ninety-one healthy congenitally or early blind, late blind and sighted adults (aged 20-65) participated in the study. On the basis of vowel sounds alone, participants assessed the relative body sizes of male pairs of varying heights. Accuracy of voice-based body size assessments significantly exceeded chance and did not differ among participants who were sighted, or congenitally blind or who had lost their sight later in life. Accuracy increased significantly with relative differences in physical height between men, suggesting that both blind and sighted participants used reliable vocal cues to size (i.e. vocal tract resonances). Our findings demonstrate that prior visual experience is not necessary for accurate body size estimation. This capacity, integral to both nonverbal communication and speech perception, may be present at birth or may generalize from broader cross-modal correspondences. PMID:27095264

  18. Quantitative assessment of fluorescent proteins.

    PubMed

    Cranfill, Paula J; Sell, Brittney R; Baird, Michelle A; Allen, John R; Lavagnino, Zeno; de Gruiter, H Martijn; Kremers, Gert-Jan; Davidson, Michael W; Ustione, Alessandro; Piston, David W

    2016-07-01

    The advent of fluorescent proteins (FPs) for genetic labeling of molecules and cells has revolutionized fluorescence microscopy. Genetic manipulations have created a vast array of bright and stable FPs spanning blue to red spectral regions. Common to autofluorescent FPs is their tight β-barrel structure, which provides the rigidity and chemical environment needed for effectual fluorescence. Despite the common structure, each FP has unique properties. Thus, there is no single 'best' FP for every circumstance, and each FP has advantages and disadvantages. To guide decisions about which FP is right for a given application, we have quantitatively characterized the brightness, photostability, pH stability and monomeric properties of more than 40 FPs to enable straightforward and direct comparison between them. We focus on popular and/or top-performing FPs in each spectral region. PMID:27240257

  19. Quantitative assessment of skin aging.

    PubMed

    Lévêque, J L

    2001-11-01

    Noninvasive methods have allowed physicians to give an objective description of aged skin in terms of functional and esthetic properties. The relative influence of environment (mainly sun) on the true aging process can be assessed through the obtained data. It is also possible to measure the efficacy of topical preparations (cosmetics or drugs) designed for treating the various cutaneous aging marks. PMID:11535423

  20. Accurate and molecular-size-tolerant NMR quantitation of diverse components in solution

    PubMed Central

    Okamura, Hideyasu; Nishimura, Hiroshi; Nagata, Takashi; Kigawa, Takanori; Watanabe, Takashi; Katahira, Masato

    2016-01-01

    Determining the amount of each component of interest in a mixture is a fundamental first step in characterizing the nature of the solution and to develop possible means of utilization of its components. Similarly, determining the composition of units in complex polymers, or polymer mixtures, is crucial. Although NMR is recognized as one of the most powerful methods to achieve this and is widely used in many fields, variation in the molecular sizes or the relative mobilities of components skews quantitation due to the size-dependent decay of magnetization. Here, a method to accurately determine the amount of each component by NMR was developed. This method was validated using a solution that contains biomass-related components in which the molecular sizes greatly differ. The method is also tolerant of other factors that skew quantitation such as variation in the one-bond C–H coupling constant. The developed method is the first and only way to reliably overcome the skewed quantitation caused by several different factors to provide basic information on the correct amount of each component in a solution. PMID:26883279

  1. Quantitation and accurate mass analysis of pesticides in vegetables by LC/TOF-MS.

    PubMed

    Ferrer, Imma; Thurman, E Michael; Fernández-Alba, Amadeo R

    2005-05-01

    A quantitative method consisting of solvent extraction followed by liquid chromatography/time-of-flight mass spectrometry (LC/TOF-MS) analysis was developed for the identification and quantitation of three chloronicotinyl pesticides (imidacloprid, acetamiprid, thiacloprid) commonly used on salad vegetables. Accurate mass measurements within 3 ppm error were obtained for all the pesticides studied in various vegetable matrixes (cucumber, tomato, lettuce, pepper), which allowed an unequivocal identification of the target pesticides. Calibration curves covering 2 orders of magnitude were linear over the concentration range studied, thus showing the quantitative ability of TOF-MS as a monitoring tool for pesticides in vegetables. Matrix effects were also evaluated using matrix-matched standards showing no significant interferences between matrixes and clean extracts. Intraday reproducibility was 2-3% relative standard deviation (RSD) and interday values were 5% RSD. The precision (standard deviation) of the mass measurements was evaluated and it was less than 0.23 mDa between days. Detection limits of the chloronicotinyl insecticides in salad vegetables ranged from 0.002 to 0.01 mg/kg. These concentrations are equal to or better than the EU directives for controlled pesticides in vegetables showing that LC/TOF-MS analysis is a powerful tool for identification of pesticides in vegetables. Robustness and applicability of the method was validated for the analysis of market vegetable samples. Concentrations found in these samples were in the range of 0.02-0.17 mg/kg of vegetable. PMID:15859598

  2. Accurate and quantitative polarization-sensitive OCT by unbiased birefringence estimator with noise-stochastic correction

    NASA Astrophysics Data System (ADS)

    Kasaragod, Deepa; Sugiyama, Satoshi; Ikuno, Yasushi; Alonso-Caneiro, David; Yamanari, Masahiro; Fukuda, Shinichi; Oshika, Tetsuro; Hong, Young-Joo; Li, En; Makita, Shuichi; Miura, Masahiro; Yasuno, Yoshiaki

    2016-03-01

    Polarization sensitive optical coherence tomography (PS-OCT) is a functional extension of OCT that contrasts the polarization properties of tissues. It has been applied to ophthalmology, cardiology, etc. Proper quantitative imaging is required for a widespread clinical utility. However, the conventional method of averaging to improve the signal to noise ratio (SNR) and the contrast of the phase retardation (or birefringence) images introduce a noise bias offset from the true value. This bias reduces the effectiveness of birefringence contrast for a quantitative study. Although coherent averaging of Jones matrix tomography has been widely utilized and has improved the image quality, the fundamental limitation of nonlinear dependency of phase retardation and birefringence to the SNR was not overcome. So the birefringence obtained by PS-OCT was still not accurate for a quantitative imaging. The nonlinear effect of SNR to phase retardation and birefringence measurement was previously formulated in detail for a Jones matrix OCT (JM-OCT) [1]. Based on this, we had developed a maximum a-posteriori (MAP) estimator and quantitative birefringence imaging was demonstrated [2]. However, this first version of estimator had a theoretical shortcoming. It did not take into account the stochastic nature of SNR of OCT signal. In this paper, we present an improved version of the MAP estimator which takes into account the stochastic property of SNR. This estimator uses a probability distribution function (PDF) of true local retardation, which is proportional to birefringence, under a specific set of measurements of the birefringence and SNR. The PDF was pre-computed by a Monte-Carlo (MC) simulation based on the mathematical model of JM-OCT before the measurement. A comparison between this new MAP estimator, our previous MAP estimator [2], and the standard mean estimator is presented. The comparisons are performed both by numerical simulation and in vivo measurements of anterior and

  3. Accuracy of quantitative visual soil assessment

    NASA Astrophysics Data System (ADS)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  4. The accurate assessment of small-angle X-ray scattering data

    SciTech Connect

    Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; Matsui, Tsutomu; Weiss, Thomas M.; Martel, Anne; Snell, Edward H.

    2015-01-01

    A set of quantitative techniques is suggested for assessing SAXS data quality. These are applied in the form of a script, SAXStats, to a test set of 27 proteins, showing that these techniques are more sensitive than manual assessment of data quality. Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targets for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. The studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.

  5. Integrated regional assessment: qualitative and quantitative issues

    SciTech Connect

    Malone, Elizabeth L.

    2009-11-19

    Qualitative and quantitative issues are particularly significant in integrated regional assessment. This chapter examines the terms “qualitative” and “quantitative” separately and in relation to one another, along with a discussion of the degree of interdependence or overlap between the two. Strategies for integrating the two general approaches often produce uneasy compromises. However, integrated regional assessment provides opportunities for strong collaborations in addressing specific problems in specific places.

  6. Quantitative Assessment of Image Retrieval Effectiveness.

    ERIC Educational Resources Information Center

    Smith, John R.

    2001-01-01

    Examines the problems of developing a framework and testbed for quantitative assessment of image retrieval effectiveness. To better harness the extensive research on content-based retrieval and improve capabilities of image retrieval systems, this article advocates the establishment of common image retrieval testbeds consisting of standardized…

  7. Bright-field quantitative phase microscopy (BFQPM) for accurate phase imaging using conventional microscopy hardware

    NASA Astrophysics Data System (ADS)

    Jenkins, Micah; Gaylord, Thomas K.

    2015-03-01

    Most quantitative phase microscopy methods require the use of custom-built or modified microscopic configurations which are not typically available to most bio/pathologists. There are, however, phase retrieval algorithms which utilize defocused bright-field images as input data and are therefore implementable in existing laboratory environments. Among these, deterministic methods such as those based on inverting the transport-of-intensity equation (TIE) or a phase contrast transfer function (PCTF) are particularly attractive due to their compatibility with Köhler illuminated systems and numerical simplicity. Recently, a new method has been proposed, called multi-filter phase imaging with partially coherent light (MFPI-PC), which alleviates the inherent noise/resolution trade-off in solving the TIE by utilizing a large number of defocused bright-field images spaced equally about the focal plane. Despite greatly improving the state-ofthe- art, the method has many shortcomings including the impracticality of high-speed acquisition, inefficient sampling, and attenuated response at high frequencies due to aperture effects. In this report, we present a new method, called bright-field quantitative phase microscopy (BFQPM), which efficiently utilizes a small number of defocused bright-field images and recovers frequencies out to the partially coherent diffraction limit. The method is based on a noiseminimized inversion of a PCTF derived for each finite defocus distance. We present simulation results which indicate nanoscale optical path length sensitivity and improved performance over MFPI-PC. We also provide experimental results imaging live bovine mesenchymal stem cells at sub-second temporal resolution. In all, BFQPM enables fast and accurate phase imaging with unprecedented spatial resolution using widely available bright-field microscopy hardware.

  8. CT-Analyst: fast and accurate CBR emergency assessment

    NASA Astrophysics Data System (ADS)

    Boris, Jay; Fulton, Jack E., Jr.; Obenschain, Keith; Patnaik, Gopal; Young, Theodore, Jr.

    2004-08-01

    An urban-oriented emergency assessment system for airborne Chemical, Biological, and Radiological (CBR) threats, called CT-Analyst and based on new principles, gives greater accuracy and much greater speed than possible with current alternatives. This paper explains how this has been done. The increased accuracy derives from detailed, three-dimensional CFD computations including, solar heating, buoyancy, complete building geometry specification, trees, wind fluctuations, and particle and droplet distributions (as appropriate). This paper shows how a very finite number of such computations for a given area can be extended to all wind directions and speeds, and all likely sources and source locations using a new data structure called Dispersion Nomographs. Finally, we demonstrate a portable, entirely graphical software tool called CT-Analyst that embodies this entirely new, high-resolution technology and runs effectively on small personal computers. Real-time users don't have to wait for results because accurate answers are available with near zero-latency (that is 10 - 20 scenarios per second). Entire sequences of cases (e.g. a continuously changing source location or wind direction) can be computed and displayed as continuous-action movies. Since the underlying database has been precomputed, the door is wide open for important new real-time, zero-latency functions such as sensor data fusion, backtracking to an unknown source location, and even evacuation route planning. Extensions of the technology to sensor location optimization, buildings, tunnels, and integration with other advanced technologies, e.g. micrometeorology or detailed wind field measurements, will be discussed briefly here.

  9. Multiobjective optimization in quantitative structure-activity relationships: deriving accurate and interpretable QSARs.

    PubMed

    Nicolotti, Orazio; Gillet, Valerie J; Fleming, Peter J; Green, Darren V S

    2002-11-01

    Deriving quantitative structure-activity relationship (QSAR) models that are accurate, reliable, and easily interpretable is a difficult task. In this study, two new methods have been developed that aim to find useful QSAR models that represent an appropriate balance between model accuracy and complexity. Both methods are based on genetic programming (GP). The first method, referred to as genetic QSAR (or GPQSAR), uses a penalty function to control model complexity. GPQSAR is designed to derive a single linear model that represents an appropriate balance between the variance and the number of descriptors selected for the model. The second method, referred to as multiobjective genetic QSAR (MoQSAR), is based on multiobjective GP and represents a new way of thinking of QSAR. Specifically, QSAR is considered as a multiobjective optimization problem that comprises a number of competitive objectives. Typical objectives include model fitting, the total number of terms, and the occurrence of nonlinear terms. MoQSAR results in a family of equivalent QSAR models where each QSAR represents a different tradeoff in the objectives. A practical consideration often overlooked in QSAR studies is the need for the model to promote an understanding of the biochemical response under investigation. To accomplish this, chemically intuitive descriptors are needed but do not always give rise to statistically robust models. This problem is addressed by the addition of a further objective, called chemical desirability, that aims to reward models that consist of descriptors that are easily interpretable by chemists. GPQSAR and MoQSAR have been tested on various data sets including the Selwood data set and two different solubility data sets. The study demonstrates that the MoQSAR method is able to find models that are at least as good as models derived using standard statistical approaches and also yields models that allow a medicinal chemist to trade statistical robustness for chemical

  10. Validation of biological markers for quantitative risk assessment.

    PubMed Central

    Schulte, P; Mazzuckelli, L F

    1991-01-01

    The evaluation of biological markers is recognized as necessary to the future of toxicology, epidemiology, and quantitative risk assessment. For biological markers to become widely accepted, their validity must be ascertained. This paper explores the range of considerations that compose the concept of validity as it applies to the evaluation of biological markers. Three broad categories of validity (measurement, internal study, and external) are discussed in the context of evaluating data for use in quantitative risk assessment. Particular attention is given to the importance of measurement validity in the consideration of whether to use biological markers in epidemiologic studies. The concepts developed in this presentation are applied to examples derived from the occupational environment. In the first example, measurement of bromine release as a marker of ethylene dibromide toxicity is shown to be of limited use in constructing an accurate quantitative assessment of the risk of developing cancer as a result of long-term, low-level exposure. This example is compared to data obtained from studies of ethylene oxide, in which hemoglobin alkylation is shown to be a valid marker of both exposure and effect. PMID:2050067

  11. Can clinicians accurately assess esophageal dilation without fluoroscopy?

    PubMed

    Bailey, A D; Goldner, F

    1990-01-01

    This study questioned whether clinicians could determine the success of esophageal dilation accurately without the aid of fluoroscopy. Twenty patients were enrolled with the diagnosis of distal esophageal stenosis, including benign peptic stricture (17), Schatski's ring (2), and squamous cell carcinoma of the esophagus (1). Dilation attempts using only Maloney dilators were monitored fluoroscopically by the principle investigator, the physician and patient being unaware of the findings. Physicians then predicted whether or not their dilations were successful, and they examined various features to determine their usefulness in predicting successful dilation. They were able to predict successful dilation accurately in 97% of the cases studied; however, their predictions of unsuccessful dilation were correct only 60% of the time. Features helpful in predicting passage included easy passage of the dilator (98%) and the patient feeling the dilator in the stomach (95%). Excessive resistance suggesting unsuccessful passage was an unreliable feature and was often due to the dilator curling in the stomach. When Maloney dilators are used to dilate simple distal strictures, if the physician predicts successful passage, he is reliably accurate without the use of fluoroscopy; however, if unsuccessful passage is suspected, fluoroscopy must be used for confirmation. PMID:2210278

  12. Binary Imaging Analysis for Comprehensive Quantitative Assessment of Peripheral Nerve

    PubMed Central

    Hunter, Daniel A.; Moradzadeh, Arash; Whitlock, Elizabeth L.; Brenner, Michael J.; Myckatyn, Terence M.; Wei, Cindy H.; Tung, Thomas H.H.; Mackinnon, Susan E.

    2007-01-01

    Quantitative histomorphometry is the current gold standard for objective measurement of nerve architecture and its components. Many methods still in use rely heavily upon manual techniques that are prohibitively time consuming, predisposing to operator fatigue, sampling error, and overall limited reproducibility. More recently, investigators have attempted to combine the speed of automated morphometry with the accuracy of manual and semi-automated methods. Systematic refinements in binary imaging analysis techniques combined with an algorithmic approach allow for more exhaustive characterization of nerve parameters in the surgically relevant injury paradigms of regeneration following crush, transection, and nerve gap injuries. The binary imaging method introduced here uses multiple bitplanes to achieve reproducible, high throughput quantitative assessment of peripheral nerve. Number of myelinated axons, myelinated fiber diameter, myelin thickness, fiber distributions, myelinated fiber density, and neural debris can be quantitatively evaluated with stratification of raw data by nerve component. Results of this semi-automated method are validated by comparing values against those obtained with manual techniques. The use of this approach results in more rapid, accurate, and complete assessment of myelinated axons than manual techniques. PMID:17675163

  13. Increasing Accurate Preference Assessment Implementation through Pyramidal Training

    ERIC Educational Resources Information Center

    Pence, Sacha T.; St. Peter, Claire C.; Tetreault, Allison S.

    2012-01-01

    Preference assessments directly evaluate items that may serve as reinforcers, and their implementation is an important skill for individuals who work with children. This study examined the effectiveness of pyramidal training on teachers' implementation of preference assessments. During Experiment 1, 3 special education teachers taught 6 trainees…

  14. The accurate assessment of small-angle X-ray scattering data

    DOE PAGESBeta

    Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; Matsui, Tsutomu; Weiss, Thomas M.; Martel, Anne; Snell, Edward H.

    2015-01-23

    Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targetsmore » for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. Studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.« less

  15. The accurate assessment of small-angle X-ray scattering data

    SciTech Connect

    Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; Matsui, Tsutomu; Weiss, Thomas M.; Martel, Anne; Snell, Edward H.

    2015-01-23

    Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targets for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. Studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.

  16. The accurate assessment of small-angle X-ray scattering data

    PubMed Central

    Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; Matsui, Tsutomu; Weiss, Thomas M.; Martel, Anne; Snell, Edward H.

    2015-01-01

    Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targets for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. The studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality. PMID:25615859

  17. Internal Medicine Residents Do Not Accurately Assess Their Medical Knowledge

    ERIC Educational Resources Information Center

    Jones, Roger; Panda, Mukta; Desbiens, Norman

    2008-01-01

    Background: Medical knowledge is essential for appropriate patient care; however, the accuracy of internal medicine (IM) residents' assessment of their medical knowledge is unknown. Methods: IM residents predicted their overall percentile performance 1 week (on average) before and after taking the in-training exam (ITE), an objective and well…

  18. An accurate method of extracting fat droplets in liver images for quantitative evaluation

    NASA Astrophysics Data System (ADS)

    Ishikawa, Masahiro; Kobayashi, Naoki; Komagata, Hideki; Shinoda, Kazuma; Yamaguchi, Masahiro; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie

    2015-03-01

    The steatosis in liver pathological tissue images is a promising indicator of nonalcoholic fatty liver disease (NAFLD) and the possible risk of hepatocellular carcinoma (HCC). The resulting values are also important for ensuring the automatic and accurate classification of HCC images, because the existence of many fat droplets is likely to create errors in quantifying the morphological features used in the process. In this study we propose a method that can automatically detect, and exclude regions with many fat droplets by using the feature values of colors, shapes and the arrangement of cell nuclei. We implement the method and confirm that it can accurately detect fat droplets and quantify the fat droplet ratio of actual images. This investigation also clarifies the effective characteristics that contribute to accurate detection.

  19. Hydrogen quantitative risk assessment workshop proceedings.

    SciTech Connect

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersion 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.

  20. Mass Spectrometry Provides Accurate and Sensitive Quantitation of A2E

    PubMed Central

    Gutierrez, Danielle B.; Blakeley, Lorie; Goletz, Patrice W.; Schey, Kevin L.; Hanneken, Anne; Koutalos, Yiannis; Crouch, Rosalie K.; Ablonczy, Zsolt

    2010-01-01

    Summary Orange autofluorescence from lipofuscin in the lysosomes of the retinal pigment epithelium (RPE) is a hallmark of aging in the eye. One of the major components of lipofuscin is A2E, the levels of which increase with age and in pathologic conditions, such as Stargardt disease or age-related macular degeneration. In vitro studies have suggested that A2E is highly phototoxic and, more specifically, that A2E and its oxidized derivatives contribute to RPE damage and subsequent photoreceptor cell death. To date, absorption spectroscopy has been the primary method to identify and quantitate A2E. Here, a new mass spectrometric method was developed for the specific detection of low levels of A2E and compared to a traditional method of analysis. The new mass spectrometry method allows the detection and quantitation of approximately 10,000-fold less A2E than absorption spectroscopy and the detection and quantitation of low levels of oxidized A2E, with localization of the oxidation sites. This study suggests that identification and quantitation of A2E from tissue extracts by chromatographic absorption spectroscopyoverestimates the amount of A2E. This mass spectrometry approach makes it possible to detect low levels of A2E and its oxidized metabolites with greater accuracy than traditional methods, thereby facilitating a more exact analysis of bis-retinoids in animal models of inherited retinal degeneration as well as in normal and diseased human eyes. PMID:20931136

  1. Quantitative nonlinear optical assessment of atherosclerosis progression in rabbits.

    PubMed

    Mostaço-Guidolin, Leila B; Kohlenberg, Elicia K; Smith, Michael; Hewko, Mark; Major, Arkady; Sowa, Michael G; Ko, Alex C-T

    2014-07-01

    Quantification of atherosclerosis has been a challenging task owing to its complex pathology. In this study, we validated a quantitative approach for assessing atherosclerosis progression in a rabbit model using a numerical matrix, optical index for plaque burden, derived directly from the nonlinear optical microscopic images captured on the atherosclerosis-affected blood vessel. A positive correlation between this optical index and the severity of atherosclerotic lesions, represented by the age of the rabbits, was established based on data collected from 21 myocardial infarction-prone Watanabe heritable hyperlipidemic rabbits with age ranging between new-born and 27 months old. The same optical index also accurately identified high-risk locations for atherosclerotic plaque formation along the entire aorta, which was validated by immunohistochemical fluorescence imaging. PMID:24892226

  2. More accurate assessment of stenotic lesions in percutaneous transluminal angioplasty.

    PubMed

    Janevski, B K; Breslau, P J; Jorning, P J

    1986-01-01

    Eighty patients underwent percutaneous transluminal dilatation and recanalisation of atheromatous lesions of the arteries of the lower extremities in the University Hospital of Maastricht in the period of 1980 to 1984. Out of 80 attempted procedures of the iliac and femoro-popliteal tract 71 (89%) were technically possible and were considered initially successful. In all cases of iliac artery lesions a retrograde arteriogram was performed prior to PTA. Intra-arterial pressure measurements at rest and after hyperemia were used for exact assessment of the hemodynamic significance of the stenosis before and after PTA. A follow-up of all patients successfully treated by angioplasty was performed. The early hemodynamic success rate of PTA for iliac lesions was 90 per cent and for femoral-popliteal segment 83 per cent. There was no morbidity or mortality. The cumulative 3-year patency rate for both segments was 74 per cent. PMID:2943842

  3. Quantitative Methods for Assessing Drug Synergism

    PubMed Central

    2011-01-01

    Two or more drugs that individually produce overtly similar effects will sometimes display greatly enhanced effects when given in combination. When the combined effect is greater than that predicted by their individual potencies, the combination is said to be synergistic. A synergistic interaction allows the use of lower doses of the combination constituents, a situation that may reduce adverse reactions. Drug combinations are quite common in the treatment of cancers, infections, pain, and many other diseases and situations. The determination of synergism is a quantitative pursuit that involves a rigorous demonstration that the combination effect is greater than that which is expected from the individual drug’s potencies. The basis of that demonstration is the concept of dose equivalence, which is discussed here and applied to an experimental design and data analysis known as isobolographic analysis. That method, and a related method of analysis that also uses dose equivalence, are presented in this brief review, which provides the mathematical basis for assessing synergy and an optimization strategy for determining the dose combination. PMID:22737266

  4. Quantitative Risk Assessment for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Lowry, T. S.; McKenna, S. A.; Hadgu, T.; Kalinina, E.

    2011-12-01

    This study uses a quantitative risk-assessment approach to place the uncertainty associated with enhanced geothermal systems (EGS) development into meaningful context and to identify points of attack that can reduce risk the most. Using the integrated geothermal assessment tool, GT-Mod, we calculate the complimentary cumulative distribution function of the levelized cost of electricity (LCOE) that results from uncertainty in a variety of geologic and economic input parameter values. EGS is a developing technology that taps deep (2-10km) geologic heat sources for energy production by "enhancing" non-permeable hot rock through hydraulic stimulation. Despite the promise of EGS, uncertainties in predicting the physical end economic performance of a site has hindered its development. To address this, we apply a quantitative risk-assessment approach that calculates risk as the sum of the consequence, C, multiplied by the range of the probability, ΔP, over all estimations of a given exceedance probability, n, over time, t. The consequence here is defined as the deviation from the best estimate LCOE, which is calculated using the 'best-guess' input parameter values. The analysis assumes a realistic but fictitious EGS site with uncertainties in the exploration success rate, the sub-surface thermal gradient, the reservoir fracture pattern, and the power plant performance. Uncertainty in the exploration, construction, O&M, and drilling costs are also included. The depth to the resource is calculated from the thermal gradient and a target resource temperature of 225 °C. Thermal performance is simulated using the Gringarten analytical solution. The mass flow rate is set to produce 30 MWe of power for the given conditions and is adjusted over time to maintain that rate over the plant lifetime of 30 years. Simulations are conducted using GT-Mod, which dynamically links the physical systems of a geothermal site to simulate, as an integrated, multi-system component, the

  5. Quantitative risk assessment of durable glass fibers.

    PubMed

    Fayerweather, William E; Eastes, Walter; Cereghini, Francesco; Hadley, John G

    2002-06-01

    This article presents a quantitative risk assessment for the theoretical lifetime cancer risk from the manufacture and use of relatively durable synthetic glass fibers. More specifically, we estimate levels of exposure to respirable fibers or fiberlike structures of E-glass and C-glass that, assuming a working lifetime exposure, pose a theoretical lifetime cancer risk of not more than 1 per 100,000. For comparability with other risk assessments we define these levels as nonsignificant exposures. Nonsignificant exposure levels are estimated from (a) the Institute of Occupational Medicine (IOM) chronic rat inhalation bioassay of durable E-glass microfibers, and (b) the Research Consulting Company (RCC) chronic inhalation bioassay of durable refractory ceramic fibers (RCF). Best estimates of nonsignificant E-glass exposure exceed 0.05-0.13 fibers (or shards) per cubic centimeter (cm3) when calculated from the multistage nonthreshold model. Best estimates of nonsignificant C-glass exposure exceed 0.27-0.6 fibers/cm3. Estimates of nonsignificant exposure increase markedly for E- and C-glass when non-linear models are applied and rapidly exceed 1 fiber/cm3. Controlling durable fiber exposures to an 8-h time-weighted average of 0.05 fibers/cm3 will assure that the additional theoretical lifetime risk from working lifetime exposures to these durable fibers or shards is kept below the 1 per 100,000 level. Measured airborne exposures to respirable, durable glass fibers (or shards) in glass fiber manufacturing and fabrication operations were compared with the nonsignificant exposure estimates described. Sampling results for B-sized respirable E-glass fibers at facilities that manufacture or fabricate small-diameter continuous-filament products, from those that manufacture respirable E-glass shards from PERG (process to efficiently recycle glass), from milled fiber operations, and from respirable C-glass shards from Flakeglass operations indicate very low median exposures of 0

  6. Quantitative Assessment of Autistic Symptomatology in Preschoolers

    ERIC Educational Resources Information Center

    Pine, Elyse; Luby, Joan; Abbacchi, Anna; Constantino, John N.

    2006-01-01

    Given a growing emphasis on early intervention for children with autism, valid quantitative tools for measuring treatment response are needed. The Social Responsiveness Scale (SRS) is a brief (15-20 minute) quantitative measure of autistic traits in 4-to 18-year-olds, for which a version for 3-year-olds was recently developed. We obtained serial…

  7. Sensitive Quantitative Assessment of Balance Disorders

    NASA Technical Reports Server (NTRS)

    Paloski, Willilam H.

    2007-01-01

    Computerized dynamic posturography (CDP) has become a standard technique for objectively quantifying balance control performance, diagnosing the nature of functional impairments underlying balance disorders, and monitoring clinical treatment outcomes. We have long used CDP protocols to assess recovery of sensory-motor function in astronauts following space flight. The most reliable indicators of post-flight crew performance are the sensory organization tests (SOTs), particularly SOTs 5 and 6, which are sensitive to changes in availability and/or utilization of vestibular cues. We have noted, however, that some astronauts exhibiting obvious signs of balance impairment after flight are able to score within clinical norms on these tests, perhaps as a result of adopting competitive strategies or by their natural skills at substituting alternate sensory information sources. This insensitivity of the CDP protocol could underestimate of the degree of impairment and, perhaps, lead to premature release of those crewmembers to normal duties. To improve the sensitivity of the CDP protocol we have introduced static and dynamic head tilt SOT trials into our protocol. The pattern of postflight recovery quantified by the enhanced CDP protocol appears to more aptly track the re-integration of sensory-motor function, with recovery time increasing as the complexity of sensory-motor/biomechanical task increases. The new CDP protocol therefore seems more suitable for monitoring post-flight sensory-motor recovery and for indicating to crewmembers and flight surgeons fitness for return to duty and/or activities of daily living. There may be classes of patients (e.g., athletes, pilots) having motivation and/or performance characteristics similar to astronauts whose sensory-motor treatment outcomes would also be more accurately monitored using the enhanced CDP protocol. Furthermore, the enhanced protocol may be useful in early detection of age-related balance disorders.

  8. Quantitative ultrasound assessment of cervical microstructure.

    PubMed

    Feltovich, Helen; Nam, Kibo; Hall, Timothy J

    2010-07-01

    The objective of this preliminary study was to determine whether quantitative ultrasound (QUS) can provide insight into, and characterization of, uterine cervical microstructure. Throughout pregnancy, cervical collagen reorganizes (from aligned and anisotropic to disorganized and isotropic) as the cervix changes in preparation for delivery. Premature changes in collagen are associated with premature birth in mammals. Because QUS is able to detect structural anisotropy/isotropy, we hypothesized that it may provide a means of noninvasively assessing cervical microstructure. Thorough study of cervical microstructure has been limited by lack of technology to detect small changes in collagen organization, which has in turn limited our ability to detect abnormal and/or premature changes in collagen that may lead to preterm birth. In order to determine whether QUS may be useful for detection of cervical microstructure, radiofrequency (rf) echo data were acquired from the cervices of human hysterectomy specimens (n = 10). The angle between the acoustic beam and tissue was used to assess anisotropic acoustic propagation by control of transmit/receive angles from -20 degrees to +20 degrees. The power spectrum of the echo signals from within a region of interest was computed in order to investigate the microstructure of the tissue. An identical analysis was performed on a homogeneous phantom with spherical scatterers for system calibration. Power spectra of backscattered rf from the cervix were 6 dB higher for normal (0 degree) than steered (+/- 20 degrees) beams. The spectral power for steered beams decreased monotonically (0.4 dB at +5 degrees to 3.6 dB at +20 degrees). The excess difference (compared to similar analysis for the phantom) in normally-incident (0 degree) versus steered beams is consistent with scattering from an aligned component of the cervical microstructure. Therefore, QUS appears to reliably identify an aligned component of cervical microstructure

  9. A Quantitative Approach to Assessing System Evolvability

    NASA Technical Reports Server (NTRS)

    Christian, John A., III

    2004-01-01

    When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.

  10. Quantitative spectroscopy of hot stars: accurate atomic data applied on a large scale as driver of recent breakthroughs

    NASA Astrophysics Data System (ADS)

    Przybilla, Norbert; Schaffenroth, Veronika; Nieva, Maria-Fernanda

    2015-08-01

    OB-type stars present hotbeds for non-LTE physics because of their strong radiation fields that drive the atmospheric plasma out of local thermodynamic equilibrium. We report on recent breakthroughs in the quantitative analysis of the optical and UV-spectra of OB-type stars that were facilitated by application of accurate and precise atomic data on a large scale. An astophysicist's dream has come true, by bringing observed and model spectra into close match over wide parts of the observed wavelength ranges. This facilitates tight observational constraints to be derived from OB-type stars for wide applications in astrophysics. However, despite the progress made, many details of the modelling may be improved further. We discuss atomic data needs in terms of laboratory measurements and also ab-initio calculations. Particular emphasis is given to quantitative spectroscopy in the near-IR, which will be in focus in the era of the upcoming extremely large telescopes.

  11. There's plenty of gloom at the bottom: the many challenges of accurate quantitation in size-based oligomeric separations.

    PubMed

    Striegel, André M

    2013-11-01

    There is a variety of small-molecule species (e.g., tackifiers, plasticizers, oligosaccharides) the size-based characterization of which is of considerable scientific and industrial importance. Likewise, quantitation of the amount of oligomers in a polymer sample is crucial for the import and export of substances into the USA and European Union (EU). While the characterization of ultra-high molar mass macromolecules by size-based separation techniques is generally considered a challenge, it is this author's contention that a greater challenge is encountered when trying to perform, for quantitation purposes, separations in and of the oligomeric region. The latter thesis is expounded herein, by detailing the various obstacles encountered en route to accurate, quantitative oligomeric separations by entropically dominated techniques such as size-exclusion chromatography, hydrodynamic chromatography, and asymmetric flow field-flow fractionation, as well as by methods which are, principally, enthalpically driven such as liquid adsorption and temperature gradient interaction chromatography. These obstacles include, among others, the diminished sensitivity of static light scattering (SLS) detection at low molar masses, the non-constancy of the response of SLS and of commonly employed concentration-sensitive detectors across the oligomeric region, and the loss of oligomers through the accumulation wall membrane in asymmetric flow field-flow fractionation. The battle is not lost, however, because, with some care and given a sufficient supply of sample, the quantitation of both individual oligomeric species and of the total oligomeric region is often possible. PMID:23887277

  12. Restriction Site Tiling Analysis: accurate discovery and quantitative genotyping of genome-wide polymorphisms using nucleotide arrays

    PubMed Central

    2010-01-01

    High-throughput genotype data can be used to identify genes important for local adaptation in wild populations, phenotypes in lab stocks, or disease-related traits in human medicine. Here we advance microarray-based genotyping for population genomics with Restriction Site Tiling Analysis. The approach simultaneously discovers polymorphisms and provides quantitative genotype data at 10,000s of loci. It is highly accurate and free from ascertainment bias. We apply the approach to uncover genomic differentiation in the purple sea urchin. PMID:20403197

  13. Highly accurate thermal flow microsensor for continuous and quantitative measurement of cerebral blood flow.

    PubMed

    Li, Chunyan; Wu, Pei-ming; Wu, Zhizhen; Limnuson, Kanokwan; Mehan, Neal; Mozayan, Cameron; Golanov, Eugene V; Ahn, Chong H; Hartings, Jed A; Narayan, Raj K

    2015-10-01

    Cerebral blood flow (CBF) plays a critical role in the exchange of nutrients and metabolites at the capillary level and is tightly regulated to meet the metabolic demands of the brain. After major brain injuries, CBF normally decreases and supporting the injured brain with adequate CBF is a mainstay of therapy after traumatic brain injury. Quantitative and localized measurement of CBF is therefore critically important for evaluation of treatment efficacy and also for understanding of cerebral pathophysiology. We present here an improved thermal flow microsensor and its operation which provides higher accuracy compared to existing devices. The flow microsensor consists of three components, two stacked-up thin film resistive elements serving as composite heater/temperature sensor and one remote resistive element for environmental temperature compensation. It operates in constant-temperature mode (~2 °C above the medium temperature) providing 20 ms temporal resolution. Compared to previous thermal flow microsensor based on self-heating and self-sensing design, the sensor presented provides at least two-fold improvement in accuracy in the range from 0 to 200 ml/100 g/min. This is mainly achieved by using the stacked-up structure, where the heating and sensing are separated to improve the temperature measurement accuracy by minimization of errors introduced by self-heating. PMID:26256480

  14. Quantitative calcium resistivity based method for accurate and scalable water vapor transmission rate measurement.

    PubMed

    Reese, Matthew O; Dameron, Arrelaine A; Kempe, Michael D

    2011-08-01

    The development of flexible organic light emitting diode displays and flexible thin film photovoltaic devices is dependent on the use of flexible, low-cost, optically transparent and durable barriers to moisture and/or oxygen. It is estimated that this will require high moisture barriers with water vapor transmission rates (WVTR) between 10(-4) and 10(-6) g/m(2)/day. Thus there is a need to develop a relatively fast, low-cost, and quantitative method to evaluate such low permeation rates. Here, we demonstrate a method where the resistance changes of patterned Ca films, upon reaction with moisture, enable one to calculate a WVTR between 10 and 10(-6) g/m(2)/day or better. Samples are configured with variable aperture size such that the sensitivity and/or measurement time of the experiment can be controlled. The samples are connected to a data acquisition system by means of individual signal cables permitting samples to be tested under a variety of conditions in multiple environmental chambers. An edge card connector is used to connect samples to the measurement wires enabling easy switching of samples in and out of test. This measurement method can be conducted with as little as 1 h of labor time per sample. Furthermore, multiple samples can be measured in parallel, making this an inexpensive and high volume method for measuring high moisture barriers. PMID:21895269

  15. Allele-Specific Quantitative PCR for Accurate, Rapid, and Cost-Effective Genotyping.

    PubMed

    Lee, Han B; Schwab, Tanya L; Koleilat, Alaa; Ata, Hirotaka; Daby, Camden L; Cervera, Roberto Lopez; McNulty, Melissa S; Bostwick, Hannah S; Clark, Karl J

    2016-06-01

    Customizable endonucleases such as transcription activator-like effector nucleases (TALENs) and clustered regularly interspaced short palindromic repeats/CRISPR-associated protein 9 (CRISPR/Cas9) enable rapid generation of mutant strains at genomic loci of interest in animal models and cell lines. With the accelerated pace of generating mutant alleles, genotyping has become a rate-limiting step to understanding the effects of genetic perturbation. Unless mutated alleles result in distinct morphological phenotypes, mutant strains need to be genotyped using standard methods in molecular biology. Classic restriction fragment length polymorphism (RFLP) or sequencing is labor-intensive and expensive. Although simpler than RFLP, current versions of allele-specific PCR may still require post-polymerase chain reaction (PCR) handling such as sequencing, or they are more expensive if allele-specific fluorescent probes are used. Commercial genotyping solutions can take weeks from assay design to result, and are often more expensive than assembling reactions in-house. Key components of commercial assay systems are often proprietary, which limits further customization. Therefore, we developed a one-step open-source genotyping method based on quantitative PCR. The allele-specific qPCR (ASQ) does not require post-PCR processing and can genotype germline mutants through either threshold cycle (Ct) or end-point fluorescence reading. ASQ utilizes allele-specific primers, a locus-specific reverse primer, universal fluorescent probes and quenchers, and hot start DNA polymerase. Individual laboratories can further optimize this open-source system as we completely disclose the sequences, reagents, and thermal cycling protocol. We have tested the ASQ protocol to genotype alleles in five different genes. ASQ showed a 98-100% concordance in genotype scoring with RFLP or Sanger sequencing outcomes. ASQ is time-saving because a single qPCR without post-PCR handling suffices to score

  16. Allele-Specific Quantitative PCR for Accurate, Rapid, and Cost-Effective Genotyping

    PubMed Central

    Lee, Han B.; Schwab, Tanya L.; Koleilat, Alaa; Ata, Hirotaka; Daby, Camden L.; Cervera, Roberto Lopez; McNulty, Melissa S.; Bostwick, Hannah S.; Clark, Karl J.

    2016-01-01

    Customizable endonucleases such as transcription activator-like effector nucleases (TALENs) and clustered regularly interspaced short palindromic repeats/CRISPR-associated protein 9 (CRISPR/Cas9) enable rapid generation of mutant strains at genomic loci of interest in animal models and cell lines. With the accelerated pace of generating mutant alleles, genotyping has become a rate-limiting step to understanding the effects of genetic perturbation. Unless mutated alleles result in distinct morphological phenotypes, mutant strains need to be genotyped using standard methods in molecular biology. Classic restriction fragment length polymorphism (RFLP) or sequencing is labor-intensive and expensive. Although simpler than RFLP, current versions of allele-specific PCR may still require post-polymerase chain reaction (PCR) handling such as sequencing, or they are more expensive if allele-specific fluorescent probes are used. Commercial genotyping solutions can take weeks from assay design to result, and are often more expensive than assembling reactions in-house. Key components of commercial assay systems are often proprietary, which limits further customization. Therefore, we developed a one-step open-source genotyping method based on quantitative PCR. The allele-specific qPCR (ASQ) does not require post-PCR processing and can genotype germline mutants through either threshold cycle (Ct) or end-point fluorescence reading. ASQ utilizes allele-specific primers, a locus-specific reverse primer, universal fluorescent probes and quenchers, and hot start DNA polymerase. Individual laboratories can further optimize this open-source system as we completely disclose the sequences, reagents, and thermal cycling protocol. We have tested the ASQ protocol to genotype alleles in five different genes. ASQ showed a 98–100% concordance in genotype scoring with RFLP or Sanger sequencing outcomes. ASQ is time-saving because a single qPCR without post-PCR handling suffices to score

  17. Bayes` theorem and quantitative risk assessment

    SciTech Connect

    Kaplan, S.

    1994-12-31

    This paper argues that for a quantitative risk analysis (QRA) to be useful for public and private decision making, and for rallying the support necessary to implement those decisions, it is necessary that the QRA results be ``trustable.`` Trustable means that the results are based solidly and logically on all the relevant evidence available. This, in turn, means that the quantitative results must be derived from the evidence using Bayes` theorem. Thus, it argues that one should strive to make their QRAs more clearly and explicitly Bayesian, and in this way make them more ``evidence dependent`` than ``personality dependent.``

  18. Development and Validation of a Highly Accurate Quantitative Real-Time PCR Assay for Diagnosis of Bacterial Vaginosis.

    PubMed

    Hilbert, David W; Smith, William L; Chadwick, Sean G; Toner, Geoffrey; Mordechai, Eli; Adelson, Martin E; Aguin, Tina J; Sobel, Jack D; Gygax, Scott E

    2016-04-01

    Bacterial vaginosis (BV) is the most common gynecological infection in the United States. Diagnosis based on Amsel's criteria can be challenging and can be aided by laboratory-based testing. A standard method for diagnosis in research studies is enumeration of bacterial morphotypes of a Gram-stained vaginal smear (i.e., Nugent scoring). However, this technique is subjective, requires specialized training, and is not widely available. Therefore, a highly accurate molecular assay for the diagnosis of BV would be of great utility. We analyzed 385 vaginal specimens collected prospectively from subjects who were evaluated for BV by clinical signs and Nugent scoring. We analyzed quantitative real-time PCR (qPCR) assays on DNA extracted from these specimens to quantify nine organisms associated with vaginal health or disease:Gardnerella vaginalis,Atopobium vaginae, BV-associated bacteria 2 (BVAB2, an uncultured member of the orderClostridiales),Megasphaeraphylotype 1 or 2,Lactobacillus iners,Lactobacillus crispatus,Lactobacillus gasseri, andLactobacillus jensenii We generated a logistic regression model that identifiedG. vaginalis,A. vaginae, andMegasphaeraphylotypes 1 and 2 as the organisms for which quantification provided the most accurate diagnosis of symptomatic BV, as defined by Amsel's criteria and Nugent scoring, with 92% sensitivity, 95% specificity, 94% positive predictive value, and 94% negative predictive value. The inclusion ofLactobacillusspp. did not contribute sufficiently to the quantitative model for symptomatic BV detection. This molecular assay is a highly accurate laboratory tool to assist in the diagnosis of symptomatic BV. PMID:26818677

  19. Assessing Quantitative Reasoning in Young Children

    ERIC Educational Resources Information Center

    Nunes, Terezinha; Bryant, Peter; Evans, Deborah; Barros, Rossana

    2015-01-01

    Before starting school, many children reason logically about concepts that are basic to their later mathematical learning. We describe a measure of quantitative reasoning that was administered to children at school entry (mean age 5.8 years) and accounted for more variance in a mathematical attainment test than general cognitive ability 16 months…

  20. Quantitative Ultrasound Assessment of the Rat Cervix

    PubMed Central

    McFarlin, Barbara L.; O’Brien, William D.; Oelze, Michael L.; Zachary, James F.; White-Traut, Rosemary C.

    2009-01-01

    Objective The purpose of this research was to detect cervical ripening with a new quantitative ultrasound technique. Methods Cervices of 13 nonpregnant and 65 timed pregnant (days 15, 17, 19, 20, and 21 of pregnancy) Sprague Dawley rats were scanned ex vivo with a 70-MHz ultrasound transducer. Ultrasound scatterer property estimates (scatterer diameter [SD], acoustic concentration [AC], and scatterer strength factor [SSF]) from the cervices were quantified and then compared to hydroxyproline and water content. Insertion loss (attenuation) was measured in 3 rats in each of the 6 groups. Discriminant analysis was used to predict gestational age group (cervical ripening) from the ultrasound variables SD, SSF, and AC. Results Differences were observed between the groups (SD, AC, and SSF; P < .0001). Quantitative ultrasound measures changed as the cervix ripened: (1) SD increased from days 15 to 21; (2) AC decreased from days 15 to 21; and (3) SSF was the greatest in the nonpregnant group and the least in the day 21 group. Cervix hydroxyproline content increased as pregnancy progressed (P < .003) and correlated with group, SD, AC, and SSF (P < .001). Discriminant analysis of ultrasound variables predicted 56.4% of gestational group assignment (P < .001) and increased to 77% within 2 days of the predicted analysis. Cervix insertion loss was greatest for the nonpregnant group and least for the day 21 group. Conclusions Quantitative ultrasound predicted cervical ripening in the rat cervix, but before use in humans, quantitative ultrasound will need to predict gestational age in the later days of gestation with more precision. PMID:16870896

  1. Simple, fast, and accurate methodology for quantitative analysis using Fourier transform infrared spectroscopy, with bio-hybrid fuel cell examples

    PubMed Central

    Mackie, David M.; Jahnke, Justin P.; Benyamin, Marcus S.; Sumner, James J.

    2016-01-01

    The standard methodologies for quantitative analysis (QA) of mixtures using Fourier transform infrared (FTIR) instruments have evolved until they are now more complicated than necessary for many users’ purposes. We present a simpler methodology, suitable for widespread adoption of FTIR QA as a standard laboratory technique across disciplines by occasional users.•Algorithm is straightforward and intuitive, yet it is also fast, accurate, and robust.•Relies on component spectra, minimization of errors, and local adaptive mesh refinement.•Tested successfully on real mixtures of up to nine components. We show that our methodology is robust to challenging experimental conditions such as similar substances, component percentages differing by three orders of magnitude, and imperfect (noisy) spectra. As examples, we analyze biological, chemical, and physical aspects of bio-hybrid fuel cells. PMID:26977411

  2. A Quantitative Software Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  3. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety. PMID:20055976

  4. Method for accurate quantitation of background tissue optical properties in the presence of emission from a strong fluorescence marker

    NASA Astrophysics Data System (ADS)

    Bravo, Jaime; Davis, Scott C.; Roberts, David W.; Paulsen, Keith D.; Kanick, Stephen C.

    2015-03-01

    Quantification of targeted fluorescence markers during neurosurgery has the potential to improve and standardize surgical distinction between normal and cancerous tissues. However, quantitative analysis of marker fluorescence is complicated by tissue background absorption and scattering properties. Correction algorithms that transform raw fluorescence intensity into quantitative units, independent of absorption and scattering, require a paired measurement of localized white light reflectance to provide estimates of the optical properties. This study focuses on the unique problem of developing a spectral analysis algorithm to extract tissue absorption and scattering properties from white light spectra that contain contributions from both elastically scattered photons and fluorescence emission from a strong fluorophore (i.e. fluorescein). A fiber-optic reflectance device was used to perform measurements in a small set of optical phantoms, constructed with Intralipid (1% lipid), whole blood (1% volume fraction) and fluorescein (0.16-10 μg/mL). Results show that the novel spectral analysis algorithm yields accurate estimates of tissue parameters independent of fluorescein concentration, with relative errors of blood volume fraction, blood oxygenation fraction (BOF), and the reduced scattering coefficient (at 521 nm) of <7%, <1%, and <22%, respectively. These data represent a first step towards quantification of fluorescein in tissue in vivo.

  5. Quantitative statistical methods for image quality assessment.

    PubMed

    Dutta, Joyita; Ahn, Sangtae; Li, Quanzheng

    2013-01-01

    Quantitative measures of image quality and reliability are critical for both qualitative interpretation and quantitative analysis of medical images. While, in theory, it is possible to analyze reconstructed images by means of Monte Carlo simulations using a large number of noise realizations, the associated computational burden makes this approach impractical. Additionally, this approach is less meaningful in clinical scenarios, where multiple noise realizations are generally unavailable. The practical alternative is to compute closed-form analytical expressions for image quality measures. The objective of this paper is to review statistical analysis techniques that enable us to compute two key metrics: resolution (determined from the local impulse response) and covariance. The underlying methods include fixed-point approaches, which compute these metrics at a fixed point (the unique and stable solution) independent of the iterative algorithm employed, and iteration-based approaches, which yield results that are dependent on the algorithm, initialization, and number of iterations. We also explore extensions of some of these methods to a range of special contexts, including dynamic and motion-compensated image reconstruction. While most of the discussed techniques were developed for emission tomography, the general methods are extensible to other imaging modalities as well. In addition to enabling image characterization, these analysis techniques allow us to control and enhance imaging system performance. We review practical applications where performance improvement is achieved by applying these ideas to the contexts of both hardware (optimizing scanner design) and image reconstruction (designing regularization functions that produce uniform resolution or maximize task-specific figures of merit). PMID:24312148

  6. Quantitative Statistical Methods for Image Quality Assessment

    PubMed Central

    Dutta, Joyita; Ahn, Sangtae; Li, Quanzheng

    2013-01-01

    Quantitative measures of image quality and reliability are critical for both qualitative interpretation and quantitative analysis of medical images. While, in theory, it is possible to analyze reconstructed images by means of Monte Carlo simulations using a large number of noise realizations, the associated computational burden makes this approach impractical. Additionally, this approach is less meaningful in clinical scenarios, where multiple noise realizations are generally unavailable. The practical alternative is to compute closed-form analytical expressions for image quality measures. The objective of this paper is to review statistical analysis techniques that enable us to compute two key metrics: resolution (determined from the local impulse response) and covariance. The underlying methods include fixed-point approaches, which compute these metrics at a fixed point (the unique and stable solution) independent of the iterative algorithm employed, and iteration-based approaches, which yield results that are dependent on the algorithm, initialization, and number of iterations. We also explore extensions of some of these methods to a range of special contexts, including dynamic and motion-compensated image reconstruction. While most of the discussed techniques were developed for emission tomography, the general methods are extensible to other imaging modalities as well. In addition to enabling image characterization, these analysis techniques allow us to control and enhance imaging system performance. We review practical applications where performance improvement is achieved by applying these ideas to the contexts of both hardware (optimizing scanner design) and image reconstruction (designing regularization functions that produce uniform resolution or maximize task-specific figures of merit). PMID:24312148

  7. CUMULATIVE RISK ASSESSMENT FOR QUANTITATIVE RESPONSE DATA

    EPA Science Inventory

    The Relative Potency Factor approach (RPF) is used to normalize and combine different toxic potencies among a group of chemicals selected for cumulative risk assessment. The RPF method assumes that the slopes of the dose-response functions are all equal; but this method depends o...

  8. QUANTITATIVE RISK ASSESSMENT FOR MICROBIAL AGENTS

    EPA Science Inventory

    Compared to chemical risk assessment, the process for microbial agents and infectious disease is more complex because of host factors and the variety of settings in which disease transmission can occur. While the National Academy of Science has established a paradigm for performi...

  9. Asbestos exposure--quantitative assessment of risk

    SciTech Connect

    Hughes, J.M.; Weill, H.

    1986-01-01

    Methods for deriving quantitative estimates of asbestos-associated health risks are reviewed and their numerous assumptions and uncertainties described. These methods involve extrapolation of risks observed at past relatively high asbestos concentration levels down to usually much lower concentration levels of interest today--in some cases, orders of magnitude lower. These models are used to calculate estimates of the potential risk to workers manufacturing asbestos products and to students enrolled in schools containing asbestos products. The potential risk to workers exposed for 40 yr to 0.5 fibers per milliliter (f/ml) of mixed asbestos fiber type (a permissible workplace exposure limit under consideration by the Occupational Safety and Health Administration (OSHA) ) are estimated as 82 lifetime excess cancers per 10,000 exposed. The risk to students exposed to an average asbestos concentration of 0.001 f/ml of mixed asbestos fiber types for an average enrollment period of 6 school years is estimated as 5 lifetime excess cancers per one million exposed. If the school exposure is to chrysotile asbestos only, then the estimated risk is 1.5 lifetime excess cancers per million. Risks from other causes are presented for comparison; e.g., annual rates (per million) of 10 deaths from high school football, 14 from bicycling (10-14 yr of age), 5 to 20 for whooping cough vaccination. Decisions concerning asbestos products require participation of all parties involved and should only be made after a scientifically defensible estimate of the associated risk has been obtained. In many cases to date, such decisions have been made without adequate consideration of the level of risk or the cost-effectiveness of attempts to lower the potential risk. 73 references.

  10. Quantitative assessment of protein function prediction programs.

    PubMed

    Rodrigues, B N; Steffens, M B R; Raittz, R T; Santos-Weiss, I C R; Marchaukoski, J N

    2015-01-01

    Fast prediction of protein function is essential for high-throughput sequencing analysis. Bioinformatic resources provide cheaper and faster techniques for function prediction and have helped to accelerate the process of protein sequence characterization. In this study, we assessed protein function prediction programs that accept amino acid sequences as input. We analyzed the classification, equality, and similarity between programs, and, additionally, compared program performance. The following programs were selected for our assessment: Blast2GO, InterProScan, PANTHER, Pfam, and ScanProsite. This selection was based on the high number of citations (over 500), fully automatic analysis, and the possibility of returning a single best classification per sequence. We tested these programs using 12 gold standard datasets from four different sources. The gold standard classification of the databases was based on expert analysis, the Protein Data Bank, or the Structure-Function Linkage Database. We found that the miss rate among the programs is globally over 50%. Furthermore, we observed little overlap in the correct predictions from each program. Therefore, a combination of multiple types of sources and methods, including experimental data, protein-protein interaction, and data mining, may be the best way to generate more reliable predictions and decrease the miss rate. PMID:26782400

  11. SPECT-OPT multimodal imaging enables accurate evaluation of radiotracers for β-cell mass assessments

    PubMed Central

    Eter, Wael A.; Parween, Saba; Joosten, Lieke; Frielink, Cathelijne; Eriksson, Maria; Brom, Maarten; Ahlgren, Ulf; Gotthardt, Martin

    2016-01-01

    Single Photon Emission Computed Tomography (SPECT) has become a promising experimental approach to monitor changes in β-cell mass (BCM) during diabetes progression. SPECT imaging of pancreatic islets is most commonly cross-validated by stereological analysis of histological pancreatic sections after insulin staining. Typically, stereological methods do not accurately determine the total β-cell volume, which is inconvenient when correlating total pancreatic tracer uptake with BCM. Alternative methods are therefore warranted to cross-validate β-cell imaging using radiotracers. In this study, we introduce multimodal SPECT - optical projection tomography (OPT) imaging as an accurate approach to cross-validate radionuclide-based imaging of β-cells. Uptake of a promising radiotracer for β-cell imaging by SPECT, 111In-exendin-3, was measured by ex vivo-SPECT and cross evaluated by 3D quantitative OPT imaging as well as with histology within healthy and alloxan-treated Brown Norway rat pancreata. SPECT signal was in excellent linear correlation with OPT data as compared to histology. While histological determination of islet spatial distribution was challenging, SPECT and OPT revealed similar distribution patterns of 111In-exendin-3 and insulin positive β-cell volumes between different pancreatic lobes, both visually and quantitatively. We propose ex vivo SPECT-OPT multimodal imaging as a highly accurate strategy for validating the performance of β-cell radiotracers. PMID:27080529

  12. SPECT-OPT multimodal imaging enables accurate evaluation of radiotracers for β-cell mass assessments.

    PubMed

    Eter, Wael A; Parween, Saba; Joosten, Lieke; Frielink, Cathelijne; Eriksson, Maria; Brom, Maarten; Ahlgren, Ulf; Gotthardt, Martin

    2016-01-01

    Single Photon Emission Computed Tomography (SPECT) has become a promising experimental approach to monitor changes in β-cell mass (BCM) during diabetes progression. SPECT imaging of pancreatic islets is most commonly cross-validated by stereological analysis of histological pancreatic sections after insulin staining. Typically, stereological methods do not accurately determine the total β-cell volume, which is inconvenient when correlating total pancreatic tracer uptake with BCM. Alternative methods are therefore warranted to cross-validate β-cell imaging using radiotracers. In this study, we introduce multimodal SPECT - optical projection tomography (OPT) imaging as an accurate approach to cross-validate radionuclide-based imaging of β-cells. Uptake of a promising radiotracer for β-cell imaging by SPECT, (111)In-exendin-3, was measured by ex vivo-SPECT and cross evaluated by 3D quantitative OPT imaging as well as with histology within healthy and alloxan-treated Brown Norway rat pancreata. SPECT signal was in excellent linear correlation with OPT data as compared to histology. While histological determination of islet spatial distribution was challenging, SPECT and OPT revealed similar distribution patterns of (111)In-exendin-3 and insulin positive β-cell volumes between different pancreatic lobes, both visually and quantitatively. We propose ex vivo SPECT-OPT multimodal imaging as a highly accurate strategy for validating the performance of β-cell radiotracers. PMID:27080529

  13. Can College Students Accurately Assess What Affects Their Learning and Development?

    ERIC Educational Resources Information Center

    Bowman, Nicholas A.; Seifert, Tricia A.

    2011-01-01

    Informal (and sometimes formal) assessments in higher education often ask students how their skills or attitudes have changed as the result of engaging in a particular course or program; however, it is unclear to what extent these self-reports are accurate. Using a longitudinal sample of over 3,000 college students, we found that students were…

  14. Quantitative estimation in Health Impact Assessment: Opportunities and challenges

    SciTech Connect

    Bhatia, Rajiv; Seto, Edmund

    2011-04-15

    Health Impact Assessment (HIA) considers multiple effects on health of policies, programs, plans and projects and thus requires the use of diverse analytic tools and sources of evidence. Quantitative estimation has desirable properties for the purpose of HIA but adequate tools for quantification exist currently for a limited number of health impacts and decision settings; furthermore, quantitative estimation generates thorny questions about the precision of estimates and the validity of methodological assumptions. In the United States, HIA has only recently emerged as an independent practice apart from integrated EIA, and this article aims to synthesize the experience with quantitative health effects estimation within that practice. We use examples identified through a scan of available identified instances of quantitative estimation in the U.S. practice experience to illustrate methods applied in different policy settings along with their strengths and limitations. We then discuss opportunity areas and practical considerations for the use of quantitative estimation in HIA.

  15. A simple and accurate protocol for absolute polar metabolite quantification in cell cultures using quantitative nuclear magnetic resonance.

    PubMed

    Goldoni, Luca; Beringhelli, Tiziana; Rocchia, Walter; Realini, Natalia; Piomelli, Daniele

    2016-05-15

    Absolute analyte quantification by nuclear magnetic resonance (NMR) spectroscopy is rarely pursued in metabolomics, even though this would allow researchers to compare results obtained using different techniques. Here we report on a new protocol that permits, after pH-controlled serum protein removal, the sensitive quantification (limit of detection [LOD] = 5-25 μM) of hydrophilic nutrients and metabolites in the extracellular medium of cells in cultures. The method does not require the use of databases and uses PULCON (pulse length-based concentration determination) quantitative NMR to obtain results that are significantly more accurate and reproducible than those obtained by CPMG (Carr-Purcell-Meiboom-Gill) sequence or post-processing filtering approaches. Three practical applications of the method highlight its flexibility under different cell culture conditions. We identified and quantified (i) metabolic differences between genetically engineered human cell lines, (ii) alterations in cellular metabolism induced by differentiation of mouse myoblasts into myotubes, and (iii) metabolic changes caused by activation of neurotransmitter receptors in mouse myoblasts. Thus, the new protocol offers an easily implementable, efficient, and versatile tool for the investigation of cellular metabolism and signal transduction. PMID:26898303

  16. Development and evaluation of a liquid chromatography-mass spectrometry method for rapid, accurate quantitation of malondialdehyde in human plasma.

    PubMed

    Sobsey, Constance A; Han, Jun; Lin, Karen; Swardfager, Walter; Levitt, Anthony; Borchers, Christoph H

    2016-09-01

    Malondialdhyde (MDA) is a commonly used marker of lipid peroxidation in oxidative stress. To provide a sensitive analytical method that is compatible with high throughput, we developed a multiple reaction monitoring-mass spectrometry (MRM-MS) approach using 3-nitrophenylhydrazine chemical derivatization, isotope-labeling, and liquid chromatography (LC) with electrospray ionization (ESI)-tandem mass spectrometry assay to accurately quantify MDA in human plasma. A stable isotope-labeled internal standard was used to compensate for ESI matrix effects. The assay is linear (R(2)=0.9999) over a 20,000-fold concentration range with a lower limit of quantitation of 30fmol (on-column). Intra- and inter-run coefficients of variation (CVs) were <2% and ∼10% respectively. The derivative was stable for >36h at 5°C. Standards spiked into plasma had recoveries of 92-98%. When compared to a common LC-UV method, the LC-MS method found near-identical MDA concentrations. A pilot project to quantify MDA in patient plasma samples (n=26) in a study of major depressive disorder with winter-type seasonal pattern (MDD-s) confirmed known associations between MDA concentrations and obesity (p<0.02). The LC-MS method provides high sensitivity and high reproducibility for quantifying MDA in human plasma. The simple sample preparation and rapid analysis time (5x faster than LC-UV) offers high throughput for large-scale clinical applications. PMID:27437618

  17. Importance of housekeeping gene selection for accurate reverse transcription-quantitative polymerase chain reaction in a wound healing model.

    PubMed

    Turabelidze, Anna; Guo, Shujuan; DiPietro, Luisa A

    2010-01-01

    Studies in the field of wound healing have utilized a variety of different housekeeping genes for reverse transcription-quantitative polymerase chain reaction (RT-qPCR) analysis. However, nearly all of these studies assume that the selected normalization gene is stably expressed throughout the course of the repair process. The purpose of our current investigation was to identify the most stable housekeeping genes for studying gene expression in mouse wound healing using RT-qPCR. To identify which housekeeping genes are optimal for studying gene expression in wound healing, we examined all articles published in Wound Repair and Regeneration that cited RT-qPCR during the period of January/February 2008 until July/August 2009. We determined that ACTβ, GAPDH, 18S, and β2M were the most frequently used housekeeping genes in human, mouse, and pig studies. We also investigated nine commonly used housekeeping genes that are not generally used in wound healing models: GUS, TBP, RPLP2, ATP5B, SDHA, UBC, CANX, CYC1, and YWHAZ. We observed that wounded and unwounded tissues have contrasting housekeeping gene expression stability. The results demonstrate that commonly used housekeeping genes must be validated as accurate normalizing genes for each individual experimental condition. PMID:20731795

  18. Accurate, Fast and Cost-Effective Diagnostic Test for Monosomy 1p36 Using Real-Time Quantitative PCR

    PubMed Central

    Cunha, Pricila da Silva; Pena, Heloisa B.; D'Angelo, Carla Sustek; Koiffmann, Celia P.; Rosenfeld, Jill A.; Shaffer, Lisa G.; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5–0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs. PMID:24839341

  19. A comparison of risk assessment techniques from qualitative to quantitative

    SciTech Connect

    Altenbach, T.J.

    1995-02-13

    Risk assessment techniques vary from purely qualitative approaches, through a regime of semi-qualitative to the more traditional quantitative. Constraints such as time, money, manpower, skills, management perceptions, risk result communication to the public, and political pressures all affect the manner in which risk assessments are carried out. This paper surveys some risk matrix techniques, examining the uses and applicability for each. Limitations and problems for each technique are presented and compared to the others. Risk matrix approaches vary from purely qualitative axis descriptions of accident frequency vs consequences, to fully quantitative axis definitions using multi-attribute utility theory to equate different types of risk from the same operation.

  20. Quantitative Assessment of Lung Using Hyperpolarized Magnetic Resonance Imaging

    PubMed Central

    Emami, Kiarash; Stephen, Michael; Kadlecek, Stephen; Cadman, Robert V.; Ishii, Masaru; Rizi, Rahim R.

    2009-01-01

    Improvements in the quantitative assessment of structure, function, and metabolic activity in the lung, combined with improvements in the spatial resolution of those assessments, enhance the diagnosis and evaluation of pulmonary disorders. Radiologic methods are among the most attractive techniques for the comprehensive assessment of the lung, as they allow quantitative assessment of this organ through measurements of a number of structural, functional, and metabolic parameters. Hyperpolarized nuclei magnetic resonance imaging (MRI) has opened up new territories for the quantitative assessment of lung function and structure with an unprecedented spatial resolution and sensitivity. This review article presents a survey of recent developments in the field of pulmonary imaging using hyperpolarized nuclei MRI for quantitative imaging of different aspects of the lung, as well as preclinical applications of these techniques to diagnose and evaluate specific pulmonary diseases. After presenting a brief overview of various hyperpolarization techniques, this survey divides the research activities of the field into four broad areas: lung microstructure, ventilation, oxygenation, and perfusion. Finally, it discusses the challenges currently faced by researchers in this field to translate this rich body of methodology into wider-scale clinical applications. PMID:19687215

  1. Automated and quantitative headspace in-tube extraction for the accurate determination of highly volatile compounds from wines and beers.

    PubMed

    Zapata, Julián; Mateo-Vivaracho, Laura; Lopez, Ricardo; Ferreira, Vicente

    2012-03-23

    An automatic headspace in-tube extraction (ITEX) method for the accurate determination of acetaldehyde, ethyl acetate, diacetyl and other volatile compounds from wine and beer has been developed and validated. Method accuracy is based on the nearly quantitative transference of volatile compounds from the sample to the ITEX trap. For achieving that goal most methodological aspects and parameters have been carefully examined. The vial and sample sizes and the trapping materials were found to be critical due to the pernicious saturation effects of ethanol. Small 2 mL vials containing very small amounts of sample (20 μL of 1:10 diluted sample) and a trap filled with 22 mg of Bond Elut ENV resins could guarantee a complete trapping of sample vapors. The complete extraction requires 100 × 0.5 mL pumping strokes at 60 °C and takes 24 min. Analytes are further desorbed at 240 °C into the GC injector under a 1:5 split ratio. The proportion of analytes finally transferred to the trap ranged from 85 to 99%. The validation of the method showed satisfactory figures of merit. Determination coefficients were better than 0.995 in all cases and good repeatability was also obtained (better than 7% in all cases). Reproducibility was better than 8.3% except for acetaldehyde (13.1%). Detection limits were below the odor detection thresholds of these target compounds in wine and beer and well below the normal ranges of occurrence. Recoveries were not significantly different to 100%, except in the case of acetaldehyde. In such a case it could be determined that the method is not able to break some of the adducts that this compound forms with sulfites. However, such problem was avoided after incubating the sample with glyoxal. The method can constitute a general and reliable alternative for the analysis of very volatile compounds in other difficult matrixes. PMID:22340891

  2. Quantitative Assessment of a Senge Learning Organization Intervention

    ERIC Educational Resources Information Center

    Kiedrowski, P. Jay

    2006-01-01

    Purpose: To quantitatively assess a Senge learning organization (LO) intervention to determine if it would result in improved employee satisfaction. Design/methodology/approach: A Senge LO intervention in Division 123 of Company ABC was undertaken in 2000. Three employee surveys using likert-scale questions over five years and correlation analysis…

  3. Quantitative phylogenetic assessment of microbial communities indiverse environments

    SciTech Connect

    von Mering, C.; Hugenholtz, P.; Raes, J.; Tringe, S.G.; Doerks,T.; Jensen, L.J.; Ward, N.; Bork, P.

    2007-01-01

    The taxonomic composition of environmental communities is an important indicator of their ecology and function. Here, we use a set of protein-coding marker genes, extracted from large-scale environmental shotgun sequencing data, to provide a more direct, quantitative and accurate picture of community composition than traditional rRNA-based approaches using polymerase chain reaction (PCR). By mapping marker genes from four diverse environmental data sets onto a reference species phylogeny, we show that certain communities evolve faster than others, determine preferred habitats for entire microbial clades, and provide evidence that such habitat preferences are often remarkably stable over time.

  4. Some suggested future directions of quantitative resource assessments

    USGS Publications Warehouse

    Singer, D.A.

    2001-01-01

    Future quantitative assessments will be expected to estimate quantities, values, and locations of undiscovered mineral resources in a form that conveys both economic viability and uncertainty associated with the resources. Historically, declining metal prices point to the need for larger deposits over time. Sensitivity analysis demonstrates that the greatest opportunity for reducing uncertainty in assessments lies in lowering uncertainty associated with tonnage estimates. Of all errors possible in assessments, those affecting tonnage estimates are by far the most important. Selecting the correct deposit model is the most important way of controlling errors because the dominance of tonnage-deposit models are the best known predictor of tonnage. Much of the surface is covered with apparently barren rocks and sediments in many large regions. Because many exposed mineral deposits are believed to have been found, a prime concern is the presence of possible mineralized rock under cover. Assessments of areas with resources under cover must rely on extrapolation from surrounding areas, new geologic maps of rocks under cover, or analogy with other well-explored areas that can be considered training tracts. Cover has a profound effect on uncertainty and on methods and procedures of assessments because geology is seldom known and geophysical methods typically have attenuated responses. Many earlier assessment methods were based on relationships of geochemical and geophysical variables to deposits learned from deposits exposed on the surface-these will need to be relearned based on covered deposits. Mineral-deposit models are important in quantitative resource assessments for two reasons: (1) grades and tonnages of most deposit types are significantly different, and (2) deposit types are present in different geologic settings that can be identified from geologic maps. Mineral-deposit models are the keystone in combining the diverse geoscience information on geology, mineral

  5. Assessment of and standardization for quantitative nondestructive test

    NASA Technical Reports Server (NTRS)

    Neuschaefer, R. W.; Beal, J. B.

    1972-01-01

    Present capabilities and limitations of nondestructive testing (NDT) as applied to aerospace structures during design, development, production, and operational phases are assessed. It will help determine what useful structural quantitative and qualitative data may be provided from raw materials to vehicle refurbishment. This assessment considers metal alloys systems and bonded composites presently applied in active NASA programs or strong contenders for future use. Quantitative and qualitative data has been summarized from recent literature, and in-house information, and presented along with a description of those structures or standards where the information was obtained. Examples, in tabular form, of NDT technique capabilities and limitations have been provided. NDT techniques discussed and assessed were radiography, ultrasonics, penetrants, thermal, acoustic, and electromagnetic. Quantitative data is sparse; therefore, obtaining statistically reliable flaw detection data must be strongly emphasized. The new requirements for reusable space vehicles have resulted in highly efficient design concepts operating in severe environments. This increases the need for quantitative NDT evaluation of selected structural components, the end item structure, and during refurbishment operations.

  6. Application of an Effective Statistical Technique for an Accurate and Powerful Mining of Quantitative Trait Loci for Rice Aroma Trait

    PubMed Central

    Golestan Hashemi, Farahnaz Sadat; Rafii, Mohd Y.; Ismail, Mohd Razi; Mohamed, Mahmud Tengku Muda; Rahim, Harun A.; Latif, Mohammad Abdul; Aslani, Farzad

    2015-01-01

    When a phenotype of interest is associated with an external/internal covariate, covariate inclusion in quantitative trait loci (QTL) analyses can diminish residual variation and subsequently enhance the ability of QTL detection. In the in vitro synthesis of 2-acetyl-1-pyrroline (2AP), the main fragrance compound in rice, the thermal processing during the Maillard-type reaction between proline and carbohydrate reduction produces a roasted, popcorn-like aroma. Hence, for the first time, we included the proline amino acid, an important precursor of 2AP, as a covariate in our QTL mapping analyses to precisely explore the genetic factors affecting natural variation for rice scent. Consequently, two QTLs were traced on chromosomes 4 and 8. They explained from 20% to 49% of the total aroma phenotypic variance. Additionally, by saturating the interval harboring the major QTL using gene-based primers, a putative allele of fgr (major genetic determinant of fragrance) was mapped in the QTL on the 8th chromosome in the interval RM223-SCU015RM (1.63 cM). These loci supported previous studies of different accessions. Such QTLs can be widely used by breeders in crop improvement programs and for further fine mapping. Moreover, no previous studies and findings were found on simultaneous assessment of the relationship among 2AP, proline and fragrance QTLs. Therefore, our findings can help further our understanding of the metabolomic and genetic basis of 2AP biosynthesis in aromatic rice. PMID:26061689

  7. Application of an Effective Statistical Technique for an Accurate and Powerful Mining of Quantitative Trait Loci for Rice Aroma Trait.

    PubMed

    Golestan Hashemi, Farahnaz Sadat; Rafii, Mohd Y; Ismail, Mohd Razi; Mohamed, Mahmud Tengku Muda; Rahim, Harun A; Latif, Mohammad Abdul; Aslani, Farzad

    2015-01-01

    When a phenotype of interest is associated with an external/internal covariate, covariate inclusion in quantitative trait loci (QTL) analyses can diminish residual variation and subsequently enhance the ability of QTL detection. In the in vitro synthesis of 2-acetyl-1-pyrroline (2AP), the main fragrance compound in rice, the thermal processing during the Maillard-type reaction between proline and carbohydrate reduction produces a roasted, popcorn-like aroma. Hence, for the first time, we included the proline amino acid, an important precursor of 2AP, as a covariate in our QTL mapping analyses to precisely explore the genetic factors affecting natural variation for rice scent. Consequently, two QTLs were traced on chromosomes 4 and 8. They explained from 20% to 49% of the total aroma phenotypic variance. Additionally, by saturating the interval harboring the major QTL using gene-based primers, a putative allele of fgr (major genetic determinant of fragrance) was mapped in the QTL on the 8th chromosome in the interval RM223-SCU015RM (1.63 cM). These loci supported previous studies of different accessions. Such QTLs can be widely used by breeders in crop improvement programs and for further fine mapping. Moreover, no previous studies and findings were found on simultaneous assessment of the relationship among 2AP, proline and fragrance QTLs. Therefore, our findings can help further our understanding of the metabolomic and genetic basis of 2AP biosynthesis in aromatic rice. PMID:26061689

  8. Quantitative computed tomography for spinal mineral assessment: current status

    NASA Technical Reports Server (NTRS)

    Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U.; Arnaud, C. D.

    1985-01-01

    Quantitative CT (QCT) is an established method for the noninvasive assessment of bone mineral content in the vertebral spongiosum and other anatomic locations. The potential strengths of QCT relative to dual photon absorptiometry (DPA) are its capability for precise three-dimensional anatomic localization providing a direct density measurement and its capability for spatial separation of highly responsive cancellous bone from less responsive cortical bone. The extraction of this quantitative information from the CT image, however, requires sophisticated calibration and positioning techniques and careful technical monitoring.

  9. Status and future of Quantitative Microbiological Risk Assessment in China

    PubMed Central

    Dong, Q.L.; Barker, G.C.; Gorris, L.G.M.; Tian, M.S.; Song, X.Y.; Malakar, P.K.

    2015-01-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives. PMID:26089594

  10. Disordered Speech Assessment Using Automatic Methods Based on Quantitative Measures

    NASA Astrophysics Data System (ADS)

    Gu, Lingyun; Harris, John G.; Shrivastav, Rahul; Sapienza, Christine

    2005-12-01

    Speech quality assessment methods are necessary for evaluating and documenting treatment outcomes of patients suffering from degraded speech due to Parkinson's disease, stroke, or other disease processes. Subjective methods of speech quality assessment are more accurate and more robust than objective methods but are time-consuming and costly. We propose a novel objective measure of speech quality assessment that builds on traditional speech processing techniques such as dynamic time warping (DTW) and the Itakura-Saito (IS) distortion measure. Initial results show that our objective measure correlates well with the more expensive subjective methods.

  11. Using fatty acids to fingerprint biofilm communities: a means to quickly and accurately assess stream quality.

    PubMed

    DeForest, Jared L; Drerup, Samuel A; Vis, Morgan L

    2016-05-01

    The assessment of lotic ecosystem quality plays an essential role to help determine the extent of environmental stress and the effectiveness of restoration activities. Methods that incorporate biological properties are considered ideal because they provide direct assessment of the end goal of a vigorous biological community. Our primary objective was to use biofilm lipids to develop an accurate biomonitoring tool that requires little expertise and time to facilitate assessment. A model was created of fatty acid biomarkers most associated with predetermined stream quality classification, exceptional warm water habitat (EWH), warm water habitat (WWH), and limited resource (LR-AMD), and validated along a gradient of known stream qualities. The fatty acid fingerprint of the biofilm community was statistically different (P = 0.03) and was generally unique to recognized stream quality. One striking difference was essential fatty acids (DHA, EPA, and ARA) were absent from LR-AMD and only recovered from WWH and EWH, 45 % more in EWH than WWH. Independently testing the model along a stream quality gradient, this model correctly categorized six of the seven sites, with no match due to low sample biomass. These results provide compelling evidence that biofilm fatty acid analysis can be a sensitive, accurate, and cost-effective biomonitoring tool. We conceive of future studies expanding this research to more in-depth studies of remediation efforts, determining the applicable geographic area for the method and the addition of multiple stressors with the possibility of distinguishing among stressors. PMID:27061804

  12. The quantitative assessment of normal canine small intestinal mucosa.

    PubMed

    Hart, I R; Kidder, D E

    1978-09-01

    Quanitative methods of assessing the architecture of small intestinal mucosa have been applied to biopsy material from normal dogs. Mucosal samples taken from four predetermined sites show that there are significant quantitative differences between the various levels of the small bowel. Animals of one year of age and older show no correlation between age or weight and mucosal dimensions. The significance of these findings, in relation to examination of biopsy material from cases of clinical small intestinal disease, is discussed. PMID:364574

  13. Home Circadian Phase Assessments with Measures of Compliance Yield Accurate Dim Light Melatonin Onsets

    PubMed Central

    Burgess, Helen J.; Wyatt, James K.; Park, Margaret; Fogg, Louis F.

    2015-01-01

    Study Objectives: There is a need for the accurate assessment of circadian phase outside of the clinic/laboratory, particularly with the gold standard dim light melatonin onset (DLMO). We tested a novel kit designed to assist in saliva sampling at home for later determination of the DLMO. The home kit includes objective measures of compliance to the requirements for dim light and half-hourly saliva sampling. Design: Participants were randomized to one of two 10-day protocols. Each protocol consisted of two back-to-back home and laboratory phase assessments in counterbalanced order, separated by a 5-day break. Setting: Laboratory or participants' homes. Participants: Thirty-five healthy adults, age 21–62 y. Interventions: N/A. Measurements and Results: Most participants received at least one 30-sec epoch of light > 50 lux during the home phase assessments (average light intensity 4.5 lux), but on average for < 9 min of the required 8.5 h. Most participants collected every saliva sample within 5 min of the scheduled time. Ninety-two percent of home DLMOs were not affected by light > 50 lux or sampling errors. There was no significant difference between the home and laboratory DLMOs (P > 0.05); on average the home DLMOs occurred 9.6 min before the laboratory DLMOs. The home DLMOs were highly correlated with the laboratory DLMOs (r = 0.91, P < 0.001). Conclusions: Participants were reasonably compliant to the home phase assessment procedures. The good agreement between the home and laboratory dim light melatonin onsets (DLMOs) demonstrates that including objective measures of light exposure and sample timing during home saliva sampling can lead to accurate home DLMOs. Clinical Trial Registration: Circadian Phase Assessments at Home, http://clinicaltrials.gov/show/NCT01487252, NCT01487252. Citation: Burgess HJ, Wyatt JK, Park M, Fogg LF. Home circadian phase assessments with measures of compliance yield accurate dim light melatonin onsets. SLEEP 2015;38(6):889–897

  14. Accurate assessment of Congo basin forest carbon stocks requires forest type specific assessments

    NASA Astrophysics Data System (ADS)

    Moonen, Pieter C. J.; Van Ballaert, Siege; Verbist, Bruno; Boyemba, Faustin; Muys, Bart

    2014-05-01

    carbon stocks despite poorer physical and chemical soil properties. Soil organic carbon stocks (0-100cm) did not significantly differ between forest types and were estimated at 109 ± 35 Mg C ha-1. Our results confirm recent findings of significantly lower carbon stocks in the Central Congo Basin as compared to the outer regions and of the importance of local tree height-diameter relationships for accurate carbon stock estimations.

  15. Quantitative assessment of visual behavior in disorders of consciousness.

    PubMed

    Trojano, L; Moretta, P; Loreto, V; Cozzolino, A; Santoro, L; Estraneo, A

    2012-09-01

    The study of eye behavior is of paramount importance in the differential diagnosis of disorders of consciousness (DoC). In spite of this, assessment of eye movement patterns in patients with vegetative state (VS) or minimally conscious state (MCS) only relies on clinical evaluation. In this study we aimed to provide a quantitative assessment of visual tracking behavior in response to moving stimuli in DoC patients. Nine VS patients and nine MCS patients were recruited in a Neurorehabilitation Unit for patients with chronic DoC; 11 matched healthy subjects were tested as the control group. All participants under went a quantitative evaluation of eye-tracking pattern by means of a computerized infrared eye-tracker system; stimuli were represented by a red circle or a small color picture slowly moving on a PC monitor. The proportion of on- or off-target fixations differed significantly between MCS and VS. Most importantly, the distribution of fixations on or off the target in all VS patients was at or below the chance level, whereas in the MCS group seven out of nine patients showed a proportion of on-target fixations significantly higher than the chance level. Fixation length did not differ among the three groups significantly. The present quantitative assessment of visual behaviour in a tracking task demonstrated that MCS and VS patients differ in the proportion of on-target fixations. These results could have important clinical implications since the quantitative analysis of visual behavior might provide additional elements in the differential diagnosis of DoC. PMID:22302277

  16. DREAM: a method for semi-quantitative dermal exposure assessment.

    PubMed

    Van-Wendel-de-Joode, Berna; Brouwer, Derk H; Vermeulen, Roel; Van Hemmen, Joop J; Heederik, Dick; Kromhout, Hans

    2003-01-01

    This paper describes a new method (DREAM) for structured, semi-quantitative dermal exposure assessment for chemical or biological agents that can be used in occupational hygiene or epidemiology. It is anticipated that DREAM could serve as an initial assessment of dermal exposure, amongst others, resulting in a ranking of tasks and subsequently jobs. DREAM consists of an inventory and evaluation part. Two examples of dermal exposure of workers of a car-construction company show that DREAM characterizes tasks and gives insight into exposure mechanisms, forming a basis for systematic exposure reduction. DREAM supplies estimates for exposure levels on the outside clothing layer as well as on skin, and provides insight into the distribution of dermal exposure over the body. Together with the ranking of tasks and people, this provides information for measurement strategies and helps to determine who, where and what to measure. In addition to dermal exposure assessment, the systematic description of dermal exposure pathways helps to prioritize and determine most adequate measurement strategies and methods. DREAM could be a promising approach for structured, semi-quantitative, dermal exposure assessment. PMID:12505908

  17. Combining qualitative and quantitative methods in assessing hospital learning environments.

    PubMed

    Chan, D S

    2001-08-01

    Clinical education is a vital component in the curricula of pre-registration nursing courses and provides student nurses with the opportunity to combine cognitive, psychomotor, and affective skills. Clinical practice enables the student to develop competencies in the application of knowledge, skills, and attitudes to clinical field situations. It is, therefore, vital that the valuable clinical time be utilised effectively and productively. Nursing students' perception of the hospital learning environment were assessed by combining quantitative and qualitative approaches. The Clinical Learning Environment Inventory, based on the theoretical framework of learning environment studies, was developed and validated. The quantitative and qualitative findings reinforced each other. It was found that there were significant differences in students' perceptions of the actual clinical learning environment and their preferred learning environment. Generally, students preferred a more positive and favourable clinical environment than they perceived as being actually present. PMID:11470103

  18. Quantitative Assessment of Myocardial Blood Flow with SPECT.

    PubMed

    Petretta, Mario; Storto, Giovanni; Pellegrino, Teresa; Bonaduce, Domenico; Cuocolo, Alberto

    2015-01-01

    The quantitative assessment of myocardial blood flow (MBF) and coronary flow reserve (CFR) may be useful for the functional evaluation of coronary artery disease, allowing judgment of its severity, tracking of disease progression, and evaluation of the anti-ischemic efficacy of therapeutic strategies. Quantitative estimates of myocardial perfusion and CFR can be derived from single-photon emission computed tomography (SPECT) myocardial perfusion images by use of equipment, tracers, and techniques that are available in most nuclear cardiology laboratories. However, this method underestimates CFR, particularly at high flow rates. The recent introduction of cardiac-dedicated gamma cameras with solid-state detectors provides very fast perfusion imaging with improved resolution, allowing fast acquisition of serial dynamic images during the first pass of a flow agent. This new technology holds great promise for MBF and CFR quantification with dynamic SPECT. Future studies will clarify the effectiveness of dynamic SPECT flow imaging. PMID:25560327

  19. Quantitative Proteome Analysis of Human Plasma Following in vivo Lipopolysaccharide Administration using 16O/18O Labeling and the Accurate Mass and Time Tag Approach

    PubMed Central

    Qian, Wei-Jun; Monroe, Matthew E.; Liu, Tao; Jacobs, Jon M.; Anderson, Gordon A.; Shen, Yufeng; Moore, Ronald J.; Anderson, David J.; Zhang, Rui; Calvano, Steve E.; Lowry, Stephen F.; Xiao, Wenzhong; Moldawer, Lyle L.; Davis, Ronald W.; Tompkins, Ronald G.; Camp, David G.; Smith, Richard D.

    2007-01-01

    SUMMARY Identification of novel diagnostic or therapeutic biomarkers from human blood plasma would benefit significantly from quantitative measurements of the proteome constituents over a range of physiological conditions. Herein we describe an initial demonstration of proteome-wide quantitative analysis of human plasma. The approach utilizes post-digestion trypsin-catalyzed 16O/18O peptide labeling, two-dimensional liquid chromatography (LC)-Fourier transform ion cyclotron resonance ((FTICR) mass spectrometry, and the accurate mass and time (AMT) tag strategy to identify and quantify peptides/proteins from complex samples. A peptide accurate mass and LC-elution time AMT tag database was initially generated using tandem mass spectrometry (MS/MS) following extensive multidimensional LC separations to provide the basis for subsequent peptide identifications. The AMT tag database contains >8,000 putative identified peptides, providing 938 confident plasma protein identifications. The quantitative approach was applied without depletion for high abundant proteins for comparative analyses of plasma samples from an individual prior to and 9 h after lipopolysaccharide (LPS) administration. Accurate quantification of changes in protein abundance was demonstrated by both 1:1 labeling of control plasma and the comparison between the plasma samples following LPS administration. A total of 429 distinct plasma proteins were quantified from the comparative analyses and the protein abundances for 25 proteins, including several known inflammatory response mediators, were observed to change significantly following LPS administration. PMID:15753121

  20. High-throughput and quantitative assessment of enhancer activity in mammals by CapStarr-seq.

    PubMed

    Vanhille, Laurent; Griffon, Aurélien; Maqbool, Muhammad Ahmad; Zacarias-Cabeza, Joaquin; Dao, Lan T M; Fernandez, Nicolas; Ballester, Benoit; Andrau, Jean Christophe; Spicuglia, Salvatore

    2015-01-01

    Cell-type specific regulation of gene expression requires the activation of promoters by distal genomic elements defined as enhancers. The identification and the characterization of enhancers are challenging in mammals due to their genome complexity. Here we develop CapStarr-Seq, a novel high-throughput strategy to quantitatively assess enhancer activity in mammals. This approach couples capture of regions of interest to previously developed Starr-seq technique. Extensive assessment of CapStarr-seq demonstrates accurate quantification of enhancer activity. Furthermore, we find that enhancer strength is associated with binding complexity of tissue-specific transcription factors and super-enhancers, while additive enhancer activity isolates key genes involved in cell identity and function. The CapStarr-Seq thus provides a fast and cost-effective approach to assess the activity of potential enhancers for a given cell type and will be helpful in decrypting transcription regulation mechanisms. PMID:25872643

  1. An Analytical Pipeline for Quantitative Characterization of Dietary Intake: Application To Assess Grape Intake.

    PubMed

    Garcia-Perez, Isabel; Posma, Joram M; Chambers, Edward S; Nicholson, Jeremy K; C Mathers, John; Beckmann, Manfred; Draper, John; Holmes, Elaine; Frost, Gary

    2016-03-23

    Lack of accurate dietary assessment in free-living populations requires discovery of new biomarkers reflecting food intake qualitatively and quantitatively to objectively evaluate effects of diet on health. We provide a proof-of-principle for an analytical pipeline to identify quantitative dietary biomarkers. Tartaric acid was identified by nuclear magnetic resonance spectroscopy as a dose-responsive urinary biomarker of grape intake and subsequently quantified in volunteers following a series of 4-day dietary interventions incorporating 0 g/day, 50 g/day, 100 g/day, and 150 g/day of grapes in standardized diets from a randomized controlled clinical trial. Most accurate quantitative predictions of grape intake were obtained in 24 h urine samples which have the strongest linear relationship between grape intake and tartaric acid excretion (r(2) = 0.90). This new methodological pipeline for estimating nutritional intake based on coupling dietary intake information and quantified nutritional biomarkers was developed and validated in a controlled dietary intervention study, showing that this approach can improve the accuracy of estimating nutritional intakes. PMID:26909845

  2. PLIF: A rapid, accurate method to detect and quantitatively assess protein-lipid interactions.

    PubMed

    Ceccato, Laurie; Chicanne, Gaëtan; Nahoum, Virginie; Pons, Véronique; Payrastre, Bernard; Gaits-Iacovoni, Frédérique; Viaud, Julien

    2016-01-01

    Phosphoinositides are a type of cellular phospholipid that regulate signaling in a wide range of cellular and physiological processes through the interaction between their phosphorylated inositol head group and specific domains in various cytosolic proteins. These lipids also influence the activity of transmembrane proteins. Aberrant phosphoinositide signaling is associated with numerous diseases, including cancer, obesity, and diabetes. Thus, identifying phosphoinositide-binding partners and the aspects that define their specificity can direct drug development. However, current methods are costly, time-consuming, or technically challenging and inaccessible to many laboratories. We developed a method called PLIF (for "protein-lipid interaction by fluorescence") that uses fluorescently labeled liposomes and tethered, tagged proteins or peptides to enable fast and reliable determination of protein domain specificity for given phosphoinositides in a membrane environment. We validated PLIF against previously known phosphoinositide-binding partners for various proteins and obtained relative affinity profiles. Moreover, PLIF analysis of the sorting nexin (SNX) family revealed not only that SNXs bound most strongly to phosphatidylinositol 3-phosphate (PtdIns3P or PI3P), which is known from analysis with other methods, but also that they interacted with other phosphoinositides, which had not previously been detected using other techniques. Different phosphoinositide partners, even those with relatively weak binding affinity, could account for the diverse functions of SNXs in vesicular trafficking and protein sorting. Because PLIF is sensitive, semiquantitative, and performed in a high-throughput manner, it may be used to screen for highly specific protein-lipid interaction inhibitors. PMID:27025878

  3. Quantitative risk assessment in aerospace: Evolution from the nuclear industry

    SciTech Connect

    Frank, M.V.

    1996-12-31

    In 1987, the National Aeronautics and Space Administration (NASA) and the aerospace industry relied on failure mode and effects analysis (FMEA) and hazards analysis as the primary tools for safety and reliability of their systems. The FMEAs were reviewed to provide critical items using a set of qualitative criteria. Hazards and critical items judged the worst, by a qualitative method, were to be either eliminated by a design change or controlled by the addition of a safeguard. However, it is frequently the case that limitations of space, weight, technical feasibility, and cost left critical items and hazards unable to be eliminated or controlled. In these situations, program management accepted the risk. How much risk was being accepted was unknown because quantitative risk assessment methods were not used. Perhaps the greatest contribution of the nuclear industry to NASA and the aerospace industry was the introduction of modern (i.e., post-WASH-1400) quantitative risk assessment concepts and techniques. The concepts of risk assessment that have been most useful in the aerospace industry are the following: 1. combination of accident sequence diagrams, event trees, and fault trees to model scenarios and their causative factors; 2. use of Bayesian analysis of system and component failure data; 3. evaluation and presentation of uncertainties in the risk estimates.

  4. Blood pressure measurement for accurate assessment of patient status in emergency medical settings.

    PubMed

    Convertino, Victor A

    2012-06-01

    Obtaining blood pressure measurements with traditional sphygomanometry that are insensitive and nonspecific can fail to provide an accurate assessment of patient status, particularly in specific clinical scenarios of acute reduction in central blood volume such as hemorrhage or orthostatic testing. This paper provides a review of newly emerging monitoring technologies that are being developed and integrated to improve patient diagnosis by using collection and feature extraction in real time of arterial waveforms by machine-learning algorithms. With assessment of continuous, noninvasively measured arterial waveforms, machine-learning algorithms have been developed with the capability to predict cardiovascular collapse with > 96% accuracy and a correlation of 0.89 between the time of predicted and actual cardiovascular collapse (e.g., shock, syncope) using a human model of progressive central hypovolemia. The resulting capability to obtain earlier predictions of imminent hemodynamic instability has significant implications for effective countermeasure applications by the aeromedical community. The ability to obtain real-time, continuous information about changes in features and patterns of arterial waveforms in addition to standard blood pressure provides for the first time the capability to assess the status of circulatory blood volume of the patient and can be used to diagnose progression toward development of syncope or overt shock, or guide fluid resuscitation. PMID:22764618

  5. Quantitative objective assessment of peripheral nociceptive C fibre function.

    PubMed Central

    Parkhouse, N; Le Quesne, P M

    1988-01-01

    A technique is described for the quantitative assessment of peripheral nociceptive C fibre function by measurement of the axon reflex flare. Acetylcholine, introduced by electrophoresis, is used to stimulate a ring of nociceptive C fibre endings at the centre of which the increase in blood flow is measured with a laser Doppler flowmeter. This flare (neurogenic vasodilatation) has been compared with mechanically or chemically stimulated non-neurogenic cutaneous vasodilation. The flare is abolished by local anaesthetic and is absent in denervated skin. The flare has been measured on the sole of the foot of 96 healthy subjects; its size decreases with age in males, but not in females. Images PMID:3351528

  6. Numerical system utilising a Monte Carlo calculation method for accurate dose assessment in radiation accidents.

    PubMed

    Takahashi, F; Endo, A

    2007-01-01

    A system utilising radiation transport codes has been developed to derive accurate dose distributions in a human body for radiological accidents. A suitable model is quite essential for a numerical analysis. Therefore, two tools were developed to setup a 'problem-dependent' input file, defining a radiation source and an exposed person to simulate the radiation transport in an accident with the Monte Carlo calculation codes-MCNP and MCNPX. Necessary resources are defined by a dialogue method with a generally used personal computer for both the tools. The tools prepare human body and source models described in the input file format of the employed Monte Carlo codes. The tools were validated for dose assessment in comparison with a past criticality accident and a hypothesized exposure. PMID:17510203

  7. Short Course Introduction to Quantitative Mineral Resource Assessments

    USGS Publications Warehouse

    Singer, Donald A.

    2007-01-01

    This is an abbreviated text supplementing the content of three sets of slides used in a short course that has been presented by the author at several workshops. The slides should be viewed in the order of (1) Introduction and models, (2) Delineation and estimation, and (3) Combining estimates and summary. References cited in the slides are listed at the end of this text. The purpose of the three-part form of mineral resource assessments discussed in the accompanying slides is to make unbiased quantitative assessments in a format needed in decision-support systems so that consequences of alternative courses of action can be examined. The three-part form of mineral resource assessments was developed to assist policy makers evaluate the consequences of alternative courses of action with respect to land use and mineral-resource development. The audience for three-part assessments is a governmental or industrial policy maker, a manager of exploration, a planner of regional development, or similar decision-maker. Some of the tools and models presented here will be useful for selection of exploration sites, but that is a side benefit, not the goal. To provide unbiased information, we recommend the three-part form of mineral resource assessments where general locations of undiscovered deposits are delineated from a deposit type's geologic setting, frequency distributions of tonnages and grades of well-explored deposits serve as models of grades and tonnages of undiscovered deposits, and number of undiscovered deposits are estimated probabilistically by type. The internally consistent descriptive, grade and tonnage, deposit density, and economic models used in the design of the three-part form of assessments reduce the chances of biased estimates of the undiscovered resources. What and why quantitative resource assessments: The kind of assessment recommended here is founded in decision analysis in order to provide a framework for making decisions concerning mineral

  8. Wavelet prism decomposition analysis applied to CARS spectroscopy: a tool for accurate and quantitative extraction of resonant vibrational responses.

    PubMed

    Kan, Yelena; Lensu, Lasse; Hehl, Gregor; Volkmer, Andreas; Vartiainen, Erik M

    2016-05-30

    We propose an approach, based on wavelet prism decomposition analysis, for correcting experimental artefacts in a coherent anti-Stokes Raman scattering (CARS) spectrum. This method allows estimating and eliminating a slowly varying modulation error function in the measured normalized CARS spectrum and yields a corrected CARS line-shape. The main advantage of the approach is that the spectral phase and amplitude corrections are avoided in the retrieved Raman line-shape spectrum, thus significantly simplifying the quantitative reconstruction of the sample's Raman response from a normalized CARS spectrum in the presence of experimental artefacts. Moreover, the approach obviates the need for assumptions about the modulation error distribution and the chemical composition of the specimens under study. The method is quantitatively validated on normalized CARS spectra recorded for equimolar aqueous solutions of D-fructose, D-glucose, and their disaccharide combination sucrose. PMID:27410113

  9. Rock Slide Risk Assessment: A Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Duzgun, H. S. B.

    2009-04-01

    Rock slides can be better managed by systematic risk assessments. Any risk assessment methodology for rock slides involves identification of rock slide risk components, which are hazard, elements at risk and vulnerability. For a quantitative/semi-quantitative risk assessment for rock slides, a mathematical value the risk has to be computed and evaluated. The quantitative evaluation of risk for rock slides enables comparison of the computed risk with the risk of other natural and/or human-made hazards and providing better decision support and easier communication for the decision makers. A quantitative/semi-quantitative risk assessment procedure involves: Danger Identification, Hazard Assessment, Elements at Risk Identification, Vulnerability Assessment, Risk computation, Risk Evaluation. On the other hand, the steps of this procedure require adaptation of existing or development of new implementation methods depending on the type of landslide, data availability, investigation scale and nature of consequences. In study, a generic semi-quantitative risk assessment (SQRA) procedure for rock slides is proposed. The procedure has five consecutive stages: Data collection and analyses, hazard assessment, analyses of elements at risk and vulnerability and risk assessment. The implementation of the procedure for a single rock slide case is illustrated for a rock slope in Norway. Rock slides from mountain Ramnefjell to lake Loen are considered to be one of the major geohazards in Norway. Lake Loen is located in the inner part of Nordfjord in Western Norway. Ramnefjell Mountain is heavily jointed leading to formation of vertical rock slices with height between 400-450 m and width between 7-10 m. These slices threaten the settlements around Loen Valley and tourists visiting the fjord during summer season, as the released slides have potential of creating tsunami. In the past, several rock slides had been recorded from the Mountain Ramnefjell between 1905 and 1950. Among them

  10. Using Thoracic Ultrasonography to Accurately Assess Pneumothorax Progression During Positive Pressure Ventilation

    PubMed Central

    Lossius, Hans Morten; Wemmelund, Kristian; Stokkeland, Paal Johan; Knudsen, Lars; Sloth, Erik

    2013-01-01

    Background: Although thoracic ultrasonography accurately determines the size and extent of occult pneumothoraces (PTXs) in spontaneously breathing patients, there is uncertainty about patients receiving positive pressure ventilation. We compared the lung point (ie, the area where the collapsed lung still adheres to the inside of the chest wall) using the two modalities ultrasonography and CT scanning to determine whether ultrasonography can be used reliably to assess PTX progression in a positive-pressure-ventilated porcine model. Methods: Air was introduced in incremental steps into five hemithoraces in three intubated porcine models. The lung point was identified on ultrasound imaging and referenced against the lateral limit of the intrapleural air space identified on the CT scans. The distance from the sternum to the lung point (S-LP) was measured on the CT scans and correlated to the insufflated air volume. Results: The mean total difference between the 131 ultrasound and CT scan lung points was 6.8 mm (SD, 7.1 mm; range, 0.0-29.3 mm). A mixed-model regression analysis showed a linear relationship between the S-LP distances and the PTX volume (P < .001). Conclusions: In an experimental porcine model, we found a linear relation between the PTX size and the lateral position of the lung point. The accuracy of thoracic ultrasonography for identifying the lung point (and, thus, the PTX extent) was comparable to that of CT imaging. These clinically relevant results suggest that ultrasonography may be safe and accurate in monitoring PTX progression during positive pressure ventilation. PMID:23188058

  11. Quantitative Proteome Analysis of Human Plasma Following in vivo Lipopolysaccharide Administration using O-16/O-18 Labeling and the Accurate Mass and Time Tag Approach

    SciTech Connect

    Qian, Weijun; Monroe, Matthew E.; Liu, Tao; Jacobs, Jon M.; Anderson, Gordon A.; Shen, Yufeng; Moore, Ronald J.; Anderson, David J.; Zhang, Rui; Calvano, Steven E.; Lowry, Stephen F.; Xiao, Wenzhong; Moldawer, Lyle L.; Davis, Ronald W.; Tompkins, Ronald G.; Camp, David G.; Smith, Richard D.

    2005-05-01

    Identification of novel diagnostic or therapeutic biomarkers from human blood plasma would benefit significantly from quantitative measurements of the proteome constituents over a range of physiological conditions. We describe here an initial demonstration of proteome-wide quantitative analysis of human plasma. The approach utilizes post-digestion trypsin-catalyzed 16O/18O labeling, two-dimensional liquid chromatography (LC)-Fourier transform ion cyclotron resonance ((FTICR) mass spectrometry, and the accurate mass and time (AMT) tag strategy for identification and quantification of peptides/proteins from complex samples. A peptide mass and time tag database was initially generated using tandem mass spectrometry (MS/MS) following extensive multidimensional LC separations and the database serves as a ‘look-up’ table for peptide identification. The mass and time tag database contains >8,000 putative identified peptides, which yielded 938 confident plasma protein identifications. The quantitative approach was applied to the comparative analyses of plasma samples from an individual prior to and 9 hours after lipopolysaccharide (LPS) administration without depletion of high abundant proteins. Accurate quantification of changes in protein abundance was demonstrated with both 1:1 labeling of control plasma and the comparison between the plasma samples following LPS administration. A total of 429 distinct plasma proteins were quantified from the comparative analyses and the protein abundances for 28 proteins were observed to be significantly changed following LPS administration, including several known inflammatory response mediators.

  12. Improved Surgical Site Infection (SSI) rate through accurately assessed surgical wounds

    PubMed Central

    John, Honeymol; Nimeri, Abdelrahman; ELLAHHAM, SAMER

    2015-01-01

    assignment was 36%, and the worst rates were in appendectomies (97%). Over time our incorrect wound classification decreased down to 22%, while at the same time our actual SSI wound occurrences per month and our odds ratio of SSI in the department have decreased an average of six to three per month. We followed the best practice guidelines of the ACS NSQIP. Accurate assessment of wound classification is necessary to make sure the expected SSI rates are not falsely high if wounds are under-classified. The present study shows that accurate wound classification in contaminated and dirty wounds can lead to lower odds ratio of SSI. PMID:26734358

  13. Identification and evaluation of new reference genes in Gossypium hirsutum for accurate normalization of real-time quantitative RT-PCR data

    PubMed Central

    2010-01-01

    Background Normalizing through reference genes, or housekeeping genes, can make more accurate and reliable results from reverse transcription real-time quantitative polymerase chain reaction (qPCR). Recent studies have shown that no single housekeeping gene is universal for all experiments. Thus, suitable reference genes should be the first step of any qPCR analysis. Only a few studies on the identification of housekeeping gene have been carried on plants. Therefore qPCR studies on important crops such as cotton has been hampered by the lack of suitable reference genes. Results By the use of two distinct algorithms, implemented by geNorm and NormFinder, we have assessed the gene expression of nine candidate reference genes in cotton: GhACT4, GhEF1α5, GhFBX6, GhPP2A1, GhMZA, GhPTB, GhGAPC2, GhβTUB3 and GhUBQ14. The candidate reference genes were evaluated in 23 experimental samples consisting of six distinct plant organs, eight stages of flower development, four stages of fruit development and in flower verticils. The expression of GhPP2A1 and GhUBQ14 genes were the most stable across all samples and also when distinct plants organs are examined. GhACT4 and GhUBQ14 present more stable expression during flower development, GhACT4 and GhFBX6 in the floral verticils and GhMZA and GhPTB during fruit development. Our analysis provided the most suitable combination of reference genes for each experimental set tested as internal control for reliable qPCR data normalization. In addition, to illustrate the use of cotton reference genes we checked the expression of two cotton MADS-box genes in distinct plant and floral organs and also during flower development. Conclusion We have tested the expression stabilities of nine candidate genes in a set of 23 tissue samples from cotton plants divided into five different experimental sets. As a result of this evaluation, we recommend the use of GhUBQ14 and GhPP2A1 housekeeping genes as superior references for normalization of gene

  14. Self-aliquoting microarray plates for accurate quantitative matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Pabst, Martin; Fagerer, Stephan R; Köhling, Rudolf; Küster, Simon K; Steinhoff, Robert; Badertscher, Martin; Wahl, Fabian; Dittrich, Petra S; Jefimovs, Konstantins; Zenobi, Renato

    2013-10-15

    Matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is a fast analysis tool employed for the detection of a broad range of analytes. However, MALDI-MS has a reputation of not being suitable for quantitative analysis. Inhomogeneous analyte/matrix co-crystallization, spot-to-spot inhomogeneity, as well as a typically low number of replicates are the main contributing factors. Here, we present a novel MALDI sample target for quantitative MALDI-MS applications, which addresses the limitations mentioned above. The platform is based on the recently developed microarray for mass spectrometry (MAMS) technology and contains parallel lanes of hydrophilic reservoirs. Samples are not pipetted manually but deposited by dragging one or several sample droplets with a metal sliding device along these lanes. Sample is rapidly and automatically aliquoted into the sample spots due to the interplay of hydrophilic/hydrophobic interactions. With a few microliters of sample, it is possible to aliquot up to 40 replicates within seconds, each aliquot containing just 10 nL. The analyte droplet dries immediately and homogeneously, and consumption of the whole spot during MALDI-MS analysis is typically accomplished within few seconds. We evaluated these sample targets with respect to their suitability for use with different samples and matrices. Furthermore, we tested their application for generating calibration curves of standard peptides with α-cyano-4-hdydroxycinnamic acid as a matrix. For angiotensin II and [Glu(1)]-fibrinopeptide B we achieved coefficients of determination (r(2)) greater than 0.99 without the use of internal standards. PMID:24003910

  15. Assessment of metabolic bone diseases by quantitative computed tomography

    NASA Technical Reports Server (NTRS)

    Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.

    1985-01-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.

  16. Performance Assessment in Fingerprinting and Multi Component Quantitative NMR Analyses.

    PubMed

    Gallo, Vito; Intini, Nicola; Mastrorilli, Piero; Latronico, Mario; Scapicchio, Pasquale; Triggiani, Maurizio; Bevilacqua, Vitoantonio; Fanizzi, Paolo; Acquotti, Domenico; Airoldi, Cristina; Arnesano, Fabio; Assfalg, Michael; Benevelli, Francesca; Bertelli, Davide; Cagliani, Laura R; Casadei, Luca; Cesare Marincola, Flaminia; Colafemmina, Giuseppe; Consonni, Roberto; Cosentino, Cesare; Davalli, Silvia; De Pascali, Sandra A; D'Aiuto, Virginia; Faccini, Andrea; Gobetto, Roberto; Lamanna, Raffaele; Liguori, Francesca; Longobardi, Francesco; Mallamace, Domenico; Mazzei, Pierluigi; Menegazzo, Ileana; Milone, Salvatore; Mucci, Adele; Napoli, Claudia; Pertinhez, Thelma; Rizzuti, Antonino; Rocchigiani, Luca; Schievano, Elisabetta; Sciubba, Fabio; Sobolev, Anatoly; Tenori, Leonardo; Valerio, Mariacristina

    2015-07-01

    An interlaboratory comparison (ILC) was organized with the aim to set up quality control indicators suitable for multicomponent quantitative analysis by nuclear magnetic resonance (NMR) spectroscopy. A total of 36 NMR data sets (corresponding to 1260 NMR spectra) were produced by 30 participants using 34 NMR spectrometers. The calibration line method was chosen for the quantification of a five-component model mixture. Results show that quantitative NMR is a robust quantification tool and that 26 out of 36 data sets resulted in statistically equivalent calibration lines for all considered NMR signals. The performance of each laboratory was assessed by means of a new performance index (named Qp-score) which is related to the difference between the experimental and the consensus values of the slope of the calibration lines. Laboratories endowed with a Qp-score falling within the suitable acceptability range are qualified to produce NMR spectra that can be considered statistically equivalent in terms of relative intensities of the signals. In addition, the specific response of nuclei to the experimental excitation/relaxation conditions was addressed by means of the parameter named NR. NR is related to the difference between the theoretical and the consensus slopes of the calibration lines and is specific for each signal produced by a well-defined set of acquisition parameters. PMID:26020452

  17. Extending the quantitative assessment of industrial risks to earthquake effects.

    PubMed

    Campedel, Michela; Cozzani, Valerio; Garcia-Agreda, Anita; Salzano, Ernesto

    2008-10-01

    In the general framework of quantitative methods for natural-technological (NaTech) risk analysis, a specific methodology was developed for assessing risks caused by hazardous substances released due to earthquakes. The contribution of accidental scenarios initiated by seismic events to the overall industrial risk was assessed in three case studies derived from the actual plant layout of existing oil refineries. Several specific vulnerability models for different equipment classes were compared and assessed. The effect of differing structural resistances for process equipment on the final risk results was also investigated. The main factors influencing the final risk values resulted from the models for equipment vulnerability and the assumptions for the reference damage states of the process equipment. The analysis of case studies showed that in seismic zones the additional risk deriving from damage caused by earthquakes may be up to more than one order of magnitude higher than that associated to internal failure causes. Critical equipment was determined to be mainly pressurized tanks, even though atmospheric tanks were more vulnerable to containment loss. Failure of minor process equipment having a limited hold-up of hazardous substances (such as pumps) was shown to have limited influence on the final values of the risk increase caused by earthquakes. PMID:18657068

  18. New Cardiovascular Risk Factors and Their Use for an Accurate Cardiovascular Risk Assessment in Hypertensive Patients

    PubMed Central

    TAUTU, Oana-Florentina; DARABONT, Roxana; ONCIUL, Sebastian; DEACONU, Alexandru; COMANESCU, Ioana; ANDREI, Radu Dan; DRAGOESCU, Bogdan; CINTEZA, Mircea; DOROBANTU, Maria

    2014-01-01

    Objectives: To analyze the predictive value of new cardiovascular (CV) risk factors for CV risk assessment in the adult Romanian hypertensive (HT) population. Methods: Hypertensive adults aged between 40-65 years of age, identified in national representative SEPHAR II survey were evaluated by anthropometric, BP and arterial stiffness measurements: aortic pulse wave velocity (PWVao), aortic augmentation index (AIXao), revers time (RT) and central systolic blood pressure (SBPao), 12 lead ECGs and laboratory workup. Values above the 4th quartile of mean SBP' standard deviation (s.d.) defined increased BP variability. Log(TG/HDL-cholesterol) defined atherogenic index of plasma (AIP). Serum uric acid levels above 5.70 mg/dl for women and 7.0 mg/dl for males defined hyperuricemia (HUA). CV risk was assessed based on SCORE chart for high CV risk countries. Binary logistic regression using a stepwise likelihood ratio method (adjustments for major confounders and colliniarity analysis) was used in order to validate predictors of high and very high CV risk class. Results: The mean SBP value of the study group was 148.46±19.61 mmHg. Over forty percent of hypertensives had a high and very high CV risk. Predictors of high/very high CV risk category validated by regression analysis were: increased visit-to-visit BP variability (OR: 2.49; 95%CI: 1.67-3.73), PWVao (OR: 1.12; 95%CI: 1.02-1.22), RT (OR: 0.95; 95% CI: 0.93-0.98), SBPao (OR: 1.01; 95%CI: 1.01-1.03) and AIP (OR: 7.08; 95%CI: 3.91-12.82). Conclusion: The results of our study suggests that the new CV risk factors such as increased BP variability, arterial stiffness indices and AIP are useful tools for a more accurate identification of hypertensives patients at high and very high CV risk. PMID:25705267

  19. Assessing the Reliability of Quantitative Imaging of Sm-153

    NASA Astrophysics Data System (ADS)

    Poh, Zijie; Dagan, Maáyan; Veldman, Jeanette; Trees, Brad

    2013-03-01

    Samarium-153 is used for palliation of and recently has been investigated for therapy for bone metastases. Patient specific dosing of Sm-153 is based on quantitative single-photon emission computed tomography (SPECT) and knowing the accuracy and precision of image-based estimates of the in vivo activity distribution. Physical phantom studies are useful for estimating these in simple objects, but do not model realistic activity distributions. We are using realistic Monte Carlo simulations combined with a realistic digital phantom modeling human anatomy to assess the accuracy and precision of Sm-153 SPECT. Preliminary data indicates that we can simulate projection images and reconstruct them with compensation for various physical image degrading factors, such as attenuation and scatter in the body as well as non-idealities in the imaging system, to provide realistic SPECT images.

  20. Quantitative Elastography for Cervical Stiffness Assessment during Pregnancy

    PubMed Central

    Fruscalzo, A.; Londero, A. P.; Fröhlich, C.; Möllmann, U.; Schmitz, R.

    2014-01-01

    Aim. Feasibility and reliability of tissue Doppler imaging-(TDI-) based elastography for cervical quantitative stiffness assessment during all three trimesters of pregnancy were evaluated. Materials and Methods. Prospective case-control study including seventy-four patients collected between the 12th and 42nd weeks of gestation. The tissue strain (TS) was measured by two independent operators as natural strain. Intra- and interoperator intraclass correlation coefficient (ICC) agreements were evaluated. Results. TS measurement was always feasible and exhibited a high performance in terms of reliability (intraoperator ICC-agreement = 0.93; interoperator ICC agreement = 0.89 and 0.93 for a single measurement and for the average of two measurements, resp.). Cervical TS showed also a significant correlation with gestational age, cervical length, and parity. Conclusions. TS measurement during pregnancy demonstrated high feasibility and reliability. Furthermore, TS significantly correlated with gestational age, cervical length, and parity. PMID:24734246

  1. Quantitative Security Risk Assessment and Management for Railway Transportation Infrastructures

    NASA Astrophysics Data System (ADS)

    Flammini, Francesco; Gaglione, Andrea; Mazzocca, Nicola; Pragliola, Concetta

    Scientists have been long investigating procedures, models and tools for the risk analysis in several domains, from economics to computer networks. This paper presents a quantitative method and a tool for the security risk assessment and management specifically tailored to the context of railway transportation systems, which are exposed to threats ranging from vandalism to terrorism. The method is based on a reference mathematical model and it is supported by a specifically developed tool. The tool allows for the management of data, including attributes of attack scenarios and effectiveness of protection mechanisms, and the computation of results, including risk and cost/benefit indices. The main focus is on the design of physical protection systems, but the analysis can be extended to logical threats as well. The cost/benefit analysis allows for the evaluation of the return on investment, which is a nowadays important issue to be addressed by risk analysts.

  2. A Statistical Method for Assessing Peptide Identification Confidence in Accurate Mass and Time Tag Proteomics

    SciTech Connect

    Stanley, Jeffrey R.; Adkins, Joshua N.; Slysz, Gordon W.; Monroe, Matthew E.; Purvine, Samuel O.; Karpievitch, Yuliya V.; Anderson, Gordon A.; Smith, Richard D.; Dabney, Alan R.

    2011-07-15

    High-throughput proteomics is rapidly evolving to require high mass measurement accuracy for a variety of different applications. Increased mass measurement accuracy in bottom-up proteomics specifically allows for an improved ability to distinguish and characterize detected MS features, which may in turn be identified by, e.g., matching to entries in a database for both precursor and fragmentation mass identification methods. Many tools exist with which to score the identification of peptides from LC-MS/MS measurements or to assess matches to an accurate mass and time (AMT) tag database, but these two calculations remain distinctly unrelated. Here we present a statistical method, Statistical Tools for AMT tag Confidence (STAC), which extends our previous work incorporating prior probabilities of correct sequence identification from LC-MS/MS, as well as the quality with which LC-MS features match AMT tags, to evaluate peptide identification confidence. Compared to existing tools, we are able to obtain significantly more high-confidence peptide identifications at a given false discovery rate and additionally assign confidence estimates to individual peptide identifications. Freely available software implementations of STAC are available in both command line and as a Windows graphical application.

  3. Algal productivity modeling: a step toward accurate assessments of full-scale algal cultivation.

    PubMed

    Béchet, Quentin; Chambonnière, Paul; Shilton, Andy; Guizard, Guillaume; Guieysse, Benoit

    2015-05-01

    A new biomass productivity model was parameterized for Chlorella vulgaris using short-term (<30 min) oxygen productivities from algal microcosms exposed to 6 light intensities (20-420 W/m(2)) and 6 temperatures (5-42 °C). The model was then validated against experimental biomass productivities recorded in bench-scale photobioreactors operated under 4 light intensities (30.6-74.3 W/m(2)) and 4 temperatures (10-30 °C), yielding an accuracy of ± 15% over 163 days of cultivation. This modeling approach addresses major challenges associated with the accurate prediction of algal productivity at full-scale. Firstly, while most prior modeling approaches have only considered the impact of light intensity on algal productivity, the model herein validated also accounts for the critical impact of temperature. Secondly, this study validates a theoretical approach to convert short-term oxygen productivities into long-term biomass productivities. Thirdly, the experimental methodology used has the practical advantage of only requiring one day of experimental work for complete model parameterization. The validation of this new modeling approach is therefore an important step for refining feasibility assessments of algae biotechnologies. PMID:25502920

  4. Quantitative Assessment of Eye Phenotypes for Functional Genetic Studies Using Drosophila melanogaster.

    PubMed

    Iyer, Janani; Wang, Qingyu; Le, Thanh; Pizzo, Lucilla; Grönke, Sebastian; Ambegaokar, Surendra S; Imai, Yuzuru; Srivastava, Ashutosh; Troisí, Beatriz Llamusí; Mardon, Graeme; Artero, Ruben; Jackson, George R; Isaacs, Adrian M; Partridge, Linda; Lu, Bingwei; Kumar, Justin P; Girirajan, Santhosh

    2016-01-01

    About two-thirds of the vital genes in the Drosophila genome are involved in eye development, making the fly eye an excellent genetic system to study cellular function and development, neurodevelopment/degeneration, and complex diseases such as cancer and diabetes. We developed a novel computational method, implemented as Flynotyper software (http://flynotyper.sourceforge.net), to quantitatively assess the morphological defects in the Drosophila eye resulting from genetic alterations affecting basic cellular and developmental processes. Flynotyper utilizes a series of image processing operations to automatically detect the fly eye and the individual ommatidium, and calculates a phenotypic score as a measure of the disorderliness of ommatidial arrangement in the fly eye. As a proof of principle, we tested our method by analyzing the defects due to eye-specific knockdown of Drosophila orthologs of 12 neurodevelopmental genes to accurately document differential sensitivities of these genes to dosage alteration. We also evaluated eye images from six independent studies assessing the effect of overexpression of repeats, candidates from peptide library screens, and modifiers of neurotoxicity and developmental processes on eye morphology, and show strong concordance with the original assessment. We further demonstrate the utility of this method by analyzing 16 modifiers of sine oculis obtained from two genome-wide deficiency screens of Drosophila and accurately quantifying the effect of its enhancers and suppressors during eye development. Our method will complement existing assays for eye phenotypes, and increase the accuracy of studies that use fly eyes for functional evaluation of genes and genetic interactions. PMID:26994292

  5. Quantitative Assessment of Eye Phenotypes for Functional Genetic Studies Using Drosophila melanogaster

    PubMed Central

    Iyer, Janani; Wang, Qingyu; Le, Thanh; Pizzo, Lucilla; Grönke, Sebastian; Ambegaokar, Surendra S.; Imai, Yuzuru; Srivastava, Ashutosh; Troisí, Beatriz Llamusí; Mardon, Graeme; Artero, Ruben; Jackson, George R.; Isaacs, Adrian M.; Partridge, Linda; Lu, Bingwei; Kumar, Justin P.; Girirajan, Santhosh

    2016-01-01

    About two-thirds of the vital genes in the Drosophila genome are involved in eye development, making the fly eye an excellent genetic system to study cellular function and development, neurodevelopment/degeneration, and complex diseases such as cancer and diabetes. We developed a novel computational method, implemented as Flynotyper software (http://flynotyper.sourceforge.net), to quantitatively assess the morphological defects in the Drosophila eye resulting from genetic alterations affecting basic cellular and developmental processes. Flynotyper utilizes a series of image processing operations to automatically detect the fly eye and the individual ommatidium, and calculates a phenotypic score as a measure of the disorderliness of ommatidial arrangement in the fly eye. As a proof of principle, we tested our method by analyzing the defects due to eye-specific knockdown of Drosophila orthologs of 12 neurodevelopmental genes to accurately document differential sensitivities of these genes to dosage alteration. We also evaluated eye images from six independent studies assessing the effect of overexpression of repeats, candidates from peptide library screens, and modifiers of neurotoxicity and developmental processes on eye morphology, and show strong concordance with the original assessment. We further demonstrate the utility of this method by analyzing 16 modifiers of sine oculis obtained from two genome-wide deficiency screens of Drosophila and accurately quantifying the effect of its enhancers and suppressors during eye development. Our method will complement existing assays for eye phenotypes, and increase the accuracy of studies that use fly eyes for functional evaluation of genes and genetic interactions. PMID:26994292

  6. Accurate quantitative 13C NMR spectroscopy: repeatability over time of site-specific 13C isotope ratio determination.

    PubMed

    Caytan, Elsa; Botosoa, Eliot P; Silvestre, Virginie; Robins, Richard J; Akoka, Serge; Remaud, Gérald S

    2007-11-01

    The stability over time (repeatability) for the determination of site-specific 13C/12C ratios at natural abundance by quantitative 13C NMR spectroscopy has been tested on three probes: enriched bilabeled [1,2-13C2]ethanol; ethanol at natural abundance; and vanillin at natural abundance. It is shown in all three cases that the standard deviation for a series of measurements taken every 2-3 months over periods between 9 and 13 months is equal to or smaller than the standard deviation calculated from 5-10 replicate measurements made on a single sample. The precision which can be achieved using the present analytical 13C NMR protocol is higher than the prerequisite value of 1-2 per thousand for the determination of site-specific 13C/12C ratios at natural abundance (13C-SNIF-NMR). Hence, this technique permits the discrimination of very small variations in 13C/12C ratios between carbon positions, as found in biogenic natural products. This observed stability over time in 13C NMR spectroscopy indicates that further improvements in precision will depend primarily on improved signal-to-noise ratio. PMID:17900175

  7. Selection of accurate reference genes in mouse trophoblast stem cells for reverse transcription-quantitative polymerase chain reaction.

    PubMed

    Motomura, Kaori; Inoue, Kimiko; Ogura, Atsuo

    2016-06-17

    Mouse trophoblast stem cells (TSCs) form colonies of different sizes and morphologies, which might reflect their degrees of differentiation. Therefore, each colony type can have a characteristic gene expression profile; however, the expression levels of internal reference genes may also change, causing fluctuations in their estimated gene expression levels. In this study, we validated seven housekeeping genes by using a geometric averaging method and identified Gapdh as the most stable gene across different colony types. Indeed, when Gapdh was used as the reference, expression levels of Elf5, a TSC marker gene, stringently classified TSC colonies into two groups: a high expression groups consisting of type 1 and 2 colonies, and a lower expression group consisting of type 3 and 4 colonies. This clustering was consistent with our putative classification of undifferentiated/differentiated colonies based on their time-dependent colony transitions. By contrast, use of an unstable reference gene (Rn18s) allowed no such clear classification. Cdx2, another TSC marker, did not show any significant colony type-specific expression pattern irrespective of the reference gene. Selection of stable reference genes for quantitative gene expression analysis might be critical, especially when cell lines consisting of heterogeneous cell populations are used. PMID:26853688

  8. Selection of accurate reference genes in mouse trophoblast stem cells for reverse transcription-quantitative polymerase chain reaction

    PubMed Central

    MOTOMURA, Kaori; INOUE, Kimiko; OGURA, Atsuo

    2016-01-01

    Mouse trophoblast stem cells (TSCs) form colonies of different sizes and morphologies, which might reflect their degrees of differentiation. Therefore, each colony type can have a characteristic gene expression profile; however, the expression levels of internal reference genes may also change, causing fluctuations in their estimated gene expression levels. In this study, we validated seven housekeeping genes by using a geometric averaging method and identified Gapdh as the most stable gene across different colony types. Indeed, when Gapdh was used as the reference, expression levels of Elf5, a TSC marker gene, stringently classified TSC colonies into two groups: a high expression groups consisting of type 1 and 2 colonies, and a lower expression group consisting of type 3 and 4 colonies. This clustering was consistent with our putative classification of undifferentiated/differentiated colonies based on their time-dependent colony transitions. By contrast, use of an unstable reference gene (Rn18s) allowed no such clear classification. Cdx2, another TSC marker, did not show any significant colony type-specific expression pattern irrespective of the reference gene. Selection of stable reference genes for quantitative gene expression analysis might be critical, especially when cell lines consisting of heterogeneous cell populations are used. PMID:26853688

  9. Validation of Reference Genes for Accurate Normalization of Gene Expression in Lilium davidii var. unicolor for Real Time Quantitative PCR

    PubMed Central

    Zhang, Jing; Teixeira da Silva, Jaime A.; Wang, ChunXia; Sun, HongMei

    2015-01-01

    Lilium is an important commercial market flower bulb. qRT-PCR is an extremely important technique to track gene expression levels. The requirement of suitable reference genes for normalization has become increasingly significant and exigent. The expression of internal control genes in living organisms varies considerably under different experimental conditions. For economically important Lilium, only a limited number of reference genes applied in qRT-PCR have been reported to date. In this study, the expression stability of 12 candidate genes including α-TUB, β-TUB, ACT, eIF, GAPDH, UBQ, UBC, 18S, 60S, AP4, FP, and RH2, in a diverse set of 29 samples representing different developmental processes, three stress treatments (cold, heat, and salt) and different organs, has been evaluated. For different organs, the combination of ACT, GAPDH, and UBQ is appropriate whereas ACT together with AP4, or ACT along with GAPDH is suitable for normalization of leaves and scales at different developmental stages, respectively. In leaves, scales and roots under stress treatments, FP, ACT and AP4, respectively showed the most stable expression. This study provides a guide for the selection of a reference gene under different experimental conditions, and will benefit future research on more accurate gene expression studies in a wide variety of Lilium genotypes. PMID:26509446

  10. Preschool Temperament Assessment: A Quantitative Assessment of the Validity of Behavioral Style Questionnaire Data

    ERIC Educational Resources Information Center

    Huelsman, Timothy J.; Gagnon, Sandra Glover; Kidder-Ashley, Pamela; Griggs, Marissa Swaim

    2014-01-01

    Research Findings: Child temperament is an important construct, but its measurement has been marked by a number of weaknesses that have diminished the frequency with which it is assessed in practice. We address this problem by presenting the results of a quantitative construct validation study. We calculated validity indices by hypothesizing the…

  11. A Framework for General Education Assessment: Assessing Information Literacy and Quantitative Literacy with ePortfolios

    ERIC Educational Resources Information Center

    Hubert, David A.; Lewis, Kati J.

    2014-01-01

    This essay presents the findings of an authentic and holistic assessment, using a random sample of one hundred student General Education ePortfolios, of two of Salt Lake Community College's (SLCC) college-wide learning outcomes: quantitative literacy (QL) and information literacy (IL). Performed by four faculty from biology, humanities, and…

  12. Thermography as a quantitative imaging method for assessing postoperative inflammation

    PubMed Central

    Christensen, J; Matzen, LH; Vaeth, M; Schou, S; Wenzel, A

    2012-01-01

    Objective To assess differences in skin temperature between the operated and control side of the face after mandibular third molar surgery using thermography. Methods 127 patients had 1 mandibular third molar removed. Before the surgery, standardized thermograms were taken of both sides of the patient's face using a Flir ThermaCam™ E320 (Precisions Teknik AB, Halmstad, Sweden). The imaging procedure was repeated 2 days and 7 days after surgery. A region of interest including the third molar region was marked on each image. The mean temperature within each region of interest was calculated. The difference between sides and over time were assessed using paired t-tests. Results No significant difference was found between the operated side and the control side either before or 7 days after surgery (p > 0.3). The temperature of the operated side (mean: 32.39 °C, range: 28.9–35.3 °C) was higher than that of the control side (mean: 32.06 °C, range: 28.5–35.0 °C) 2 days after surgery [0.33 °C, 95% confidence interval (CI): 0.22–0.44 °C, p < 0.001]. No significant difference was found between the pre-operative and the 7-day post-operative temperature (p > 0.1). After 2 days, the operated side was not significantly different from the temperature pre-operatively (p = 0.12), whereas the control side had a lower temperature (0.57 °C, 95% CI: 0.29–0.86 °C, p < 0.001). Conclusions Thermography seems useful for quantitative assessment of inflammation between the intervention side and the control side after surgical removal of mandibular third molars. However, thermography cannot be used to assess absolute temperature changes due to normal variations in skin temperature over time. PMID:22752326

  13. In vivo quantitative assessment of catheter patency in rats

    PubMed Central

    Yang, Jun; Maarek, Jean-Michel I; Holschneider, Daniel P

    2014-01-01

    Summary Formation of fibrin sleeves around catheter tips is a central factor in catheter failure during chronic implantation, and such tissue growth can occur despite administration of anticoagulants. We developed a novel method for monitoring catheter patency. This method recognizes the progressive nature of catheter occlusion, and tracks this process over time through measurement of changes in catheter resistance to a standardized 1 mL bolus infusion from a pressurized reservoir. Two indirect measures of catheter patency were used: (a) reservoir residual pressure and (b) reservoir discharge time. This method was applied to the study of catheter patency in rats comparing the effect of catheter material (silastic, polyurethane, Microrenathane™), lock solution (heparin, heparin/dexamethasone) and two different cannulation sites (superior vena cava via the external jugular vein, inferior vena cava via the femoral vein). Our findings reveal that application of flexible smaller-size silastic catheters and a dexamethasone lock solution resulted in prolonged catheter patency. Patency could be maintained over nine weeks with the femoral vein catheters, compared with five weeks with the external jugular vein catheters. The current method for measuring catheter patency provides a useful index for the assessment of tissue growth around the catheter tip. The method also provides an objective and quantitative way of comparing changes in catheter patency for different surgical methods and catheter types. Our method improves on the conventional method of assessing catheter occlusion by judging the ability to aspirate from the catheter. PMID:16004684

  14. The potential optical coherence tomography in tooth bleaching quantitative assessment

    NASA Astrophysics Data System (ADS)

    Ni, Y. R.; Guo, Z. Y.; Shu, S. Y.; Zeng, C. C.; Zhong, H. Q.; Chen, B. L.; Liu, Z. M.; Bao, Y.

    2011-12-01

    In this paper, we report the outcomes from a pilot study on using OCT functional imaging method to evaluate and quantify color alteration in the human teeth in vitro. The image formations of the dental tissues without and with treatment 35% hydrogen peroxide were obtained by an OCT system at a 1310 nm central wavelength. One parameter for the quantification of optical properties from OCT measurements is introduced in our study: attenuate coefficient (μ). And the attenuate coefficient have significant decrease ( p < 0.001) in dentine as well as a significant increase ( p < 0.001) in enamel was observed during tooth bleaching process. From the experimental results, it is found that attenuate coefficient could be useful to assess color alteration of the human tooth samples. OCT has a potential to become an effective tool for the assessment tooth bleaching. And our experiment offer a now method to evaluate color change in visible region by quantitative analysis of the infrared region information from OCT.

  15. Towards Alignment Independent Quantitative Assessment of Homology Detection

    PubMed Central

    Kliger, Yossef

    2006-01-01

    Identification of homologous proteins provides a basis for protein annotation. Sequence alignment tools reliably identify homologs sharing high sequence similarity. However, identification of homologs that share low sequence similarity remains a challenge. Lowering the cutoff value could enable the identification of diverged homologs, but also introduces numerous false hits. Methods are being continuously developed to minimize this problem. Estimation of the fraction of homologs in a set of protein alignments can help in the assessment and development of such methods, and provides the users with intuitive quantitative assessment of protein alignment results. Herein, we present a computational approach that estimates the amount of homologs in a set of protein pairs. The method requires a prevalent and detectable protein feature that is conserved between homologs. By analyzing the feature prevalence in a set of pairwise protein alignments, the method can estimate the number of homolog pairs in the set independently of the alignments' quality. Using the HomoloGene database as a standard of truth, we implemented this approach in a proteome-wide analysis. The results revealed that this approach, which is independent of the alignments themselves, works well for estimating the number of homologous proteins in a wide range of homology values. In summary, the presented method can accompany homology searches and method development, provides validation to search results, and allows tuning of tools and methods. PMID:17205117

  16. Quantitative assessment of human body shape using Fourier analysis

    NASA Astrophysics Data System (ADS)

    Friess, Martin; Rohlf, F. J.; Hsiao, Hongwei

    2004-04-01

    Fall protection harnesses are commonly used to reduce the number and severity of injuries. Increasing the efficiency of harness design requires the size and shape variation of the user population to be assessed as detailed and as accurately as possible. In light of the unsatisfactory performance of traditional anthropometry with respect to such assessments, we propose the use of 3D laser surface scans of whole bodies and the statistical analysis of elliptic Fourier coefficients. Ninety-eight male and female adults were scanned. Key features of each torso were extracted as a 3D curve along front, back and the thighs. A 3D extension of Elliptic Fourier analysis4 was used to quantify their shape through multivariate statistics. Shape change as a function of size (allometry) was predicted by regressing the coefficients onto stature, weight and hip circumference. Upper and lower limits of torso shape variation were determined and can be used to redefine the design of the harness that will fit most individual body shapes. Observed allometric changes are used for adjustments to the harness shape in each size. Finally, the estimated outline data were used as templates for a free-form deformation of the complete torso surface using NURBS models (non-uniform rational B-splines).

  17. Toward Quantitatively Accurate Calculation of the Redox-Associated Acid–Base and Ligand Binding Equilibria of Aquacobalamin

    DOE PAGESBeta

    Johnston, Ryne C.; Zhou, Jing; Smith, Jeremy C.; Parks, Jerry M.

    2016-07-08

    In redox processes in complex transition metal-containing species are often intimately associated with changes in ligand protonation states and metal coordination number. Moreover, a major challenge is therefore to develop consistent computational approaches for computing pH-dependent redox and ligand dissociation properties of organometallic species. Reduction of the Co center in the vitamin B12 derivative aquacobalamin can be accompanied by ligand dissociation, protonation, or both, making these properties difficult to compute accurately. We examine this challenge here by using density functional theory and continuum solvation to compute Co ligand binding equilibrium constants (Kon/off), pKas and reduction potentials for models of aquacobalaminmore » in aqueous solution. We consider two models for cobalamin ligand coordination: the first follows the hexa, penta, tetra coordination scheme for CoIII, CoII, and CoI species, respectively, and the second model features saturation of each vacant axial coordination site on CoII and CoI species with a single, explicit water molecule to maintain six directly interacting ligands or water molecules in each oxidation state. Comparing these two coordination schemes in combination with five dispersion-corrected density functionals, we find that the accuracy of the computed properties is largely independent of the scheme used, but including only a continuum representation of the solvent yields marginally better results than saturating the first solvation shell around Co throughout. PBE performs best, displaying balanced accuracy and superior performance overall, with RMS errors of 80 mV for seven reduction potentials, 2.0 log units for five pKas and 2.3 log units for two log Kon/off values for the aquacobalamin system. Furthermore, we find that the BP86 functional commonly used in corrinoid studies suffers from erratic behavior and inaccurate descriptions of Co axial ligand binding, leading to substantial errors in predicted

  18. Toward Quantitatively Accurate Calculation of the Redox-Associated Acid-Base and Ligand Binding Equilibria of Aquacobalamin.

    PubMed

    Johnston, Ryne C; Zhou, Jing; Smith, Jeremy C; Parks, Jerry M

    2016-08-01

    Redox processes in complex transition metal-containing species are often intimately associated with changes in ligand protonation states and metal coordination number. A major challenge is therefore to develop consistent computational approaches for computing pH-dependent redox and ligand dissociation properties of organometallic species. Reduction of the Co center in the vitamin B12 derivative aquacobalamin can be accompanied by ligand dissociation, protonation, or both, making these properties difficult to compute accurately. We examine this challenge here by using density functional theory and continuum solvation to compute Co-ligand binding equilibrium constants (Kon/off), pKas, and reduction potentials for models of aquacobalamin in aqueous solution. We consider two models for cobalamin ligand coordination: the first follows the hexa, penta, tetra coordination scheme for Co(III), Co(II), and Co(I) species, respectively, and the second model features saturation of each vacant axial coordination site on Co(II) and Co(I) species with a single, explicit water molecule to maintain six directly interacting ligands or water molecules in each oxidation state. Comparing these two coordination schemes in combination with five dispersion-corrected density functionals, we find that the accuracy of the computed properties is largely independent of the scheme used, but including only a continuum representation of the solvent yields marginally better results than saturating the first solvation shell around Co throughout. PBE performs best, displaying balanced accuracy and superior performance overall, with RMS errors of 80 mV for seven reduction potentials, 2.0 log units for five pKas and 2.3 log units for two log Kon/off values for the aquacobalamin system. Furthermore, we find that the BP86 functional commonly used in corrinoid studies suffers from erratic behavior and inaccurate descriptions of Co-axial ligand binding, leading to substantial errors in predicted pKas and

  19. 1NON-INVASIVE RADIOIODINE IMAGING FOR ACCURATE QUANTITATION OF NIS REPORTER GENE EXPRESSION IN TRANSPLANTED HEARTS

    PubMed Central

    Ricci, Davide; Mennander, Ari A; Pham, Linh D; Rao, Vinay P; Miyagi, Naoto; Byrne, Guerard W; Russell, Stephen J; McGregor, Christopher GA

    2008-01-01

    Objectives We studied the concordance of transgene expression in the transplanted heart using bicistronic adenoviral vector coding for a transgene of interest (human carcinoembryonic antigen: hCEA - beta human chorionic gonadotropin: βhCG) and for a marker imaging transgene (human sodium iodide symporter: hNIS). Methods Inbred Lewis rats were used for syngeneic heterotopic cardiac transplantation. Donor rat hearts were perfused ex vivo for 30 minutes prior to transplantation with University of Wisconsin (UW) solution (n=3), with 109 pfu/ml of adenovirus expressing hNIS (Ad-NIS; n=6), hNIS-hCEA (Ad-NIS-CEA; n=6) and hNIS-βhCG (Ad-NIS-CG; n=6). On post-operative day (POD) 5, 10, 15 all animals underwent micro-SPECT/CT imaging of the donor hearts after tail vein injection of 1000 μCi 123I and blood sample collection for hCEA and βhCG quantification. Results Significantly higher image intensity was noted in the hearts perfused with Ad-NIS (1.1±0.2; 0.9±0.07), Ad-NIS-CEA (1.2±0.3; 0.9±0.1) and Ad-NIS-CG (1.1±0.1; 0.9±0.1) compared to UW group (0.44±0.03; 0.47±0.06) on POD 5 and 10 (p<0.05). Serum levels of hCEA and βhCG increased in animals showing high cardiac 123I uptake, but not in those with lower uptake. Above this threshold, image intensities correlated well with serum levels of hCEA and βhCG (R2=0.99 and R2=0.96 respectively). Conclusions These data demonstrate that hNIS is an excellent reporter gene for the transplanted heart. The expression level of hNIS can be accurately and non-invasively monitored by serial radioisotopic single photon emission computed tomography (SPECT) imaging. High concordance has been demonstrated between imaging and soluble marker peptides at the maximum transgene expression on POD 5. PMID:17980613

  20. Precise and accurate assessment of uncertainties in model parameters from stellar interferometry. Application to stellar diameters

    NASA Astrophysics Data System (ADS)

    Lachaume, Regis; Rabus, Markus; Jordan, Andres

    2015-08-01

    In stellar interferometry, the assumption that the observables can be seen as Gaussian, independent variables is the norm. In particular, neither the optical interferometry FITS (OIFITS) format nor the most popular fitting software in the field, LITpro, offer means to specify a covariance matrix or non-Gaussian uncertainties. Interferometric observables are correlated by construct, though. Also, the calibration by an instrumental transfer function ensures that the resulting observables are not Gaussian, even if uncalibrated ones happened to be so.While analytic frameworks have been published in the past, they are cumbersome and there is no generic implementation available. We propose here a relatively simple way of dealing with correlated errors without the need to extend the OIFITS specification or making some Gaussian assumptions. By repeatedly picking at random which interferograms, which calibrator stars, and which are the errors on their diameters, and performing the data processing on the bootstrapped data, we derive a sampling of p(O), the multivariate probability density function (PDF) of the observables O. The results can be stored in a normal OIFITS file. Then, given a model m with parameters P predicting observables O = m(P), we can estimate the PDF of the model parameters f(P) = p(m(P)) by using a density estimation of the observables' PDF p.With observations repeated over different baselines, on nights several days apart, and with a significant set of calibrators systematic errors are de facto taken into account. We apply the technique to a precise and accurate assessment of stellar diameters obtained at the Very Large Telescope Interferometer with PIONIER.

  1. An Assessment of the Quantitative Literacy of Undergraduate Students

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.

    2016-01-01

    Quantitative literacy (QLT) represents an underlying higher-order construct that accounts for a person's willingness to engage in quantitative situations in everyday life. The purpose of this study is to retest the construct validity of a model of quantitative literacy (Wilkins, 2010). In this model, QLT represents a second-order factor that…

  2. Assessment of RT-qPCR Normalization Strategies for Accurate Quantification of Extracellular microRNAs in Murine Serum

    PubMed Central

    Roberts, Thomas C.; Coenen-Stass, Anna M. L.; Wood, Matthew J. A.

    2014-01-01

    Extracellular microRNAs (miRNAs) are under investigation as minimally-invasive biomarkers for a wide range of disease conditions. We have recently shown in a mouse model of the progressive muscle-wasting condition Duchenne muscular dystrophy (DMD) that a set of highly elevated serum miRNAs reflects the regenerative status of muscle. These miRNAs are promising biomarkers for monitoring DMD disease progression and the response to experimental therapies. The gold standard miRNA detection methodology is Reverse Transcriptase-quantitative Polymerase Chain Reaction (RT-qPCR), which typically exhibits high sensitivity and wide dynamic range. Accurate determination of miRNA levels is affected by RT-qPCR normalization method and therefore selection of the optimal strategy is of critical importance. Serum miRNA abundance was measured by RT-qPCR array in 14 week old mice, and by individual RT-qPCR assays in a time course experiment spanning 48 weeks. Here we utilize these two datasets to assess the validity of three miRNA normalization strategies (a) normalization to the average of all Cq values from array experiments, (b) normalization to a stably expressed endogenous reference miRNA, and (c) normalization to an external spike-in synthetic oligonucleotide. Normalization approaches based on endogenous control miRNAs result in an under-estimation of miRNA levels by a factor of ∼2. An increase in total RNA and total miRNA was observed in dystrophic serum which may account for this systematic bias. We conclude that the optimal strategy for this model system is to normalize to a synthetic spike-in control oligonucleotide. PMID:24586621

  3. Quantitative Assessment of Murine Articular Cartilage and Bone Using X-Ray Phase-Contrast Imaging

    PubMed Central

    Li, Jun; Yuan, Huihui; Wu, Mingshu; Dong, Linan; Zhang, Lu; Shi, Hongli; Luo, Shuqian

    2014-01-01

    Murine models for rheumatoid arthritis (RA) research can provide important insights for understanding RA pathogenesis and evaluating the efficacy of novel treatments. However, simultaneously imaging both murine articular cartilage and subchondral bone using conventional techniques is challenging because of low spatial resolution and poor soft tissue contrast. X-ray phase-contrast imaging (XPCI) is a new technique that offers high spatial resolution for the visualisation of cartilage and skeletal tissues. The purpose of this study was to utilise XPCI to observe articular cartilage and subchondral bone in a collagen-induced arthritis (CIA) murine model and quantitatively assess changes in the joint microstructure. XPCI was performed on the two treatment groups (the control group and CIA group, n = 9 per group) to monitor the progression of damage to the femur from the knee joint in a longitudinal study (at 0, 4 and 8 weeks after primary injection). For quantitative assessment, morphologic parameters were measured in three-dimensional (3D) images using appropriate image analysis software. Our results showed that the average femoral cartilage volume, surface area and thickness were significantly decreased (P<0.05) in the CIA group compared to the control group. Meanwhile, these decreases were accompanied by obvious destruction of the surface of subchondral bone and a loss of trabecular bone in the CIA group. This study confirms that XPCI technology has the ability to qualitatively and quantitatively evaluate microstructural changes in mouse joints. This technique has the potential to become a routine analysis method for accurately monitoring joint damage and comprehensively assessing treatment efficacy. PMID:25369528

  4. Quantitative assessment of computational models for retinotopic map formation.

    PubMed

    Hjorth, J J Johannes; Sterratt, David C; Cutts, Catherine S; Willshaw, David J; Eglen, Stephen J

    2015-06-01

    Molecular and activity-based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. Our framework facilitates comparison between models, testing new models against known phenotypes and simulating new phenotypes in existing models. We have used this framework to assess four representative models that combine Eph/ephrin gradients and/or activity-based mechanisms and competition. Two of the models were updated from their original form to fit into our framework. The models were tested against five different phenotypes: wild type, Isl2-EphA3(ki/ki), Isl2-EphA3(ki/+), ephrin-A2,A3,A5 triple knock-out (TKO), and Math5(-/-) (Atoh7). Two models successfully reproduced the extent of the Math5(-/-) anteromedial projection, but only one of those could account for the collapse point in Isl2-EphA3(ki/+). The models needed a weak anteroposterior gradient in the SC to reproduce the residual order in the ephrin-A2,A3,A5 TKO phenotype, suggesting either an incomplete knock-out or the presence of another guidance molecule. Our article demonstrates the importance of testing retinotopic models against as full a range of phenotypes as possible, and we have made available MATLAB software, we wrote to facilitate this process. PMID:25367067

  5. Quantitative assessment of computational models for retinotopic map formation

    PubMed Central

    Sterratt, David C; Cutts, Catherine S; Willshaw, David J; Eglen, Stephen J

    2014-01-01

    ABSTRACT Molecular and activity‐based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. Our framework facilitates comparison between models, testing new models against known phenotypes and simulating new phenotypes in existing models. We have used this framework to assess four representative models that combine Eph/ephrin gradients and/or activity‐based mechanisms and competition. Two of the models were updated from their original form to fit into our framework. The models were tested against five different phenotypes: wild type, Isl2‐EphA3 ki/ki, Isl2‐EphA3 ki/+, ephrin‐A2,A3,A5 triple knock‐out (TKO), and Math5 −/− (Atoh7). Two models successfully reproduced the extent of the Math5 −/− anteromedial projection, but only one of those could account for the collapse point in Isl2‐EphA3 ki/+. The models needed a weak anteroposterior gradient in the SC to reproduce the residual order in the ephrin‐A2,A3,A5 TKO phenotype, suggesting either an incomplete knock‐out or the presence of another guidance molecule. Our article demonstrates the importance of testing retinotopic models against as full a range of phenotypes as possible, and we have made available MATLAB software, we wrote to facilitate this process. © 2014 Wiley Periodicals, Inc. Develop Neurobiol 75: 641–666, 2015 PMID:25367067

  6. A quantitative assessment of chemical perturbations in thermotropic cyanobiphenyls.

    PubMed

    Guerra, Sebastiano; Dutronc, Thibault; Terazzi, Emmanuel; Guénée, Laure; Piguet, Claude

    2016-05-25

    Chemical programming of the temperature domains of existence of liquid crystals is greatly desired by both academic workers and industrial partners. This contribution proposes to combine empirical approaches, which rely on systematic chemical substitutions of mesogenic molecules followed by thermal characterizations, with a rational thermodynamic assessment of the effects induced by chemical perturbations. Taking into account the similarities which exist between temperature-dependent cohesive Gibbs free energy densities (CFEDs) and pressure-temperature phase diagrams modeled with the Clapeyron equation, chemical perturbations are considered as pressure increments along phase boundaries, which control the thermotropic liquid crystalline properties. Taking the familiar calamitic amphiphilic cyanobiphenyl-type mesogens as models, the consequences of (i) methyl substitution of the aromatic polar heads and (ii) connections of bulky silyl groups at the termini of the apolar flexible alkyl chain on the melting and clearing temperatures are quantitatively analyzed. Particular efforts were focused on the translation of the thermodynamic rationalization into a predictive tool accessible to synthetic chemists mainly interested in designing liquid crystals with specific technological applications. PMID:27173940

  7. Quantitative assessment of healthy and reconstructed cleft lip using ultrasonography

    PubMed Central

    Devadiga, Sumana; Desai, Anil Kumar; Joshi, Shamsunder; Gopalakrishnan, K.

    2016-01-01

    Purpose: This study is conducted to investigate the feasibility of echographic imaging of tissue thickness of healthy and reconstructed cleft lip. Design: Prospective study. Materials and Methods: The study was conducted in SDM Craniofacial Unit, Dharwad and was approved by Local Institutional Review Board. A total of 30 patients, age group ranging from 4 to 25 years, of which 15 postoperative unilateral cleft lip constituted the test group. The remaining 15 with no cleft deformities, no gross facial asymmetry, constituted the control group. The thickness of the mucosa, submucosa, muscle and full thickness of the upper lip were measured with the transversal images using ultrasonography at midpoint of philtrum, right and left side philtral ridges and vermillion border, at 1, 3, 6 months interval. Results: There was an increase in muscle thickness at the vermillion border (mean = 6.9 mm) and philtral ridge (5.9 mm). Equal muscle thickness were found between the normal and test group at 6 months follow-up in a relaxed position, which was statistically significant (P = 0.0404). Conclusion: Quantitative assessment of thickness and echo levels of various lip tissues are done with proper echographic calibration. Diagnostic potentials of this method for noninvasive evaluation of cleft lip reconstructions were achieved by this study. PMID:27134448

  8. Assessment of breast tumor margins via quantitative diffuse reflectance imaging

    NASA Astrophysics Data System (ADS)

    Brown, J. Quincy; Bydlon, Torre M.; Kennedy, Stephanie A.; Geradts, Joseph; Wilke, Lee G.; Barry, William; Richards, Lisa M.; Junker, Marlee K.; Gallagher, Jennifer; Ramanujam, Nimmi

    2010-02-01

    A particular application of interest for tissue reflectance spectroscopy in the UV-Visible is intraoperative detection of residual cancer at the margins of excised breast tumors, which could prevent costly and unnecessary repeat surgeries. Our multi-disciplinary group has developed an optical imaging device, which is capable of surveying the entire specimen surface down to a depth of 1-2mm, all within a short time as required for intraoperative use. In an IRB-approved study, reflectance spectral images were acquired from 54 margins in 48 patients. Conversion of the spectral images to quantitative tissue parameter maps was facilitated by a fast scalable inverse Monte-Carlo model. Data from margin parameter images were reduced to image-descriptive scalar values and compared to gold-standard margin pathology. The utility of the device for classification of margins was determined via the use of a conditional inference tree modeling approach, and was assessed both as a function of type of disease present at the margin, as well as a function of distance of disease from the issue surface. Additionally, the influence of breast density on the diagnostic parameters, as well as the accuracy of the device, was evaluated.

  9. Quantitative assessment of computed radiography quality control parameters.

    PubMed

    Rampado, O; Isoardi, P; Ropolo, R

    2006-03-21

    Quality controls for testing the performance of computed radiography (CR) systems have been recommended by manufacturers and medical physicists' organizations. The purpose of this work was to develop a set of image processing tools for quantitative assessment of computed radiography quality control parameters. Automatic image analysis consisted in detecting phantom details, defining regions of interest and acquiring measurements. The tested performance characteristics included dark noise, uniformity, exposure calibration, linearity, low-contrast and spatial resolution, spatial accuracy, laser beam function and erasure thoroughness. CR devices from two major manufacturers were evaluated. We investigated several approaches to quantify the detector response uniformity. We developed methods to characterize the spatial accuracy and resolution properties across the entire image area, based on the Fourier analysis of the image of a fine wire mesh. The implemented methods were sensitive to local blurring and allowed us to detect a local distortion of 4% or greater in any part of an imaging plate. The obtained results showed that the developed image processing tools allow us to implement a quality control program for CR with short processing time and with absence of subjectivity in the evaluation of the parameters. PMID:16510964

  10. A Quantitative Measure of Handwriting Dysfluency for Assessing Tardive Dyskinesia

    PubMed Central

    Caligiuri, Michael P.; Teulings, Hans-Leo; Dean, Charles E.; Lohr, James B.

    2015-01-01

    Tardive dyskinesia (TD) is movement disorder commonly associated with chronic exposure to antidopaminergic medications which may be in some cases disfiguring and socially disabling. The consensus from a growing body of research on the incidence and prevalence of TD in the modern era of antipsychotics indicates that this disorder has not disappeared continues to challenge the effective management of psychotic symptoms in patients with schizophrenia. A fundamental component in an effective strategy for managing TD is its reliable and accurate assessment. In the present study, we examined the clinical utility of a brief handwriting dysfluency measure for quantifying TD. Digitized samples of handwritten circles and loops were obtained from 62 psychosis patients with or without TD and from 50 healthy subjects. Two measures of dysfluent pen movements were extracted from each vertical pen stroke, including normalized jerk and the number of acceleration peaks. TD patients exhibited significantly higher dysfluency scores than non-TD patients and controls. Severity of handwriting movement dysfluency was correlated with AIMS severity ratings for some tasks. The procedure yielded high degrees of test-retest reliability. These results suggest that measures of handwriting movement dysfluency may be particularly useful for objectively evaluating the efficacy of pharmacotherapeutic strategies for treating TD. PMID:25679121

  11. Quantitative assessment of the effectiveness of a rockfall warning system

    NASA Astrophysics Data System (ADS)

    Bründl, Michael; Sättele, Martina; Krautblatter, Michael; Straub, Daniel

    2016-04-01

    Rockslides and rockfalls can pose high risk to human settlements and traffic infrastructure. In addition to structural mitigation measures like rockfall nets, warning systems are increasingly installed to reduce rockfall risks. Whereas for structural mitigation measures with reducing effects on the spatial extent a structured evaluation method is existing, no or only few approaches to assess the effectiveness for warning systems are known. Especially for higher magnitude rockfalls structural mitigation measures are not effective, and reliable early warning systems will be essential in future. In response to that, we developed a classification and a framework to assess the reliability and effectiveness of early warning systems (Sättele et al, 2015a; 2016). Here, we demonstrate an application for the rockfall warning system installed in Preonzo prior to a major rockfall in May 2012 (Sättele et al., 2015b). We show that it is necessary to design such a warning system as fail-safe construction, which has to incorporate components with low failure probabilities, high redundancy, low warning thresholds, and additional control systems. With a hypothetical probabilistic analysis, we investigate the effect of the risk attitude of decision makers and of the number of sensors on the probability of detecting an event and on initiating a timely evacuation, as well as on related intervention cost. We conclude that it is possible to quantitatively assess the effectiveness of warning systems, which helps to optimize mitigation strategies against rockfall events. References Sättele, M., Bründl, M., and Straub, D.: Reliability and effectiveness of warning systems for natural hazards: concept and application to debris flow warning, Rel. Eng. Syst. Safety, 142, 192-202, 2015a. Sättele, M., Krautblatter, M., Bründl, M., and Straub, D.: Forecasting rock slope failure: How reliable and effective are warning systems?, Landslides, 605, 1-14, 2015b. Sättele, M., Bründl, M., and

  12. PET optimization for improved assessment and accurate quantification of {sup 90}Y-microsphere biodistribution after radioembolization

    SciTech Connect

    Martí-Climent, Josep M. Prieto, Elena; Elosúa, César; Rodríguez-Fraile, Macarena; Domínguez-Prado, Inés; Vigil, Carmen; García-Velloso, María J.; Arbizu, Javier; Peñuelas, Iván; Richter, José A.

    2014-09-15

    Purpose: {sup 90}Y-microspheres are widely used for the radioembolization of metastatic liver cancer or hepatocellular carcinoma and there is a growing interest for imaging {sup 90}Y-microspheres with PET. The aim of this study is to evaluate the performance of a current generation PET/CT scanner for {sup 90}Y imaging and to optimize the PET protocol to improve the assessment and the quantification of {sup 90}Y-microsphere biodistribution after radioembolization. Methods: Data were acquired on a Biograph mCT-TrueV scanner with time of flight (TOF) and point spread function (PSF) modeling. Spatial resolution was measured with a{sup 90}Y point source. Sensitivity was evaluated using the NEMA 70 cm line source filled with {sup 90}Y. To evaluate the count rate performance, {sup 90}Y vials with activity ranging from 3.64 to 0.035 GBq were measured in the center of the field of view (CFOV). The energy spectrum was evaluated. Image quality with different reconstructions was studied using the Jaszczak phantom containing six hollow spheres (diameters: 31.3, 28.1, 21.8, 16.1, 13.3, and 10.5 mm), filled with a 207 kBq/ml {sup 90}Y concentration and a 5:1 sphere-to-background ratio. Acquisition time was adjusted to simulate the quality of a realistic clinical PET acquisition of a patient treated with SIR-Spheres{sup ®}. The developed methodology was applied to ten patients after SIR-Spheres{sup ®} treatment acquiring a 10 min per bed PET. Results: The energy spectrum showed the{sup 90}Y bremsstrahlung radiation. The {sup 90}Y transverse resolution, with filtered backprojection reconstruction, was 4.5 mm in the CFOV and degraded to 5.0 mm at 10 cm off-axis. {sup 90}Y absolute sensitivity was 0.40 kcps/MBq in the center of the field of view. Tendency of true and random rates as a function of the {sup 90}Y activity could be accurately described using linear and quadratic models, respectively. Phantom studies demonstrated that, due to low count statistics in {sup 90}Y PET

  13. Disaster Metrics: A Proposed Quantitative Assessment Tool in Complex Humanitarian Emergencies - The Public Health Impact Severity Scale (PHISS)

    PubMed Central

    Bayram, Jamil D.; Kysia, Rashid; Kirsch, Thomas D.

    2012-01-01

    Background: Complex Humanitarian Emergencies (CHE) result in rapid degradation of population health and quickly overwhelm indigenous health resources. Numerous governmental, non-governmental, national and international organizations and agencies are involved in the assessment of post-CHE affected populations. To date, there is no entirely quantitative assessment tool conceptualized to measure the public health impact of CHE. Methods: Essential public health parameters in CHE were identified based on the Sphere Project "Minimum Standards", and scoring rubrics were proposed based on the prevailing evidence when applicable. Results: 12 quantitative parameters were identified, representing the four categories of “Minimum Standards for Disaster Response” according to the Sphere Project; health, shelter, food and nutrition, in addition to water and sanitation. The cumulative tool constitutes a quantitative scale, referred to as the Public Health Impact Severity Scale (PHISS), and the score on this scale ranges from a minimum of 0 to a maximum of 100. Conclusion: Quantitative measurement of the public health impact of CHE is germane to accurate assessment, in order to identify the scale and scope of the critical response required for the relief efforts of the affected populations. PHISS is a new conceptual metric tool, proposed to add an objective quantitative dimension to the post-CHE assessment arsenal. PHISS has not yet been validated, and studies are needed with prospective data collection to test its validity, feasibility and reliability. Citation: Bayram JD, Kysia R, Kirsch TD. Disaster Metrics: A Proposed Quantitative Assessment Tool in Complex Humanitarian Emergencies – The Public Health Impact Severity Scale (PHISS). PLOS Currents Disasters. 2012 Aug 21. doi: 10.1371/4f7b4bab0d1a3. PMID:22984643

  14. Is there a place for quantitative risk assessment?

    PubMed

    Hall, Eric J

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  15. How many standard area diagram sets are needed for accurate disease severity assessment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Standard area diagram sets (SADs) are widely used in plant pathology: a rater estimates disease severity by comparing an unknown sample to actual severities in the SADs and interpolates an estimate as accurately as possible (although some SADs have been developed for categorizing disease too). Most ...

  16. The U.S. Department of Agriculture Automated Multiple-Pass Method accurately assesses sodium intakes

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Accurate and practical methods to monitor sodium intake of the U.S. population are critical given current sodium reduction strategies. While the gold standard for estimating sodium intake is the 24 hour urine collection, few studies have used this biomarker to evaluate the accuracy of a dietary ins...

  17. Quantitative risk assessment of Cryptosporidium in tap water in Ireland.

    PubMed

    Cummins, E; Kennedy, R; Cormican, M

    2010-01-15

    Cryptosporidium species are protozoan parasites associated with gastro-intestinal illness. Following a number of high profile outbreaks worldwide, it has emerged as a parasite of major public health concern. A quantitative Monte Carlo simulation model was developed to evaluate the annual risk of infection from Cryptosporidium in tap water in Ireland. The assessment considers the potential initial contamination levels in raw water, oocyst removal and decontamination events following various process stages, including coagulation/flocculation, sedimentation, filtration and disinfection. A number of scenarios were analysed to represent potential risks from public water supplies, group water schemes and private wells. Where surface water is used additional physical and chemical water treatment is important in terms of reducing the risk to consumers. The simulated annual risk of illness for immunocompetent individuals was below 1 x 10(-4) per year (as set by the US EPA) except under extreme contamination events. The risk for immunocompromised individuals was 2-3 orders of magnitude greater for the scenarios analysed. The model indicates a reduced risk of infection from tap water that has undergone microfiltration, as this treatment is more robust in the event of high contamination loads. The sensitivity analysis highlighted the importance of watershed protection and the importance of adequate coagulation/flocculation in conventional treatment. The frequency of failure of the treatment process is the most important parameter influencing human risk in conventional treatment. The model developed in this study may be useful for local authorities, government agencies and other stakeholders to evaluate the likely risk of infection given some basic input data on source water and treatment processes used. PMID:19945145

  18. Quantitative ultrasound (QUS) assessment of tissue properties for Achilles tendons

    NASA Astrophysics Data System (ADS)

    Du, Yi-Chun; Chen, Yung-Fu; Chen, Pei-Jarn; Lin, Yu-Ching; Chen, Tainsong; Lin, Chii-Jeng

    2007-09-01

    Quantitative ultrasound (QUS) techniques have recently been widely applied for the characterization of tissues. For example, they can be used for the quantification of Achilles tendon properties based on the broadband ultrasound attenuation (BUA) and the speed of sound (SOS) when the ultrasound wave passes through the tissues. This study is to develop an integrated system to investigate the properties of Achilles tendons using QUS images from UBIS 5000 (DMS, Montpellier, France) and B-mode ultrasound images from HDI 5000 (ATL, Ultramark, USA). Subjects including young (32 females and 17 males; mean age: 23.7 ± 2.0) and middle-aged groups (8 female and 8 males; mean age: 47.3 ± 8.5 s) were recruited and tested for this study. Only subjects who did not exercise regularly and had no record of tendon injury were studied. The results show that the BUA is significantly higher for the young group (45.2 ± 1.6 dB MHz-1) than the middle-age group (40.5 ± 1.9 dB MHz-1), while the SOS is significantly lower for the young (1601.9 ± 11.2 ms-1) compared to the middle-aged (1624.1 ± 8.7 m s-1). On the other hand, the thicknesses of Achilles tendons for both groups (young: 4.31 ± 0.23 mm; middle age: 4.24 ± 0.23 mm) are very similar. For one patient who had an Achilles tendon lengthening (ATL) surgery, the thickness of the Achilles tendon increased from 4 mm to 4.33 mm after the surgery. In addition, the BUA increased by about 7.2% while the SOS decreased by about 0.6%. In conclusion, noninvasive ultrasonic assessment of Achilles tendons is useful for assisting clinical diagnosis and for the evaluation of a therapeutic regimen.

  19. A Quantitative Assessment Method for Ascaris Eggs on Hands

    PubMed Central

    Jeandron, Aurelie; Ensink, Jeroen H. J.; Thamsborg, Stig M.; Dalsgaard, Anders; Sengupta, Mita E.

    2014-01-01

    The importance of hands in the transmission of soil transmitted helminths, especially Ascaris and Trichuris infections, is under-researched. This is partly because of the absence of a reliable method to quantify the number of eggs on hands. Therefore, the aim of this study was to develop a method to assess the number of Ascaris eggs on hands and determine the egg recovery rate of the method. Under laboratory conditions, hands were seeded with a known number of Ascaris eggs, air dried and washed in a plastic bag retaining the washing water, in order to determine recovery rates of eggs for four different detergents (cationic [benzethonium chloride 0.1% and cetylpyridinium chloride CPC 0.1%], anionic [7X 1% - quadrafos, glycol ether, and dioctyl sulfoccinate sodium salt] and non-ionic [Tween80 0.1% -polyethylene glycol sorbitan monooleate]) and two egg detection methods (McMaster technique and FLOTAC). A modified concentration McMaster technique showed the highest egg recovery rate from bags. Two of the four diluted detergents (benzethonium chloride 0.1% and 7X 1%) also showed a higher egg recovery rate and were then compared with de-ionized water for recovery of helminth eggs from hands. The highest recovery rate (95.6%) was achieved with a hand rinse performed with 7X 1%. Washing hands with de-ionized water resulted in an egg recovery rate of 82.7%. This washing method performed with a low concentration of detergent offers potential for quantitative investigation of contamination of hands with Ascaris eggs and of their role in human infection. Follow-up studies are needed that validate the hand washing method under field conditions, e.g. including people of different age, lower levels of contamination and various levels of hand cleanliness. PMID:24802859

  20. Use of Quantitative Microbial Risk Assessment to Improve Interpretation of a Recreational Water Epidemiological Study

    EPA Science Inventory

    We conducted a supplemental water quality monitoring study and quantitative microbial risk assessment (QMRA) to complement the United States Environmental Protection Agency’s (U.S. EPA) National Epidemiological and Environmental Assessment of Recreational Water study at Boq...

  1. QUANTITATIVE CANCER RISK ASSESSMENT METHODOLOGY USING SHORT-TERM GENETIC BIOASSAYS: THE COMPARATIVE POTENCY METHOD

    EPA Science Inventory

    Quantitative risk assessment is fraught with many uncertainties. The validity of the assumptions underlying the methods employed are often difficult to test or validate. Cancer risk assessment has generally employed either human epidemiological data from relatively high occupatio...

  2. Noninvasive Qualitative and Quantitative Assessment of Spoilage Attributes of Chilled Pork Using Hyperspectral Scattering Technique.

    PubMed

    Zhang, Leilei; Peng, Yankun

    2016-08-01

    The objective of this research was to develop a rapid noninvasive method for quantitative and qualitative determination of chilled pork spoilage. Microbiological, physicochemical, and organoleptic characteristics such as the total viable count (TVC), Pseudomonas spp., total volatile basic-nitrogen (TVB-N), pH value, and color parameter L* were determined to appraise pork quality. The hyperspectral scattering characteristics from 54 meat samples were fitted by four-parameter modified Gompertz function accurately. Support vector machines (SVM) was applied to establish quantitative prediction model between scattering fitting parameters and reference values. In addition, partial least squares discriminant analysis (PLS-DA) and Bayesian analysis were utilized as supervised and unsupervised techniques for the qualitative identification of meat spoilage. All stored chilled meat samples were classified into three grades: "fresh," "semi-fresh," and "spoiled." Bayesian classification model was superior to PLS-DA with overall classification accuracy of 92.86%. The results demonstrated that hyperspectral scattering technique combined with SVM and Bayesian possessed a powerful capability for meat spoilage assessment rapidly and noninvasively. PMID:27340214

  3. A quantitative health assessment index for rapid evaluation of fish condition in the field

    SciTech Connect

    Adams, S.M. ); Brown, A.M. ); Goede, R.W. )

    1993-01-01

    The health assessment index (HAI) is an extension and refinement of a previously published field necropsy system. The HAI is a quantitative index that allows statistical comparisons of fish health among data sets. Index variables are assigned numerical values based on the degree of severity or damage incurred by an organ or tissue from environmental stressors. This approach has been used to evaluate the general health status of fish populations in a wide range of reservoir types in the Tennessee River basin (North Carolina, Tennessee, Alabama, Kentucky), in Hartwell Reservoir (Georgia, South Carolina) that is contaminated by polychlorinated biphenyls, and in the Pigeon River (Tennessee, North Carolina) that receives effluents from a bleaches kraft mill. The ability of the HAI to accurately characterize the health of fish in these systems was evaluated by comparing this index to other types of fish health measures (contaminant, bioindicator, and reproductive analysis) made at the same time as the HAI. In all cases, the HAI demonstrated the same pattern of fish health status between sites as did each of the other more sophisticated health assessment methods. The HAI has proven to be a simple and inexpensive means of rapidly assessing general fish health in field situations. 29 refs., 5 tabs.

  4. The effect of manipulated and accurate assessment feedback on the self-efficacy of dance students.

    PubMed

    García-Dantas, Ana; Quested, Eleanor

    2015-03-01

    Research undertaken with athletes has shown that lower-evaluated feedback is related to low self-efficacy levels. However, the relationship between teacher feedback and self-efficacy has not been studied in the dance setting. In sports or dance contexts, very few studies have manipulated feedback content to examine its impact on performers' self-efficacy in relation to the execution of a specific movement. Therefore, the aim of this investigation was to explore the effect of manipulated upper, lower, and accurate grade feedback on changes in dancers' self-efficacy levels for the execution of the "Zapateado" (a flamenco foot movement). Sixty-one students (56 female, 5 male, ages 13 to 22 ± 3.25 years) from a Spanish dance conservatory participated in this experimental study. They were randomly divided into four feedback groups: 1. upper-evaluated, 2. objective and informational, 3. lower-evaluated, and 4. no feedback-control. Participants performed three trials during a 1-hour session and completed questionnaires tapping self-efficacy pre-feedback and post-feedback. After each trial, teachers (who were confederates in the study) were first asked to rate their perception of each dancer's competence level at performing the movement according to conventional criteria (scores from 0 to 10). The results were then manipulated, and students accurate, lower-evaluated, or upper-evaluated scores were given. Those in the accurate feedback group reported positive change in self-efficacy, whereas those in the lower-evaluated group showed no significant change in self-efficacy during the course of the trial. Findings call into question the common perception among teachers that it can be motivating to provide students with inaccurate feedback that indicates that the students' performance level is much better or much worse than they actually perceive it to be. Self-efficacy appears most likely to increase in students when feedback is accurate. PMID:25741781

  5. Qualitative and quantitative procedures for health risk assessment.

    PubMed

    Lohman, P H

    1999-07-16

    Numerous reactive mutagenic electrophiles are present in the environment or are formed in the human body through metabolizing processes. Those electrophiles can directly react with DNA and are considered to be ultimate carcinogens. In the past decades more than 200 in vitro and in vivo genotoxic tests have been described to identify, monitor and characterize the exposure of humans to such agents. When the responses of such genotoxic tests are quantified by a weight-of-evidence analysis, it is found that the intrinsic potency of electrophiles being mutagens does not differ much for the majority of the agents studied. Considering the fact that under normal environmental circumstances human are exposed to low concentration of about a million electrophiles, the relation between exposure to such agents and adverse health effects (e.g., cancer) will become a 'Pandora's box'. For quantitative risk assessment it will be necessary not only to detect whether the agent is genotoxic, but also understand the mechanism of interaction of the agent with the DNA in target cells needs to be taken into account. Examples are given for a limited group of important environmental and carcinogenic agents for which such an approach is feasible. The groups identified are agents that form cross-links with DNA or are mono-alkylating agents that react with base-moieties in the DNA strands. Quantitative hazard ranking of the mutagenic potency of these groups of chemical can be performed and there is ample evidence that such a ranking corresponds with the individual carcinogenic potency of those agents in rodents. Still, in practice, with the exception of certain occupational or accidental exposure situations, these approaches have not be successful in preventing cancer death in the human population. However, this is not only due to the described 'Pandora's box' situation. At least three other factors are described. Firstly, in the industrial world the medical treatment of cancer in patients

  6. How to Achieve Accurate Peer Assessment for High Value Written Assignments in a Senior Undergraduate Course

    ERIC Educational Resources Information Center

    Jeffery, Daniel; Yankulov, Krassimir; Crerar, Alison; Ritchie, Kerry

    2016-01-01

    The psychometric measures of accuracy, reliability and validity of peer assessment are critical qualities for its use as a supplement to instructor grading. In this study, we seek to determine which factors related to peer review are the most influential on these psychometric measures, with a primary focus on the accuracy of peer assessment or how…

  7. Assessment of the sources of error affecting the quantitative accuracy of SPECT imaging in small animals

    PubMed Central

    Hwang, Andrew B; Franc, Benjamin L; Gullberg, Grant T; Hasegawa, Bruce H

    2009-01-01

    Small animal SPECT imaging systems have multiple potential applications in biomedical research. Whereas SPECT data are commonly interpreted qualitatively in a clinical setting, the ability to accurately quantify measurements will increase the utility of the SPECT data for laboratory measurements involving small animals. In this work, we assess the effect of photon attenuation, scatter and partial volume errors on the quantitative accuracy of small animal SPECT measurements, first with Monte Carlo simulation and then confirmed with experimental measurements. The simulations modeled the imaging geometry of a commercially available small animal SPECT system. We simulated the imaging of a radioactive source within a cylinder of water, and reconstructed the projection data using iterative reconstruction algorithms. The size of the source and the size of the surrounding cylinder were varied to evaluate the effects of photon attenuation and scatter on quantitative accuracy. We found that photon attenuation can reduce the measured concentration of radioactivity in a volume of interest in the center of a rat-sized cylinder of water by up to 50% when imaging with iodine-125, and up to 25% when imaging with technetium-99m. When imaging with iodine-125, the scatter-to-primary ratio can reach up to approximately 30%, and can cause overestimation of the radioactivity concentration when reconstructing data with attenuation correction. We varied the size of the source to evaluate partial volume errors, which we found to be a strong function of the size of the volume of interest and the spatial resolution. These errors can result in large (>50%) changes in the measured amount of radioactivity. The simulation results were compared with and found to agree with experimental measurements. The inclusion of attenuation correction in the reconstruction algorithm improved quantitative accuracy. We also found that an improvement of the spatial resolution through the use of resolution

  8. Assessment of the sources of error affecting the quantitative accuracy of SPECT imaging in small animals

    SciTech Connect

    Joint Graduate Group in Bioengineering, University of California, San Francisco and University of California, Berkeley; Department of Radiology, University of California; Gullberg, Grant T; Hwang, Andrew B.; Franc, Benjamin L.; Gullberg, Grant T.; Hasegawa, Bruce H.

    2008-02-15

    Small animal SPECT imaging systems have multiple potential applications in biomedical research. Whereas SPECT data are commonly interpreted qualitatively in a clinical setting, the ability to accurately quantify measurements will increase the utility of the SPECT data for laboratory measurements involving small animals. In this work, we assess the effect of photon attenuation, scatter and partial volume errors on the quantitative accuracy of small animal SPECT measurements, first with Monte Carlo simulation and then confirmed with experimental measurements. The simulations modeled the imaging geometry of a commercially available small animal SPECT system. We simulated the imaging of a radioactive source within a cylinder of water, and reconstructed the projection data using iterative reconstruction algorithms. The size of the source and the size of the surrounding cylinder were varied to evaluate the effects of photon attenuation and scatter on quantitative accuracy. We found that photon attenuation can reduce the measured concentration of radioactivity in a volume of interest in the center of a rat-sized cylinder of water by up to 50percent when imaging with iodine-125, and up to 25percent when imaging with technetium-99m. When imaging with iodine-125, the scatter-to-primary ratio can reach up to approximately 30percent, and can cause overestimation of the radioactivity concentration when reconstructing data with attenuation correction. We varied the size of the source to evaluate partial volume errors, which we found to be a strong function of the size of the volume of interest and the spatial resolution. These errors can result in large (>50percent) changes in the measured amount of radioactivity. The simulation results were compared with and found to agree with experimental measurements. The inclusion of attenuation correction in the reconstruction algorithm improved quantitative accuracy. We also found that an improvement of the spatial resolution through the

  9. Tandem Mass Spectrometry Measurement of the Collision Products of Carbamate Anions Derived from CO2 Capture Sorbents: Paving the Way for Accurate Quantitation

    NASA Astrophysics Data System (ADS)

    Jackson, Phil; Fisher, Keith J.; Attalla, Moetaz Ibrahim

    2011-08-01

    The reaction between CO2 and aqueous amines to produce a charged carbamate product plays a crucial role in post-combustion capture chemistry when primary and secondary amines are used. In this paper, we report the low energy negative-ion CID results for several anionic carbamates derived from primary and secondary amines commonly used as post-combustion capture solvents. The study was performed using the modern equivalent of a triple quadrupole instrument equipped with a T-wave collision cell. Deuterium labeling of 2-aminoethanol (1,1,2,2,-d4-2-aminoethanol) and computations at the M06-2X/6-311++G(d,p) level were used to confirm the identity of the fragmentation products for 2-hydroxyethylcarbamate (derived from 2-aminoethanol), in particular the ions CN-, NCO- and facile neutral losses of CO2 and water; there is precedent for the latter in condensed phase isocyanate chemistry. The fragmentations of 2-hydroxyethylcarbamate were generalized for carbamate anions derived from other capture amines, including ethylenediamine, diethanolamine, and piperazine. We also report unequivocal evidence for the existence of carbamate anions derived from sterically hindered amines ( Tris(2-hydroxymethyl)aminomethane and 2-methyl-2-aminopropanol). For the suite of carbamates investigated, diagnostic losses include the decarboxylation product (-CO2, 44 mass units), loss of 46 mass units and the fragments NCO- ( m/z 42) and CN- ( m/z 26). We also report low energy CID results for the dicarbamate dianion (-O2CNHC2H4NHCO{2/-}) commonly encountered in CO2 capture solution utilizing ethylenediamine. Finally, we demonstrate a promising ion chromatography-MS based procedure for the separation and quantitation of aqueous anionic carbamates, which is based on the reported CID findings. The availability of accurate quantitation methods for ionic CO2 capture products could lead to dynamic operational tuning of CO2 capture-plants and, thus, cost-savings via real-time manipulation of solvent

  10. Is objective and accurate cognitive assessment across the menstrual cycle possible? A feasibility study

    PubMed Central

    Neill, Jo; Scally, Andy; Tuffnell, Derek; Marshall, Kay

    2015-01-01

    Objectives: Variation in plasma hormone levels influences the neurobiology of brain regions involved in cognition and emotion processing. Fluctuations in hormone levels across the menstrual cycle could therefore alter cognitive performance and wellbeing; reports have provided conflicting results, however. The aim of this study was to assess whether objective assessment of cognitive performance and self-reported wellbeing during the follicular and luteal phases of the menstrual cycle is feasible and investigate the possible reasons for variation in effects previously reported. Methods: The Cambridge Neuropsychological Test Automated Battery and Edinburgh Postnatal Depression Scale were used to assess the cognitive performance and wellbeing of 12 women. Data were analysed by self-reported and hormone-estimated phases of the menstrual cycle. Results: Recruitment to the study and assessment of cognition and wellbeing was without issue. Plasma hormone and peptide estimation showed substantial individual variation and suggests inaccuracy in self-reported menstrual phase estimation. Conclusion: Objective assessment of cognitive performance and self-assessed wellbeing across the menstrual cycle is feasible. Grouping data by hormonal profile rather by self-reported phase estimation may influence phase-mediated results. Future studies should use plasma hormone and peptide profiles to estimate cycle phase and group data for analyses. PMID:26770760

  11. Mitochondrial DNA as a non-invasive biomarker: Accurate quantification using real time quantitative PCR without co-amplification of pseudogenes and dilution bias

    SciTech Connect

    Malik, Afshan N.; Shahni, Rojeen; Rodriguez-de-Ledesma, Ana; Laftah, Abas; Cunningham, Phil

    2011-08-19

    Highlights: {yields} Mitochondrial dysfunction is central to many diseases of oxidative stress. {yields} 95% of the mitochondrial genome is duplicated in the nuclear genome. {yields} Dilution of untreated genomic DNA leads to dilution bias. {yields} Unique primers and template pretreatment are needed to accurately measure mitochondrial DNA content. -- Abstract: Circulating mitochondrial DNA (MtDNA) is a potential non-invasive biomarker of cellular mitochondrial dysfunction, the latter known to be central to a wide range of human diseases. Changes in MtDNA are usually determined by quantification of MtDNA relative to nuclear DNA (Mt/N) using real time quantitative PCR. We propose that the methodology for measuring Mt/N needs to be improved and we have identified that current methods have at least one of the following three problems: (1) As much of the mitochondrial genome is duplicated in the nuclear genome, many commonly used MtDNA primers co-amplify homologous pseudogenes found in the nuclear genome; (2) use of regions from genes such as {beta}-actin and 18S rRNA which are repetitive and/or highly variable for qPCR of the nuclear genome leads to errors; and (3) the size difference of mitochondrial and nuclear genomes cause a 'dilution bias' when template DNA is diluted. We describe a PCR-based method using unique regions in the human mitochondrial genome not duplicated in the nuclear genome; unique single copy region in the nuclear genome and template treatment to remove dilution bias, to accurately quantify MtDNA from human samples.

  12. An automated method for analysis of microcirculation videos for accurate assessment of tissue perfusion

    PubMed Central

    2012-01-01

    Background Imaging of the human microcirculation in real-time has the potential to detect injuries and illnesses that disturb the microcirculation at earlier stages and may improve the efficacy of resuscitation. Despite advanced imaging techniques to monitor the microcirculation, there are currently no tools for the near real-time analysis of the videos produced by these imaging systems. An automated system tool that can extract microvasculature information and monitor changes in tissue perfusion quantitatively might be invaluable as a diagnostic and therapeutic endpoint for resuscitation. Methods The experimental algorithm automatically extracts microvascular network and quantitatively measures changes in the microcirculation. There are two main parts in the algorithm: video processing and vessel segmentation. Microcirculatory videos are first stabilized in a video processing step to remove motion artifacts. In the vessel segmentation process, the microvascular network is extracted using multiple level thresholding and pixel verification techniques. Threshold levels are selected using histogram information of a set of training video recordings. Pixel-by-pixel differences are calculated throughout the frames to identify active blood vessels and capillaries with flow. Results Sublingual microcirculatory videos are recorded from anesthetized swine at baseline and during hemorrhage using a hand-held Side-stream Dark Field (SDF) imaging device to track changes in the microvasculature during hemorrhage. Automatically segmented vessels in the recordings are analyzed visually and the functional capillary density (FCD) values calculated by the algorithm are compared for both health baseline and hemorrhagic conditions. These results were compared to independently made FCD measurements using a well-known semi-automated method. Results of the fully automated algorithm demonstrated a significant decrease of FCD values. Similar, but more variable FCD values were calculated

  13. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  14. Assessing association between protein truncating variants and quantitative traits

    PubMed Central

    Rivas, Manuel A.; Pirinen, Matti; Neville, Matthew J.; Gaulton, Kyle J.; Moutsianas, Loukas; Lindgren, Cecilia M.; Karpe, Fredrik; McCarthy, Mark I.; Donnelly, Peter

    2013-01-01

    Motivation: In sequencing studies of common diseases and quantitative traits, power to test rare and low frequency variants individually is weak. To improve power, a common approach is to combine statistical evidence from several genetic variants in a region. Major challenges are how to do the combining and which statistical framework to use. General approaches for testing association between rare variants and quantitative traits include aggregating genotypes and trait values, referred to as ‘collapsing’, or using a score-based variance component test. However, little attention has been paid to alternative models tailored for protein truncating variants. Recent studies have highlighted the important role that protein truncating variants, commonly referred to as ‘loss of function’ variants, may have on disease susceptibility and quantitative levels of biomarkers. We propose a Bayesian modelling framework for the analysis of protein truncating variants and quantitative traits. Results: Our simulation results show that our models have an advantage over the commonly used methods. We apply our models to sequence and exome-array data and discover strong evidence of association between low plasma triglyceride levels and protein truncating variants at APOC3 (Apolipoprotein C3). Availability: Software is available from http://www.well.ox.ac.uk/~rivas/mamba Contact: donnelly@well.ox.ac.uk PMID:23860716

  15. Quantitative assessment of scatter correction techniques incorporated in next generation dual-source computed tomography

    NASA Astrophysics Data System (ADS)

    Mobberley, Sean David

    Accurate, cross-scanner assessment of in-vivo air density used to quantitatively assess amount and distribution of emphysema in COPD subjects has remained elusive. Hounsfield units (HU) within tracheal air can be considerably more positive than -1000 HU. With the advent of new dual-source scanners which employ dedicated scatter correction techniques, it is of interest to evaluate how the quantitative measures of lung density compare between dual-source and single-source scan modes. This study has sought to characterize in-vivo and phantom-based air metrics using dual-energy computed tomography technology where the nature of the technology has required adjustments to scatter correction. Anesthetized ovine (N=6), swine (N=13: more human-like rib cage shape), lung phantom and a thoracic phantom were studied using a dual-source MDCT scanner (Siemens Definition Flash. Multiple dual-source dual-energy (DSDE) and single-source (SS) scans taken at different energy levels and scan settings were acquired for direct quantitative comparison. Density histograms were evaluated for the lung, tracheal, water and blood segments. Image data were obtained at 80, 100, 120, and 140 kVp in the SS mode (B35f kernel) and at 80, 100, 140, and 140-Sn (tin filtered) kVp in the DSDE mode (B35f and D30f kernels), in addition to variations in dose, rotation time, and pitch. To minimize the effect of cross-scatter, the phantom scans in the DSDE mode was obtained by reducing the tube current of one of the tubes to its minimum (near zero) value. When using image data obtained in the DSDE mode, the median HU values in the tracheal regions of all animals and the phantom were consistently closer to -1000 HU regardless of reconstruction kernel (chapters 3 and 4). Similarly, HU values of water and blood were consistently closer to their nominal values of 0 HU and 55 HU respectively. When using image data obtained in the SS mode the air CT numbers demonstrated a consistent positive shift of up to 35 HU

  16. Quantitative assessment and reduction of long-term autoradiographic background

    SciTech Connect

    Traub, R.K.; Famous, L.; Krishnan, R.; Olson, K.R.

    1990-01-01

    Quantitative autoradiography can measure distribution patterns in an animal exposed to radiolabeled compounds. A comparison of autoradiographs of rat brain containing low levels of 14C showed that a highly variable background signal had been produced. This resulted in several overexposed autoradiographs which could not be quantitatively compared. The background, believed to be produced by light emanating from the phosphor coating in the X-ray cassette, was a major impediment because it hindered correct analysis of the specimen. this article details our experiments demonstrating the sources of variance contributing to background and offers methods for its reduction. We found that placement of black polyethylene plastic between the slides and phosphor in the X-ray film cassette minimized autoradiographic background and effectively eliminated the effects caused by inherently different levels of radioactivity in the glass slides.

  17. Accurate calculation of binding energies for molecular clusters - Assessment of different models

    NASA Astrophysics Data System (ADS)

    Friedrich, Joachim; Fiedler, Benjamin

    2016-06-01

    In this work we test different strategies to compute high-level benchmark energies for medium-sized molecular clusters. We use the incremental scheme to obtain CCSD(T)/CBS energies for our test set and carefully validate the accuracy for binding energies by statistical measures. The local errors of the incremental scheme are <1 kJ/mol. Since they are smaller than the basis set errors, we obtain higher total accuracy due to the applicability of larger basis sets. The final CCSD(T)/CBS benchmark values are ΔE = - 278.01 kJ/mol for (H2O)10, ΔE = - 221.64 kJ/mol for (HF)10, ΔE = - 45.63 kJ/mol for (CH4)10, ΔE = - 19.52 kJ/mol for (H2)20 and ΔE = - 7.38 kJ/mol for (H2)10 . Furthermore we test state-of-the-art wave-function-based and DFT methods. Our benchmark data will be very useful for critical validations of new methods. We find focal-point-methods for estimating CCSD(T)/CBS energies to be highly accurate and efficient. For foQ-i3CCSD(T)-MP2/TZ we get a mean error of 0.34 kJ/mol and a standard deviation of 0.39 kJ/mol.

  18. Assessing temporal flux of plant hormones in stored processing potatoes using high definition accurate mass spectrometry

    PubMed Central

    Ordaz-Ortiz, José Juan; Foukaraki, Sofia; Terry, Leon Alexander

    2015-01-01

    Plant hormones are important molecules which at low concentration can regulate various physiological processes. Mass spectrometry has become a powerful technique for the quantification of multiple classes of plant hormones because of its high sensitivity and selectivity. We developed a new ultrahigh pressure liquid chromatography–full-scan high-definition accurate mass spectrometry method, for simultaneous determination of abscisic acid and four metabolites phaseic acid, dihydrophaseic acid, 7′-hydroxy-abscisic acid and abscisic acid glucose ester, cytokinins zeatin, zeatin riboside, gibberellins (GA1, GA3, GA4 and GA7) and indole-3-acetyl-L-aspartic acid. We measured the amount of plant hormones in the flesh and skin of two processing potato cvs. Sylvana and Russet Burbank stored for up to 30 weeks at 6 °C under ambient air conditions. Herein, we report for the first time that abscisic acid glucose ester seems to accumulate in the skin of potato tubers throughout storage time. The method achieved a lowest limit of detection of 0.22 ng g−1 of dry weight and a limit of quantification of 0.74 ng g−1 dry weight (zeatin riboside), and was able to recover, detect and quantify a total of 12 plant hormones spiked on flesh and skin of potato tubers. In addition, the mass accuracy for all compounds (<5 ppm) was evaluated. PMID:26504563

  19. Quantitative assessment of hemadsorption by myxoviruses: virus hemadsorption assay.

    PubMed

    Hahon, N; Booth, J A; Eckert, H L

    1973-04-01

    The standardization and quantitative evaluation of an assay for myxoviruses, based on the enumeration of individual infected clone 1-5C-4 cells manifesting hemadsorption within 24 h of infection, are described. Hemadsorption was detectable earlier than immunofluorescence in infected cells or hemagglutinins in culture medium. The relationship between virus concentration and cells exhibiting hemadsorption was linear. The assay was highly precise, sensitive, and reproducible. PMID:4349248

  20. Calibration assessment in quantitative electroencephalographic brainmapping and evoked potential studies.

    PubMed

    Richards, A K; Hamilton-Bruce, M A

    1994-09-01

    Acquisition of a Cadwell Spectrum 32 resulted in the introduction of quantitative electrophysiological brainmapping techniques in our neurophysiology laboratory. To ascertain the accuracy and consistency of our equipment, we performed the following tests: inputting a calibration signal and measuring the resultant amplitudes for quantitative electroencephalographs (qEEGs) and evoked potentials (EPs) in the mapping and standard montages, inputting a synchronous calibration signal and mapping it at varying times for qEEGs and EPs, as well as re-analysing the same electroencephalographic (EEG) epochs previously selected from 20 control subjects. QEEG amplitudes varied from -5.4% to +5.8% and EPs by 9.5% or less, and after an EP software upgrade, by 5.5% or less. QEEG voltage mapping showed variation of only one color increment across the map, which could, in our example, represent up to 25.2% of the scale used. Re-analysis of previously selected epochs yielded identical results. We have established some of the accuracy and consistency limits of the hard- and software of our system with respect to the quantitative and topographic data. We conclude that such systems need to be calibration-checked in the laboratories in which they are used, with an independent signal generator. Users also need to be aware that scaling of topographic maps could lead to erroneous conclusions, as perceived amplitude changes could affect the interpretation of both initial and serial studies. PMID:7980205

  1. Quantitative approach using multiple single parameters versus visual assessment in dobutamine stress echocardiography

    PubMed Central

    2012-01-01

    .52. Conclusions Multiple single quantitative parameters showed limited predictive ability to identify significant coronary artery stenosis. Visual assessment of DSE appears to be more accurate than single velocity and strain/strain rate markers in the diagnosis of CAD. PMID:22846395

  2. Quantitative Assessment of Countermeasure Efficacy for Long-Term Space Missions

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.

    2000-01-01

    This slide presentation reviews the development of quantitative assessments of the effectiveness of countermeasures (CM) for the effects of space travel on humans for long term space missions. An example of bone mineral density (BMD) is examined to show specific quantitative measures for failure and success.

  3. Food habits and nutritional status assessment of adolescent soccer players. A necessary and accurate approach.

    PubMed

    Iglesias-Gutiérrez, Eduardo; García-Rovés, Pablo M; Rodríguez, Carmen; Braga, Socorro; García-Zapico, Pedro; Patterson, Angeles M

    2005-02-01

    The aim of this study was to assess the food habits and nutritional status of high level adolescent soccer players (N = 33; ages 14-16 yrs) living in their home environment. Body composition (height, mass, skinfolds), biochemical and hematological parameters, performance in soccer-specific tests (sprinting, jumping, intermittent endurance), and dietary intake (weighed food intake method) and related behaviors (nutrient supplement use, daily activity profile) were assessed. Daily energy expenditure and energy intake were 12.5 MJ and 12.6 MJ, respectively. Protein (16% of energy intake; 1.9 g/kg of body mass), lipid (38%), and cholesterol (385 mg) intake were above recommendations, while carbohydrates (45%) were below. The food intake of these adolescents was based on cereals and derivates; meat, fish, and eggs; milk and dairy products; biscuits and confectionery; and oil, butter and margarine, which provided 78% of total energy intake, 85% of proteins, 64% of carbohydrates, 90% of lipids, and 47% of fiber. Although diet provided sufficient iron, 48% of individuals showed iron deficiency without anemia. Based on these results, a well designed nutrition intervention would be advisable for optimizing performance, and especially for promoting healthy eating habits in adolescent soccer players. PMID:15855680

  4. Quantitative Assessment of Neuromotor Function in Adolescents with High Functioning Autism and Asperger Syndrome

    ERIC Educational Resources Information Center

    Freitag, Christine M.; Kleser, Christina; Schneider, Marc; von Gontard, Alexander

    2007-01-01

    Background: Motor impairment in children with Asperger Syndrome (AS) or High functioning autism (HFA) has been reported previously. This study presents results of a quantitative assessment of neuromotor skills in 14-22 year old HFA/AS. Methods: 16 HFA/AS and 16 IQ-matched controls were assessed by the Zurich Neuromotor Assessment (ZNA). Results:…

  5. I Vivo Quantitative Ultrasound Imaging and Scatter Assessments.

    NASA Astrophysics Data System (ADS)

    Lu, Zheng Feng

    There is evidence that "instrument independent" measurements of ultrasonic scattering properties would provide useful diagnostic information that is not available with conventional ultrasound imaging. This dissertation is a continuing effort to test the above hypothesis and to incorporate quantitative ultrasound methods into clinical examinations for early detection of diffuse liver disease. A well-established reference phantom method was employed to construct quantitative ultrasound images of tissue in vivo. The method was verified by extensive phantom tests. A new method was developed to measure the effective attenuation coefficient of the body wall. The method relates the slope of the difference between the echo signal power spectrum from a uniform region distal to the body wall and the echo signal power spectrum from a reference phantom to the body wall attenuation. The accuracy obtained from phantom tests suggests further studies with animal experiments. Clinically, thirty-five healthy subjects and sixteen patients with diffuse liver disease were studied by these quantitative ultrasound methods. The average attenuation coefficient in normals agreed with previous investigators' results; in vivo backscatter coefficients agreed with the results from normals measured by O'Donnell. Strong discriminating power (p < 0.001) was found for both attenuation and backscatter coefficients between fatty livers and normals; a significant difference (p < 0.01) was observed in the backscatter coefficient but not in the attenuation coefficient between cirrhotic livers and normals. An in vivo animal model of steroid hepatopathy was used to investigate the system sensitivity in detecting early changes in canine liver resulting from corticosteroid administration. The average attenuation coefficient slope increased from 0.7 dB/cm/MHz in controls to 0.82 dB/cm/MHz (at 6 MHz) in treated animals on day 14 into the treatment, and the backscatter coefficient was 26times 10^{ -4}cm^{-1}sr

  6. A Platform for Rapid, Quantitative Assessment of Multiple Drug Combinations Simultaneously in Solid Tumors In Vivo

    PubMed Central

    Grenley, Marc O.; Casalini, Joseph R.; Tretyak, Ilona; Ditzler, Sally H.; Thirstrup, Derek J.; Frazier, Jason P.; Pierce, Daniel W.; Carleton, Michael; Klinghoffer, Richard A.

    2016-01-01

    While advances in high-throughput screening have resulted in increased ability to identify synergistic anti-cancer drug combinations, validation of drug synergy in the in vivo setting and prioritization of combinations for clinical development remain low-throughput and resource intensive. Furthermore, there is currently no viable method for prospectively assessing drug synergy directly in human patients in order to potentially tailor therapies. To address these issues we have employed the previously described CIVO platform and developed a quantitative approach for investigating multiple combination hypotheses simultaneously in single living tumors. This platform provides a rapid, quantitative and cost effective approach to compare and prioritize drug combinations based on evidence of synergistic tumor cell killing in the live tumor context. Using a gemcitabine resistant model of pancreatic cancer, we efficiently investigated nine rationally selected Abraxane-based combinations employing only 19 xenografted mice. Among the drugs tested, the BCL2/BCLxL inhibitor ABT-263 was identified as the one agent that synergized with Abraxane® to enhance acute induction of localized apoptosis in this model of human pancreatic cancer. Importantly, results obtained with CIVO accurately predicted the outcome of systemic dosing studies in the same model where superior tumor regression induced by the Abraxane/ABT-263 combination was observed compared to that induced by either single agent. This supports expanded use of CIVO as an in vivo platform for expedited in vivo drug combination validation and sets the stage for performing toxicity-sparing drug combination studies directly in cancer patients with solid malignancies. PMID:27359113

  7. Chromatic distortion during angioscopy: assessment and correction by quantitative colorimetric angioscopic analysis.

    PubMed

    Lehmann, K G; Oomen, J A; Slager, C J; deFeyter, P J; Serruys, P W

    1998-10-01

    Angioscopy represents a diagnostic tool with the unique ability of assessing the true color of intravascular structures. Current angioscopic interpretation is entirely subjective, however, and the visual interpretation of color has been shown to be marginal at best. The quantitative colorimetric angioscopic analysis system permits the full characterization of angioscopic color using two parameters (C1 and C2), derived from a custom color coordinate system, that are independent of illuminating light intensity. Measurement variability was found to be low (coefficient of variation = 0.06-0.64%), and relatively stable colorimetric values were obtained even at the extremes of illumination power. Variability between different angioscopic catheters was good (maximum difference for C1, 0.022; for C2, 0.015). Catheter flexion did not significantly distort color transmission. Although the fiber optic illumination bundle was found to impart a slight yellow tint to objects in view (deltaC1 = 0.020, deltaC2 = 0.024, P < 0.0001) and the imaging bundle in isolation imparted a slight red tint (deltaC1 = 0.043, deltaC2 = -0.027, P < 0.0001), both of these artifacts could be corrected by proper white balancing. Finally, evaluation of regional chromatic characteristics revealed a radially symmetric and progressive blue shift in measured color when moving from the periphery to the center of an angioscopic image. An algorithm was developed that could automatically correct 93.0-94.3% of this error and provide accurate colorimetric measurements independent of spatial location within the angioscopic field. In summary, quantitative colorimetric angioscopic analysis provides objective and highly reproducible measurements of angioscopic color. This technique can correct for important chromatic distortions present in modern angioscopic systems. It can also help overcome current limitations in angioscopy research and clinical use imposed by the reliance on visual perception of color. PMID

  8. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  9. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  10. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  11. Personal Exposure Monitoring Wearing Protocol Compliance: An Initial Assessment of Quantitative Measurements

    EPA Science Inventory

    Personal exposure sampling provides the most accurate and representative assessment of exposure to a pollutant, but only if measures are implemented to minimize exposure misclassification and reduce confounders that may cause misinterpretation of the collected data. Poor complian...

  12. Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals’ Behaviour

    PubMed Central

    Calderara, Simone; Pistocchi, Simone; Cucchiara, Rita; Podaliri-Vulpiani, Michele; Messori, Stefano; Ferri, Nicola

    2016-01-01

    Mankind directly controls the environment and lifestyles of several domestic species for purposes ranging from production and research to conservation and companionship. These environments and lifestyles may not offer these animals the best quality of life. Behaviour is a direct reflection of how the animal is coping with its environment. Behavioural indicators are thus among the preferred parameters to assess welfare. However, behavioural recording (usually from video) can be very time consuming and the accuracy and reliability of the output rely on the experience and background of the observers. The outburst of new video technology and computer image processing gives the basis for promising solutions. In this pilot study, we present a new prototype software able to automatically infer the behaviour of dogs housed in kennels from 3D visual data and through structured machine learning frameworks. Depth information acquired through 3D features, body part detection and training are the key elements that allow the machine to recognise postures, trajectories inside the kennel and patterns of movement that can be later labelled at convenience. The main innovation of the software is its ability to automatically cluster frequently observed temporal patterns of movement without any pre-set ethogram. Conversely, when common patterns are defined through training, a deviation from normal behaviour in time or between individuals could be assessed. The software accuracy in correctly detecting the dogs’ behaviour was checked through a validation process. An automatic behaviour recognition system, independent from human subjectivity, could add scientific knowledge on animals’ quality of life in confinement as well as saving time and resources. This 3D framework was designed to be invariant to the dog’s shape and size and could be extended to farm, laboratory and zoo quadrupeds in artificial housing. The computer vision technique applied to this software is innovative in non

  13. Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals' Behaviour.

    PubMed

    Barnard, Shanis; Calderara, Simone; Pistocchi, Simone; Cucchiara, Rita; Podaliri-Vulpiani, Michele; Messori, Stefano; Ferri, Nicola

    2016-01-01

    Mankind directly controls the environment and lifestyles of several domestic species for purposes ranging from production and research to conservation and companionship. These environments and lifestyles may not offer these animals the best quality of life. Behaviour is a direct reflection of how the animal is coping with its environment. Behavioural indicators are thus among the preferred parameters to assess welfare. However, behavioural recording (usually from video) can be very time consuming and the accuracy and reliability of the output rely on the experience and background of the observers. The outburst of new video technology and computer image processing gives the basis for promising solutions. In this pilot study, we present a new prototype software able to automatically infer the behaviour of dogs housed in kennels from 3D visual data and through structured machine learning frameworks. Depth information acquired through 3D features, body part detection and training are the key elements that allow the machine to recognise postures, trajectories inside the kennel and patterns of movement that can be later labelled at convenience. The main innovation of the software is its ability to automatically cluster frequently observed temporal patterns of movement without any pre-set ethogram. Conversely, when common patterns are defined through training, a deviation from normal behaviour in time or between individuals could be assessed. The software accuracy in correctly detecting the dogs' behaviour was checked through a validation process. An automatic behaviour recognition system, independent from human subjectivity, could add scientific knowledge on animals' quality of life in confinement as well as saving time and resources. This 3D framework was designed to be invariant to the dog's shape and size and could be extended to farm, laboratory and zoo quadrupeds in artificial housing. The computer vision technique applied to this software is innovative in non

  14. Assessment of metabolic bone diseases by quantitative computed tomography

    SciTech Connect

    Richardson, M.L.; Genant, H.K.; Cann, C.E.; Ettinger, B.; Gordan, G.S.; Kolb, F.O.; Reiser, U.J.

    1985-05-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid- induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements.

  15. Quantitative assessment of susceptibility weighted imaging processing methods

    PubMed Central

    Li, Ningzhi; Wang, Wen-Tung; Sati, Pascal; Pham, Dzung L.; Butman, John A.

    2013-01-01

    Purpose To evaluate different susceptibility weighted imaging (SWI) phase processing methods and parameter selection, thereby improving understanding of potential artifacts, as well as facilitating choice of methodology in clinical settings. Materials and Methods Two major phase processing methods, Homodyne-filtering and phase unwrapping-high pass (HP) filtering, were investigated with various phase unwrapping approaches, filter sizes, and filter types. Magnitude and phase images were acquired from a healthy subject and brain injury patients on a 3T clinical Siemens MRI system. Results were evaluated based on image contrast to noise ratio and presence of processing artifacts. Results When using a relatively small filter size (32 pixels for the matrix size 512 × 512 pixels), all Homodyne-filtering methods were subject to phase errors leading to 2% to 3% masked brain area in lower and middle axial slices. All phase unwrapping-filtering/smoothing approaches demonstrated fewer phase errors and artifacts compared to the Homodyne-filtering approaches. For performing phase unwrapping, Fourier-based methods, although less accurate, were 2–4 orders of magnitude faster than the PRELUDE, Goldstein and Quality-guide methods. Conclusion Although Homodyne-filtering approaches are faster and more straightforward, phase unwrapping followed by HP filtering approaches perform more accurately in a wider variety of acquisition scenarios. PMID:24923594

  16. Quantitative Assessment of Neurite Outgrowth in PC12 Cells

    EPA Science Inventory

    In vitro test methods can provide a rapid approach for the screening of large numbers of chemicals for their potential to produce toxicity. In order to identify potential developmental neurotoxicants, assessment of critical neurodevelopmental processes such as neuronal differenti...

  17. Quantitative risk assessment: an emerging tool for emerging foodborne pathogens.

    PubMed Central

    Lammerding, A. M.; Paoli, G. M.

    1997-01-01

    New challenges to the safety of the food supply require new strategies for evaluating and managing food safety risks. Changes in pathogens, food preparation, distribution, and consumption, and population immunity have the potential to adversely affect human health. Risk assessment offers a framework for predicting the impact of changes and trends on the provision of safe food. Risk assessment models facilitate the evaluation of active or passive changes in how foods are produced, processed, distributed, and consumed. PMID:9366601

  18. Purity Assessment of Aryltetralin Lactone Lignans by Quantitative 1H Nuclear Magnetic Resonance.

    PubMed

    Sun, Yan-Jun; Zhang, Yan-Li; Wang, Yu; Wang, Jun-Min; Zhao, Xuan; Gong, Jian-Hong; Gao, Wei; Guan, Yan-Bin

    2015-01-01

    In the present work, a quantitative 1H Nuclear Magnetic Resonance (qHNMR) was established for purity assessment of six aryltetralin lactone lignans. The validation of the method was carried out, including specificity, selectivity, linearity, accuracy, precision, and robustness. Several experimental parameters were optimized, including relaxation delay (D1), scan numbers (NS), and pulse angle. 1,4-Dinitrobenzene was used as internal standard (IS), and deuterated dimethyl sulfoxide (DMSO-d6) as the NMR solvent. The purities were calculated by the area ratios of H-2,6 from target analytes vs. aromatic protons from IS. Six aryltetralin lactone lignans (deoxypodophyllotoxin, podophyllotoxin, 4-demethylpodophyllotoxin, podophyllotoxin-7'-O-β-d-glucopyranoside, 4-demethylpodophyllotoxin-7'-O-β-d-glucopyranoside, and 6''-acetyl-podophyllotoxin-7'-O-β -d-glucopyranoside) were analyzed. The analytic results of qHNMR were further validated by high performance liquid chromatography (HPLC). Therefore, the qHNMR method was a rapid, accurate, reliable tool for monitoring the purity of aryltetralin lactone lignans. PMID:26016553

  19. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    SciTech Connect

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; Prowant, Matthew S.; Nettleship, Ian; Addleman, Raymond S.; Bonheyo, George T.

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results were compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.

  20. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE PAGESBeta

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; Prowant, Matthew S.; Nettleship, Ian; Addleman, Raymond S.; Bonheyo, George T.

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  1. Accurate assessment of the impact of salmon farming on benthic sediment enrichment using foraminiferal metabarcoding.

    PubMed

    Pochon, X; Wood, S A; Keeley, N B; Lejzerowicz, F; Esling, P; Drew, J; Pawlowski, J

    2015-11-15

    Assessing the environmental impact of salmon farms on benthic systems is traditionally undertaken using biotic indices derived from microscopic analyses of macrobenthic infaunal (MI) communities. In this study, we tested the applicability of using foraminiferal-specific high-throughput sequencing (HTS) metabarcoding for monitoring these habitats. Sediment samples and physico-chemical data were collected along an enrichment gradient radiating out from three Chinook salmon (Oncorhynchus tshawytscha) farms in New Zealand. HTS of environmental DNA and RNA (eDNA/eRNA) resulted in 1,875,300 sequences that clustered into 349 Operational Taxonomic Units. Strong correlations were observed among various biotic indices calculated from MI data and normalized fourth-root transformed HTS data. Correlations were stronger using eRNA compared to eDNA data. Quantile regression spline analyses identified 12 key foraminiferal taxa that have potential to be used as bioindicator species. This study demonstrates the huge potential for using this method for biomonitoring of fish-farming and other marine industrial activities. PMID:26337228

  2. Three-Dimensional Quantitative Validation of Breast Magnetic Resonance Imaging Background Parenchymal Enhancement Assessments.

    PubMed

    Ha, Richard; Mema, Eralda; Guo, Xiaotao; Mango, Victoria; Desperito, Elise; Ha, Jason; Wynn, Ralph; Zhao, Binsheng

    2016-01-01

    The magnetic resonance imaging (MRI) background parenchymal enhancement (BPE) and its clinical significance as a biomarker of breast cancer risk has been proposed based on qualitative studies. Previous BPE quantification studies lack appropriate correlation with BPE qualitative assessments. The purpose of this study is to validate our three-dimensional BPE quantification method with standardized BPE qualitative cases. An Institutional Review Board-approved study reviewed 500 consecutive magnetic resonance imaging cases (from January 2013-December 2014) using a strict inclusion criteria and 120 cases that best represented each of the BPE qualitative categories (minimal or mild or moderate or marked) were selected. Blinded to the qualitative data, fibroglandular tissue contours of precontrast and postcontrast images were delineated using an in-house, proprietary segmentation algorithm. Metrics of BPE were calculated including %BPE ([ratio of BPE volume to fibroglandular tissue volume] × 100) at multiple threshold levels to determine the optimal cutoff point for BPE quantification that best correlated with the reference BPE qualitative cases. The highest positive correlation was present at ×1.5 precontrast average signal intensity threshold level (r = 0.84, P < 0.001). At this level, the BPE qualitative assessment of minimal, mild, moderate, and marked correlated with the mean quantitative %BPE of 14.1% (95% CI: 10.9-17.2), 26.1% (95% CI: 22.8-29.3), 45.9% (95% CI: 40.2-51.7), and 74.0% (95% CI: 68.6-79.5), respectively. A one-way analysis of variance with post-hoc analysis showed that at ×1.5 precontrast average signal intensity level, the quantitative %BPE measurements best differentiated the four reference BPE qualitative groups (F [3,117] = 106.8, P < 0.001). Our three-dimensional BPE quantification methodology was validated using the reference BPE qualitative cases and could become an invaluable clinical tool to more accurately assess breast cancer risk and to

  3. Use of human in vitro skin models for accurate and ethical risk assessment: metabolic considerations.

    PubMed

    Hewitt, Nicola J; Edwards, Robert J; Fritsche, Ellen; Goebel, Carsten; Aeby, Pierre; Scheel, Julia; Reisinger, Kerstin; Ouédraogo, Gladys; Duche, Daniel; Eilstein, Joan; Latil, Alain; Kenny, Julia; Moore, Claire; Kuehnl, Jochen; Barroso, Joao; Fautz, Rolf; Pfuhler, Stefan

    2013-06-01

    Several human skin models employing primary cells and immortalized cell lines used as monocultures or combined to produce reconstituted 3D skin constructs have been developed. Furthermore, these models have been included in European genotoxicity and sensitization/irritation assay validation projects. In order to help interpret data, Cosmetics Europe (formerly COLIPA) facilitated research projects that measured a variety of defined phase I and II enzyme activities and created a complete proteomic profile of xenobiotic metabolizing enzymes (XMEs) in native human skin and compared them with data obtained from a number of in vitro models of human skin. Here, we have summarized our findings on the current knowledge of the metabolic capacity of native human skin and in vitro models and made an overall assessment of the metabolic capacity from gene expression, proteomic expression, and substrate metabolism data. The known low expression and function of phase I enzymes in native whole skin were reflected in the in vitro models. Some XMEs in whole skin were not detected in in vitro models and vice versa, and some major hepatic XMEs such as cytochrome P450-monooxygenases were absent or measured only at very low levels in the skin. Conversely, despite varying mRNA and protein levels of phase II enzymes, functional activity of glutathione S-transferases, N-acetyltransferase 1, and UDP-glucuronosyltransferases were all readily measurable in whole skin and in vitro skin models at activity levels similar to those measured in the liver. These projects have enabled a better understanding of the contribution of XMEs to toxicity endpoints. PMID:23539547

  4. A quantitative assessment of Arctic shipping in 2010–2014

    NASA Astrophysics Data System (ADS)

    Eguíluz, Victor M.; Fernández-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M.

    2016-08-01

    Rapid loss of sea ice is opening up the Arctic Ocean to shipping, a practice that is forecasted to increase rapidly by 2050 when many models predict that the Arctic Ocean will largely be free of ice toward the end of summer. These forecasts carry considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011–2014 is already significant and that it is concentrated (i) in the Norwegian and Barents Seas, and (ii) predominantly accessed via the Northeast and Northwest Passages. Thick ice along the forecasted direct trans-Arctic route was still present in 2014, preventing transit. Although Arctic shipping remains constrained by the extent of ice coverage, during every September, this coverage is at a minimum, allowing the highest levels of shipping activity. Access to Arctic resources, particularly fisheries, is the most important driver of Arctic shipping thus far.

  5. Quantitative Cherenkov emission spectroscopy for tissue oxygenation assessment

    PubMed Central

    Axelsson, Johan; Glaser, Adam K.; Gladstone, David J.; Pogue, Brian W.

    2012-01-01

    Measurements of Cherenkov emission in tissue during radiation therapy are shown to enable estimation of hemoglobin oxygen saturation non-invasively, through spectral fitting of the spontaneous emissions from the treated tissue. Tissue oxygenation plays a critical role in the efficacy of radiation therapy to kill tumor tissue. Yet in-vivo measurement of this has remained elusive in routine use because of the complexity of oxygen measurement techniques. There is a spectrally broad emission of Cherenkov light that is induced during the time of irradiation, and as this travels through tissue from the point of the radiation deposition, the tissue absorption and scatter impart spectral changes. These changes can be quantified by diffuse spectral fitting of the signal. Thus Cherenkov emission spectroscopy is demonstrated for the first time quantitatively in vitro and qualitatively in vivo, and has potential for real-time online tracking of tissue oxygen during radiation therapy when fully characterized and developed. PMID:22418319

  6. Quantitative Assessment of Spray Deposition with Water-Sensitive Paper

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Spray droplets, discharged from the lower six nozzles of an airblast sprayer, were sampled on pairs of absorbent filter and water-sensitive papers at nine distances from sprayer. Spray deposition on filter targets were measured by fluorometry and spray distribution on WSP targets were assessed by t...

  7. INCORPORATION OF MOLECULAR ENDPOINTS INTO QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    The U.S. Environmental Protection Agency has recently released its Guidelines for Carcinogen Risk Assessment. These new guidelines benefit from the significant progress that has been made in understanding the cancer process and also from the more than 20 years experience that EPA...

  8. Developing a Quantitative Tool for Sustainability Assessment of HEIs

    ERIC Educational Resources Information Center

    Waheed, Bushra; Khan, Faisal I.; Veitch, Brian

    2011-01-01

    Purpose: Implementation of a sustainability paradigm demands new choices and innovative ways of thinking. The main objective of this paper is to provide a meaningful sustainability assessment tool for make informed decisions, which is applied to higher education institutions (HEIs). Design/methodology/approach: The objective is achieved by…

  9. INTEGRATED QUANTITATIVE CANCER RISK ASSESSMENT OF INORGANIC ARSENIC

    EPA Science Inventory

    This paper attempts to make an integrated risk assessment of arsenic, using data on humans exposed to arsenic via inhalation and ingestion. he data useful for making an integrated analysis and data gaps are discussed. rsenic provides a rare opportunity to compare the cancer risk ...

  10. Validation of a quantitative phosphorus loss assessment tool

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Pasture Phosphorus Management Plus (PPM Plus) is a tool that allows nutrient management and conservation planners to evaluate phosphorus loss from agricultural fields. This tool is a modified version of the widely used Soil and Water Assessment Tool (SWAT) model with a vastly simplified interface. ...

  11. Influence of storage time on DNA of Chlamydia trachomatis, Ureaplasma urealyticum, and Neisseria gonorrhoeae for accurate detection by quantitative real-time polymerase chain reaction.

    PubMed

    Lu, Y; Rong, C Z; Zhao, J Y; Lao, X J; Xie, L; Li, S; Qin, X

    2016-01-01

    The shipment and storage conditions of clinical samples pose a major challenge to the detection accuracy of Chlamydia trachomatis (CT), Neisseria gonorrhoeae (NG), and Ureaplasma urealyticum (UU) when using quantitative real-time polymerase chain reaction (qRT-PCR). The aim of the present study was to explore the influence of storage time at 4°C on the DNA of these pathogens and its effect on their detection by qRT-PCR. CT, NG, and UU positive genital swabs from 70 patients were collected, and DNA of all samples were extracted and divided into eight aliquots. One aliquot was immediately analyzed with qRT-PCR to assess the initial pathogen load, whereas the remaining samples were stored at 4°C and analyzed after 1, 2, 3, 7, 14, 21, and 28 days. No significant differences in CT, NG, and UU DNA loads were observed between baseline (day 0) and the subsequent time points (days 1, 2, 3, 7, 14, 21, and 28) in any of the 70 samples. Although a slight increase in DNA levels was observed at day 28 compared to day 0, paired sample t-test results revealed no significant differences between the mean DNA levels at different time points following storage at 4°C (all P>0.05). Overall, the CT, UU, and NG DNA loads from all genital swab samples were stable at 4°C over a 28-day period. PMID:27580005

  12. Influence of storage time on DNA of Chlamydia trachomatis, Ureaplasma urealyticum, and Neisseria gonorrhoeae for accurate detection by quantitative real-time polymerase chain reaction

    PubMed Central

    Lu, Y.; Rong, C.Z.; Zhao, J.Y.; Lao, X.J.; Xie, L.; Li, S.; Qin, X.

    2016-01-01

    The shipment and storage conditions of clinical samples pose a major challenge to the detection accuracy of Chlamydia trachomatis (CT), Neisseria gonorrhoeae (NG), and Ureaplasma urealyticum (UU) when using quantitative real-time polymerase chain reaction (qRT-PCR). The aim of the present study was to explore the influence of storage time at 4°C on the DNA of these pathogens and its effect on their detection by qRT-PCR. CT, NG, and UU positive genital swabs from 70 patients were collected, and DNA of all samples were extracted and divided into eight aliquots. One aliquot was immediately analyzed with qRT-PCR to assess the initial pathogen load, whereas the remaining samples were stored at 4°C and analyzed after 1, 2, 3, 7, 14, 21, and 28 days. No significant differences in CT, NG, and UU DNA loads were observed between baseline (day 0) and the subsequent time points (days 1, 2, 3, 7, 14, 21, and 28) in any of the 70 samples. Although a slight increase in DNA levels was observed at day 28 compared to day 0, paired sample t-test results revealed no significant differences between the mean DNA levels at different time points following storage at 4°C (all P>0.05). Overall, the CT, UU, and NG DNA loads from all genital swab samples were stable at 4°C over a 28-day period. PMID:27580005

  13. High Resolution Peripheral Quantitative Computed Tomography for Assessment of Bone Quality

    NASA Astrophysics Data System (ADS)

    Kazakia, Galateia

    2014-03-01

    The study of bone quality is motivated by the high morbidity, mortality, and societal cost of skeletal fractures. Over 10 million people are diagnosed with osteoporosis in the US alone, suffering 1.5 million osteoporotic fractures and costing the health care system over 17 billion annually. Accurate assessment of fracture risk is necessary to ensure that pharmacological and other interventions are appropriately administered. Currently, areal bone mineral density (aBMD) based on 2D dual-energy X-ray absorptiometry (DXA) is used to determine osteoporotic status and predict fracture risk. Though aBMD is a significant predictor of fracture risk, it does not completely explain bone strength or fracture incidence. The major limitation of aBMD is the lack of 3D information, which is necessary to distinguish between cortical and trabecular bone and to quantify bone geometry and microarchitecture. High resolution peripheral quantitative computed tomography (HR-pQCT) enables in vivo assessment of volumetric BMD within specific bone compartments as well as quantification of geometric and microarchitectural measures of bone quality. HR-pQCT studies have documented that trabecular bone microstructure alterations are associated with fracture risk independent of aBMD.... Cortical bone microstructure - specifically porosity - is a major determinant of strength, stiffness, and fracture toughness of cortical tissue and may further explain the aBMD-independent effect of age on bone fragility and fracture risk. The application of finite element analysis (FEA) to HR-pQCT data permits estimation of patient-specific bone strength, shown to be associated with fracture incidence independent of aBMD. This talk will describe the HR-pQCT scanner, established metrics of bone quality derived from HR-pQCT data, and novel analyses of bone quality currently in development. Cross-sectional and longitudinal HR-pQCT studies investigating the impact of aging, disease, injury, gender, race, and

  14. Quantitative dose-response assessment of inhalation exposures to toxic air pollutants

    SciTech Connect

    Jarabek, A.M.; Foureman, G.L.; Gift, J.S.; Guth, D.J.

    1997-12-31

    Implementation of the 1990 Clean Air Act Amendments, including evaluation of residual risks. requires accurate human health risk estimates of both acute and chronic inhalation exposures to toxic air pollutants. The U.S. Environmental Protection Agency`s National Center for Environmental Assessment, Research Triangle Park, NC, has a research program that addresses several key issues for development of improved quantitative approaches for dose-response assessment. This paper describes three projects underway in the program. Project A describes a Bayesian approach that was developed to base dose-response estimates on combined data sets and that expresses these estimates as probability density functions. A categorical regression model has been developed that allows for the combination of all available acute data, with toxicity expressed as severity categories (e.g., mild, moderate, severe), and with both duration and concentration as governing factors. Project C encompasses two refinements to uncertainty factors (UFs) often applied to extrapolate dose-response estimates from laboratory animal data to human equivalent concentrations. Traditional UFs have been based on analyses of oral administration and may not be appropriate for extrapolation of inhalation exposures. Refinement of the UF applied to account for the use of subchronic rather than chronic data was based on an analysis of data from inhalation exposures (Project C-1). Mathematical modeling using the BMD approach was used to calculate the dose-response estimates for comparison between the subchronic and chronic data so that the estimates were not subject to dose-spacing or sample size variability. The second UF that was refined for extrapolation of inhalation data was the adjustment for the use of a LOAEL rather than a NOAEL (Project C-2).

  15. Automated quantitative assessment of cardiovascular magnetic resonance-derived atrioventricular junction velocities.

    PubMed

    Leng, Shuang; Zhao, Xiao-Dan; Huang, Fei-Qiong; Wong, Jia-Ing; Su, Bo-Yang; Allen, John Carson; Kassab, Ghassan S; Tan, Ru-San; Zhong, Liang

    2015-12-01

    The assessment of atrioventricular junction (AVJ) deformation plays an important role in evaluating left ventricular systolic and diastolic function in clinical practice. This study aims to demonstrate the effectiveness and consistency of cardiovascular magnetic resonance (CMR) for quantitative assessment of AVJ velocity compared with tissue Doppler echocardiography (TDE). A group of 145 human subjects comprising 21 healthy volunteers, 8 patients with heart failure, 17 patients with hypertrophic cardiomyopathy, 52 patients with myocardial infarction, and 47 patients with repaired Tetralogy of Fallot were prospectively enrolled and underwent TDE and CMR scan. Six AVJ points were tracked with three CMR views. The peak systolic velocity (Sm1), diastolic velocity during early diastolic filling (Em), and late diastolic velocity during atrial contraction (Am) were extracted and analyzed. All CMR-derived septal and lateral AVJ velocities correlated well with TDE measurements (Sm1: r = 0.736; Em: r = 0.835; Am: r = 0.701; Em/Am: r = 0.691; all p < 0.001) and demonstrated excellent reproducibility [intrastudy: r = 0.921-0.991, intraclass correlation coefficient (ICC): 0.918-0.991; interstudy: r = 0.900-0.970, ICC: 0.887-0.957; all p < 0.001]. The evaluation of three-dimensional AVJ motion incorporating measurements from all views better differentiated normal and diseased states [area under the curve (AUC) = 0.918] and provided further insights into mechanical dyssynchrony diagnosis in HF patients (AUC = 0.987). These findings suggest that the CMR-based method is feasible, accurate, and consistent in quantifying the AVJ deformation, and subsequently in diagnosing systolic and diastolic cardiac dysfunction. PMID:26408537

  16. Quantitative Evaluation of MODIS Fire Radiative Power Measurement for Global Smoke Emissions Assessment

    NASA Technical Reports Server (NTRS)

    Ichoku, Charles; Ellison, Luke

    2011-01-01

    Satellite remote sensing is providing us tremendous opportunities to measure the fire radiative energy (FRE) release rate or power (FRP) from open biomass burning, which affects many vegetated regions of the world on a seasonal basis. Knowledge of the biomass burning characteristics and emission source strengths of different (particulate and gaseous) smoke constituents is one of the principal ingredients upon which the assessment, modeling, and forecasting of their distribution and impacts depend. This knowledge can be gained through accurate measurement of FRP, which has been shown to have a direct relationship with the rates of biomass consumption and emissions of major smoke constituents. Over the last decade or so, FRP has been routinely measured from space by both the MODIS sensors aboard the polar orbiting Terra and Aqua satellites, and the SEVIRI sensor aboard the Meteosat Second Generation (MSG) geostationary satellite. During the last few years, FRP has steadily gained increasing recognition as an important parameter for facilitating the development of various scientific studies and applications relating to the quantitative characterization of biomass burning and their emissions. To establish the scientific integrity of the FRP as a stable quantity that can be measured consistently across a variety of sensors and platforms, with the potential of being utilized to develop a unified long-term climate data record of fire activity and impacts, it needs to be thoroughly evaluated, calibrated, and validated. Therefore, we are conducting a detailed analysis of the FRP products from MODIS to evaluate the uncertainties associated with them, such as those due to the effects of satellite variable observation geometry and other factors, in order to establish their error budget for use in diverse scientific research and applications. In this presentation, we will show recent results of the MODIS FRP uncertainty analysis and error mitigation solutions, and demonstrate

  17. A quantitative assessment of Arctic shipping in 2010-2014.

    PubMed

    Eguíluz, Victor M; Fernández-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M

    2016-01-01

    Rapid loss of sea ice is opening up the Arctic Ocean to shipping, a practice that is forecasted to increase rapidly by 2050 when many models predict that the Arctic Ocean will largely be free of ice toward the end of summer. These forecasts carry considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011-2014 is already significant and that it is concentrated (i) in the Norwegian and Barents Seas, and (ii) predominantly accessed via the Northeast and Northwest Passages. Thick ice along the forecasted direct trans-Arctic route was still present in 2014, preventing transit. Although Arctic shipping remains constrained by the extent of ice coverage, during every September, this coverage is at a minimum, allowing the highest levels of shipping activity. Access to Arctic resources, particularly fisheries, is the most important driver of Arctic shipping thus far. PMID:27477878

  18. A quantitative assessment of Arctic shipping in 2010–2014

    PubMed Central

    Eguíluz, Victor M.; Fernández-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M.

    2016-01-01

    Rapid loss of sea ice is opening up the Arctic Ocean to shipping, a practice that is forecasted to increase rapidly by 2050 when many models predict that the Arctic Ocean will largely be free of ice toward the end of summer. These forecasts carry considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011–2014 is already significant and that it is concentrated (i) in the Norwegian and Barents Seas, and (ii) predominantly accessed via the Northeast and Northwest Passages. Thick ice along the forecasted direct trans-Arctic route was still present in 2014, preventing transit. Although Arctic shipping remains constrained by the extent of ice coverage, during every September, this coverage is at a minimum, allowing the highest levels of shipping activity. Access to Arctic resources, particularly fisheries, is the most important driver of Arctic shipping thus far. PMID:27477878

  19. Quantitative ultrasound assessment of thermal damage in excised liver

    NASA Astrophysics Data System (ADS)

    Kemmerer, Jeremy P.; Ghoshal, Goutam; Oelze, Michael L.

    2012-10-01

    Quantitative ultrasound (QUS) is a novel approach for characterizing tissue microstructure and changes in tissue microstructure due to therapy. In this report, we discuss changes in QUS parameters in liver tissues after being exposed to thermal insult. Effective scatterer diameter (ESD) and effective acoustic concentration (EAC) from the normalized backscattered power spectrum were examined in rat liver specimens heated in a degassed saline bath. Individual liver samples were bisected, with half of each sample heated to a therapeutic temperature of 60°C for 10 minutes and the other half held at 37°C. The ultrasonic backscatter and attenuation coefficient were then estimated at 37°C from both halves. ESD was observed to decrease by an average of 34% in exposed compared to unexposed sample sections, EAC increased by 18 dB, and the attenuation coefficient increased by 70%. Histological slides from these samples indicate cell size and/or concentration may be affected by heating. This work was supported by NIH R01-EB008992.

  20. Quantitative Assessment of Cytosolic Salmonella in Epithelial Cells

    PubMed Central

    Knodler, Leigh A.; Nair, Vinod; Steele-Mortimer, Olivia

    2014-01-01

    Within mammalian cells, Salmonella enterica serovar Typhimurium (S. Typhimurium) inhabits a membrane-bound vacuole known as the Salmonella-containing vacuole (SCV). We have recently shown that wild type S. Typhimurium also colonizes the cytosol of epithelial cells. Here we sought to quantify the contribution of cytosolic Salmonella to the total population over a time course of infection in different epithelial cell lines and under conditions of altered vacuolar escape. We found that the lysosomotropic agent, chloroquine, acts on vacuolar, but not cytosolic, Salmonella. After chloroquine treatment, vacuolar bacteria are not transcriptionally active or replicative and appear degraded. Using a chloroquine resistance assay, in addition to digitonin permeabilization, we found that S. Typhimurium lyses its nascent vacuole in numerous epithelial cell lines, albeit with different frequencies, and hyper-replication in the cytosol is also widespread. At later times post-infection, cytosolic bacteria account for half of the total population in some epithelial cell lines, namely HeLa and Caco-2 C2Bbe1. Both techniques accurately measured increased vacuole lysis in epithelial cells upon treatment with wortmannin. By chloroquine resistance assay, we also determined that Salmonella pathogenicity island-1 (SPI-1), but not SPI-2, the virulence plasmid nor the flagellar apparatus, was required for vacuolar escape and cytosolic replication in epithelial cells. Together, digitonin permeabilization and the chloroquine resistance assay will be useful, complementary tools for deciphering the mechanisms of SCV lysis and Salmonella replication in the epithelial cell cytosol. PMID:24400108

  1. Quantitative statistical assessment of conditional models for synthetic aperture radar.

    PubMed

    DeVore, Michael D; O'Sullivan, Joseph A

    2004-02-01

    Many applications of object recognition in the presence of pose uncertainty rely on statistical models-conditioned on pose-for observations. The image statistics of three-dimensional (3-D) objects are often assumed to belong to a family of distributions with unknown model parameters that vary with one or more continuous-valued pose parameters. Many methods for statistical model assessment, for example the tests of Kolmogorov-Smirnov and K. Pearson, require that all model parameters be fully specified or that sample sizes be large. Assessing pose-dependent models from a finite number of observations over a variety of poses can violate these requirements. However, a large number of small samples, corresponding to unique combinations of object, pose, and pixel location, are often available. We develop methods for model testing which assume a large number of small samples and apply them to the comparison of three models for synthetic aperture radar images of 3-D objects with varying pose. Each model is directly related to the Gaussian distribution and is assessed both in terms of goodness-of-fit and underlying model assumptions, such as independence, known mean, and homoscedasticity. Test results are presented in terms of the functional relationship between a given significance level and the percentage of samples that wold fail a test at that level. PMID:15376934

  2. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  3. How accurate are we at assessing others’ well-being? The example of welfare assessment in horses

    PubMed Central

    Lesimple, Clémence; Hausberger, Martine

    2014-01-01

    Healthcare practitioners such as physicians or nurses often underestimate patients’ well-being impairment (e.g., pain, anxiety) which may lead to undesirable consequences on treatment decisions. Lack of recognition/identification of signals and over-exposure are two reasons invoked, but a combination of factors may be involved. Studying human decoding of animals’ expressions of emotions showed that “identification” to the subject was necessary to decode the other’s internal state. In the present study we wanted to compare caretakers’ reports on the prevalence of stereotypic or abnormal repetitive behaviors, to ethological observations performed by an experienced observer on the same horses in order to test the impact of these different factors. On the first hand, a questionnaire was given hand to hand to the caretakers. On the other hand, the experienced observer spent 18 h observing the horses in each stable. Here we show that caretakers strongly underestimate horses’ expressions of well-being impairment. The caretakers who had a strong concern about their horses’ well-being were also those who reported the more accurately SB/ARB’s prevalence, showing that “identification” to the subject is a primary factor of bad-being signal’s detection. Over-exposure also appeared to be involved as no SB/ARB was reported in stables where most of the horses were performing these abnormal behaviors. Being surrounded by a large population of individuals expressing clear signals of bad-being may change professionals’ perceptions of what are behaviors or expressions of well being. These findings are of primary importance as (1) they illustrate the interest of using human-animal relationships to evaluate humans’ abilities to decode others’ states; (2) they put limitations on questionnaire-based studies of welfare. PMID:24478748

  4. How accurate are we at assessing others' well-being? The example of welfare assessment in horses.

    PubMed

    Lesimple, Clémence; Hausberger, Martine

    2014-01-01

    Healthcare practitioners such as physicians or nurses often underestimate patients' well-being impairment (e.g., pain, anxiety) which may lead to undesirable consequences on treatment decisions. Lack of recognition/identification of signals and over-exposure are two reasons invoked, but a combination of factors may be involved. Studying human decoding of animals' expressions of emotions showed that "identification" to the subject was necessary to decode the other's internal state. In the present study we wanted to compare caretakers' reports on the prevalence of stereotypic or abnormal repetitive behaviors, to ethological observations performed by an experienced observer on the same horses in order to test the impact of these different factors. On the first hand, a questionnaire was given hand to hand to the caretakers. On the other hand, the experienced observer spent 18 h observing the horses in each stable. Here we show that caretakers strongly underestimate horses' expressions of well-being impairment. The caretakers who had a strong concern about their horses' well-being were also those who reported the more accurately SB/ARB's prevalence, showing that "identification" to the subject is a primary factor of bad-being signal's detection. Over-exposure also appeared to be involved as no SB/ARB was reported in stables where most of the horses were performing these abnormal behaviors. Being surrounded by a large population of individuals expressing clear signals of bad-being may change professionals' perceptions of what are behaviors or expressions of well being. These findings are of primary importance as (1) they illustrate the interest of using human-animal relationships to evaluate humans' abilities to decode others' states; (2) they put limitations on questionnaire-based studies of welfare. PMID:24478748

  5. Quantitative Assessment of Fat Infiltration in the Rotator Cuff Muscles using water-fat MRI

    PubMed Central

    Nardo, Lorenzo; Karampinos, Dimitrios C.; Lansdown, Drew A.; Carballido-Gamio, Julio; Lee, Sonia; Maroldi, Roberto; Ma, C. Benjamin; Link, Thomas M.; Krug, Roland

    2013-01-01

    Purpose To evaluate a chemical shift-based fat quantification technique in the rotator cuff muscles in comparison with the semi-quantitative Goutallier fat infiltration classification (GC) and to assess their relationship with clinical parameters. Materials and Methods The shoulders of 57 patients were imaged using a 3T MR scanner. The rotator cuff muscles were assessed for fat infiltration using GC by two radiologists and an orthopedic surgeon. Sequences included oblique-sagittal T1-, T2- and proton density-weighted fast spin echo, and six-echo gradient echo. The iterative decomposition of water and fat with echo asymmetry and least-squares estimation (IDEAL) was used to measure fat fraction. Pain and range of motion of the shoulder were recorded. Results Fat fraction values were significantly correlated with GC grades (p< 0.0001, kappa>0.9) showing consistent increase with GC grades (grade=0, 0%–5.59%; grade=1, 1.1%–9.70%; grade=2, 6.44%–14.86%; grade=3, 15.25%–17.77%; grade=4, 19.85%–29.63%). A significant correlation between fat infiltration of the subscapularis muscle quantified with IDEAL versus a) deficit in internal rotation (Spearman Rank Correlation Coefficient=0.39, 95% CI 0.13–0.60, p<0.01) and b) pain (Spearman Rank Correlation coefficient=0.313, 95% CI 0.049–0.536, p=0.02) was found but was not seen between the clinical parameters and GC grades. Additionally, only quantitative fat infiltration measures of the supraspinatus muscle were significantly correlated with a deficit in abduction (Spearman Rank Correlation Coefficient=0.45, 95% CI 0.20–0.60, p<0.01). Conclusion We concluded that an accurate and highly reproducible fat quantification in the rotator cuff muscles using water-fat MRI techniques is possible and significantly correlates with shoulder pain and range of motion. PMID:24115490

  6. Floods characterization: from impact data to quantitative assessment

    NASA Astrophysics Data System (ADS)

    Llasat, Maria-Carmen; Gilabert, Joan; Llasat-Botija, Montserrat; Marcos, Raül; Quintana-Seguí, Pere; Turco, Marco

    2015-04-01

    This study is based on the following flood databases from Catalonia: INUNGAMA (1900-2010) which considers 372 floods (Llasat et al, 2014), PRESSGAMA (1981-2010) and HISTOGAMA (from XIV Century on) - built as part of SPHERE project and recently updated. These databases store information about flood impacts (among others) and classify them by their severity (catastrophic, extraordinary and ordinary) by means of an indicators matrix based on other studies (i.e. Petrucci et al, 2013; Llasat et al, 2013). On this research we present a comparison between flood impacts, flow data and rainfall data on a Catalan scale and particularly for the basins of Segre, Muga, Ter and Llobregat (Western Mediterranean). From a bottom-up approach, a statistical methodology has been built (trend analysis, measures of position, cumulative distribution functions and geostatistics) in order to identify quantitative thresholds that will make possible to classify the floods. The purpose of this study is to establish generic thresholds for the whole Catalan region, for this we have selected rainfall maximums of flooding episodes stored at INUNGAMA and they have been related to flood categories by boxplot diagrams. Regarding the stream flow, we have established a relation between impacts and return periods at the day when the flow is maximum. The aim is to homogenize and compare the different drainage basins and to obtain general thresholds. It is also presented detailed analyses of relations between flooding episodes, flood classification and weather typing schemes - based in Jenkinson and Collison classification (applied to the Iberian Peninsula by Spellmann, 2000). In this way it could be analyzed whether patterns for the different types of floods exist or not. Finally, this work has pointed out the need of defining a new category for the most severe episodes.

  7. Quantitative phase imaging technologies to assess neuronal activity (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Thouvenin, Olivier; Fink, Mathias; Boccara, Claude

    2016-03-01

    Active neurons tends to have a different dynamical behavior compared to resting ones. Non-exhaustively, vesicular transport towards the synapses is increased, since axonal growth becomes slower. Previous studies also reported small phase variations occurring simultaneously with the action potential. Such changes exhibit times scales ranging from milliseconds to several seconds on spatial scales smaller than the optical diffraction limit. Therefore, QPI systems are of particular interest to measure neuronal activity without labels. Here, we report the development of two new QPI systems that should enable the detection of such activity. Both systems can acquire full field phase images with a sub nanometer sensitivity at a few hundreds of frames per second. The first setup is a synchronous combination of Full Field Optical Coherence Tomography (FF-OCT) and Fluorescence wide field imaging. The latter modality enables the measurement of neurons electrical activity using calcium indicators. In cultures, FF-OCT exhibits similar features to Digital Holographic Microscopy (DHM), except from complex computational reconstruction. However, FF-OCT is of particular interest in order to measure phase variations in tissues. The second setup is based on a Quantitative Differential Interference Contrast setup mounted in an epi-illumination configuration with a spectrally incoherent illumination. Such a common path interferometer exhibits a very good mechanical stability, and thus enables the measurement of phase images during hours. Additionally, such setup can not only measure a height change, but also an optical index change for both polarization. Hence, one can measure simultaneously a phase change and a birefringence change.

  8. A quantitative epigenetic approach for the assessment of cigarette consumption

    PubMed Central

    Philibert, Robert; Hollenbeck, Nancy; Andersen, Eleanor; Osborn, Terry; Gerrard, Meg; Gibbons, Frederick X.; Wang, Kai

    2015-01-01

    Smoking is the largest preventable cause of morbidity and mortality in the world. Despite the development of numerous preventive and treatment interventions, the rate of daily smoking in the United States is still approximately 22%. Effective psychosocial interventions and pharmacologic agents exist for the prevention and treatment of smoking. Unfortunately, both approaches are hindered by our inability to accurately quantify amount of cigarette consumption from the point of initial experimentation to the point of total dependency. Recently, we and others have demonstrated that smoking is associated with genome-wide changes in DNA methylation. However, whether this advance in basic science can be employed as a reliable assay that is useful for clinical diagnosis and treatment has not been shown. In this communication, we determine the sensitivity and specificity of five of the most consistently replicated CpG loci with respect to smoking status using data from a publically available dataset. We show that methylation status at a CpG locus in the aryl hydrocarbon receptor repressor, cg05575921, is both sensitive and specific for smoking status in adults with a receiver operated curve characteristic area under the curve of 0.99. Given recent demonstrations that methylation at this locus reflects both intensity of smoking and the degree of smoking cessation, we conclude that a methylation-based diagnostic at this locus could have a prominent role in understanding the impact of new products, such as e-cigarettes on initiation of cigarette smoking among adolescents, while improving the prevention and treatment of smoking, and smoking related disorders. PMID:26082730

  9. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  10. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  11. Quantitative Assessment of a Field-Based Course on Integrative Geology, Ecology and Cultural History

    ERIC Educational Resources Information Center

    Sheppard, Paul R.; Donaldson, Brad A.; Huckleberry, Gary

    2010-01-01

    A field-based course at the University of Arizona called Sense of Place (SOP) covers the geology, ecology and cultural history of the Tucson area. SOP was quantitatively assessed for pedagogical effectiveness. Students of the Spring 2008 course were given pre- and post-course word association surveys in order to assess awareness and comprehension…

  12. A quantitative assessment of results with the Angelchik prosthesis.

    PubMed Central

    Wyllie, J. H.; Edwards, D. A.

    1985-01-01

    The Angelchik antireflux prosthesis was assessed in 15 unpromising patients, 12 of whom had peptic strictures of the oesophagus. Radiological techniques were used to show the effect of the device on gastro-oesophageal reflux, and on the bore and length of strictures. Twelve months later (range 6-24) most patients were well satisfied with the operation, and all considered it had been worthwhile; there was radiological evidence of reduction in reflux and remission of strictures. The device never surrounded the oesophageal sphincter; in all but 1 case it encircled a tube of stomach. Images Fig. 5 Fig. 6 PMID:4037629

  13. New Trends in Quantitative Assessment of the Corneal Barrier Function

    PubMed Central

    Guimerà, Anton; Illa, Xavi; Traver, Estefania; Herrero, Carmen; Maldonado, Miguel J.; Villa, Rosa

    2014-01-01

    The cornea is a very particular tissue due to its transparency and its barrier function as it has to resist against the daily insults of the external environment. In addition, maintenance of this barrier function is of crucial importance to ensure a correct corneal homeostasis. Here, the corneal epithelial permeability has been assessed in vivo by means of non-invasive tetrapolar impedance measurements, taking advantage of the huge impact of the ion fluxes in the passive electrical properties of living tissues. This has been possible by using a flexible sensor based in SU-8 photoresist. In this work, a further analysis focused on the validation of the presented sensor is performed by monitoring the healing process of corneas that were previously wounded. The obtained impedance measurements have been compared with the damaged area observed in corneal fluorescein staining images. The successful results confirm the feasibility of this novel method, as it represents a more sensitive in vivo and non-invasive test to assess low alterations of the epithelial permeability. Then, it could be used as an excellent complement to the fluorescein staining image evaluation. PMID:24841249

  14. Quantitative Assessment of Workload and Stressors in Clinical Radiation Oncology

    SciTech Connect

    Mazur, Lukasz M.; Mosaly, Prithima R.; Jackson, Marianne; Chang, Sha X.; Burkhardt, Katharin Deschesne; Adams, Robert D.; Jones, Ellen L.; Hoyle, Lesley; Xu, Jing; Rockwell, John; Marks, Lawrence B.

    2012-08-01

    Purpose: Workload level and sources of stressors have been implicated as sources of error in multiple settings. We assessed workload levels and sources of stressors among radiation oncology professionals. Furthermore, we explored the potential association between workload and the frequency of reported radiotherapy incidents by the World Health Organization (WHO). Methods and Materials: Data collection was aimed at various tasks performed by 21 study participants from different radiation oncology professional subgroups (simulation therapists, radiation therapists, physicists, dosimetrists, and physicians). Workload was assessed using National Aeronautics and Space Administration Task-Load Index (NASA TLX). Sources of stressors were quantified using observational methods and segregated using a standard taxonomy. Comparisons between professional subgroups and tasks were made using analysis of variance ANOVA, multivariate ANOVA, and Duncan test. An association between workload levels (NASA TLX) and the frequency of radiotherapy incidents (WHO incidents) was explored (Pearson correlation test). Results: A total of 173 workload assessments were obtained. Overall, simulation therapists had relatively low workloads (NASA TLX range, 30-36), and physicists had relatively high workloads (NASA TLX range, 51-63). NASA TLX scores for physicians, radiation therapists, and dosimetrists ranged from 40-52. There was marked intertask/professional subgroup variation (P<.0001). Mental demand (P<.001), physical demand (P=.001), and effort (P=.006) significantly differed among professional subgroups. Typically, there were 3-5 stressors per cycle of analyzed tasks with the following distribution: interruptions (41.4%), time factors (17%), technical factors (13.6%), teamwork issues (11.6%), patient factors (9.0%), and environmental factors (7.4%). A positive association between workload and frequency of reported radiotherapy incidents by the WHO was found (r = 0.87, P value=.045

  15. Quantitative risk assessment of FMD virus transmission via water.

    PubMed

    Schijven, Jack; Rijs, Gerard B J; de Roda Husman, Ana Maria

    2005-02-01

    Foot-and-mouth disease (FMD) is a viral disease of domesticated and wild cloven-hoofed animals. FMD virus is known to spread by direct contact between infected and susceptible animals, by animal products such as meat and milk, by the airborne route, and mechanical transfer on people, wild animals, birds, and by vehicles. During the outbreak of 2001 in the Netherlands, milk from dairy cattle was illegally discharged into the sewerage as a consequence of transport prohibition. This may lead to contaminated discharges of biologically treated and raw sewage in surface water that is given to cattle to drink. The objective of the present study was to assess the probability of infecting dairy cows that were drinking FMD virus contaminated surface water due to illegal discharges of contaminated milk. So, the following data were collected from literature: FMD virus inactivation in aqueous environments, FMD virus concentrations in milk, dilution in sewage water, virus removal by sewage treatment, dilution in surface water, water consumption of cows, size of a herd in a meadow, and dose-response data for ingested FMD virus by cattle. In the case of 1.6 x 10(2) FMD virus per milliliter in milk and discharge of treated sewage in surface water, the probability of infecting a herd of cows was estimated to be 3.3 x 10(-7) to 8.5 x 10(-5), dependent on dilution in the receiving surface water. In the case of discharge of raw sewage, all probabilities of infection were 100 times higher. In the case of little dilution in small rivers, the high level of 8.5 x 10(-3) is reached. For 10(4) times higher FMD virus concentrations in milk, the probabilities of infecting a herd of cows are high in the case of discharge of treated sewage (3.3 x 10(-3) to 5.7 x 10(-1)) and very high in the case of discharge of raw sewage (0.28-1.0). It can be concluded that illegal and uncontrolled discharges of contaminated milk into the sewerage system may lead to high risks to other cattle farms at 6-50 km

  16. Compressed natural gas bus safety: a quantitative risk assessment.

    PubMed

    Chamberlain, Samuel; Modarres, Mohammad

    2005-04-01

    This study assesses the fire safety risks associated with compressed natural gas (CNG) vehicle systems, comprising primarily a typical school bus and supporting fuel infrastructure. The study determines the sensitivity of the results to variations in component failure rates and consequences of fire events. The components and subsystems that contribute most to fire safety risk are determined. Finally, the results are compared to fire risks of the present generation of diesel-fueled school buses. Direct computation of the safety risks associated with diesel-powered vehicles is possible because these are mature technologies for which historical performance data are available. Because of limited experience, fatal accident data for CNG bus fleets are minimal. Therefore, this study uses the probabilistic risk assessment (PRA) approach to model and predict fire safety risk of CNG buses. Generic failure data, engineering judgments, and assumptions are used in this study. This study predicts the mean fire fatality risk for typical CNG buses as approximately 0.23 fatalities per 100-million miles for all people involved, including bus passengers. The study estimates mean values of 0.16 fatalities per 100-million miles for bus passengers only. Based on historical data, diesel school bus mean fire fatality risk is 0.091 and 0.0007 per 100-million miles for all people and bus passengers, respectively. One can therefore conclude that CNG buses are more prone to fire fatality risk by 2.5 times that of diesel buses, with the bus passengers being more at risk by over two orders of magnitude. The study estimates a mean fire risk frequency of 2.2 x 10(-5) fatalities/bus per year. The 5% and 95% uncertainty bounds are 9.1 x 10(-6) and 4.0 x 10(-5), respectively. The risk result was found to be affected most by failure rates of pressure relief valves, CNG cylinders, and fuel piping. PMID:15876211

  17. Quantitative assessment of the retinal microvasculature using optical coherence tomography angiography

    NASA Astrophysics Data System (ADS)

    Chu, Zhongdi; Lin, Jason; Gao, Chen; Xin, Chen; Zhang, Qinqin; Chen, Chieh-Li; Roisman, Luis; Gregori, Giovanni; Rosenfeld, Philip J.; Wang, Ruikang K.

    2016-06-01

    Optical coherence tomography angiography (OCTA) is clinically useful for the qualitative assessment of the macular microvasculature. However, there is a need for comprehensive quantitative tools to help objectively analyze the OCT angiograms. Few studies have reported the use of a single quantitative index to describe vessel density in OCT angiograms. In this study, we introduce a five-index quantitative analysis of OCT angiograms in an attempt to detect and assess vascular abnormalities from multiple perspectives. The indices include vessel area density, vessel skeleton density, vessel diameter index, vessel perimeter index, and vessel complexity index. We show the usefulness of the proposed indices with five illustrative cases. Repeatability is tested on both a healthy case and a stable diseased case, giving interclass coefficients smaller than 0.031. The results demonstrate that our proposed quantitative analysis may be useful as a complement to conventional OCTA for the diagnosis of disease and monitoring of treatment.

  18. Accurate dose assessment system for an exposed person utilising radiation transport calculation codes in emergency response to a radiological accident.

    PubMed

    Takahashi, F; Shigemori, Y; Seki, A

    2009-01-01

    A system has been developed to assess radiation dose distribution inside the body of exposed persons in a radiological accident by utilising radiation transport calculation codes-MCNP and MCNPX. The system consists mainly of two parts, pre-processor and post-processor of the radiation transport calculation. Programs for the pre-processor are used to set up a 'problem-dependent' input file, which defines the accident condition and dosimetric quantities to be estimated. The program developed for the post-processor part can effectively indicate dose information based upon the output file of the code. All of the programs in the dosimetry system can be executed with a generally used personal computer and accurately give the dose profile to an exposed person in a radiological accident without complicated procedures. An experiment using a physical phantom was carried out to verify the availability of the dosimetry system with the developed programs in a gamma ray irradiation field. PMID:19181661

  19. Quantitative assessment of the serve speed in tennis.

    PubMed

    Vaverka, Frantisek; Cernosek, Miroslav

    2016-01-01

    A method is presented for assessing the serve speeds of tennis players based on their body height. The research involved a sample of top world players (221 males and 215 females) who participated in the Grand Slam tournaments in 2008 and 2012. The method is based on the linear regression analysis of the association between the player's body height and the serve speed (fastest serve, average first-serve, and second-serve speed). The coefficient of serve speed (CSS) was calculated as the quotient of the measured and the theoretical value of the serve speed on a regression line relative to the player's body height. The CSS of >1, 1 and <1 indicate above-average, average, and below-average serve speeds, respectively, relative to the top world tennis players with the same body height. The CSS adds a new element to the already existing statistics about a tennis match, and provides additional information about the performance of tennis players. The CSS can be utilised e.g. for setting the target serve speed of a given player to achieve based on his/her body height, choosing the most appropriate match strategy against a particular player, and a long-term monitoring of the effectiveness of training focused on the serve speed. PMID:26879039

  20. Quantitative assessment of gene expression network module-validation methods.

    PubMed

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-01-01

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks. PMID:26470848

  1. Validation of a quantitative phosphorus loss assessment tool.

    PubMed

    White, Michael J; Storm, Daniel E; Smolen, Michael D; Busteed, Philip R; Zhang, Hailin; Fox, Garey A

    2014-01-01

    Pasture Phosphorus Management Plus (PPM Plus) is a tool that allows nutrient management and conservation planners to evaluate phosphorus (P) loss from agricultural fields. This tool uses a modified version of the widely used Soil and Water Assessment Tool model with a vastly simplified interface. The development of PPM Plus has been fully described in previous publications; in this article we evaluate the accuracy of PPM Plus using 286 field-years of runoff, sediment, and P validation data from runoff studies at various locations in Oklahoma, Texas, Arkansas, and Georgia. Land uses include pasture, small grains, and row crops with rainfall ranging from 630 to 1390 mm yr, with and without animal manure application. PPM Plus explained 68% of the variability in total P loss, 56% of runoff, and 73% of the variability of sediment yield. An empirical model developed from these data using soil test P, total applied P, slope, and precipitation only accounted for 15% of the variability in total P loss, which implies that a process-based model is required to account for the diversity present in these data. PPM Plus is an easy-to-use conservation planning tool for P loss prediction, which, with modification, could be applicable at the regional and national scales. PMID:25602555

  2. Bioelectrical impedance is an accurate method to assess body composition in obese but not severely obese adolescents.

    PubMed

    Verney, Julien; Metz, Lore; Chaplais, Elodie; Cardenoux, Charlotte; Pereira, Bruno; Thivel, David

    2016-07-01

    The aim of this study was to compare total and segmental body composition results between bioimpedance analysis (BIA) and dual x-ray absorptiometry (DXA) scan and to test the reproducibility of BIA in obese adolescents. We hypothesized that BIA offers an accurate and reproducible method to assess body composition in adolescents with obesity. Whole-body and segmental body compositions were assessed by BIA (Tanita MC-780) and DXA (Hologic) among 138 (110 girls and 28 boys) obese adolescents (Tanner stage 3-5) aged 14±1.5years. The BIA analysis was replicated on 3 identical occasions in 32 participants to test the reproducibility of the methods. Whole-body fat mass percentage was significantly higher using the BIA method compared with DXA (40.6±7.8 vs 38.8±4.9%, P<.001), which represents a 4.8% overestimation of the BIA technique compared with DXA. Similarly, fat mass expressed in kilograms is overestimated by 2.8% using BIA (35.8±11.7kg) compared with the DXA measure (34.3±8.7kg) (P<.001), and fat-free mass is underestimated by -6.1% using BIA (P<.001). Except for the right arm and leg percentage of fat mass, all the segmental measures of body composition are significantly different between the 2 methods. Intraclass correlation coefficient and Lin coefficient showed great agreement and concordance between both methods in assessing whole-body composition. Intraclass correlation coefficient between the 3 BIA measures ranged from 0.99 to 1 for body weight, body fat, and fat-free mass. Bioimpedance analysis offers an acceptable and reproducible alternative to assess body composition in obese adolescents, with however a loss of correlation between BIA and DXA with increasing body fat; its validity remains uncertain for segmental analysis among obese youth. PMID:27333957

  3. Real Time Quantitative Radiological Monitoring Equipment for Environmental Assessment

    SciTech Connect

    John R. Giles; Lyle G. Roybal; Michael V. Carpenter

    2006-03-01

    The Idaho National Laboratory (INL) has developed a suite of systems that rapidly scan, analyze, and characterize radiological contamination in soil. These systems have been successfully deployed at several Department of Energy (DOE) laboratories and Cold War Legacy closure sites. Traditionally, these systems have been used during the characterization and remediation of radiologically contaminated soils and surfaces; however, subsequent to the terrorist attacks of September 11, 2001, the applications of these systems have expanded to include homeland security operations for first response, continuing assessment and verification of cleanup activities in the event of the detonation of a radiological dispersal device. The core system components are a detector, a spectral analyzer, and a global positioning system (GPS). The system is computer controlled by menu-driven, user-friendly custom software designed for a technician-level operator. A wide variety of detectors have been used including several configurations of sodium iodide (NaI) and high-purity germanium (HPGe) detectors, and a large area proportional counter designed for the detection of x-rays from actinides such as Am-241 and Pu-238. Systems have been deployed from several platforms including a small all-terrain vehicle (ATV), hand-pushed carts, a backpack mounted unit, and an excavator mounted unit used where personnel safety considerations are paramount. The INL has advanced this concept, and expanded the system functionality to create an integrated, field-deployed analytical system through the use of tailored analysis and operations software. Customized, site specific software is assembled from a supporting toolbox of algorithms that streamline the data acquisition, analysis and reporting process. These algorithms include region specific spectral stripping, automated energy calibration, background subtraction, activity calculations based on measured detector efficiencies, and on-line data quality checks

  4. Quantitative Assessment of a Framework for Creating Anatomical Brain Networks via Global Tractography

    PubMed Central

    Li, Longchuan; Rilling, James K.; Preuss, Todd M.; Glasser, Matthew F.; Damen, Frederick W.; Hu, Xiaoping

    2012-01-01

    Interregional connections of the brain measured with diffusion tractography can be used to infer valuable information regarding both brain structure and function. However, different tractography algorithms can generate networks that exhibit different characteristics, resulting in poor reproducibility across studies. Therefore, it is important to benchmark different tractography algorithms to quantitatively assess their performance. Here we systematically evaluated a newly introduced tracking algorithm, global tractography, to derive anatomical brain networks in a fiber phantom, 2 post-mortem macaque brains, and 20 living humans, and compared the results with an established local tracking algorithm. Our results demonstrated that global tractography accurately characterized the phantom network in terms of graph-theoretic measures, and significantly outperformed the local tracking approach. Results in brain tissues (post-mortem macaques and in vivo humans), however, showed that although the performance of global tractography demonstrated a trend of improvement, the results were not vastly different than that of local tractography, possibly resulting from the increased fiber complexity of real tissues. When using macaque tracer-derived connections as the ground truth, we found that both global and local algorithms generated non-random patterns of false negative and false positive connections that were probably related to specific fiber systems and largely independent of the tractography algorithm or tissue type (post-mortem vs. in vivo) used in the current study. Moreover, a close examination of the transcallosal motor connections, reconstructed via either global or local tractography, demonstrated that the lateral transcallosal fibers in humans and macaques did not exhibit the denser homotopic connections found in primate tracer studies, indicating the need for more robust brain mapping techniques based on diffusion MRI data. PMID:22484406

  5. Reliability of Quantitative Ultrasonic Assessment of Normal-Tissue Toxicity in Breast Cancer Radiotherapy

    SciTech Connect

    Yoshida, Emi J.; Chen Hao; Torres, Mylin; Andic, Fundagul; Liu Haoyang; Chen Zhengjia; Sun, Xiaoyan; Curran, Walter J.; Liu Tian

    2012-02-01

    Purpose: We have recently reported that ultrasound imaging, together with ultrasound tissue characterization (UTC), can provide quantitative assessment of radiation-induced normal-tissue toxicity. This study's purpose is to evaluate the reliability of our quantitative ultrasound technology in assessing acute and late normal-tissue toxicity in breast cancer radiotherapy. Method and Materials: Our ultrasound technique analyzes radiofrequency echo signals and provides quantitative measures of dermal, hypodermal, and glandular tissue toxicities. To facilitate easy clinical implementation, we further refined this technique by developing a semiautomatic ultrasound-based toxicity assessment tool (UBTAT). Seventy-two ultrasound studies of 26 patients (720 images) were analyzed. Images of 8 patients were evaluated for acute toxicity (<6 months postradiotherapy) and those of 18 patients were evaluated for late toxicity ({>=}6 months postradiotherapy). All patients were treated according to a standard radiotherapy protocol. To assess intraobserver reliability, one observer analyzed 720 images in UBTAT and then repeated the analysis 3 months later. To assess interobserver reliability, three observers (two radiation oncologists and one ultrasound expert) each analyzed 720 images in UBTAT. An intraclass correlation coefficient (ICC) was used to evaluate intra- and interobserver reliability. Ultrasound assessment and clinical evaluation were also compared. Results: Intraobserver ICC was 0.89 for dermal toxicity, 0.74 for hypodermal toxicity, and 0.96 for glandular tissue toxicity. Interobserver ICC was 0.78 for dermal toxicity, 0.74 for hypodermal toxicity, and 0.94 for glandular tissue toxicity. Statistical analysis found significant changes in dermal (p < 0.0001), hypodermal (p = 0.0027), and glandular tissue (p < 0.0001) assessments in the acute toxicity group. Ultrasound measurements correlated with clinical Radiation Therapy Oncology Group (RTOG) toxicity scores of patients

  6. Quantitative and qualitative assessment of the bovine abortion surveillance system in France.

    PubMed

    Bronner, Anne; Gay, Emilie; Fortané, Nicolas; Palussière, Mathilde; Hendrikx, Pascal; Hénaux, Viviane; Calavas, Didier

    2015-06-01

    Bovine abortion is the main clinical sign of bovine brucellosis, a disease of which France has been declared officially free since 2005. To ensure the early detection of any brucellosis outbreak, event-driven surveillance relies on the mandatory notification of bovine abortions and the brucellosis testing of aborting cows. However, the under-reporting of abortions appears frequent. Our objectives were to assess the aptitude of the bovine abortion surveillance system to detect each and every bovine abortion and to identify factors influencing the system's effectiveness. We evaluated five attributes defined by the U.S. Centers for Disease Control with a method suited to each attribute: (1) data quality was studied quantitatively and qualitatively, as this factor considerably influences data analysis and results; (2) sensitivity and representativeness were estimated using a unilist capture-recapture approach to quantify the surveillance system's effectiveness; (3) acceptability and simplicity were studied through qualitative interviews of actors in the field, given that the surveillance system relies heavily on abortion notifications by farmers and veterinarians. Our analysis showed that (1) data quality was generally satisfactory even though some errors might be due to actors' lack of awareness of the need to collect accurate data; (2) from 2006 to 2011, the mean annual sensitivity - i.e. the proportion of farmers who reported at least one abortion out of all those who detected such events - was around 34%, but was significantly higher in dairy than beef cattle herds (highlighting a lack of representativeness); (3) overall, the system's low sensitivity was related to its low acceptability and lack of simplicity. This study showed that, in contrast to policy-makers, most farmers and veterinarians perceived the risk of a brucellosis outbreak as negligible. They did not consider sporadic abortions as a suspected case of brucellosis and usually reported abortions only to

  7. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  8. Productive Factors in School Learning: A Quantitative Synthesis of National Assessment Studies.

    ERIC Educational Resources Information Center

    Borger, Jeanne B.; Walberg, Herbert J.

    To integrate findings concerning the influence of productive factors on student achievement and attitudes across various disciplines and ages, nine regression studies of National Assessment of Educational Progress samples containing a total of 15,802 students were quantitatively synthesized. Correlations and standardized regression weights for…

  9. The value of quantitative patient preferences in regulatory benefit-risk assessment

    PubMed Central

    Egbrink, Mart oude; IJzerman, Maarten

    2014-01-01

    Quantitative patient preferences are a method to involve patients in regulatory benefit-risk assessment. Assuming preferences can be elicited, there might be multiple advantages to their use. Legal, methodological and procedural issues do however imply that preferences are currently at most part of the solution on how to best involve patients in regulatory decision making. Progress is recently made on these issues.

  10. Integrating a Qualitative and Quantitative Assessment of the Quality of Academic Life: Political and Logistical Issues.

    ERIC Educational Resources Information Center

    Marshall, Catherine; And Others

    1991-01-01

    Efforts to assess quality of academic life at Vanderbilt University (Tennessee) resulted in a plan to merge qualitative and quantitative measures and uncovered political, logistical, and fiscal issues in collection and use of the two kinds of data. Although qualitative databases are costly, they are also very useful in different ways. (Author/MSE)

  11. Quantitative Assessment of Disinfectant Activity Against Listeria monocytogenes Biofilms Under Flow Conditions

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Listeria monocytogenes has a high mortality in humans and is responsible for most food recalls involving bacterial contamination. Our objective was to develop methods to quantitatively assess the pathogen under flow conditions to mimic wet food processing. A reactor was used to grow the bacteria on ...

  12. QUANTITATIVE ASSESSMENT OF CORAL DISEASES IN THE FLORIDA KEYS: STRATEGY AND METHODOLOGY

    EPA Science Inventory

    Most studies of coral disease have focused on the incidence of a single disease within a single location. Our overall objective is to use quantitative assessments to characterize annual patterns in the distribution and frequency of scleractinian and gorgonian coral diseases over ...

  13. Quantitative Approach to Collaborative Learning: Performance Prediction, Individual Assessment, and Group Composition

    ERIC Educational Resources Information Center

    Cen, Ling; Ruta, Dymitr; Powell, Leigh; Hirsch, Benjamin; Ng, Jason

    2016-01-01

    The benefits of collaborative learning, although widely reported, lack the quantitative rigor and detailed insight into the dynamics of interactions within the group, while individual contributions and their impacts on group members and their collaborative work remain hidden behind joint group assessment. To bridge this gap we intend to address…

  14. Climate Change Education: Quantitatively Assessing the Impact of a Botanical Garden as an Informal Learning Environment

    ERIC Educational Resources Information Center

    Sellmann, Daniela; Bogner, Franz X.

    2013-01-01

    Although informal learning environments have been studied extensively, ours is one of the first studies to quantitatively assess the impact of learning in botanical gardens on students' cognitive achievement. We observed a group of 10th graders participating in a one-day educational intervention on climate change implemented in a botanical…

  15. DOSIMETRY MODELING OF INHALED FORMALDEHYDE: BINNING NASAL FLUX PREDICTIONS FOR QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    Dosimetry Modeling of Inhaled Formaldehyde: Binning Nasal Flux Predictions for Quantitative Risk Assessment. Kimbell, J.S., Overton, J.H., Subramaniam, R.P., Schlosser, P.M., Morgan, K.T., Conolly, R.B., and Miller, F.J. (2001). Toxicol. Sci. 000, 000:000.

    Interspecies e...

  16. A quantitative microbial risk assessment for center pivot irrigation of dairy wastewaters

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In the western United States where livestock wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks from inhaling pathogens aerosolized dur...

  17. Penumbra Pattern Assessment in Acute Stroke Patients: Comparison of Quantitative and Non-Quantitative Methods in Whole Brain CT Perfusion

    PubMed Central

    Baumann, Alena B.; Meinel, Felix G.; Helck, Andreas D.; Opherk, Christian; Straube, Andreas; Reiser, Maximilian F.; Sommer, Wieland H.

    2014-01-01

    Background And Purpose While penumbra assessment has become an important part of the clinical decision making for acute stroke patients, there is a lack of studies measuring the reliability and reproducibility of defined assessment techniques in the clinical setting. Our aim was to determine reliability and reproducibility of different types of three-dimensional penumbra assessment methods in stroke patients who underwent whole brain CT perfusion imaging (WB-CTP). Materials And Methods We included 29 patients with a confirmed MCA infarction who underwent initial WB-CTP with a scan coverage of 100 mm in the z-axis. Two blinded and experienced readers assessed the flow-volume-mismatch twice and in two quantitative ways: Performing a volumetric mismatch analysis using OsiriX imaging software (MMVOL) and visual estimation of mismatch (MMEST). Complementarily, the semiquantitative Alberta Stroke Programme Early CT Score for CT perfusion was used to define mismatch (MMASPECTS). A favorable penumbral pattern was defined by a mismatch of ≥30% in combination with a cerebral blood flow deficit of ≤90 ml and an MMASPECTS score of ≥1, respectively. Inter- and intrareader agreement was determined by Kappa-values and ICCs. Results Overall, MMVOL showed considerably higher inter-/intrareader agreement (ICCs: 0.751/0.843) compared to MMEST (0.292/0.749). In the subgroup of large (≥50 mL) perfusion deficits, inter- and intrareader agreement of MMVOL was excellent (ICCs: 0.961/0.942), while MMEST interreader agreement was poor (0.415) and intrareader agreement was good (0.919). With respect to penumbra classification, MMVOL showed the highest agreement (interreader agreement: 25 agreements/4 non-agreements/κ: 0.595; intrareader agreement 27/2/0.833), followed by MMEST (22/7/0.471; 23/6/0.577), and MMASPECTS (18/11/0.133; 21/8/0.340). Conclusion The evaluated approach of volumetric mismatch assessment is superior to pure visual and ASPECTS penumbra pattern assessment in WB

  18. Accurate assessment of whole-body retention for PRRT with (177)Lu using paired measurements with external detectors.

    PubMed

    Liu, Boxue; de Blois, Erik; Breeman, Wouter A P; Konijnenberg, Mark W; Wolterbeek, Hubert T; Bode, Peter

    2015-01-01

    The aim of this study was to assess the accuracy of the results of whole-body measurements by comparison with the urine collection method in the PRRT with (177)Lu and furthermore to develop a more accurate method of paired measurements. Excreted samples were collected at given intervals and activities were measured by a dose calibrator. Traditionally, whole-body activities during subsequent measurements are normalized individually to the administered activity. In order to correct for the effects of the activity in the bladder during the baseline measurement before the first voiding and activity redistributions in the patient body during subsequent measurements, a series of paired measurements before and after each voiding were carried out. Time-dependent detector responses at given times were derived and time-activity retentions were then determined. Compared to the results of the urine collection, whole-body activities by traditional whole-body measurements were overestimated by ca. 14% at 1 h after administration and randomly varied from -29% to 49% at 24 h. Measurement uncertainties of whole-body activities were from ± 4% (the coverage factor k=2) at 1 h to >± 20% at 24 h by the urine collection and ± 7% by paired measurements, respectively. Whole-body activities at 1 h by paired measurements were validated using the results by measurements of the collected first urine. The new method of paired measurements has an equivalent measurement accuracy and even better during the later measurements with respect to the urine collection method and therefore can replace urine approach for assessing the time-activity remaining in the patient body. PMID:25771370

  19. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses

    PubMed Central

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning. PMID:27146161

  20. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses.

    PubMed

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning. PMID:27146161

  1. The application of intraoperative transit time flow measurement to accurately assess anastomotic quality in sequential vein grafting

    PubMed Central

    Yu, Yang; Zhang, Fan; Gao, Ming-Xin; Li, Hai-Tao; Li, Jing-Xing; Song, Wei; Huang, Xin-Sheng; Gu, Cheng-Xiong

    2013-01-01

    OBJECTIVES Intraoperative transit time flow measurement (TTFM) is widely used to assess anastomotic quality in coronary artery bypass grafting (CABG). However, in sequential vein grafting, the flow characteristics collected by the conventional TTFM method are usually associated with total graft flow and might not accurately indicate the quality of every distal anastomosis in a sequential graft. The purpose of our study was to examine a new TTFM method that could assess the quality of each distal anastomosis in a sequential graft more reliably than the conventional TTFM approach. METHODS Two TTFM methods were tested in 84 patients who underwent sequential saphenous off-pump CABG in Beijing An Zhen Hospital between April and August 2012. In the conventional TTFM method, normal blood flow in the sequential graft was maintained during the measurement, and the flow probe was placed a few centimetres above the anastomosis to be evaluated. In the new method, blood flow in the sequential graft was temporarily reduced during the measurement by placing an atraumatic bulldog clamp at the graft a few centimetres distal to the anastomosis to be evaluated, while the position of the flow probe remained the same as in the conventional method. This new TTFM method was named the flow reduction TTFM. Graft flow parameters measured by both methods were compared. RESULTS Compared with the conventional TTFM, the flow reduction TTFM resulted in significantly lower mean graft blood flow (P < 0.05); in contrast, yielded significantly higher pulsatility index (P < 0.05). Diastolic filling was not significantly different between the two methods and was >50% in both cases. Interestingly, the flow reduction TTFM identified two defective middle distal anastomoses that the conventional TTFM failed to detect. Graft flows near the defective distal anastomoses were improved substantially after revision. CONCLUSIONS In this study, we found that temporary reduction of graft flow during TTFM seemed to

  2. Photon-tissue interaction model for quantitative assessment of biological tissues

    NASA Astrophysics Data System (ADS)

    Lee, Seung Yup; Lloyd, William R.; Wilson, Robert H.; Chandra, Malavika; McKenna, Barbara; Simeone, Diane; Scheiman, James; Mycek, Mary-Ann

    2014-02-01

    In this study, we describe a direct fit photon-tissue interaction model to quantitatively analyze reflectance spectra of biological tissue samples. The model rapidly extracts biologically-relevant parameters associated with tissue optical scattering and absorption. This model was employed to analyze reflectance spectra acquired from freshly excised human pancreatic pre-cancerous tissues (intraductal papillary mucinous neoplasm (IPMN), a common precursor lesion to pancreatic cancer). Compared to previously reported models, the direct fit model improved fit accuracy and speed. Thus, these results suggest that such models could serve as real-time, quantitative tools to characterize biological tissues assessed with reflectance spectroscopy.

  3. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  4. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Caffo, Brian; Frey, Eric C.

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  5. Can Community Health Workers Report Accurately on Births and Deaths? Results of Field Assessments in Ethiopia, Malawi and Mali

    PubMed Central

    Silva, Romesh; Amouzou, Agbessi; Munos, Melinda; Marsh, Andrew; Hazel, Elizabeth; Victora, Cesar; Black, Robert; Bryce, Jennifer

    2016-01-01

    Introduction Most low-income countries lack complete and accurate vital registration systems. As a result, measures of under-five mortality rates rely mostly on household surveys. In collaboration with partners in Ethiopia, Ghana, Malawi, and Mali, we assessed the completeness and accuracy of reporting of births and deaths by community-based health workers, and the accuracy of annualized under-five mortality rate estimates derived from these data. Here we report on results from Ethiopia, Malawi and Mali. Method In all three countries, community health workers (CHWs) were trained, equipped and supported to report pregnancies, births and deaths within defined geographic areas over a period of at least fifteen months. In-country institutions collected these data every month. At each study site, we administered a full birth history (FBH) or full pregnancy history (FPH), to women of reproductive age via a census of households in Mali and via household surveys in Ethiopia and Malawi. Using these FBHs/FPHs as a validation data source, we assessed the completeness of the counts of births and deaths and the accuracy of under-five, infant, and neonatal mortality rates from the community-based method against the retrospective FBH/FPH for rolling twelve-month periods. For each method we calculated total cost, average annual cost per 1,000 population, and average cost per vital event reported. Results On average, CHWs submitted monthly vital event reports for over 95 percent of catchment areas in Ethiopia and Malawi, and for 100 percent of catchment areas in Mali. The completeness of vital events reporting by CHWs varied: we estimated that 30%-90% of annualized expected births (i.e. the number of births estimated using a FPH) were documented by CHWs and 22%-91% of annualized expected under-five deaths were documented by CHWs. Resulting annualized under-five mortality rates based on the CHW vital events reporting were, on average, under-estimated by 28% in Ethiopia, 32% in

  6. Sewage sludge toxicity assessment using earthworm Eisenia fetida: can biochemical and histopathological analysis provide fast and accurate insight?

    PubMed

    Babić, S; Barišić, J; Malev, O; Klobučar, G; Popović, N Topić; Strunjak-Perović, I; Krasnići, N; Čož-Rakovac, R; Klobučar, R Sauerborn

    2016-06-01

    Sewage sludge (SS) is a complex organic by-product of wastewater treatment plants. Deposition of large amounts of SS can increase the risk of soil contamination. Therefore, there is an increasing need for fast and accurate assessment of SS toxic potential. Toxic effects of SS were tested on earthworm Eisenia fetida tissue, at the subcellular and biochemical level. Earthworms were exposed to depot sludge (DS) concentration ratio of 30 or 70 %, to undiluted and to 100 and 10 times diluted active sludge (AS). The exposure to DS lasted for 24/48 h (acute exposure), 96 h (semi-acute exposure) and 7/14/28 days (sub-chronic exposure) and 48 h for AS. Toxic effects were tested by the measurements of multixenobiotic resistance mechanism (MXR) activity and lipid peroxidation levels, as well as the observation of morphological alterations and behavioural changes. Biochemical markers confirmed the presence of MXR inhibitors in the tested AS and DS and highlighted the presence of SS-induced oxidative stress. The MXR inhibition and thiobarbituric acid reactive substance (TBARS) concentration in the whole earthworm's body were higher after the exposition to lower concentration of the DS. Furthermore, histopathological changes revealed damage to earthworm body wall tissue layers as well as to the epithelial and chloragogen cells in the typhlosole region. These changes were proportional to SS concentration in tested soils and to exposure duration. Obtained results may contribute to the understanding of SS-induced toxic effects on terrestrial invertebrates exposed through soil contact and to identify defence mechanisms of earthworms. PMID:26971513

  7. Assessing exposure to allied ground troops in the Vietnam War: a quantitative evaluation of the Stellman Exposure Opportunity Index model.

    PubMed

    Ginevan, Michael E; Watkins, Deborah K; Ross, John H; O'Boyle, Randy A

    2009-06-01

    The Exposure Opportunity Index (EOI) is a proximity-based model developed to estimate relative exposure of ground troops in Vietnam to aerially applied herbicides. We conducted a detailed quantitative evaluation of the EOI model by using actual herbicide spray missions isolated in time and space. EOI scores were calculated for each of 36 hypothetical receptor location points associated with each spray mission for 30 herbicide missions for two time periods - day of herbicide application and day 2-3 post-application. Our analysis found an enormous range of EOI predictions with 500-1000-fold differences across missions directly under the flight path. This quantitative examination of the EOI suggests that extensive testing of the model's code is warranted. Researchers undertaking development of a proximity-based exposure model for epidemiologic studies of either Vietnam veterans or the Vietnamese population should conduct a thorough and realistic analysis of how precise and accurate the model results are likely to be and then assess whether the model results provide a useful basis for their planned epidemiologic studies. PMID:19278712

  8. Quantitative assessment of hydrocarbon contamination in soil using reflectance spectroscopy: a "multipath" approach.

    PubMed

    Schwartz, Guy; Ben-Dor, Eyal; Eshel, Gil

    2013-11-01

    Petroleum hydrocarbons are contaminants of great significance. The commonly used analytic method for assessing total petroleum hydrocarbons (TPH) in soil samples is based on extraction with 1,1,2-Trichlorotrifluoroethane (Freon 113), a substance prohibited to use by the Environmental Protection Agency. During the past 20 years, a new quantitative methodology that uses the reflected radiation of solids has been widely adopted. By using this approach, the reflectance radiation across the visible, near infrared-shortwave infrared region (400-2500 nm) is modeled against constituents determined using traditional analytic chemistry methods and then used to predict unknown samples. This technology is environmentally friendly and permits rapid and cost-effective measurements of large numbers of samples. Thus, this method dramatically reduces chemical analytical costs and secondary pollution, enabling a new dimension of environmental monitoring. In this study we adapted this approach and developed effective steps in which hydrocarbon contamination in soils can be determined rapidly, accurately, and cost effectively solely from reflectance spectroscopy. Artificial contaminated samples were analyzed chemically and spectrally to form a database of five soils contaminated with three types of petroleum hydrocarbons (PHCs), creating 15 datasets of 48 samples each at contamination levels of 50-5000 wt% ppm (parts per million). A brute force preprocessing approach was used by combining eight different preprocessing techniques with all possible datasets, resulting in 120 different mutations for each dataset. The brute force was done based on an innovative computing system developed for this study. A new parameter for evaluating model performance scoring (MPS) is proposed based on a combination of several common statistical parameters. The effect of dividing the data into training validation and test sets on modeling accuracy is also discussed. The results of this study clearly show

  9. Quantitative Risk Assessment of CO2 Sequestration in a commerical-scale EOR Site

    NASA Astrophysics Data System (ADS)

    Pan, F.; McPherson, B. J. O. L.; Dai, Z.; Jia, W.; Lee, S. Y.; Ampomah, W.; Viswanathan, H. S.

    2015-12-01

    Enhanced Oil Recovery with CO2 (CO2-EOR) is perhaps the most feasible option for geologic CO2 sequestration (GCS), if only due to existing infrastructure and economic opportunities of associated oil production. Probably the most significant source of uncertainty of CO2 storage forecasts is heterogeneity of reservoir properties. Quantification of storage forecast uncertainty is critical for accurate assessment of risks associated with GCS in EOR fields. This study employs a response surface methodology (RSM) to quantify uncertainties of CO2 storage associated with oil production in an active CO2-EOR field. Specifically, the Morrow formation, a clastic reservoir within the Farnsworth EOR Unit (FWU) in Texas, was selected as a case study. Four uncertain parameters (i.e., independent variables) are reservoir permeability, anisotropy ratio of permeability, water-alternating-gas (WAG) time ratio, and initial oil saturation. Cumulative oil production and net CO2 injection are the output dependent variables. A 3-D FWU reservoir model, including a representative 5-spot well pattern, was constructed for CO2-oil-water multiphase flow analysis. A total of 25 permutations of 3-D reservoir simulations were executed using Eclipse simulator. After performing stepwise regression analysis, a series of response surface models of the output variables at each step were constructed and verified using appropriate goodness-of-fit measures. The R2 values are larger than 0.9 and NRMSE values are less than 5% between the simulated and predicted oil production and net CO2 injection, suggesting that the response surface (or proxy) models are sufficient for predicting CO2-EOR system behavior for FWU case. Given the range of uncertainties in the independent variables, the cumulative distribution functions (CDFs) of dependent variables were estimated using the proxy models. The predicted cumulative oil production and net CO2 injection at 95th percentile after 5 years are about 3.65 times, and 1

  10. A quantitative microbial risk assessment for meatborne Toxoplasma gondii infection in The Netherlands.

    PubMed

    Opsteegh, Marieke; Prickaerts, Saskia; Frankena, Klaas; Evers, Eric G

    2011-11-01

    Toxoplasma gondii is an important foodborne pathogen, and the cause of a high disease burden due to congenital toxoplasmosis in The Netherlands. The aim of this study was to quantify the relative contribution of sheep, beef and pork products to human T. gondii infections by Quantitative Microbial Risk Assessment (QMRA). Bradyzoite concentration and portion size data were used to estimate the bradyzoite number in infected unprocessed portions for human consumption. The reduction factors for salting, freezing and heating as estimated based on published experiments in mice, were subsequently used to estimate the bradyzoite number in processed portions. A dose-response relation for T. gondii infection in mice was used to estimate the human probability of infection due to consumption of these originally infected processed portions. By multiplying these probabilities with the prevalence of T. gondii per livestock species and the number of portions consumed per year, the number of infections per year was calculated for the susceptible Dutch population and the subpopulation of susceptible pregnant women. QMRA results predict high numbers of infections per year with beef as the most important source. Although many uncertainties were present in the data and the number of congenital infections predicted by the model was almost twenty times higher than the number estimated based on the incidence in newborns, the usefulness of the advice to thoroughly heat meat is confirmed by our results. Forty percent of all predicted infections is due to the consumption of unheated meat products, and sensitivity analysis indicates that heating temperature has the strongest influence on the predicted number of infections. The results also demonstrate that, even with a low prevalence of infection in cattle, consumption of beef remains an important source of infection. Developing this QMRA model has helped identify important gaps of knowledge and resulted in the following recommendations for

  11. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates. PMID:27357043

  12. Study Protocol - Accurate assessment of kidney function in Indigenous Australians: aims and methods of the eGFR Study

    PubMed Central

    2010-01-01

    Background There is an overwhelming burden of cardiovascular disease, type 2 diabetes and chronic kidney disease among Indigenous Australians. In this high risk population, it is vital that we are able to measure accurately kidney function. Glomerular filtration rate is the best overall marker of kidney function. However, differences in body build and body composition between Indigenous and non-Indigenous Australians suggest that creatinine-based estimates of glomerular filtration rate derived for European populations may not be appropriate for Indigenous Australians. The burden of kidney disease is borne disproportionately by Indigenous Australians in central and northern Australia, and there is significant heterogeneity in body build and composition within and amongst these groups. This heterogeneity might differentially affect the accuracy of estimation of glomerular filtration rate between different Indigenous groups. By assessing kidney function in Indigenous Australians from Northern Queensland, Northern Territory and Western Australia, we aim to determine a validated and practical measure of glomerular filtration rate suitable for use in all Indigenous Australians. Methods/Design A cross-sectional study of Indigenous Australian adults (target n = 600, 50% male) across 4 sites: Top End, Northern Territory; Central Australia; Far North Queensland and Western Australia. The reference measure of glomerular filtration rate was the plasma disappearance rate of iohexol over 4 hours. We will compare the accuracy of the following glomerular filtration rate measures with the reference measure: Modification of Diet in Renal Disease 4-variable formula, Chronic Kidney Disease Epidemiology Collaboration equation, Cockcroft-Gault formula and cystatin C- derived estimates. Detailed assessment of body build and composition was performed using anthropometric measurements, skinfold thicknesses, bioelectrical impedance and a sub-study used dual-energy X-ray absorptiometry. A

  13. Repeatability Assessment by ISO 11843-7 in Quantitative HPLC for Herbal Medicines.

    PubMed

    Chen, Liangmian; Kotani, Akira; Hakamata, Hideki; Tsutsumi, Risa; Hayashi, Yuzuru; Wang, Zhimin; Kusu, Fumiyo

    2015-01-01

    We have proposed an assessment methods to estimate the measurement relative standard deviation (RSD) of chromatographic peaks in quantitative HPLC for herbal medicines by the methodology of ISO 11843 Part 7 (ISO 11843-7:2012), which provides detection limits stochastically. In quantitative HPLC with UV detection (HPLC-UV) of Scutellaria Radix for the determination of baicalin, the measurement RSD of baicalin by ISO 11843-7:2012 stochastically was within a 95% confidence interval of the statistically obtained RSD by repetitive measurements (n = 6). Thus, our findings show that it is applicable for estimating of the repeatability of HPLC-UV for determining baicalin without repeated measurements. In addition, the allowable limit of the "System repeatability" in "Liquid Chromatography" regulated in a pharmacopoeia can be obtained by the present assessment method. Moreover, the present assessment method was also successfully applied to estimate the measurement RSDs of quantitative three-channel liquid chromatography with electrochemical detection (LC-3ECD) of Chrysanthemi Flos for determining caffeoylquinic acids and flavonoids. By the present repeatability assessment method, reliable measurement RSD was obtained stochastically, and the experimental time was remarkably reduced. PMID:26353956

  14. Rapid quantitative assessment of visible injury to vegetation and visual amenity effects of fluoride air pollution.

    PubMed

    Doley, D

    2010-01-01

    Quantitative measures of visible injury are proposed for the protection of the aesthetic acceptability and health of ecosystems. Visible indications of air pollutant injury symptoms can be assessed rapidly and economically over large areas of mixed species such as native ecosystems. Reliable indication requires close attention to the criteria for assessment, species selection, and the influence of other environmental conditions on plant response to a pollutant. The estimation of fluoride-induced visible injury in dicotyledonous species may require techniques that are more varied than the measurement of necrosis in linear-leaved monocotyledons and conifers. A scheme is described for quantitative estimates of necrosis, chlorosis and deformation of leaves using an approximately geometric series of injury categories that permits rapid and sufficiently consistent determination and recognises degrees of aesthetic offence associated with foliar injury to plants. PMID:19067198

  15. Differential Label-free Quantitative Proteomic Analysis of Shewanella oneidensis Cultured under Aerobic and Suboxic Conditions by Accurate Mass and Time Tag Approach

    SciTech Connect

    Fang, Ruihua; Elias, Dwayne A.; Monroe, Matthew E.; Shen, Yufeng; McIntosh, Martin; Wang, Pei; Goddard, Carrie D.; Callister, Stephen J.; Moore, Ronald J.; Gorby, Yuri A.; Adkins, Joshua N.; Fredrickson, Jim K.; Lipton, Mary S.; Smith, Richard D.

    2006-04-01

    We describe the application of liquid chromatography coupled to mass spectrometry (LC/MS) without the use of stable isotope labeling for differential quantitative proteomics analysis of whole cell lysates of Shewanella oneidensis MR-1 cultured under aerobic and sub-oxic conditions. Liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) was used to initially identify peptide sequences, and LC coupled to Fourier transform ion cyclotron resonance mass spectrometry (LC-FTICR) was used to confirm these identifications, as well as measure relative peptide abundances. 2343 peptides, covering 668 proteins were identified with high confidence and quantified. Among these proteins, a subset of 56 changed significantly using statistical approaches such as SAM, while another subset of 56 that were annotated as performing housekeeping functions remained essentially unchanged in relative abundance. Numerous proteins involved in anaerobic energy metabolism exhibited up to a 10-fold increase in relative abundance when S. oneidensis is transitioned from aerobic to sub-oxic conditions.

  16. Quantitative assessment of radiation force effect at the dielectric air-liquid interface

    PubMed Central

    Capeloto, Otávio Augusto; Zanuto, Vitor Santaella; Malacarne, Luis Carlos; Baesso, Mauro Luciano; Lukasievicz, Gustavo Vinicius Bassi; Bialkowski, Stephen Edward; Astrath, Nelson Guilherme Castelli

    2016-01-01

    We induce nanometer-scale surface deformation by exploiting momentum conservation of the interaction between laser light and dielectric liquids. The effect of radiation force at the air-liquid interface is quantitatively assessed for fluids with different density, viscosity and surface tension. The imparted pressure on the liquids by continuous or pulsed laser light excitation is fully described by the Helmholtz electromagnetic force density. PMID:26856622

  17. The Quantitative Reasoning for College Science (QuaRCS) Assessment in non-Astro 101 Courses

    NASA Astrophysics Data System (ADS)

    Kirkman, Thomas W.; Jensen, Ellen

    2016-06-01

    The innumeracy of American students and adults is a much lamented educational problem. The quantitative reasoning skills of college students may be particularly addressed and improved in "general education" science courses like Astro 101. Demonstrating improvement requires a standardized instrument. Among the non-proprietary instruments the Quantitative Literacy and Reasoning Assessment[1] (QRLA) and the Quantitative Reasoning for College Science (QuaRCS) Assessment[2] stand out.Follette et al. developed the QuaRCS in the context of Astro 101 at University of Arizona. We report on QuaRCS results in different contexts: pre-med physics and pre-nursing microbiology at a liberal arts college. We report on the mismatch between students' contemporaneous report of a question's difficulty and the actual probability of success. We report correlations between QuaRCS and other assessments of overall student performance in the class. We report differences in attitude towards mathematics in these two different but health-related student populations .[1] QLRA, Gaze et al., 2014, DOI: http://dx.doi.org/10.5038/1936-4660.7.2.4[2] QuaRCS, Follette, et al., 2015, DOI: http://dx.doi.org/10.5038/1936-4660.8.2.2

  18. Sugar concentration in nectar: a quantitative metric of crop attractiveness for refined pollinator risk assessments.

    PubMed

    Knopper, Loren D; Dan, Tereza; Reisig, Dominic D; Johnson, Josephine D; Bowers, Lisa M

    2016-10-01

    Those involved with pollinator risk assessment know that agricultural crops vary in attractiveness to bees. Intuitively, this means that exposure to agricultural pesticides is likely greatest for attractive plants and lowest for unattractive plants. While crop attractiveness in the risk assessment process has been qualitatively remarked on by some authorities, absent is direction on how to refine the process with quantitative metrics of attractiveness. At a high level, attractiveness of crops to bees appears to depend on several key variables, including but not limited to: floral, olfactory, visual and tactile cues; seasonal availability; physical and behavioral characteristics of the bee; plant and nectar rewards. Notwithstanding the complexities and interactions among these variables, sugar content in nectar stands out as a suitable quantitative metric by which to refine pollinator risk assessments for attractiveness. Provided herein is a proposed way to use sugar nectar concentration to adjust the exposure parameter (with what is called a crop attractiveness factor) in the calculation of risk quotients in order to derive crop-specific tier I assessments. This Perspective is meant to invite discussion on incorporating such changes in the risk assessment process. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry. PMID:27197566

  19. A framework for quantitative assessment of impacts related to energy and mineral resource development

    USGS Publications Warehouse

    Haines, Seth S.; Diffendorfer, James; Balistrieri, Laurie S.; Berger, Byron R.; Cook, Troy A.; Gautier, Donald L.; Gallegos, Tanya J.; Gerritsen, Margot; Graffy, Elisabeth; Hawkins, Sarah; Johnson, Kathleen; Macknick, Jordan; McMahon, Peter; Modde, Tim; Pierce, Brenda; Schuenemeyer, John H.; Semmens, Darius; Simon, Benjamin; Taylor, Jason; Walton-Day, Katie

    2013-01-01

    Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sage grouse leks and piñon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. The framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.

  20. A methodology for the quantitative risk assessment of major accidents triggered by seismic events.

    PubMed

    Antonioni, Giacomo; Spadoni, Gigliola; Cozzani, Valerio

    2007-08-17

    A procedure for the quantitative risk assessment of accidents triggered by seismic events in industrial facilities was developed. The starting point of the procedure was the use of available historical data to assess the expected frequencies and the severity of seismic events. Available equipment-dependant failure probability models (vulnerability or fragility curves) were used to assess the damage probability of equipment items due to a seismic event. An analytic procedure was subsequently developed to identify, evaluate the credibility and finally assess the expected consequences of all the possible scenarios that may follow the seismic events. The procedure was implemented in a GIS-based software tool in order to manage the high number of event sequences that are likely to be generated in large industrial facilities. The developed methodology requires a limited amount of additional data with respect to those used in a conventional QRA, and yields with a limited effort a preliminary quantitative assessment of the contribution of the scenarios triggered by earthquakes to the individual and societal risk indexes. The application of the methodology to several case-studies evidenced that the scenarios initiated by seismic events may have a relevant influence on industrial risk, both raising the overall expected frequency of single scenarios and causing specific severe scenarios simultaneously involving several plant units. PMID:17276591

  1. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit

    PubMed Central

    Dai, Houde; Zhang, Pengyue; Lueth, Tim C.

    2015-01-01

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor assessment method. This inertial sensor-based method was verified through comparison with an electromagnetic motion tracking system. Seven Parkinson’s disease (PD) patients were tested using this tremor assessment system. The measured tremor amplitudes correlated well with the judgments of a neurologist (r = 0.98). The systematic analysis of sensor-based tremor quantification and the corresponding experiments could be of great help in monitoring the severity of parkinsonian tremor. PMID:26426020

  2. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit.

    PubMed

    Dai, Houde; Zhang, Pengyue; Lueth, Tim C

    2015-01-01

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor assessment method. This inertial sensor-based method was verified through comparison with an electromagnetic motion tracking system. Seven Parkinson's disease (PD) patients were tested using this tremor assessment system. The measured tremor amplitudes correlated well with the judgments of a neurologist (r = 0.98). The systematic analysis of sensor-based tremor quantification and the corresponding experiments could be of great help in monitoring the severity of parkinsonian tremor. PMID:26426020

  3. Three-Dimensional Quantitative Assessment of Uterine Fibroid Response after Uterine Artery Embolization Using Contrast-Enhanced MR Imaging

    PubMed Central

    Chapiro, Julius; Duran, Rafael; Lin, MingDe; Werner, John D.; Wang, Zhijun; Schernthaner, Rüdiger; Savic, Lynn Jeanette; Lessne, Mark L.; Geschwind, Jean-François; Hong, Kelvin

    2015-01-01

    Purpose To evaluate the clinical feasibility and diagnostic accuracy of three-dimensional (3D) quantitative magnetic resonance (MR) imaging for the assessment of total lesion volume (TLV) and enhancing lesion volume (ELV) before and after uterine artery embolization (UAE). Materials and Methods This retrospective study included 25 patients with uterine fibroids who underwent UAE and received contrast-enhanced MR imaging before and after the procedure. TLV was calculated using a semiautomated 3D segmentation of the dominant lesion on contrast-enhanced MR imaging, and ELV was defined as voxels within TLV where the enhancement exceeded the value of a region of interest placed in hypoenhancing soft tissue (left psoas muscle). ELV was expressed in relative (% of TLV) and absolute (in cm3) metrics. Results were compared with manual measurements and correlated with symptomatic outcome using a linear regression model. Results Although 3D quantitative measurements of TLV demonstrated a strong correlation with the manual technique (R2 = 0.93), measurements of ELV after UAE showed significant disagreement between techniques (R2 = 0.72; residual standard error, 15.8). Six patients (24%) remained symptomatic and were classified as nonresponders. When stratified according to response, no difference in % ELV between responders and nonresponders was observed. When assessed using cm3 ELV, responders showed a significantly lower mean ELV compared with nonresponders (4.1 cm3 [range, 0.3–19.8 cm3] vs 77 cm3 [range, 11.91–296 cm3]; P < .01). Conclusions The use of segmentation-based 3D quantification of lesion enhancement is feasible and diagnostically accurate and could be considered as an MR imaging response marker for clinical outcome after UAE. PMID:25638750

  4. SWIFT-MRI imaging and quantitative assessment of IONPs in murine tumors following intra-tumor and systemic delivery

    NASA Astrophysics Data System (ADS)

    Reeves, Russell; Petryk, Alicia A.; Kastner, Elliot J.; Zhang, Jinjin; Ring, Hattie; Garwood, Michael; Hoopes, P. Jack

    2015-03-01

    Although preliminary clinical trials are ongoing, successful the use of iron-oxide magnetic nanoparticles (IONP) for heatbased cancer treatments will depend on advancements in: 1) nanoparticle platforms, 2) delivery of a safe and effective alternating magnetic field (AMF) to the tumor, and 3) development of non-invasive, spatially accurate IONP imaging and quantification technique. This imaging technique must be able to assess tumor and normal tissue anatomy as well as IONP levels and biodistribution. Conventional CT imaging is capable of detecting and quantifying IONPs at tissue levels above 10 mg/gram; unfortunately this level is not clinically achievable in most situations. Conventional MRI is capable of imaging IONPs at tissue levels of 0.05 mg/gm or less, however this level is considered to be below the therapeutic threshold. We present here preliminary in vivo data demonstrating the ability of a novel MRI technique, Sweep Imaging with Fourier Transformation (SWIFT), to accurately image and quantify IONPs in tumor tissue in the therapeutic concentration range (0.1-1.0 mg/gm tissue). This ultra-short, T2 MRI method provides a positive Fe contrast enhancement with a reduced signal to noise ratio. Additional IONP signal enhancement techniques such as inversion recovery spectroscopy and variable flip angle (VFA) are also being studied for potential optimization of SWIFT IONP imaging. Our study demonstrates the use of SWIFT to assess IONP levels and biodistribution, in murine flank tumors, following intra-tumoral and systemic IONP administration. ICP-MS and quantitative histological techniques are used to validate the accuracy and sensitivity of SWIFT-based IONP imaging and quantification.

  5. Quantitative assessment of historical coastal landfill contamination using in-situ field portable XRF (FPXRF)

    NASA Astrophysics Data System (ADS)

    O'Shea, Francis; Spencer, Kate; Brasington, James

    2014-05-01

    in the field to determine the presence, location and extent of the sub-surface contaminant plume. Although XRF analysis has gained acceptance in the study of in-situ metal contamination (Kalnicky and Singhvi 2001; Martin Peinado et al. 2010) field moisture content and sample heterogeneity can suppress X-ray signals. Therefore, sediment samples were also collected and returned to the laboratory and analysed by ICP OES for comparison. Both wet and dry certified reference materials were also analysed in the laboratory using XRF and ICP OES to observe the impact of moisture content and to produce a correction factor allowing quantitative data to be collected in the field. In-situ raw XRF data identified the location of contamination plumes in the field in agreement with ICP data, although the data were systematically suppressed compared to ICP data, under-estimating the levels of contamination. Applying a correction factor for moisture content provided accurate measurements of concentration. The use of field portable XRF with the application of a moisture content correction factor enables the rapid screening of sediment fronting coastal landfill sites, goes some way towards providing a national baseline dataset and can contribute to the development of risk assessments.

  6. Quantitative assessment of microvasculopathy in arcAβ mice with USPIO-enhanced gradient echo MRI

    PubMed Central

    Deistung, Andreas; Ielacqua, Giovanna D; Seuwen, Aline; Kindler, Diana; Schweser, Ferdinand; Vaas, Markus; Kipar, Anja; Reichenbach, Jürgen R; Rudin, Markus

    2015-01-01

    Magnetic resonance imaging employing administration of iron oxide-based contrast agents is widely used to visualize cellular and molecular processes in vivo. In this study, we investigated the ability of R2* and quantitative susceptibility mapping to quantitatively assess the accumulation of ultrasmall superparamagnetic iron oxide (USPIO) particles in the arcAβ mouse model of cerebral amyloidosis. Gradient-echo data of mouse brains were acquired at 9.4 T after injection of USPIO. Focal areas with increased magnetic susceptibility and R2* values were discernible across several brain regions in 12-month-old arcAβ compared to 6-month-old arcAβ mice and to non-transgenic littermates, indicating accumulation of particles after USPIO injection. This was concomitant with higher R2* and increased magnetic susceptibility differences relative to cerebrospinal fluid measured in USPIO-injected compared to non-USPIO-injected 12-month-old arcAβ mice. No differences in R2* and magnetic susceptibility were detected in USPIO-injected compared to non-injected 12-month-old non-transgenic littermates. Histological analysis confirmed focal uptake of USPIO particles in perivascular macrophages adjacent to small caliber cerebral vessels with radii of 2–8 µm that showed no cerebral amyloid angiopathy. USPIO-enhanced R2* and quantitative susceptibility mapping constitute quantitative tools to monitor such functional microvasculopathies. PMID:26661253

  7. Quantitative assessment of microvasculopathy in arcAβ mice with USPIO-enhanced gradient echo MRI.

    PubMed

    Klohs, Jan; Deistung, Andreas; Ielacqua, Giovanna D; Seuwen, Aline; Kindler, Diana; Schweser, Ferdinand; Vaas, Markus; Kipar, Anja; Reichenbach, Jürgen R; Rudin, Markus

    2016-09-01

    Magnetic resonance imaging employing administration of iron oxide-based contrast agents is widely used to visualize cellular and molecular processes in vivo. In this study, we investigated the ability of [Formula: see text] and quantitative susceptibility mapping to quantitatively assess the accumulation of ultrasmall superparamagnetic iron oxide (USPIO) particles in the arcAβ mouse model of cerebral amyloidosis. Gradient-echo data of mouse brains were acquired at 9.4 T after injection of USPIO. Focal areas with increased magnetic susceptibility and [Formula: see text] values were discernible across several brain regions in 12-month-old arcAβ compared to 6-month-old arcAβ mice and to non-transgenic littermates, indicating accumulation of particles after USPIO injection. This was concomitant with higher [Formula: see text] and increased magnetic susceptibility differences relative to cerebrospinal fluid measured in USPIO-injected compared to non-USPIO-injected 12-month-old arcAβ mice. No differences in [Formula: see text] and magnetic susceptibility were detected in USPIO-injected compared to non-injected 12-month-old non-transgenic littermates. Histological analysis confirmed focal uptake of USPIO particles in perivascular macrophages adjacent to small caliber cerebral vessels with radii of 2-8 µm that showed no cerebral amyloid angiopathy. USPIO-enhanced [Formula: see text] and quantitative susceptibility mapping constitute quantitative tools to monitor such functional microvasculopathies. PMID:26661253

  8. Validation of reference genes for accurate normalization of gene expression for real time-quantitative PCR in strawberry fruits using different cultivars and osmotic stresses.

    PubMed

    Galli, Vanessa; Borowski, Joyce Moura; Perin, Ellen Cristina; Messias, Rafael da Silva; Labonde, Julia; Pereira, Ivan dos Santos; Silva, Sérgio Delmar Dos Anjos; Rombaldi, Cesar Valmor

    2015-01-10

    The increasing demand of strawberry (Fragaria×ananassa Duch) fruits is associated mainly with their sensorial characteristics and the content of antioxidant compounds. Nevertheless, the strawberry production has been hampered due to its sensitivity to abiotic stresses. Therefore, to understand the molecular mechanisms highlighting stress response is of great importance to enable genetic engineering approaches aiming to improve strawberry tolerance. However, the study of expression of genes in strawberry requires the use of suitable reference genes. In the present study, seven traditional and novel candidate reference genes were evaluated for transcript normalization in fruits of ten strawberry cultivars and two abiotic stresses, using RefFinder, which integrates the four major currently available software programs: geNorm, NormFinder, BestKeeper and the comparative delta-Ct method. The results indicate that the expression stability is dependent on the experimental conditions. The candidate reference gene DBP (DNA binding protein) was considered the most suitable to normalize expression data in samples of strawberry cultivars and under drought stress condition, and the candidate reference gene HISTH4 (histone H4) was the most stable under osmotic stresses and salt stress. The traditional genes GAPDH (glyceraldehyde-3-phosphate dehydrogenase) and 18S (18S ribosomal RNA) were considered the most unstable genes in all conditions. The expression of phenylalanine ammonia lyase (PAL) and 9-cis epoxycarotenoid dioxygenase (NCED1) genes were used to further confirm the validated candidate reference genes, showing that the use of an inappropriate reference gene may induce erroneous results. This study is the first survey on the stability of reference genes in strawberry cultivars and osmotic stresses and provides guidelines to obtain more accurate RT-qPCR results for future breeding efforts. PMID:25445290

  9. Combining quantitative and qualitative measures of uncertainty in model-based environmental assessment: the NUSAP system.

    PubMed

    van der Sluijs, Jeroen P; Craye, Matthieu; Funtowicz, Silvio; Kloprogge, Penny; Ravetz, Jerry; Risbey, James

    2005-04-01

    This article discusses recent experiences with the Numeral Unit Spread Assessment Pedigree (NUSAP) system for multidimensional uncertainty assessment, based on four case studies that vary in complexity. We show that the NUSAP method is applicable not only to relatively simple calculation schemes but also to complex models in a meaningful way and that NUSAP is useful to assess not only parameter uncertainty but also (model) assumptions. A diagnostic diagram can be used to synthesize results of quantitative analysis of parameter sensitivity and qualitative review (pedigree analysis) of parameter strength. It provides an analytic tool to prioritize uncertainties according to quantitative and qualitative insights in the limitations of available knowledge. We show that extension of the pedigree scheme to include societal dimensions of uncertainty, such as problem framing and value-laden assumptions, further promotes reflexivity and collective learning. When used in a deliberative setting, NUSAP pedigree assessment has the potential to foster a deeper social debate and a negotiated management of complex environmental problems. PMID:15876219

  10. Fibrosis assessment: impact on current management of chronic liver disease and application of quantitative invasive tools.

    PubMed

    Wang, Yan; Hou, Jin-Lin

    2016-05-01

    Fibrosis, a common pathogenic pathway of chronic liver disease (CLD), has long been indicated to be significantly and most importantly associated with severe prognosis. Nowadays, with remarkable advances in understanding and/or treatment of major CLDs such as hepatitis C, B, and nonalcoholic fatty liver disease, there is an unprecedented requirement for the diagnosis and assessment of liver fibrosis or cirrhosis in various clinical settings. Among the available approaches, liver biopsy remains the one which possibly provides the most direct and reliable information regarding fibrosis patterns and changes in the parenchyma at different clinical stages and with different etiologies. Thus, many endeavors have been undertaken for developing methodologies based on the strategy of quantitation for the invasive assessment. Here, we analyze the impact of fibrosis assessment on the CLD patient care based on the data of recent clinical studies. We discuss and update the current invasive tools regarding their technological features and potentials for the particular clinical applications. Furthermore, we propose the potential resolutions with application of quantitative invasive tools for some major issues in fibrosis assessment, which appear to be obstacles against the nowadays rapid progress in CLD medicine. PMID:26742762

  11. A Compressed Sensing-Based Wearable Sensor Network for Quantitative Assessment of Stroke Patients.

    PubMed

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-01-01

    Clinical rehabilitation assessment is an important part of the therapy process because it is the premise for prescribing suitable rehabilitation interventions. However, the commonly used assessment scales have the following two drawbacks: (1) they are susceptible to subjective factors; (2) they only have several rating levels and are influenced by a ceiling effect, making it impossible to exactly detect any further improvement in the movement. Meanwhile, energy constraints are a primary design consideration in wearable sensor network systems since they are often battery-operated. Traditionally, for wearable sensor network systems that follow the Shannon/Nyquist sampling theorem, there are many data that need to be sampled and transmitted. This paper proposes a novel wearable sensor network system to monitor and quantitatively assess the upper limb motion function, based on compressed sensing technology. With the sparse representation model, less data is transmitted to the computer than with traditional systems. The experimental results show that the accelerometer signals of Bobath handshake and shoulder touch exercises can be compressed, and the length of the compressed signal is less than 1/3 of the raw signal length. More importantly, the reconstruction errors have no influence on the predictive accuracy of the Brunnstrom stage classification model. It also indicated that the proposed system can not only reduce the amount of data during the sampling and transmission processes, but also, the reconstructed accelerometer signals can be used for quantitative assessment without any loss of useful information. PMID:26861337

  12. A Compressed Sensing-Based Wearable Sensor Network for Quantitative Assessment of Stroke Patients

    PubMed Central

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-01-01

    Clinical rehabilitation assessment is an important part of the therapy process because it is the premise for prescribing suitable rehabilitation interventions. However, the commonly used assessment scales have the following two drawbacks: (1) they are susceptible to subjective factors; (2) they only have several rating levels and are influenced by a ceiling effect, making it impossible to exactly detect any further improvement in the movement. Meanwhile, energy constraints are a primary design consideration in wearable sensor network systems since they are often battery-operated. Traditionally, for wearable sensor network systems that follow the Shannon/Nyquist sampling theorem, there are many data that need to be sampled and transmitted. This paper proposes a novel wearable sensor network system to monitor and quantitatively assess the upper limb motion function, based on compressed sensing technology. With the sparse representation model, less data is transmitted to the computer than with traditional systems. The experimental results show that the accelerometer signals of Bobath handshake and shoulder touch exercises can be compressed, and the length of the compressed signal is less than 1/3 of the raw signal length. More importantly, the reconstruction errors have no influence on the predictive accuracy of the Brunnstrom stage classification model. It also indicated that the proposed system can not only reduce the amount of data during the sampling and transmission processes, but also, the reconstructed accelerometer signals can be used for quantitative assessment without any loss of useful information. PMID:26861337

  13. A quantitative assessment of alkaptonuria: testing the reliability of two disease severity scoring systems.

    PubMed

    Cox, Trevor F; Ranganath, Lakshminarayan

    2011-12-01

    Alkaptonuria (AKU) is due to excessive homogentisic acid accumulation in body fluids due to lack of enzyme homogentisate dioxygenase leading in turn to varied clinical manifestations mainly by a process of conversion of HGA to a polymeric melanin-like pigment known as ochronosis. A potential treatment, a drug called nitisinone, to decrease formation of HGA is available. However, successful demonstration of its efficacy in modifying the natural history of AKU requires an effective quantitative assessment tool. We have described two potential tools that could be used to quantitate disease burden in AKU. One tool describes scoring the clinical features that includes clinical assessments, investigations and questionnaires in 15 patients with AKU. The second tool describes a scoring system that only includes items obtained from questionnaires used in 44 people with AKU. Statistical analyses were carried out on the two patient datasets to assess the AKU tools; these included the calculation of Chronbach's alpha, multidimensional scaling and simple linear regression analysis. The conclusion was that there was good evidence that the tools could be adopted as AKU assessment tools, but perhaps with further refinement before being used in the practical setting of a clinical trial. PMID:21744089

  14. Quantitative MRI assessments of white matter in children treated for acute lymphoblastic leukemia

    NASA Astrophysics Data System (ADS)

    Reddick, Wilburn E.; Glass, John O.; Helton, Kathleen J.; Li, Chin-Shang; Pui, Ching-Hon

    2005-04-01

    The purpose of this study was to use objective quantitative MR imaging methods to prospectively assess changes in the physiological structure of white matter during the temporal evolution of leukoencephalopathy (LE) in children treated for acute lymphoblastic leukemia. The longitudinal incidence, extent (proportion of white matter affect), and intensity (elevation of T1 and T2 relaxation rates) of LE was evaluated for 44 children. A combined imaging set consisting of T1, T2, PD, and FLAIR MR images and white matter, gray matter and CSF a priori maps from a spatially normalized atlas were analyzed with a neural network segmentation based on a Kohonen Self-Organizing Map (SOM). Quantitative T1 and T2 relaxation maps were generated using a nonlinear parametric optimization procedure to fit the corresponding multi-exponential models. A Cox proportional regression was performed to estimate the effect of intravenous methotrexate (IV-MTX) exposure on the development of LE followed by a generalized linear model to predict the probability of LE in new patients. Additional T-tests of independent samples were performed to assess differences in quantitative measures of extent and intensity at four different points in therapy. Higher doses and more courses of IV-MTX placed patients at a higher risk of developing LE and were associated with more intense changes affecting more of the white matter volume; many of the changes resolved after completion of therapy. The impact of these changes on neurocognitive functioning and quality of life in survivors remains to be determined.

  15. The Utility of Maze Accurate Response Rate in Assessing Reading Comprehension in Upper Elementary and Middle School Students

    ERIC Educational Resources Information Center

    McCane-Bowling, Sara J.; Strait, Andrea D.; Guess, Pamela E.; Wiedo, Jennifer R.; Muncie, Eric

    2014-01-01

    This study examined the predictive utility of five formative reading measures: words correct per minute, number of comprehension questions correct, reading comprehension rate, number of maze correct responses, and maze accurate response rate (MARR). Broad Reading cluster scores obtained via the Woodcock-Johnson III (WJ III) Tests of Achievement…

  16. Automated Tracking of Quantitative Assessments of Tumor Burden in Clinical Trials1

    PubMed Central

    Rubin, Daniel L; Willrett, Debra; O'Connor, Martin J; Hage, Cleber; Kurtz, Camille; Moreira, Dilvan A

    2014-01-01

    There are two key challenges hindering effective use of quantitative assessment of imaging in cancer response assessment: 1) Radiologists usually describe the cancer lesions in imaging studies subjectively and sometimes ambiguously, and 2) it is difficult to repurpose imaging data, because lesion measurements are not recorded in a format that permits machine interpretation and interoperability. We have developed a freely available software platform on the basis of open standards, the electronic Physician Annotation Device (ePAD), to tackle these challenges in two ways. First, ePAD facilitates the radiologist in carrying out cancer lesion measurements as part of routine clinical trial image interpretation workflow. Second, ePAD records all image measurements and annotations in a data format that permits repurposing image data for analyses of alternative imaging biomarkers of treatment response. To determine the impact of ePAD on radiologist efficiency in quantitative assessment of imaging studies, a radiologist evaluated computed tomography (CT) imaging studies from 20 subjects having one baseline and three consecutive follow-up imaging studies with and without ePAD. The radiologist made measurements of target lesions in each imaging study using Response Evaluation Criteria in Solid Tumors 1.1 criteria, initially with the aid of ePAD, and then after a 30-day washout period, the exams were reread without ePAD. The mean total time required to review the images and summarize measurements of target lesions was 15% (P < .039) shorter using ePAD than without using this tool. In addition, it was possible to rapidly reanalyze the images to explore lesion cross-sectional area as an alternative imaging biomarker to linear measure. We conclude that ePAD appears promising to potentially improve reader efficiency for quantitative assessment of CT examinations, and it may enable discovery of future novel image-based biomarkers of cancer treatment response. PMID:24772204

  17. Quantitative assessment of colon distention for polyp detection in CT virtual colonoscopy

    NASA Astrophysics Data System (ADS)

    Van Uitert, Robert; Bitter, Ingmar; Summers, Ronald M.; Choi, J. Richard; Pickhardt, Perry J.

    2006-03-01

    Virtual colonoscopy is becoming a more prevalent way to diagnose colon cancer. One of the critical elements in detecting cancerous polyps using virtual colonoscopy, especially in conjunction with computer-aided detection of polyps, is that the colon be sufficiently distended. We have developed an automatic method to determine from a CT scan what percentage of the colon is distended by 1cm or larger and compared our method with a radiologist's assessment of quality of the scan with respect to successful colon polyp detection. A radiologist grouped 41 CT virtual colonoscopy scans into three groups according to the degree of colonic distention, "well", "medium", and "poor". We also employed a subvoxel accurate centerline algorithm and a subvoxel accurate distance transform to each dataset to measure the colon distention along the centerline. To summarize the colonic distention with a single value relevant for polyp detection, the distention score, we recorded the percentage of centerline positions in which the colon distention was 1cm or larger. We then compared the radiologist's assessment and the computed results. The sorting of all datasets according to the distention score agreed with the radiologist's assessment. The "poor" cases had a mean and standard deviation score of 78.4% +/- 5.2%, the "medium" cases measured 88.7% +/- 1.9%, and the "well" cases 98.8% +/- 1.5%. All categories were shown to be significantly different from each other using unpaired two sample t-tests. The presented colonic distention score is an accurate method for assessing the quality of colonic distention for CT colonography.

  18. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    PubMed

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical

  19. A Quantitative Climate-Match Score for Risk-Assessment Screening of Reptile and Amphibian Introductions

    NASA Astrophysics Data System (ADS)

    van Wilgen, Nicola J.; Roura-Pascual, Núria; Richardson, David M.

    2009-09-01

    Assessing climatic suitability provides a good preliminary estimate of the invasive potential of a species to inform risk assessment. We examined two approaches for bioclimatic modeling for 67 reptile and amphibian species introduced to California and Florida. First, we modeled the worldwide distribution of the biomes found in the introduced range to highlight similar areas worldwide from which invaders might arise. Second, we modeled potentially suitable environments for species based on climatic factors in their native ranges, using three sources of distribution data. Performance of the three datasets and both approaches were compared for each species. Climate match was positively correlated with species establishment success (maximum predicted suitability in the introduced range was more strongly correlated with establishment success than mean suitability). Data assembled from the Global Amphibian Assessment through NatureServe provided the most accurate models for amphibians, while ecoregion data compiled by the World Wide Fund for Nature yielded models which described reptile climatic suitability better than available point-locality data. We present three methods of assigning a climate-match score for use in risk assessment using both the mean and maximum climatic suitabilities. Managers may choose to use different methods depending on the stringency of the assessment and the available data, facilitating higher resolution and accuracy for herpetofaunal risk assessment. Climate-matching has inherent limitations and other factors pertaining to ecological interactions and life-history traits must also be considered for thorough risk assessment.

  20. Experimental assessment of bone mineral density using quantitative computed tomography in holstein dairy cows.

    PubMed

    Maetani, Ayami; Itoh, Megumi; Nishihara, Kahori; Aoki, Takahiro; Ohtani, Masayuki; Shibano, Kenichi; Kayano, Mitsunori; Yamada, Kazutaka

    2016-08-01

    The aim of this study was to assess the measurement of bone mineral density (BMD) by quantitative computed tomography (QCT), comparing the relationships of BMD between QCT and dual-energy X-ray absorptiometry (DXA) and between QCT and radiographic absorptiometry (RA) in the metacarpal bone of Holstein dairy cows (n=27). A significant positive correlation was found between QCT and DXA measurements (r=0.70, P<0.01), and a significant correlation was found between QCT and RA measurements (r=0.50, P<0.01). We conclude that QCT provides quantitative evaluation of BMD in dairy cows, because BMD measured by QCT showed positive correlations with BMD measured by the two conventional methods: DXA and RA. PMID:27075115

  1. A quantitative collagen fibers orientation assessment using birefringence measurements: calibration and application to human osteons.

    PubMed

    Spiesz, Ewa M; Kaminsky, Werner; Zysset, Philippe K

    2011-12-01

    Even though mechanical properties depend strongly on the arrangement of collagen fibers in mineralized tissues, it is not yet well resolved. Only a few semi-quantitative evaluations of the fiber arrangement in bone, like spectroscopic techniques or circularly polarized light microscopy methods are available. In this study the out-of-plane collagen arrangement angle was calibrated to the linear birefringence of a longitudinally fibered mineralized turkey leg tendon cut at variety of angles to the main axis. The calibration curve was applied to human cortical bone osteons to quantify the out-of-plane collagen fibers arrangement. The proposed calibration curve is normalized to sample thickness and wavelength of the probing light to enable a universally applicable quantitative assessment. This approach may improve our understanding of the fibrillar structure of bone and its implications on mechanical properties. PMID:21970947

  2. Experimental assessment of bone mineral density using quantitative computed tomography in holstein dairy cows

    PubMed Central

    MAETANI, Ayami; ITOH, Megumi; NISHIHARA, Kahori; AOKI, Takahiro; OHTANI, Masayuki; SHIBANO, Kenichi; KAYANO, Mitsunori; YAMADA, Kazutaka

    2016-01-01

    The aim of this study was to assess the measurement of bone mineral density (BMD) by quantitative computed tomography (QCT), comparing the relationships of BMD between QCT and dual-energy X-ray absorptiometry (DXA) and between QCT and radiographic absorptiometry (RA) in the metacarpal bone of Holstein dairy cows (n=27). A significant positive correlation was found between QCT and DXA measurements (r=0.70, P<0.01), and a significant correlation was found between QCT and RA measurements (r=0.50, P<0.01). We conclude that QCT provides quantitative evaluation of BMD in dairy cows, because BMD measured by QCT showed positive correlations with BMD measured by the two conventional methods: DXA and RA. PMID:27075115

  3. Quantitative Assessment of Protein Interaction with Methyl-Lysine Analogues by Hybrid Computational and Experimental Approaches

    PubMed Central

    2011-01-01

    In cases where binding ligands of proteins are not easily available, structural analogues are often used. For example, in the analysis of proteins recognizing different methyl-lysine residues in histones, methyl-lysine analogues based on methyl-amino-alkylated cysteine residues have been introduced. Whether these are close enough to justify quantitative interpretation of binding experiments is however questionable. To systematically address this issue, we developed, applied, and assessed a hybrid computational/experimental approach that extracts the binding free energy difference between the native ligand (methyl-lysine) and the analogue (methyl-amino-alkylated cysteine) from a thermodynamic cycle. Our results indicate that measured and calculated binding differences are in very good agreement and therefore allow the correction of measured affinities of the analogues. We suggest that quantitative binding parameters for defined ligands in general can be derived by this method with remarkable accuracy. PMID:21991995

  4. Valuation of ecotoxicological impacts from tributyltin based on a quantitative environmental assessment framework.

    PubMed

    Noring, Maria; Håkansson, Cecilia; Dahlgren, Elin

    2016-02-01

    In the scientific literature, few valuations of biodiversity and ecosystem services following the impacts of toxicity are available, hampered by the lack of ecotoxicological documentation. Here, tributyltin is used to conduct a contingent valuation study as well as cost-benefit analysis (CBA) of measures for improving the environmental status in Swedish coastal waters of the Baltic Sea. Benefits considering different dimensions when assessing environmental status are highlighted and a quantitative environmental assessment framework based on available technology, ecological conditions, and economic valuation methodology is developed. Two scenarios are used in the valuation study: (a) achieving good environmental status by 2020 in accordance with EU legislation (USD 119 household(-1) year(-1)) and (b) achieving visible improvements by 2100 due to natural degradation (USD 108 household(-1) year(-1)) during 8 years. The later scenario was used to illustrate an application of the assessment framework. The CBA results indicate that both scenarios might generate a welfare improvement. PMID:26178630

  5. The edaphic quantitative protargol stain: a sampling protocol for assessing soil ciliate abundance and diversity.

    PubMed

    Acosta-Mercado, Dimaris; Lynn, Denis H

    2003-06-01

    It has been suggested that species loss from microbial groups low in diversity that occupy trophic positions close to the base of the detrital food web could be critical for terrestrial ecosystem functioning. Among the protozoans within the soil microbial loop, ciliates are presumably the least abundant and of low diversity. However, the lack of a standardized method to quantitatively enumerate and identify them has hampered our knowledge about the magnitude of their active and potential diversity, and about the interactions in which they are involved. Thus, the Edaphic Quantitative Protargol Staining (EQPS) method is provided to simultaneously account for ciliate species richness and abundance in a quantitative and qualitative way. This direct method allows this rapid and simultaneous assessment by merging the Non-flooded Petri Dish (NFPD) method [Prog. Protistol. 2 (1987) 69] and the Quantitative Protargol Stain (QPS) method [Montagnes, D.J.S., Lynn, D.H., 1993. A quantitative protargol stain (QPS) for ciliates and other protists. In: Kemp, P.F., Sherr, B.F., Sherr, E.B., Cole, J.J. (Eds.), Handbook of Methods in Aquatic Microbial Ecology. Lewis Publishers, Boca Raton, FL, pp. 229-240]. The abovementioned protocols were refined by experiments examining the spatial distribution of ciliates under natural field conditions, sampling intensity, the effect of storage, and the use of cytological preparations versus live observations. The EQPS could be useful in ecological studies since it provides both a "snapshot" of the active and effective diversity and a robust estimate of the potential diversity. PMID:12689714

  6. Qualitative and quantitative approaches in the dose-response assessment of genotoxic carcinogens.

    PubMed

    Fukushima, Shoji; Gi, Min; Kakehashi, Anna; Wanibuchi, Hideki; Matsumoto, Michiharu

    2016-05-01

    Qualitative and quantitative approaches are important issues in field of carcinogenic risk assessment of the genotoxic carcinogens. Herein, we provide quantitative data on low-dose hepatocarcinogenicity studies for three genotoxic hepatocarcinogens: 2-amino-3,8-dimethylimidazo[4,5-f]quinoxaline (MeIQx), 2-amino-3-methylimidazo[4,5-f]quinoline (IQ) and N-nitrosodiethylamine (DEN). Hepatocarcinogenicity was examined by quantitative analysis of glutathione S-transferase placental form (GST-P) positive foci, which are the preneoplastic lesions in rat hepatocarcinogenesis and the endpoint carcinogenic marker in the rat liver medium-term carcinogenicity bioassay. We also examined DNA damage and gene mutations which occurred through the initiation stage of carcinogenesis. For the establishment of points of departure (PoD) from which the cancer-related risk can be estimated, we analyzed the above events by quantitative no-observed-effect level and benchmark dose approaches. MeIQx at low doses induced formation of DNA-MeIQx adducts; somewhat higher doses caused elevation of 8-hydroxy-2'-deoxyquanosine levels; at still higher doses gene mutations occurred; and the highest dose induced formation of GST-P positive foci. These data indicate that early genotoxic events in the pathway to carcinogenesis showed the expected trend of lower PoDs for earlier events in the carcinogenic process. Similarly, only the highest dose of IQ caused an increase in the number of GST-P positive foci in the liver, while IQ-DNA adduct formation was observed with low doses. Moreover, treatment with DEN at low doses had no effect on development of GST-P positive foci in the liver. These data on PoDs for the markers contribute to understand whether genotoxic carcinogens have a threshold for their carcinogenicity. The most appropriate approach to use in low dose-response assessment must be approved on the basis of scientific judgment. PMID:26152227

  7. Validity of Quantitative Lymphoscintigraphy as a Lymphedema Assessment Tool for Patients With Breast Cancer

    PubMed Central

    Yoo, Ji-Na; Cheong, Youn-Soo; Min, Yu-Sun; Lee, Sang-Woo; Park, Ho Yong

    2015-01-01

    Objective To evaluate the validity of quantitative lymphoscintigraphy as a useful lymphedema assessment tool for patients with breast cancer surgery including axillary lymph node dissection (ALND). Methods We recruited 72 patients with lymphedema after breast cancer surgery that included ALND. Circumferences in their upper limbs were measured in five areas: 15 cm proximal to the lateral epicondyle (LE), the elbow, 10 cm distal to the LE, the wrist, and the metacarpophalangeal joint. Then, maximal circumference difference (MCD) was calculated by subtracting the unaffected side from the affected side. Quantitative asymmetry indices (QAI) were defined as the radiopharmaceutical uptake ratios of the affected side to the unaffected side. Patients were divided into 3 groups by qualitative lymphoscintigraphic patterns: normal, decreased function, and obstruction. Results The MCD was highest in the qualitative obstruction (2.76±2.48) pattern with significant differences from the normal (0.69±0.78) and decreased function (1.65±1.17) patterns. The QAIs of the axillary LNs showed significant differences among the normal (0.82±0.29), decreased function (0.42±0.41), and obstruction (0.18±0.16) patterns. As the QAI of the axillary LN increased, the MCD decreased. The QAIs of the upper limbs were significantly higher in the obstruction (3.12±3.07) pattern compared with the normal (1.15±0.10) and decreased function (0.79±0.30) patterns. Conclusion Quantitative lymphoscintigraphic analysis is well correlated with both commonly used qualitative lymphoscintigraphic analysis and circumference differences in the upper limbs of patients with breast cancer surgery with ALND. Quantitative lymphoscintigraphy may be a good alternative assessment tool for diagnosing lymphedema after breast cancer surgery with ALND. PMID:26798607

  8. Quantitative muscle strength assessment in duchenne muscular dystrophy: longitudinal study and correlation with functional measures

    PubMed Central

    2012-01-01

    Background The aim of this study was to perform a longitudinal assessment using Quantitative Muscle Testing (QMT) in a cohort of ambulant boys affected by Duchenne muscular dystrophy (DMD) and to correlate the results of QMT with functional measures. This study is to date the most thorough long-term evaluation of QMT in a cohort of DMD patients correlated with other measures, such as the North Star Ambulatory Assessment (NSAA) or thee 6-min walk test (6MWT). Methods This is a single centre, prospective, non-randomised, study assessing QMT using the Kin Com® 125 machine in a study cohort of 28 ambulant DMD boys, aged 5 to 12 years. This cohort was assessed longitudinally over a 12 months period of time with 3 monthly assessments for QMT and with assessment of functional abilities, using the NSAA and the 6MWT at baseline and at 12 months only. QMT was also used in a control group of 13 healthy age-matched boys examined at baseline and at 12 months. Results There was an increase in QMT over 12 months in boys below the age of 7.5 years while in boys above the age of 7.5 years, QMT showed a significant decrease. All the average one-year changes were significantly different than those experienced by healthy controls. We also found a good correlation between quantitative tests and the other measures that was more obvious in the stronger children. Conclusion Our longitudinal data using QMT in a cohort of DMD patients suggest that this could be used as an additional tool to monitor changes, providing additional information on segmental strength. PMID:22974002

  9. Quantitative autoradiographic assessment of sup 55 Fe-RBC distribution in rat brain

    SciTech Connect

    Lin, S.Z.; Nakata, H.; Tajima, A.; Gruber, K.; Acuff, V.; Patlak, C.; Fenstermacher, J. )

    1990-11-01

    A simple in vivo technique of labeling erythrocytes (RBCs) with {sup 55}Fe was developed for quantitative autoradiography (QAR). This procedure involved injecting 5-6 ml of ({sup 55}Fe)ferrous citrate solution (1 mCi/ml) intraperitoneally into donor rats. The number of labeled RBCs reached a maximum at around 7 days and declined very slowly thereafter. Labeled RBCs were harvested from donor rats and used for RBC volume measurement in awake rats. Brain radioactivity was assayed by QAR, which yielded spatial resolution of greater than 50 microns. Tight nearly irreversible binding of {sup 55}Fe to RBCs was found in vivo and in vitro. More than 99.5% of the {sup 55}Fe in the blood of donor rats was bound to RBCs. Because of this, labeled blood can be taken from donors and injected into recipients without further preparation. The tissue absorption of {sup 55}Fe emissions was the same in gray and white matter. Microvascular RBC volumes measured with {sup 55}Fe-labeled RBCs agreed with those assayed with {sup 51}Cr-labeled RBCs for many, but not all, brain areas. In conclusion, {sup 55}Fe-RBCs can be readily prepared by this technique and accurately quantitated in brain tissue by QAR.

  10. Accurate quantitative measurements of brachial artery cross-sectional vascular area and vascular volume elastic modulus using automated oscillometric measurements: comparison with brachial artery ultrasound

    PubMed Central

    Tomiyama, Yuuki; Yoshinaga, Keiichiro; Fujii, Satoshi; Ochi, Noriki; Inoue, Mamiko; Nishida, Mutumi; Aziki, Kumi; Horie, Tatsunori; Katoh, Chietsugu; Tamaki, Nagara

    2015-01-01

    Increasing vascular diameter and attenuated vascular elasticity may be reliable markers for atherosclerotic risk assessment. However, previous measurements have been complex, operator-dependent or invasive. Recently, we developed a new automated oscillometric method to measure a brachial artery's estimated area (eA) and volume elastic modulus (VE). The aim of this study was to investigate the reliability of new automated oscillometric measurement of eA and VE. Rest eA and VE were measured using the recently developed automated detector with the oscillometric method. eA was estimated using pressure/volume curves and VE was defined as follows (VE=Δ pressure/ (100 × Δ area/area) mm Hg/%). Sixteen volunteers (age 35.2±13.1 years) underwent the oscillometric measurements and brachial ultrasound at rest and under nitroglycerin (NTG) administration. Oscillometric measurement was performed twice on different days. The rest eA correlated with ultrasound-measured brachial artery area (r=0.77, P<0.001). Rest eA and VE measurement showed good reproducibility (eA: intraclass correlation coefficient (ICC)=0.88, VE: ICC=0.78). Under NTG stress, eA was significantly increased (12.3±3.0 vs. 17.1±4.6 mm2, P<0.001), and this was similar to the case with ultrasound evaluation (4.46±0.72 vs. 4.73±0.75 mm, P<0.001). VE was also decreased (0.81±0.16 vs. 0.65±0.11 mm Hg/%, P<0.001) after NTG. Cross-sectional vascular area calculated using this automated oscillometric measurement correlated with ultrasound measurement and showed good reproducibility. Therefore, this is a reliable approach and this modality may have practical application to automatically assess muscular artery diameter and elasticity in clinical or epidemiological settings. PMID:25693851

  11. Exploring new quantitative CT image features to improve assessment of lung cancer prognosis

    NASA Astrophysics Data System (ADS)

    Emaminejad, Nastaran; Qian, Wei; Kang, Yan; Guan, Yubao; Lure, Fleming; Zheng, Bin

    2015-03-01

    Due to the promotion of lung cancer screening, more Stage I non-small-cell lung cancers (NSCLC) are currently detected, which usually have favorable prognosis. However, a high percentage of the patients have cancer recurrence after surgery, which reduces overall survival rate. To achieve optimal efficacy of treating and managing Stage I NSCLC patients, it is important to develop more accurate and reliable biomarkers or tools to predict cancer prognosis. The purpose of this study is to investigate a new quantitative image analysis method to predict the risk of lung cancer recurrence of Stage I NSCLC patients after the lung cancer surgery using the conventional chest computed tomography (CT) images and compare the prediction result with a popular genetic biomarker namely, protein expression of the excision repair cross-complementing 1 (ERCC1) genes. In this study, we developed and tested a new computer-aided detection (CAD) scheme to segment lung tumors and initially compute 35 tumor-related morphologic and texture features from CT images. By applying a machine learning based feature selection method, we identified a set of 8 effective and non-redundant image features. Using these features we trained a naïve Bayesian network based classifier to predict the risk of cancer recurrence. When applying to a test dataset with 79 Stage I NSCLC cases, the computed areas under ROC curves were 0.77±0.06 and 0.63±0.07 when using the quantitative image based classifier and ERCC1, respectively. The study results demonstrated the feasibility of improving accuracy of predicting cancer prognosis or recurrence risk using a CAD-based quantitative image analysis method.

  12. Application of a Multiplex Quantitative PCR to Assess Prevalence and Intensity Of Intestinal Parasite Infections in a Controlled Clinical Trial

    PubMed Central

    Llewellyn, Stacey; Inpankaew, Tawin; Nery, Susana Vaz; Gray, Darren J.; Verweij, Jaco J.; Clements, Archie C. A.; Gomes, Santina J.; Traub, Rebecca; McCarthy, James S.

    2016-01-01

    Background Accurate quantitative assessment of infection with soil transmitted helminths and protozoa is key to the interpretation of epidemiologic studies of these parasites, as well as for monitoring large scale treatment efficacy and effectiveness studies. As morbidity and transmission of helminth infections are directly related to both the prevalence and intensity of infection, there is particular need for improved techniques for assessment of infection intensity for both purposes. The current study aimed to evaluate two multiplex PCR assays to determine prevalence and intensity of intestinal parasite infections, and compare them to standard microscopy. Methodology/Principal Findings Faecal samples were collected from a total of 680 people, originating from rural communities in Timor-Leste (467 samples) and Cambodia (213 samples). DNA was extracted from stool samples and subject to two multiplex real-time PCR reactions the first targeting: Necator americanus, Ancylostoma spp., Ascaris spp., and Trichuris trichiura; and the second Entamoeba histolytica, Cryptosporidium spp., Giardia. duodenalis, and Strongyloides stercoralis. Samples were also subject to sodium nitrate flotation for identification and quantification of STH eggs, and zinc sulphate centrifugal flotation for detection of protozoan parasites. Higher parasite prevalence was detected by multiplex PCR (hookworms 2.9 times higher, Ascaris 1.2, Giardia 1.6, along with superior polyparasitism detection with this effect magnified as the number of parasites present increased (one: 40.2% vs. 38.1%, two: 30.9% vs. 12.9%, three: 7.6% vs. 0.4%, four: 0.4% vs. 0%). Although, all STH positive samples were low intensity infections by microscopy as defined by WHO guidelines the DNA-load detected by multiplex PCR suggested higher intensity infections. Conclusions/Significance Multiplex PCR, in addition to superior sensitivity, enabled more accurate determination of infection intensity for Ascaris, hookworms and

  13. A relative quantitative assessment of myocardial perfusion by first-pass technique: animal study

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Zhang, Zhang; Yu, Xuefang; Zhou, Kenneth J.

    2015-03-01

    The purpose of this study is to quantitatively assess the myocardial perfusion by first-pass technique in swine model. Numerous techniques based on the analysis of Computed Tomography (CT) Hounsfield Unit (HU) density have emerged. Although these methods proposed to be able to assess haemodynamically significant coronary artery stenosis, their limitations are noticed. There are still needs to develop some new techniques. Experiments were performed upon five (5) closed-chest swine. Balloon catheters were placed into the coronary artery to simulate different degrees of luminal stenosis. Myocardial Blood Flow (MBF) was measured using color microsphere technique. Fractional Flow Reserve (FFR) was measured using pressure wire. CT examinations were performed twice during First-pass phase under adenosine-stress condition. CT HU Density (HUDCT) and CT HU Density Ratio (HUDRCT) were calculated using the acquired CT images. Our study presents that HUDRCT shows a good (y=0.07245+0.09963x, r2=0.898) correlation with MBF and FFR. In receiver operating characteristic (ROC) curve analyses, HUDRCT provides excellent diagnostic performance for the detection of significant ischemia during adenosine-stress as defined by FFR indicated by the value of Area Under the Curve (AUC) of 0.927. HUDRCT has the potential to be developed as a useful indicator of quantitative assessment of myocardial perfusion.

  14. Impact assessment of abiotic resources in LCA: quantitative comparison of selected characterization models.

    PubMed

    Rørbech, Jakob T; Vadenbo, Carl; Hellweg, Stefanie; Astrup, Thomas F

    2014-10-01

    Resources have received significant attention in recent years resulting in development of a wide range of resource depletion indicators within life cycle assessment (LCA). Understanding the differences in assessment principles used to derive these indicators and the effects on the impact assessment results is critical for indicator selection and interpretation of the results. Eleven resource depletion methods were evaluated quantitatively with respect to resource coverage, characterization factors (CF), impact contributions from individual resources, and total impact scores. We included 2247 individual market inventory data sets covering a wide range of societal activities (ecoinvent database v3.0). Log-linear regression analysis was carried out for all pairwise combinations of the 11 methods for identification of correlations in CFs (resources) and total impacts (inventory data sets) between methods. Significant differences in resource coverage were observed (9-73 resources) revealing a trade-off between resource coverage and model complexity. High correlation in CFs between methods did not necessarily manifest in high correlation in total impacts. This indicates that also resource coverage may be critical for impact assessment results. Although no consistent correlations between methods applying similar assessment models could be observed, all methods showed relatively high correlation regarding the assessment of energy resources. Finally, we classify the existing methods into three groups, according to method focus and modeling approach, to aid method selection within LCA. PMID:25208267

  15. Quantitative assessment of emphysema from whole lung CT scans: comparison with visual grading

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Apanosovich, Tatiyana V.; Wang, Jianwei; Yankelevitz, David F.; Henschke, Claudia I.

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for imaging of the anatomical basis of emphysema and for visual assessment by radiologists of the extent present in the lungs. Several measures have been introduced for the quantification of the extent of disease directly from CT data in order to add to the qualitative assessments made by radiologists. In this paper we compare emphysema index, mean lung density, histogram percentiles, and the fractal dimension to visual grade in order to evaluate the predictability of radiologist visual scoring of emphysema from low-dose CT scans through quantitative scores, in order to determine which measures can be useful as surrogates for visual assessment. All measures were computed over nine divisions of the lung field (whole lung, individual lungs, and upper/middle/lower thirds of each lung) for each of 148 low-dose, whole lung scans. In addition, a visual grade of each section was also given by an expert radiologist. One-way ANOVA and multinomial logistic regression were used to determine the ability of the measures to predict visual grade from quantitative score. We found that all measures were able to distinguish between normal and severe grades (p<0.01), and between mild/moderate and all other grades (p<0.05). However, no measure was able to distinguish between mild and moderate cases. Approximately 65% prediction accuracy was achieved from using quantitative score to predict visual grade, with 73% if mild and moderate cases are considered as a single class.

  16. Analytical and Clinical Validation of a Digital Sequencing Panel for Quantitative, Highly Accurate Evaluation of Cell-Free Circulating Tumor DNA

    PubMed Central

    Zill, Oliver A.; Sebisanovic, Dragan; Lopez, Rene; Blau, Sibel; Collisson, Eric A.; Divers, Stephen G.; Hoon, Dave S. B.; Kopetz, E. Scott; Lee, Jeeyun; Nikolinakos, Petros G.; Baca, Arthur M.; Kermani, Bahram G.; Eltoukhy, Helmy; Talasaz, AmirAli

    2015-01-01

    Next-generation sequencing of cell-free circulating solid tumor DNA addresses two challenges in contemporary cancer care. First this method of massively parallel and deep sequencing enables assessment of a comprehensive panel of genomic targets from a single sample, and second, it obviates the need for repeat invasive tissue biopsies. Digital SequencingTM is a novel method for high-quality sequencing of circulating tumor DNA simultaneously across a comprehensive panel of over 50 cancer-related genes with a simple blood test. Here we report the analytic and clinical validation of the gene panel. Analytic sensitivity down to 0.1% mutant allele fraction is demonstrated via serial dilution studies of known samples. Near-perfect analytic specificity (> 99.9999%) enables complete coverage of many genes without the false positives typically seen with traditional sequencing assays at mutant allele frequencies or fractions below 5%. We compared digital sequencing of plasma-derived cell-free DNA to tissue-based sequencing on 165 consecutive matched samples from five outside centers in patients with stage III-IV solid tumor cancers. Clinical sensitivity of plasma-derived NGS was 85.0%, comparable to 80.7% sensitivity for tissue. The assay success rate on 1,000 consecutive samples in clinical practice was 99.8%. Digital sequencing of plasma-derived DNA is indicated in advanced cancer patients to prevent repeated invasive biopsies when the initial biopsy is inadequate, unobtainable for genomic testing, or uninformative, or when the patient’s cancer has progressed despite treatment. Its clinical utility is derived from reduction in the costs, complications and delays associated with invasive tissue biopsies for genomic testing. PMID:26474073

  17. Analytical and Clinical Validation of a Digital Sequencing Panel for Quantitative, Highly Accurate Evaluation of Cell-Free Circulating Tumor DNA.

    PubMed

    Lanman, Richard B; Mortimer, Stefanie A; Zill, Oliver A; Sebisanovic, Dragan; Lopez, Rene; Blau, Sibel; Collisson, Eric A; Divers, Stephen G; Hoon, Dave S B; Kopetz, E Scott; Lee, Jeeyun; Nikolinakos, Petros G; Baca, Arthur M; Kermani, Bahram G; Eltoukhy, Helmy; Talasaz, AmirAli

    2015-01-01

    Next-generation sequencing of cell-free circulating solid tumor DNA addresses two challenges in contemporary cancer care. First this method of massively parallel and deep sequencing enables assessment of a comprehensive panel of genomic targets from a single sample, and second, it obviates the need for repeat invasive tissue biopsies. Digital Sequencing™ is a novel method for high-quality sequencing of circulating tumor DNA simultaneously across a comprehensive panel of over 50 cancer-related genes with a simple blood test. Here we report the analytic and clinical validation of the gene panel. Analytic sensitivity down to 0.1% mutant allele fraction is demonstrated via serial dilution studies of known samples. Near-perfect analytic specificity (> 99.9999%) enables complete coverage of many genes without the false positives typically seen with traditional sequencing assays at mutant allele frequencies or fractions below 5%. We compared digital sequencing of plasma-derived cell-free DNA to tissue-based sequencing on 165 consecutive matched samples from five outside centers in patients with stage III-IV solid tumor cancers. Clinical sensitivity of plasma-derived NGS was 85.0%, comparable to 80.7% sensitivity for tissue. The assay success rate on 1,000 consecutive samples in clinical practice was 99.8%. Digital sequencing of plasma-derived DNA is indicated in advanced cancer patients to prevent repeated invasive biopsies when the initial biopsy is inadequate, unobtainable for genomic testing, or uninformative, or when the patient's cancer has progressed despite treatment. Its clinical utility is derived from reduction in the costs, complications and delays associated with invasive tissue biopsies for genomic testing. PMID:26474073

  18. Quantitative assessment of autonomic dysreflexia with combined spectroscopic and perfusion probes

    NASA Astrophysics Data System (ADS)

    Ramella-Roman, Jessica C.; Pfefer, Allison; Hidler, Joseph

    2009-02-01

    Autonomic Dysreflexia (AD) is an uncontrolled response of sympathetic output occurring in individuals with an injury at the sixth thoracic (T6) neurologic level. Any noxious stimulus below the injury level can trigger an AD episode. Progression of an AD attack can result in severe vasoconstriction below the injury level. Skin oxygenation can decrease up to 40% during an AD event. We present a quantitative and non-invasive method of assessing the progression of an AD event by measuring patient's skin oxygen levels and blood flow using a fiber optic based system.

  19. Quantitative Assessment of the Effects of Oxidants on Antigen-Antibody Binding In Vitro

    PubMed Central

    Han, Shuang; Wang, Guanyu; Xu, Naijin; Liu, Hui

    2016-01-01

    Objective. We quantitatively assessed the influence of oxidants on antigen-antibody-binding activity. Methods. We used several immunological detection methods, including precipitation reactions, agglutination reactions, and enzyme immunoassays, to determine antibody activity. The oxidation-reduction potential was measured in order to determine total serum antioxidant capacity. Results. Certain concentrations of oxidants resulted in significant inhibition of antibody activity but had little influence on total serum antioxidant capacity. Conclusions. Oxidants had a significant influence on interactions between antigen and antibody, but minimal effect on the peptide of the antibody molecule. PMID:27313823

  20. Global Quantitative Assessment of Colorectal Polyp Burden in Familial Adenomatous Polyposis Using a Web-based Tool

    PubMed Central

    Lynch, Patrick M.; Morris, Jeffrey S.; Ross, William A.; Rodriguez-Bigas, Miguel A.; Posadas, Juan; Khalaf, Rossa; Weber, Diane M.; Sepeda, Valerie O.; Levin, Bernard; Shureiqi, Imad

    2013-01-01

    Background Accurate measures of total polyp burden in familial adenomatous polyposis (FAP) are lacking. Current assessment tools include polyp quantitation in limited-field photographs and qualitative total colorectal polyp burden by video. Objective To develop global quantitative tools of FAP colorectal adenoma burden. Design and Interventions A single-arm phase II trial in 27 FAP patients treated with celecoxib for 6 months, with pre- and post-treatment videos posted to intranet with interactive site for scoring. Main outcome measurements Global adenoma counts and sizes (grouped into categories: <2 mm, 2–4 mm, and >4 mm) were scored from videos using a novel web-based tool. Baseline and end-of-study adenoma burdens results were summarized using five models. Correlations between pairs of reviewers were analyzed for each model. RESULTS Interobserver agreement was high for all 5 measures of polyp burden. Measures employing both polyp count and polyp size had better interobserver agreement than measures based only on polyp count. The measure in which polyp counts were weighted according to diameter, calculated as (1) × (no. of polyps <2 mm) + (3) × (no. of polyps 2–4 mm) + (5) × (no. of polyps >4 mm) had the highest interobserver agreement. (Pearson r = 0.978 for two gastroenterologists, 0.786 and 0.846 for the surgeon vs each gastroenterologist). Treatment reduced polyp burden by these measurements in 70–89% subjects (p<0.001). Limitations Phase II study. Conclusions This novel web-based polyp scoring method provides a convenient and reproducible way to quantify global colorectal adenoma burden in FAP patients and a framework for developing a clinical staging system for FAP. PMID:23332604

  1. Hydrostratigraphic Drilling Record Assessment (HyDRA): Assessing the Consistency and Quantitative Utility of Water Well Drillers' Logs

    NASA Astrophysics Data System (ADS)

    Bohling, G.; Helm, C. F.; Butler, J. J., Jr.

    2014-12-01

    The Hydrostratigraphic Drilling Record Assessment (HyDRA) project is a three-year study to develop improved methods for building groundwater flow models from drillers' logs. Lithologic logs recorded by water well drillers represent a voluminous source of information regarding hydrostratigraphy. However, developing quantitative models from drillers' logs is challenging due to the idiosyncratic nature of each driller's approach to describing sediments and lithologies as well as variability in the amount of care invested in the description process. This presentation uses three approaches to assess the consistency and utility of drillers' logs from 250 wells in the vicinity of a continuously monitored "index" well in the High Plains Aquifer in Thomas County, Kansas. The first assessment procedure will examine logs from wells in the vicinity of the index well to determine whether they show evidence of lateral confinement of a region immediately surrounding the index well, as seems to be indicated by the index well hydrograph. The second will apply a cross-validation procedure to determine the degree of consistency among logs at different wells and identify logs that are most out of keeping with logs at nearby wells. The logs are cast in quantitative terms by first representing the sediment descriptions using 72 standardized lithology terms, further categorizing the standardized lithologies into five hydraulic property categories, and then computing the proportions of the hydraulic property categories over regular ten-foot-intervals in each well. The cross-validation procedure involves using a cross-entropy measure to compare the actual category proportions in each well to those interpolated from neighboring wells. Finally, results of a groundwater flow model using property fields developed from the drillers' logs will be briefly discussed. Comparisons between observed and simulated water levels at the index well and other continuously and annually monitored wells in the

  2. Monitoring and quantitative assessment of tumor burden using in vivo bioluminescence imaging

    NASA Astrophysics Data System (ADS)

    Chen, Chia-Chi; Hwang, Jeng-Jong; Ting, Gann; Tseng, Yun-Long; Wang, Shyh-Jen; Whang-Peng, Jaqueline

    2007-02-01

    In vivo bioluminescence imaging (BLI) is a sensitive imaging modality that is rapid and accessible, and may comprise an ideal tool for evaluating tumor growth. In this study, the kinetic of tumor growth has been assessed in C26 colon carcinoma bearing BALB/c mouse model. The ability of BLI to noninvasively quantitate the growth of subcutaneous tumors transplanted with C26 cells genetically engineered to stably express firefly luciferase and herpes simplex virus type-1 thymidine kinase (C26/ tk-luc). A good correlation ( R2=0.998) of photon emission to the cell number was found in vitro. Tumor burden and tumor volume were monitored in vivo over time by quantitation of photon emission using Xenogen IVIS 50 and standard external caliper measurement, respectively. At various time intervals, tumor-bearing mice were imaged to determine the correlation of in vivo BLI to tumor volume. However, a correlation of BLI to tumor volume was observed when tumor volume was smaller than 1000 mm 3 ( R2=0.907). γ Scintigraphy combined with [ 131I]FIAU was another imaging modality used for verifying the previous results. In conclusion, this study showed that bioluminescence imaging is a powerful and quantitative tool for the direct assay to monitor tumor growth in vivo. The dual reporter genes transfected tumor-bearing animal model can be applied in the evaluation of the efficacy of new developed anti-cancer drugs.

  3. Automated quantitative assessment of three-dimensional bioprinted hydrogel scaffolds using optical coherence tomography

    PubMed Central

    Wang, Ling; Xu, Mingen; Zhang, LieLie; Zhou, QingQing; Luo, Li

    2016-01-01

    Reconstructing and quantitatively assessing the internal architecture of opaque three-dimensional (3D) bioprinted hydrogel scaffolds is difficult but vital to the improvement of 3D bioprinting techniques and to the fabrication of functional engineered tissues. In this study, swept-source optical coherence tomography was applied to acquire high-resolution images of hydrogel scaffolds. Novel 3D gelatin/alginate hydrogel scaffolds with six different representative architectures were fabricated using our 3D bioprinting system. Both the scaffold material networks and the interconnected flow channel networks were reconstructed through volume rendering and binarisation processing to provide a 3D volumetric view. An image analysis algorithm was developed based on the automatic selection of the spatially-isolated region-of–interest. Via this algorithm, the spatially-resolved morphological parameters including pore size, pore shape, strut size, surface area, porosity, and interconnectivity were quantified precisely. Fabrication defects and differences between the designed and as-produced scaffolds were clearly identified in both 2D and 3D; the locations and dimensions of each of the fabrication defects were also defined. It concludes that this method will be a key tool for non-destructive and quantitative characterization, design optimisation and fabrication refinement of 3D bioprinted hydrogel scaffolds. Furthermore, this method enables investigation into the quantitative relationship between scaffold structure and biological outcome. PMID:27231597

  4. Quantitative sonoelastography for the in vivo assessment of skeletal muscle viscoelasticity

    PubMed Central

    Hoyt, Kenneth; Kneezel, Timothy; Castaneda, Benjamin; Parker, Kevin J

    2015-01-01

    A novel quantitative sonoelastography technique for assessing the viscoelastic properties of skeletal muscle tissue was developed. Slowly propagating shear wave interference patterns (termed crawling waves) were generated using a two-source configuration vibrating normal to the surface. Theoretical models predict crawling wave displacement fields, which were validated through phantom studies. In experiments, a viscoelastic model was fit to dispersive shear wave speed sonoelastographic data using nonlinear least-squares techniques to determine frequency-independent shear modulus and viscosity estimates. Shear modulus estimates derived using the viscoelastic model were in agreement with that obtained by mechanical testing on phantom samples. Preliminary sonoelastographic data acquired in healthy human skeletal muscles confirm that high-quality quantitative elasticity data can be acquired in vivo. Studies on relaxed muscle indicate discernible differences in both shear modulus and viscosity estimates between different skeletal muscle groups. Investigations into the dynamic viscoelastic properties of (healthy) human skeletal muscles revealed that voluntarily contracted muscles exhibit considerable increases in both shear modulus and viscosity estimates as compared to the relaxed state. Overall, preliminary results are encouraging and quantitative sonoelastography may prove clinically feasible for in vivo characterization of the dynamic viscoelastic properties of human skeletal muscle. PMID:18612176

  5. Automated quantitative assessment of three-dimensional bioprinted hydrogel scaffolds using optical coherence tomography.

    PubMed

    Wang, Ling; Xu, Mingen; Zhang, LieLie; Zhou, QingQing; Luo, Li

    2016-03-01

    Reconstructing and quantitatively assessing the internal architecture of opaque three-dimensional (3D) bioprinted hydrogel scaffolds is difficult but vital to the improvement of 3D bioprinting techniques and to the fabrication of functional engineered tissues. In this study, swept-source optical coherence tomography was applied to acquire high-resolution images of hydrogel scaffolds. Novel 3D gelatin/alginate hydrogel scaffolds with six different representative architectures were fabricated using our 3D bioprinting system. Both the scaffold material networks and the interconnected flow channel networks were reconstructed through volume rendering and binarisation processing to provide a 3D volumetric view. An image analysis algorithm was developed based on the automatic selection of the spatially-isolated region-of-interest. Via this algorithm, the spatially-resolved morphological parameters including pore size, pore shape, strut size, surface area, porosity, and interconnectivity were quantified precisely. Fabrication defects and differences between the designed and as-produced scaffolds were clearly identified in both 2D and 3D; the locations and dimensions of each of the fabrication defects were also defined. It concludes that this method will be a key tool for non-destructive and quantitative characterization, design optimisation and fabrication refinement of 3D bioprinted hydrogel scaffolds. Furthermore, this method enables investigation into the quantitative relationship between scaffold structure and biological outcome. PMID:27231597

  6. Survey of Quantitative Research Metrics to Assess Pilot Performance in Upset Recovery

    NASA Technical Reports Server (NTRS)

    Le Vie, Lisa R.

    2016-01-01

    Accidents attributable to in-flight loss of control are the primary cause for fatal commercial jet accidents worldwide. The National Aeronautics and Space Administration (NASA) conducted a literature review to determine and identify the quantitative standards for assessing upset recovery performance. This review contains current recovery procedures for both military and commercial aviation and includes the metrics researchers use to assess aircraft recovery performance. Metrics include time to first input, recognition time and recovery time and whether that input was correct or incorrect. Other metrics included are: the state of the autopilot and autothrottle, control wheel/sidestick movement resulting in pitch and roll, and inputs to the throttle and rudder. In addition, airplane state measures, such as roll reversals, altitude loss/gain, maximum vertical speed, maximum/minimum air speed, maximum bank angle and maximum g loading are reviewed as well.

  7. Disc Degeneration Assessed by Quantitative T2* (T2 star) Correlated with Functional Lumbar Mechanics

    PubMed Central

    Ellingson, Arin M.; Mehta, Hitesh; Polly, David W.; Ellermann, Jutta; Nuckley, David J.

    2013-01-01

    Study Design Experimental correlation study design to quantify features of disc health, including signal intensity and distinction between the annulus fibrosus (AF) and nucleus pulposus (NP), with T2* magnetic resonance imaging (MRI) and correlate with the functional mechanics in corresponding motion segments. Objective Establish the relationship between disc health assessed by quantitative T2* MRI and functional lumbar mechanics. Summary of Background Data Degeneration leads to altered biochemistry in the disc, affecting the mechanical competence. Clinical routine MRI sequences are not adequate in detecting early changes in degeneration and fails to correlate with pain or improve patient stratification. Quantitative T2* relaxation time mapping probes biochemical features and may offer more sensitivity in assessing disc degeneration. Methods Cadaveric lumbar spines were imaged using quantitative T2* mapping, as well as conventional T2-weighted MRI sequences. Discs were graded by the Pfirrmann scale and features of disc health, including signal intensity (T2* Intensity Area) and distinction between the AF and NP (Transition Zone Slope), were quantified by T2*. Each motion segment was subjected to pure moment bending to determine range of motion (ROM), neutral zone (NZ), and bending stiffness. Results T2* Intensity Area and Transition Zone Slope were significantly correlated with flexion ROM (p=0.015; p=0.002), ratio of NZ/ROM (p=0.010; p=0.028), and stiffness (p=0.044; p=0.026), as well as lateral bending NZ/ROM (p=0.005; p=0.010) and stiffness (p=0.022; p=0.029). T2* Intensity Area was also correlated with LB ROM (p=0.023). Pfirrmann grade was only correlated with lateral bending NZ/ROM (p=0.001) and stiffness (p=0.007). Conclusions T2* mapping is a sensitive quantitative method capable of detecting changes associated with disc degeneration. Features of disc health quantified with T2* predicted altered functional mechanics of the lumbar spine better than

  8. Quantitative microbial risk assessment for Staphylococcus aureus in natural and processed cheese in Korea.

    PubMed

    Lee, Heeyoung; Kim, Kyunga; Choi, Kyoung-Hee; Yoon, Yohan

    2015-09-01

    This study quantitatively assessed the microbial risk of Staphylococcus aureus in cheese in Korea. The quantitative microbial risk assessment was carried out for natural and processed cheese from factory to consumption. Hazards for S. aureus in cheese were identified through the literature. For exposure assessment, the levels of S. aureus contamination in cheeses were evaluated, and the growth of S. aureus was predicted by predictive models at the surveyed temperatures, and at the time of cheese processing and distribution. For hazard characterization, a dose-response model for S. aureus was found, and the model was used to estimate the risk of illness. With these data, simulation models were prepared with @RISK (Palisade Corp., Ithaca, NY) to estimate the risk of illness per person per day in risk characterization. Staphylococcus aureus cell counts on cheese samples from factories and markets were below detection limits (0.30-0.45 log cfu/g), and pert distribution showed that the mean temperature at markets was 6.63°C. Exponential model [P=1 - exp(7.64×10(-8) × N), where N=dose] for dose-response was deemed appropriate for hazard characterization. Mean temperature of home storage was 4.02°C (log-logistic distribution). The results of risk characterization for S. aureus in natural and processed cheese showed that the mean values for the probability of illness per person per day were higher in processed cheese (mean: 2.24×10(-9); maximum: 7.97×10(-6)) than in natural cheese (mean: 7.84×10(-10); maximum: 2.32×10(-6)). These results indicate that the risk of S. aureus-related foodborne illness due to cheese consumption can be considered low under the present conditions in Korea. In addition, the developed stochastic risk assessment model in this study can be useful in establishing microbial criteria for S. aureus in cheese. PMID:26162789

  9. Quantitative photoacoustic assessment of ex-vivo lymph nodes of colorectal cancer patients

    NASA Astrophysics Data System (ADS)

    Sampathkumar, Ashwin; Mamou, Jonathan; Saegusa-Beercroft, Emi; Chitnis, Parag V.; Machi, Junji; Feleppa, Ernest J.

    2015-03-01

    Staging of cancers and selection of appropriate treatment requires histological examination of multiple dissected lymph nodes (LNs) per patient, so that a staggering number of nodes require histopathological examination, and the finite resources of pathology facilities create a severe processing bottleneck. Histologically examining the entire 3D volume of every dissected node is not feasible, and therefore, only the central region of each node is examined histologically, which results in severe sampling limitations. In this work, we assess the feasibility of using quantitative photoacoustics (QPA) to overcome the limitations imposed by current procedures and eliminate the resulting under sampling in node assessments. QPA is emerging as a new hybrid modality that assesses tissue properties and classifies tissue type based on multiple estimates derived from spectrum analysis of photoacoustic (PA) radiofrequency (RF) data and from statistical analysis of envelope-signal data derived from the RF signals. Our study seeks to use QPA to distinguish cancerous from non-cancerous regions of dissected LNs and hence serve as a reliable means of imaging and detecting small but clinically significant cancerous foci that would be missed by current methods. Dissected lymph nodes were placed in a water bath and PA signals were generated using a wavelength-tunable (680-950 nm) laser. A 26-MHz, f-2 transducer was used to sense the PA signals. We present an overview of our experimental setup; provide a statistical analysis of multi-wavelength classification parameters (mid-band fit, slope, intercept) obtained from the PA signal spectrum generated in the LNs; and compare QPA performance with our established quantitative ultrasound (QUS) techniques in distinguishing metastatic from non-cancerous tissue in dissected LNs. QPA-QUS methods offer a novel general means of tissue typing and evaluation in a broad range of disease-assessment applications, e.g., cardiac, intravascular

  10. Changes in transmural distribution of myocardial perfusion assessed by quantitative intravenous myocardial contrast echocardiography in humans

    PubMed Central

    Fukuda, S; Muro, T; Hozumi, T; Watanabe, H; Shimada, K; Yoshiyama, M; Takeuchi, K; Yoshikawa, J

    2002-01-01

    Objective: To clarify whether changes in transmural distribution of myocardial perfusion under significant coronary artery stenosis can be assessed by quantitative intravenous myocardial contrast echocardiography (MCE) in humans. Methods: 31 patients underwent dipyridamole stress MCE and quantitative coronary angiography. Intravenous MCE was performed by continuous infusion of Levovist. Images were obtained from the apical four chamber view with alternating pulsing intervals both at rest and after dipyridamole infusion. Images were analysed offline by placing regions of interest over both endocardial and epicardial sides of the mid-septum. The background subtracted intensity versus pulsing interval plots were fitted to an exponential function, y = A (1 − e−βt), where A is plateau level and β is rate of rise. Results: Of the 31 patients, 16 had significant stenosis (> 70%) in the left anterior descending artery (group A) and 15 did not (group B). At rest, there were no differences in the A endocardial to epicardial ratio (A-EER) and β-EER between the two groups (mean (SD) 1.2 (0.6) v 1.2 (0.8) and 1.2 (0.7) v 1.1 (0.6), respectively, NS). During hyperaemia, β-EER in group A was significantly lower than that in group B (1.0 (0.5) v 1.4 (0.5), p < 0.05) and A-EER did not differ between the two groups (1.0 (0.5) v 1.2 (0.4), NS). Conclusions: Changes in transmural distribution of myocardial perfusion under significant coronary artery stenosis can be assessed by quantitative intravenous MCE in humans. PMID:12231594

  11. Quantitative imaging and sigmoidoscopy to assess distribution of rectal microbicide surrogates.

    PubMed

    Hendrix, C W; Fuchs, E J; Macura, K J; Lee, L A; Parsons, T L; Bakshi, R P; Khan, W A; Guidos, A; Leal, J P; Wahl, R

    2008-01-01

    Understanding the distribution of microbicide and human immunodeficiency virus (HIV) within the gastrointestinal tract is critical to development of rectal HIV microbicides. A hydroxyethylcellulose-based microbicide surrogate or viscosity-matched semen surrogate, labeled with gadolinium-DTPA (diethylene triamine pentaacetic acid) and 99mTechnetium-sulfur colloid, was administered to three subjects under varying experimental conditions to evaluate effects of enema, coital simulation, and microbicide or semen simulant over 5 h duration. Quantitative assessment used single photon emission computed tomography (SPECT)/computed tomography (CT) and magnetic resonance imaging (MRI) imaging, and sigmoidoscopic sampling. Over 4 h, radiolabel migrated cephalad in all studies by a median (interquartile range) of 50% (29-102%; P<0.001), as far as the splenic flexure (approximately 60 cm) in 12% of studies. There was a correlation in concentration profile between endoscopic sampling and SPECT assessments. HIV-sized particles migrate retrograde, 60 cm in some studies, 4 h after simulated ejaculation in our model. SPECT/CT, MRI, and endoscopy can be used quantitatively to facilitate rational development of microbicides for rectal use. PMID:17507921

  12. Quantitative volumetric assessment of pulmonary involvement in patients with systemic sclerosis

    PubMed Central

    Göya, Cemil; Hamidi, Cihad; Tekbaş, Güven; Abakay, Özlem; Batmaz, İbrahim; Hattapoğlu, Salih; Yavuz, Alpaslan; Bilici, Aslan

    2016-01-01

    Background Computed tomography (CT) is the gold standard for assessing interstitial lung disease (ILD) in patients with systemic sclerosis (SSc). In this study, we performed a quantitative calculation of ILD severity by examining the lung volume of SSc patients. Methods The present study was performed retrospectively on 38 patients with SSc who were referred to our clinic. Patients were divided into two groups based on high-resolution computed tomography (HRCT): patients with ILD and patients without ILD.The percentage of lower lobe volume (PLLV) was calculated using HRCT. In addition, we evaluated the PLLV in all patients according to age, diffusing capacity of the lung for carbon monoxide (DLCO) and spirometric findings, and assessed the relationships among these factors. Results PLLV of the right lung in patients with ILD was reduced when compared with patients without ILD (P=0.041). The PLLV of the right lung in patients with ILD was negatively correlated with age and forced vital capacity (FVC; P=0.01 and P=0.012, respectively). Conclusions The PLLV of the right lung may decrease in SSc patients with ILD. In these patients, the PLLV may be a quantitative parameter indicating damage in the lung. PMID:26981455

  13. Consistencies and inconsistencies underlying the quantitative assessment of leukemia risk from benzene exposure

    SciTech Connect

    Lamm, S.H.; Walters, A.S. ); Wilson, R. ); Byrd, D.M. ); Grunwald, H. )

    1989-07-01

    This paper examines recent risk assessments for benzene and observes a number of inconsistencies within the study and consistencies between studies that should effect the quantitative determination of the risk from benzene exposure. Comparisons across studies show that only acute myeloid leukemia (AML) is found to be consistently in excess with significant benzene exposure. The data from the Pliofilm study that forms the basis of most quantitative assessments reveal that all the AML cases came from only one of the three studied plants and that all the benzene exposure data came from the other plants. Hematological data from the 1940s from the plant from which almost all of the industrial hygiene exposure data come do not correlate well with the originally published exposure estimates but do correlate well with an alternative set of exposure estimates that are much greater than those estimates originally published. Temporal relationships within the study are not consistent with those of other studies. The dose-response relationship is strongly nonlinear. Other data suggest that the leukemogenic effect of benzene is nonlinear and may derive from a threshold toxicity.

  14. Using Non-Invasive Multi-Spectral Imaging to Quantitatively Assess Tissue Vasculature

    SciTech Connect

    Vogel, A; Chernomordik, V; Riley, J; Hassan, M; Amyot, F; Dasgeb, B; Demos, S G; Pursley, R; Little, R; Yarchoan, R; Tao, Y; Gandjbakhche, A H

    2007-10-04

    This research describes a non-invasive, non-contact method used to quantitatively analyze the functional characteristics of tissue. Multi-spectral images collected at several near-infrared wavelengths are input into a mathematical optical skin model that considers the contributions from different analytes in the epidermis and dermis skin layers. Through a reconstruction algorithm, we can quantify the percent of blood in a given area of tissue and the fraction of that blood that is oxygenated. Imaging normal tissue confirms previously reported values for the percent of blood in tissue and the percent of blood that is oxygenated in tissue and surrounding vasculature, for the normal state and when ischemia is induced. This methodology has been applied to assess vascular Kaposi's sarcoma lesions and the surrounding tissue before and during experimental therapies. The multi-spectral imaging technique has been combined with laser Doppler imaging to gain additional information. Results indicate that these techniques are able to provide quantitative and functional information about tissue changes during experimental drug therapy and investigate progression of disease before changes are visibly apparent, suggesting a potential for them to be used as complementary imaging techniques to clinical assessment.

  15. Second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygenation

    NASA Astrophysics Data System (ADS)

    Huang, Jiwei; Zhang, Shiwu; Gnyawali, Surya; Sen, Chandan K.; Xu, Ronald X.

    2015-03-01

    We report a second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygen saturation (StO2). The algorithm is based on a forward model of light transport in multilayered skin tissue and an inverse algorithm for StO2 reconstruction. Based on the forward simulation results, a parameter of a second derivative ratio (SDR) is derived as a function of cutaneous tissue StO2. The SDR function is optimized at a wavelength set of 544, 552, 568, 576, 592, and 600 nm so that cutaneous tissue StO2 can be derived with minimal artifacts by blood concentration, tissue scattering, and melanin concentration. The proposed multispectral StO2 imaging algorithm is verified in both benchtop and in vivo experiments. The experimental results show that the proposed multispectral imaging algorithm is able to map cutaneous tissue StO2 in high temporal resolution with reduced measurement artifacts induced by different skin conditions in comparison with other three commercial tissue oxygen measurement systems. These results indicate that the multispectral StO2 imaging technique has the potential for noninvasive and quantitative assessment of skin tissue oxygenation with a high temporal resolution.

  16. Semi-quantitative assessment of pulmonary perfusion in children using dynamic contrast-enhanced MRI

    NASA Astrophysics Data System (ADS)

    Fetita, Catalin; Thong, William E.; Ou, Phalla

    2013-03-01

    This paper addresses the study of semi-quantitative assessment of pulmonary perfusion acquired from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in a study population mainly composed of children with pulmonary malformations. The automatic analysis approach proposed is based on the indicator-dilution theory introduced in 1954. First, a robust method is developed to segment the pulmonary artery and the lungs from anatomical MRI data, exploiting 2D and 3D mathematical morphology operators. Second, the time-dependent contrast signal of the lung regions is deconvolved by the arterial input function for the assessment of the local hemodynamic system parameters, ie. mean transit time, pulmonary blood volume and pulmonary blood flow. The discrete deconvolution method implements here a truncated singular value decomposition (tSVD) method. Parametric images for the entire lungs are generated as additional elements for diagnosis and quantitative follow-up. The preliminary results attest the feasibility of perfusion quantification in pulmonary DCE-MRI and open an interesting alternative to scintigraphy for this type of evaluation, to be considered at least as a preliminary decision in the diagnostic due to the large availability of the technique and to the non-invasive aspects.

  17. Second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygenation

    PubMed Central

    Huang, Jiwei; Zhang, Shiwu; Gnyawali, Surya; Sen, Chandan K.; Xu, Ronald X.

    2015-01-01

    Abstract. We report a second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygen saturation (StO2). The algorithm is based on a forward model of light transport in multilayered skin tissue and an inverse algorithm for StO2 reconstruction. Based on the forward simulation results, a parameter of a second derivative ratio (SDR) is derived as a function of cutaneous tissue StO2. The SDR function is optimized at a wavelength set of 544, 552, 568, 576, 592, and 600 nm so that cutaneous tissue StO2 can be derived with minimal artifacts by blood concentration, tissue scattering, and melanin concentration. The proposed multispectral StO2 imaging algorithm is verified in both benchtop and in vivo experiments. The experimental results show that the proposed multispectral imaging algorithm is able to map cutaneous tissue StO2 in high temporal resolution with reduced measurement artifacts induced by different skin conditions in comparison with other three commercial tissue oxygen measurement systems. These results indicate that the multispectral StO2 imaging technique has the potential for noninvasive and quantitative assessment of skin tissue oxygenation with a high temporal resolution. PMID:25734405

  18. Quantitative Risk Assessment of Human Trichinellosis Caused by Consumption of Pork Meat Sausages in Argentina.

    PubMed

    Sequeira, G J; Zbrun, M V; Soto, L P; Astesana, D M; Blajman, J E; Rosmini, M R; Frizzo, L S; Signorini, M L

    2016-03-01

    In Argentina, there are three known species of genus Trichinella; however, Trichinella spiralis is most commonly associated with domestic pigs and it is recognized as the main cause of human trichinellosis by the consumption of products made with raw or insufficiently cooked pork meat. In some areas of Argentina, this disease is endemic and it is thus necessary to develop a more effective programme of prevention and control. Here, we developed a quantitative risk assessment of human trichinellosis following pork meat sausage consumption, which may be used to identify the stages with greater impact on the probability of acquiring the disease. The quantitative model was designed to describe the conditions in which the meat is produced, processed, transported, stored, sold and consumed in Argentina. The model predicted a risk of human trichinellosis of 4.88 × 10(-6) and an estimated annual number of trichinellosis cases of 109. The risk of human trichinellosis was sensitive to the number of Trichinella larvae that effectively survived the storage period (r = 0.89), the average probability of infection (PPinf ) (r = 0.44) and the storage time (Storage) (r = 0.08). This model allowed assessing the impact of different factors influencing the risk of acquiring trichinellosis. The model may thus help to select possible strategies to reduce the risk in the chain of by-products of pork production. PMID:26227185

  19. A quantitative model to assess Social Responsibility in Environmental Science and Technology.

    PubMed

    Valcárcel, M; Lucena, R

    2014-01-01

    The awareness of the impact of human activities in society and environment is known as "Social Responsibility" (SR). It has been a topic of growing interest in many enterprises since the fifties of the past Century, and its implementation/assessment is nowadays supported by international standards. There is a tendency to amplify its scope of application to other areas of the human activities, such as Research, Development and Innovation (R + D + I). In this paper, a model of quantitative assessment of Social Responsibility in Environmental Science and Technology (SR EST) is described in detail. This model is based on well established written standards as the EFQM Excellence model and the ISO 26000:2010 Guidance on SR. The definition of five hierarchies of indicators, the transformation of qualitative information into quantitative data and the dual procedure of self-evaluation and external evaluation are the milestones of the proposed model, which can be applied to Environmental Research Centres and institutions. In addition, a simplified model that facilitates its implementation is presented in the article. PMID:23892022

  20. A simple and accurate grading system for orthoiodohippurate renal scans in the assessment of post-transplant renal function

    SciTech Connect

    Zaki, S.K.; Bretan, P.N.; Go, R.T.; Rehm, P.K.; Streem, S.B.; Novick, A.C. )

    1990-06-01

    Orthoiodohippurate renal scanning has proved to be a reliable, noninvasive method for the evaluation and followup of renal allograft function. However, a standardized system for grading renal function with this test is not available. We propose a simple grading system to distinguish the different functional phases of hippurate scanning in renal transplant recipients. This grading system was studied in 138 patients who were evaluated 1 week after renal transplantation. There was a significant correlation between the isotope renographic functional grade and clinical correlates of allograft function such as the serum creatinine level (p = 0.0001), blood urea nitrogen level (p = 0.0001), urine output (p = 0.005) and need for hemodialysis (p = 0.007). We recommend this grading system as a simple and accurate method to interpret orthoiodohippurate renal scans in the evaluation and followup of renal allograft recipients.

  1. Feasibility study for image guided kidney surgery: assessment of required intraoperative surface for accurate image to physical space registrations

    NASA Astrophysics Data System (ADS)

    Benincasa, Anne B.; Clements, Logan W.; Herrell, S. Duke; Chang, Sam S.; Cookson, Michael S.; Galloway, Robert L.

    2006-03-01

    Currently, the removal of kidney tumor masses uses only direct or laparoscopic visualizations, resulting in prolonged procedure and recovery times and reduced clear margin. Applying current image guided surgery (IGS) techniques, as those used in liver cases, to kidney resections (nephrectomies) presents a number of complications. Most notably is the limited field of view of the intraoperative kidney surface, which constrains the ability to obtain a surface delineation that is geometrically descriptive enough to drive a surface-based registration. Two different phantom orientations were used to model the laparoscopic and traditional partial nephrectomy views. For the laparoscopic view, fiducial point sets were compiled from a CT image volume using anatomical features such as the renal artery and vein. For the traditional view, markers attached to the phantom set-up were used for fiducials and targets. The fiducial points were used to perform a point-based registration, which then served as a guide for the surface-based registration. Laser range scanner (LRS) obtained surfaces were registered to each phantom surface using a rigid iterative closest point algorithm. Subsets of each phantom's LRS surface were used in a robustness test to determine the predictability of their registrations to transform the entire surface. Results from both orientations suggest that about half of the kidney's surface needs to be obtained intraoperatively for accurate registrations between the image surface and the LRS surface, suggesting the obtained kidney surfaces were geometrically descriptive enough to perform accurate registrations. This preliminary work paves the way for further development of kidney IGS systems.

  2. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate

    PubMed Central

    Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul

    2015-01-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere. PMID:26633821

  3. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate.

    PubMed

    Minyoo, Abel B; Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul; Lankester, Felix

    2015-12-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere. PMID:26633821

  4. A rapid and accurate method for the quantitative estimation of natural polysaccharides and their fractions using high performance size exclusion chromatography coupled with multi-angle laser light scattering and refractive index detector.

    PubMed

    Cheong, Kit-Leong; Wu, Ding-Tao; Zhao, Jing; Li, Shao-Ping

    2015-06-26

    In this study, a rapid and accurate method for quantitative analysis of natural polysaccharides and their different fractions was developed. Firstly, high performance size exclusion chromatography (HPSEC) was utilized to separate natural polysaccharides. And then the molecular masses of their fractions were determined by multi-angle laser light scattering (MALLS). Finally, quantification of polysaccharides or their fractions was performed based on their response to refractive index detector (RID) and their universal refractive index increment (dn/dc). Accuracy of the developed method for the quantification of individual and mixed polysaccharide standards, including konjac glucomannan, CM-arabinan, xyloglucan, larch arabinogalactan, oat β-glucan, dextran (410, 270, and 25 kDa), mixed xyloglucan and CM-arabinan, and mixed dextran 270 K and CM-arabinan was determined, and their average recoveries were between 90.6% and 98.3%. The limits of detection (LOD) and quantification (LOQ) were ranging from 10.68 to 20.25 μg/mL, and 42.70 to 68.85 μg/mL, respectively. Comparing to the conventional phenol sulfuric acid assay and HPSEC coupled with evaporative light scattering detection (HPSEC-ELSD) analysis, the developed HPSEC-MALLS-RID method based on universal dn/dc for the quantification of polysaccharides and their fractions is much more simple, rapid, and accurate with no need of individual polysaccharide standard, as well as free of calibration curve. The developed method was also successfully utilized for quantitative analysis of polysaccharides and their different fractions from three medicinal plants of Panax genus, Panax ginseng, Panax notoginseng and Panax quinquefolius. The results suggested that the HPSEC-MALLS-RID method based on universal dn/dc could be used as a routine technique for the quantification of polysaccharides and their fractions in natural resources. PMID:25990349

  5. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    SciTech Connect

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and the potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments

  6. Cardiovascular magnetic resonance of myocardial edema using a short inversion time inversion recovery (STIR) black-blood technique: Diagnostic accuracy of visual and semi-quantitative assessment

    PubMed Central

    2012-01-01

    Background The short inversion time inversion recovery (STIR) black-blood technique has been used to visualize myocardial edema, and thus to differentiate acute from chronic myocardial lesions. However, some cardiovascular magnetic resonance (CMR) groups have reported variable image quality, and hence the diagnostic value of STIR in routine clinical practice has been put into question. The aim of our study was to analyze image quality and diagnostic performance of STIR using a set of pulse sequence parameters dedicated to edema detection, and to discuss possible factors that influence image quality. We hypothesized that STIR imaging is an accurate and robust way of detecting myocardial edema in non-selected patients with acute myocardial infarction. Methods Forty-six consecutive patients with acute myocardial infarction underwent CMR (day 4.5, +/- 1.6) including STIR for the assessment of myocardial edema and late gadolinium enhancement (LGE) for quantification of myocardial necrosis. Thirty of these patients underwent a follow-up CMR at approximately six months (195 +/- 39 days). Both STIR and LGE images were evaluated separately on a segmental basis for image quality as well as for presence and extent of myocardial hyper-intensity, with both visual and semi-quantitative (threshold-based) analysis. LGE was used as a reference standard for localization and extent of myocardial necrosis (acute) or scar (chronic). Results Image quality of STIR images was rated as diagnostic in 99.5% of cases. At the acute stage, the sensitivity and specificity of STIR to detect infarcted segments on visual assessment was 95% and 78% respectively, and on semi-quantitative assessment was 99% and 83%, respectively. STIR differentiated acutely from chronically infarcted segments with a sensitivity of 95% by both methods and with a specificity of 99% by visual assessment and 97% by semi-quantitative assessment. The extent of hyper-intense areas on acute STIR images was 85% larger than

  7. Zebrafish Caudal Fin Angiogenesis Assay—Advanced Quantitative Assessment Including 3-Way Correlative Microscopy

    PubMed Central

    Correa Shokiche, Carlos; Schaad, Laura; Triet, Ramona; Jazwinska, Anna; Tschanz, Stefan A.; Djonov, Valentin

    2016-01-01

    Background Researchers evaluating angiomodulating compounds as a part of scientific projects or pre-clinical studies are often confronted with limitations of applied animal models. The rough and insufficient early-stage compound assessment without reliable quantification of the vascular response counts, at least partially, to the low transition rate to clinics. Objective To establish an advanced, rapid and cost-effective angiogenesis assay for the precise and sensitive assessment of angiomodulating compounds using zebrafish caudal fin regeneration. It should provide information regarding the angiogenic mechanisms involved and should include qualitative and quantitative data of drug effects in a non-biased and time-efficient way. Approach & Results Basic vascular parameters (total regenerated area, vascular projection area, contour length, vessel area density) were extracted from in vivo fluorescence microscopy images using a stereological approach. Skeletonization of the vasculature by our custom-made software Skelios provided additional parameters including “graph energy” and “distance to farthest node”. The latter gave important insights into the complexity, connectivity and maturation status of the regenerating vascular network. The employment of a reference point (vascular parameters prior amputation) is unique for the model and crucial for a proper assessment. Additionally, the assay provides exceptional possibilities for correlative microscopy by combining in vivo-imaging and morphological investigation of the area of interest. The 3-way correlative microscopy links the dynamic changes in vivo with their structural substrate at the subcellular level. Conclusions The improved zebrafish fin regeneration model with advanced quantitative analysis and optional 3-way correlative morphology is a promising in vivo angiogenesis assay, well-suitable for basic research and preclinical investigations. PMID:26950851

  8. Quantitative assessment of biological impact using transcriptomic data and mechanistic network models

    SciTech Connect

    Thomson, Ty M.; Sewer, Alain; Martin, Florian; Belcastro, Vincenzo; Frushour, Brian P.; Gebel, Stephan; Park, Jennifer; Schlage, Walter K.; Talikka, Marja; Vasilyev, Dmitry M.; Westra, Jurjen W.; Hoeng, Julia; Peitsch, Manuel C.

    2013-11-01

    Exposure to biologically active substances such as therapeutic drugs or environmental toxicants can impact biological systems at various levels, affecting individual molecules, signaling pathways, and overall cellular processes. The ability to derive mechanistic insights from the resulting system responses requires the integration of experimental measures with a priori knowledge about the system and the interacting molecules therein. We developed a novel systems biology-based methodology that leverages mechanistic network models and transcriptomic data to quantitatively assess the biological impact of exposures to active substances. Hierarchically organized network models were first constructed to provide a coherent framework for investigating the impact of exposures at the molecular, pathway and process levels. We then validated our methodology using novel and previously published experiments. For both in vitro systems with simple exposure and in vivo systems with complex exposures, our methodology was able to recapitulate known biological responses matching expected or measured phenotypes. In addition, the quantitative results were in agreement with experimental endpoint data for many of the mechanistic effects that were assessed, providing further objective confirmation of the approach. We conclude that our methodology evaluates the biological impact of exposures in an objective, systematic, and quantifiable manner, enabling the computation of a systems-wide and pan-mechanistic biological impact measure for a given active substance or mixture. Our results suggest that various fields of human disease research, from drug development to consumer product testing and environmental impact analysis, could benefit from using this methodology. - Highlights: • The impact of biologically active substances is quantified at multiple levels. • The systems-level impact integrates the perturbations of individual networks. • The networks capture the relationships between

  9. Quantitative Assessment of Liver Fat with Magnetic Resonance Imaging and Spectroscopy

    PubMed Central

    Reeder, Scott B.; Cruite, Irene; Hamilton, Gavin; Sirlin, Claude B.

    2011-01-01

    Hepatic steatosis is characterized by abnormal and excessive accumulation of lipids within hepatocytes. It is an important feature of diffuse liver disease, and the histological hallmark of non-alcoholic fatty liver disease (NAFLD). Other conditions associated with steatosis include alcoholic liver disease, viral hepatitis, HIV and genetic lipodystrophies, cystic fibrosis liver disease, and hepatotoxicity from various therapeutic agents. Liver biopsy, the current clinical gold standard for assessment of liver fat, is invasive and has sampling errors, and is not optimal for screening, monitoring, clinical decision making, or well-suited for many types of research studies. Non-invasive methods that accurately and objectively quantify liver fat are needed. Ultrasound (US) and computed tomography (CT) can be used to assess liver fat but have limited accuracy as well as other limitations. Magnetic resonance (MR) techniques can decompose the liver signal into its fat and water signal components and therefore assess liver fat more directly than CT or US. Most magnetic resonance (MR) techniques measure the signal fat-fraction (the fraction of the liver MR signal attributable to liver fat), which may be confounded by numerous technical and biological factors and may not reliably reflect fat content. By addressing the factors that confound the signal fat-fraction, advanced MR techniques measure the proton density fat-fraction (the fraction of the liver proton density attributable to liver fat), which is a fundamental tissue property and a direct measure of liver fat content. These advanced techniques show promise for accurate fat quantification and are likely to be commercially available soon. PMID:22025886

  10. The Development of Multiple-Choice Items Consistent with the AP Chemistry Curriculum Framework to More Accurately Assess Deeper Understanding

    ERIC Educational Resources Information Center

    Domyancich, John M.

    2014-01-01

    Multiple-choice questions are an important part of large-scale summative assessments, such as the advanced placement (AP) chemistry exam. However, past AP chemistry exam items often lacked the ability to test conceptual understanding and higher-order cognitive skills. The redesigned AP chemistry exam shows a distinctive shift in item types toward…

  11. Basic concepts in three-part quantitative assessments of undiscovered mineral resources

    USGS Publications Warehouse

    Singer, D.A.

    1993-01-01

    Since 1975, mineral resource assessments have been made for over 27 areas covering 5??106 km2 at various scales using what is now called the three-part form of quantitative assessment. In these assessments, (1) areas are delineated according to the types of deposits permitted by the geology,(2) the amount of metal and some ore characteristics are estimated using grade and tonnage models, and (3) the number of undiscovered deposits of each type is estimated. Permissive boundaries are drawn for one or more deposit types such that the probability of a deposit lying outside the boundary is negligible, that is, less than 1 in 100,000 to 1,000,000. Grade and tonnage models combined with estimates of the number of deposits are the fundamental means of translating geologists' resource assessments into a language that economists can use. Estimates of the number of deposits explicitly represent the probability (or degree of belief) that some fixed but unknown number of undiscovered deposits exist in the delineated tracts. Estimates are by deposit type and must be consistent with the grade and tonnage model. Other guidelines for these estimates include (1) frequency of deposits from well-explored areas, (2) local deposit extrapolations, (3) counting and assigning probabilities to anomalies and occurrences, (4) process constraints, (5) relative frequencies of related deposit types, and (6) area spatial limits. In most cases, estimates are made subjectively, as they are in meteorology, gambling, and geologic interpretations. In three-part assessments, the estimates are internally consistent because delineated tracts are consistent with descriptive models, grade and tonnage models are consistent with descriptive models, as well as with known deposits in the area, and estimates of number of deposits are consistent with grade and tonnage models. All available information is used in the assessment, and uncertainty is explicitly represented. ?? 1993 Oxford University Press.

  12. Quantitative assessment of gestational sac shape: the gestational sac shape score

    PubMed Central

    Deter, R.L.; Li, J.; Lee, W.; Liu, S.; Romero, R.

    2012-01-01

    Objective To develop a quantitative method for characterizing gestational sac shape. Methods Twenty first-trimester gestational sacs in normal pregnancies were studied with three-dimensional (3D) ultrasonography. The 3D coordinates of surface-point sets were obtained for each sac using 30-, 15- and six-slice sampling. Cubic spline interpolation was used with the 15- and six-slice surface-point samples to generate coordinates for those 30-slice surface points not measured. Interpolated and measured values, the latter from the 30-slice sample, were compared and the percent error calculated. Cubic spline interpolation was used to determine the coordinates of a standard surface-point sample (3660) for each sac in each slice sample. These coordinate data were used to give each sac a standard configuration by moving its center of gravity to the origin, aligning its inertial axes along the coordinate axes and converting its volume to 1.0 mL. In this form, a volume shape descriptor could be generated for each sac that was then transformed into a vector containing only shape information. The 20 shape vectors of each slice sample were subjected to principal components analysis, and principal component scores (PCSs) calculated. The first four PCSs were used to define a gestational sac shape score (GSSS-30, GSSS-15 or GSSS-6) for each sac in a given slice sample. The characteristics of each set of GSSSs were determined and those for the GSSS-15 and GSSS-6 were compared with the GSSS-30 characteristics. Results Cubic spline interpolations were very accurate in most cases, with means close to 0%, and approximately 95% of the errors being less than 10%. GSSS-30 accounted for 67.6% of the shape variance, had a mean of zero and an SD of 1.1, was normally distributed and was not related to menstrual age (R = −0.16, P = 0.51). GSSS-15 and GSSS-6 had essentially the same characteristics. No significant differences between individual GSSS-30 values and those for GSSS-15 or GSSS-6

  13. Assessment of Nutritional Status of Nepalese Hemodialysis Patients by Anthropometric Examinations and Modified Quantitative Subjective Global Assessment

    PubMed Central

    Sedhain, Arun; Hada, Rajani; Agrawal, Rajendra Kumar; Bhattarai, Gandhi R; Baral, Anil

    2015-01-01

    OBJECTIVE To assess the nutritional status of patients on maintenance hemodialysis by using modified quantitative subjective global assessment (MQSGA) and anthropometric measurements. METHOD We Conducted a cross sectional descriptive analytical study to assess the nutritional status of fifty four patients with chronic kidney disease undergoing maintenance hemodialysis by using MQSGA and different anthropometric and laboratory measurements like body mass index (BMI), mid-arm circumference (MAC), mid-arm muscle circumference (MAMC), triceps skin fold (TSF) and biceps skin fold (BSF), serum albumin, C-reactive protein (CRP) and lipid profile in a government tertiary hospital at Kathmandu, Nepal. RESULTS Based on MQSGA criteria, 66.7% of the patients suffered from mild to moderate malnutrition and 33.3% were well nourished. None of the patients were severely malnourished. CRP was positive in 56.3% patients. Serum albumin, MAC and BMI were (mean + SD) 4.0 + 0.3 mg/dl, 22 + 2.6 cm and 19.6 ± 3.2 kg/m2 respectively. MQSGA showed negative correlation with MAC (r = −0.563; P = <0.001), BMI (r = −0.448; P = <0.001), MAMC (r = −0.506; P = <.0001), TSF (r = −0.483; P = <.0002), and BSF (r = −0.508; P = <0.0001). Negative correlation of MQSGA was also found with total cholesterol, triglyceride, LDL cholesterol and HDL cholesterol without any statistical significance. CONCLUSION Mild to moderate malnutrition was found to be present in two thirds of the patients undergoing hemodialysis. Anthropometric measurements like BMI, MAC, MAMC, BSF and TSF were negatively correlated with MQSGA. Anthropometric and laboratory assessment tools could be used for nutritional assessment as they are relatively easier, cheaper and practical markers of nutritional status. PMID:26327781

  14. Quantitative Relationship Between Coronary Vasodilator Reserve Assessed by Rubidium-82 PET Imaging and Coronary Artery Stenosis Severity

    PubMed Central

    Anagnostopoulos, Constantinos; Almonacid, Alexandra; El Fakhri, Georges; Currilova, Zelmira; Sitek, Arkadiusz; Roughton, Michael; Dorbala, Sharmila; Popma, Jeffrey J.; Di Carli, Marcelo F.

    2011-01-01

    The relationship between myocardial blood flow (MBF) and stenosis severity has been determined previously using cyclotron-produced radiotracers such as 15O-H2O and 13N-ammonia. An attractive alternative to overcome the limitations related to the use of cyclotron might be to use the generator-produced Rubidium-82 as a flow tracer. The current study was undertaken to investigate the relationship between MBF and coronary vasodilator reserve (CVR) as measured by Rubidium-82 positron emission tomography (PET) and the percent diameter stenosis as defined by quantitative coronary arteriography. Methods We prospectively evaluated 22 individuals: 15 patients (60±11 years of age) with angiographically documented coronary artery disease (CAD) and seven age-matched (56±9 years) asymptomatic individuals without risk factors for CAD. Dynamic Rubidium-82 PET was performed at rest and after dipyridamole vasodilation. MBF, CVR and an index of “minimal coronary resistance” (MCR) were assessed in each of the three main coronary territories. Results Rest and stress MBF in regions subtended by vessels with <50% diameter stenosis was similar to that of the individuals with no risk factors for CAD. As a result, CVR was also similar in the two groups (1.9, interquartile [IQ] range from 1.7 to 2.7 vs. 2.2, IQ range from 2 to 3.4 respectively, p=0.09)). CVR successfully differentiated coronary lesions with stenosis severity 70% to 89% from those with 50% to 69% stenosis (1, IQ range from 1 to 1.3 vs. 1.7, IQ range from 1.4 to 2), respectively, p=0.001. In addition, hyperaemic MBF (r2=.74, p<0.001), CVR (r2=.69, p<0.001), and MCR (r2=.78, p<0.001) measurements were inversely and non-linearly correlated to the percent diameter stenosis on angiography. Conclusion MBF and CVR are inversely and non-linearly correlated to stenosis severity. Quantitative Rubidium-82 PET can be a clinically useful tool for an accurate functional assessment of CAD. PMID:18425513

  15. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  16. Just How Accurate are Your Probabilistic Forecasts? Improving Forecast Quality Assessment in the Presence of Sampling Uncertainty

    NASA Astrophysics Data System (ADS)

    Kang, T. H.; Sharma, A.; Marshall, L. A.

    2015-12-01

    Use of ensemble forecasts as a means of characterising predictive uncertainty has become increasingly common in hydrological and meteorological forecasting. The needs to characterize ensemble forecast quality has encouraged the development of reliable verification tools. Most of the metrics used currently are related to the Brier score, first proposed in 1950. However, the Brier score and its alterations including the decomposition of the Brier score, as well as the Ranked Probability Score, have paid little attention to the difference in the characteristics of the forecasted and sampled probability distributions. This difference, or the error in the probability distribution, can lead to a bias in all existing metrics derived from the Brier score. Similar biases arise where the second moment is different to that observed, or when the observations are scarce and hence difficult to characterise. Therefore, this study suggests simple and reliable measures for the first and second moment bias of the forecasted ensemble and in addition, approaches to analytically estimate the sampling uncertainty of the proposed measures. The proposed approaches are tested through synthetically generated hydrologic forecasts and observations, as well as seasonal forecasts of the El Nino Southern Oscillation issued by the International research Institute for Climate and Society (IRI-ENSO). The results show that the estimated uncertainty range of the first and second moment bias can accurately represent the sampling error under most circumstances in a real forecasting system.

  17. Malignant gliomas: current perspectives in diagnosis, treatment, and early response assessment using advanced quantitative imaging methods

    PubMed Central

    Ahmed, Rafay; Oborski, Matthew J; Hwang, Misun; Lieberman, Frank S; Mountz, James M

    2014-01-01

    Malignant gliomas consist of glioblastomas, anaplastic astrocytomas, anaplastic oligodendrogliomas and anaplastic oligoastrocytomas, and some less common tumors such as anaplastic ependymomas and anaplastic gangliogliomas. Malignant gliomas have high morbidity and mortality. Even with optimal treatment, median survival is only 12–15 months for glioblastomas and 2–5 years for anaplastic gliomas. However, recent advances in imaging and quantitative analysis of image data have led to earlier diagnosis of tumors and tumor response to therapy, providing oncologists with a greater time window for therapy management. In addition, improved understanding of tumor biology, genetics, and resistance mechanisms has enhanced surgical techniques, chemotherapy methods, and radiotherapy administration. After proper diagnosis and institution of appropriate therapy, there is now a vital need for quantitative methods that can sensitively detect malignant glioma response to therapy at early follow-up times, when changes in management of nonresponders can have its greatest effect. Currently, response is largely evaluated by measuring magnetic resonance contrast and size change, but this approach does not take into account the key biologic steps that precede tumor size reduction. Molecular imaging is ideally suited to measuring early response by quantifying cellular metabolism, proliferation, and apoptosis, activities altered early in treatment. We expect that successful integration of quantitative imaging biomarker assessment into the early phase of clinical trials could provide a novel approach for testing new therapies, and importantly, for facilitating patient management, sparing patients from weeks or months of toxicity and ineffective treatment. This review will present an overview of epidemiology, molecular pathogenesis and current advances in diagnoses, and management of malignant gliomas. PMID:24711712

  18. Quantitative assessment of sample stiffness and sliding friction from force curves in atomic force microscopy

    SciTech Connect

    Pratt, Jon R.; Shaw, Gordon A.; Kumanchik, Lee; Burnham, Nancy A.

    2010-02-15

    It has long been recognized that the angular deflection of an atomic force microscope (AFM) cantilever under ''normal'' loading conditions can be profoundly influenced by the friction between the tip and the surface. It is shown here that a remarkably quantifiable hysteresis occurs in the slope of loading curves whenever the normal flexural stiffness of the AFM cantilever is greater than that of the sample. This situation arises naturally in cantilever-on-cantilever calibration, but also when trying to measure the stiffness of nanomechanical devices or test structures, or when probing any type of surface or structure that is much more compliant along the surface normal than in transverse directions. Expressions and techniques for evaluating the coefficient of sliding friction between the cantilever tip and sample from normal force curves, as well as relations for determining the stiffness of a mechanically compliant specimen are presented. The model is experimentally supported by the results of cantilever-on-cantilever spring constant calibrations. The cantilever spring constants determined here agree with the values determined using the NIST electrostatic force balance within the limits of the largest uncertainty component, which had a relative value of less than 2.5%. This points the way for quantitative testing of micromechanical and nanomechanical components, more accurate calibration of AFM force, and provides nanotribologists access to information about contact friction from normal force curves.

  19. Quantitative assessment of the multivalent protein-carbohydrate interactions on silicon.

    PubMed

    Yang, Jie; Chazalviel, Jean-Noël; Siriwardena, Aloysius; Boukherroub, Rabah; Ozanam, François; Szunerits, Sabine; Gouget-Laemmel, Anne Chantal

    2014-10-21

    A key challenge in the development of glycan arrays is that the sensing interface be fabricated reliably so as to ensure the sensitive and accurate analysis of the protein-carbohydrate interaction of interest, reproducibly. These goals are complicated in the case of glycan arrays as surface sugar density can influence dramatically the strength and mode of interaction of the sugar ligand at any interface with lectin partners. In this Article, we describe the preparation of carboxydecyl-terminated crystalline silicon (111) surfaces onto which are grafted either mannosyl moieties or a mixture of mannose and spacer alcohol molecules to provide "diluted" surfaces. The fabrication of the silicon surfaces was achieved efficiently through a strategy implicating a "click" coupling step. The interactions of these newly fabricated glycan interfaces with the lectin, Lens culinaris, have been characterized using quantitative infrared (IR) spectroscopy in the attenuated total geometry (ATR). The density of mannose probes and lectin targets was precisely determined for the first time by the aid of special IR calibration experiments, thus allowing for the interpretation of the distribution of mannose and its multivalent binding with lectins. These experimental findings were accounted for by numerical simulations of lectin adsorption. PMID:25216376

  20. 3D reconstruction and quantitative assessment method of mitral eccentric regurgitation from color Doppler echocardiography

    NASA Astrophysics Data System (ADS)

    Liu, Qi; Ge, Yi Nan; Wang, Tian Fu; Zheng, Chang Qiong; Zheng, Yi

    2005-10-01

    Based on the two-dimensional color Doppler image in this article, multilane transesophageal rotational scanning method is used to acquire original Doppler echocardiography while echocardiogram is recorded synchronously. After filtering and interpolation, the surface rendering and volume rendering methods are performed. Through analyzing the color-bar information and the color Doppler flow image's superposition principle, the grayscale mitral anatomical structure and color-coded regurgitation velocity parameter were separated from color Doppler flow images, three-dimensional reconstruction of mitral structure and regurgitation velocity distribution was implemented separately, fusion visualization of the reconstructed regurgitation velocity distribution parameter with its corresponding 3D mitral anatomical structures was realized, which can be used in observing the position, phase, direction and measuring the jet length, area, volume, space distribution and severity level of the mitral regurgitation. In addition, in patients with eccentric mitral regurgitation, this new modality overcomes the inherent limitations of two-dimensional color Doppler flow image by depicting the full extent of the jet trajectory, the area of eccentric regurgitation on three-dimensional image was much larger than that on two-dimensional image, the area variation tendency and volume variation tendency of regurgitation have been shown in figure at different angle and different systolic phase. The study shows that three-dimensional color Doppler provides quantitative measurements of eccentric mitral regurgitation that are more accurate and reproducible than conventional color Doppler.

  1. Quantitative assessment of MS plaques and brain atrophy in multiple sclerosis using semiautomatic segmentation method

    NASA Astrophysics Data System (ADS)

    Heinonen, Tomi; Dastidar, Prasun; Ryymin, Pertti; Lahtinen, Antti J.; Eskola, Hannu; Malmivuo, Jaakko

    1997-05-01

    Quantitative magnetic resonance (MR) imaging of the brain is useful in multiple sclerosis (MS) in order to obtain reliable indices of disease progression. The goal of this project was to estimate the total volume of gliotic and non gliotic plaques in chronic progressive multiple sclerosis with the help of a semiautomatic segmentation method developed at the Ragnar Granit Institute. Youth developed program running on a PC based computer provides de displays of the segmented data, in addition to the volumetric analyses. The volumetric accuracy of the program was demonstrated by segmenting MR images of fluid filed syringes. An anatomical atlas is to be incorporated in the segmentation system to estimate the distribution of MS plaques in various neural pathways of the brain. A total package including MS plaque volume estimation, estimation of brain atrophy and ventricular enlargement, distribution of MS plaques in different neural segments of the brain has ben planned for the near future. Our study confirmed that total lesion volumes in chronic MS disease show a poor correlation to EDSS scores but show a positive correlation to neuropsychological scores. Therefore accurate total volume measurements of MS plaques using the developed semiautomatic segmentation technique helped us to evaluate the degree of neuropsychological impairment.

  2. Using Modified Contour Deformable Model to Quantitatively Estimate Ultrasound Parameters for Osteoporosis Assessment

    NASA Astrophysics Data System (ADS)

    Chen, Yung-Fu; Du, Yi-Chun; Tsai, Yi-Ting; Chen, Tainsong

    Osteoporosis is a systemic skeletal disease, which is characterized by low bone mass and micro-architectural deterioration of bone tissue, leading to bone fragility. Finding an effective method for prevention and early diagnosis of the disease is very important. Several parameters, including broadband ultrasound attenuation (BUA), speed of sound (SOS), and stiffness index (STI), have been used to measure the characteristics of bone tissues. In this paper, we proposed a method, namely modified contour deformable model (MCDM), bases on the active contour model (ACM) and active shape model (ASM) for automatically detecting the calcaneus contour from quantitative ultrasound (QUS) parametric images. The results show that the difference between the contours detected by the MCDM and the true boundary for the phantom is less than one pixel. By comparing the phantom ROIs, significant relationship was found between contour mean and bone mineral density (BMD) with R=0.99. The influence of selecting different ROI diameters (12, 14, 16 and 18 mm) and different region-selecting methods, including fixed region (ROI fix ), automatic circular region (ROI cir ) and calcaneal contour region (ROI anat ), were evaluated for testing human subjects. Measurements with large ROI diameters, especially using fixed region, result in high position errors (10-45%). The precision errors of the measured ultrasonic parameters for ROI anat are smaller than ROI fix and ROI cir . In conclusion, ROI anat provides more accurate measurement of ultrasonic parameters for the evaluation of osteoporosis and is useful for clinical application.

  3. Quantitative assessment of developmental levels in overarm throwing using wearable inertial sensing technology.

    PubMed

    Grimpampi, Eleni; Masci, Ilaria; Pesce, Caterina; Vannozzi, Giuseppe

    2016-09-01

    Motor proficiency in childhood has been recently recognised as a public health determinant, having a potential impact on the physical activity level and possible sedentary behaviour of the child later in life. Among fundamental motor skills, ballistic skills assessment based on in-field quantitative observations is progressively needed in the motor development community. The aim of this study was to propose an in-field quantitative approach to identify different developmental levels in overarm throwing. Fifty-eight children aged 5-10 years performed an overarm throwing task while wearing three inertial sensors located at the wrist, trunk and pelvis level and were then categorised using a developmental sequence of overarm throwing. A set of biomechanical parameters were defined and analysed using multivariate statistics to evaluate whether they can be used as developmental indicators. Trunk and pelvis angular velocities and time durations before the ball release showed increasing/decreasing trends with increasing developmental level. Significant differences between developmental level pairs were observed for selected biomechanical parameters. The results support the suitability and feasibility of objective developmental measures in ecological learning contexts, suggesting their potential supportiveness to motor learning experiences in educational and youth sports training settings. PMID:26818205

  4. Rapid quantitative assessment of myocardial perfusion: spectral analysis of myocardial contrast echocardiographic images.

    PubMed

    Bae, Richard Y; Belohlavek, Marek; Greenleaf, James F; Seward, James B

    2002-01-01

    We described a novel rapid spectral analysis technique performed on raw digital in-phase quadrature (IQ) data that quantitatively differentiated perfused from nonperfused myocardium based on the simultaneous comparison of local fundamental and harmonic frequency band intensity levels. In open-chest pigs after ligation of the left anterior descending coronary artery (LAD) and continuous venous contrast infusion, the fundamental-to-harmonic intensity ratio (FHIR) for samples placed within the left ventricular (LV) cavity (10.8 +/- 1.7 dB) and perfused myocardium (13.7 +/- 1.6 dB) were significantly (P <.001) lower than for nonperfused myocardium (27.1 +/- 2.9 dB). In attenuated images, the FHIR for the LV cavity and perfused myocardium were also significantly (P <.05) lower than for the nonperfused myocardium (21.4 +/- 3.0 dB, 34.4 +/- 3.2 dB, and 40.2 +/- 4.4 dB, respectively). Spectral properties of contrast microbubbles, as characterized by the FHIR, allow for rapid quantitative assessment of myocardial perfusion from data contained in a single-image frame, without requiring background image subtraction and image averaging. PMID:11781556

  5. Multiparametric MRI Assessment of Human Articular Cartilage Degeneration: Correlation with Quantitative Histology and Mechanical Properties

    PubMed Central

    Rautiainen, Jari; Nissi, Mikko J.; Salo, Elli-Noora; Tiitu, Virpi; Finnilä, Mikko A.J.; Aho, Olli-Matti; Saarakkala, Simo; Lehenkari, Petri; Ellermann, Jutta; Nieminen, Miika T.

    2014-01-01

    Purpose To evaluate the sensitivity of quantitative MRI techniques (T1, T1,Gd, T2, continous wave (CW) T1ρ dispersion, adiabatic T1ρ, adiabatic T2ρ, RAFF and inversion-prepared magnetization transfer (MT)) for assessment of human articular cartilage with varying degrees of natural degeneration. Methods Osteochondral samples (n = 14) were obtained from the tibial plateaus of patients undergoing total knee replacement. MRI of the specimens was performed at 9.4 T and the relaxation time maps were evaluated in the cartilage zones. For reference, quantitative histology, OARSI grading and biomechanical measurements were performed and correlated with MRI findings. Results All MRI parameters, except T1,Gd, showed statistically significant differences in tangential and full-thickness ROIs between early and advanced osteoarthritis (OA) groups, as classified by OARSI grading. CW-T1ρ showed significant dispersion in all ROIs and featured classical laminar structure of cartilage with spin-lock powers below 1000 Hz. Adiabatic T1ρ, T2ρ, CW-T1ρ, MT and RAFF correlated strongly with OARSI grade and biomechanical parameters. Conclusion MRI parameters were able to differentiate between early and advanced OA. Furthermore, rotating frame methods, namely adiabatic T1ρ, adiabatic T2ρ, CW-T1ρ and RAFF, as well as MT experiment correlated strongly with biomechanical parameters and OARSI grade, suggesting high sensitivity of the parameters for cartilage degeneration. PMID:25104181

  6. Quantitative MR assessment of structural changes in white matter of children treated for ALL

    NASA Astrophysics Data System (ADS)

    Reddick, Wilburn E.; Glass, John O.; Mulhern, Raymond K.

    2001-07-01

    Our research builds on the hypothesis that white matter damage resulting from therapy spans a continuum of severity that can be reliably probed using non-invasive MR technology. This project focuses on children treated for ALL with a regimen containing seven courses of high-dose methotrexate (HDMTX) which is known to cause leukoencephalopathy. Axial FLAIR, T1-, T2-, and PD-weighted images were acquired, registered and then analyzed with a hybrid neural network segmentation algorithm to identify normal brain parenchyma and leukoencephalopathy. Quantitative T1 and T2 maps were also analyzed at the level of the basal ganglia and the centrum semiovale. The segmented images were used as mask to identify regions of normal appearing white matter (NAWM) and leukoencephalopathy in the quantitative T1 and T2 maps. We assessed the longitudinal changes in volume, T1 and T2 in NAWM and leukoencephalopathy for 42 patients. The segmentation analysis revealed that 69% of patients had leukoencephalopathy after receiving seven courses of HDMTX. The leukoencephalopathy affected approximately 17% of the patients' white matter volume on average (range 2% - 38%). Relaxation rates in the NAWM were not significantly changed between the 1st and 7th courses. Regions of leukoencephalopathy exhibited a 13% elevation in T1 and a 37% elevation in T2 relaxation rates.

  7. Dynamic and quantitative assessment of blood coagulation using optical coherence elastography

    NASA Astrophysics Data System (ADS)

    Xu, Xiangqun; Zhu, Jiang; Chen, Zhongping

    2016-04-01

    Reliable clot diagnostic systems are needed for directing treatment in a broad spectrum of cardiovascular diseases and coagulopathy. Here, we report on non-contact measurement of elastic modulus for dynamic and quantitative assessment of whole blood coagulation using acoustic radiation force orthogonal excitation optical coherence elastography (ARFOE-OCE). In this system, acoustic radiation force (ARF) is produced by a remote ultrasonic transducer, and a shear wave induced by ARF excitation is detected by the optical coherence tomography (OCT) system. During porcine whole blood coagulation, changes in the elastic property of the clots increase the shear modulus of the sample, altering the propagating velocity of the shear wave. Consequently, dynamic blood coagulation status can be measured quantitatively by relating the velocity of the shear wave with clinically relevant coagulation metrics, including reaction time, clot formation kinetics and maximum shear modulus. The results show that the ARFOE-OCE is sensitive to the clot formation kinetics and can differentiate the elastic properties of the recalcified porcine whole blood, blood added with kaolin as an activator, and blood spiked with fibrinogen.

  8. Enantiomeric fractionation as a tool for quantitative assessment of biodegradation: The case of metoprolol.

    PubMed

    Souchier, Marine; Benali-Raclot, Dalel; Casellas, Claude; Ingrand, Valérie; Chiron, Serge

    2016-05-15

    An efficient chiral liquid chromatography high resolution mass spectrometry method has been developed for the determination of metoprolol (MTP) and three of its major metabolites, namely O-desmethylmetoprolol (O-DMTP), α-hydroxymetoprolol (α-HMTP) and metoprolol acid (MTPA) in wastewater treatment plant (WWTP) influents and effluents. The optimized analytical method has been validated with good quality parameters including resolution >1.3 and method quantification limits down to the ng/L range except for MTPA. On the basis of this newly developed analytical method, the stereochemistry of MTP and its metabolites was studied over time in effluent/sediment biotic and sterile microcosms under dark and light conditions and in influents and effluents of 5 different WWTPs. MTP stereoselective degradation was exclusively observed under biotic conditions, confirming the specificity of enantiomeric fraction variations to biodegradation processes. MTP was always biotransformed into MTPA with a (S)-enantiomer enrichment. The results of enantiomeric enrichment pointed the way for a quantitative assessment of in situ biodegradation processes due to a good fit (R(2) > 0.98) of the aerobic MTP biodegradation to the Rayleigh dependency in all the biotic microcosms and in WWTPs because both MTP enantiomers followed the same biodegradation kinetic profiles. These results demonstrate that enantiomeric fractionation constitutes a very interesting quantitative indicator of MTP biodegradation in WWTPs and probably in the environment. PMID:26978718

  9. Dynamic and quantitative assessment of blood coagulation using optical coherence elastography.

    PubMed

    Xu, Xiangqun; Zhu, Jiang; Chen, Zhongping

    2016-01-01

    Reliable clot diagnostic systems are needed for directing treatment in a broad spectrum of cardiovascular diseases and coagulopathy. Here, we report on non-contact measurement of elastic modulus for dynamic and quantitative assessment of whole blood coagulation using acoustic radiation force orthogonal excitation optical coherence elastography (ARFOE-OCE). In this system, acoustic radiation force (ARF) is produced by a remote ultrasonic transducer, and a shear wave induced by ARF excitation is detected by the optical coherence tomography (OCT) system. During porcine whole blood coagulation, changes in the elastic property of the clots increase the shear modulus of the sample, altering the propagating velocity of the shear wave. Consequently, dynamic blood coagulation status can be measured quantitatively by relating the velocity of the shear wave with clinically relevant coagulation metrics, including reaction time, clot formation kinetics and maximum shear modulus. The results show that the ARFOE-OCE is sensitive to the clot formation kinetics and can differentiate the elastic properties of the recalcified porcine whole blood, blood added with kaolin as an activator, and blood spiked with fibrinogen. PMID:27090437

  10. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    NASA Astrophysics Data System (ADS)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66-1.06, 1.06-1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  11. Dynamic and quantitative assessment of blood coagulation using optical coherence elastography

    PubMed Central

    Xu, Xiangqun; Zhu, Jiang; Chen, Zhongping

    2016-01-01

    Reliable clot diagnostic systems are needed for directing treatment in a broad spectrum of cardiovascular diseases and coagulopathy. Here, we report on non-contact measurement of elastic modulus for dynamic and quantitative assessment of whole blood coagulation using acoustic radiation force orthogonal excitation optical coherence elastography (ARFOE-OCE). In this system, acoustic radiation force (ARF) is produced by a remote ultrasonic transducer, and a shear wave induced by ARF excitation is detected by the optical coherence tomography (OCT) system. During porcine whole blood coagulation, changes in the elastic property of the clots increase the shear modulus of the sample, altering the propagating velocity of the shear wave. Consequently, dynamic blood coagulation status can be measured quantitatively by relating the velocity of the shear wave with clinically relevant coagulation metrics, including reaction time, clot formation kinetics and maximum shear modulus. The results show that the ARFOE-OCE is sensitive to the clot formation kinetics and can differentiate the elastic properties of the recalcified porcine whole blood, blood added with kaolin as an activator, and blood spiked with fibrinogen. PMID:27090437

  12. Quantitative Assessment of Thermodynamic Constraints on the Solution Space of Genome-Scale Metabolic Models

    PubMed Central

    Hamilton, Joshua J.; Dwivedi, Vivek; Reed, Jennifer L.

    2013-01-01

    Constraint-based methods provide powerful computational techniques to allow understanding and prediction of cellular behavior. These methods rely on physiochemical constraints to eliminate infeasible behaviors from the space of available behaviors. One such constraint is thermodynamic feasibility, the requirement that intracellular flux distributions obey the laws of thermodynamics. The past decade has seen several constraint-based methods that interpret this constraint in different ways, including those that are limited to small networks, rely on predefined reaction directions, and/or neglect the relationship between reaction free energies and metabolite concentrations. In this work, we utilize one such approach, thermodynamics-based metabolic flux analysis (TMFA), to make genome-scale, quantitative predictions about metabolite concentrations and reaction free energies in the absence of prior knowledge of reaction directions, while accounting for uncertainties in thermodynamic estimates. We applied TMFA to a genome-scale network reconstruction of Escherichia coli and examined the effect of thermodynamic constraints on the flux space. We also assessed the predictive performance of TMFA against gene essentiality and quantitative metabolomics data, under both aerobic and anaerobic, and optimal and suboptimal growth conditions. Based on these results, we propose that TMFA is a useful tool for validating phenotypes and generating hypotheses, and that additional types of data and constraints can improve predictions of metabolite concentrations. PMID:23870272

  13. Specific and quantitative assessment of naphthalene and salicylate bioavailability by using a bioluminescent catabolic reporter bacterium

    SciTech Connect

    Heitzer, A.; Thonnard, J.E.; Sayler, G.S.; Webb, O.F. )

    1992-06-01

    A bioassay was developed and standardized for the rapid, specific, and quantitative assessment of naphthalene and salicylate bioavailability by use of bioluminescence monitoring of catabolic gene expression. The bioluminescent reporter strain Pseudomonas fluorescens HK44, which carries a transcriptional nahG-luxCDABE fusion for naphthalene and salicylate catabolism, was used. The physiological state of the reporter cultures as well as the intrinsic regulatory properties of the naphthalene degradation operon must be taken into account to obtain a high specificity at low target substrate concentrations. Experiments have shown that the use of exponentially growing reporter cultures has advantages over the use of carbon-starved, resting cultures. In aqueous solutions for both substrates, naphthalene and salicylate, linear relationships between initial substrate concentration and bioluminescence response were found over concentration ranges of 1 to 2 orders of magnitude. Naphthalene could be detected at a concentration of 45 ppb. Studies conducted under defined conditions with extracts and slurries of experimentally contaminated sterile soils and identical uncontaminated soil controls demonstrated that this method can be used for specific and quantitative estimations of target pollutant presence and bioavailability in soil extracts and for specific and qualitative estimations of napthalene in soil slurries.

  14. Quantitative risk assessment for skin sensitisation: consideration of a simplified approach for hair dye ingredients.

    PubMed

    Goebel, Carsten; Diepgen, Thomas L; Krasteva, Maya; Schlatter, Harald; Nicolas, Jean-Francois; Blömeke, Brunhilde; Coenraads, Pieter Jan; Schnuch, Axel; Taylor, James S; Pungier, Jacquemine; Fautz, Rolf; Fuchs, Anne; Schuh, Werner; Gerberick, G Frank; Kimber, Ian

    2012-12-01

    With the availability of the local lymph node assay, and the ability to evaluate effectively the relative skin sensitizing potency of contact allergens, a model for quantitative-risk-assessment (QRA) has been developed. This QRA process comprises: (a) determination of a no-expected-sensitisation-induction-level (NESIL), (b) incorporation of sensitization-assessment-factors (SAFs) reflecting variations between subjects, product use patterns and matrices, and (c) estimation of consumer-exposure-level (CEL). Based on these elements an acceptable-exposure-level (AEL) can be calculated by dividing the NESIL of the product by individual SAFs. Finally, the AEL is compared with the CEL to judge about risks to human health. We propose a simplified approach to risk assessment of hair dye ingredients by making use of precise experimental product exposure data. This data set provides firmly established dose/unit area concentrations under relevant consumer use conditions referred to as the measured-exposure-level (MEL). For that reason a direct comparison is possible between the NESIL with the MEL as a proof-of-concept quantification of the risk of skin sensitization. This is illustrated here by reference to two specific hair dye ingredients p-phenylenediamine and resorcinol. Comparison of these robust and toxicologically relevant values is therefore considered an improvement versus a hazard-based classification of hair dye ingredients. PMID:23069142

  15. Quantitative assessment of reactive hyperemia using laser speckle contrast imaging at multiple wavelengths

    NASA Astrophysics Data System (ADS)

    Young, Anthony; Vishwanath, Karthik

    2016-03-01

    Reactive hyperemia refers to an increase of blood flow in tissue post release of an occlusion in the local vasculature. Measuring the temporal response of reactive hyperemia, post-occlusion in patients has the potential to shed information about microvascular diseases such as systemic sclerosis and diabetes. Laser speckle contrast imaging (LSCI) is an imaging technique capable of sensing superficial blood flow in tissue which can be used to quantitatively assess reactive hyperemia. Here, we employ LSCI using coherent sources in the blue, green and red wavelengths to evaluate reactive hyperemia in healthy human volunteers. Blood flow in the forearms of subjects were measured using LSCI to assess the time-course of reactive hyperemia that was triggered by a pressure cuff applied to the biceps of the subjects. Raw speckle images were acquired and processed to yield blood-flow parameters from a region of interest before, during and after application of occlusion. Reactive hyperemia was quantified via two measures - (1) by calculating the difference between the peak LSCI flow during the hyperemia and baseline flow, and (2) by measuring the amount of time that elapsed between the release of the occlusion and peak flow. These measurements were acquired in three healthy human participants, under the three laser wavelengths employed. The studies shed light on the utility of in vivo LSCI-based flow sensing for non-invasive assessment of reactive hyperemia responses and how they varied with the choice source wavelength influences the measured parameters.

  16. Assessing the toxic effects of ethylene glycol ethers using Quantitative Structure Toxicity Relationship models

    SciTech Connect

    Ruiz, Patricia; Mumtaz, Moiz; Gombar, Vijay

    2011-07-15

    Experimental determination of toxicity profiles consumes a great deal of time, money, and other resources. Consequently, businesses, societies, and regulators strive for reliable alternatives such as Quantitative Structure Toxicity Relationship (QSTR) models to fill gaps in toxicity profiles of compounds of concern to human health. The use of glycol ethers and their health effects have recently attracted the attention of international organizations such as the World Health Organization (WHO). The board members of Concise International Chemical Assessment Documents (CICAD) recently identified inadequate testing as well as gaps in toxicity profiles of ethylene glycol mono-n-alkyl ethers (EGEs). The CICAD board requested the ATSDR Computational Toxicology and Methods Development Laboratory to conduct QSTR assessments of certain specific toxicity endpoints for these chemicals. In order to evaluate the potential health effects of EGEs, CICAD proposed a critical QSTR analysis of the mutagenicity, carcinogenicity, and developmental effects of EGEs and other selected chemicals. We report here results of the application of QSTRs to assess rodent carcinogenicity, mutagenicity, and developmental toxicity of four EGEs: 2-methoxyethanol, 2-ethoxyethanol, 2-propoxyethanol, and 2-butoxyethanol and their metabolites. Neither mutagenicity nor carcinogenicity is indicated for the parent compounds, but these compounds are predicted to be developmental toxicants. The predicted toxicity effects were subjected to reverse QSTR (rQSTR) analysis to identify structural attributes that may be the main drivers of the developmental toxicity potential of these compounds.

  17. Quantitative Gait Measurement With Pulse-Doppler Radar for Passive In-Home Gait Assessment

    PubMed Central

    Skubic, Marjorie; Rantz, Marilyn; Cuddihy, Paul E.

    2014-01-01

    In this paper, we propose a pulse-Doppler radar system for in-home gait assessment of older adults. A methodology has been developed to extract gait parameters including walking speed and step time using Doppler radar. The gait parameters have been validated with a Vicon motion capture system in the lab with 13 participants and 158 test runs. The study revealed that for an optimal step recognition and walking speed estimation, a dual radar set up with one radar placed at foot level and the other at torso level is necessary. An excellent absolute agreement with intraclass correlation coefficients of 0.97 was found for step time estimation with the foot level radar. For walking speed, although both radars show excellent consistency they all have a system offset compared to the ground truth due to walking direction with respect to the radar beam. The torso level radar has a better performance (9% offset on average) in the speed estimation compared to the foot level radar (13%–18% offset). Quantitative analysis has been performed to compute the angles causing the systematic error. These lab results demonstrate the capability of the system to be used as a daily gait assessment tool in home environments, useful for fall risk assessment and other health care applications. The system is currently being tested in an unstructured home environment. PMID:24771566

  18. Quantitative assessment of resilience of a water supply system under rainfall reduction due to climate change

    NASA Astrophysics Data System (ADS)

    Amarasinghe, Pradeep; Liu, An; Egodawatta, Prasanna; Barnes, Paul; McGree, James; Goonetilleke, Ashantha

    2016-09-01

    A water supply system can be impacted by rainfall reduction due to climate change, thereby reducing its supply potential. This highlights the need to understand the system resilience, which refers to the ability to maintain service under various pressures (or disruptions). Currently, the concept of resilience has not yet been widely applied in managing water supply systems. This paper proposed three technical resilience indictors to assess the resilience of a water supply system. A case study analysis was undertaken of the Water Grid system of Queensland State, Australia, to showcase how the proposed indicators can be applied to assess resilience. The research outcomes confirmed that the use of resilience indicators is capable of identifying critical conditions in relation to the water supply system operation, such as the maximum allowable rainfall reduction for the system to maintain its operation without failure. Additionally, resilience indicators also provided useful insight regarding the sensitivity of the water supply system to a changing rainfall pattern in the context of climate change, which represents the system's stability when experiencing pressure. The study outcomes will help in the quantitative assessment of resilience and provide improved guidance to system operators to enhance the efficiency and reliability of a water supply system.

  19. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection.

    PubMed

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  20. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection

    PubMed Central

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  1. Quantitative gait measurement with pulse-Doppler radar for passive in-home gait assessment.

    PubMed

    Wang, Fang; Skubic, Marjorie; Rantz, Marilyn; Cuddihy, Paul E

    2014-09-01

    In this paper, we propose a pulse-Doppler radar system for in-home gait assessment of older adults. A methodology has been developed to extract gait parameters including walking speed and step time using Doppler radar. The gait parameters have been validated with a Vicon motion capture system in the lab with 13 participants and 158 test runs. The study revealed that for an optimal step recognition and walking speed estimation, a dual radar set up with one radar placed at foot level and the other at torso level is necessary. An excellent absolute agreement with intraclass correlation coefficients of 0.97 was found for step time estimation with the foot level radar. For walking speed, although both radars show excellent consistency they all have a system offset compared to the ground truth due to walking direction with respect to the radar beam. The torso level radar has a better performance (9% offset on average) in the speed estimation compared to the foot level radar (13%-18% offset). Quantitative analysis has been performed to compute the angles causing the systematic error. These lab results demonstrate the capability of the system to be used as a daily gait assessment tool in home environments, useful for fall risk assessment and other health care applications. The system is currently being tested in an unstructured home environment. PMID:24771566

  2. Coherent and consistent decision making for mixed hazardous waste management: The application of quantitative assessment techniques

    SciTech Connect

    Smith, G.M.; Little, R.H.; Torres, C.

    1994-12-31

    This paper focuses on predictive modelling capacity for post-disposal safety assessments of land-based disposal facilities, illustrated by presentation of the development and application of a comprehensive, yet practicable, assessment framework. The issues addressed include: (1) land-based disposal practice, (2) the conceptual and mathematical representation of processes leading to release, migration and accumulation of contaminants, (3) the identification and evaluation of relevant assessment end-points, including human health, health of non-human biota and eco-systems, and property and resource effects, (4) the gap between data requirements and data availability, and (5) the application of results in decision making, given the uncertainties in assessment results and the difficulty of comparing qualitatively different impacts arising in different temporal and spatial scales. The paper illustrates the issues with examples based on disposal of metals and radionuclides to shallow facilities. The types of disposal facility considered include features consistent with facilities for radioactive wastes as well as other types of design more typical of hazardous wastes. The intention is to raise the question of whether radioactive and other hazardous wastes are being consistently managed, and to show that assessment methods are being developed which can provide quantitative information on the levels of environmental impact as well as a consistent approach for different types of waste, such methods can then be applied to mixed hazardous wastes contained radionuclides as well as other contaminants. The remaining question is whether the will exists to employ them. The discussion and worked illustrations are based on a methodology developed and being extended within the current European Atomic Energy Community`s cost-sharing research program on radioactive waste management and disposal, with co-funding support from Empresa Nacional de Residuous Radiactivos SA, Spain.

  3. Quantitative Assessment of Participant Knowledge and Evaluation of Participant Satisfaction in the CARES Training Program

    PubMed Central

    Goodman, Melody S.; Si, Xuemei; Stafford, Jewel D.; Obasohan, Adesuwa; Mchunguzi, Cheryl

    2016-01-01

    Background The purpose of the Community Alliance for Research Empowering Social change (CARES) training program was to (1) train community members on evidence-based public health, (2) increase their scientific literacy, and (3) develop the infrastructure for community-based participatory research (CBPR). Objectives We assessed participant knowledge and evaluated participant satisfaction of the CARES training program to identify learning needs, obtain valuable feedback about the training, and ensure learning objectives were met through mutually beneficial CBPR approaches. Methods A baseline assessment was administered before the first training session and a follow-up assessment and evaluation was administered after the final training session. At each training session a pretest was administered before the session and a posttest and evaluation were administered at the end of the session. After training session six, a mid-training evaluation was administered. We analyze results from quantitative questions on the assessments, pre- and post-tests, and evaluations. Results CARES fellows knowledge increased at follow-up (75% of questions were answered correctly on average) compared with baseline (38% of questions were answered correctly on average) assessment; post-test scores were higher than pre-test scores in 9 out of 11 sessions. Fellows enjoyed the training and rated all sessions well on the evaluations. Conclusions The CARES fellows training program was successful in participant satisfaction and increasing community knowledge of public health, CBPR, and research method ology. Engaging and training community members in evidence-based public health research can develop an infrastructure for community–academic research partnerships. PMID:22982849

  4. Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations

    PubMed Central

    Røttingen, John-Arne

    2015-01-01

    We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196

  5. Quantitative mass spectrometric analysis and post-extraction stability assessment of the euglenoid toxin euglenophycin.

    PubMed

    Gutierrez, Danielle B; Rafalski, Alexandra; Beauchesne, Kevin; Moeller, Peter D; Triemer, Richard E; Zimba, Paul V

    2013-09-01

    Euglenophycin is a recently discovered toxin produced by at least one species of euglenoid algae. The toxin has been responsible for several fish mortality events. To facilitate the identification and monitoring of euglenophycin in freshwater ponds, we have developed a specific mass spectrometric method for the identification and quantitation of euglenophycin. The post-extraction stability of the toxin was assessed under various conditions. Euglenophycin was most stable at room temperature. At 8 °C there was a small, but statistically significant, loss in toxin after one day. These methods and knowledge of the toxin's stability will facilitate identification of the toxin as a causative agent in fish kills and determination of the toxin's distribution in the organs of exposed fish. PMID:24051554

  6. Quantitative assessment of pupillary light reflex in normal and anesthetized dogs: a preliminary study.

    PubMed

    Kim, Jury; Heo, Jiseong; Ji, Dongbeom; Kim, Min-Su

    2015-04-01

    The purpose of this study was to quantitatively assess the pupillary light reflex (PLR) in normal and anesthetized dogs using a pupillometer. Eleven dogs (20 eyes) of various breeds were included. PLRs were measured with a handheld pupillometer in dim light before and during anesthesia. Anesthesia was conducted with atropine, xylazine and ketamine. Parameters of pupillometry included neurological pupil index (NPi), pupil size, percent of change (%CH), latency (LAT), constriction velocity (CV), maximum constriction velocity (MCV) and dilation velocity (DV). NPi,%CH, CV and MCV were significantly decreased during anesthesia compared with the pre-anesthesia data. The results suggest that atropine-xylazine-ketamine combination anesthesia depresses the PLR. Additionally, this study demonstrates the feasibility of the use of a pupillometer in dogs. PMID:25648149

  7. A quantitative structure-activity relationship approach for assessing toxicity of mixture of organic compounds.

    PubMed

    Chang, C M; Ou, Y H; Liu, T-C; Lu, S-Y; Wang, M-K

    2016-06-01

    Four types of reactivity indices were employed to construct quantitative structure-activity relationships for the assessment of toxicity of organic chemical mixtures. Results of analysis indicated that the maximum positive charge of the hydrogen atom and the inverse of the apolar surface area are the most important descriptors for the toxicity of mixture of benzene and its derivatives to Vibrio fischeri. The toxicity of mixture of aromatic compounds to green alga Scenedesmus obliquus is mainly affected by the electron flow and electrostatic interactions. The electron-acceptance chemical potential and the maximum positive charge of the hydrogen atom are found to be the most important descriptors for the joint toxicity of aromatic compounds. PMID:27426856

  8. Quantitative assessment of motion correction for high angular resolution diffusion imaging.

    PubMed

    Sakaie, Ken E; Lowe, Mark J

    2010-02-01

    Several methods have been proposed for motion correction of high angular resolution diffusion imaging (HARDI) data. There have been few comparisons of these methods, partly due to a lack of quantitative metrics of performance. We compare two motion correction strategies using two figures of merit: displacement introduced by the motion correction and the 95% confidence interval of the cone of uncertainty of voxels with prolate tensors. What follows is a general approach for assessing motion correction of HARDI data that may have broad application for quality assurance and optimization of postprocessing protocols. Our analysis demonstrates two important issues related to motion correction of HARDI data: (1) although neither method we tested was dramatically superior in performance, both were dramatically better than performing no motion correction, and (2) iteration of motion correction can improve the final results. Based on the results demonstrated here, iterative motion correction is strongly recommended for HARDI acquisitions. PMID:19695824

  9. Quantitative assessment of steroid hormone binding sites by thaw-mount autoradiography

    SciTech Connect

    Stumpf, W.E.; Sar, M.; Zuber, T.J.; Soini, E.; Tuohimaa, P.

    1981-01-01

    A procedure for the quantitative assessment of nuclear receptors for steroid hormones--and other substances--in individual cells is presented. Thaw-mount autoradiography, a procedure developed earlier in our laboratory, is utilized. The silver grain yield (specific activity) is 16.6 disintegrations per silver as determined fo tritium in guinea pig uterine tissues. An integrated formula is presented and applied for /sup 3/H-estradiol, /sup 3/H-diethylstilbestrol, and /sup 3/H-aldosterone in sampled tissue. A comparison with data derived from the literature that are based on the homogenization of whole uteri and biochemical analysis shows comparable values wtih the autoradiographic data if the latter are pooled. The pooled ata indicated 12-14,00 molecules of /sup 3/H-estradiol per uterine nucleus, while subpopulations of target cells vary between 5,000 and 28,000 per nucleus.

  10. Evaluation of dental enamel caries assessment using Quantitative Light Induced Fluorescence and Optical Coherence Tomography.

    PubMed

    Maia, Ana Marly Araújo; de Freitas, Anderson Zanardi; de L Campello, Sergio; Gomes, Anderson Stevens Leônidas; Karlsson, Lena

    2016-06-01

    An in vitro study of morphological alterations between sound dental structure and artificially induced white spot lesions in human teeth, was performed through the loss of fluorescence by Quantitative Light-Induced Fluorescence (QLF) and the alterations of the light attenuation coefficient by Optical Coherence Tomography (OCT). To analyze the OCT images using a commercially available system, a special algorithm was applied, whereas the QLF images were analyzed using the software available in the commercial system employed. When analyzing the sound region against white spot lesions region by QLF, a reduction in the fluorescence intensity was observed, whilst an increase of light attenuation by the OCT system occurred. Comparison of the percentage of alteration between optical properties of sound and artificial enamel caries regions showed that OCT processed images through the attenuation of light enhanced the tooth optical alterations more than fluorescence detected by QLF System. QLF versus OCT imaging of enamel caries: a photonics assessment. PMID:26351155

  11. Auditory-prefrontal axonal connectivity in the macaque cortex: quantitative assessment of processing streams.

    PubMed

    Bezgin, Gleb; Rybacki, Konrad; van Opstal, A John; Bakker, Rembrandt; Shen, Kelly; Vakorin, Vasily A; McIntosh, Anthony R; Kötter, Rolf

    2014-08-01

    Primate sensory systems subserve complex neurocomputational functions. Consequently, these systems are organised anatomically in a distributed fashion, commonly linking areas to form specialised processing streams. Each stream is related to a specific function, as evidenced from studies of the visual cortex, which features rather prominent segregation into spatial and non-spatial domains. It has been hypothesised that other sensory systems, including auditory, are organised in a similar way on the cortical level. Recent studies offer rich qualitative evidence for the dual stream hypothesis. Here we provide a new paradigm to quantitatively uncover these patterns in the auditory system, based on an analysis of multiple anatomical studies using multivariate techniques. As a test case, we also apply our assessment techniques to more ubiquitously-explored visual system. Importantly, the introduced framework opens the possibility for these techniques to be applied to other neural systems featuring a dichotomised organisation, such as language or music perception. PMID:24980416

  12. Quantitative Mass Spectrometric Analysis and Post-Extraction Stability Assessment of the Euglenoid Toxin Euglenophycin

    PubMed Central

    Gutierrez, Danielle B.; Rafalski, Alexandra; Beauchesne, Kevin; Moeller, Peter D.; Triemer, Richard E.; Zimba, Paul V.

    2013-01-01

    Euglenophycin is a recently discovered toxin produced by at least one species of euglenoid algae. The toxin has been responsible for several fish mortality events. To facilitate the identification and monitoring of euglenophycin in freshwater ponds, we have developed a specific mass spectrometric method for the identification and quantitation of euglenophycin. The post-extraction stability of the toxin was assessed under various conditions. Euglenophycin was most stable at room temperature. At 8 °C there was a small, but statistically significant, loss in toxin after one day. These methods and knowledge of the toxin’s stability will facilitate identification of the toxin as a causative agent in fish kills and determination of the toxin’s distribution in the organs of exposed fish. PMID:24051554

  13. Quantitative risk assessment & leak detection criteria for a subsea oil export pipeline

    NASA Astrophysics Data System (ADS)

    Zhang, Fang-Yuan; Bai, Yong; Badaruddin, Mohd Fauzi; Tuty, Suhartodjo

    2009-06-01

    A quantitative risk assessment (QRA) based on leak detection criteria (LDC) for the design of a proposed subsea oil export pipeline is presented in this paper. The objective of this QRA/LDC study was to determine if current leak detection methodologies were sufficient, based on QRA results, while excluding the use of statistical leak detection; if not, an appropriate LDC for the leak detection system would need to be established. The famous UK PARLOC database was used for the calculation of pipeline failure rates, and the software POSVCM from MMS was used for oil spill simulations. QRA results revealed that the installation of a statistically based leak detection system (LDS) can significantly reduce time to leak detection, thereby mitigating the consequences of leakage. A sound LDC has been defined based on QRA study results and comments from various LDS vendors to assist the emergency response team (ERT) to quickly identify and locate leakage and employ the most effective measures to contain damage.

  14. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection.

    PubMed

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel

    2016-04-01

    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology. PMID:26910307

  15. MO-G-12A-01: Quantitative Imaging Metrology: What Should Be Assessed and How?

    SciTech Connect

    Giger, M; Petrick, N; Obuchowski, N; Kinahan, P

    2014-06-15

    The first two symposia in the Quantitative Imaging Track focused on 1) the introduction of quantitative imaging (QI) challenges and opportunities, and QI efforts of agencies and organizations such as the RSNA, NCI, FDA, and NIST, and 2) the techniques, applications, and challenges of QI, with specific examples from CT, PET/CT, and MR. This third symposium in the QI Track will focus on metrology and its importance in successfully advancing the QI field. While the specific focus will be on QI, many of the concepts presented are more broadly applicable to many areas of medical physics research and applications. As such, the topics discussed should be of interest to medical physicists involved in imaging as well as therapy. The first talk of the session will focus on the introduction to metrology and why it is critically important in QI. The second talk will focus on appropriate methods for technical performance assessment. The third talk will address statistically valid methods for algorithm comparison, a common problem not only in QI but also in other areas of medical physics. The final talk in the session will address strategies for publication of results that will allow statistically valid meta-analyses, which is critical for combining results of individual studies with typically small sample sizes in a manner that can best inform decisions and advance the field. Learning Objectives: Understand the importance of metrology in the QI efforts. Understand appropriate methods for technical performance assessment. Understand methods for comparing algorithms with or without reference data (i.e., “ground truth”). Understand the challenges and importance of reporting results in a manner that allows for statistically valid meta-analyses.

  16. Quantitative assessment of the stent/scaffold strut embedment analysis by optical coherence tomography.

    PubMed

    Sotomi, Yohei; Tateishi, Hiroki; Suwannasom, Pannipa; Dijkstra, Jouke; Eggermont, Jeroen; Liu, Shengnan; Tenekecioglu, Erhan; Zheng, Yaping; Abdelghani, Mohammad; Cavalcante, Rafael; de Winter, Robbert J; Wykrzykowska, Joanna J; Onuma, Yoshinobu; Serruys, Patrick W; Kimura, Takeshi

    2016-06-01

    The degree of stent/scaffold embedment could be a surrogate parameter of the vessel wall-stent/scaffold interaction and could have biological implications in the vascular response. We have developed a new specific software for the quantitative evaluation of embedment of struts by optical coherence tomography (OCT). In the present study, we described the algorithm of the embedment analysis and its reproducibility. The degree of embedment was evaluated as the ratio of the embedded part versus the whole strut height and subdivided into quartiles. The agreement and the inter- and intra-observer reproducibility were evaluated using the kappa and the interclass correlation coefficient (ICC). A total of 4 pullbacks of OCT images in 4 randomly selected coronary lesions with 3.0 × 18 mm devices [2 lesions with Absorb BVS and 2 lesions with XIENCE (both from Abbott Vascular, Santa Clara, CA, USA)] from Absorb Japan trial were evaluated by two investigators with QCU-CMS software version 4.69 (Leiden University Medical Center, Leiden, The Netherlands). Finally, 1481 polymeric struts in 174 cross-sections and 1415 metallic struts in 161 cross-sections were analyzed. Inter- and intra-observer reproducibility of quantitative measurements of embedment ratio and categorical assessment of embedment in Absorb BVS and XIENCE had excellent agreement with ICC ranging from 0.958 to 0.999 and kappa ranging from 0.850 to 0.980. The newly developed embedment software showed excellent reproducibility. Computer-assisted embedment analysis could be a feasible tool to assess the strut penetration into the vessel wall that could be a surrogate of acute injury caused by implantation of devices. PMID:26898315

  17. Quantitative assessment of hemodynamic changes during spinal dural arteriovenous fistula surgery.

    PubMed

    Shi, Wei; Qiao, Guangyu; Sun, Zhenghui; Shang, Aijia; Wu, Chen; Xu, Bainan

    2015-07-01

    We aimed to evaluate the efficacy of FLOW 800 (Carl Zeiss Meditec, Jena, Thuringia, Germany) with indocyanine green (ICG) videoangiography for the quantitative assessment of flow dynamics in spinal dural arteriovenous fistula (dAVF) surgeries. We prospectively enrolled nine patients with spinal dAVF diagnosed within the past year and performed FLOW 800 analyses using ICG videoangiography before and after surgical obliteration of the fistula. A color-coded map was semi-automatically generated by FLOW 800 and used for high-resolution visualization of the vasculature and instant interpretation of the dynamic flow changes. The FLOW 800-specific hemodynamic parameters were employed for real-time measurements of parenchymal perfusion alterations. Overall, 18 intraoperative FLOW 800 analyses using ICG videoangiography were performed in nine patients. The color-coded map aided the detection and complete obliteration of the fistulas in all patients and the results were verified by postoperative spinal digital subtraction angiography. The transit time parameter was significantly shorter in the preobliteration phase than in the postobliteration phase (p < 0.01), the rise time parameter exhibited the same pattern (p = 0.08) and maximum intensity and blood flow index were not significantly different between these phases. FLOW 800 with ICG videoangiography provided an intuitive and objective understanding of blood flow dynamics intraoperatively and enabled easy and confident identification and treatment of this pathology. The FLOW 800-specific hemodynamic analyses provided additional perfusion information that enabled real-time measurements of parenchymal perfusion alterations. FLOW 800 with ICG videoangiography is useful for intraoperative quantitative assessment of flow dynamics, facilitating safety and confidence in the treatment of spinal dAVF. PMID:25934113

  18. Quantitative Assessment of MRI T2 Response to Kainic Acid Neurotoxicity in Rats in vivo.

    PubMed

    Liachenko, Serguei; Ramu, Jaivijay; Konak, Tetyana; Paule, Merle G; Hanig, Joseph

    2015-07-01

    The aim of this study was to assess quantitative changes in T2 relaxation using magnetic resonance imaging approaches in rats exposed to kainic acid to assess the utility of such endpoints as biomarkers of neurotoxicity. Quantitative T2 mapping was performed in 21 rats before and 2, 24, and 48 h after a single ip injection of 10 mg/kg of kainic acid. Three methods of quantifying T2 changes were explored: (1) Thresholding: all voxels exhibiting T2 ≤ 72 ms were designated normal tissue, whereas voxels exhibiting T2 > 72 ms were designated as lesioned tissue; (2) Statistical mapping: T2 maps obtained after treatment were statistically compared with averaged "baseline" maps, voxel-by-voxel; (3) Within-subject difference from baseline: for each individual the baseline T2 map was subtracted from the T2 map obtained after treatment. Based on the follow-up histopathological response there were 9 responders, 7 nonresponders, and 5 animals were not classified due to early sacrifice at 2 h which was too soon after treatment to detect any morphological evidence. The "thresholding" method (1) detected differences between groups only at the later time point of 48 h, the "statistical mapping" approach (2) detected differences 24 and 48 h after treatment, and the "within-subject difference from baseline" method (3) detected statistically significant differences between groups at each time point (2, 24, and 48 h). T2 mapping provides an easily quantifiable biomarker and the quantification method employing the use of the same animal as its own control provides the most sensitive metrics. PMID:25904105

  19. An Improved Quantitative Approach for the Assessment of Mitochondrial Fragmentation in Chemoresistant Ovarian Cancer Cells

    PubMed Central

    Farrand, Lee; Kim, Ji Young; Im-Aram, Akechai; Suh, Jeong-Yong; Lee, Hyong Joo; Tsang, Benjamin K.

    2013-01-01

    Mitochondrial fission is a process that involves cleavage of mitochondria into smaller fragments and is regulated by the GTPase Dynamin-related protein 1 (Drp1). Higher levels of mitochondrial fission are associated with the induction of apoptosis in cancer cells. However, current methods to accurately quantify mitochondrial fission in order to compare therapeutics that target this process are often ambiguous or rely on subjective assessment. Mitochondria are also prone to aggregation, making accurate analysis difficult. Here we describe an improved approach for the quantification of mitochondrial fragmentation involving several differences from currently existing methods. Cells are first subjected to cytological centrifugation, which reduces cellular z-axis height and disperses individual mitochondria for easier observation. Three commercially available fluorescence analysis tools are then applied to disambiguate remaining mitochondrial clusters that require further inspection. Finally, cut-off scoring is applied, which can be tailored to individual cell type. The resultant approach allows for the efficient and objective assessment of mitochondrial fragmentation in response to treatment. We applied this technique to an experimental question involving chemosensitive and chemoresistant ovarian cancer (OVCA) cells. Cisplatin and the phytochemical piperlongumine were found to induce both mitochondrial fission and apoptosis in chemosensitive cells, while only piperlongumine was able to elicit these cellular responses in chemoresistant cells. Piperlongumine-induced apoptosis appeared to be mediated by Drp1-dependent mitochondrial fission since the apoptotic response was attenuated by the presence of the Drp1 inhibitor mDivi-1. Our study provides groundwork for a more objective approach to the quantification of mitochondrial fragmentation, and sheds further light on a potential mechanism of action for piperlongumine in the treatment of chemoresistant OVCA. PMID:24040144

  20. The importance of tissue handling of surgically removed breast cancer for an accurate assessment of the Ki-67 index

    PubMed Central

    Arima, Nobuyuki; Nishimura, Reiki; Osako, Tomofumi; Nishiyama, Yasuyuki; Fujisue, Mamiko; Okumura, Yasuhiro; Nakano, Masahiro; Tashima, Rumiko; Toyozumi, Yasuo

    2016-01-01

    Aim Insufficient attention for the Ki-67 immunohistochemistry has been given to the importance of tissue handling for surgical breast cancer specimens. We sought to investigate the effect of fixation status on the Ki-67. Methods We examined the effect of fixative, time to and duration of fixation using surgical specimens, and finally, compared the paired Ki-67 index in the tumour between core needle and surgical specimen. Results The Ki-67 was significantly higher when 10% neutral buffered formalin was used (p=0.0276). Insufficient fixation caused a drastic reduction in the Ki-67 index (p=0.0177), but not significant in oestrogen receptor (ER) and human epidermal growth factor receptor 2 (HER2). Sixteen hours delayed time to fixation also caused a reduction of the Ki-67 (p=0.0284), but not significant in ER. Prolonged fixation significantly led to a gradual reduction in the Ki-67 in a time-dependent manner, but not in both ER and HER2. Finally, cutting the tumour before fixation improved fixation status and consequently caused an increased level of the Ki-67 index (p=0.0181), which resulted in a strong correlation of the Ki-67 between core needle and surgical specimen (r=0.8595). Conclusions Tissue handling of surgical specimen is critical for assessing the Ki-67 compared with ER and HER2. We should pay more attention to tissue fixation status for the standard assessment of the Ki-67 index. PMID:26420767

  1. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    SciTech Connect

    Waters, Michael Jackson, Marcus

    2008-11-15

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens and presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including

  2. Assessment and Mission Planning Capability For Quantitative Aerothermodynamic Flight Measurements Using Remote Imaging

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas; Splinter, Scott; Daryabeigi, Kamran; Wood, William; Schwartz, Richard; Ross, Martin

    2008-01-01

    assessment study focused on increasing the probability of returning spatially resolved scientific/engineering thermal imagery. This paper provides an overview of the assessment task and the systematic approach designed to establish confidence in the ability of existing assets to reliably acquire, track and return global quantitative surface temperatures of the Shuttle during entry. A discussion of capability demonstration in support of a potential Shuttle boundary layer transition flight test is presented. Successful demonstration of a quantitative, spatially resolved, global temperature measurement on the proposed Shuttle boundary layer transition flight test could lead to potential future applications with hypersonic flight test programs within the USAF and DARPA along with flight test opportunities supporting NASA s project Constellation.

  3. Quantitative Assessment of Regional Wall Motion Abnormalities Using Dual-Energy Digital Subtraction Intravenous Ventriculography

    NASA Astrophysics Data System (ADS)

    McCollough, Cynthia H.

    Healthy portions of the left ventricle (LV) can often compensate for regional dysfunction, thereby masking regional disease when global indices of LV function are employed. Thus, quantitation of regional function provides a more useful method of assessing LV function, especially in diseases that have regional effects such as coronary artery disease. This dissertation studied the ability of a phase -matched dual-energy digital subtraction angiography (DE -DSA) technique to quantitate changes in regional LV systolic volume. The potential benefits and a theoretical description of the DE imaging technique are detailed. A correlated noise reduction algorithm is also presented which raises the signal-to-noise ratio of DE images by a factor of 2 -4. Ten open-chest dogs were instrumented with transmural ultrasonic crystals to assess regional LV function in terms of systolic normalized-wall-thickening rate (NWTR) and percent-systolic-thickening (PST). A pneumatic occluder was placed on the left-anterior-descending (LAD) coronary artery to temporarily reduce myocardial blood flow, thereby changing regional LV function in the LAD bed. DE-DSA intravenous left ventriculograms were obtained at control and four levels of graded myocardial ischemia, as determined by reductions in PST. Phase-matched images displaying changes in systolic contractile function were created by subtracting an end-systolic (ES) control image from ES images acquired at each level of myocardial ischemia. The resulting wall-motion difference signal (WMD), which represents a change in regional systolic volume between the control and ischemic states, was quantitated by videodensitometry and compared with changes in NWTR and PST. Regression analysis of 56 data points from 10 animals shows a linear relationship between WMD and both NWTR and PST: WMD = -2.46 NWTR + 13.9, r = 0.64, p < 0.001; WMD = -2.11 PST + 18.4, r = 0.54, p < 0.001. Thus, changes in regional ES LV volume between rest and ischemic states, as

  4. Quantitative crystalline silica exposure assessment for a historical cohort epidemiologic study in the German porcelain industry.

    PubMed

    Birk, Thomas; Guldner, Karlheinz; Mundt, Kenneth A; Dahmann, Dirk; Adams, Robert C; Parsons, William

    2010-09-01

    A time-dependent quantitative exposure assessment of silica exposure among nearly 18,000 German porcelain workers was conducted. Results will be used to evaluate exposure-response disease risks. Over 8000 historical industrial hygiene (IH) measurements with original sampling and analysis protocols from 1954-2006 were obtained from the German Berufs- genossenschaft der keramischen-und Glas-Industrie (BGGK) and used to construct a job exposure matrix (JEM). Early measurements from different devices were converted to modern gravimetric equivalent values. Conversion factors were derived from parallel historical measurements and new side-by-side measurements using historical and modern devices in laboratory dust tunnels and active workplace locations. Exposure values were summarized and smoothed using LOESS regression; estimates for early years were derived using backward extrapolation techniques. Employee work histories were merged with JEM values to determine cumulative crystalline silica exposures for cohort members. Average silica concentrations were derived for six primary similar exposure groups (SEGs) for 1938-2006. Over 40% of the cohort accumulated <0.5 mg; just over one-third accumulated >1 mg/m(3)-years. Nearly 5000 workers had cumulative crystalline silica estimates >1.5 mg/m(3)-years. Similar numbers of men and women fell into each cumulative exposure category, except for 1113 women and 1567 men in the highest category. Over half of those hired before 1960 accumulated >3 mg/m(3)-years crystalline silica compared with 4.9% of those hired after 1960. Among those ever working in the materials preparation area, half accumulated >3 mg/m(3)-year compared with 12% of those never working in this area. Quantitative respirable silica exposures were estimated for each member of this cohort, including employment periods for which sampling used now obsolete technologies. Although individual cumulative exposure estimates ranged from background to about 40 mg/m(3)-years

  5. NPEC Sourcebook on Assessment: Definitions and Assessment Methods for Communication, Leadership, Information Literacy, Quantitative Reasoning, and Quantitative Skills. NPEC 2005-0832

    ERIC Educational Resources Information Center

    Jones, Elizabeth A.; RiCharde, Stephen

    2005-01-01

    Faculty, instructional staff, and assessment professionals are interested in student outcomes assessment processes and tools that can be used to improve learning experiences and academic programs. How can students' skills be assessed effectively? What assessments measure skills in communication? Leadership? Information literacy? Quantitative…

  6. Assessing Quantitative Resistance against Leptosphaeria maculans (Phoma Stem Canker) in Brassica napus (Oilseed Rape) in Young Plants

    PubMed Central

    Huang, Yong-Ju; Qi, Aiming; King, Graham J.; Fitt, Bruce D. L.

    2014-01-01

    Quantitative resistance against Leptosphaeria maculans in Brassica napus is difficult to assess in young plants due to the long period of symptomless growth of the pathogen from the appearance of leaf lesions to the appearance of canker symptoms on the stem. By using doubled haploid (DH) lines A30 (susceptible) and C119 (with quantitative resistance), quantitative resistance against L. maculans was assessed in young plants in controlled environments at two stages: stage 1, growth of the pathogen along leaf veins/petioles towards the stem by leaf lamina inoculation; stage 2, growth in stem tissues to produce stem canker symptoms by leaf petiole inoculation. Two types of inoculum (ascospores; conidia) and three assessment methods (extent of visible necrosis; symptomless pathogen growth visualised using the GFP reporter gene; amount of pathogen DNA quantified by PCR) were used. In stage 1 assessments, significant differences were observed between lines A30 and C119 in area of leaf lesions, distance grown along veins/petioles assessed by visible necrosis or by viewing GFP and amount of L. maculans DNA in leaf petioles. In stage 2 assessments, significant differences were observed between lines A30 and C119 in severity of stem canker and amount of L. maculans DNA in stem tissues. GFP-labelled L. maculans spread more quickly from the stem cortex to the stem pith in A30 than in C119. Stem canker symptoms were produced more rapidly by using ascospore inoculum than by using conidial inoculum. These results suggest that quantitative resistance against L. maculans in B. napus can be assessed in young plants in controlled conditions. Development of methods to phenotype quantitative resistance against plant pathogens in young plants in controlled environments will help identification of stable quantitative resistance for control of crop diseases. PMID:24454767

  7. Application of Pain Quantitative Analysis Device for Assessment of Postoperative Pain after Arthroscopic Rotator Cuff Repair

    PubMed Central

    Mifune, Yutaka; Inui, Atsuyuki; Nagura, Issei; Sakata, Ryosuke; Muto, Tomoyuki; Harada, Yoshifumi; Takase, Fumiaki; Kurosaka, Masahiro; Kokubu, Takeshi

    2015-01-01

    Purpose : The PainVision™ system was recently developed for quantitative pain assessment. Here, we used this system to evaluate the effect of plexus brachialis block on postoperative pain after arthroscopic rotator cuff repair. Methods : Fifty-five patients who underwent arthroscopic rotator cuff repair were included in this study. First 26 cases received no plexus brachialis block (control group), and the next 29 cases received the plexus brachialis block before surgery (block group). Patients completed the visual analog scale at 4, 8, 16, and 24 hours after surgery, and the intensity of postoperative pain was assessed with PainVision™ at 16 hours. The postoperative use of non-steroidal anti-inflammatory agents was also recorded. Results : The pain intensity at 16 hours after surgery assessed by PainVision™ was significantly lower in the block group than in the control group (block, 252.0 ± 47.8, control, 489.0 ± 89.1, P < 0.05). However, there were no differences in the VAS values at 16 hours between the 2 groups (block, 4.3 ± 0.6, control, 5.7 ± 0.4, P = N.S.). The pain intensity and VAS at 16 hours after surgery were highly correlated (r = 0.59, P = 0.006 in the block group and r = 0.62, P = 0.003 in the control group). The effect size of the assessment by PainVision™ was bigger than that of VAS (r=0.31 in VAS and 0.51 in Pain vision). Conclusion : The PainVision™ system could be useful to evaluate postoperative pain because it enables the quantification and comparison of pain intensity independent of individual pain thresholds. PMID:26157522

  8. Stepping inside the niche: microclimate data are critical for accurate assessment of species' vulnerability to climate change.

    PubMed

    Storlie, Collin; Merino-Viteri, Andres; Phillips, Ben; VanDerWal, Jeremy; Welbergen, Justin; Williams, Stephen

    2014-09-01

    To assess a species' vulnerability to climate change, we commonly use mapped environmental data that are coarsely resolved in time and space. Coarsely resolved temperature data are typically inaccurate at predicting temperatures in microhabitats used by an organism and may also exhibit spatial bias in topographically complex areas. One consequence of these inaccuracies is that coarsely resolved layers may predict thermal regimes at a site that exceed species' known thermal limits. In this study, we use statistical downscaling to account for environmental factors and develop high-resolution estimates of daily maximum temperatures for a 36 000 km(2) study area over a 38-year period. We then demonstrate that this statistical downscaling provides temperature estimates that consistently place focal species within their fundamental thermal niche, whereas coarsely resolved layers do not. Our results highlight the need for incorporation of fine-scale weather data into species' vulnerability analyses and demonstrate that a statistical downscaling approach can yield biologically relevant estimates of thermal regimes. PMID:25252835

  9. Assessment of the extended Koopmans' theorem for the chemical reactivity: Accurate computations of chemical potentials, chemical hardnesses, and electrophilicity indices.

    PubMed

    Yildiz, Dilan; Bozkaya, Uğur

    2016-01-30

    The extended Koopmans' theorem (EKT) provides a straightforward way to compute ionization potentials and electron affinities from any level of theory. Although it is widely applied to ionization potentials, the EKT approach has not been applied to evaluation of the chemical reactivity. We present the first benchmarking study to investigate the performance of the EKT methods for predictions of chemical potentials (μ) (hence electronegativities), chemical hardnesses (η), and electrophilicity indices (ω). We assess the performance of the EKT approaches for post-Hartree-Fock methods, such as Møller-Plesset perturbation theory, the coupled-electron pair theory, and their orbital-optimized counterparts for the evaluation of the chemical reactivity. Especially, results of the orbital-optimized coupled-electron pair theory method (with the aug-cc-pVQZ basis set) for predictions of the chemical reactivity are very promising; the corresponding mean absolute errors are 0.16, 0.28, and 0.09 eV for μ, η, and ω, respectively. PMID:26458329

  10. Stepping inside the niche: microclimate data are critical for accurate assessment of species' vulnerability to climate change

    PubMed Central

    Storlie, Collin; Merino-Viteri, Andres; Phillips, Ben; VanDerWal, Jeremy; Welbergen, Justin; Williams, Stephen

    2014-01-01

    To assess a species' vulnerability to climate change, we commonly use mapped environmental data that are coarsely resolved in time and space. Coarsely resolved temperature data are typically inaccurate at predicting temperatures in microhabitats used by an organism and may also exhibit spatial bias in topographically complex areas. One consequence of these inaccuracies is that coarsely resolved layers may predict thermal regimes at a site that exceed species' known thermal limits. In this study, we use statistical downscaling to account for environmental factors and develop high-resolution estimates of daily maximum temperatures for a 36 000 km2 study area over a 38-year period. We then demonstrate that this statistical downscaling provides temperature estimates that consistently place focal species within their fundamental thermal niche, whereas coarsely resolved layers do not. Our results highlight the need for incorporation of fine-scale weather data into species' vulnerability analyses and demonstrate that a statistical downscaling approach can yield biologically relevant estimates of thermal regimes. PMID:25252835

  11. Quantitative assessment of the differential impacts of arbuscular and ectomycorrhiza on soil carbon cycling.

    PubMed

    Soudzilovskaia, Nadejda A; van der Heijden, Marcel G A; Cornelissen, Johannes H C; Makarov, Mikhail I; Onipchenko, Vladimir G; Maslov, Mikhail N; Akhmetzhanova, Asem A; van Bodegom, Peter M

    2015-10-01

    A significant fraction of carbon stored in the Earth's soil moves through arbuscular mycorrhiza (AM) and ectomycorrhiza (EM). The impacts of AM and EM on the soil carbon budget are poorly understood. We propose a method to quantify the mycorrhizal contribution to carbon cycling, explicitly accounting for the abundance of plant-associated and extraradical mycorrhizal mycelium. We discuss the need to acquire additional data to use our method, and present our new global database holding information on plant species-by-site intensity of root colonization by mycorrhizas. We demonstrate that the degree of mycorrhizal fungal colonization has globally consistent patterns across plant species. This suggests that the level of plant species-specific root colonization can be used as a plant trait. To exemplify our method, we assessed the differential impacts of AM : EM ratio and EM shrub encroachment on carbon stocks in sub-arctic tundra. AM and EM affect tundra carbon stocks at different magnitudes, and via partly distinct dominant pathways: via extraradical mycelium (both EM and AM) and via mycorrhizal impacts on above- and belowground biomass carbon (mostly AM). Our method provides a powerful tool for the quantitative assessment of mycorrhizal impact on local and global carbon cycling processes, paving the way towards an improved understanding of the role of mycorrhizas in the Earth's carbon cycle. PMID:26011828

  12. Quantitative risk assessment for the induction of allergic contact dermatitis: uncertainty factors for mucosal exposures.

    PubMed

    Farage, Miranda A; Bjerke, Donald L; Mahony, Catherine; Blackburn, Karen L; Gerberick, G Frank

    2003-09-01

    The quantitative risk assessment (QRA) paradigm has been extended to evaluating the risk of induction of allergic contact dermatitis from consumer products. Sensitization QRA compares product-related, topical exposures to a safe benchmark, the sensitization reference dose. The latter is based on an experimentally or clinically determined 'no observable adverse effect level' (NOAEL) and further refined by incorporating 'sensitization uncertainty factors' (SUFs) that address variables not adequately reflected in the data from which the threshold NOAEL was derived. A critical area of uncertainty for the risk assessment of oral care or feminine hygiene products is the extrapolation from skin to mucosal exposures. Most sensitization data are derived from skin contact, but the permeability of vulvovaginal and oral mucosae is greater than that of keratinized skin. Consequently, the QRA for some personal products that are exposed to mucosal tissue may require the use of more conservative SUFs. This article reviews the scientific basis for SUFs applied to topical exposure to vulvovaginal and oral mucosae. We propose a 20-fold range in the default uncertainty factor used in the contact sensitization QRA when extrapolating from data derived from the skin to situations involving exposure to non-keratinized mucosal tissue. PMID:14678210

  13. Quantitative assessment of the probability of bluetongue virus overwintering by horizontal transmission: application to Germany

    PubMed Central

    2011-01-01

    Even though bluetongue virus (BTV) transmission is apparently interrupted during winter, bluetongue outbreaks often reappear in the next season (overwintering). Several mechanisms for BTV overwintering have been proposed, but to date, their relative importance remain unclear. In order to assess the probability of BTV overwintering by persistence in adult vectors, ruminants (through prolonged viraemia) or a combination of both, a quantitative risk assessment model was developed. Furthermore, the model allowed the role played by the residual number of vectors present during winter to be examined, and the effect of a proportion of Culicoides living inside buildings (endophilic behaviour) to be explored. The model was then applied to a real scenario: overwintering in Germany between 2006 and 2007. The results showed that the limited number of vectors active during winter seemed to allow the transmission of BTV during this period, and that while transmission was favoured by the endophilic behaviour of some Culicoides, its effect was limited. Even though transmission was possible, the likelihood of BTV overwintering by the mechanisms studied seemed too low to explain the observed re-emergence of the disease. Therefore, other overwintering mechanisms not considered in the model are likely to have played a significant role in BTV overwintering in Germany between 2006 and 2007. PMID:21314966

  14. Quantitative Assessment of Local Collagen Matrix Remodeling in 3-D Culture: The Role of Rho Kinase

    PubMed Central

    Kim, Areum; Lakshman, Neema; Petroll, W.Matthew

    2007-01-01

    The purpose of this study was to quantitatively assess the role of Rho kinase in modulating the pattern and amount of local cell-induced collagen matrix remodeling. Human corneal fibroblasts were plated inside 100 μm thick fibrillar collagen matrices and cultured for 24 hours in media with or without the Rho kinase inhibitor Y-27632. Cells were then fixed and stained with phalloidin. Fluorescent (for f-actin) and reflected light (for collagen fibrils) 3-D optical section images were acquired using laser confocal microscopy. Fourier transform analysis was used to assess collagen fibril alignment, and 3-D cell morphology and local collagen density were measured using MetaMorph. Culture in serum-containing media induced significant global matrix contraction, which was inhibited by blocking Rho kinase (p < 0.001). Fibroblasts generally had a bipolar morphology and intracellular stress fibers. Collagen fibrils were compacted and aligned parallel to stress fibers and pseudopodia. When Rho kinase was inhibited, cells had a more cortical f-actin distribution and dendritic morphology. Both local collagen fibril density and alignment were significantly reduced (p<0.01). Overall, the data suggests that Rho kinase dependent contractile force generation leads to co-alignment of cells and collagen fibrils along the plane of greatest resistance, and that this process contributes to global matrix contraction. PMID:16978606

  15. [Multi-component quantitative analysis combined with chromatographic fingerprint for quality assessment of Onosma hookeri].

    PubMed

    Aga, Er-bu; Nie, Li-juan; Dongzhi, Zhuo-ma; Wang, Ju-le

    2015-11-01

    A method for simultaneous determination of the shikonin, acetyl shikonin and β, β'-dimethylpropene shikonin in Onosma hookeri and the chromatographic fingerprint was estabished by HPLC-DAD on an Agilent Zorbax SB-column with a gradient elution of acetonitrile and water at 0.8 mL x min(-1), 30 degrees C. The quality assessment was conducted by comparing the content difference of three naphthoquinone constituents, in combination with chromatographic fingerprint analysis and systems cluster analysis among 7 batches of radix O. hookeri. The content of the three naphthoquinone constituents showed wide variations in 7 bathces. The similarity value of the fingerprints of sample 5, 6 and 7 was above 0.99, sample 2 and 3 above 0.97, sample 3 and 4 above 0.90, and other samples larger than 0.8, which was in concert with the content of three naphthoquinone constituents. The 7 samples were roughly divided into 4 categories. The results above indicated that the using of this medicine is complex and rather spotty. The established HPLC fingerprints and the quantitative analysis method can be used efficiently for quality assessment of O. hookeri. PMID:27097421

  16. Application of quantitative uncertainty analysis for human health risk assessment at Rocky Flats

    SciTech Connect

    Duncan, F.L.W.; Gordon, J.W. ); Smith, D. ); Singh, S.P. )

    1993-01-01

    The characterization of uncertainty is an important component of the risk assessment process. According to the U.S. Environmental Protection Agency's (EPA's) [open quotes]Guidance on Risk Characterization for Risk Managers and Risk Assessors,[close quotes] point estimates of risk [open quotes]do not fully convey the range of information considered and used in developing the assessment.[close quotes] Furthermore, the guidance states that the Monte Carlo simulation may be used to estimate descriptive risk percentiles. To provide information about the uncertainties associated with the reasonable maximum exposure (RME) estimate and the relation of the RME to other percentiles of the risk distribution for Operable Unit 1 (OU-1) at Rocky Flats, uncertainties were identified and quantitatively evaluated. Monte Carlo simulation is a technique that can be used to provide a probability function of estimated risk using random values of exposure factors and toxicity values in an exposure scenario. The Monte Carlo simulation involves assigning a joint probability distribution to the input variables (i.e., exposure factors) of an exposure scenario. Next, a large number of independent samples from the assigned joint distribution are taken and the corresponding outputs calculated. Methods of statistical inference are used to estimate, from the output sample, some parameters of the output distribution, such as percentiles and the expected value.

  17. Disability adjusted life year (DALY): a useful tool for quantitative assessment of environmental pollution.

    PubMed

    Gao, Tingting; Wang, Xiaochang C; Chen, Rong; Ngo, Huu Hao; Guo, Wenshan

    2015-04-01

    Disability adjusted life year (DALY) has been widely used since 1990s for evaluating global and/or regional burden of diseases. As many environmental pollutants are hazardous to human health, DALY is also recognized as an indicator to quantify the health impact of environmental pollution related to disease burden. Based on literature reviews, this article aims to give an overview of the applicable methodologies and research directions for using DALY as a tool for quantitative assessment of environmental pollution. With an introduction of the methodological framework of DALY, the requirements on data collection and manipulation for quantifying disease burdens are summarized. Regarding environmental pollutants hazardous to human beings, health effect/risk evaluation is indispensable for transforming pollution data into disease data through exposure and dose-response analyses which need careful selection of models and determination of parameters. Following the methodological discussions, real cases are analyzed with attention paid to chemical pollutants and pathogens usually encountered in environmental pollution. It can be seen from existing studies that DALY is advantageous over conventional environmental impact assessment for quantification and comparison of the risks resulted from environmental pollution. However, further studies are still required to standardize the methods of health effect evaluation regarding varied pollutants under varied circumstances before DALY calculation. PMID:25549348

  18. Multi-observation PET image analysis for patient follow-up quantitation and therapy assessment

    NASA Astrophysics Data System (ADS)

    David, S.; Visvikis, D.; Roux, C.; Hatt, M.

    2011-09-01

    In positron emission tomography (PET) imaging, an early therapeutic response is usually characterized by variations of semi-quantitative parameters restricted to maximum SUV measured in PET scans during the treatment. Such measurements do not reflect overall tumor volume and radiotracer uptake variations. The proposed approach is based on multi-observation image analysis for merging several PET acquisitions to assess tumor metabolic volume and uptake variations. The fusion algorithm is based on iterative estimation using a stochastic expectation maximization (SEM) algorithm. The proposed method was applied to simulated and clinical follow-up PET images. We compared the multi-observation fusion performance to threshold-based methods, proposed for the assessment of the therapeutic response based on functional volumes. On simulated datasets the adaptive threshold applied independently on both images led to higher errors than the ASEM fusion and on clinical datasets it failed to provide coherent measurements for four patients out of seven due to aberrant delineations. The ASEM method demonstrated improved and more robust estimation of the evaluation leading to more pertinent measurements. Future work will consist in extending the methodology and applying it to clinical multi-tracer datasets in order to evaluate its potential impact on the biological tumor volume definition for radiotherapy applications.

  19. Estimation of undiscovered deposits in quantitative mineral resource assessments-examples from Venezuela and Puerto Rico

    USGS Publications Warehouse

    Cox, D.P.

    1993-01-01

    Quantitative mineral resource assessments used by the United States Geological Survey are based on deposit models. These assessments consist of three parts: (1) selecting appropriate deposit models and delineating on maps areas permissive for each type of deposit; (2) constructing a grade-tonnage model for each deposit model; and (3) estimating the number of undiscovered deposits of each type. In this article, I focus on the estimation of undiscovered deposits using two methods: the deposit density method and the target counting method. In the deposit density method, estimates are made by analogy with well-explored areas that are geologically similar to the study area and that contain a known density of deposits per unit area. The deposit density method is useful for regions where there is little or no data. This method was used to estimate undiscovered low-sulfide gold-quartz vein deposits in Venezuela. Estimates can also be made by counting targets such as mineral occurrences, geophysical or geochemical anomalies, or exploration "plays" and by assigning to each target a probability that it represents an undiscovered deposit that is a member of the grade-tonnage distribution. This method is useful in areas where detailed geological, geophysical, geochemical, and mineral occurrence data exist. Using this method, porphyry copper-gold deposits were estimated in Puerto Rico. ?? 1993 Oxford University Press.

  20. Swept source optical coherence tomography for quantitative and qualitative assessment of dental composite restorations

    NASA Astrophysics Data System (ADS)

    Sadr, Alireza; Shimada, Yasushi; Mayoral, Juan Ricardo; Hariri, Ilnaz; Bakhsh, Turki A.; Sumi, Yasunori; Tagami, Junji

    2011-03-01

    The aim of this work was to explore the utility of swept-source optical coherence tomography (SS-OCT) for quantitative evaluation of dental composite restorations. The system (Santec, Japan) with a center wavelength of around 1300 nm and axial resolution of 12 μm was used to record data during and after placement of light-cured composites. The Fresnel phenomenon at the interfacial defects resulted in brighter areas indicating gaps as small as a few micrometers. The gap extension at the interface was quantified and compared to the observation by confocal laser scanning microscope after trimming the specimen to the same cross-section. Also, video imaging of the composite during polymerization could provide information about real-time kinetics of contraction stress and resulting gaps, distinguishing them from those gaps resulting from poor adaptation of composite to the cavity prior to polymerization. Some samples were also subjected to a high resolution microfocus X-ray computed tomography (μCT) assessment; it was found that differentiation of smaller gaps from the radiolucent bonding layer was difficult with 3D μCT. Finally, a clinical imaging example using a newly developed dental SS-OCT system with an intra-oral scanning probe (Panasonic Healthcare, Japan) is presented. SS-OCT is a unique tool for clinical assessment and laboratory research on resin-based dental restorations. Supported by GCOE at TMDU and NCGG.

  1. Large-Scale Quantitative Assessment of Binding Preferences in Protein-Nucleic Acid Complexes.

    PubMed

    Jakubec, Dávid; Hostas, Jirí; Laskowski, Roman A; Hobza, Pavel; Vondrásek, Jirí

    2015-04-14

    The growing number of high-quality experimental (X-ray, NMR) structures of protein–DNA complexes has sufficient enough information to assess whether universal rules governing the DNA sequence recognition process apply. While previous studies have investigated the relative abundance of various modes of amino acid–base contacts (van der Waals contacts, hydrogen bonds), relatively little is known about the energetics of these noncovalent interactions. In the present study, we have performed the first large-scale quantitative assessment of binding preferences in protein–DNA complexes by calculating the interaction energies in all 80 possible amino acid–DNA base combinations. We found that several mutual amino acid–base orientations featuring bidentate hydrogen bonds capable of unambiguous one-to-one recognition correspond to unique minima in the potential energy space of the amino acid–base pairs. A clustering algorithm revealed that these contacts form a spatially well-defined group offering relatively little conformational freedom. Various molecular mechanics force field and DFT-D ab initio calculations were performed, yielding similar results. PMID:26894243

  2. Tracking Epidermal Nerve Fiber Changes in Asian Macaques: Tools and Techniques for Quantitative Assessment.

    PubMed

    Mangus, Lisa M; Dorsey, Jamie L; Weinberg, Rachel L; Ebenezer, Gigi J; Hauer, Peter; Laast, Victoria A; Mankowski, Joseph L

    2016-08-01

    Quantitative assessment of epidermal nerve fibers (ENFs) has become a widely used clinical tool for the diagnosis of small fiber neuropathies such as diabetic neuropathy and human immunodeficiency virus-associated sensory neuropathy (HIV-SN). To model and investigate the pathogenesis of HIV-SN using simian immunodeficiency virus (SIV)-infected Asian macaques, we adapted the skin biopsy and immunostaining techniques currently employed in human patients and then developed two unbiased image analysis techniques for quantifying ENF in macaque footpad skin. This report provides detailed descriptions of these tools and techniques for ENF assessment in macaques and outlines important experimental considerations that we have identified in the course of our long-term studies. Although initially developed for studies of HIV-SN in the SIV-infected macaque model, these methods could be readily translated to a range of studies involving peripheral nerve degeneration and neurotoxicity in nonhuman primates as well as preclinical investigations of agents aimed at neuroprotection and regeneration. PMID:27235324

  3. Quantitative microbial risk assessment of human illness from exposure to marine beach sand.

    PubMed

    Shibata, Tomoyuki; Solo-Gabriele, Helena M

    2012-03-01

    Currently no U.S. federal guideline is available for assessing risk of illness from sand at recreational sites. The objectives of this study were to compute a reference level guideline for pathogens in beach sand and to compare these reference levels with measurements from a beach impacted by nonpoint sources of contamination. Reference levels were computed using quantitative microbial risk assessment (QMRA) coupled with Monte Carlo simulations. In order to reach an equivalent level of risk of illness as set by the U.S. EPA for marine water exposure (1.9 × 10(-2)), levels would need to be at least about 10 oocysts/g (about 1 oocyst/g for a pica child) for Cryptosporidium, about 5 MPN/g (about 1 MPN/g for pica) for enterovirus, and less than 10(6) CFU/g for S. aureus. Pathogen levels measured in sand at a nonpoint source recreational beach were lower than the reference levels. More research is needed in evaluating risk from yeast and helminth exposures as well as in identifying acceptable levels of risk for skin infections associated with sand exposures. PMID:22296573

  4. Quantitative assessment of human and pet exposure to Salmonella associated with dry pet foods.

    PubMed

    Lambertini, Elisabetta; Buchanan, Robert L; Narrod, Clare; Ford, Randall M; Baker, Robert C; Pradhan, Abani K

    2016-01-01

    Recent Salmonella outbreaks associated with dry pet foods and treats highlight the importance of these foods as previously overlooked exposure vehicles for both pets and humans. In the last decade efforts have been made to raise the safety of this class of products, for instance by upgrading production equipment, cleaning protocols, and finished product testing. However, no comprehensive or quantitative risk profile is available for pet foods, thus limiting the ability to establish safety standards and assess the effectiveness of current and proposed Salmonella control measures. This study sought to develop an ingredients-to-consumer quantitative microbial exposure assessment model to: 1) estimate pet and human exposure to Salmonella via dry pet food, and 2) assess the impact of industry and household-level mitigation strategies on exposure. Data on prevalence and concentration of Salmonella in pet food ingredients, production process parameters, bacterial ecology, and contact transfer in the household were obtained through literature review, industry data, and targeted research. A probabilistic Monte Carlo modeling framework was developed to simulate the production process and basic household exposure routes. Under the range of assumptions adopted in this model, human exposure due to handling pet food is null to minimal if contamination occurs exclusively before extrusion. Exposure increases considerably if recontamination occurs post-extrusion during coating with fat, although mean ingested doses remain modest even at high fat contamination levels, due to the low percent of fat in the finished product. Exposure is highly variable, with the distribution of doses ingested by adult pet owners spanning 3Log CFU per exposure event. Child exposure due to ingestion of 1g of pet food leads to significantly higher doses than adult doses associated with handling the food. Recontamination after extrusion and coating, e.g., via dust or equipment surfaces, may also lead to

  5. Quantitative assessment of in vivo breast masses using ultrasound attenuation and backscatter.

    PubMed

    Nam, Kibo; Zagzebski, James A; Hall, Timothy J

    2013-04-01

    Clinical analysis of breast ultrasound imaging is done qualitatively, facilitated with the ultrasound breast imaging-reporting and data system (US BI-RADS) lexicon, which helps to standardize imaging assessments. Two descriptors in that lexicon, "posterior acoustic features" and the "echo pattern" within a mass, are directly related to quantitative ultrasound (QUS) parameters, namely, ultrasound attenuation and the average backscatter coefficient (BSC). The purpose of this study was to quantify ultrasound attenuation and backscatter in breast masses and to investigate these QUS properties as potential differential diagnostic markers. Radio frequency (RF) echo signals were from patients with breast masses during a special ultrasound imaging session prior to core biopsy. Data were also obtained from a well characterized phantom using identical system settings. Masses include 14 fibroadenomas and 10 carcinomas. Attenuation for the acoustic path lying proximal to the tumor was estimated offline using a least squares method with constraints. BSCs were estimated using a reference phantom method (RPM). The attenuation coefficient within each mass was assessed using both the RPM and a hybrid method, and effective scatterer diameters (ESDs) were estimated using a Gaussian form factor model. Attenuation estimates obtained with the RPM were consistent with estimates done using the hybrid method in all cases except for two masses. The mean slope of the attenuation coefficient versus frequency for carcinomas was 20% greater than the mean slope value for the fibroadenomas. The product of the attenuation coefficient and anteroposterior dimension of the mass was computed to estimate the total attenuation for each mass. That value correlated well with the BI-RADS assessment of "posterior acoustic features" judged qualitatively from gray scale images. Nearly all masses were described as "hypoechoic," so no strong statements could be made about the correlation of echo pattern

  6. Comparison of Diagnostic Performance Between Visual and Quantitative Assessment of Bone Scintigraphy Results in Patients With Painful Temporomandibular Disorder

    PubMed Central

    Choi, Bong-Hoi; Yoon, Seok-Ho; Song, Seung-Il; Yoon, Joon-Kee; Lee, Su Jin; An, Young-Sil

    2016-01-01

    Abstract This retrospective clinical study was performed to evaluate whether a visual or quantitative method is more valuable for assessing painful temporomandibular disorder (TMD) using bone scintigraphy results. In total, 230 patients (172 women and 58 men) with TMD were enrolled. All patients were questioned about their temporomandibular joint (TMJ) pain. Bone scintigraphic data were acquired in all patients, and images were analyzed by visual and quantitative methods using the TMJ-to-skull uptake ratio. The diagnostic performances of both bone scintigraphic assessment methods for painful TMD were compared. In total, 241 of 460 TMJs (52.4%) were finally diagnosed with painful TMD. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of the visual analysis for diagnosing painful TMD were 62.8%, 59.6%, 58.6%, 63.8%, and 61.1%, respectively. The quantitative assessment showed the ability to diagnose painful TMD with a sensitivity of 58.8% and specificity of 69.3%. The diagnostic ability of the visual analysis for diagnosing painful TMD was not significantly different from that of the quantitative analysis. Visual bone scintigraphic analysis showed a diagnostic utility similar to that of quantitative assessment for the diagnosis of painful TMD. PMID:26765456

  7. Groundwater availability in the United States: the value of quantitative regional assessments

    USGS Publications Warehouse

    Dennehy, Kevin F.; Reilly, Thomas E.; Cunningham, William L.

    2015-01-01

    The sustainability of water resources is under continued threat from the challenges associated with a growing population, competing demands, and a changing climate. Freshwater scarcity has become a fact in many areas. Much of the United States surface-water supplies are fully apportioned for use; thus, in some areas the only potential alternative freshwater source that can provide needed quantities is groundwater. Although frequently overlooked, groundwater serves as the principal reserve of freshwater in the US and represents much of the potential supply during periods of drought. Some nations have requirements to monitor and characterize the availability of groundwater such as the European Union’s Water Framework Directive (EPCEU 2000). In the US there is no such national requirement. Quantitative regional groundwater availability assessments, however, are essential to document the status and trends of groundwater availability for the US and make informed water-resource decisions possible now and in the future. Barthel (2014) highlighted that the value of regional groundwater assessments goes well beyond just quantifying the resource so that it can be better managed. The tools and techniques required to evaluate these unique regional systems advance the science of hydrogeology and provide enhanced methods that can benefit local-scale groundwater investigations. In addition, a significant, yet under-utilized benefit is the digital spatial and temporal data sets routinely generated as part of these studies. Even though there is no legal or regulatory requirement for regional groundwater assessments in the US, there is a logical basis for their implementation. The purpose of this essay is to articulate the rationale for and reaffirm the value of regional groundwater assessments primarily in the US; however, the arguments hold for all nations. The importance of the data sets and the methods and model development that occur as part of these assessments is stressed

  8. Assessing the quality of conformal treatment planning: a new tool for quantitative comparison.

    PubMed

    Menhel, J; Levin, D; Alezra, D; Symon, Z; Pfeffer, R

    2006-10-21

    We develop a novel radiotherapy plan comparison index, critical organ scoring index (COSI), which is a measure of both target coverage and critical organ overdose. COSI is defined as COSI=1-(V(OAR)>tol/TC), where V(OAR)>tol is the fraction of volume of organ at risk receiving more than tolerance dose, and TC is the target coverage, VT,PI/VT, where VT,PI is the target volume receiving at a least prescription dose and VT is the total target volume. COSI approaches unity when the critical structure is completely spared and the target coverage is unity. We propose a two-dimensional, graphical representation of COSI versus conformity index (CI), where CI is a measure of a normal tissue overdose. We show that this 2D representation is a reliable, visual quantitative tool for evaluating competing plans. We generate COSI-CI plots for three sites: head and neck, cavernous sinus, and pancreas, and evaluate competing non-coplanar 3D and IMRT treatment plans. For all three sites this novel 2D representation assisted the physician in choosing the optimal plan, both in terms of target coverage and in terms of critical organ sparing. We verified each choice by analysing individual DVHs and isodose lines. Comparing our results to the widely used conformation number, we found that in all cases where there were discrepancies in the choice of the best treatment plan, the COSI-CI choice was considered the correct one, in several cases indicating that a non-coplanar 3D plan was superior to the IMRT plans. The choice of plan was quick, simple and accurate using the new graphical representation. PMID:17019044

  9. Quantitative MRI of the spinal cord and brain in adrenomyeloneuropathy: in vivo assessment of structural changes.

    PubMed

    Castellano, Antonella; Papinutto, Nico; Cadioli, Marcello; Brugnara, Gianluca; Iadanza, Antonella; Scigliuolo, Graziana; Pareyson, Davide; Uziel, Graziella; Köhler, Wolfgang; Aubourg, Patrick; Falini, Andrea; Henry, Roland G; Politi, Letterio S; Salsano, Ettore

    2016-06-01

    Adrenomyeloneuropathy is the late-onset form of X-linked adrenoleukodystrophy, and is considered the most frequent metabolic hereditary spastic paraplegia. In adrenomyeloneuropathy the spinal cord is the main site of pathology. Differently from quantitative magnetic resonance imaging of the brain, little is known about the feasibility and utility of advanced neuroimaging in quantifying the spinal cord abnormalities in hereditary diseases. Moreover, little is known about the subtle pathological changes that can characterize the brain of adrenomyeloneuropathy subjects in the early stages of the disease. We performed a cross-sectional study on 13 patients with adrenomyeloneuropathy and 12 age-matched healthy control subjects who underwent quantitative magnetic resonance imaging to assess the structural changes of the upper spinal cord and brain. Total cord areas from C2-3 to T2-3 level were measured, and diffusion tensor imaging metrics, i.e. fractional anisotropy, mean, axial and radial diffusivity values were calculated in both grey and white matter of spinal cord. In the brain, grey matter regions were parcellated with Freesurfer and average volume and thickness, and mean diffusivity and fractional anisotropy from co-registered diffusion maps were calculated in each region. Brain white matter diffusion tensor imaging metrics were assessed using whole-brain tract-based spatial statistics, and tractography-based analysis on corticospinal tracts. Correlations among clinical, structural and diffusion tensor imaging measures were calculated. In patients total cord area was reduced by 26.3% to 40.2% at all tested levels (P < 0.0001). A mean 16% reduction of spinal cord white matter fractional anisotropy (P ≤ 0.0003) with a concomitant 9.7% axial diffusivity reduction (P < 0.009) and 34.5% radial diffusivity increase (P < 0.009) was observed, suggesting co-presence of axonal degeneration and demyelination. Brain tract-based spatial statistics showed a marked reduction

  10. Comparison of Methodologies to Detect Low Levels of Hemolysis in Serum for Accurate Assessment of Serum microRNAs

    PubMed Central

    Shah, Jaynish S.; Soon, Patsy S.; Marsh, Deborah J.

    2016-01-01

    microRNAs have emerged as powerful regulators of many biological processes, and their expression in many cancer tissues has been shown to correlate with clinical parameters such as cancer type and prognosis. Present in a variety of biological fluids, microRNAs have been described as a ‘gold mine’ of potential noninvasive biomarkers. Release of microRNA content of blood cells upon hemolysis dramatically alters the microRNA profile in blood, potentially affecting levels of a significant number of proposed biomarker microRNAs and, consequently, accuracy of serum or plasma-based tests. Several methods to detect low levels of hemolysis have been proposed; however, a direct comparison assessing their sensitivities is currently lacking. In this study, we evaluated the sensitivities of four methods to detect hemolysis in serum (listed in the order of sensitivity): measurement of hemoglobin using a Coulter® AcT diff™ Analyzer, visual inspection, the absorbance of hemoglobin measured by spectrophotometry at 414 nm and the ratio of red blood cell-enriched miR-451a to the reference microRNA miR-23a-3p. The miR ratio detected hemolysis down to approximately 0.001%, whereas the Coulter® AcT diff™ Analyzer was unable to detect hemolysis lower than 1%. The spectrophotometric method could detect down to 0.004% hemolysis, and correlated with the miR ratio. Analysis of hemolysis in a cohort of 86 serum samples from cancer patients and healthy controls showed that 31 of 86 (36%) were predicted by the miR ratio to be hemolyzed, whereas only 8 of these samples (9%) showed visible pink discoloration. Using receiver operator characteristic (ROC) analyses, we identified absorbance cutoffs of 0.072 and 0.3 that could identify samples with low and high levels of hemolysis, respectively. Overall, this study will assist researchers in the selection of appropriate methodologies to test for hemolysis in serum samples prior to quantifying expression of microRNAs. PMID:27054342

  11. Comparison of Methodologies to Detect Low Levels of Hemolysis in Serum for Accurate Assessment of Serum microRNAs.

    PubMed

    Shah, Jaynish S; Soon, Patsy S; Marsh, Deborah J

    2016-01-01

    microRNAs have emerged as powerful regulators of many biological processes, and their expression in many cancer tissues has been shown to correlate with clinical parameters such as cancer type and prognosis. Present in a variety of biological fluids, microRNAs have been described as a 'gold mine' of potential noninvasive biomarkers. Release of microRNA content of blood cells upon hemolysis dramatically alters the microRNA profile in blood, potentially affecting levels of a significant number of proposed biomarker microRNAs and, consequently, accuracy of serum or plasma-based tests. Several methods to detect low levels of hemolysis have been proposed; however, a direct comparison assessing their sensitivities is currently lacking. In this study, we evaluated the sensitivities of four methods to detect hemolysis in serum (listed in the order of sensitivity): measurement of hemoglobin using a Coulter® AcT diff™ Analyzer, visual inspection, the absorbance of hemoglobin measured by spectrophotometry at 414 nm and the ratio of red blood cell-enriched miR-451a to the reference microRNA miR-23a-3p. The miR ratio detected hemolysis down to approximately 0.001%, whereas the Coulter® AcT diff™ Analyzer was unable to detect hemolysis lower than 1%. The spectrophotometric method could detect down to 0.004% hemolysis, and correlated with the miR ratio. Analysis of hemolysis in a cohort of 86 serum samples from cancer patients and healthy controls showed that 31 of 86 (36%) were predicted by the miR ratio to be hemolyzed, whereas only 8 of these samples (9%) showed visible pink discoloration. Using receiver operator characteristic (ROC) analyses, we identified absorbance cutoffs of 0.072 and 0.3 that could identify samples with low and high levels of hemolysis, respectively. Overall, this study will assist researchers in the selection of appropriate methodologies to test for hemolysis in serum samples prior to quantifying expression of microRNAs. PMID:27054342

  12. A simplified method for quantitative assessment of the relative health and safety risk of environmental management activities

    SciTech Connect

    Eide, S.A.; Smith, T.H.; Peatross, R.G.; Stepan, I.E.

    1996-09-01

    This report presents a simplified method to assess the health and safety risk of Environmental Management activities of the US Department of Energy (DOE). The method applies to all types of Environmental Management activities including waste management, environmental restoration, and decontamination and decommissioning. The method is particularly useful for planning or tradeoff studies involving multiple conceptual options because it combines rapid evaluation with a quantitative approach. The method is also potentially applicable to risk assessments of activities other than DOE Environmental Management activities if rapid quantitative results are desired.

  13. Quantitative assessment of inhalation exposure and deposited dose of aerosol from nanotechnology-based consumer sprays†

    PubMed Central

    Nazarenko, Yevgen; Lioy, Paul J.; Mainelis, Gediminas

    2015-01-01

    This study provides a quantitative assessment of inhalation exposure and deposited aerosol dose in the 14 nm to 20 μm particle size range based on the aerosol measurements conducted during realistic usage simulation of five nanotechnology-based and five regular spray products matching the nano-products by purpose of application. The products were also examined using transmission electron microscopy. In seven out of ten sprays, the highest inhalation exposure was observed for the coarse (2.5–10 μm) particles while being minimal or below the detection limit for the remaining three sprays. Nanosized aerosol particles (14–100 nm) were released, which resulted in low but measurable inhalation exposures from all of the investigated consumer sprays. Eight out of ten products produced high total deposited aerosol doses on the order of 101–103 ng kg−1 bw per application, ~85–88% of which were in the head airways, only <10% in the alveolar region and <8% in the tracheobronchial region. One nano and one regular spray produced substantially lower total deposited doses (by 2–4 orders of magnitude less), only ~52–64% of which were in the head while ~29–40% in the alveolar region. The electron microscopy data showed nanosized objects in some products not labeled as nanotechnology-based and conversely did not find nano-objects in some nano-sprays. We found no correlation between nano-object presence and abundance as per the electron microscopy data and the determined inhalation exposures and deposited doses. The findings of this study and the reported quantitative exposure data will be valuable for the manufacturers of nanotechnology-based consumer sprays to minimize inhalation exposure from their products, as well as for the regulators focusing on protecting the public health. PMID:25621175

  14. A quantitative assessment of the eating capability in the elderly individuals.

    PubMed

    Laguna, Laura; Sarkar, Anwesha; Artigas, Gràcia; Chen, Jianshe

    2015-08-01

    Ageing process implies physiologically weakened muscles, loss of natural teeth and movement coordination, causing difficulties in the eating process. A term "eating capability" has been proposed to measure objectively how capable an elderly individual is in overall food management. Our objectives were to establish feasible methodologies of eating capability assessment, examine correlations between hand and oro-facial muscle strengths and grade elderly subjects into groups based on their eating capabilities. This study was performed with 203 elderly subjects living in the UK (n=103, 7 community centres, 2 sheltered accommodation) and Spain (n=100, 3 nursing homes, 1 community centre). Hand gripping force, finger gripping force, biting force, lip sealing pressure, tongue pressing pressure and touching sensitivity were measured for elderly subjects. Measured parameters were normalised and scored between 1 and 5, with 1 being the weakest. Subjects were then grouped into 4 groups based on their eating capability scores, being participants of cluster 1 the weakest group and 4 the strongest. Perception of oral processing difficulty was assessed by showing food images. Hand gripping force showed a strong linear correlation with tongue pressure (UK: 0.35; Spain: 0.326) and biting force (UK: 0.351; Spain: 0.427). Biting force was strongly dependent on the denture status. Elderly of the first three groups perceived food products with more hardness and/or fibrous structure as difficult to process orally. The objective measurements of various physiological factors enabled quantitative characterisation of the eating capabilities of elderly people. The observed relationship between hand and oro-facial muscle strengths provides possibility of using non-invasive hand gripping force measurement for eating capability assessment. PMID:25936821

  15. Quick, non-invasive and quantitative assessment of small fiber neuropathy in patients receiving chemotherapy.

    PubMed

    Saad, Mehdi; Psimaras, Dimitri; Tafani, Camille; Sallansonnet-Froment, Magali; Calvet, Jean-Henri; Vilier, Alice; Tigaud, Jean-Marie; Bompaire, Flavie; Lebouteux, Marie; de Greslan, Thierry; Ceccaldi, Bernard; Poirier, Jean-Michel; Ferrand, François-Régis; Le Moulec, Sylvestre; Huillard, Olivier; Goldwasser, François; Taillia, Hervé; Maisonobe, Thierry; Ricard, Damien

    2016-04-01

    Chemotherapy-induced peripheral neurotoxicity (CIPN) is a common, potentially severe and dose-limiting adverse effect; however, it is poorly investigated at an early stage due to the lack of a simple assessment tool. As sweat glands are innervated by small autonomic C-fibers, sudomotor function testing has been suggested for early screening of peripheral neuropathy. This study aimed to evaluate Sudoscan, a non-invasive and quantitative method to assess sudomotor function, in the detection and follow-up of CIPN. Eighty-eight patients receiving at least two infusions of Oxaliplatin only (45.4%), Paclitaxel only (14.8%), another drug only (28.4%) or two drugs (11.4%) were enrolled in the study. At each chemotherapy infusion the accumulated dose of chemotherapy was calculated and the Total Neuropathy Score clinical version (TNSc) was carried out. Small fiber neuropathy was assessed using Sudoscan (a 3-min test). The device measures the Electrochemical Skin Conductance (ESC) of the hands and feet expressed in microSiemens (µS). For patients receiving Oxaliplatin mean hands ESC changed from 73 ± 2 to 63 ± 2 and feet ESC from 77 ± 2 to 66 ± 3 µS (p < 0.001) while TNSc changed from 2.9 ± 0.5 to 4.3 ± 0.4. Similar results were observed in patients receiving Paclitaxel or another neurotoxic chemotherapy. During the follow-up, ESC values of both hands and feet with a corresponding TNSc < 2 were 70 ± 2 and 73 ± 2 µS respectively while they were 59 ± 1.4 and 64 ± 1.5 µS with a corresponding TNSc ≥ 6 (p < 0.0001 and p = 0.0003 respectively). This preliminary study suggests that small fiber neuropathy could be screened and followed using Sudoscan in patients receiving chemotherapy. PMID:26749101

  16. A Rapid Murine Coma and Behavior Scale for Quantitative Assessment of Murine Cerebral Malaria

    PubMed Central

    Carroll, Ryan W.; Wainwright, Mark S.; Kim, Kwang-Youn; Kidambi, Trilokesh; Gómez, Noé D.; Taylor, Terrie; Haldar, Kasturi

    2010-01-01

    Background Cerebral malaria (CM) is a neurological syndrome that includes coma and seizures following malaria parasite infection. The pathophysiology is not fully understood and cannot be accounted for by infection alone: patients still succumb to CM, even if the underlying parasite infection has resolved. To that effect, there is no known adjuvant therapy for CM. Current murine CM (MCM) models do not allow for rapid clinical identification of affected animals following infection. An animal model that more closely mimics the clinical features of human CM would be helpful in elucidating potential mechanisms of disease pathogenesis and evaluating new adjuvant therapies. Methodology/Principal Findings A quantitative, rapid murine coma and behavior scale (RMCBS) comprised of 10 parameters was developed to assess MCM manifested in C57BL/6 mice infected with Plasmodium berghei ANKA (PbA). Using this method a single mouse can be completely assessed within 3 minutes. The RMCBS enables the operator to follow the evolution of the clinical syndrome, validated here by correlations with intracerebral hemorrhages. It provides a tool by which subjects can be identified as symptomatic prior to the initiation of trial treatment. Conclusions/Significance Since the RMCBS enables an operator to rapidly follow the course of disease, label a subject as affected or not, and correlate the level of illness with neuropathologic injury, it can ultimately be used to guide the initiation of treatment after the onset of cerebral disease (thus emulating the situation in the field). The RMCBS is a tool by which an adjuvant therapy can be objectively assessed. PMID:20957049

  17. Quantitative 13C NMR of whole and fractionated Iowa Mollisols for assessment of organic matter composition

    NASA Astrophysics Data System (ADS)

    Fang, Xiaowen; Chua, Teresita; Schmidt-Rohr, Klaus; Thompson, Michael L.

    2010-01-01

    Both the concentrations and the stocks of soil organic carbon vary across the landscape. Do the amounts of recalcitrant components of soil organic matter (SOM) vary with landscape position? To address this question, we studied four Mollisols in central Iowa, two developed in till and two developed in loess. Two of the soils were well drained and two were poorly drained. We collected surface-horizon samples and studied organic matter in the particulate organic matter (POM) fraction, the clay fractions, and the whole, unfractionated samples. We treated the soil samples with 5 M HF at ambient temperature or at 60 °C for 30 min to concentrate the SOM. To assess the composition of the SOM, we used solid-state nuclear magnetic resonance (NMR) spectroscopy, in particular, quantitative 13C DP/MAS (direct-polarization/magic-angle spinning), with and without recoupled dipolar dephasing. Spin counting by correlation of the integral NMR intensity with the C concentration by elemental analysis showed that NMR was ⩾85% quantitative for the majority of the samples studied. For untreated whole-soil samples with <2.5 wt.% C, which is considerably less than in most previous quantitative NMR analyses of SOM, useful spectra that reflected ⩾65% of all C were obtained. The NMR analyses allowed us to conclude (1) that the HF treatment (with or without heat) had low impact on the organic C composition in the samples, except for protonating carboxylate anions to carboxylic acids, (2) that most organic C was observable by NMR even in untreated soil materials, (3) that esters were likely to compose only a minor fraction of SOM in these Mollisols, and (4) that the aromatic components of SOM were enriched to ˜53% in the poorly drained soils, compared with ˜48% in the well drained soils; in plant tissue and particulate organic matter (POM) the aromaticities were ˜18% and ˜32%, respectively. Nonpolar, nonprotonated aromatic C, interpreted as a proxy for charcoal C, dominated the

  18. Continuous monitoring as a tool for more accurate assessment of remaining lifetime for rotors and casings of steam turbines in service

    SciTech Connect

    Leyzerovich, A.; Berlyand, V.; Pozhidaev, A.; Yatskevich, S.

    1998-12-31

    The continuous monitoring of steam parameters and metal temperatures allows assessing the individual remaining lifetime for major high-temperature design components of steam turbines in service more accurately. Characteristic metal temperature differences and corresponding maximum thermal stresses and strains are calculated on-line to estimate the metal fatigue damage accumulated during the operation process. This can be one of the diagnostic functions of the power unit`s computerized Data Acquisition System (DAS) or special Subsystem of Diagnostic monitoring (SDM) for the turbine. In doing so, the remaining lifetime is assessed in terms of actual operating conditions and operation quality for the individual unit, and the problem of lifetime extension for each object is solved more accurately. Such an approach is considered as applied to a specific case of the supercritical-pressure steam turbine of 300-MW output. The applied mathematical models were developed on the basis of combined experimentation (field) and calculation investigations of the metal temperature and strain-stress fields in the high-temperature (HP and IP) rotors and casings under the most characteristic stationary and transient operating conditions. The monitoring results are used for revealing the operating conditions with the extreme thermal stresses and specific metal damage, as well as for making decisions about scheduling the turbine`s overhauls and extension of the turbine lifetime beyond the limits having been set originally.

  19. Rapid and accurate species and genomic species identification and exhaustive population diversity assessment of Agrobacterium spp. using recA-based PCR.

    PubMed

    Shams, M; Vial, L; Chapulliot, D; Nesme, X; Lavire, C

    2013-07-01

    Agrobacteria are common soil bacteria that interact with plants as commensals, plant growth promoting rhizobacteria or alternatively as pathogens. Indigenous agrobacterial populations are composites, generally with several species and/or genomic species and several strains per species. We thus developed a recA-based PCR approach to accurately identify and specifically detect agrobacteria at various taxonomic levels. Specific primers were designed for all species and/or genomic species of Agrobacterium presently known, including 11 genomic species of the Agrobacterium tumefaciens complex (G1-G9, G13 and G14, among which only G2, G4, G8 and G14 still received a Latin epithet: pusense, radiobacter, fabrum and nepotum, respectively), A. larrymoorei, A. rubi, R. skierniewicense, A. sp. 1650, and A. vitis, and for the close relative Allorhizobium undicola. Specific primers were also designed for superior taxa, Agrobacterium spp. and Rhizobiaceace. Primer specificities were assessed with target and non-target pure culture DNAs as well as with DNAs extracted from composite agrobacterial communities. In addition, we showed that the amplicon cloning-sequencing approach used with Agrobacterium-specific or Rhizobiaceae-specific primers is a way to assess the agrobacterial diversity of an indigenous agrobacterial population. Hence, the agrobacterium-specific primers designed in the present study enabled the first accurate and rapid identification of all species and/or genomic species of Agrobacterium, as well as their direct detection in environmental samples. PMID:23578959

  20. Quantitative assessments of traumatic axonal injury in human brain: concordance of microdialysis and advanced MRI.

    PubMed

    Magnoni, Sandra; Mac Donald, Christine L; Esparza, Thomas J; Conte, Valeria; Sorrell, James; Macrì, Mario; Bertani, Giulio; Biffi, Riccardo; Costa, Antonella; Sammons, Brian; Snyder, Abraham Z; Shimony, Joshua S; Triulzi, Fabio; Stocchetti, Nino; Brody, David L

    2015-08-01

    regions. We interpret this result to mean that both microdialysis and diffusion tensor magnetic resonance imaging accurately reflect the same pathophysiological process: traumatic axonal injury. This cross-validation increases confidence in both methods for the clinical assessment of axonal injury. However, neither microdialysis nor diffusion tensor magnetic resonance imaging have been validated versus post-mortem histology in humans. Furthermore, future work will be required to determine the prognostic significance of these assessments of traumatic axonal injury when combined with other clinical and radiological measures. PMID:26084657

  1. Quantitative assessments of traumatic axonal injury in human brain: concordance of microdialysis and advanced MRI

    PubMed Central

    Magnoni, Sandra; Mac Donald, Christine L.; Esparza, Thomas J.; Conte, Valeria; Sorrell, James; Macrì, Mario; Bertani, Giulio; Biffi, Riccardo; Costa, Antonella; Sammons, Brian; Snyder, Abraham Z.; Shimony, Joshua S.; Triulzi, Fabio; Stocchetti, Nino

    2015-01-01

    white matter regions. We interpret this result to mean that both microdialysis and diffusion tensor magnetic resonance imaging accurately reflect the same pathophysiological process: traumatic axonal injury. This cross-validation increases confidence in both methods for the clinical assessment of axonal injury. However, neither microdialysis nor diffusion tensor magnetic resonance imaging have been validated versus post-mortem histology in humans. Furthermore, future work will be required to determine the prognostic significance of these assessments of traumatic axonal injury when combined with other clinical and radiological measures. PMID:26084657

  2. High-Resolution Micro-CT for Morphologic and Quantitative Assessment of the Sinusoid in Human Cavernous Hemangioma of the Liver

    PubMed Central

    Duan, Jinghao; Hu, Chunhong; Chen, Hua

    2013-01-01

    Hepatic sinusoid plays a vital role in human cavernous hemangioma of the liver (CHL), and its morphologic investigation facilitates the understanding of microcirculation mechanism and pathological change of CHL. However, precise anatomical view of the hepatic sinusoid has been limited by the resolution and contrast available from existing imaging techniques. While liver biopsy has traditionally been the reliable method for the assessment of hepatic sinusoids, the invasiveness and sampling error are its inherent limitations. In this study, imaging of CHL samples was performed using in-line phase-contrast imaging (ILPCI) technique with synchrotron radiation. ILPCI allowed clear visualization of soft tissues and revealed structural details that were invisible to conventional radiography. Combining the computed tomography (CT) technique, ILPCI-CT was used to acquire the high-resolution micro-CT images of CHL, and three dimensional (3D) microstructures of hepatic sinusoids were provided for the morphologic depiction and quantitative assessment. Our study demonstrated that ILPCI-CT could substantially improve the radiographic contrast of CHL tissues in vitro with no contrast agent. ILPCI-CT yielded high-resolution micro-CT image of CHL sample at the micron scale, corresponding to information on actual structures revealed at histological section. The 3D visualization provided an excellent view of the hepatic sinusoid. The accurate view of individual hepatic sinusoid was achieved. The valuable morphological parameters of hepatic sinusoids, such as thrombi, diameters, surface areas and volumes, were measured. These parameters were of great importance in the evaluation of CHL, and they provided quantitative descriptors that characterized anatomical properties and pathological features of hepatic sinusoids. The results highlight the high degree of sensitivity of the ILPCI-CT technique and demonstrate the feasibility of accurate visualization of hepatic sinusoids. Moreover

  3. Assessment of phalangeal bone loss in patients with rheumatoid arthritis by quantitative ultrasound

    PubMed Central

    Roben, P; Barkmann, R; Ullrich, S; Gause, A; Heller, M; Gluer, C

    2001-01-01

    OBJECTIVE—Periarticular osteopenia is an early radiological sign of rheumatoid arthritis (RA). Quantitative ultrasound (QUS) devices have recently been shown to be useful for assessing osteoporosis. In this study the capability of a transportable and easy to use QUS device to detect skeletal impairment of the finger phalanges in patients with RA was investigated.
METHODS—In a cross sectional study 83 women (30 controls, 29 with glucocorticosteroid (GC) treated RA, and 24 with GC treated vasculitis) were examined. QUS measurements were obtained at the metaphyses of the proximal phalanges II-V and directly at the proximal interphalangeal joints II-IV with a DBM Sonic 1200 (IGEA, Italy) QUS device. Amplitude dependent speed of sound (AD-SoS) was evaluated. In 23 of the patients with RA, hand radiographs were evaluated.
RESULTS—Significant differences between patients with RA and the other groups were found for AD-SoS at both measurement sites. Compared with age matched controls, the AD-SoS of patients with RA was lowered by two and three standard deviations at the metaphysis and joint, respectively. Fingers of patients with RA without erosions (Larsen score 0-I) already had significantly decreased QUS values, which deteriorated further with the development of erosions (Larsen II-V).
CONCLUSION—This study indicates that QUS is sensitive to phalangeal periarticular bone loss in RA. QUS is a quick, simple, and inexpensive method free of ionising radiation that appears to be suited to detection of early stages of periarticular bone loss. Its clinical use in the assessment of early RA should be further evaluated in prospective studies.

 PMID:11406521

  4. Predicting pathogen risks to aid beach management: the real value of quantitative microbial risk assessment (QMRA).

    PubMed

    Ashbolt, Nicholas J; Schoen, Mary E; Soller, Jeffrey A; Roser, David J

    2010-09-01

    There has been an ongoing dilemma for agencies that set criteria for safe recreational waters in how to provide for a seasonal assessment of a beach site versus guidance for day-to-day management. Typically an overall 'safe' criterion level is derived from epidemiologic studies of sewage-impacted beaches. The decision criterion is based on a percentile value for a single sample or a moving median of a limited number (e.g. five per month) of routine samples, which are reported at least the day after recreator exposure has occurred. The focus of this paper is how to better undertake day-to-day recreational site monitoring and management. Internationally, good examples exist where predictive empirical regression models (based on rainfall, wind speed/direction, etc.) may provide an estimate of the target faecal indicator density for the day of exposure. However, at recreational swimming sites largely impacted by non-sewage sources of faecal indicators, there is concern that the indicator-illness associations derived from studies at sewage-impacted beaches may be inappropriate. Furthermore, some recent epidemiologic evidence supports the relationship to gastrointestinal (GI) illness with qPCR-derived measures of Bacteroidales/Bacteroides spp. as well as more traditional faecal indicators, but we understand less about the environmental fate of these molecular targets and their relationship to bather risk. Modelling pathogens and indicators within a quantitative microbial risk assessment framework is suggested as a way to explore the large diversity of scenarios for faecal contamination and hydrologic events, such as from waterfowl, agricultural animals, resuspended sediments and from the bathers themselves. Examples are provided that suggest that more site-specific targets derived by QMRA could provide insight, directly translatable to management actions. PMID:20638095

  5. A quantitative integrated assessment of pollution prevention achieved by integrated pollution prevention control licensing.

    PubMed

    Styles, David; O'Brien, Kieran; Jones, Michael B

    2009-11-01

    This paper presents an innovative, quantitative assessment of pollution avoidance attributable to environmental regulation enforced through integrated licensing, using Ireland's pharmaceutical-manufacturing sector as a case study. Emissions data reported by pharmaceutical installations were aggregated into a pollution trend using an Environmental Emissions Index (EEI) based on Lifecycle Assessment methodologies. Complete sectoral emissions data from 2001 to 2007 were extrapolated back to 1995, based on available data. Production volume data were used to derive a sectoral production index, and determine 'no-improvement' emission trends, whilst questionnaire responses from 20 industry representatives were used to quantify the contribution of integrated licensing to emission avoidance relative to these trends. Between 2001 and 2007, there was a 40% absolute reduction in direct pollution from 27 core installations, and 45% pollution avoidance relative to hypothetical 'no-improvement' pollution. It was estimated that environmental regulation avoided 20% of 'no-improvement' pollution, in addition to 25% avoidance under business-as-usual. For specific emissions, avoidance ranged from 14% and 30 kt a(-1) for CO(2) to 88% and 598 t a(-1) for SO(x). Between 1995 and 2007, there was a 59% absolute reduction in direct pollution, and 76% pollution avoidance. Pollution avoidance was dominated by reductions in emissions of VOCs, SO(x) and NO(x) to air, and emissions of heavy metals to water. Pollution avoidance of 35% was attributed to integrated licensing, ranging from between 8% and 2.9 t a(-1) for phosphorus emissions to water to 49% and 3143 t a(-1) for SO(x) emissions to air. Environmental regulation enforced through integrated licensing has been the major driver of substantial pollution avoidance achieved by Ireland's pharmaceutical sector - through emission limit values associated with Best Available Techniques, emissions monitoring and reporting requirements, and

  6. Pharmacology-based toxicity assessment: towards quantitative risk prediction in humans.

    PubMed

    Sahota, Tarjinder; Danhof, Meindert; Della Pasqua, Oscar

    2016-05-01

    Despite ongoing efforts to better understand the mechanisms underlying safety and toxicity, ~30% of the attrition in drug discovery and development is still due to safety concerns. Changes in current practice regarding the assessment of safety and toxicity are required to reduce late stage attrition and enable effective development of novel medicines. This review focuses on the implications of empirical evidence generation for the evaluation of safety and toxicity during drug development. A shift in paradigm is needed to (i) ensure that pharmacological concepts are incorporated into the evaluation of safety and toxicity; (ii) facilitate the integration of historical evidence and thereby the translation of findings across species as well as between in vitro and in vivo experiments and (iii) promote the use of experimental protocols tailored to address specific safety and toxicity questions. Based on historical examples, we highlight the challenges for the early characterisation of the safety profile of a new molecule and discuss how model-based methodologies can be applied for the design and analysis of experimental protocols. Issues relative to the scientific rationale are categorised and presented as a hierarchical tree describing the decision-making process. Focus is given to four different areas, namely, optimisation, translation, analytical construct and decision criteria. From a methodological perspective, the relevance of quantitative methods for estimation and extrapolation of risk from toxicology and safety pharmacology experimental protocols, such as points of departure and potency, is discussed in light of advancements in population and Bayesian modelling techniques (e.g. non-linear mixed effects modelling). Their use in the evaluation of pharmacokinetics (PK) and pharmacokinetic-pharmacodynamic relationships (PKPD) has enabled great insight into the dose rationale for medicines in humans, both in terms of efficacy and adverse events. Comparable benefits

  7. The Strategic Environment Assessment bibliographic network: A quantitative literature review analysis

    SciTech Connect

    Caschili, Simone; De Montis, Andrea; Ganciu, Amedeo; Ledda, Antonio; Barra, Mario

    2014-07-01

    Academic literature has been continuously growing at such a pace that it can be difficult to follow the progression of scientific achievements; hence, the need to dispose of quantitative knowledge support systems to analyze the literature of a subject. In this article we utilize network analysis tools to build a literature review of scientific documents published in the multidisciplinary field of Strategic Environment Assessment (SEA). The proposed approach helps researchers to build unbiased and comprehensive literature reviews. We collect information on 7662 SEA publications and build the SEA Bibliographic Network (SEABN) employing the basic idea that two publications are interconnected if one cites the other. We apply network analysis at macroscopic (network architecture), mesoscopic (sub graph) and microscopic levels (node) in order to i) verify what network structure characterizes the SEA literature, ii) identify the authors, disciplines and journals that are contributing to the international discussion on SEA, and iii) scrutinize the most cited and important publications in the field. Results show that the SEA is a multidisciplinary subject; the SEABN belongs to the class of real small world networks with a dominance of publications in Environmental studies over a total of 12 scientific sectors. Christopher Wood, Olivia Bina, Matthew Cashmore, and Andrew Jordan are found to be the leading authors while Environmental Impact Assessment Review is by far the scientific journal with the highest number of publications in SEA studies. - Highlights: • We utilize network analysis to analyze scientific documents in the SEA field. • We build the SEA Bibliographic Network (SEABN) of 7662 publications. • We apply network analysis at macroscopic, mesoscopic and microscopic network levels. • We identify SEABN architecture, relevant publications, authors, subjects and journals.

  8. A quantitative methodology to assess the risks to human health from CO 2 leakage into groundwater

    NASA Astrophysics Data System (ADS)

    Siirila, Erica R.; Navarre-Sitchler, Alexis K.; Maxwell, Reed M.; McCray, John E.

    2012-02-01

    Leakage of CO 2 and associated gases into overlying aquifers as a result of geologic carbon capture and sequestration may have adverse impacts on aquifer drinking-water quality. Gas or aqueous-phase leakage may occur due to transport via faults and fractures, through faulty well bores, or through leaky confining materials. Contaminants of concern include aqueous salts and dissolved solids, gaseous or aqueous-phase organic contaminants, and acidic gas or aqueous-phase fluids that can liberate metals from aquifer minerals. Here we present a quantitative risk assessment framework to predict potential human health risk from CO 2 leakage into drinking water aquifers. This framework incorporates the potential release of CO 2 into the drinking water aquifer; mobilization of metals due to a decrease in pH; transport of these metals down gradient to municipal receptors; distributions of contaminated groundwater to multiple households; and exposure and health risk to individuals using this water for household purposes. Additionally, this framework is stochastic, incorporates detailed variations in geological and geostatistical parameters and discriminates between uncertain and variable parameters using a two-stage, or nested, Monte Carlo approach. This approach is demonstrated using example simulations with hypothetical, yet realistic, aquifer characteristics and leakage scenarios. These example simulations show a greater risk for arsenic than for lead for both cancer and non-cancer endpoints, an unexpected finding. Higher background groundwater gradients also yield higher risk. The overall risk and the associated uncertainty are sensitive to the extent of aquifer stratification and the degree of local-scale dispersion. These results all highlight the importance of hydrologic modeling in risk assessment. A linear relationship between carcinogenic and noncarcinogenic risk was found for arsenic and suggests action levels for carcinogenic risk will be exceeded in exposure

  9. A Quantitative Microbiological Risk Assessment for Salmonella in Pigs for the European Union.

    PubMed

    Snary, Emma L; Swart, Arno N; Simons, Robin R L; Domingues, Ana Rita Calado; Vigre, Hakan; Evers, Eric G; Hald, Tine; Hill, Andrew A

    2016-03-01

    A farm-to-consumption quantitative microbiological risk assessment (QMRA) for Salmonella in pigs in the European Union has been developed for the European Food Safety Authority. The primary aim of the QMRA was to assess the impact of hypothetical reductions of slaughter-pig prevalence and the impact of control measures on the risk of human Salmonella infection. A key consideration during the QMRA development was the characterization of variability between E.U. Member States (MSs), and therefore a generic MS model was developed that accounts for differences in pig production, slaughterhouse practices, and consumption patterns. To demonstrate the parameterization of the model, four case study MSs were selected that illustrate the variability in production of pork meat and products across MSs. For the case study MSs the average probability of illness was estimated to be between 1 in 100,000 and 1 in 10 million servings given consumption of one of the three product types considered (pork cuts, minced meat, and fermented ready-to-eat sausages). Further analyses of the farm-to-consumption QMRA suggest that the vast majority of human risk derives from infected pigs with a high concentration of Salmonella in their feces (≥10(4) CFU/g). Therefore, it is concluded that interventions should be focused on either decreasing the level of Salmonella in the feces of infected pigs, the introduction of a control step at the abattoir to reduce the transfer of feces to the exterior of the pig, or a control step to reduce the level of Salmonella on the carcass post-evisceration. PMID:27002672

  10. Quantitative Assessment of Orbital Implant Position – A Proof of Concept

    PubMed Central

    Schreurs, Ruud; Dubois, Leander; Becking, Alfred G.; Maal, Thomas J. J.

    2016-01-01

    Introduction In orbital reconstruction, the optimal location of a predefined implant can be planned preoperatively. Surgical results can be assessed intraoperatively or postoperatively. A novel method for quantifying orbital implant position is introduced. The method measures predictability of implant placement: transformation parameters between planned and resulting implant position are quantified. Methods The method was tested on 3 human specimen heads. Computed Tomography scans were acquired at baseline with intact orbits (t0), after creation of the defect (t1) and postoperatively after reconstruction of the defect using a preformed implant (t2). Prior to reconstruction, the optimal implant position was planned on the t0 and t1 scans. Postoperatively, the planned and realized implant position were compared. The t0 and t2 scans were fused using iPlan software and the resulting implant was segmented in the fused t2 scan. An implant reference frame was created (Orbital Implant Positioning Frame); the planned implant was transformed to the reference position using an Iterative Closest Point approach. The segmentation of the resulting implant was also registered on the reference position, yielding rotational (pitch, yaw, roll) as well as translational parameters of implant position. Results Measurement with the Orbital Implant Positioning Frame proved feasible on all three specimen. The positional outcome provided more thorough and accurate insight in resulting implant position than could be gathered from distance measurements alone. Observer-related errors were abolished from the process, since the method is largely automatic. Conclusion A novel method of quantifying surgical outcome in orbital reconstructive surgery was presented. The presented Orbital Implant Positioning Frame assessed all parameters involved in implant displacement. The method proved to be viable on three human specimen heads. Clinically, the method could provide direct feedback intraoperatively

  11. Assessment of a quantitative metric for 4D CT artifact evaluation by observer consensus.

    PubMed

    Castillo, Sarah J; Castillo, Richard; Balter, Peter; Pan, Tinsu; Ibbott, Geoffrey; Hobbs, Brian; Yuan, Ying; Guerrero, Thomas

    2014-01-01

    The benefits of four-dimensional computed tomography (4D CT) are limited by the presence of artifacts that remain difficult to quantify. A correlation-based metric previously proposed for ciné 4D CT artifact identification was further validated as an independent artifact evaluator by using a novel qualitative assessment featuring a group of observers reaching a consensus decision on artifact location and magnitude. The consensus group evaluated ten ciné 4D CT scans for artifacts over each breathing phase of coronal lung views assuming one artifact per couch location. Each artifact was assigned a magnitude score of 1-5, 1 indicating lowest severity and 5 indicating highest severity. Consensus group results served as the ground truth for assessment of the correlation metric. The ten patients were split into two cohorts; cohort 1 generated an artifact identification threshold derived from receiver operating characteristic analysis using the Youden Index, while cohort 2 generated sensitivity and specificity values from application of the artifact threshold. The Pearson correlation coefficient was calculated between the correlation metric values and the consensus group scores for both cohorts. The average sensitivity and specificity values found with application of the artifact threshold were 0.703 and 0.476, respectively. The correlation coefficients of artifact magnitudes for cohort 1 and 2 were 0.80 and 0.61, respectively, (p < 0.001 for both); these correlation coefficients included a few scans with only two of the five possible magnitude scores. Artifact incidence was associated with breathing phase (p < 0.002), with presentation less likely near maximum exhale. Overall, the correlation metric allowed accurate and automated artifact identification. The consensus group evaluation resulted in efficient qualitative scoring, reduced interobserver variation, and provided consistent identification of artifact location and magnitudes. PMID:24892346

  12. 3D Quantitative Assessment of Lesion Response to MR-guided High-Intensity Focused Ultrasound Treatment of Uterine Fibroids

    PubMed Central

    Savic, Lynn J.; Lin, MingDe; Duran, Rafael; Schernthaner, Rüdiger E.; Hamm, Bernd; Geschwind, Jean-François; Hong, Kelvin; Chapiro, Julius

    2015-01-01

    Rationale and Objectives To investigate the response after MR-guided high-intensity focused ultrasound (MRgHIFU) treatment of uterine fibroids (UF) using a 3D quantification of total and enhancing lesion volume (TLV, ELV) on contrast-enhanced MRI (ceMRI) scans. Methods and Materials In a total of 24 patients, ceMRI scans were obtained at baseline and 24 hrs, 6, 12 and 24 months after MRgHIFU treatment. The dominant lesion was assessed using a semi-automatic quantitative 3D segmentation technique. Agreement between software-assisted and manual measurements was then analyzed using a linear regression model. Patients were classified as responders (R) or non-responders (NR) based on their symptom report after 6 months. Statistical analysis included the paired t-test and Mann-Whitney-test. Results Preprocedurally, the median TLV and ELV were 263.74cm3 (30.45–689.56cm3) and 210.13cm3 (14.43–689.53cm3), respectively. The 6-month follow-up demonstrated a reduction of TLV in 21 patients (87.5%) with a median TLV of 171.7cm3 (8.5–791.2cm3) (p<.0001). TLV remained stable with significant differences compared to baseline (p<.001 and p=.047 after 12 and 24 months). A reduction of ELV was apparent in 16 patients (66.6%) with a median ELV of 158.91cm3 (8.55–779.61cm3) after 6 months (p=.065). 3D quantification and manual measurements showed strong intermethod-agreement for fibroid volumes (R2=.889 and R2=.917) but greater discrepancy for enhancement calculations (R2=.659 and R2=.419) at baseline and 6 mo. No significant differences in TLV or ELV were observed between clinical R (n=15) and NR (n=3). Conclusion The 3D assessment has proven feasible and accurate in the quantification of fibroid response to MRgHIFU. Contrary to ELV, changes in TLV may be representative of the clinical outcome. PMID:26160057

  13. Quantitative microbial risk assessment applied to irrigation of salad crops with waste stabilization pond effluents.

    PubMed

    Pavione, D M S; Bastos, R K X; Bevilacqua, P D

    2013-01-01

    A quantitative microbial risk assessment model for estimating infection risks arising from consuming crops eaten raw that have been irrigated with effluents from stabilization ponds was constructed. A log-normal probability distribution function was fitted to a large database from a comprehensive monitoring of an experimental pond system to account for variability in Escherichia coli concentration in irrigation water. Crop contamination levels were estimated using predictive models derived from field experiments involving the irrigation of several crops with different effluent qualities. Data on daily intake of salad crops were obtained from a national survey in Brazil. Ten thousand-trial Monte Carlo simulations were used to estimate human health risks associated with the use of wastewater for irrigating low- and high-growing crops. The use of effluents containing 10(3)-10(4) E. coli per 100 ml resulted in median rotavirus infection risk of approximately 10(-3) and 10(-4) pppy when irrigating, respectively, low- and high-growing crops; the corresponding 95th percentile risk estimates were around 10(-2) in both scenarios. Sensitivity analyses revealed that variations in effluent quality, in the assumed ratios of pathogens to E. coli, and in the reduction of pathogens between harvest and consumption had great impact upon risk estimates. PMID:23508143

  14. Quantitative Assessment of Turbulence and Flow Eccentricity in an Aortic Coarctation: Impact of Virtual Interventions.

    PubMed

    Andersson, Magnus; Lantz, Jonas; Ebbers, Tino; Karlsson, Matts

    2015-09-01

    Turbulence and flow eccentricity can be measured by magnetic resonance imaging (MRI) and may play an important role in the pathogenesis of numerous cardiovascular diseases. In the present study, we propose quantitative techniques to assess turbulent kinetic energy (TKE) and flow eccentricity that could assist in the evaluation and treatment of stenotic severities. These hemodynamic parameters were studied in a pre-treated aortic coarctation (CoA) and after several virtual interventions using computational fluid dynamics (CFD), to demonstrate the effect of different dilatation options on the flow field. Patient-specific geometry and flow conditions were derived from MRI data. The unsteady pulsatile flow was resolved by large eddy simulation including non-Newtonian blood rheology. Results showed an inverse asymptotic relationship between the total amount of TKE and degree of dilatation of the stenosis, where turbulent flow proximal the constriction limits the possible improvement by treating the CoA alone. Spatiotemporal maps of TKE and flow eccentricity could be linked to the characteristics of the jet, where improved flow conditions were favored by an eccentric dilatation of the CoA. By including these flow markers into a combined MRI-CFD intervention framework, CoA therapy has not only the possibility to produce predictions via simulation, but can also be validated pre- and immediate post treatment, as well as during follow-up studies. PMID:26577361

  15. Quantitative assessments of residual stress fields at the surface of alumina hip joints.

    PubMed

    Pezzotti, Giuseppe; Munisso, Maria Chiara; Lessnau, Kristina; Zhu, Wenliang

    2010-11-01

    In-depth and in-plane response functions of photo- and electro-stimulated probes have been modeled and quantitatively evaluated in order to assess their suitability to detect the highly graded residual stress fields generated at the surface of alumina hip joints. Optical calibrations revealed large differences in probe size, which strongly affected the detected magnitude of residual stress. A comparison between the responses of Raman and fluorescence probes in polycrystalline alumina showed that the depth of those probes spread to an extent in the order of the tens of microns even with using a confocal probe configuration. On the other hand, the electro-stimulated luminescence emitted by oxygen vacancy sites (F(+) center) in the alumina lattice represented the most suitable choice for confining to a shallow volume the stress probe. This latter probe enabled us to reduce the measurement depth to the order of the tens of nanometers. We show maps of surface residual stress as collected on both main-wear and nonwear zones of an alumina femoral head. A comparison among stress maps taken at exactly the same location, but employing different probes, revealed averaging effects on the stress magnitude detected with photo-stimulated probes, while proving the superior spatial resolution of the electron probe. PMID:20848660

  16. A real-time, quantitative PCR protocol for assessing the relative parasitemia of Leucocytozoon in waterfowl.

    PubMed

    Smith, Matthew M; Schmutz, Joel; Apelgren, Chloe; Ramey, Andrew M

    2015-04-01

    Microscopic examination of blood smears can be effective at diagnosing and quantifying hematozoa infections. However, this method requires highly trained observers, is time consuming, and may be inaccurate for detection of infections at low levels of parasitemia. To develop a molecular methodology for identifying and quantifying Leucocytozoon parasite infection in wild waterfowl (Anseriformes), we designed a real-time, quantitative PCR protocol to amplify Leucocytozoon mitochondrial DNA using TaqMan fluorogenic probes and validated our methodology using blood samples collected from waterfowl in interior Alaska during late summer and autumn (n=105). By comparing our qPCR results to those derived from a widely used nested PCR protocol, we determined that our assay showed high levels of sensitivity (91%) and specificity (100%) in detecting Leucocytozoon DNA from host blood samples. Additionally, results of a linear regression revealed significant correlation between the raw measure of parasitemia produced by our qPCR assay (Ct values) and numbers of parasites observed on blood smears (R(2)=0.694, P=0.003), indicating that our assay can reliably determine the relative parasitemia levels among samples. This methodology provides a powerful new tool for studies assessing effects of haemosporidian infection in wild avian species. PMID:25655779

  17. Australia’s first national level quantitative environmental justice assessment of industrial air pollution

    NASA Astrophysics Data System (ADS)

    Chakraborty, Jayajit; Green, Donna

    2014-04-01

    This study presents the first national level quantitative environmental justice assessment of industrial air pollution in Australia. Specifically, our analysis links the spatial distribution of sites and emissions associated with industrial pollution sources derived from the National Pollution Inventory, to Indigenous status and social disadvantage characteristics of communities derived from Australian Bureau of Statistics indicators. Our results reveal a clear national pattern of environmental injustice based on the locations of industrial pollution sources, as well as volume, and toxicity of air pollution released at these locations. Communities with the highest number of polluting sites, emission volume, and toxicity-weighted air emissions indicate significantly greater proportions of Indigenous population and higher levels of socio-economic disadvantage. The quantities and toxicities of industrial air pollution are particularly higher in communities with the lowest levels of educational attainment and occupational status. These findings emphasize the need for more detailed analysis in specific regions and communities where socially disadvantaged groups are disproportionately impacted by industrial air pollution. Our empirical findings also underscore the growing necessity to incorporate environmental justice considerations in environmental planning and policy-making in Australia.

  18. Utilization of quantitative structure-activity relationships (QSARs) in risk assessment: Alkylphenols

    SciTech Connect

    Beck, B.D.; Toole, A.P.; Callahan, B.G.; Siddhanti, S.K. )

    1991-12-01

    Alkylphenols are a class of environmentally pervasive compounds, found both in natural (e.g., crude oils) and in anthropogenic (e.g., wood tar, coal gasification waste) materials. Despite the frequent environmental occurrence of these chemicals, there is a limited toxicity database on alkylphenols. The authors have therefore developed a 'toxicity equivalence approach' for alkylphenols which is based on their ability to inhibit, in a specific manner, the enzyme cyclooxygenase. Enzyme-inhibiting ability for individual alkylphenols can be estimated based on the quantitative structure-activity relationship developed by Dewhirst (1980) and is a function of the free hydroxyl group, electron-donating ring substituents, and hydrophobic aromatic ring substituents. The authors evaluated the toxicological significance of cyclooxygenase inhibition by comparison of the inhibitory capacity of alkylphenols with the inhibitory capacity of acetylsalicylic acid, or aspirin, a compound whose low-level effects are due to cyclooxygenase inhibition. Since nearly complete absorption for alkylphenols and aspirin is predicted, based on estimates of hydrophobicity and fraction of charged molecules at gastrointestinal pHs, risks from alkylphenols can be expressed directly in terms of 'milligram aspirin equivalence,' without correction for absorption differences. They recommend this method for assessing risks of mixtures of alkylphenols, especially for those compounds with no chronic toxicity data.38 references.

  19. A real-time, quantitative PCR protocol for assessing the relative parasitemia of Leucocytozoon in waterfowl

    USGS Publications Warehouse

    Smith, Matthew M.; Schmutz, Joel A.; Apelgren, Chloe; Ramey, Andy M.

    2015-01-01

    Microscopic examination of blood smears can be effective at diagnosing and quantifying hematozoa infections. However, this method requires highly trained observers, is time consuming, and may be inaccurate for detection of infections at low levels of parasitemia. To develop a molecular methodology for identifying and quantifying Leucocytozoon parasite infection in wild waterfowl (Anseriformes), we designed a real-time, quantitative PCR protocol to amplify Leucocytozoon mitochondrial DNA using TaqMan fluorogenic probes and validated our methodology using blood samples collected from waterfowl in interior Alaska during late summer and autumn (n = 105). By comparing our qPCR results to those derived from a widely used nested PCR protocol, we determined that our assay showed high levels of sensitivity (91%) and specificity (100%) in detecting Leucocytozoon DNA from host blood samples. Additionally, results of a linear regression revealed significant correlation between the raw measure of parasitemia produced by our qPCR assay (Ct values) and numbers of parasites observed on blood smears (R2 = 0.694, P = 0.003), indicating that our assay can reliably determine the relative parasitemia levels among samples. This methodology provides a powerful new tool for studies assessing effects of haemosporidian infection in wild avian species.

  20. An Overview of Quantitative Risk Assessment of Space Shuttle Propulsion Elements

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    1998-01-01

    Since the Space Shuttle Challenger accident in 1986, NASA has been working to incorporate quantitative risk assessment (QRA) in decisions concerning the Space Shuttle and other NASA projects. One current major NASA QRA study is the creation of a risk model for the overall Space Shuttle system. The model is intended to provide a tool to estimate Space Shuttle risk and to perform sensitivity analyses/trade studies, including the evaluation of upgrades. Marshall Space Flight Center (MSFC) is a part of the NASA team conducting the QRA study; MSFC responsibility involves modeling the propulsion elements of the Space Shuttle, namely: the External Tank (ET), the Solid Rocket Booster (SRB), the Reusable Solid Rocket Motor (RSRM), and the Space Shuttle Main Engine (SSME). This paper discusses the approach that MSFC has used to model its Space Shuttle elements, including insights obtained from this experience in modeling large scale, highly complex systems with a varying availability of success/failure data. Insights, which are applicable to any QRA study, pertain to organizing the modeling effort, obtaining customer buy-in, preparing documentation, and using varied modeling methods and data sources. Also provided is an overall evaluation of the study results, including the strengths and the limitations of the MSFC QRA approach and of qRA technology in general.

  1. Assessment of Quantitative and Allelic MGMT Methylation Patterns as a Prognostic Marker in Glioblastoma.

    PubMed

    Kristensen, Lasse S; Michaelsen, Signe R; Dyrbye, Henrik; Aslan, Derya; Grunnet, Kirsten; Christensen, Ib J; Poulsen, Hans S; Grønbæk, Kirsten; Broholm, Helle

    2016-03-01

    Methylation of the O(6)-methylguanine-DNA methyltransferase (MGMT) gene is a predictive and prognostic marker in newly diagnosed glioblastoma patients treated with temozolomide but how MGMT methylation should be assessed to ensure optimal detection accuracy is debated. We developed a novel quantitative methylation-specific PCR (qMSP) MGMT assay capable of providing allelic methylation data and analyzed 151 glioblastomas from patients receiving standard of care treatment (Stupp protocol). The samples were also analyzed by immunohistochemistry (IHC), standard bisulfite pyrosequencing, and genotyped for the rs1690252 MGMT promoter single nucleotide polymorphism. Monoallelic methylation was observed more frequently than biallelic methylation, and some cases with monoallelic methylation expressed the MGMT protein whereas others did not. The presence of MGMT methylation was associated with better overall survival (p = 0.006; qMSP and p = 0.002; standard pyrosequencing), and the presence of the protein was associated with worse overall survival (p = 0.009). Combined analyses of qMSP and standard pyrosequencing or IHC identified additional patients who benefited from temozolomide treatment. Finally, low methylation levels were also associated with better overall survival (p = 0.061; qMSP and p = 0.02; standard pyrosequencing). These data support the use of both MGMT methylation and MGMT IHC but not allelic methylation data as prognostic markers in patients with temozolomide-treated glioblastoma. PMID:26883115

  2. A Quantitative Assay Using Mycelial Fragments to Assess Virulence of Mycosphaerella fijiensis.

    PubMed

    Donzelli, Bruno Giuliano Garisto; Churchill, Alice C L

    2007-08-01

    ABSTRACT We describe a method to evaluate the virulence of Mycosphaerella fijiensis, the causal agent of black leaf streak disease (BLSD) of banana and plantain. The method is based on the delivery of weighed slurries of fragmented mycelia by camel's hair brush to 5-by-5-cm areas on the abaxial surface of banana leaf blades. Reliable BLSD development was attained in an environmental growth chamber with stringent lighting and humidity controls. By localizing inoculum onto small areas of large leaves, we achieved a dramatic increase in the number of strains that can be tested on each leaf and plant, which is critical for comparing the virulence of numerous strains concurrently. Image analysis software was used to measure the percentage of each inoculated leaf section showing BLSD symptoms over time. We demonstrated that the level of disease of four isolates was correlated with the weight of the mycelium applied and relatively insensitive to the degree of fragmentation of hyphae. This is the first report demonstrating that weighed mycelial inoculum, combined with image analysis software to measure disease severity, can be used to quantitatively assess the virulence of M. fijiensis under rigorously controlled environmental conditions. PMID:18943631

  3. Bioaerosol Deposition to Food Crops near Manure Application: Quantitative Microbial Risk Assessment.

    PubMed

    Jahne, Michael A; Rogers, Shane W; Holsen, Thomas M; Grimberg, Stefan J; Ramler, Ivan P; Kim, Seungo

    2016-03-01

    Production of both livestock and food crops are central priorities of agriculture; however, food safety concerns arise where these practices intersect. In this study, we investigated the public health risks associated with potential bioaerosol deposition to crops grown in the vicinity of manure application sites. A field sampling campaign at dairy manure application sites supported the emission, transport, and deposition modeling of bioaerosols emitted from these lands following application activities. Results were coupled with a quantitative microbial risk assessment model to estimate the infection risk due to consumption of leafy green vegetable crops grown at various distances downwind from the application area. Inactivation of pathogens ( spp., spp., and O157:H7) on both the manure-amended field and on crops was considered to determine the maximum loading of pathogens to plants with time following application. Overall median one-time infection risks at the time of maximum loading decreased from 1:1300 at 0 m directly downwind from the field to 1:6700 at 100 m and 1:92,000 at 1000 m; peak risks (95th percentiles) were considerably greater (1:18, 1:89, and 1:1200, respectively). Median risk was below 1:10,000 at >160 m downwind. As such, it is recommended that a 160-m setback distance is provided between manure application and nearby leafy green crop production. Additional distance or delay before harvest will provide further protection of public health. PMID:27065414

  4. Community Capacity for Watershed Conservation: A Quantitative Assessment of Indicators and Core Dimensions

    NASA Astrophysics Data System (ADS)

    Brinkman, Elliot; Seekamp, Erin; Davenport, Mae A.; Brehm, Joan M.

    2012-10-01

    Community capacity for watershed management has emerged as an important topic for the conservation of water resources. While much of the literature on community capacity has focused primarily on theory construction, there have been few efforts to quantitatively assess community capacity variables and constructs, particularly for watershed management and conservation. This study seeks to identify predictors of community capacity for watershed conservation in southwestern Illinois. A subwatershed-scale survey of residents from four communities located within the Lower Kaskaskia River watershed of southwestern Illinois was administered to measure three specific capacity variables: community empowerment, shared vision and collective action. Principal component analysis revealed key dimensions of each variable. Specifically, collective action was characterized by items relating to collaborative governance and social networks, community empowerment was characterized by items relating to community competency and a sense of responsibility and shared vision was characterized by items relating to perceptions of environmental threats, issues with development, environmental sense of place and quality of life. From the emerging factors, composite measures were calculated to determine the extent to which each variable contributed to community capacity. A stepwise regression revealed that community empowerment explained most of the variability in the composite measure of community capacity for watershed conservation. This study contributes to the theoretical understanding of community capacity by quantifying the role of collective action, community empowerment and shared vision in community capacity, highlighting the need for multilevel interaction to address watershed issues.

  5. Quantitative CT assessment of bone mineral density in dogs with hyperadrenocorticism.

    PubMed

    Lee, Donghoon; Lee, Youngjae; Choi, Wooshin; Chang, Jinhwa; Kang, Ji-Houn; Na, Ki-Jeong; Chang, Dong-Woo

    2015-01-01

    Canine hyperadrenocorticism (HAC) is one of the most common causes of general osteopenia. In this study, quantitative computed tomography (QCT) was used to compare the bone mineral densities (BMD) between 39 normal dogs and 8 dogs with HAC (6 pituitary-dependent hyperadrenocorticism [PDH]; pituitary dependent hyperadrenocorticism, 2 adrenal hyperadrenocorticism [ADH]; adrenal dependent hyperadrenocorticism) diagnosed through hormonal assay. A computed tomogaraphy scan of the 12th thoracic to 7th lumbar vertebra was performed and the region of interest was drawn in each trabecular and cortical bone. Mean Hounsfield unit values were converted to equivalent BMD with bone-density phantom by linear regression analysis. The converted mean trabecular BMDs were significantly lower than those of normal dogs. ADH dogs showed significantly lower BMDs at cortical bone than normal dogs. Mean trabecular BMDs of dogs with PDH using QCT were significantly lower than those of normal dogs, and both mean trabecular and cortical BMDs in dogs with ADH were significantly lower than those of normal dogs. Taken together, these findings indicate that QCT is useful to assess BMD in dogs with HAC. PMID:26040613

  6. Quantitative CT assessment of bone mineral density in dogs with hyperadrenocorticism

    PubMed Central

    Lee, Donghoon; Lee, Youngjae; Choi, Wooshin; Chang, Jinhwa; Kang, Ji-Houn; Na, Ki-Jeong

    2015-01-01

    Canine hyperadrenocorticism (HAC) is one of the most common causes of general osteopenia. In this study, quantitative computed tomography (QCT) was used to compare the bone mineral densities (BMD) between 39 normal dogs and 8 dogs with HAC (6 pituitary-dependent hyperadrenocorticism [PDH]; pituitary dependent hyperadrenocorticism, 2 adrenal hyperadrenocorticism [ADH]; adrenal dependent hyperadrenocorticism) diagnosed through hormonal assay. A computed tomogaraphy scan of the 12th thoracic to 7th lumbar vertebra was performed and the region of interest was drawn in each trabecular and cortical bone. Mean Hounsfield unit values were converted to equivalent BMD with bone-density phantom by linear regression analysis. The converted mean trabecular BMDs were significantly lower than those of normal dogs. ADH dogs showed significantly lower BMDs at cortical bone than normal dogs. Mean trabecular BMDs of dogs with PDH using QCT were significantly lower than those of normal dogs, and both mean trabecular and cortical BMDs in dogs with ADH were significantly lower than those of normal dogs. Taken together, these findings indicate that QCT is useful to assess BMD in dogs with HAC. PMID:26040613

  7. Quantitative risk assessment for lung cancer from exposure to metal ore dust

    SciTech Connect

    Fu, H.; Jing, X.; Yu, S.; Gu, X.; Wu, K.; Yang, J.; Qiu, S. )

    1992-09-01

    To quantitatively assess risk for lung cancer of metal miners, a historical cohort study was conducted. The cohort consisted of 1113 miners who were employed to underground work for at least 12 months between January 1, 1960 and December 12, 1974. According to the records of dust concentration, a cumulative dust dose of each miner in the cohort was estimated. There were 162 deaths in total and 45 deaths from lung cancer with a SMR of 2184. The SMR for lung cancer increased from 1019 for those with cumulative dust dose of less than 500 mg-year to 2469 for those with the dose of greater than 4500 mg-year. Furthermore, the risk in the highest category of combined cumulative dust dose and cigarette smoking was 46-fold greater than the lowest category of dust dose and smoking. This study showed that there was an exposure-response relationship between metal ore dust and lung cancer, and an interaction of lung cancer between smoking and metal ore dust exposure.

  8. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis.

    PubMed

    Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T

    2016-01-01

    The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms. PMID:26643074

  9. Quantitative immunohistochemical assessment of blood and lymphatic microcirculation in cutaneous lichen planus lesions.

    PubMed

    Výbohová, Desanka; Mellová, Yvetta; Adamicová, Katarína; Adamkov, Marián; Hešková, Gabriela

    2015-06-01

    Latest advances have brought to light the hypothesis that angiogenesis and lymphangiogenesis are tightly connected to some chronic inflammatory diseases. The present study focuses on immunohistochemical assessment of the quantitative changes in the blood and lymphatic microcirculatory bed in common chronic dermatosis - cutaneous lichen planus. Double immunohistochemistry with CD34 and podoplanin antibodies was used to detect blood and lymphatic endothelium, while anti-human VEGF was used for the observation of a key angiogenesis and lymphangiogenesis inducer. Morphometric analysis was performed with QuickPhoto Micro image analysis software. Results confirmed statistically significant enlargement of both the blood and lymphatic microcirculatory beds. Compared to healthy skin, cutaneous lichen planus lesions revealed 1.6 times enlarged blood microcirculatory bed and 1.8 times enlarged lymphatic microcirculatory bed. Vascular endothelial growth factor (VEGF) expression in lesional skin was significantly higher in the epidermis (19.1 times increase) than in the dermis (10.3 times increase). These findings indicate a tight association of angiogenesis and lymphangiogenesis with the pathogenesis of cutaneous lichen planus. PMID:25504638

  10. Quantitative assessment of chemical artefacts produced by propionylation of histones prior to mass spectrometry analysis.

    PubMed

    Soldi, Monica; Cuomo, Alessandro; Bonaldi, Tiziana

    2016-07-01

    Histone PTMs play a crucial role in regulating chromatin structure and function, with impact on gene expression. MS is nowadays widely applied to study histone PTMs systematically. Because histones are rich in arginine and lysine, classical shot-gun approaches based on trypsin digestion are typically not employed for histone modifications mapping. Instead, different protocols of chemical derivatization of lysines in combination with trypsin have been implemented to obtain "Arg-C like" digestion products that are more suitable for LC-MS/MS analysis. Although widespread, these strategies have been recently described to cause various side reactions that result in chemical modifications prone to be misinterpreted as native histone marks. These artefacts can also interfere with the quantification process, causing errors in histone PTMs profiling. The work of Paternoster V. et al. is a quantitative assessment of methyl-esterification and other side reactions occurring on histones after chemical derivatization of lysines with propionic anhydride [Proteomics 2016, 16, 2059-2063]. The authors estimate the effect of different solvents, incubation times, and pH on the extent of these side reactions. The results collected indicate that the replacement of methanol with isopropanol or ACN not only blocks methyl-esterification, but also significantly reduces other undesired unspecific reactions. Carefully titrating the pH after propionic anhydride addition is another way to keep methyl-esterification under control. Overall, the authors describe a set of experimental conditions that allow reducing the generation of various artefacts during histone propionylation. PMID:27373704

  11. Assessment of Quantitative and Allelic MGMT Methylation Patterns as a Prognostic Marker in Glioblastoma

    PubMed Central

    Michaelsen, Signe R.; Dyrbye, Henrik; Aslan, Derya; Grunnet, Kirsten; Christensen, Ib J.; Poulsen, Hans S.; Grønbæk, Kirsten; Broholm, Helle

    2016-01-01

    Methylation of the O6-methylguanine-DNA methyltransferase (MGMT) gene is a predictive and prognostic marker in newly diagnosed glioblastoma patients treated with temozolomide but how MGMT methylation should be assessed to ensure optimal detection accuracy is debated. We developed a novel quantitative methylation-specific PCR (qMSP) MGMT assay capable of providing allelic methylation data and analyzed 151 glioblastomas from patients receiving standard of care treatment (Stupp protocol). The samples were also analyzed by immunohistochemistry (IHC), standard bisulfite pyrosequencing, and genotyped for the rs1690252 MGMT promoter single nucleotide polymorphism. Monoallelic methylation was observed more frequently than biallelic methylation, and some cases with monoallelic methylation expressed the MGMT protein whereas others did not. The presence of MGMT methylation was associated with better overall survival (p = 0.006; qMSP and p = 0.002; standard pyrosequencing), and the presence of the protein was associated with worse overall survival (p = 0.009). Combined analyses of qMSP and standard pyrosequencing or IHC identified additional patients who benefited from temozolomide treatment. Finally, low methylation levels were also associated with better overall survival (p = 0.061; qMSP and p = 0.02; standard pyrosequencing). These data support the use of both MGMT methylation and MGMT IHC but not allelic methylation data as prognostic markers in patients with temozolomide-treated glioblastoma. PMID:26883115

  12. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Pavlov, Konstantin A.

    1997-01-10

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing.

  13. Quantitative