Science.gov

Sample records for mimicking quantitative assessment

  1. Static mechanical assessment of elastic Young's modulus of tissue mimicking materials used for medical imaging.

    PubMed

    Duboeuf, François; Liebgott, Hervé; Basarab, Adrian; Brusseau, Elisabeth; Delachartre, Philippe; Vray, Didier

    2007-01-01

    Emerging medical imaging techniques usually provide quantitative diagnostic parameters. Since the description of a method for quantitative imaging of strain and elastic modulus distributions in soft tissues by Ophir et al. in 1991, research in elastography is progressing and experimental in vitro validation of new displacement estimators appears crucial for clinical applications. Materials mimicking biological tissues appear very useful to reach this goal. Nevertheless, correct validation necessitates knowledge of mechanical properties of the investigated material, which are often difficult to obtain. This study describes a simple method for mechanical characterization of gels used in elastography. We demonstrated the possibility to assess elasticity modulus with a reasonable reproducibility using simple tools and methods. For validation, the described method was further tested with 5 samples of Polyvinyl alcohol (PVA) cryogel having different values of elasticity. Young's moduli, from 24 to 135 kPa according to the number of freeze-thaw cycles (from 1 to 5) have been measured with a reproducibility varying from 2 to 7%, in the respect of strict measurements conditions. The method demonstrates good feasibility and acceptable reproducibility to mechanically characterize phantoms.

  2. Microbiological Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dominguez, Silvia; Schaffner, Donald W.

    The meat and poultry industry faces ongoing challenges due to the natural association of pathogens of concern (e.g., Salmonella, Campylobacter jejuni, Escherichia coli O157:H7) with a variety of domesticated food animals. In addition, pathogens such as Listeria monocytogenes pose a significant cross-contamination risk during further meat and poultry processing, distribution, and storage. Furthermore, the meat and poultry industries are constantly changing with the addition of new products, use of new raw materials, and targeting of new consumer populations, each of which may give rise to potential new risks. National and international regulations are increasingly using a “risk-based” approach to food safety (where the regulatory focus is driven by the magnitude of the risk), so risk assessment is becoming a valuable tool to systematically organize and evaluate the potential public health risk posed by food processing operations.

  3. Towards quantitative assessment of calciphylaxis

    NASA Astrophysics Data System (ADS)

    Deserno, Thomas M.; Sárándi, István.; Jose, Abin; Haak, Daniel; Jonas, Stephan; Specht, Paula; Brandenburg, Vincent

    2014-03-01

    Calciphylaxis is a rare disease that has devastating conditions associated with high morbidity and mortality. Calciphylaxis is characterized by systemic medial calcification of the arteries yielding necrotic skin ulcerations. In this paper, we aim at supporting the installation of multi-center registries for calciphylaxis, which includes a photographic documentation of skin necrosis. However, photographs acquired in different centers under different conditions using different equipment and photographers cannot be compared quantitatively. For normalization, we use a simple color pad that is placed into the field of view, segmented from the image, and its color fields are analyzed. In total, 24 colors are printed on that scale. A least-squares approach is used to determine the affine color transform. Furthermore, the card allows scale normalization. We provide a case study for qualitative assessment. In addition, the method is evaluated quantitatively using 10 images of two sets of different captures of the same necrosis. The variability of quantitative measurements based on free hand photography is assessed regarding geometric and color distortions before and after our simple calibration procedure. Using automated image processing, the standard deviation of measurements is significantly reduced. The coefficients of variations yield 5-20% and 2-10% for geometry and color, respectively. Hence, quantitative assessment of calciphylaxis becomes practicable and will impact a better understanding of this rare but fatal disease.

  4. Characterisation of a PVCP-based tissue-mimicking phantom for quantitative photoacoustic imaging

    NASA Astrophysics Data System (ADS)

    Fonseca, Martina; Zeqiri, Bajram; Beard, Paul; Cox, Ben

    2015-07-01

    Photoacoustic imaging can provide high resolution images of tissue structure, pathology and function. As these images can be obtained at multiple wavelengths, quantitatively accurate, spatially resolved, estimates for chromophore concentration, for example, may be obtainable. Such a capability would find a wide range of clinical and pre-clinical applications. However, despite a growing body of theoretical papers on how this might be achieved, there is a noticeable lack of studies providing validated evidence that it can be achieved experimentally, either in vitro or in vivo. Well-defined, versatile and stable phantom materials are essential to assess the accuracy, robustness and applicability of multispectral Quantitative Photoacoustic Imaging (qPAI) algorithms in experimental scenarios. This study assesses the potential of polyvinyl chloride plastisol (PVCP) as a phantom material for qPAI, building on previous work that focused on using PVCP for quality control. Parameters that might be controlled or tuned to assess the performance of qPAI algorithms were studied: broadband acoustic properties, multiwavelength optical properties with added absorbers and scatterers, and photoacoustic efficiency. The optical and acoustic properties of PVCP can be tuned to be broadly representative of soft tissue. The Grüneisen parameter is larger than expected in tissue, which is an advantage as it increases the signal-to-noise ratio of the photoacoustic measurements. Interestingly, when the absorption was altered by adding absorbers, the absorption spectra measured using high peak power nanosecond-pulsed sources (typical in photoacoustics) were repeatably different from the ones measured using the low power source in the spectrophotometer, indicative of photochemical reactions taking place.

  5. Quantitative MRI Assessment of Leukoencephalopathy

    PubMed Central

    Reddick, Wilburn E.; Glass, John O.; Langston, James W.; Helton, Kathleen J.

    2008-01-01

    Quantitative MRI assessment of leukoencephalopathy is difficult because the MRI properties of leukoencephalopathy significantly overlap those of normal tissue. This report describes the use of an automated procedure for longitudinal measurement of tissue volume and relaxation times to quantify leukoencephalopathy. Images derived by using this procedure in patients undergoing therapy for acute lymphoblastic leukemia (ALL) are presented. Five examinations from each of five volunteers (25 examinations) were used to test the reproducibility of quantitated baseline and subsequent, normal-appearing images; the coefficients of variation were less than 2% for gray and white matter. Regions of leukoencephalopathy in patients were assessed by comparison with manual segmentation. Two radiologists manually segmented images from 15 randomly chosen MRI examinations that exhibited leukoencephalopathy. Kappa analyses showed that the two radiologists’ interpretations were concordant (κ = 0.70) and that each radiologist’s interpretations agreed with the results of the automated procedure (κ = 0.57 and 0.55).The clinical application of this method was illustrated by analysis of images from sequential MR examinations of two patients who developed leukoencephalopathy during treatment for ALL. The ultimate goal is to use these quantitative MR imaging measures to better understand therapy-induced neurotoxicity, which can be limited or even reversed with some combination of therapy adjustments and pharmacological and neurobehavioral interventions. PMID:11979570

  6. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  7. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  8. Quantitative assessment of scientific quality

    NASA Astrophysics Data System (ADS)

    Heinzl, Harald; Bloching, Philipp

    2012-09-01

    Scientific publications, authors, and journals are commonly evaluated with quantitative bibliometric measures. Frequently-used measures will be reviewed and their strengths and weaknesses will be highlighted. Reflections about conditions for a new, research paper-specific measure will be presented.

  9. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  10. Use of chemostat cultures mimicking different phases of wine fermentations as a tool for quantitative physiological analysis

    PubMed Central

    2014-01-01

    Background Saccharomyces cerevisiae is the most relevant yeast species conducting the alcoholic fermentation that takes place during winemaking. Although the physiology of this model organism has been extensively studied, systematic quantitative physiology studies of this yeast under winemaking conditions are still scarce, thus limiting the understanding of fermentative metabolism of wine yeast strains and the systematic description, modelling and prediction of fermentation processes. In this study, we implemented and validated the use of chemostat cultures as a tool to simulate different stages of a standard wine fermentation, thereby allowing to implement metabolic flux analyses describing the sequence of metabolic states of S. cerevisae along the wine fermentation. Results Chemostat cultures mimicking the different stages of standard wine fermentations of S. cerevisiae EC1118 were performed using a synthetic must and strict anaerobic conditions. The simulated stages corresponded to the onset of the exponential growth phase, late exponential growth phase and cells just entering stationary phase, at dilution rates of 0.27, 0.04, 0.007 h−1, respectively. Notably, measured substrate uptake and product formation rates at each steady state condition were generally within the range of corresponding conversion rates estimated during the different batch fermentation stages. Moreover, chemostat data were further used for metabolic flux analysis, where biomass composition data for each condition was considered in the stoichiometric model. Metabolic flux distributions were coherent with previous analyses based on batch cultivations data and the pseudo-steady state assumption. Conclusions Steady state conditions obtained in chemostat cultures reflect the environmental conditions and physiological states of S. cerevisiae corresponding to the different growth stages of a typical batch wine fermentation, thereby showing the potential of this experimental approach to

  11. Therapeutic ultrasound in physical medicine and rehabilitation: characterization and assessment of its physical effects on joint-mimicking phantoms.

    PubMed

    Lioce, Elisa Edi Anna Nadia; Novello, Matteo; Durando, Gianni; Bistolfi, Alessandro; Actis, Maria Vittoria; Massazza, Giuseppe; Magnetto, Chiara; Guiot, Caterina

    2014-11-01

    The aim of the study described here was to quantitatively assess thermal and mechanical effects of therapeutic ultrasound (US) by sonicating a joint-mimicking phantom, made of muscle-equivalent material, using clinical US equipment. The phantom contains two bone disks simulating a deep joint (treated at 1 MHz) and a superficial joint (3 MHz). Thermal probes were inserted in fixed positions. To test the mechanical (cavitational) effects, we used a latex balloon filled with oxygen-loaded nanobubbles; the dimensions of the oxygen-loaded nanobubbles were determined before and after sonication. Significant increases in temperature (up to 17°C) with fixed field using continuous waves were detected both in front of and behind the bones, depending on the US mode (continuous wave vs. pulsed wave) and on the treatment modality (fixed vs. massage). We found no significant differences in mechanical effects. Although limited by the in vitro design (no blood perfusion, no metabolic compensation), the results can be used to guide operators in their choice of the best US treatment modality for a specific joint.

  12. QUANTITATIVE PROCEDURES FOR NEUROTOXICOLOGY RISK ASSESSMENT

    EPA Science Inventory

    In this project, previously published information on biologically based dose-response model for brain development was used to quantitatively evaluate critical neurodevelopmental processes, and to assess potential chemical impacts on early brain development. This model has been ex...

  13. Quantitative assessment of fluorescent proteins.

    PubMed

    Cranfill, Paula J; Sell, Brittney R; Baird, Michelle A; Allen, John R; Lavagnino, Zeno; de Gruiter, H Martijn; Kremers, Gert-Jan; Davidson, Michael W; Ustione, Alessandro; Piston, David W

    2016-07-01

    The advent of fluorescent proteins (FPs) for genetic labeling of molecules and cells has revolutionized fluorescence microscopy. Genetic manipulations have created a vast array of bright and stable FPs spanning blue to red spectral regions. Common to autofluorescent FPs is their tight β-barrel structure, which provides the rigidity and chemical environment needed for effectual fluorescence. Despite the common structure, each FP has unique properties. Thus, there is no single 'best' FP for every circumstance, and each FP has advantages and disadvantages. To guide decisions about which FP is right for a given application, we have quantitatively characterized the brightness, photostability, pH stability and monomeric properties of more than 40 FPs to enable straightforward and direct comparison between them. We focus on popular and/or top-performing FPs in each spectral region. PMID:27240257

  14. Quantitative assessment of DNA condensation.

    PubMed

    Trubetskoy, V S; Slattum, P M; Hagstrom, J E; Wolff, J A; Budker, V G

    1999-02-15

    A fluorescent method is proposed for assessing DNA condensation in aqueous solutions with variety of condensing agents. The technique is based on the effect of concentration-dependent self-quenching of covalently bound fluorophores upon DNA collapse. The method allows a more precise determination of charge equivalency in titration experiments with various polycations. The technique's ability to determine the number of DNA molecules that are condensed together in close proximity is under further investigation.

  15. Assessment of tissue Doppler imaging measurements of arterial wall motion using a tissue mimicking test rig.

    PubMed

    Thrush, Abigail J; Brewin, Mark P; Birch, Malcolm J

    2008-03-01

    The aim of this in vitro study is to assess the accuracy of the tissue Doppler imaging arterial wall motion (TDI AWM) technique in measuring dilation over a range of distances and velocities. A test rig, consisting of two parallel blocks of tissue mimicking material (TMM), has been developed to generate known wall motion. One block remains stationary while the other moves in a cyclical motion. A calibrated laser range finder was used to measure the TMM motion. The TDI AWM measurements were found to underestimate the dilation by 21% +/- 4.7% when using the recommended scanner parameters. The size of the error was found to increase with a decrease in ultrasound output power. Results suggested that errors in the TDI AWM dilation measurements relate to underestimates in the velocity measured by the TDI technique. The error demonstrated in this study indicates a limitation in the value of TDI AWM result obtained in vivo. (E-mail: abigail.thrush@bartsandthelondon.nhs.uk). PMID:17964065

  16. Biologically based, quantitative risk assessment of neurotoxicants.

    PubMed

    Slikker, W; Crump, K S; Andersen, M E; Bellinger, D

    1996-01-01

    The need for biologically based, quantitative risk assessment procedures for noncancer endpoints such as neurotoxicity has been discussed in reports by the United States Congress (Office of Technology Assessment, OTA), National Research Council (NRC), and a federal coordinating council. According to OTA, current attention and resources allocated to health risk assessment research are inadequate and not commensurate with its impact on public health and the economy. Methods to include continuous rather than dichotomous data for neurotoxicity endpoints, biomarkers of exposure and effects, and pharmacokinetic and mechanistic data have been proposed for neurotoxicity risk assessment but require further review and validation before acceptance. The purpose of this symposium was to examine procedures to enhance the risk assessment process for neurotoxicants and to discuss techniques to make the process more quantitative. Accordingly, a review of the currently used safety factor risk assessment approach for neurotoxicants is provided along with specific examples of how this process may be enhanced with the use of the benchmark dose approach. The importance of including physiologically based pharmacokinetic data in the risk assessment process and specific examples of this approach is presented for neurotoxicants. The role of biomarkers of exposure and effect and mechanistic information in the risk assessment process are also addressed. Finally, quantitative approaches with the use of continuous neurotoxicity data are demonstrated and the outcomes compared to those generated by currently used risk assessment procedures. PMID:8838636

  17. Accuracy of quantitative visual soil assessment

    NASA Astrophysics Data System (ADS)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  18. Integrated regional assessment: qualitative and quantitative issues

    SciTech Connect

    Malone, Elizabeth L.

    2009-11-19

    Qualitative and quantitative issues are particularly significant in integrated regional assessment. This chapter examines the terms “qualitative” and “quantitative” separately and in relation to one another, along with a discussion of the degree of interdependence or overlap between the two. Strategies for integrating the two general approaches often produce uneasy compromises. However, integrated regional assessment provides opportunities for strong collaborations in addressing specific problems in specific places.

  19. Physiologic basis for understanding quantitative dehydration assessment.

    PubMed

    Cheuvront, Samuel N; Kenefick, Robert W; Charkoudian, Nisha; Sawka, Michael N

    2013-03-01

    Dehydration (body water deficit) is a physiologic state that can have profound implications for human health and performance. Unfortunately, dehydration can be difficult to assess, and there is no single, universal gold standard for decision making. In this article, we review the physiologic basis for understanding quantitative dehydration assessment. We highlight how phenomenologic interpretations of dehydration depend critically on the type (dehydration compared with volume depletion) and magnitude (moderate compared with severe) of dehydration, which in turn influence the osmotic (plasma osmolality) and blood volume-dependent compensatory thresholds for antidiuretic and thirst responses. In particular, we review new findings regarding the biological variation in osmotic responses to dehydration and discuss how this variation can help provide a quantitative and clinically relevant link between the physiology and phenomenology of dehydration. Practical measures with empirical thresholds are provided as a starting point for improving the practice of dehydration assessment.

  20. Trends in quantitative cancer risk assessment.

    PubMed Central

    Morris, S C

    1991-01-01

    Quantitative cancer risk assessment is a dynamic field, more closely coupled to rapidly advancing biomedical research than ever before. Six areas of change and growth are identified: expansion from models of cancer initiation to a more complete picture of the total carcinogenic process; trend from curve-fitting to biologically based models; movement from upperbound estimates to best estimates, with a more complete treatment of uncertainty; increased consideration of the role of susceptibility; growing development of expert systems and decision support systems; and emerging importance of risk communication. PMID:2050076

  1. Hydrogen quantitative risk assessment workshop proceedings.

    SciTech Connect

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersion 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.

  2. A toolbox for rockfall Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Agliardi, F.; Mavrouli, O.; Schubert, M.; Corominas, J.; Crosta, G. B.; Faber, M. H.; Frattini, P.; Narasimhan, H.

    2012-04-01

    Rockfall Quantitative Risk Analysis for mitigation design and implementation requires evaluating the probability of rockfall events, the probability and intensity of impacts on structures (elements at risk and countermeasures), their vulnerability, and the related expected costs for different scenarios. A sound theoretical framework has been developed during the last years both for spatially-distributed and local (i.e. single element at risk) analyses. Nevertheless, the practical application of existing methodologies remains challenging, due to difficulties in the collection of required data and to the lack of simple, dedicated analysis tools. In order to fill this gap, specific tools have been developed in the form of Excel spreadsheets, in the framework of Safeland EU project. These tools can be used by stakeholders, practitioners and other interested parties for the quantitative calculation of rock fall risk through its key components (probabilities, vulnerability, loss), using combinations of deterministic and probabilistic approaches. Three tools have been developed, namely: QuRAR (by UNIMIB), VulBlock (by UPC), and RiskNow-Falling Rocks (by ETH Zurich). QuRAR implements a spatially distributed, quantitative assessment methodology of rockfall risk for individual buildings or structures in a multi-building context (urban area). Risk is calculated in terms of expected annual cost, through the evaluation of rockfall event probability, propagation and impact probability (by 3D numerical modelling of rockfall trajectories), and empirical vulnerability for different risk protection scenarios. Vulblock allows a detailed, analytical calculation of the vulnerability of reinforced concrete frame buildings to rockfalls and related fragility curves, both as functions of block velocity and the size. The calculated vulnerability can be integrated in other methodologies/procedures based on the risk equation, by incorporating the uncertainty of the impact location of the rock

  3. Quantitative Risk Assessment for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Lowry, T. S.; McKenna, S. A.; Hadgu, T.; Kalinina, E.

    2011-12-01

    This study uses a quantitative risk-assessment approach to place the uncertainty associated with enhanced geothermal systems (EGS) development into meaningful context and to identify points of attack that can reduce risk the most. Using the integrated geothermal assessment tool, GT-Mod, we calculate the complimentary cumulative distribution function of the levelized cost of electricity (LCOE) that results from uncertainty in a variety of geologic and economic input parameter values. EGS is a developing technology that taps deep (2-10km) geologic heat sources for energy production by "enhancing" non-permeable hot rock through hydraulic stimulation. Despite the promise of EGS, uncertainties in predicting the physical end economic performance of a site has hindered its development. To address this, we apply a quantitative risk-assessment approach that calculates risk as the sum of the consequence, C, multiplied by the range of the probability, ΔP, over all estimations of a given exceedance probability, n, over time, t. The consequence here is defined as the deviation from the best estimate LCOE, which is calculated using the 'best-guess' input parameter values. The analysis assumes a realistic but fictitious EGS site with uncertainties in the exploration success rate, the sub-surface thermal gradient, the reservoir fracture pattern, and the power plant performance. Uncertainty in the exploration, construction, O&M, and drilling costs are also included. The depth to the resource is calculated from the thermal gradient and a target resource temperature of 225 °C. Thermal performance is simulated using the Gringarten analytical solution. The mass flow rate is set to produce 30 MWe of power for the given conditions and is adjusted over time to maintain that rate over the plant lifetime of 30 years. Simulations are conducted using GT-Mod, which dynamically links the physical systems of a geothermal site to simulate, as an integrated, multi-system component, the

  4. Quantitative ultrasound assessment of cervical microstructure.

    PubMed

    Feltovich, Helen; Nam, Kibo; Hall, Timothy J

    2010-07-01

    The objective of this preliminary study was to determine whether quantitative ultrasound (QUS) can provide insight into, and characterization of, uterine cervical microstructure. Throughout pregnancy, cervical collagen reorganizes (from aligned and anisotropic to disorganized and isotropic) as the cervix changes in preparation for delivery. Premature changes in collagen are associated with premature birth in mammals. Because QUS is able to detect structural anisotropy/isotropy, we hypothesized that it may provide a means of noninvasively assessing cervical microstructure. Thorough study of cervical microstructure has been limited by lack of technology to detect small changes in collagen organization, which has in turn limited our ability to detect abnormal and/or premature changes in collagen that may lead to preterm birth. In order to determine whether QUS may be useful for detection of cervical microstructure, radiofrequency (rf) echo data were acquired from the cervices of human hysterectomy specimens (n = 10). The angle between the acoustic beam and tissue was used to assess anisotropic acoustic propagation by control of transmit/receive angles from -20 degrees to +20 degrees. The power spectrum of the echo signals from within a region of interest was computed in order to investigate the microstructure of the tissue. An identical analysis was performed on a homogeneous phantom with spherical scatterers for system calibration. Power spectra of backscattered rf from the cervix were 6 dB higher for normal (0 degree) than steered (+/- 20 degrees) beams. The spectral power for steered beams decreased monotonically (0.4 dB at +5 degrees to 3.6 dB at +20 degrees). The excess difference (compared to similar analysis for the phantom) in normally-incident (0 degree) versus steered beams is consistent with scattering from an aligned component of the cervical microstructure. Therefore, QUS appears to reliably identify an aligned component of cervical microstructure

  5. A Quantitative Approach to Assessing System Evolvability

    NASA Technical Reports Server (NTRS)

    Christian, John A., III

    2004-01-01

    When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.

  6. Bayes` theorem and quantitative risk assessment

    SciTech Connect

    Kaplan, S.

    1994-12-31

    This paper argues that for a quantitative risk analysis (QRA) to be useful for public and private decision making, and for rallying the support necessary to implement those decisions, it is necessary that the QRA results be ``trustable.`` Trustable means that the results are based solidly and logically on all the relevant evidence available. This, in turn, means that the quantitative results must be derived from the evidence using Bayes` theorem. Thus, it argues that one should strive to make their QRAs more clearly and explicitly Bayesian, and in this way make them more ``evidence dependent`` than ``personality dependent.``

  7. Assessing Quantitative Reasoning in Young Children

    ERIC Educational Resources Information Center

    Nunes, Terezinha; Bryant, Peter; Evans, Deborah; Barros, Rossana

    2015-01-01

    Before starting school, many children reason logically about concepts that are basic to their later mathematical learning. We describe a measure of quantitative reasoning that was administered to children at school entry (mean age 5.8 years) and accounted for more variance in a mathematical attainment test than general cognitive ability 16 months…

  8. A Quantitative Software Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  9. CUMULATIVE RISK ASSESSMENT FOR QUANTITATIVE RESPONSE DATA

    EPA Science Inventory

    The Relative Potency Factor approach (RPF) is used to normalize and combine different toxic potencies among a group of chemicals selected for cumulative risk assessment. The RPF method assumes that the slopes of the dose-response functions are all equal; but this method depends o...

  10. QUANTITATIVE RISK ASSESSMENT FOR MICROBIAL AGENTS

    EPA Science Inventory

    Compared to chemical risk assessment, the process for microbial agents and infectious disease is more complex because of host factors and the variety of settings in which disease transmission can occur. While the National Academy of Science has established a paradigm for performi...

  11. Asbestos exposure--quantitative assessment of risk

    SciTech Connect

    Hughes, J.M.; Weill, H.

    1986-01-01

    Methods for deriving quantitative estimates of asbestos-associated health risks are reviewed and their numerous assumptions and uncertainties described. These methods involve extrapolation of risks observed at past relatively high asbestos concentration levels down to usually much lower concentration levels of interest today--in some cases, orders of magnitude lower. These models are used to calculate estimates of the potential risk to workers manufacturing asbestos products and to students enrolled in schools containing asbestos products. The potential risk to workers exposed for 40 yr to 0.5 fibers per milliliter (f/ml) of mixed asbestos fiber type (a permissible workplace exposure limit under consideration by the Occupational Safety and Health Administration (OSHA) ) are estimated as 82 lifetime excess cancers per 10,000 exposed. The risk to students exposed to an average asbestos concentration of 0.001 f/ml of mixed asbestos fiber types for an average enrollment period of 6 school years is estimated as 5 lifetime excess cancers per one million exposed. If the school exposure is to chrysotile asbestos only, then the estimated risk is 1.5 lifetime excess cancers per million. Risks from other causes are presented for comparison; e.g., annual rates (per million) of 10 deaths from high school football, 14 from bicycling (10-14 yr of age), 5 to 20 for whooping cough vaccination. Decisions concerning asbestos products require participation of all parties involved and should only be made after a scientifically defensible estimate of the associated risk has been obtained. In many cases to date, such decisions have been made without adequate consideration of the level of risk or the cost-effectiveness of attempts to lower the potential risk. 73 references.

  12. Quantitative performance assessments for neuromagnetic imaging systems.

    PubMed

    Koga, Ryo; Hiyama, Ei; Matsumoto, Takuya; Sekihara, Kensuke

    2013-01-01

    We have developed a Monte-Carlo simulation method to assess the performance of neuromagnetic imaging systems using two kinds of performance metrics: A-prime metric and spatial resolution. We compute these performance metrics for virtual sensor systems having 80, 160, 320, and 640 sensors, and discuss how the system performance is improved, depending on the number of sensors. We also compute these metrics for existing whole-head MEG systems, MEGvision™ (Yokogawa Electric Corporation, Tokyo, Japan) that uses axial-gradiometer sensors, and TRIUX™ (Elekta Corporate, Stockholm, Sweden) that uses planar-gradiometer and magnetometer sensors. We discuss performance comparisons between these significantly different systems.

  13. Quantitative performance assessments for neuromagnetic imaging systems.

    PubMed

    Koga, Ryo; Hiyama, Ei; Matsumoto, Takuya; Sekihara, Kensuke

    2013-01-01

    We have developed a Monte-Carlo simulation method to assess the performance of neuromagnetic imaging systems using two kinds of performance metrics: A-prime metric and spatial resolution. We compute these performance metrics for virtual sensor systems having 80, 160, 320, and 640 sensors, and discuss how the system performance is improved, depending on the number of sensors. We also compute these metrics for existing whole-head MEG systems, MEGvision™ (Yokogawa Electric Corporation, Tokyo, Japan) that uses axial-gradiometer sensors, and TRIUX™ (Elekta Corporate, Stockholm, Sweden) that uses planar-gradiometer and magnetometer sensors. We discuss performance comparisons between these significantly different systems. PMID:24110711

  14. Quantitative assessment of protein function prediction programs.

    PubMed

    Rodrigues, B N; Steffens, M B R; Raittz, R T; Santos-Weiss, I C R; Marchaukoski, J N

    2015-12-21

    Fast prediction of protein function is essential for high-throughput sequencing analysis. Bioinformatic resources provide cheaper and faster techniques for function prediction and have helped to accelerate the process of protein sequence characterization. In this study, we assessed protein function prediction programs that accept amino acid sequences as input. We analyzed the classification, equality, and similarity between programs, and, additionally, compared program performance. The following programs were selected for our assessment: Blast2GO, InterProScan, PANTHER, Pfam, and ScanProsite. This selection was based on the high number of citations (over 500), fully automatic analysis, and the possibility of returning a single best classification per sequence. We tested these programs using 12 gold standard datasets from four different sources. The gold standard classification of the databases was based on expert analysis, the Protein Data Bank, or the Structure-Function Linkage Database. We found that the miss rate among the programs is globally over 50%. Furthermore, we observed little overlap in the correct predictions from each program. Therefore, a combination of multiple types of sources and methods, including experimental data, protein-protein interaction, and data mining, may be the best way to generate more reliable predictions and decrease the miss rate.

  15. Quantitative assessment of protein function prediction programs.

    PubMed

    Rodrigues, B N; Steffens, M B R; Raittz, R T; Santos-Weiss, I C R; Marchaukoski, J N

    2015-01-01

    Fast prediction of protein function is essential for high-throughput sequencing analysis. Bioinformatic resources provide cheaper and faster techniques for function prediction and have helped to accelerate the process of protein sequence characterization. In this study, we assessed protein function prediction programs that accept amino acid sequences as input. We analyzed the classification, equality, and similarity between programs, and, additionally, compared program performance. The following programs were selected for our assessment: Blast2GO, InterProScan, PANTHER, Pfam, and ScanProsite. This selection was based on the high number of citations (over 500), fully automatic analysis, and the possibility of returning a single best classification per sequence. We tested these programs using 12 gold standard datasets from four different sources. The gold standard classification of the databases was based on expert analysis, the Protein Data Bank, or the Structure-Function Linkage Database. We found that the miss rate among the programs is globally over 50%. Furthermore, we observed little overlap in the correct predictions from each program. Therefore, a combination of multiple types of sources and methods, including experimental data, protein-protein interaction, and data mining, may be the best way to generate more reliable predictions and decrease the miss rate. PMID:26782400

  16. Quantitative estimation in Health Impact Assessment: Opportunities and challenges

    SciTech Connect

    Bhatia, Rajiv; Seto, Edmund

    2011-04-15

    Health Impact Assessment (HIA) considers multiple effects on health of policies, programs, plans and projects and thus requires the use of diverse analytic tools and sources of evidence. Quantitative estimation has desirable properties for the purpose of HIA but adequate tools for quantification exist currently for a limited number of health impacts and decision settings; furthermore, quantitative estimation generates thorny questions about the precision of estimates and the validity of methodological assumptions. In the United States, HIA has only recently emerged as an independent practice apart from integrated EIA, and this article aims to synthesize the experience with quantitative health effects estimation within that practice. We use examples identified through a scan of available identified instances of quantitative estimation in the U.S. practice experience to illustrate methods applied in different policy settings along with their strengths and limitations. We then discuss opportunity areas and practical considerations for the use of quantitative estimation in HIA.

  17. Sensitive Quantitative Assessment of Balance Disorders

    NASA Technical Reports Server (NTRS)

    Paloski, Willilam H.

    2007-01-01

    Computerized dynamic posturography (CDP) has become a standard technique for objectively quantifying balance control performance, diagnosing the nature of functional impairments underlying balance disorders, and monitoring clinical treatment outcomes. We have long used CDP protocols to assess recovery of sensory-motor function in astronauts following space flight. The most reliable indicators of post-flight crew performance are the sensory organization tests (SOTs), particularly SOTs 5 and 6, which are sensitive to changes in availability and/or utilization of vestibular cues. We have noted, however, that some astronauts exhibiting obvious signs of balance impairment after flight are able to score within clinical norms on these tests, perhaps as a result of adopting competitive strategies or by their natural skills at substituting alternate sensory information sources. This insensitivity of the CDP protocol could underestimate of the degree of impairment and, perhaps, lead to premature release of those crewmembers to normal duties. To improve the sensitivity of the CDP protocol we have introduced static and dynamic head tilt SOT trials into our protocol. The pattern of postflight recovery quantified by the enhanced CDP protocol appears to more aptly track the re-integration of sensory-motor function, with recovery time increasing as the complexity of sensory-motor/biomechanical task increases. The new CDP protocol therefore seems more suitable for monitoring post-flight sensory-motor recovery and for indicating to crewmembers and flight surgeons fitness for return to duty and/or activities of daily living. There may be classes of patients (e.g., athletes, pilots) having motivation and/or performance characteristics similar to astronauts whose sensory-motor treatment outcomes would also be more accurately monitored using the enhanced CDP protocol. Furthermore, the enhanced protocol may be useful in early detection of age-related balance disorders.

  18. A comparison of risk assessment techniques from qualitative to quantitative

    SciTech Connect

    Altenbach, T.J.

    1995-02-13

    Risk assessment techniques vary from purely qualitative approaches, through a regime of semi-qualitative to the more traditional quantitative. Constraints such as time, money, manpower, skills, management perceptions, risk result communication to the public, and political pressures all affect the manner in which risk assessments are carried out. This paper surveys some risk matrix techniques, examining the uses and applicability for each. Limitations and problems for each technique are presented and compared to the others. Risk matrix approaches vary from purely qualitative axis descriptions of accident frequency vs consequences, to fully quantitative axis definitions using multi-attribute utility theory to equate different types of risk from the same operation.

  19. Acoustic Assessment of a Konjac–Carrageenan Tissue-Mimicking Material at 5–60 MHz

    PubMed Central

    Kenwright, David A.; Sadhoo, Neelaksh; Rajagopal, Srinath; Anderson, Tom; Moran, Carmel M.; Hadoke, Patrick W.; Gray, Gillian A.; Zeqiri, Bajram; Hoskins, Peter R.

    2014-01-01

    The acoustic properties of a robust tissue-mimicking material based on konjac–carrageenan at ultrasound frequencies in the range 5–60 MHz are described. Acoustic properties were characterized using two methods: a broadband reflection substitution technique using a commercially available preclinical ultrasound scanner (Vevo 770, FUJIFILM VisualSonics, Toronto, ON, Canada), and a dedicated high-frequency ultrasound facility developed at the National Physical Laboratory (NPL, Teddington, UK), which employed a broadband through-transmission substitution technique. The mean speed of sound across the measured frequencies was found to be 1551.7 ± 12.7 and 1547.7 ± 3.3 m s−1, respectively. The attenuation exhibited a non-linear dependence on frequency, f (MHz), in the form of a polynomial function: 0.009787f2 + 0.2671f and 0.01024f2 + 0.3639f, respectively. The characterization of this tissue-mimicking material will provide reference data for designing phantoms for preclinical systems, which may, in certain applications such as flow phantoms, require a physically more robust tissue-mimicking material than is currently available. PMID:25438864

  20. Quantitative wearable sensors for objective assessment of Parkinson's disease.

    PubMed

    Maetzler, Walter; Domingos, Josefa; Srulijes, Karin; Ferreira, Joaquim J; Bloem, Bastiaan R

    2013-10-01

    There is a rapidly growing interest in the quantitative assessment of Parkinson's disease (PD)-associated signs and disability using wearable technology. Both persons with PD and their clinicians see advantages in such developments. Specifically, quantitative assessments using wearable technology may allow for continuous, unobtrusive, objective, and ecologically valid data collection. Also, this approach may improve patient-doctor interaction, influence therapeutic decisions, and ultimately ameliorate patients' global health status. In addition, such measures have the potential to be used as outcome parameters in clinical trials, allowing for frequent assessments; eg, in the home setting. This review discusses promising wearable technology, addresses which parameters should be prioritized in such assessment strategies, and reports about studies that have already investigated daily life issues in PD using this new technology.

  1. Assessing digital and quantitative EEG in clinical settings.

    PubMed

    Nuwer, M R

    1998-11-01

    Assessment of clinical utility involves a series of steps based primarily on published peer-reviewed medical literature. Relevant publications usually use the scientific method, appropriate control groups, blinded reading, prospective design, and other study elements. Assessments are more credible when conducted by those who do not have a conflict of interest in the technique. A detailed assessment of digital and quantitative EEG was conducted recently by the American Academy of Neurology. The American Clinical Neurophysiology Society was a joint sponsor. This assessment concluded that digital EEG is an excellent substitute for paper EEG. It also found quantitative techniques helpful in epilepsy monitoring, seizure detections, and in operating room/intensive care unit trend monitors. Several other applications were considered promising, whereas some applications were considered not ready for clinical use. Substantial problems still plague the field, predisposing to false-positive results.

  2. Quantitative Assessment of a Senge Learning Organization Intervention

    ERIC Educational Resources Information Center

    Kiedrowski, P. Jay

    2006-01-01

    Purpose: To quantitatively assess a Senge learning organization (LO) intervention to determine if it would result in improved employee satisfaction. Design/methodology/approach: A Senge LO intervention in Division 123 of Company ABC was undertaken in 2000. Three employee surveys using likert-scale questions over five years and correlation analysis…

  3. [The method of quantitative assessment of dentition aesthetic parameters].

    PubMed

    Ryakhovsky, A N; Kalacheva, Ya A

    2016-01-01

    This article describes the formula for calculating the aesthetic index of treatment outcome. The formula was derived on the basis of the obtained regression equations showing the dependence of visual assessment of the value of aesthetic violations. The formula can be used for objective quantitative evaluation of the aesthetics of the teeth when smiling before and after dental treatment.

  4. [The method of quantitative assessment of dentition aesthetic parameters].

    PubMed

    Ryakhovsky, A N; Kalacheva, Ya A

    2016-01-01

    This article describes the formula for calculating the aesthetic index of treatment outcome. The formula was derived on the basis of the obtained regression equations showing the dependence of visual assessment of the value of aesthetic violations. The formula can be used for objective quantitative evaluation of the aesthetics of the teeth when smiling before and after dental treatment. PMID:27367198

  5. Some suggested future directions of quantitative resource assessments

    USGS Publications Warehouse

    Singer, D.A.

    2001-01-01

    Future quantitative assessments will be expected to estimate quantities, values, and locations of undiscovered mineral resources in a form that conveys both economic viability and uncertainty associated with the resources. Historically, declining metal prices point to the need for larger deposits over time. Sensitivity analysis demonstrates that the greatest opportunity for reducing uncertainty in assessments lies in lowering uncertainty associated with tonnage estimates. Of all errors possible in assessments, those affecting tonnage estimates are by far the most important. Selecting the correct deposit model is the most important way of controlling errors because the dominance of tonnage-deposit models are the best known predictor of tonnage. Much of the surface is covered with apparently barren rocks and sediments in many large regions. Because many exposed mineral deposits are believed to have been found, a prime concern is the presence of possible mineralized rock under cover. Assessments of areas with resources under cover must rely on extrapolation from surrounding areas, new geologic maps of rocks under cover, or analogy with other well-explored areas that can be considered training tracts. Cover has a profound effect on uncertainty and on methods and procedures of assessments because geology is seldom known and geophysical methods typically have attenuated responses. Many earlier assessment methods were based on relationships of geochemical and geophysical variables to deposits learned from deposits exposed on the surface-these will need to be relearned based on covered deposits. Mineral-deposit models are important in quantitative resource assessments for two reasons: (1) grades and tonnages of most deposit types are significantly different, and (2) deposit types are present in different geologic settings that can be identified from geologic maps. Mineral-deposit models are the keystone in combining the diverse geoscience information on geology, mineral

  6. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    PubMed

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  7. Quantitative risk assessment methods for cancer and noncancer effects.

    PubMed

    Baynes, Ronald E

    2012-01-01

    Human health risk assessments have evolved from the more qualitative approaches to more quantitative approaches in the past decade. This has been facilitated by the improvement in computer hardware and software capability and novel computational approaches being slowly recognized by regulatory agencies. These events have helped reduce the reliance on experimental animals as well as better utilization of published animal toxicology data in deriving quantitative toxicity indices that may be useful for risk management purposes. This chapter briefly describes some of the approaches as described in the guidance documents from several of the regulatory agencies as it pertains to hazard identification and dose-response assessment of a chemical. These approaches are contrasted with more novel computational approaches that provide a better grasp of the uncertainty often associated with chemical risk assessments.

  8. Assessment of and standardization for quantitative nondestructive test

    NASA Technical Reports Server (NTRS)

    Neuschaefer, R. W.; Beal, J. B.

    1972-01-01

    Present capabilities and limitations of nondestructive testing (NDT) as applied to aerospace structures during design, development, production, and operational phases are assessed. It will help determine what useful structural quantitative and qualitative data may be provided from raw materials to vehicle refurbishment. This assessment considers metal alloys systems and bonded composites presently applied in active NASA programs or strong contenders for future use. Quantitative and qualitative data has been summarized from recent literature, and in-house information, and presented along with a description of those structures or standards where the information was obtained. Examples, in tabular form, of NDT technique capabilities and limitations have been provided. NDT techniques discussed and assessed were radiography, ultrasonics, penetrants, thermal, acoustic, and electromagnetic. Quantitative data is sparse; therefore, obtaining statistically reliable flaw detection data must be strongly emphasized. The new requirements for reusable space vehicles have resulted in highly efficient design concepts operating in severe environments. This increases the need for quantitative NDT evaluation of selected structural components, the end item structure, and during refurbishment operations.

  9. Quantitative computed tomography for spinal mineral assessment: current status

    NASA Technical Reports Server (NTRS)

    Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U.; Arnaud, C. D.

    1985-01-01

    Quantitative CT (QCT) is an established method for the noninvasive assessment of bone mineral content in the vertebral spongiosum and other anatomic locations. The potential strengths of QCT relative to dual photon absorptiometry (DPA) are its capability for precise three-dimensional anatomic localization providing a direct density measurement and its capability for spatial separation of highly responsive cancellous bone from less responsive cortical bone. The extraction of this quantitative information from the CT image, however, requires sophisticated calibration and positioning techniques and careful technical monitoring.

  10. Status and future of Quantitative Microbiological Risk Assessment in China

    PubMed Central

    Dong, Q.L.; Barker, G.C.; Gorris, L.G.M.; Tian, M.S.; Song, X.Y.; Malakar, P.K.

    2015-01-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives. PMID:26089594

  11. Sites of superoxide and hydrogen peroxide production by muscle mitochondria assessed ex vivo under conditions mimicking rest and exercise.

    PubMed

    Goncalves, Renata L S; Quinlan, Casey L; Perevoshchikova, Irina V; Hey-Mogensen, Martin; Brand, Martin D

    2015-01-01

    The sites and rates of mitochondrial production of superoxide and H2O2 in vivo are not yet defined. At least 10 different mitochondrial sites can generate these species. Each site has a different maximum capacity (e.g. the outer quinol site in complex III (site IIIQo) has a very high capacity in rat skeletal muscle mitochondria, whereas the flavin site in complex I (site IF) has a very low capacity). The maximum capacities can greatly exceed the actual rates observed in the absence of electron transport chain inhibitors, so maximum capacities are a poor guide to actual rates. Here, we use new approaches to measure the rates at which different mitochondrial sites produce superoxide/H2O2 using isolated muscle mitochondria incubated in media mimicking the cytoplasmic substrate and effector mix of skeletal muscle during rest and exercise. We find that four or five sites dominate during rest in this ex vivo system. Remarkably, the quinol site in complex I (site IQ) and the flavin site in complex II (site IIF) each account for about a quarter of the total measured rate of H2O2 production. Site IF, site IIIQo, and perhaps site EF in the β-oxidation pathway account for most of the remainder. Under conditions mimicking mild and intense aerobic exercise, total production is much less, and the low capacity site IF dominates. These results give novel insights into which mitochondrial sites may produce superoxide/H2O2 in vivo. PMID:25389297

  12. Elastic properties of soft tissue-mimicking phantoms assessed by combined use of laser ultrasonics and low coherence interferometry.

    PubMed

    Li, Chunhui; Huang, Zhihong; Wang, Ruikang K

    2011-05-23

    Advances in the field of laser ultrasonics have opened up new possibilities in medical applications. This paper evaluates this technique as a method that would allow for rapid characterization of the elastic properties of soft biological tissue. In doing so, we propose a novel approach that utilizes a low coherence interferometer to detect the laser-induced surface acoustic waves (SAW) from the tissue-mimicking phantoms. A Nd:YAG focused laser line-source is applied to one- and two-layer tissue-mimicking agar-agar phantoms, and the generated SAW signals are detected by a time domain low coherence interferometry system. SAW phase velocity dispersion curves are calculated, from which the elasticity of the specimens is evaluated. We show that the experimental results agree well with those of the theoretical expectations. This study is the first report that a laser-generated SAW phase velocity dispersion technique is applied to soft materials. This technique may open a way for laser ultrasonics to detect the mechanical properties of soft tissues, such as skin.

  13. The quantitative assessment of normal canine small intestinal mucosa.

    PubMed

    Hart, I R; Kidder, D E

    1978-09-01

    Quanitative methods of assessing the architecture of small intestinal mucosa have been applied to biopsy material from normal dogs. Mucosal samples taken from four predetermined sites show that there are significant quantitative differences between the various levels of the small bowel. Animals of one year of age and older show no correlation between age or weight and mucosal dimensions. The significance of these findings, in relation to examination of biopsy material from cases of clinical small intestinal disease, is discussed. PMID:364574

  14. DREAM: a method for semi-quantitative dermal exposure assessment.

    PubMed

    Van-Wendel-de-Joode, Berna; Brouwer, Derk H; Vermeulen, Roel; Van Hemmen, Joop J; Heederik, Dick; Kromhout, Hans

    2003-01-01

    This paper describes a new method (DREAM) for structured, semi-quantitative dermal exposure assessment for chemical or biological agents that can be used in occupational hygiene or epidemiology. It is anticipated that DREAM could serve as an initial assessment of dermal exposure, amongst others, resulting in a ranking of tasks and subsequently jobs. DREAM consists of an inventory and evaluation part. Two examples of dermal exposure of workers of a car-construction company show that DREAM characterizes tasks and gives insight into exposure mechanisms, forming a basis for systematic exposure reduction. DREAM supplies estimates for exposure levels on the outside clothing layer as well as on skin, and provides insight into the distribution of dermal exposure over the body. Together with the ranking of tasks and people, this provides information for measurement strategies and helps to determine who, where and what to measure. In addition to dermal exposure assessment, the systematic description of dermal exposure pathways helps to prioritize and determine most adequate measurement strategies and methods. DREAM could be a promising approach for structured, semi-quantitative, dermal exposure assessment. PMID:12505908

  15. Quantitative study designs used in quality improvement and assessment.

    PubMed

    Ormes, W S; Brim, M B; Coggan, P

    2001-01-01

    This article describes common quantitative design techniques that can be used to collect and analyze quality data. An understanding of the differences between these design techniques can help healthcare quality professionals make the most efficient use of their time, energies, and resources. To evaluate the advantages and disadvantages of these various study designs, it is necessary to assess factors that threaten the degree with which quality professionals may infer a cause-and-effect relationship from the data collected. Processes, the conduits of organizational function, often can be assessed by methods that do not take into account confounding and compromising circumstances that affect the outcomes of their analyses. An assumption that the implementation of process improvements may cause real change is incomplete without a consideration of other factors that might also have caused the same result. It is only through the identification, assessment, and exclusion of these alternative factors that administrators and healthcare quality professionals can assess the degree to which true process improvement or compliance has occurred. This article describes the advantages and disadvantages of common quantitative design techniques and reviews the corresponding threats to the interpretability of data obtained from their use. PMID:11378972

  16. Binary Imaging Analysis for Comprehensive Quantitative Assessment of Peripheral Nerve

    PubMed Central

    Hunter, Daniel A.; Moradzadeh, Arash; Whitlock, Elizabeth L.; Brenner, Michael J.; Myckatyn, Terence M.; Wei, Cindy H.; Tung, Thomas H.H.; Mackinnon, Susan E.

    2007-01-01

    Quantitative histomorphometry is the current gold standard for objective measurement of nerve architecture and its components. Many methods still in use rely heavily upon manual techniques that are prohibitively time consuming, predisposing to operator fatigue, sampling error, and overall limited reproducibility. More recently, investigators have attempted to combine the speed of automated morphometry with the accuracy of manual and semi-automated methods. Systematic refinements in binary imaging analysis techniques combined with an algorithmic approach allow for more exhaustive characterization of nerve parameters in the surgically relevant injury paradigms of regeneration following crush, transection, and nerve gap injuries. The binary imaging method introduced here uses multiple bitplanes to achieve reproducible, high throughput quantitative assessment of peripheral nerve. Number of myelinated axons, myelinated fiber diameter, myelin thickness, fiber distributions, myelinated fiber density, and neural debris can be quantitatively evaluated with stratification of raw data by nerve component. Results of this semi-automated method are validated by comparing values against those obtained with manual techniques. The use of this approach results in more rapid, accurate, and complete assessment of myelinated axons than manual techniques. PMID:17675163

  17. Quantitative assessment of regional right ventricular function with color kinesis.

    PubMed

    Vignon, P; Weinert, L; Mor-Avi, V; Spencer, K T; Bednarz, J; Lang, R M

    1999-06-01

    We used color kinesis, a recent echocardiographic technique that provides regional information on the magnitude and timing of endocardial wall motion, to quantitatively assess regional right ventricular (RV) systolic and diastolic properties in 76 subjects who were divided into five groups, as follows: normal (n = 20), heart failure (n = 15), pressure/volume overload (n = 14), pressure overload (n = 12), and RV hypertrophy (n = 15). Quantitative segmental analysis of color kinesis images was used to obtain regional fractional area change (RFAC), which was displayed in the form of stacked histograms to determine patterns of endocardial wall motion. Time curves of integrated RFAC were used to objectively identify asynchrony of diastolic endocardial motion. When compared with normal subjects, patients with pressure overload or heart failure exhibited significantly decreased endocardial motion along the RV free wall. In the presence of mixed pressure/volume overload, the markedly increased ventricular septal motion compensated for decreased RV free wall motion. Diastolic endocardial wall motion was delayed in 17 of 72 segments (24%) in patients with RV pressure overload, and in 31 of 90 segments (34%) in patients with RV hypertrophy. Asynchrony of diastolic endocardial wall motion was greater in the latter group than in normal subjects (16% versus 10%: p < 0.01). Segmental analysis of color kinesis images allows quantitative assessment of regional RV systolic and diastolic properties.

  18. Simulation evaluation of quantitative myocardial perfusion assessment from cardiac CT

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2014-03-01

    Contrast enhancement on cardiac CT provides valuable information about myocardial perfusion and methods have been proposed to assess perfusion with static and dynamic acquisitions. There is a lack of knowledge and consensus on the appropriate approach to ensure 1) sufficient diagnostic accuracy for clinical decisions and 2) low radiation doses for patient safety. This work developed a thorough dynamic CT simulation and several accepted blood flow estimation techniques to evaluate the performance of perfusion assessment across a range of acquisition and estimation scenarios. Cardiac CT acquisitions were simulated for a range of flow states (Flow = 0.5, 1, 2, 3 ml/g/min, cardiac output = 3,5,8 L/min). CT acquisitions were simulated with a validated CT simulator incorporating polyenergetic data acquisition and realistic x-ray flux levels for dynamic acquisitions with a range of scenarios including 1, 2, 3 sec sampling for 30 sec with 25, 70, 140 mAs. Images were generated using conventional image reconstruction with additional image-based beam hardening correction to account for iodine content. Time attenuation curves were extracted for multiple regions around the myocardium and used to estimate flow. In total, 2,700 independent realizations of dynamic sequences were generated and multiple MBF estimation methods were applied to each of these. Evaluation of quantitative kinetic modeling yielded blood flow estimates with an root mean square error (RMSE) of ~0.6 ml/g/min averaged across multiple scenarios. Semi-quantitative modeling and qualitative static imaging resulted in significantly more error (RMSE = ~1.2 and ~1.2 ml/min/g respectively). For quantitative methods, dose reduction through reduced temporal sampling or reduced tube current had comparable impact on the MBF estimate fidelity. On average, half dose acquisitions increased the RMSE of estimates by only 18% suggesting that substantial dose reductions can be employed in the context of quantitative myocardial

  19. Quantitative objective assessment of peripheral nociceptive C fibre function.

    PubMed Central

    Parkhouse, N; Le Quesne, P M

    1988-01-01

    A technique is described for the quantitative assessment of peripheral nociceptive C fibre function by measurement of the axon reflex flare. Acetylcholine, introduced by electrophoresis, is used to stimulate a ring of nociceptive C fibre endings at the centre of which the increase in blood flow is measured with a laser Doppler flowmeter. This flare (neurogenic vasodilatation) has been compared with mechanically or chemically stimulated non-neurogenic cutaneous vasodilation. The flare is abolished by local anaesthetic and is absent in denervated skin. The flare has been measured on the sole of the foot of 96 healthy subjects; its size decreases with age in males, but not in females. Images PMID:3351528

  20. Short Course Introduction to Quantitative Mineral Resource Assessments

    USGS Publications Warehouse

    Singer, Donald A.

    2007-01-01

    This is an abbreviated text supplementing the content of three sets of slides used in a short course that has been presented by the author at several workshops. The slides should be viewed in the order of (1) Introduction and models, (2) Delineation and estimation, and (3) Combining estimates and summary. References cited in the slides are listed at the end of this text. The purpose of the three-part form of mineral resource assessments discussed in the accompanying slides is to make unbiased quantitative assessments in a format needed in decision-support systems so that consequences of alternative courses of action can be examined. The three-part form of mineral resource assessments was developed to assist policy makers evaluate the consequences of alternative courses of action with respect to land use and mineral-resource development. The audience for three-part assessments is a governmental or industrial policy maker, a manager of exploration, a planner of regional development, or similar decision-maker. Some of the tools and models presented here will be useful for selection of exploration sites, but that is a side benefit, not the goal. To provide unbiased information, we recommend the three-part form of mineral resource assessments where general locations of undiscovered deposits are delineated from a deposit type's geologic setting, frequency distributions of tonnages and grades of well-explored deposits serve as models of grades and tonnages of undiscovered deposits, and number of undiscovered deposits are estimated probabilistically by type. The internally consistent descriptive, grade and tonnage, deposit density, and economic models used in the design of the three-part form of assessments reduce the chances of biased estimates of the undiscovered resources. What and why quantitative resource assessments: The kind of assessment recommended here is founded in decision analysis in order to provide a framework for making decisions concerning mineral

  1. [Quantitative assessment of facial palsy by Moiré topography].

    PubMed

    Inokuchi, I

    1992-05-01

    It is essential to establish an objective and quantitative method for evaluating facial palsy and to measure the extent of paralysis in order to evaluate therapeutic efficacy, determine prognosis, select appropriate treatment and observe the process of recovery. This study utilized Moiré topography, which displays three-dimensional facial symmetry with high precision and is based on light interference theory, to determine the extent of facial palsy in 38 patients (20 men and 18 women) 5 months to 73 years of age. A stereoscopic lattice type Moiré camera (FM3013) was connected to a CCD camera and to the monitoring device for confirming Moiré stripes. Moiré photographs were taken with a thermal imager (FTI-200). The photos were visually and objectively evaluated on the basis of the Moiré pattern and were then input into a personal computer with a digitizer for data processing and analysis. To view the functions of facial nerve branches, five Moiré photographs were taken: at rest, wrinkling the forehead, closing the eyes lightly, blowing out the cheeks and grinning. Results indicated that the number of stripes and their polarization adequately reflected the function of individual facial nerve branches. Thus, a well-defined Moiré pattern could clarify the characteristics of the site and the degree of facial palsy and of recovery from paralysis. It is an analytical method that can be quickly applied and seems especially useful in infants and young children, in whom point-based assessment is difficult. It is possible to quantitatively evaluate facial palsy in terms of the Asymmetry Index (AI), which is 20-25% for severe paralysis, 12-19% for partial paralysis, and 5-10% for an essentially normal condition. However, the numerical value of the AI overlap in all three paralysis categories, indicating that quantitative assessment of paralysis would be difficult. Moiré topography is an excellent method of determining the extent of facial palsy, compensating for the

  2. Remotely Sensed Quantitative Drought Risk Assessment in Vulnerable Agroecosystems

    NASA Astrophysics Data System (ADS)

    Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.

    2012-04-01

    Hazard may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, economy and society. This paper deals with drought risk assessment, which the first step designed to find out what the problems are and comprises three distinct steps, namely risk identification, risk management which is not covered in this paper, there should be a fourth step to address the need for feedback and to take post-audits of all risk assessment exercises. In particular, quantitative drought risk assessment is attempted by using statistical methods. For the qualification of drought, the Reconnaissance Drought Index (RDI) is employed, which is a new index based on hydrometeorological parameters, such as precipitation and potential evapotranspiration. The remotely sensed estimation of RDI is based on NOA-AVHRR satellite data for a period of 20 years (1981-2001). The study area is Thessaly, central Greece, which is a drought-prone agricultural region characterized by vulnerable agriculture. Specifically, the undertaken drought risk assessment processes are specified as follows: 1. Risk identification: This step involves drought quantification and monitoring based on remotely sensed RDI and extraction of several features such as severity, duration, areal extent, onset and end time. Moreover, it involves a drought early warning system based on the above parameters. 2. Risk estimation: This step includes an analysis of drought severity, frequency and their relationships. 3. Risk evaluation: This step covers drought evaluation based on analysis of RDI images before and after each drought episode, which usually lasts one hydrological year (12month). The results of these three-step drought assessment processes are considered quite satisfactory in a drought-prone region such as Thessaly in central

  3. Assessment of metabolic bone diseases by quantitative computed tomography

    NASA Technical Reports Server (NTRS)

    Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.

    1985-01-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.

  4. Extending the quantitative assessment of industrial risks to earthquake effects.

    PubMed

    Campedel, Michela; Cozzani, Valerio; Garcia-Agreda, Anita; Salzano, Ernesto

    2008-10-01

    In the general framework of quantitative methods for natural-technological (NaTech) risk analysis, a specific methodology was developed for assessing risks caused by hazardous substances released due to earthquakes. The contribution of accidental scenarios initiated by seismic events to the overall industrial risk was assessed in three case studies derived from the actual plant layout of existing oil refineries. Several specific vulnerability models for different equipment classes were compared and assessed. The effect of differing structural resistances for process equipment on the final risk results was also investigated. The main factors influencing the final risk values resulted from the models for equipment vulnerability and the assumptions for the reference damage states of the process equipment. The analysis of case studies showed that in seismic zones the additional risk deriving from damage caused by earthquakes may be up to more than one order of magnitude higher than that associated to internal failure causes. Critical equipment was determined to be mainly pressurized tanks, even though atmospheric tanks were more vulnerable to containment loss. Failure of minor process equipment having a limited hold-up of hazardous substances (such as pumps) was shown to have limited influence on the final values of the risk increase caused by earthquakes.

  5. Quantitative Elastography for Cervical Stiffness Assessment during Pregnancy

    PubMed Central

    Fruscalzo, A.; Londero, A. P.; Fröhlich, C.; Möllmann, U.; Schmitz, R.

    2014-01-01

    Aim. Feasibility and reliability of tissue Doppler imaging-(TDI-) based elastography for cervical quantitative stiffness assessment during all three trimesters of pregnancy were evaluated. Materials and Methods. Prospective case-control study including seventy-four patients collected between the 12th and 42nd weeks of gestation. The tissue strain (TS) was measured by two independent operators as natural strain. Intra- and interoperator intraclass correlation coefficient (ICC) agreements were evaluated. Results. TS measurement was always feasible and exhibited a high performance in terms of reliability (intraoperator ICC-agreement = 0.93; interoperator ICC agreement = 0.89 and 0.93 for a single measurement and for the average of two measurements, resp.). Cervical TS showed also a significant correlation with gestational age, cervical length, and parity. Conclusions. TS measurement during pregnancy demonstrated high feasibility and reliability. Furthermore, TS significantly correlated with gestational age, cervical length, and parity. PMID:24734246

  6. Assessing the Reliability of Quantitative Imaging of Sm-153

    NASA Astrophysics Data System (ADS)

    Poh, Zijie; Dagan, Maáyan; Veldman, Jeanette; Trees, Brad

    2013-03-01

    Samarium-153 is used for palliation of and recently has been investigated for therapy for bone metastases. Patient specific dosing of Sm-153 is based on quantitative single-photon emission computed tomography (SPECT) and knowing the accuracy and precision of image-based estimates of the in vivo activity distribution. Physical phantom studies are useful for estimating these in simple objects, but do not model realistic activity distributions. We are using realistic Monte Carlo simulations combined with a realistic digital phantom modeling human anatomy to assess the accuracy and precision of Sm-153 SPECT. Preliminary data indicates that we can simulate projection images and reconstruct them with compensation for various physical image degrading factors, such as attenuation and scatter in the body as well as non-idealities in the imaging system, to provide realistic SPECT images.

  7. Quantitative Security Risk Assessment and Management for Railway Transportation Infrastructures

    NASA Astrophysics Data System (ADS)

    Flammini, Francesco; Gaglione, Andrea; Mazzocca, Nicola; Pragliola, Concetta

    Scientists have been long investigating procedures, models and tools for the risk analysis in several domains, from economics to computer networks. This paper presents a quantitative method and a tool for the security risk assessment and management specifically tailored to the context of railway transportation systems, which are exposed to threats ranging from vandalism to terrorism. The method is based on a reference mathematical model and it is supported by a specifically developed tool. The tool allows for the management of data, including attributes of attack scenarios and effectiveness of protection mechanisms, and the computation of results, including risk and cost/benefit indices. The main focus is on the design of physical protection systems, but the analysis can be extended to logical threats as well. The cost/benefit analysis allows for the evaluation of the return on investment, which is a nowadays important issue to be addressed by risk analysts.

  8. Preschool Temperament Assessment: A Quantitative Assessment of the Validity of Behavioral Style Questionnaire Data

    ERIC Educational Resources Information Center

    Huelsman, Timothy J.; Gagnon, Sandra Glover; Kidder-Ashley, Pamela; Griggs, Marissa Swaim

    2014-01-01

    Research Findings: Child temperament is an important construct, but its measurement has been marked by a number of weaknesses that have diminished the frequency with which it is assessed in practice. We address this problem by presenting the results of a quantitative construct validation study. We calculated validity indices by hypothesizing the…

  9. A Framework for General Education Assessment: Assessing Information Literacy and Quantitative Literacy with ePortfolios

    ERIC Educational Resources Information Center

    Hubert, David A.; Lewis, Kati J.

    2014-01-01

    This essay presents the findings of an authentic and holistic assessment, using a random sample of one hundred student General Education ePortfolios, of two of Salt Lake Community College's (SLCC) college-wide learning outcomes: quantitative literacy (QL) and information literacy (IL). Performed by four faculty from biology, humanities, and…

  10. Quantitative risk assessment modeling for nonhomogeneous urban road tunnels.

    PubMed

    Meng, Qiang; Qu, Xiaobo; Wang, Xinchang; Yuanita, Vivi; Wong, Siew Chee

    2011-03-01

    Urban road tunnels provide an increasingly cost-effective engineering solution, especially in compact cities like Singapore. For some urban road tunnels, tunnel characteristics such as tunnel configurations, geometries, provisions of tunnel electrical and mechanical systems, traffic volumes, etc. may vary from one section to another. These urban road tunnels that have characterized nonuniform parameters are referred to as nonhomogeneous urban road tunnels. In this study, a novel quantitative risk assessment (QRA) model is proposed for nonhomogeneous urban road tunnels because the existing QRA models for road tunnels are inapplicable to assess the risks in these road tunnels. This model uses a tunnel segmentation principle whereby a nonhomogeneous urban road tunnel is divided into various homogenous sections. Individual risk for road tunnel sections as well as the integrated risk indices for the entire road tunnel is defined. The article then proceeds to develop a new QRA model for each of the homogeneous sections. Compared to the existing QRA models for road tunnels, this section-based model incorporates one additional top event-toxic gases due to traffic congestion-and employs the Poisson regression method to estimate the vehicle accident frequencies of tunnel sections. This article further illustrates an aggregated QRA model for nonhomogeneous urban tunnels by integrating the section-based QRA models. Finally, a case study in Singapore is carried out.

  11. Dermal sensitization quantitative risk assessment (QRA) for fragrance ingredients.

    PubMed

    Api, Anne Marie; Basketter, David A; Cadby, Peter A; Cano, Marie-France; Ellis, Graham; Gerberick, G Frank; Griem, Peter; McNamee, Pauline M; Ryan, Cindy A; Safford, Robert

    2008-10-01

    Based on chemical, cellular, and molecular understanding of dermal sensitization, an exposure-based quantitative risk assessment (QRA) can be conducted to determine safe use levels of fragrance ingredients in different consumer product types. The key steps are: (1) determination of benchmarks (no expected sensitization induction level (NESIL)); (2) application of sensitization assessment factors (SAF); and (3) consumer exposure (CEL) calculation through product use. Using these parameters, an acceptable exposure level (AEL) can be calculated and compared with the CEL. The ratio of AEL to CEL must be favorable to support safe use of the potential skin sensitizer. This ratio must be calculated for the fragrance ingredient in each product type. Based on the Research Institute for Fragrance Materials, Inc. (RIFM) Expert Panel's recommendation, RIFM and the International Fragrance Association (IFRA) have adopted the dermal sensitization QRA approach described in this review for fragrance ingredients identified as potential dermal sensitizers. This now forms the fragrance industry's core strategy for primary prevention of dermal sensitization to these materials in consumer products. This methodology is used to determine global fragrance industry product management practices (IFRA Standards) for fragrance ingredients that are potential dermal sensitizers. This paper describes the principles of the recommended approach, provides detailed review of all the information used in the dermal sensitization QRA approach for fragrance ingredients and presents key conclusions for its use now and refinement in the future.

  12. The potential optical coherence tomography in tooth bleaching quantitative assessment

    NASA Astrophysics Data System (ADS)

    Ni, Y. R.; Guo, Z. Y.; Shu, S. Y.; Zeng, C. C.; Zhong, H. Q.; Chen, B. L.; Liu, Z. M.; Bao, Y.

    2011-12-01

    In this paper, we report the outcomes from a pilot study on using OCT functional imaging method to evaluate and quantify color alteration in the human teeth in vitro. The image formations of the dental tissues without and with treatment 35% hydrogen peroxide were obtained by an OCT system at a 1310 nm central wavelength. One parameter for the quantification of optical properties from OCT measurements is introduced in our study: attenuate coefficient (μ). And the attenuate coefficient have significant decrease ( p < 0.001) in dentine as well as a significant increase ( p < 0.001) in enamel was observed during tooth bleaching process. From the experimental results, it is found that attenuate coefficient could be useful to assess color alteration of the human tooth samples. OCT has a potential to become an effective tool for the assessment tooth bleaching. And our experiment offer a now method to evaluate color change in visible region by quantitative analysis of the infrared region information from OCT.

  13. A quantitative model for assessing community dynamics of pleistocene mammals.

    PubMed

    Lyons, S Kathleen

    2005-06-01

    Previous studies have suggested that species responded individualistically to the climate change of the last glaciation, expanding and contracting their ranges independently. Consequently, many researchers have concluded that community composition is plastic over time. Here I quantitatively assess changes in community composition over broad timescales and assess the effect of range shifts on community composition. Data on Pleistocene mammal assemblages from the FAUNMAP database were divided into four time periods (preglacial, full glacial, postglacial, and modern). Simulation analyses were designed to determine whether the degree of change in community composition is consistent with independent range shifts, given the distribution of range shifts observed. Results indicate that many of the communities examined in the United States were more similar through time than expected if individual range shifts were completely independent. However, in each time transition examined, there were areas of nonanalogue communities. I conducted sensitivity analyses to explore how the results were affected by the assumptions of the null model. Conclusions about changes in mammalian distributions and community composition are robust with respect to the assumptions of the model. Thus, whether because of biotic interactions or because of common environmental requirements, community structure through time is more complex than previously thought.

  14. Assessing the mechanical properties of tissue-mimicking phantoms at different depths as an approach to measure biomechanical gradient of crystalline lens

    PubMed Central

    Wang, Shang; Aglyamov, Salavat; Karpiouk, Andrei; Li, Jiasong; Emelianov, Stanislav; Manns, Fabrice; Larin, Kirill V.

    2013-01-01

    We demonstrate the feasibility of using the dominant frequency of the sample surface response to a mechanical stimulation as an effective indicator for sensing the depthwise distribution of elastic properties in transparent layered phantom samples simulating the cortex and nucleus of the crystalline lens. Focused ultrasound waves are used to noninvasively interrogate the sample surface. A phase-sensitive optical coherence tomography system is utilized to capture the surface dynamics over time with nanometer scale sensitivity. Spectral analysis is performed on the sample surface response to ultrasound stimulation and the dominant frequency is calculated under particular loading parameters. Pilot experiments were conducted on homogeneous and layered tissue-mimicking phantoms. Results indicate that the mechanical layers located at different depths introduce different frequencies to the sample surface response, which are correlated with the depth-dependent elasticity of the sample. The duration and the frequency of the ultrasound excitation are also investigated for their influences on this spectrum-based detection. This noninvasive method may be potentially applied for localized and rapid assessment of the depth dependence of the mechanical properties of the crystalline lens. PMID:24409379

  15. An Assessment of the Quantitative Literacy of Undergraduate Students

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.

    2016-01-01

    Quantitative literacy (QLT) represents an underlying higher-order construct that accounts for a person's willingness to engage in quantitative situations in everyday life. The purpose of this study is to retest the construct validity of a model of quantitative literacy (Wilkins, 2010). In this model, QLT represents a second-order factor that…

  16. Cerebral Microbleeds: Burden Assessment by Using Quantitative Susceptibility Mapping

    PubMed Central

    Liu, Tian; Surapaneni, Krishna; Lou, Min; Cheng, Liuquan; Spincemaille, Pascal

    2012-01-01

    Purpose: To assess quantitative susceptibility mapping (QSM) for reducing the inconsistency of standard magnetic resonance (MR) imaging sequences in measurements of cerebral microbleed burden. Materials and Methods: This retrospective study was HIPAA compliant and institutional review board approved. Ten patients (5.6%) were selected from among 178 consecutive patients suspected of having experienced a stroke who were imaged with a multiecho gradient-echo sequence at 3.0 T and who had cerebral microbleeds on T2*-weighted images. QSM was performed for various ranges of echo time by using both the magnitude and phase components in the morphology-enabled dipole inversion method. Cerebral microbleed size was measured by two neuroradiologists on QSM images, T2*-weighted images, susceptibility-weighted (SW) images, and R2* maps calculated by using different echo times. The sum of susceptibility over a region containing a cerebral microbleed was also estimated on QSM images as its total susceptibility. Measurement differences were assessed by using the Student t test and the F test; P < .05 was considered to indicate a statistically significant difference. Results: When echo time was increased from approximately 20 to 40 msec, the measured cerebral microbleed volume increased by mean factors of 1.49 ± 0.86 (standard deviation), 1.64 ± 0.84, 2.30 ± 1.20, and 2.30 ± 1.19 for QSM, R2*, T2*-weighted, and SW images, respectively (P < .01). However, the measured total susceptibility with QSM did not show significant change over echo time (P = .31), and the variation was significantly smaller than any of the volume increases (P < .01 for each). Conclusion: The total susceptibility of a cerebral microbleed measured by using QSM is a physical property that is independent of echo time. © RSNA, 2011 PMID:22056688

  17. Quantitative assessment of computational models for retinotopic map formation

    PubMed Central

    Sterratt, David C; Cutts, Catherine S; Willshaw, David J; Eglen, Stephen J

    2014-01-01

    ABSTRACT Molecular and activity‐based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. Our framework facilitates comparison between models, testing new models against known phenotypes and simulating new phenotypes in existing models. We have used this framework to assess four representative models that combine Eph/ephrin gradients and/or activity‐based mechanisms and competition. Two of the models were updated from their original form to fit into our framework. The models were tested against five different phenotypes: wild type, Isl2‐EphA3 ki/ki, Isl2‐EphA3 ki/+, ephrin‐A2,A3,A5 triple knock‐out (TKO), and Math5 −/− (Atoh7). Two models successfully reproduced the extent of the Math5 −/− anteromedial projection, but only one of those could account for the collapse point in Isl2‐EphA3 ki/+. The models needed a weak anteroposterior gradient in the SC to reproduce the residual order in the ephrin‐A2,A3,A5 TKO phenotype, suggesting either an incomplete knock‐out or the presence of another guidance molecule. Our article demonstrates the importance of testing retinotopic models against as full a range of phenotypes as possible, and we have made available MATLAB software, we wrote to facilitate this process. © 2014 Wiley Periodicals, Inc. Develop Neurobiol 75: 641–666, 2015 PMID:25367067

  18. Quantitative assessment of neutrophil phagocytosis using flow cytometry.

    PubMed

    Nordenfelt, Pontus

    2014-01-01

    Neutrophils have an incredible ability to find and eradicate intruders such as bacteria and fungi. They do this largely through the process of phagocytosis, where the target is internalized into a phagosome, and eventually destroyed by the hostile phagosomal environment. It is important to study phagocytosis in order to understand how neutrophils interact with various pathogens and how they respond to different stimuli. Here, I describe a method to study neutrophil phagocytosis of bacteria using flow cytometry. The bacteria are fluorescently labeled before being introduced to neutrophils. After phagocytosis, both any remaining extracellular bacteria and neutrophils are labeled using one-step staining before three-color analysis. To assess phagocytosis, first the average time it takes for the neutrophils to internalize all bound bacteria is determined. Experiments are then performed using that time point while varying the bacteria-to-neutrophil ratio for full control of the analysis. Due to the ease with which multiple samples can be analyzed, and the quantitative nature of flow cytometry, this approach is both reproducible and sensitive.

  19. Quantitative risk assessment of foods containing peanut advisory labeling.

    PubMed

    Remington, Benjamin C; Baumert, Joseph L; Marx, David B; Taylor, Steve L

    2013-12-01

    Foods with advisory labeling (i.e. "may contain") continue to be prevalent and the warning may be increasingly ignored by allergic consumers. We sought to determine the residual levels of peanut in various packaged foods bearing advisory labeling, compare similar data from 2005 and 2009, and determine any potential risk for peanut-allergic consumers. Of food products bearing advisory statements regarding peanut or products that had peanut listed as a minor ingredient, 8.6% and 37.5% contained detectable levels of peanut (>2.5 ppm whole peanut), respectively. Peanut-allergic individuals should be advised to avoid such products regardless of the wording of the advisory statement. Peanut was detected at similar rates and levels in products tested in both 2005 and 2009. Advisory labeled nutrition bars contained the highest levels of peanut and an additional market survey of 399 products was conducted. Probabilistic risk assessment showed the risk of a reaction to peanut-allergic consumers from advisory labeled nutrition bars was significant but brand-dependent. Peanut advisory labeling may be overused on some nutrition bars but prudently used on others. The probabilistic approach could provide the food industry with a quantitative method to assist with determining when advisory labeling is most appropriate.

  20. Quantitative assessment of healthy and reconstructed cleft lip using ultrasonography

    PubMed Central

    Devadiga, Sumana; Desai, Anil Kumar; Joshi, Shamsunder; Gopalakrishnan, K.

    2016-01-01

    Purpose: This study is conducted to investigate the feasibility of echographic imaging of tissue thickness of healthy and reconstructed cleft lip. Design: Prospective study. Materials and Methods: The study was conducted in SDM Craniofacial Unit, Dharwad and was approved by Local Institutional Review Board. A total of 30 patients, age group ranging from 4 to 25 years, of which 15 postoperative unilateral cleft lip constituted the test group. The remaining 15 with no cleft deformities, no gross facial asymmetry, constituted the control group. The thickness of the mucosa, submucosa, muscle and full thickness of the upper lip were measured with the transversal images using ultrasonography at midpoint of philtrum, right and left side philtral ridges and vermillion border, at 1, 3, 6 months interval. Results: There was an increase in muscle thickness at the vermillion border (mean = 6.9 mm) and philtral ridge (5.9 mm). Equal muscle thickness were found between the normal and test group at 6 months follow-up in a relaxed position, which was statistically significant (P = 0.0404). Conclusion: Quantitative assessment of thickness and echo levels of various lip tissues are done with proper echographic calibration. Diagnostic potentials of this method for noninvasive evaluation of cleft lip reconstructions were achieved by this study. PMID:27134448

  1. Quantitative risk assessment of foods containing peanut advisory labeling.

    PubMed

    Remington, Benjamin C; Baumert, Joseph L; Marx, David B; Taylor, Steve L

    2013-12-01

    Foods with advisory labeling (i.e. "may contain") continue to be prevalent and the warning may be increasingly ignored by allergic consumers. We sought to determine the residual levels of peanut in various packaged foods bearing advisory labeling, compare similar data from 2005 and 2009, and determine any potential risk for peanut-allergic consumers. Of food products bearing advisory statements regarding peanut or products that had peanut listed as a minor ingredient, 8.6% and 37.5% contained detectable levels of peanut (>2.5 ppm whole peanut), respectively. Peanut-allergic individuals should be advised to avoid such products regardless of the wording of the advisory statement. Peanut was detected at similar rates and levels in products tested in both 2005 and 2009. Advisory labeled nutrition bars contained the highest levels of peanut and an additional market survey of 399 products was conducted. Probabilistic risk assessment showed the risk of a reaction to peanut-allergic consumers from advisory labeled nutrition bars was significant but brand-dependent. Peanut advisory labeling may be overused on some nutrition bars but prudently used on others. The probabilistic approach could provide the food industry with a quantitative method to assist with determining when advisory labeling is most appropriate. PMID:23994086

  2. Quantitative assessment of the effectiveness of a rockfall warning system

    NASA Astrophysics Data System (ADS)

    Bründl, Michael; Sättele, Martina; Krautblatter, Michael; Straub, Daniel

    2016-04-01

    Rockslides and rockfalls can pose high risk to human settlements and traffic infrastructure. In addition to structural mitigation measures like rockfall nets, warning systems are increasingly installed to reduce rockfall risks. Whereas for structural mitigation measures with reducing effects on the spatial extent a structured evaluation method is existing, no or only few approaches to assess the effectiveness for warning systems are known. Especially for higher magnitude rockfalls structural mitigation measures are not effective, and reliable early warning systems will be essential in future. In response to that, we developed a classification and a framework to assess the reliability and effectiveness of early warning systems (Sättele et al, 2015a; 2016). Here, we demonstrate an application for the rockfall warning system installed in Preonzo prior to a major rockfall in May 2012 (Sättele et al., 2015b). We show that it is necessary to design such a warning system as fail-safe construction, which has to incorporate components with low failure probabilities, high redundancy, low warning thresholds, and additional control systems. With a hypothetical probabilistic analysis, we investigate the effect of the risk attitude of decision makers and of the number of sensors on the probability of detecting an event and on initiating a timely evacuation, as well as on related intervention cost. We conclude that it is possible to quantitatively assess the effectiveness of warning systems, which helps to optimize mitigation strategies against rockfall events. References Sättele, M., Bründl, M., and Straub, D.: Reliability and effectiveness of warning systems for natural hazards: concept and application to debris flow warning, Rel. Eng. Syst. Safety, 142, 192-202, 2015a. Sättele, M., Krautblatter, M., Bründl, M., and Straub, D.: Forecasting rock slope failure: How reliable and effective are warning systems?, Landslides, 605, 1-14, 2015b. Sättele, M., Bründl, M., and

  3. Is there a place for quantitative risk assessment?

    PubMed Central

    Hall, Eric J

    2013-01-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk–benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  4. Is there a place for quantitative risk assessment?

    PubMed

    Hall, Eric J

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  5. Is there a place for quantitative risk assessment?

    PubMed

    Hall, Eric J

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  6. Quantitative ultrasound (QUS) assessment of tissue properties for Achilles tendons

    NASA Astrophysics Data System (ADS)

    Du, Yi-Chun; Chen, Yung-Fu; Chen, Pei-Jarn; Lin, Yu-Ching; Chen, Tainsong; Lin, Chii-Jeng

    2007-09-01

    Quantitative ultrasound (QUS) techniques have recently been widely applied for the characterization of tissues. For example, they can be used for the quantification of Achilles tendon properties based on the broadband ultrasound attenuation (BUA) and the speed of sound (SOS) when the ultrasound wave passes through the tissues. This study is to develop an integrated system to investigate the properties of Achilles tendons using QUS images from UBIS 5000 (DMS, Montpellier, France) and B-mode ultrasound images from HDI 5000 (ATL, Ultramark, USA). Subjects including young (32 females and 17 males; mean age: 23.7 ± 2.0) and middle-aged groups (8 female and 8 males; mean age: 47.3 ± 8.5 s) were recruited and tested for this study. Only subjects who did not exercise regularly and had no record of tendon injury were studied. The results show that the BUA is significantly higher for the young group (45.2 ± 1.6 dB MHz-1) than the middle-age group (40.5 ± 1.9 dB MHz-1), while the SOS is significantly lower for the young (1601.9 ± 11.2 ms-1) compared to the middle-aged (1624.1 ± 8.7 m s-1). On the other hand, the thicknesses of Achilles tendons for both groups (young: 4.31 ± 0.23 mm; middle age: 4.24 ± 0.23 mm) are very similar. For one patient who had an Achilles tendon lengthening (ATL) surgery, the thickness of the Achilles tendon increased from 4 mm to 4.33 mm after the surgery. In addition, the BUA increased by about 7.2% while the SOS decreased by about 0.6%. In conclusion, noninvasive ultrasonic assessment of Achilles tendons is useful for assisting clinical diagnosis and for the evaluation of a therapeutic regimen.

  7. Quantitative risk assessment of Cryptosporidium species infection in dairy calves.

    PubMed

    Nydam, D V; Mohammed, H O

    2005-11-01

    Cryptosporidium parvum is a zoonotic protozoan that infects many different mammals including cattle and humans. Cryptosporidiosis has become a concern for dairy producers because of the direct losses due to calves not performing well and the potential for environmental contamination with C. parvum. Identifying modifiable control points in the dynamics of infection in dairy herds will help identify management strategies that mitigate its risk. The quantitative risk assessment approach provides estimates of the risk associated with these factors so that cost-effective strategies can be implemented. Using published data from epidemiologic studies and a stochastic approach, we modeled the risk that C. parvum presents to dairy calves in 2 geographic areas: 1) the New York City Watershed (NYCW) in southeastern New York, and 2) the entire United States. The approach focused on 2 possible areas of exposure--the rearing environment and the maternity environment. In addition, we evaluated the contribution of many risk factors (e.g., age, housing, flies) to the end-state (i.e., total) risk to identify areas of intervention to decrease the risk to dairy calves. Expected risks from C. parvum in US dairy herds in rearing and maternity environments were 41.7 and 33.9%, respectively. In the NYCW, the expected risks from C. parvum in the rearing and maternity environments were 0.36 and 0.33%, respectively. In the US scenarios, the immediate environment contributed most of the risk to calves, whereas in the NYCW scenario, it was new calf infection. Therefore, within the NYCW, risk management activities may be focused on preventing new calf infections, whereas in the general US population, cleaning of calf housing would be a good choice for resource allocation. Despite the many assumptions inherent with modeling techniques, its usefulness to quantify the likelihood of risk and identify risk management areas is illustrated.

  8. Use of Quantitative Microbial Risk Assessment to Improve Interpretation of a Recreational Water Epidemiological Study

    EPA Science Inventory

    We conducted a supplemental water quality monitoring study and quantitative microbial risk assessment (QMRA) to complement the United States Environmental Protection Agency’s (U.S. EPA) National Epidemiological and Environmental Assessment of Recreational Water study at Boq...

  9. Qualitative and quantitative procedures for health risk assessment.

    PubMed

    Lohman, P H

    1999-07-16

    Numerous reactive mutagenic electrophiles are present in the environment or are formed in the human body through metabolizing processes. Those electrophiles can directly react with DNA and are considered to be ultimate carcinogens. In the past decades more than 200 in vitro and in vivo genotoxic tests have been described to identify, monitor and characterize the exposure of humans to such agents. When the responses of such genotoxic tests are quantified by a weight-of-evidence analysis, it is found that the intrinsic potency of electrophiles being mutagens does not differ much for the majority of the agents studied. Considering the fact that under normal environmental circumstances human are exposed to low concentration of about a million electrophiles, the relation between exposure to such agents and adverse health effects (e.g., cancer) will become a 'Pandora's box'. For quantitative risk assessment it will be necessary not only to detect whether the agent is genotoxic, but also understand the mechanism of interaction of the agent with the DNA in target cells needs to be taken into account. Examples are given for a limited group of important environmental and carcinogenic agents for which such an approach is feasible. The groups identified are agents that form cross-links with DNA or are mono-alkylating agents that react with base-moieties in the DNA strands. Quantitative hazard ranking of the mutagenic potency of these groups of chemical can be performed and there is ample evidence that such a ranking corresponds with the individual carcinogenic potency of those agents in rodents. Still, in practice, with the exception of certain occupational or accidental exposure situations, these approaches have not be successful in preventing cancer death in the human population. However, this is not only due to the described 'Pandora's box' situation. At least three other factors are described. Firstly, in the industrial world the medical treatment of cancer in patients

  10. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  11. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  12. Quantitative assessment of hemadsorption by myxoviruses: virus hemadsorption assay.

    PubMed

    Hahon, N; Booth, J A; Eckert, H L

    1973-04-01

    The standardization and quantitative evaluation of an assay for myxoviruses, based on the enumeration of individual infected clone 1-5C-4 cells manifesting hemadsorption within 24 h of infection, are described. Hemadsorption was detectable earlier than immunofluorescence in infected cells or hemagglutinins in culture medium. The relationship between virus concentration and cells exhibiting hemadsorption was linear. The assay was highly precise, sensitive, and reproducible. PMID:4349248

  13. Quantitative Assessment of Countermeasure Efficacy for Long-Term Space Missions

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.

    2000-01-01

    This slide presentation reviews the development of quantitative assessments of the effectiveness of countermeasures (CM) for the effects of space travel on humans for long term space missions. An example of bone mineral density (BMD) is examined to show specific quantitative measures for failure and success.

  14. I Vivo Quantitative Ultrasound Imaging and Scatter Assessments.

    NASA Astrophysics Data System (ADS)

    Lu, Zheng Feng

    There is evidence that "instrument independent" measurements of ultrasonic scattering properties would provide useful diagnostic information that is not available with conventional ultrasound imaging. This dissertation is a continuing effort to test the above hypothesis and to incorporate quantitative ultrasound methods into clinical examinations for early detection of diffuse liver disease. A well-established reference phantom method was employed to construct quantitative ultrasound images of tissue in vivo. The method was verified by extensive phantom tests. A new method was developed to measure the effective attenuation coefficient of the body wall. The method relates the slope of the difference between the echo signal power spectrum from a uniform region distal to the body wall and the echo signal power spectrum from a reference phantom to the body wall attenuation. The accuracy obtained from phantom tests suggests further studies with animal experiments. Clinically, thirty-five healthy subjects and sixteen patients with diffuse liver disease were studied by these quantitative ultrasound methods. The average attenuation coefficient in normals agreed with previous investigators' results; in vivo backscatter coefficients agreed with the results from normals measured by O'Donnell. Strong discriminating power (p < 0.001) was found for both attenuation and backscatter coefficients between fatty livers and normals; a significant difference (p < 0.01) was observed in the backscatter coefficient but not in the attenuation coefficient between cirrhotic livers and normals. An in vivo animal model of steroid hepatopathy was used to investigate the system sensitivity in detecting early changes in canine liver resulting from corticosteroid administration. The average attenuation coefficient slope increased from 0.7 dB/cm/MHz in controls to 0.82 dB/cm/MHz (at 6 MHz) in treated animals on day 14 into the treatment, and the backscatter coefficient was 26times 10^{ -4}cm^{-1}sr

  15. Quantitative phylogenetic assessment of microbial communities indiverse environments

    SciTech Connect

    von Mering, C.; Hugenholtz, P.; Raes, J.; Tringe, S.G.; Doerks,T.; Jensen, L.J.; Ward, N.; Bork, P.

    2007-01-01

    The taxonomic composition of environmental communities is an important indicator of their ecology and function. Here, we use a set of protein-coding marker genes, extracted from large-scale environmental shotgun sequencing data, to provide a more direct, quantitative and accurate picture of community composition than traditional rRNA-based approaches using polymerase chain reaction (PCR). By mapping marker genes from four diverse environmental data sets onto a reference species phylogeny, we show that certain communities evolve faster than others, determine preferred habitats for entire microbial clades, and provide evidence that such habitat preferences are often remarkably stable over time.

  16. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  17. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  18. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  19. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  20. Assessment of metabolic bone diseases by quantitative computed tomography

    SciTech Connect

    Richardson, M.L.; Genant, H.K.; Cann, C.E.; Ettinger, B.; Gordan, G.S.; Kolb, F.O.; Reiser, U.J.

    1985-05-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid- induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements.

  1. New approach to quantitative angiographic assessment after stent implantation.

    PubMed

    Reimers, B; Di Mario, C; Di Francesco, L; Moussa, I; Blengino, S; Martini, G; Reiber, J H; Colombo, A

    1997-04-01

    The new generation quantitative angiographic systems apply the interpolated technique to calculate the reference diameter at the site of the stenosis by integrating measurements of the segments proximal and distal to the stenosis. After stent implantation these measurements can be misleading as the treated segment, which is frequently larger than the adjacent not stented segments, is included in the measurements. The consequence is an overestimation of the reference diameter and the residual diameter stenosis. The present study was performed to compare this conventional technique of measurement with a new method which excludes the stented segment for the calculation of the reference diameter. Fifty-two lesions treated with poorly radiopaque stents (56% Palmaz-Schatz, 28% NIR, 10% Gianturco-Roubin, 6% Wallstent) expanded at high pressure (> = or 16 atm) were analyzed according to the conventional and stent excluded method. After stent implantation the reference diameter was 3.39 +/- 0.48 mm with conventional measurements and 3.02 +/- 0.45 mm with the stent excluded method (P < 0.05). The corresponding % diameter stenosis was 13 +/- 9 for the conventional technique and 1 +/- 13 for the stent excluded analysis (P < 0.05). The new approach to quantitative coronary analysis after stenting provides higher accuracy in reference diameter calculations and allows a more appropriate matching of stented segments with adjacent normal segments.

  2. Quantitative Assessment of Neurite Outgrowth in PC12 Cells

    EPA Science Inventory

    In vitro test methods can provide a rapid approach for the screening of large numbers of chemicals for their potential to produce toxicity. In order to identify potential developmental neurotoxicants, assessment of critical neurodevelopmental processes such as neuronal differenti...

  3. Quantitative risk assessment: an emerging tool for emerging foodborne pathogens.

    PubMed Central

    Lammerding, A. M.; Paoli, G. M.

    1997-01-01

    New challenges to the safety of the food supply require new strategies for evaluating and managing food safety risks. Changes in pathogens, food preparation, distribution, and consumption, and population immunity have the potential to adversely affect human health. Risk assessment offers a framework for predicting the impact of changes and trends on the provision of safe food. Risk assessment models facilitate the evaluation of active or passive changes in how foods are produced, processed, distributed, and consumed. PMID:9366601

  4. Quantitative criteria for assessment of gamma-ray imager performance

    NASA Astrophysics Data System (ADS)

    Gottesman, Steve; Keller, Kristi; Malik, Hans

    2015-08-01

    In recent years gamma ray imagers such as the GammaCamTM and Polaris have demonstrated good imaging performance in the field. Imager performance is often summarized as "resolution", either angular, or spatial at some distance from the imager, however the definition of resolution is not always related to the ability to image an object. It is difficult to quantitatively compare imagers without a common definition of image quality. This paper examines three categories of definition: point source; line source; and area source. It discusses the details of those definitions and which ones are more relevant for different situations. Metrics such as Full Width Half Maximum (FWHM), variations on the Rayleigh criterion, and some analogous to National Imagery Interpretability Rating Scale (NIIRS) are discussed. The performance against these metrics is evaluated for a high resolution coded aperture imager modeled using Monte Carlo N-Particle (MCNP), and for a medium resolution imager measured in the lab.

  5. A quantitative assessment of Arctic shipping in 2010–2014

    NASA Astrophysics Data System (ADS)

    Eguíluz, Victor M.; Fernández-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M.

    2016-08-01

    Rapid loss of sea ice is opening up the Arctic Ocean to shipping, a practice that is forecasted to increase rapidly by 2050 when many models predict that the Arctic Ocean will largely be free of ice toward the end of summer. These forecasts carry considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011–2014 is already significant and that it is concentrated (i) in the Norwegian and Barents Seas, and (ii) predominantly accessed via the Northeast and Northwest Passages. Thick ice along the forecasted direct trans-Arctic route was still present in 2014, preventing transit. Although Arctic shipping remains constrained by the extent of ice coverage, during every September, this coverage is at a minimum, allowing the highest levels of shipping activity. Access to Arctic resources, particularly fisheries, is the most important driver of Arctic shipping thus far.

  6. Quantitative Assessment of Faculty Workloads. ASHE 1984 Annual Meeting Paper.

    ERIC Educational Resources Information Center

    Shull, H. Eugene

    A system of measuring faculty workloads consistently and objectively has been devised and successfully applied at Pennsylvania State University's Behrend College. Its value is greatest in assessing and balancing the diverse faculty assignments within interdisciplinary and heterogeneous administrative units. It permits a legitimate comparison of…

  7. INCORPORATION OF MOLECULAR ENDPOINTS INTO QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    The U.S. Environmental Protection Agency has recently released its Guidelines for Carcinogen Risk Assessment. These new guidelines benefit from the significant progress that has been made in understanding the cancer process and also from the more than 20 years experience that EPA...

  8. Quantitative Assessments of Sensitivity to Reinforcement Contingencies in Mental Retardation.

    ERIC Educational Resources Information Center

    Dube, William V.; McIlvane, William J.

    2002-01-01

    Sensitivity to reinforcement contingencies was examined in six individuals with mental retardation using a concurrent operants procedure in the context of a computer game. Results included individual differences in sensitivity and differential sensitivity to rate and magnitude variation. Results suggest that comprehensive assessments of potential…

  9. Developing a Quantitative Tool for Sustainability Assessment of HEIs

    ERIC Educational Resources Information Center

    Waheed, Bushra; Khan, Faisal I.; Veitch, Brian

    2011-01-01

    Purpose: Implementation of a sustainability paradigm demands new choices and innovative ways of thinking. The main objective of this paper is to provide a meaningful sustainability assessment tool for make informed decisions, which is applied to higher education institutions (HEIs). Design/methodology/approach: The objective is achieved by…

  10. A quantitative assessment of Arctic shipping in 2010-2014.

    PubMed

    Eguíluz, Victor M; Fernández-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M

    2016-01-01

    Rapid loss of sea ice is opening up the Arctic Ocean to shipping, a practice that is forecasted to increase rapidly by 2050 when many models predict that the Arctic Ocean will largely be free of ice toward the end of summer. These forecasts carry considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011-2014 is already significant and that it is concentrated (i) in the Norwegian and Barents Seas, and (ii) predominantly accessed via the Northeast and Northwest Passages. Thick ice along the forecasted direct trans-Arctic route was still present in 2014, preventing transit. Although Arctic shipping remains constrained by the extent of ice coverage, during every September, this coverage is at a minimum, allowing the highest levels of shipping activity. Access to Arctic resources, particularly fisheries, is the most important driver of Arctic shipping thus far. PMID:27477878

  11. Quantitative assessment of rabbit alveolar macrophage function by chemiluminescence

    SciTech Connect

    Brennan, P.C.; Kirchner, F.R.

    1985-08-01

    Rabbit alveolar macrophages (RAM) were cultured for 24 hr with concentrations ranging from 3 to 12 ..mu..g/ml of vanadium oxide (V/sub 2/O/sub 5/), a known cytotoxic agent, or with high-molecular-weight organic by-products from coal gasification processes. After culture the cells were harvested and tested for functional capacity using three types of indicators: (1) luminol-amplified chemiluminescence (CL), which quantitatively detects photon emission due to respiratory burst activity measured in a newly designed instrument with standardized reagents; (2) the reduction of nitro blue tetrazolium-saturated polyacrylamide beads, a semiquantitative measure of respiratory burst activity; and (3) phagocytic efficiency, defined as percentage of cells incorporating immunoglobulin-coated polyacrylamide beads. Chemiluminescence declined linearly with increasing concentrations of V/sub 2/O/sub 5/ over the dose range tested. Dye reduction and phagocytic efficiency similarly decreased with increasing V/sub 2/O/sub 5/ concentration, but were less sensitive indicators of functional impairment than CL as measured by the amount required to reduce the response to 50% of untreated cells. The effect of coal gasification condensates on RAM function varied, but in general these test also indicated that the CL response was the most sensitive indicator.

  12. A quantitative assessment of Arctic shipping in 2010–2014

    PubMed Central

    Eguíluz, Victor M.; Fernández-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M.

    2016-01-01

    Rapid loss of sea ice is opening up the Arctic Ocean to shipping, a practice that is forecasted to increase rapidly by 2050 when many models predict that the Arctic Ocean will largely be free of ice toward the end of summer. These forecasts carry considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011–2014 is already significant and that it is concentrated (i) in the Norwegian and Barents Seas, and (ii) predominantly accessed via the Northeast and Northwest Passages. Thick ice along the forecasted direct trans-Arctic route was still present in 2014, preventing transit. Although Arctic shipping remains constrained by the extent of ice coverage, during every September, this coverage is at a minimum, allowing the highest levels of shipping activity. Access to Arctic resources, particularly fisheries, is the most important driver of Arctic shipping thus far. PMID:27477878

  13. Quantitative Assessment of Parametric Uncertainty in Northern Hemisphere PAH Concentrations.

    PubMed

    Thackray, Colin P; Friedman, Carey L; Zhang, Yanxu; Selin, Noelle E

    2015-08-01

    We quantitatively examine the relative importance of uncertainty in emissions and physicochemical properties (including reaction rate constants) to Northern Hemisphere (NH) and Arctic polycyclic aromatic hydrocarbon (PAH) concentrations, using a computationally efficient numerical uncertainty technique applied to the global-scale chemical transport model GEOS-Chem. Using polynomial chaos (PC) methods, we propagate uncertainties in physicochemical properties and emissions for the PAHs benzo[a]pyrene, pyrene and phenanthrene to simulated spatially resolved concentration uncertainties. We find that the leading contributors to parametric uncertainty in simulated concentrations are the black carbon-air partition coefficient and oxidation rate constant for benzo[a]pyrene, and the oxidation rate constants for phenanthrene and pyrene. NH geometric average concentrations are more sensitive to uncertainty in the atmospheric lifetime than to emissions rate. We use the PC expansions and measurement data to constrain parameter uncertainty distributions to observations. This narrows a priori parameter uncertainty distributions for phenanthrene and pyrene, and leads to higher values for OH oxidation rate constants and lower values for European PHE emission rates.

  14. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  15. Quantitative statistical assessment of conditional models for synthetic aperture radar.

    PubMed

    DeVore, Michael D; O'Sullivan, Joseph A

    2004-02-01

    Many applications of object recognition in the presence of pose uncertainty rely on statistical models-conditioned on pose-for observations. The image statistics of three-dimensional (3-D) objects are often assumed to belong to a family of distributions with unknown model parameters that vary with one or more continuous-valued pose parameters. Many methods for statistical model assessment, for example the tests of Kolmogorov-Smirnov and K. Pearson, require that all model parameters be fully specified or that sample sizes be large. Assessing pose-dependent models from a finite number of observations over a variety of poses can violate these requirements. However, a large number of small samples, corresponding to unique combinations of object, pose, and pixel location, are often available. We develop methods for model testing which assume a large number of small samples and apply them to the comparison of three models for synthetic aperture radar images of 3-D objects with varying pose. Each model is directly related to the Gaussian distribution and is assessed both in terms of goodness-of-fit and underlying model assumptions, such as independence, known mean, and homoscedasticity. Test results are presented in terms of the functional relationship between a given significance level and the percentage of samples that wold fail a test at that level. PMID:15376934

  16. A quantitative assessment of results with the Angelchik prosthesis.

    PubMed Central

    Wyllie, J. H.; Edwards, D. A.

    1985-01-01

    The Angelchik antireflux prosthesis was assessed in 15 unpromising patients, 12 of whom had peptic strictures of the oesophagus. Radiological techniques were used to show the effect of the device on gastro-oesophageal reflux, and on the bore and length of strictures. Twelve months later (range 6-24) most patients were well satisfied with the operation, and all considered it had been worthwhile; there was radiological evidence of reduction in reflux and remission of strictures. The device never surrounded the oesophageal sphincter; in all but 1 case it encircled a tube of stomach. Images Fig. 5 Fig. 6 PMID:4037629

  17. Aliasing as noise - A quantitative and qualitative assessment

    NASA Technical Reports Server (NTRS)

    Park, Stephen K.; Hazra, Rajeeb

    1993-01-01

    We present a model-based argument that, for the purposes of system design and digital image processing, aliasing should be treated as signal-dependent additive noise. By using a computational simulation based on this model, we process (high resolution images of) natural scenes in a way which enables the 'aliased component' of the reconstructed image to be isolated unambiguously. We demonstrate that our model-based argument leads naturally to system design metrics which quantify the extent of aliasing. And, by illustrating several aliased component images, we provide a qualitative assessment of aliasing as noise.

  18. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  19. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  20. Quantitative Assessment of a Field-Based Course on Integrative Geology, Ecology and Cultural History

    ERIC Educational Resources Information Center

    Sheppard, Paul R.; Donaldson, Brad A.; Huckleberry, Gary

    2010-01-01

    A field-based course at the University of Arizona called Sense of Place (SOP) covers the geology, ecology and cultural history of the Tucson area. SOP was quantitatively assessed for pedagogical effectiveness. Students of the Spring 2008 course were given pre- and post-course word association surveys in order to assess awareness and comprehension…

  1. New Trends in Quantitative Assessment of the Corneal Barrier Function

    PubMed Central

    Guimerà, Anton; Illa, Xavi; Traver, Estefania; Herrero, Carmen; Maldonado, Miguel J.; Villa, Rosa

    2014-01-01

    The cornea is a very particular tissue due to its transparency and its barrier function as it has to resist against the daily insults of the external environment. In addition, maintenance of this barrier function is of crucial importance to ensure a correct corneal homeostasis. Here, the corneal epithelial permeability has been assessed in vivo by means of non-invasive tetrapolar impedance measurements, taking advantage of the huge impact of the ion fluxes in the passive electrical properties of living tissues. This has been possible by using a flexible sensor based in SU-8 photoresist. In this work, a further analysis focused on the validation of the presented sensor is performed by monitoring the healing process of corneas that were previously wounded. The obtained impedance measurements have been compared with the damaged area observed in corneal fluorescein staining images. The successful results confirm the feasibility of this novel method, as it represents a more sensitive in vivo and non-invasive test to assess low alterations of the epithelial permeability. Then, it could be used as an excellent complement to the fluorescein staining image evaluation. PMID:24841249

  2. Quantitative assessment of human body shape using Fourier analysis

    NASA Astrophysics Data System (ADS)

    Friess, Martin; Rohlf, F. J.; Hsiao, Hongwei

    2004-04-01

    Fall protection harnesses are commonly used to reduce the number and severity of injuries. Increasing the efficiency of harness design requires the size and shape variation of the user population to be assessed as detailed and as accurately as possible. In light of the unsatisfactory performance of traditional anthropometry with respect to such assessments, we propose the use of 3D laser surface scans of whole bodies and the statistical analysis of elliptic Fourier coefficients. Ninety-eight male and female adults were scanned. Key features of each torso were extracted as a 3D curve along front, back and the thighs. A 3D extension of Elliptic Fourier analysis4 was used to quantify their shape through multivariate statistics. Shape change as a function of size (allometry) was predicted by regressing the coefficients onto stature, weight and hip circumference. Upper and lower limits of torso shape variation were determined and can be used to redefine the design of the harness that will fit most individual body shapes. Observed allometric changes are used for adjustments to the harness shape in each size. Finally, the estimated outline data were used as templates for a free-form deformation of the complete torso surface using NURBS models (non-uniform rational B-splines).

  3. Quantitative cancer risk assessment for dioxins using an occupational cohort.

    PubMed Central

    Becher, H; Steindorf, K; Flesch-Janys, D

    1998-01-01

    We consider a cohort of 1189 male German factory workers (production period 1952-1984) who produced phenoxy herbicides and were exposed to dioxins. Follow-up until the end of 1992 yielded a significantly increased standardized mortality ratio (SMR) for total cancer (SMR 141; 95% confidence interval 117-168). 2,3,7,8-Tetrachlorodibenzo-p-dioxin (TCDD) concentrations up to 2252 ng/kg body fat were measured in 275 cohort members. Other higher chlorinated dioxins and furans also occurred in high concentrations. For quantitative analysis, the integrated TCDD concentration over time was used as an exposure variable, which was calculated using results from half-life estimation for TCDD and workplace history data. The other congeners were expressed as toxic equivalency (TEQ) and compared to TCDD using international toxic equivalency factors. Poisson and Cox regressions were used to investigate dose-response relationships. Various covariables (e.g., exposure to beta-hexachlorocyclohexane, employment characteristics) were considered. In all analyses, TCDD and TEQ exposures were related to total cancer mortality. The power model yielded a relative risk (RR) function RR(x) = (1 + 0.17x)0.326 for TCDD (in microgram/kilogram blood fat x years)--only a slightly better fit than a linear RR function--and RR(x) = (1 + 0.023x)0.795 for TEQ. Investigations on latency did not show strong effects. Different methods were applied to investigate the robustness of the results and yielded almost identical results. The results were used for unit risk estimation. Taking into account different sources of variation, an interval of 10(-3) to 10(-2) for the additional lifetime cancer risk under a daily intake of 1 pg TCDD/kg body weight/day was estimated from the dose-response models considered. Uncertainties regarding the dose-response function remain. These data did not indicate the existence of a threshold value; however, such a value cannot be excluded with any certainty. PMID:9599714

  4. Quantitative Assessment of Workload and Stressors in Clinical Radiation Oncology

    SciTech Connect

    Mazur, Lukasz M.; Mosaly, Prithima R.; Jackson, Marianne; Chang, Sha X.; Burkhardt, Katharin Deschesne; Adams, Robert D.; Jones, Ellen L.; Hoyle, Lesley; Xu, Jing; Rockwell, John; Marks, Lawrence B.

    2012-08-01

    Purpose: Workload level and sources of stressors have been implicated as sources of error in multiple settings. We assessed workload levels and sources of stressors among radiation oncology professionals. Furthermore, we explored the potential association between workload and the frequency of reported radiotherapy incidents by the World Health Organization (WHO). Methods and Materials: Data collection was aimed at various tasks performed by 21 study participants from different radiation oncology professional subgroups (simulation therapists, radiation therapists, physicists, dosimetrists, and physicians). Workload was assessed using National Aeronautics and Space Administration Task-Load Index (NASA TLX). Sources of stressors were quantified using observational methods and segregated using a standard taxonomy. Comparisons between professional subgroups and tasks were made using analysis of variance ANOVA, multivariate ANOVA, and Duncan test. An association between workload levels (NASA TLX) and the frequency of radiotherapy incidents (WHO incidents) was explored (Pearson correlation test). Results: A total of 173 workload assessments were obtained. Overall, simulation therapists had relatively low workloads (NASA TLX range, 30-36), and physicists had relatively high workloads (NASA TLX range, 51-63). NASA TLX scores for physicians, radiation therapists, and dosimetrists ranged from 40-52. There was marked intertask/professional subgroup variation (P<.0001). Mental demand (P<.001), physical demand (P=.001), and effort (P=.006) significantly differed among professional subgroups. Typically, there were 3-5 stressors per cycle of analyzed tasks with the following distribution: interruptions (41.4%), time factors (17%), technical factors (13.6%), teamwork issues (11.6%), patient factors (9.0%), and environmental factors (7.4%). A positive association between workload and frequency of reported radiotherapy incidents by the WHO was found (r = 0.87, P value=.045

  5. [The concept of amnesia and quantitative assessment of amnesic disorders].

    PubMed

    Metzler, P; Rudolph, M; Voshage, J; Nickel, B

    1991-06-01

    This article presents first a short historical overview of the different viewpoints concerning psychiatric approaches to define the concept "amnesia" (Ribot, Korsakow, K. Schneider, Bleuler, Bonhoeffer et al.). A generally accepted result is the differentiation between retrograde and anterograde amnesia. Research work of the last two decades has focussed on the experimental investigation of anterograde amnesia, the so-called amnesic syndrome. In this context four main factors responsible for memory performance are distinguished: encoding, retrieval, forgetting and interference. One of the main results of neuropsychological research in amnesia consists in having discovered a set of symptoms or features common to most if not all forms of amnesia. These features appear regardless of etiology and locus of lesion. This set or features is described in detail in the paper. On the basis of these amnesic features a clinical test was developed, the Berliner Amnesie Test (BAT). This standardized test can be used for the assessment from mild up to severe memory disorders.

  6. Compressed natural gas bus safety: a quantitative risk assessment.

    PubMed

    Chamberlain, Samuel; Modarres, Mohammad

    2005-04-01

    This study assesses the fire safety risks associated with compressed natural gas (CNG) vehicle systems, comprising primarily a typical school bus and supporting fuel infrastructure. The study determines the sensitivity of the results to variations in component failure rates and consequences of fire events. The components and subsystems that contribute most to fire safety risk are determined. Finally, the results are compared to fire risks of the present generation of diesel-fueled school buses. Direct computation of the safety risks associated with diesel-powered vehicles is possible because these are mature technologies for which historical performance data are available. Because of limited experience, fatal accident data for CNG bus fleets are minimal. Therefore, this study uses the probabilistic risk assessment (PRA) approach to model and predict fire safety risk of CNG buses. Generic failure data, engineering judgments, and assumptions are used in this study. This study predicts the mean fire fatality risk for typical CNG buses as approximately 0.23 fatalities per 100-million miles for all people involved, including bus passengers. The study estimates mean values of 0.16 fatalities per 100-million miles for bus passengers only. Based on historical data, diesel school bus mean fire fatality risk is 0.091 and 0.0007 per 100-million miles for all people and bus passengers, respectively. One can therefore conclude that CNG buses are more prone to fire fatality risk by 2.5 times that of diesel buses, with the bus passengers being more at risk by over two orders of magnitude. The study estimates a mean fire risk frequency of 2.2 x 10(-5) fatalities/bus per year. The 5% and 95% uncertainty bounds are 9.1 x 10(-6) and 4.0 x 10(-5), respectively. The risk result was found to be affected most by failure rates of pressure relief valves, CNG cylinders, and fuel piping.

  7. Cardiac sarcoidosis mimicking right ventricular dysplasia.

    PubMed

    Shiraishi, Jun; Tatsumi, Tetsuya; Shimoo, Kazutoshi; Katsume, Asako; Mani, Hiroki; Kobara, Miyuki; Shirayama, Takeshi; Azuma, Akihiro; Nakagawa, Masao

    2003-02-01

    A 59-year-old woman with skin sarcoidosis was admitted to hospital for assessment of complete atrioventricular block. Cross-sectional echocardiography showed that the apical free wall of the right ventricle was thin and dyskinetic with dilation of the right ventricle. Thallium-201 myocardial imaging revealed a normal distribution. Both gallium-67 and technetium-99m pyrophosphate scintigraphy revealed no abnormal uptake in the myocardium. Right ventriculography showed chamber dilation and dyskinesis of the apical free wall, whereas left ventriculography showed normokinesis, mimicking right ventricular dysplasia. Cardiac sarcoidosis was diagnosed on examination of an endomyocardial biopsy specimen from the right ventricle. A permanent pacemaker was implanted to manage the complete atrioventricular block. After steroid treatment, electrocardiography showed first-degree atrioventricular block and echocardiography revealed an improvement in the right ventricular chamber dilation. Reports of cardiac sarcoidosis mimicking right ventricular dysplasia are extremely rare and as this case shows, right ventricular involvement may be one of its manifestations.

  8. Quantitative assessment of the retinal microvasculature using optical coherence tomography angiography

    NASA Astrophysics Data System (ADS)

    Chu, Zhongdi; Lin, Jason; Gao, Chen; Xin, Chen; Zhang, Qinqin; Chen, Chieh-Li; Roisman, Luis; Gregori, Giovanni; Rosenfeld, Philip J.; Wang, Ruikang K.

    2016-06-01

    Optical coherence tomography angiography (OCTA) is clinically useful for the qualitative assessment of the macular microvasculature. However, there is a need for comprehensive quantitative tools to help objectively analyze the OCT angiograms. Few studies have reported the use of a single quantitative index to describe vessel density in OCT angiograms. In this study, we introduce a five-index quantitative analysis of OCT angiograms in an attempt to detect and assess vascular abnormalities from multiple perspectives. The indices include vessel area density, vessel skeleton density, vessel diameter index, vessel perimeter index, and vessel complexity index. We show the usefulness of the proposed indices with five illustrative cases. Repeatability is tested on both a healthy case and a stable diseased case, giving interclass coefficients smaller than 0.031. The results demonstrate that our proposed quantitative analysis may be useful as a complement to conventional OCTA for the diagnosis of disease and monitoring of treatment.

  9. Methodological issues in the quantitative assessment of quality of life.

    PubMed

    Panagiotakos, Demosthenes B; Yfantopoulos, John N

    2011-10-01

    The term quality of life can be identified in Aristotle's classical writings of 330 BC. In his Nichomachian ethics he recognises the multiple relationships between happiness, well-being, "eudemonia" and quality of life. Historically the concept of quality of life has undergone various interpretations. It involves personal experience, perceptions and beliefs, attitudes concerning philosophical, cultural, spiritual, psychological, political, and financial aspects of everyday living. Quality of life has been extensively used both as an outcome and an explanatory factor in relation to human health, in various clinical trials, epidemiologic studies and health interview surveys. Because of the variations in the definition of quality of life, both in theory and in practice, there are also a wide range of procedures that are used to assess quality of life. In this paper several methodological issues regarding the tools used to evaluate quality of life is discussed. In summary, the use of components consisted of large number of classes, as well as the use of specific weights for each scale component, and the low-to-moderate inter-correlation level between the components, is evident from simulated and empirical studies.

  10. Validation of a quantitative phosphorus loss assessment tool.

    PubMed

    White, Michael J; Storm, Daniel E; Smolen, Michael D; Busteed, Philip R; Zhang, Hailin; Fox, Garey A

    2014-01-01

    Pasture Phosphorus Management Plus (PPM Plus) is a tool that allows nutrient management and conservation planners to evaluate phosphorus (P) loss from agricultural fields. This tool uses a modified version of the widely used Soil and Water Assessment Tool model with a vastly simplified interface. The development of PPM Plus has been fully described in previous publications; in this article we evaluate the accuracy of PPM Plus using 286 field-years of runoff, sediment, and P validation data from runoff studies at various locations in Oklahoma, Texas, Arkansas, and Georgia. Land uses include pasture, small grains, and row crops with rainfall ranging from 630 to 1390 mm yr, with and without animal manure application. PPM Plus explained 68% of the variability in total P loss, 56% of runoff, and 73% of the variability of sediment yield. An empirical model developed from these data using soil test P, total applied P, slope, and precipitation only accounted for 15% of the variability in total P loss, which implies that a process-based model is required to account for the diversity present in these data. PPM Plus is an easy-to-use conservation planning tool for P loss prediction, which, with modification, could be applicable at the regional and national scales. PMID:25602555

  11. Quantitative assessment of gene expression network module-validation methods.

    PubMed

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-01-01

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks. PMID:26470848

  12. Validation of a quantitative phosphorus loss assessment tool.

    PubMed

    White, Michael J; Storm, Daniel E; Smolen, Michael D; Busteed, Philip R; Zhang, Hailin; Fox, Garey A

    2014-01-01

    Pasture Phosphorus Management Plus (PPM Plus) is a tool that allows nutrient management and conservation planners to evaluate phosphorus (P) loss from agricultural fields. This tool uses a modified version of the widely used Soil and Water Assessment Tool model with a vastly simplified interface. The development of PPM Plus has been fully described in previous publications; in this article we evaluate the accuracy of PPM Plus using 286 field-years of runoff, sediment, and P validation data from runoff studies at various locations in Oklahoma, Texas, Arkansas, and Georgia. Land uses include pasture, small grains, and row crops with rainfall ranging from 630 to 1390 mm yr, with and without animal manure application. PPM Plus explained 68% of the variability in total P loss, 56% of runoff, and 73% of the variability of sediment yield. An empirical model developed from these data using soil test P, total applied P, slope, and precipitation only accounted for 15% of the variability in total P loss, which implies that a process-based model is required to account for the diversity present in these data. PPM Plus is an easy-to-use conservation planning tool for P loss prediction, which, with modification, could be applicable at the regional and national scales.

  13. A Quantitative Measure of Handwriting Dysfluency for Assessing Tardive Dyskinesia

    PubMed Central

    Caligiuri, Michael P.; Teulings, Hans-Leo; Dean, Charles E.; Lohr, James B.

    2015-01-01

    Tardive dyskinesia (TD) is movement disorder commonly associated with chronic exposure to antidopaminergic medications which may be in some cases disfiguring and socially disabling. The consensus from a growing body of research on the incidence and prevalence of TD in the modern era of antipsychotics indicates that this disorder has not disappeared continues to challenge the effective management of psychotic symptoms in patients with schizophrenia. A fundamental component in an effective strategy for managing TD is its reliable and accurate assessment. In the present study, we examined the clinical utility of a brief handwriting dysfluency measure for quantifying TD. Digitized samples of handwritten circles and loops were obtained from 62 psychosis patients with or without TD and from 50 healthy subjects. Two measures of dysfluent pen movements were extracted from each vertical pen stroke, including normalized jerk and the number of acceleration peaks. TD patients exhibited significantly higher dysfluency scores than non-TD patients and controls. Severity of handwriting movement dysfluency was correlated with AIMS severity ratings for some tasks. The procedure yielded high degrees of test-retest reliability. These results suggest that measures of handwriting movement dysfluency may be particularly useful for objectively evaluating the efficacy of pharmacotherapeutic strategies for treating TD. PMID:25679121

  14. Quantitative assessments of ecological impact/recovery in freshwater systems

    SciTech Connect

    Birge, W.J.; Keogh, D.P.; Zuiderveen, J.A.; Robison, W.A.

    1994-12-31

    Long-term studies were under taken to evaluate the fidelity of multi-metric scoring systems, and other means of quantifying effects of chemical stresses on aquatic ecosystems. Integrity of macroinvertebrate communities was assessed using the Rapid Bioassessment Protocol III; Brillouin and Shannon-Weaver diversity indices; judgment based on traditional parameters of species richness, abundance and trophic group assemblages, and cluster analysis. In addition, chemical and toxicological monitoring data and periphyton studies were used in these evaluations. Surveys were performed at 8 or more stations selected for comparable conditions, including upstream reference and downstream recovery areas. In two streams, ecological impact varied from severe to extreme near point-source outfalls and decreased progressively with downstream distance. Station to station scoring with Protocol III and diversity indices correlated well with independent chemical and toxicological evaluations. However, in metal-stressed streams affected by slight to moderate impact, or which were in early recovery, Protocol III scoring and other family-level metrics did not consistently reflect losses in species richness and mean abundance of up to 32% and 75%, respectively. Observations on deformities (e.g., eyespots, gills), selected subfamily and species-level metrics, including ratios of metal sensitive to metal tolerant chironomids, gave greater accuracy in characterizing marginal to moderate perturbations. Observations on fugitive and opportunistic species also were useful.

  15. Qualitative and Quantitative Assessment of Four Marketed Formulations of Brahmi

    PubMed Central

    Saini, Neeti; Mathur, Rajani; Agrawal, S. S.

    2012-01-01

    This study was conducted with the aim to compare two batches each of four popular commercial formulations of Bacopa monnieri (Brahmi), and report, if any, inter-batch variations. The formulations were procured from local market and analyzed for label specifications, uniformity of weight of capsule, identity, purity and strength parameters (total ash content test, acid insoluble ash content, water soluble extractive, alcohol soluble extractive, loss on drying). Bacoside A, one of the pharmacologically active saponin present in B. monnieri, was quantified in all the formulations using UV-spectrophotometer. In addition each formulation was assessed and compared for variation in biological activity using in vitro test for hemolytic activity using human erythrocytes. The results of the study show that there is a wide variation in the quality and content of herbal drugs marketed by different manufacturers. More importantly this study demonstrates that there exists a bigger challenge of batch-to-batch variation in the quality and content of herbal formulations of the same manufacturer. This challenge of providing standardized formulations is being faced by not any one manufacturing house but by all, and may be attributed firstly to, lack of stringent regulations and secondly to high variability in raw material quality. PMID:23204618

  16. Qualitative and quantitative assessment of four marketed formulations of brahmi.

    PubMed

    Saini, Neeti; Mathur, Rajani; Agrawal, S S

    2012-01-01

    This study was conducted with the aim to compare two batches each of four popular commercial formulations of Bacopa monnieri (Brahmi), and report, if any, inter-batch variations. The formulations were procured from local market and analyzed for label specifications, uniformity of weight of capsule, identity, purity and strength parameters (total ash content test, acid insoluble ash content, water soluble extractive, alcohol soluble extractive, loss on drying). Bacoside A, one of the pharmacologically active saponin present in B. monnieri, was quantified in all the formulations using UV-spectrophotometer. In addition each formulation was assessed and compared for variation in biological activity using in vitro test for hemolytic activity using human erythrocytes. The results of the study show that there is a wide variation in the quality and content of herbal drugs marketed by different manufacturers. More importantly this study demonstrates that there exists a bigger challenge of batch-to-batch variation in the quality and content of herbal formulations of the same manufacturer. This challenge of providing standardized formulations is being faced by not any one manufacturing house but by all, and may be attributed firstly to, lack of stringent regulations and secondly to high variability in raw material quality.

  17. Quantitative use of photography in orthognathic outcome assessment.

    PubMed

    Edler, R J; Wertheim, D; Greenhill, D; Jaisinghani, A

    2011-03-01

    This study reports an independent audit of two aspects of orthognathic surgery, namely control of inter-alar width and mandibular outline asymmetry. Measurements were taken from standardized photographs of a consecutive series of 27 patients, using an on-screen digitizing program (IPTool). All patients had undergone bimaxillary osteotomies involving maxillary impaction and/or advancement, by one surgeon, using a cinch suture for nasal width control. Nine-twelve months after surgery, inter-alar width had increased by just 0.08 cm mean (SD 0.3). Four patients showed an increase of just over 2mm, whilst six showed a small reduction. Based on ratios of size (area) and shape (compactness) of the right and left mandibular segments, there was a small overall improvement in mandibular symmetry (0.019 and 0.005 respectively). Whilst in most of the patients the need for surgery was primarily the correction of antero-posterior and vertical discrepancies, five patients with demonstrable asymmetry showed a clear improvement. In three patients whose asymmetry scores were very mild pre-treatment, there was a small, measured increase in asymmetry, but not to a degree that would be clinically noticeable. At a time when 3D imaging is still unavailable to many clinicians, the results of this study suggest that appropriate measurements taken from carefully standardized conventional photographs can provide a valid and objective means of assessing treatment outcome.

  18. A quantitative measure of handwriting dysfluency for assessing tardive dyskinesia.

    PubMed

    Caligiuri, Michael P; Teulings, Hans-Leo; Dean, Charles E; Lohr, James B

    2015-04-01

    Tardive dyskinesia (TD) is a movement disorder commonly associated with chronic exposure to antidopaminergic medications, which may be in some cases disfiguring and socially disabling. The consensus from a growing body of research on the incidence and prevalence of TD in the modern era of antipsychotics indicates that this disorder has not disappeared continues to challenge the effective management of psychotic symptoms in patients with schizophrenia. A fundamental component in an effective strategy for managing TD is its reliable and accurate assessment. In the present study, we examined the clinical utility of a brief handwriting dysfluency measure for quantifying TD. Digitized samples of handwritten circles and loops were obtained from 62 psychosis patients with or without TD and from 50 healthy subjects. Two measures of dysfluent pen movements were extracted from each vertical pen stroke, including normalized jerk and the number of acceleration peaks. Tardive dyskinesia patients exhibited significantly higher dysfluency scores than non-TD patients and controls. Severity of handwriting movement dysfluency was correlated with Abnormal Involuntary Movement Scale severity ratings for some tasks. The procedure yielded high degrees of test-retest reliability. These results suggest that measures of handwriting movement dysfluency may be particularly useful for objectively evaluating the efficacy of pharmacotherapeutic strategies for treating TD.

  19. Quantitative assessments of ecological impact/recovery in freshwater systems

    SciTech Connect

    Birge, W.J.; Keogh, D.P.; Zuiderveen, J.A. |

    1995-12-31

    Long-term studies were undertaken to evaluate the fidelity of multi-metric scoring systems, and other means of quantifying effects of chemical stresses on aquatic biota. Integrity of macroinvertebrate communities was assessed using the Rapid Bioassessment Protocol III; trophic group analysis, diversity indices and various individual parameters, including species richness, abundance and indicator species. In addition, chemical and toxicological monitoring data and periphyton studies were used in the evaluations. Surveys were performed at monitoring stations selected for comparable conditions, and included upstream reference and downstream recovery areas. In two streams, ecological impact varied from severe to extreme near point-source outfalls and decreased progressively with distance downstream. Station to station scoring with Protocol III and diversity indices correlated well with independent chemical and toxicological evaluations. However, in metal-stressed streams affected by slight to moderate impact, or which were in early recovery, Protocol III scoring and other family-level metrics did not consistently reflect losses in species richness and mean abundance up to 32% and 75%, respectively. Observations on morphological deformities (e.g., eyespots, gills), selected subfamily and species-level metrics, including ratios of metal sensitive to metal tolerant chironomids, gave greater accuracy in characterizing low to moderate perturbations. However, in conclusion, it appeared that marginal losses in biodiversity over time may not be detectable with current procedures. Major factors affecting precision included the normal range of seasonal and annual fluctuations in ecological parameters within and among stream systems, inadequate historical data, as well as drought and high water events.

  20. Quantitative assessment of corpus callosum morphology in periventricular nodular heterotopia.

    PubMed

    Pardoe, Heath R; Mandelstam, Simone A; Hiess, Rebecca Kucharsky; Kuzniecky, Ruben I; Jackson, Graeme D

    2015-01-01

    We investigated systematic differences in corpus callosum morphology in periventricular nodular heterotopia (PVNH). Differences in corpus callosum mid-sagittal area and subregional area changes were measured using an automated software-based method. Heterotopic gray matter deposits were automatically labeled and compared with corpus callosum changes. The spatial pattern of corpus callosum changes were interpreted in the context of the characteristic anterior-posterior development of the corpus callosum in healthy individuals. Individuals with periventricular nodular heterotopia were imaged at the Melbourne Brain Center or as part of the multi-site Epilepsy Phenome Genome project. Whole brain T1 weighted MRI was acquired in cases (n=48) and controls (n=663). The corpus callosum was segmented on the mid-sagittal plane using the software "yuki". Heterotopic gray matter and intracranial brain volume was measured using Freesurfer. Differences in corpus callosum area and subregional areas were assessed, as well as the relationship between corpus callosum area and heterotopic GM volume. The anterior-posterior distribution of corpus callosum changes and heterotopic GM nodules were quantified using a novel metric and compared with each other. Corpus callosum area was reduced by 14% in PVNH (p=1.59×10(-9)). The magnitude of the effect was least in the genu (7% reduction) and greatest in the isthmus and splenium (26% reduction). Individuals with higher heterotopic GM volume had a smaller corpus callosum. Heterotopic GM volume was highest in posterior brain regions, however there was no linear relationship between the anterior-posterior position of corpus callosum changes and PVNH nodules. Reduced corpus callosum area is strongly associated with PVNH, and is probably associated with abnormal brain development in this neurological disorder. The primarily posterior corpus callosum changes may inform our understanding of the etiology of PVNH. Our results suggest that

  1. Real Time Quantitative Radiological Monitoring Equipment for Environmental Assessment

    SciTech Connect

    John R. Giles; Lyle G. Roybal; Michael V. Carpenter

    2006-03-01

    The Idaho National Laboratory (INL) has developed a suite of systems that rapidly scan, analyze, and characterize radiological contamination in soil. These systems have been successfully deployed at several Department of Energy (DOE) laboratories and Cold War Legacy closure sites. Traditionally, these systems have been used during the characterization and remediation of radiologically contaminated soils and surfaces; however, subsequent to the terrorist attacks of September 11, 2001, the applications of these systems have expanded to include homeland security operations for first response, continuing assessment and verification of cleanup activities in the event of the detonation of a radiological dispersal device. The core system components are a detector, a spectral analyzer, and a global positioning system (GPS). The system is computer controlled by menu-driven, user-friendly custom software designed for a technician-level operator. A wide variety of detectors have been used including several configurations of sodium iodide (NaI) and high-purity germanium (HPGe) detectors, and a large area proportional counter designed for the detection of x-rays from actinides such as Am-241 and Pu-238. Systems have been deployed from several platforms including a small all-terrain vehicle (ATV), hand-pushed carts, a backpack mounted unit, and an excavator mounted unit used where personnel safety considerations are paramount. The INL has advanced this concept, and expanded the system functionality to create an integrated, field-deployed analytical system through the use of tailored analysis and operations software. Customized, site specific software is assembled from a supporting toolbox of algorithms that streamline the data acquisition, analysis and reporting process. These algorithms include region specific spectral stripping, automated energy calibration, background subtraction, activity calculations based on measured detector efficiencies, and on-line data quality checks

  2. Mimicking human texture classification

    NASA Astrophysics Data System (ADS)

    van Rikxoort, Eva M.; van den Broek, Egon L.; Schouten, Theo E.

    2005-03-01

    In an attempt to mimic human (colorful) texture classification by a clustering algorithm three lines of research have been encountered, in which as test set 180 texture images (both their color and gray-scale equivalent) were drawn from the OuTex and VisTex databases. First, a k-means algorithm was applied with three feature vectors, based on color/gray values, four texture features, and their combination. Second, 18 participants clustered the images using a newly developed card sorting program. The mutual agreement between the participants was 57% and 56% and between the algorithm and the participants it was 47% and 45%, for respectively color and gray-scale texture images. Third, in a benchmark, 30 participants judged the algorithms' clusters with gray-scale textures as more homogeneous then those with colored textures. However, a high interpersonal variability was present for both the color and the gray-scale clusters. So, despite the promising results, it is questionable whether average human texture classification can be mimicked (if it exists at all).

  3. Xanthomatous pleuritis mimicking mesothelioma.

    PubMed

    McGuire, Franklin R; Gourdin, Todd; Finley, James L; Downie, Gordon

    2009-01-01

    Recurrent non-malignant exudative effusions remain a diagnostic and potentially management dilemma. Fluid characteristics frequently narrow the differential but fail to offer a definitive diagnosis. Medical thoracoscopy is well tolerated and allows direct visualization and biopsy of pleural processes under conscious sedation. Rarely, macroscopic appearance and even histology may be misleading. We present a case of xanthomatous pleuritis that mimicked early mesothelioma. Our patient was a 69-year-old female with a large left pleural effusion. Her medical history was significant for a recent small pericardial effusion without cardiac dysfunction. Thoracentesis revealed a non-malignant exudative effusion. Thoracoscopy demonstrated two foci of raised soft plaques with petechial hemorrhage and adhesions. Preliminary evaluation suggested chronic inflammation admixed with proliferating spindle cells and necrosis. The immunohistochemical phenotype of the spindle cells favored a spindle and epithelioid cell neoplasm, mesothelioma. Because of discord between pathologists, we repeated the thoracoscopy through the existing chest tube/thoracoscopy site. We acquired more tissue for special stains and outside review. Following extensive immunohistochemistry, the diagnosis of xanthomatous pleuritis was made. Our patient quickly recovered with steroid therapy and is without recurrence 18 months later. This case demonstrates the utility and nuances of medical thoracoscopy in a perplexing case of xanthomatous pleuritis. PMID:18223309

  4. Towards quantitative ecological risk assessment of elevated carbon dioxide levels in the marine environment.

    PubMed

    de Vries, Pepijn; Tamis, Jacqueline E; Foekema, Edwin M; Klok, Chris; Murk, Albertinka J

    2013-08-30

    The environmental impact of elevated carbon dioxide (CO2) levels has become of more interest in recent years. This, in relation to globally rising CO2 levels and related considerations of geological CO2 storage as a mitigating measure. In the present study effect data from literature were collected in order to conduct a marine ecological risk assessment of elevated CO2 levels, using a Species Sensitivity Distribution (SSD). It became evident that information currently available from the literature is mostly insufficient for such a quantitative approach. Most studies focus on effects of expected future CO2 levels, testing only one or two elevated concentrations. A full dose-response relationship, a uniform measure of exposure, and standardized test protocols are essential for conducting a proper quantitative risk assessment of elevated CO2 levels. Improvements are proposed to make future tests more valuable and usable for quantitative risk assessment.

  5. Reliability of Quantitative Ultrasonic Assessment of Normal-Tissue Toxicity in Breast Cancer Radiotherapy

    SciTech Connect

    Yoshida, Emi J.; Chen Hao; Torres, Mylin; Andic, Fundagul; Liu Haoyang; Chen Zhengjia; Sun, Xiaoyan; Curran, Walter J.; Liu Tian

    2012-02-01

    Purpose: We have recently reported that ultrasound imaging, together with ultrasound tissue characterization (UTC), can provide quantitative assessment of radiation-induced normal-tissue toxicity. This study's purpose is to evaluate the reliability of our quantitative ultrasound technology in assessing acute and late normal-tissue toxicity in breast cancer radiotherapy. Method and Materials: Our ultrasound technique analyzes radiofrequency echo signals and provides quantitative measures of dermal, hypodermal, and glandular tissue toxicities. To facilitate easy clinical implementation, we further refined this technique by developing a semiautomatic ultrasound-based toxicity assessment tool (UBTAT). Seventy-two ultrasound studies of 26 patients (720 images) were analyzed. Images of 8 patients were evaluated for acute toxicity (<6 months postradiotherapy) and those of 18 patients were evaluated for late toxicity ({>=}6 months postradiotherapy). All patients were treated according to a standard radiotherapy protocol. To assess intraobserver reliability, one observer analyzed 720 images in UBTAT and then repeated the analysis 3 months later. To assess interobserver reliability, three observers (two radiation oncologists and one ultrasound expert) each analyzed 720 images in UBTAT. An intraclass correlation coefficient (ICC) was used to evaluate intra- and interobserver reliability. Ultrasound assessment and clinical evaluation were also compared. Results: Intraobserver ICC was 0.89 for dermal toxicity, 0.74 for hypodermal toxicity, and 0.96 for glandular tissue toxicity. Interobserver ICC was 0.78 for dermal toxicity, 0.74 for hypodermal toxicity, and 0.94 for glandular tissue toxicity. Statistical analysis found significant changes in dermal (p < 0.0001), hypodermal (p = 0.0027), and glandular tissue (p < 0.0001) assessments in the acute toxicity group. Ultrasound measurements correlated with clinical Radiation Therapy Oncology Group (RTOG) toxicity scores of patients

  6. High-throughput automated image analysis of neuroinflammation and neurodegeneration enables quantitative assessment of virus neurovirulence

    PubMed Central

    Maximova, Olga A.; Murphy, Brian R.; Pletnev, Alexander G.

    2010-01-01

    Historically, the safety of live attenuated vaccine candidates against neurotropic viruses was assessed by semi-quantitative analysis of virus-induced histopathology in the central nervous system of monkeys. We have developed a high-throughput automated image analysis (AIA) for the quantitative assessment of virus-induced neuroinflammation and neurodegeneration. Evaluation of the results generated by AIA showed that quantitative estimates of lymphocytic infiltration, microglial activation, and neurodegeneration strongly and significantly correlated with results of traditional histopathological scoring. In addition, we show that AIA is a targeted, objective, accurate, and time-efficient approach that provides reliable differentiation of virus neurovirulence. As such, it may become a useful tool in establishing consistent analytical standards across research and development laboratories and regulatory agencies, and may improve the safety evaluation of live virus vaccines. The implementation of this high-throughput AIA will markedly advance many fields of research including virology, neuroinflammation, neuroscience, and vaccinology. PMID:20688036

  7. The quantitative assessment of domino effects caused by overpressure. Part I. Probit models.

    PubMed

    Cozzani, Valerio; Salzano, Ernesto

    2004-03-19

    Accidents caused by domino effect are among the more severe that took place in the chemical and process industry. However, a well established and widely accepted methodology for the quantitative assessment of domino accidents contribution to industrial risk is still missing. Hence, available data on damage to process equipment caused by blast waves were revised in the framework of quantitative risk analysis, aiming at the quantitative assessment of domino effects caused by overpressure. Specific probit models were derived for several categories of process equipment and were compared to other literature approaches for the prediction of probability of damage of equipment loaded by overpressure. The results evidence the importance of using equipment-specific models for the probability of damage and equipment-specific damage threshold values, rather than general equipment correlation, which may lead to errors up to 500%. PMID:15072815

  8. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  9. Quantitative Approach to Collaborative Learning: Performance Prediction, Individual Assessment, and Group Composition

    ERIC Educational Resources Information Center

    Cen, Ling; Ruta, Dymitr; Powell, Leigh; Hirsch, Benjamin; Ng, Jason

    2016-01-01

    The benefits of collaborative learning, although widely reported, lack the quantitative rigor and detailed insight into the dynamics of interactions within the group, while individual contributions and their impacts on group members and their collaborative work remain hidden behind joint group assessment. To bridge this gap we intend to address…

  10. QUANTITATIVE ASSESSMENT OF CORAL DISEASES IN THE FLORIDA KEYS: STRATEGY AND METHODOLOGY

    EPA Science Inventory

    Most studies of coral disease have focused on the incidence of a single disease within a single location. Our overall objective is to use quantitative assessments to characterize annual patterns in the distribution and frequency of scleractinian and gorgonian coral diseases over ...

  11. DOSIMETRY MODELING OF INHALED FORMALDEHYDE: BINNING NASAL FLUX PREDICTIONS FOR QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    Dosimetry Modeling of Inhaled Formaldehyde: Binning Nasal Flux Predictions for Quantitative Risk Assessment. Kimbell, J.S., Overton, J.H., Subramaniam, R.P., Schlosser, P.M., Morgan, K.T., Conolly, R.B., and Miller, F.J. (2001). Toxicol. Sci. 000, 000:000.

    Interspecies e...

  12. A quantitative microbial risk assessment for center pivot irrigation of dairy wastewaters

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In the western United States where livestock wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks from inhaling pathogens aerosolized dur...

  13. Effect of Teacher Specialized Training on Limited English Speaking Students' Assessment Outcomes: Quantitative Study

    ERIC Educational Resources Information Center

    Palaroan, Michelle A.

    2009-01-01

    The quantitative study was a comparison of Limited English Proficient (LEP) students' assessment outcomes when taught by a teacher with specialized training and when taught by teachers with no specialized training. The comparison of 2007-2008 Northern Nevada LEP third grade student scores in the content areas of English language arts and…

  14. Climate Change Education: Quantitatively Assessing the Impact of a Botanical Garden as an Informal Learning Environment

    ERIC Educational Resources Information Center

    Sellmann, Daniela; Bogner, Franz X.

    2013-01-01

    Although informal learning environments have been studied extensively, ours is one of the first studies to quantitatively assess the impact of learning in botanical gardens on students' cognitive achievement. We observed a group of 10th graders participating in a one-day educational intervention on climate change implemented in a botanical…

  15. A Quantitative Synthesis of Developmental Disability Research: The Impact of Functional Assessment Methodology on Treatment Effectiveness

    ERIC Educational Resources Information Center

    Delfs, Caitlin H.; Campbell, Jonathan M.

    2010-01-01

    Methods and outcomes from functional behavioral assessment have been researched widely over the past twenty-five years. However, several important research questions have yet to be examined sufficiently. This quantitative review of developmental disability research aims to make comparisons of different functional behavioral assessment…

  16. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses

    PubMed Central

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning. PMID:27146161

  17. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses.

    PubMed

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning.

  18. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined.

  19. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses.

    PubMed

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning. PMID:27146161

  20. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment.

  1. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.

  2. Repeatability Assessment by ISO 11843-7 in Quantitative HPLC for Herbal Medicines.

    PubMed

    Chen, Liangmian; Kotani, Akira; Hakamata, Hideki; Tsutsumi, Risa; Hayashi, Yuzuru; Wang, Zhimin; Kusu, Fumiyo

    2015-01-01

    We have proposed an assessment methods to estimate the measurement relative standard deviation (RSD) of chromatographic peaks in quantitative HPLC for herbal medicines by the methodology of ISO 11843 Part 7 (ISO 11843-7:2012), which provides detection limits stochastically. In quantitative HPLC with UV detection (HPLC-UV) of Scutellaria Radix for the determination of baicalin, the measurement RSD of baicalin by ISO 11843-7:2012 stochastically was within a 95% confidence interval of the statistically obtained RSD by repetitive measurements (n = 6). Thus, our findings show that it is applicable for estimating of the repeatability of HPLC-UV for determining baicalin without repeated measurements. In addition, the allowable limit of the "System repeatability" in "Liquid Chromatography" regulated in a pharmacopoeia can be obtained by the present assessment method. Moreover, the present assessment method was also successfully applied to estimate the measurement RSDs of quantitative three-channel liquid chromatography with electrochemical detection (LC-3ECD) of Chrysanthemi Flos for determining caffeoylquinic acids and flavonoids. By the present repeatability assessment method, reliable measurement RSD was obtained stochastically, and the experimental time was remarkably reduced. PMID:26353956

  3. The edaphic quantitative protargol stain: a sampling protocol for assessing soil ciliate abundance and diversity.

    PubMed

    Acosta-Mercado, Dimaris; Lynn, Denis H

    2003-06-01

    It has been suggested that species loss from microbial groups low in diversity that occupy trophic positions close to the base of the detrital food web could be critical for terrestrial ecosystem functioning. Among the protozoans within the soil microbial loop, ciliates are presumably the least abundant and of low diversity. However, the lack of a standardized method to quantitatively enumerate and identify them has hampered our knowledge about the magnitude of their active and potential diversity, and about the interactions in which they are involved. Thus, the Edaphic Quantitative Protargol Staining (EQPS) method is provided to simultaneously account for ciliate species richness and abundance in a quantitative and qualitative way. This direct method allows this rapid and simultaneous assessment by merging the Non-flooded Petri Dish (NFPD) method [Prog. Protistol. 2 (1987) 69] and the Quantitative Protargol Stain (QPS) method [Montagnes, D.J.S., Lynn, D.H., 1993. A quantitative protargol stain (QPS) for ciliates and other protists. In: Kemp, P.F., Sherr, B.F., Sherr, E.B., Cole, J.J. (Eds.), Handbook of Methods in Aquatic Microbial Ecology. Lewis Publishers, Boca Raton, FL, pp. 229-240]. The abovementioned protocols were refined by experiments examining the spatial distribution of ciliates under natural field conditions, sampling intensity, the effect of storage, and the use of cytological preparations versus live observations. The EQPS could be useful in ecological studies since it provides both a "snapshot" of the active and effective diversity and a robust estimate of the potential diversity.

  4. Quantitative assessment of binding affinities for nanoparticles targeted to vulnerable plaque.

    PubMed

    Tang, Tang; Tu, Chuqiao; Chow, Sarah Y; Leung, Kevin H; Du, Siyi; Louie, Angelique Y

    2015-06-17

    Recent successes in targeted immune and cell-based therapies have driven new directions for pharmaceutical research. With the rise of these new therapies there is an unfilled need for companion diagnostics to assess patients' potential for therapeutic response. Targeted nanomaterials have been widely investigated to fill this niche; however, in contrast to small molecule or peptide-based targeted agents, binding affinities are not reported for nanomaterials, and to date there has been no standard, quantitative measure for the interaction of targeted nanoparticle agents with their targets. Without a standard measure, accurate comparisons between systems and optimization of targeting behavior are challenging. Here, we demonstrate a method for quantitative assessment of the binding affinity for targeted nanoparticles to cell surface receptors in living systems and apply it to optimize the development of a novel targeted nanoprobe for imaging vulnerable atherosclerotic plaques. In this work, we developed sulfated dextran-coated iron oxide nanoparticles with specific targeting to macrophages, a cell type whose density strongly correlates with plaque vulnerability. Detailed quantitative, in vitro characterizations of (111)In(3+) radiolabeled probes show high-affinity binding to the macrophage scavenger receptor A (SR-A). Cell uptake studies illustrate that higher surface sulfation levels result in much higher uptake efficiency by macrophages. We use a modified Scatchard analysis to quantitatively describe nanoparticle binding to targeted receptors. This characterization represents a potential new standard metric for targeted nanomaterials. PMID:25970303

  5. Dirofilariasis Mimicking an Acute Scrotum.

    PubMed

    Bertozzi, Mirko; Rinaldi, Victoria Elisa; Prestipino, Marco; Giovenali, Paolo; Appignani, Antonino

    2015-10-01

    Human infections caused by Dirofilaria repens have been reported in many areas of the world. We describe a case of a 3-year-old child with an intrascrotal mass caused by D repens mimicking an acute scrotum. This represents the first case of scrotal dirofilariasis described in pediatric age with such an unusual presentation.

  6. Reliability of quantitative ultrasonic assessment of normal-tissue toxicity in breast cancer radiotherapy

    PubMed Central

    Yoshida, Emi J.; Chen, Hao; Torres, Mylin; Andic, Fundagul; Liu, Hao-Yang; Chen, Zhengjia; Sun, Xiaoyan; Curran, Walter J; Liu, Tian

    2011-01-01

    Purpose We have recently reported that ultrasound imaging, together with ultrasound tissue characterization (UTC), can provide quantitative assessment of radiation-induced normal-tissue toxicity. This study’s purpose is to evaluate the reliability of our quantitative ultrasound technology in assessing acute and late normal-tissue toxicity in breast cancer radiotherapy. Method and Materials Our ultrasound technique analyzes radio-frequency echo signals and provides quantitative measures of dermal, hypodermal, and glandular-tissue toxicities. To facilitate easy clinical implementation, we further refined this technique by developing a semi-automatic ultrasound-based toxicity assessment tool (UBTAT). Seventy-two ultrasound studies of 26 patients (720 images) were analyzed. Images of 8 patients were evaluated for acute toxicity (<6 months post radiotherapy) and those of 18 patients were evaluated for late toxicity (≥6 months post radiotherapy). All patients were treated according to a standard radiotherapy protocol. To assess intra-observer reliability, one observer analyzed 720 images in UBTAT and then repeated the analysis 3 months later. To assess inter-observer reliability, three observers (two radiation oncologists and one ultrasound expert) each analyzed 720 images in UBTAT. An intraclass correlation coefficient (ICC) was used to evaluate intra- and inter-observer reliability. Ultrasound assessment and clinical evaluation were also compared. Results Intra-observer ICC was 0.89 for dermal toxicity, 0.74 for hypodermal toxicity, and 0.96 for glandular-tissue toxicity. Inter-observer ICC was 0.78 for dermal toxicity, 0.74 for hypodermal toxicity, and 0.94 for glandular-tissue toxicity. Statistical analysis found significant changes in dermal (p < 0.0001), hypodermal (p=0.0027), and glandular-tissue (p < 0.0001) assessments in the acute toxicity group. Ultrasound measurements correlated with clinical RTOG toxicity scores of patients in the late toxicity group

  7. Quantitative assessment of radiation force effect at the dielectric air-liquid interface

    PubMed Central

    Capeloto, Otávio Augusto; Zanuto, Vitor Santaella; Malacarne, Luis Carlos; Baesso, Mauro Luciano; Lukasievicz, Gustavo Vinicius Bassi; Bialkowski, Stephen Edward; Astrath, Nelson Guilherme Castelli

    2016-01-01

    We induce nanometer-scale surface deformation by exploiting momentum conservation of the interaction between laser light and dielectric liquids. The effect of radiation force at the air-liquid interface is quantitatively assessed for fluids with different density, viscosity and surface tension. The imparted pressure on the liquids by continuous or pulsed laser light excitation is fully described by the Helmholtz electromagnetic force density. PMID:26856622

  8. Postoperative Quantitative Assessment of Reconstructive Tissue Status in Cutaneous Flap Model using Spatial Frequency Domain Imaging

    PubMed Central

    Yafi, Amr; Vetter, Thomas S; Scholz, Thomas; Patel, Sarin; Saager, Rolf B; Cuccia, David J; Evans, Gregory R; Durkin, Anthony J

    2010-01-01

    Background The purpose of this study is to investigate the capabilities of a novel optical wide-field imaging technology known as Spatial Frequency Domain Imaging (SFDI) to quantitatively assess reconstructive tissue status. Methods Twenty two cutaneous pedicle flaps were created on eleven rats based on the inferior epigastric vessels. After baseline measurement, all flaps underwent vascular ischemia, induced by clamping the supporting vessels for two hours (either arterio-venous or selective venous occlusions) normal saline was injected to the control flap, and hypertonic hyperoncotic saline solution to the experimental flap. Flaps were monitored for two hours after reperfusion. The SFDI system was used for quantitative assessment of flap status over the duration of the experiment. Results All flaps demonstrated a significant decline in oxy-hemoglobin and tissue oxygen saturation in response to occlusion. Total hemoglobin and deoxy-hemoglobin were markedly increased in the selective venous occlusion group. After reperfusion and the solutions were administered, oxy-hemoglobin and tissue oxygen saturation in those flaps that survived gradually returned to the baseline levels. However, flaps for which oxy-hemoglobin and tissue oxygen saturation didn’t show any signs of recovery appeared to be compromised and eventually became necrotic within 24–48 hours in both occlusion groups. Conclusion SFDI technology provides a quantitative, objective method to assess tissue status. This study demonstrates the potential of this optical technology to assess tissue perfusion in a very precise and quantitative way, enabling wide-field visualization of physiological parameters. The results of this study suggest that SFDI may provide a means for prospectively identifying dysfunctional flaps well in advance of failure. PMID:21200206

  9. A method of quantitative risk assessment for transmission pipeline carrying natural gas.

    PubMed

    Jo, Young-Do; Ahn, Bum Jong

    2005-08-31

    Regulatory authorities in many countries are moving away from prescriptive approaches for keeping natural gas pipelines safe. As an alternative, risk management based on a quantitative assessment is being considered to improve the level of safety. This paper focuses on the development of a simplified method for the quantitative risk assessment for natural gas pipelines and introduces parameters of fatal length and cumulative fatal length. The fatal length is defined as the integrated fatality along the pipeline associated with hypothetical accidents. The cumulative fatal length is defined as the section of pipeline in which an accident leads to N or more fatalities. These parameters can be estimated easily by using the information of pipeline geometry and population density of a Geographic Information Systems (GIS). To demonstrate the proposed method, individual and societal risks for a sample pipeline have been estimated from the historical data of European Gas Pipeline Incident Data Group and BG Transco. With currently acceptable criteria taken into account for individual risk, the minimum proximity of the pipeline to occupied buildings is approximately proportional to the square root of the operating pressure of the pipeline. The proposed method of quantitative risk assessment may be useful for risk management during the planning and building stages of a new pipeline, and modification of a buried pipeline.

  10. The Quantitative Reasoning for College Science (QuaRCS) Assessment in non-Astro 101 Courses

    NASA Astrophysics Data System (ADS)

    Kirkman, Thomas W.; Jensen, Ellen

    2016-06-01

    The innumeracy of American students and adults is a much lamented educational problem. The quantitative reasoning skills of college students may be particularly addressed and improved in "general education" science courses like Astro 101. Demonstrating improvement requires a standardized instrument. Among the non-proprietary instruments the Quantitative Literacy and Reasoning Assessment[1] (QRLA) and the Quantitative Reasoning for College Science (QuaRCS) Assessment[2] stand out.Follette et al. developed the QuaRCS in the context of Astro 101 at University of Arizona. We report on QuaRCS results in different contexts: pre-med physics and pre-nursing microbiology at a liberal arts college. We report on the mismatch between students' contemporaneous report of a question's difficulty and the actual probability of success. We report correlations between QuaRCS and other assessments of overall student performance in the class. We report differences in attitude towards mathematics in these two different but health-related student populations .[1] QLRA, Gaze et al., 2014, DOI: http://dx.doi.org/10.5038/1936-4660.7.2.4[2] QuaRCS, Follette, et al., 2015, DOI: http://dx.doi.org/10.5038/1936-4660.8.2.2

  11. Summary of the workshop on issues in risk assessment: quantitative methods for developmental toxicology.

    PubMed

    Mattison, D R; Sandler, J D

    1994-08-01

    This report summarizes the proceedings of a conference on quantitative methods for assessing the risks of developmental toxicants. The conference was planned by a subcommittee of the National Research Council's Committee on Risk Assessment Methodology in conjunction with staff from several federal agencies, including the U.S. Environmental Protection Agency, U.S. Food and Drug Administration, U.S. Consumer Products Safety Commission, and Health and Welfare Canada. Issues discussed at the workshop included computerized techniques for hazard identification, use of human and animal data for defining risks in a clinical setting, relationships between end points in developmental toxicity testing, reference dose calculations for developmental toxicology, analysis of quantitative dose-response data, mechanisms of developmental toxicity, physiologically based pharmacokinetic models, and structure-activity relationships. Although a formal consensus was not sought, many participants favored the evolution of quantitative techniques for developmental toxicology risk assessment, including the replacement of lowest observed adverse effect levels (LOAELs) and no observed adverse effect levels (NOAELs) with the benchmark dose methodology.

  12. Combining qualitative and quantitative imaging evaluation for the assessment of genomic DNA integrity: The SPIDIA experience.

    PubMed

    Ciniselli, Chiara Maura; Pizzamiglio, Sara; Malentacchi, Francesca; Gelmini, Stefania; Pazzagli, Mario; Hartmann, Christina C; Ibrahim-Gawel, Hady; Verderio, Paolo

    2015-06-15

    In this note, we present an ad hoc procedure that combines qualitative (visual evaluation) and quantitative (ImageJ software) evaluations of Pulsed-Field Gel Electrophoresis (PFGE) images to assess the genomic DNA (gDNA) integrity of analyzed samples. This procedure could be suitable for the analysis of a large number of images by taking into consideration both the expertise of researchers and the objectiveness of the software. We applied this procedure on the first SPIDIA DNA External Quality Assessment (EQA) samples. Results show that the classification obtained by this ad hoc procedure allows a more accurate evaluation of gDNA integrity with respect to a single approach.

  13. A framework for quantitative assessment of impacts related to energy and mineral resource development

    USGS Publications Warehouse

    Haines, Seth S.; Diffendorfer, James; Balistrieri, Laurie S.; Berger, Byron R.; Cook, Troy A.; Gautier, Donald L.; Gallegos, Tanya J.; Gerritsen, Margot; Graffy, Elisabeth; Hawkins, Sarah; Johnson, Kathleen; Macknick, Jordan; McMahon, Peter; Modde, Tim; Pierce, Brenda; Schuenemeyer, John H.; Semmens, Darius; Simon, Benjamin; Taylor, Jason; Walton-Day, Katie

    2013-01-01

    Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sage grouse leks and piñon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. The framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.

  14. Sugar concentration in nectar: a quantitative metric of crop attractiveness for refined pollinator risk assessments.

    PubMed

    Knopper, Loren D; Dan, Tereza; Reisig, Dominic D; Johnson, Josephine D; Bowers, Lisa M

    2016-10-01

    Those involved with pollinator risk assessment know that agricultural crops vary in attractiveness to bees. Intuitively, this means that exposure to agricultural pesticides is likely greatest for attractive plants and lowest for unattractive plants. While crop attractiveness in the risk assessment process has been qualitatively remarked on by some authorities, absent is direction on how to refine the process with quantitative metrics of attractiveness. At a high level, attractiveness of crops to bees appears to depend on several key variables, including but not limited to: floral, olfactory, visual and tactile cues; seasonal availability; physical and behavioral characteristics of the bee; plant and nectar rewards. Notwithstanding the complexities and interactions among these variables, sugar content in nectar stands out as a suitable quantitative metric by which to refine pollinator risk assessments for attractiveness. Provided herein is a proposed way to use sugar nectar concentration to adjust the exposure parameter (with what is called a crop attractiveness factor) in the calculation of risk quotients in order to derive crop-specific tier I assessments. This Perspective is meant to invite discussion on incorporating such changes in the risk assessment process. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry. PMID:27197566

  15. A methodology for the quantitative risk assessment of major accidents triggered by seismic events.

    PubMed

    Antonioni, Giacomo; Spadoni, Gigliola; Cozzani, Valerio

    2007-08-17

    A procedure for the quantitative risk assessment of accidents triggered by seismic events in industrial facilities was developed. The starting point of the procedure was the use of available historical data to assess the expected frequencies and the severity of seismic events. Available equipment-dependant failure probability models (vulnerability or fragility curves) were used to assess the damage probability of equipment items due to a seismic event. An analytic procedure was subsequently developed to identify, evaluate the credibility and finally assess the expected consequences of all the possible scenarios that may follow the seismic events. The procedure was implemented in a GIS-based software tool in order to manage the high number of event sequences that are likely to be generated in large industrial facilities. The developed methodology requires a limited amount of additional data with respect to those used in a conventional QRA, and yields with a limited effort a preliminary quantitative assessment of the contribution of the scenarios triggered by earthquakes to the individual and societal risk indexes. The application of the methodology to several case-studies evidenced that the scenarios initiated by seismic events may have a relevant influence on industrial risk, both raising the overall expected frequency of single scenarios and causing specific severe scenarios simultaneously involving several plant units.

  16. Sugar concentration in nectar: a quantitative metric of crop attractiveness for refined pollinator risk assessments.

    PubMed

    Knopper, Loren D; Dan, Tereza; Reisig, Dominic D; Johnson, Josephine D; Bowers, Lisa M

    2016-10-01

    Those involved with pollinator risk assessment know that agricultural crops vary in attractiveness to bees. Intuitively, this means that exposure to agricultural pesticides is likely greatest for attractive plants and lowest for unattractive plants. While crop attractiveness in the risk assessment process has been qualitatively remarked on by some authorities, absent is direction on how to refine the process with quantitative metrics of attractiveness. At a high level, attractiveness of crops to bees appears to depend on several key variables, including but not limited to: floral, olfactory, visual and tactile cues; seasonal availability; physical and behavioral characteristics of the bee; plant and nectar rewards. Notwithstanding the complexities and interactions among these variables, sugar content in nectar stands out as a suitable quantitative metric by which to refine pollinator risk assessments for attractiveness. Provided herein is a proposed way to use sugar nectar concentration to adjust the exposure parameter (with what is called a crop attractiveness factor) in the calculation of risk quotients in order to derive crop-specific tier I assessments. This Perspective is meant to invite discussion on incorporating such changes in the risk assessment process. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.

  17. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit

    PubMed Central

    Dai, Houde; Zhang, Pengyue; Lueth, Tim C.

    2015-01-01

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor assessment method. This inertial sensor-based method was verified through comparison with an electromagnetic motion tracking system. Seven Parkinson’s disease (PD) patients were tested using this tremor assessment system. The measured tremor amplitudes correlated well with the judgments of a neurologist (r = 0.98). The systematic analysis of sensor-based tremor quantification and the corresponding experiments could be of great help in monitoring the severity of parkinsonian tremor. PMID:26426020

  18. Quantitative assessment of microvasculopathy in arcAβ mice with USPIO-enhanced gradient echo MRI

    PubMed Central

    Deistung, Andreas; Ielacqua, Giovanna D; Seuwen, Aline; Kindler, Diana; Schweser, Ferdinand; Vaas, Markus; Kipar, Anja; Reichenbach, Jürgen R; Rudin, Markus

    2015-01-01

    Magnetic resonance imaging employing administration of iron oxide-based contrast agents is widely used to visualize cellular and molecular processes in vivo. In this study, we investigated the ability of R2* and quantitative susceptibility mapping to quantitatively assess the accumulation of ultrasmall superparamagnetic iron oxide (USPIO) particles in the arcAβ mouse model of cerebral amyloidosis. Gradient-echo data of mouse brains were acquired at 9.4 T after injection of USPIO. Focal areas with increased magnetic susceptibility and R2* values were discernible across several brain regions in 12-month-old arcAβ compared to 6-month-old arcAβ mice and to non-transgenic littermates, indicating accumulation of particles after USPIO injection. This was concomitant with higher R2* and increased magnetic susceptibility differences relative to cerebrospinal fluid measured in USPIO-injected compared to non-USPIO-injected 12-month-old arcAβ mice. No differences in R2* and magnetic susceptibility were detected in USPIO-injected compared to non-injected 12-month-old non-transgenic littermates. Histological analysis confirmed focal uptake of USPIO particles in perivascular macrophages adjacent to small caliber cerebral vessels with radii of 2–8 µm that showed no cerebral amyloid angiopathy. USPIO-enhanced R2* and quantitative susceptibility mapping constitute quantitative tools to monitor such functional microvasculopathies. PMID:26661253

  19. Quantitative assessment of microvasculopathy in arcAβ mice with USPIO-enhanced gradient echo MRI.

    PubMed

    Klohs, Jan; Deistung, Andreas; Ielacqua, Giovanna D; Seuwen, Aline; Kindler, Diana; Schweser, Ferdinand; Vaas, Markus; Kipar, Anja; Reichenbach, Jürgen R; Rudin, Markus

    2016-09-01

    Magnetic resonance imaging employing administration of iron oxide-based contrast agents is widely used to visualize cellular and molecular processes in vivo. In this study, we investigated the ability of [Formula: see text] and quantitative susceptibility mapping to quantitatively assess the accumulation of ultrasmall superparamagnetic iron oxide (USPIO) particles in the arcAβ mouse model of cerebral amyloidosis. Gradient-echo data of mouse brains were acquired at 9.4 T after injection of USPIO. Focal areas with increased magnetic susceptibility and [Formula: see text] values were discernible across several brain regions in 12-month-old arcAβ compared to 6-month-old arcAβ mice and to non-transgenic littermates, indicating accumulation of particles after USPIO injection. This was concomitant with higher [Formula: see text] and increased magnetic susceptibility differences relative to cerebrospinal fluid measured in USPIO-injected compared to non-USPIO-injected 12-month-old arcAβ mice. No differences in [Formula: see text] and magnetic susceptibility were detected in USPIO-injected compared to non-injected 12-month-old non-transgenic littermates. Histological analysis confirmed focal uptake of USPIO particles in perivascular macrophages adjacent to small caliber cerebral vessels with radii of 2-8 µm that showed no cerebral amyloid angiopathy. USPIO-enhanced [Formula: see text] and quantitative susceptibility mapping constitute quantitative tools to monitor such functional microvasculopathies. PMID:26661253

  20. A Compressed Sensing-Based Wearable Sensor Network for Quantitative Assessment of Stroke Patients.

    PubMed

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-02-05

    Clinical rehabilitation assessment is an important part of the therapy process because it is the premise for prescribing suitable rehabilitation interventions. However, the commonly used assessment scales have the following two drawbacks: (1) they are susceptible to subjective factors; (2) they only have several rating levels and are influenced by a ceiling effect, making it impossible to exactly detect any further improvement in the movement. Meanwhile, energy constraints are a primary design consideration in wearable sensor network systems since they are often battery-operated. Traditionally, for wearable sensor network systems that follow the Shannon/Nyquist sampling theorem, there are many data that need to be sampled and transmitted. This paper proposes a novel wearable sensor network system to monitor and quantitatively assess the upper limb motion function, based on compressed sensing technology. With the sparse representation model, less data is transmitted to the computer than with traditional systems. The experimental results show that the accelerometer signals of Bobath handshake and shoulder touch exercises can be compressed, and the length of the compressed signal is less than 1/3 of the raw signal length. More importantly, the reconstruction errors have no influence on the predictive accuracy of the Brunnstrom stage classification model. It also indicated that the proposed system can not only reduce the amount of data during the sampling and transmission processes, but also, the reconstructed accelerometer signals can be used for quantitative assessment without any loss of useful information.

  1. Evaluating quantitative formulas for dose-response assessment of chemical mixtures.

    PubMed Central

    Hertzberg, Richard C; Teuschler, Linda K

    2002-01-01

    Risk assessment formulas are often distinguished from dose-response models by being rough but necessary. The evaluation of these rough formulas is described here, using the example of mixture risk assessment. Two conditions make the dose-response part of mixture risk assessment difficult, lack of data on mixture dose-response relationships, and the need to address risk from combinations of chemicals because of public demands and statutory requirements. Consequently, the U.S. Environmental Protection Agency has developed methods for carrying out quantitative dose-response assessment for chemical mixtures that require information only on the toxicity of single chemicals and of chemical pair interactions. These formulas are based on plausible ideas and default parameters but minimal supporting data on whole mixtures. Because of this lack of mixture data, the usual evaluation of accuracy (predicted vs. observed) cannot be performed. Two approaches to the evaluation of such formulas are to consider fundamental biological concepts that support the quantitative formulas (e.g., toxicologic similarity) and to determine how well the proposed method performs under simplifying constraints (e.g., as the toxicologic interactions disappear). These ideas are illustrated using dose addition and two weight-of-evidence formulas for incorporating toxicologic interactions. PMID:12634126

  2. A Compressed Sensing-Based Wearable Sensor Network for Quantitative Assessment of Stroke Patients.

    PubMed

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-01-01

    Clinical rehabilitation assessment is an important part of the therapy process because it is the premise for prescribing suitable rehabilitation interventions. However, the commonly used assessment scales have the following two drawbacks: (1) they are susceptible to subjective factors; (2) they only have several rating levels and are influenced by a ceiling effect, making it impossible to exactly detect any further improvement in the movement. Meanwhile, energy constraints are a primary design consideration in wearable sensor network systems since they are often battery-operated. Traditionally, for wearable sensor network systems that follow the Shannon/Nyquist sampling theorem, there are many data that need to be sampled and transmitted. This paper proposes a novel wearable sensor network system to monitor and quantitatively assess the upper limb motion function, based on compressed sensing technology. With the sparse representation model, less data is transmitted to the computer than with traditional systems. The experimental results show that the accelerometer signals of Bobath handshake and shoulder touch exercises can be compressed, and the length of the compressed signal is less than 1/3 of the raw signal length. More importantly, the reconstruction errors have no influence on the predictive accuracy of the Brunnstrom stage classification model. It also indicated that the proposed system can not only reduce the amount of data during the sampling and transmission processes, but also, the reconstructed accelerometer signals can be used for quantitative assessment without any loss of useful information. PMID:26861337

  3. A Compressed Sensing-Based Wearable Sensor Network for Quantitative Assessment of Stroke Patients

    PubMed Central

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-01-01

    Clinical rehabilitation assessment is an important part of the therapy process because it is the premise for prescribing suitable rehabilitation interventions. However, the commonly used assessment scales have the following two drawbacks: (1) they are susceptible to subjective factors; (2) they only have several rating levels and are influenced by a ceiling effect, making it impossible to exactly detect any further improvement in the movement. Meanwhile, energy constraints are a primary design consideration in wearable sensor network systems since they are often battery-operated. Traditionally, for wearable sensor network systems that follow the Shannon/Nyquist sampling theorem, there are many data that need to be sampled and transmitted. This paper proposes a novel wearable sensor network system to monitor and quantitatively assess the upper limb motion function, based on compressed sensing technology. With the sparse representation model, less data is transmitted to the computer than with traditional systems. The experimental results show that the accelerometer signals of Bobath handshake and shoulder touch exercises can be compressed, and the length of the compressed signal is less than 1/3 of the raw signal length. More importantly, the reconstruction errors have no influence on the predictive accuracy of the Brunnstrom stage classification model. It also indicated that the proposed system can not only reduce the amount of data during the sampling and transmission processes, but also, the reconstructed accelerometer signals can be used for quantitative assessment without any loss of useful information. PMID:26861337

  4. Fibrosis assessment: impact on current management of chronic liver disease and application of quantitative invasive tools.

    PubMed

    Wang, Yan; Hou, Jin-Lin

    2016-05-01

    Fibrosis, a common pathogenic pathway of chronic liver disease (CLD), has long been indicated to be significantly and most importantly associated with severe prognosis. Nowadays, with remarkable advances in understanding and/or treatment of major CLDs such as hepatitis C, B, and nonalcoholic fatty liver disease, there is an unprecedented requirement for the diagnosis and assessment of liver fibrosis or cirrhosis in various clinical settings. Among the available approaches, liver biopsy remains the one which possibly provides the most direct and reliable information regarding fibrosis patterns and changes in the parenchyma at different clinical stages and with different etiologies. Thus, many endeavors have been undertaken for developing methodologies based on the strategy of quantitation for the invasive assessment. Here, we analyze the impact of fibrosis assessment on the CLD patient care based on the data of recent clinical studies. We discuss and update the current invasive tools regarding their technological features and potentials for the particular clinical applications. Furthermore, we propose the potential resolutions with application of quantitative invasive tools for some major issues in fibrosis assessment, which appear to be obstacles against the nowadays rapid progress in CLD medicine.

  5. A remote quantitative Fugl-Meyer assessment framework for stroke patients based on wearable sensor networks.

    PubMed

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-05-01

    To extend the use of wearable sensor networks for stroke patients training and assessment in non-clinical settings, this paper proposes a novel remote quantitative Fugl-Meyer assessment (FMA) framework, in which two accelerometer and seven flex sensors were used to monitoring the movement function of upper limb, wrist and fingers. The extreme learning machine based ensemble regression model was established to map the sensor data to clinical FMA scores while the RRelief algorithm was applied to find the optimal features subset. Considering the FMA scale is time-consuming and complicated, seven training exercises were designed to replace the upper limb related 33 items in FMA scale. 24 stroke inpatients participated in the experiments in clinical settings and 5 of them were involved in the experiments in home settings after they left the hospital. Both the experimental results in clinical and home settings showed that the proposed quantitative FMA model can precisely predict the FMA scores based on wearable sensor data, the coefficient of determination can reach as high as 0.917. It also indicated that the proposed framework can provide a potential approach to the remote quantitative rehabilitation training and evaluation.

  6. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    PubMed

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical

  7. Automated Tracking of Quantitative Assessments of Tumor Burden in Clinical Trials1

    PubMed Central

    Rubin, Daniel L; Willrett, Debra; O'Connor, Martin J; Hage, Cleber; Kurtz, Camille; Moreira, Dilvan A

    2014-01-01

    There are two key challenges hindering effective use of quantitative assessment of imaging in cancer response assessment: 1) Radiologists usually describe the cancer lesions in imaging studies subjectively and sometimes ambiguously, and 2) it is difficult to repurpose imaging data, because lesion measurements are not recorded in a format that permits machine interpretation and interoperability. We have developed a freely available software platform on the basis of open standards, the electronic Physician Annotation Device (ePAD), to tackle these challenges in two ways. First, ePAD facilitates the radiologist in carrying out cancer lesion measurements as part of routine clinical trial image interpretation workflow. Second, ePAD records all image measurements and annotations in a data format that permits repurposing image data for analyses of alternative imaging biomarkers of treatment response. To determine the impact of ePAD on radiologist efficiency in quantitative assessment of imaging studies, a radiologist evaluated computed tomography (CT) imaging studies from 20 subjects having one baseline and three consecutive follow-up imaging studies with and without ePAD. The radiologist made measurements of target lesions in each imaging study using Response Evaluation Criteria in Solid Tumors 1.1 criteria, initially with the aid of ePAD, and then after a 30-day washout period, the exams were reread without ePAD. The mean total time required to review the images and summarize measurements of target lesions was 15% (P < .039) shorter using ePAD than without using this tool. In addition, it was possible to rapidly reanalyze the images to explore lesion cross-sectional area as an alternative imaging biomarker to linear measure. We conclude that ePAD appears promising to potentially improve reader efficiency for quantitative assessment of CT examinations, and it may enable discovery of future novel image-based biomarkers of cancer treatment response. PMID:24772204

  8. Quantitative assessment of olfactory receptors activity in immobilized nanosomes: a novel concept for bioelectronic nose.

    PubMed

    Vidic, Jasmina Minic; Grosclaude, Jeanne; Persuy, Marie-Annick; Aioun, Josiane; Salesse, Roland; Pajot-Augy, Edith

    2006-08-01

    We describe how mammalian olfactory receptors (ORs) could be used as sensing elements of highly specific and sensitive bioelectronic noses. An OR and an appropriate G(alpha) protein were co-expressed in Saccharomyces cerevisiae cells from which membrane nanosomes were prepared, and immobilized on a sensor chip. By Surface Plasmon Resonance, we were able to quantitatively evaluate OR stimulation by an odorant, and G protein activation. We demonstrate that ORs in nanosomes discriminate between odorant ligands and unrelated odorants, as in whole cells. This assay also provides the possibility for quantitative assessment of the coupling efficiency of the OR with different G(alpha) subunits, without the interference of the cellular transduction pathway. Our findings will be useful to develop a new generation of electronic noses for detection and discrimination of volatile compounds, particularly amenable to micro- and nano-sensor formats.

  9. Experimental assessment of bone mineral density using quantitative computed tomography in holstein dairy cows

    PubMed Central

    MAETANI, Ayami; ITOH, Megumi; NISHIHARA, Kahori; AOKI, Takahiro; OHTANI, Masayuki; SHIBANO, Kenichi; KAYANO, Mitsunori; YAMADA, Kazutaka

    2016-01-01

    The aim of this study was to assess the measurement of bone mineral density (BMD) by quantitative computed tomography (QCT), comparing the relationships of BMD between QCT and dual-energy X-ray absorptiometry (DXA) and between QCT and radiographic absorptiometry (RA) in the metacarpal bone of Holstein dairy cows (n=27). A significant positive correlation was found between QCT and DXA measurements (r=0.70, P<0.01), and a significant correlation was found between QCT and RA measurements (r=0.50, P<0.01). We conclude that QCT provides quantitative evaluation of BMD in dairy cows, because BMD measured by QCT showed positive correlations with BMD measured by the two conventional methods: DXA and RA. PMID:27075115

  10. Novel method for quantitative assessment of physical workload of healthcare workers by a tetherless ergonomics workstation.

    PubMed

    Smith, Warren D; Alharbi, Kamal A; Dixon, Jeremy B; Reggad, Hind

    2012-01-01

    Healthcare workers are at risk of physical injury. Our laboratory has developed a tetherless ergonomics workstation that is suitable for studying physicians' and nurses' physical workloads in clinical settings. The workstation uses wearable sensors to record multiple channels of body orientation and muscle activity and wirelessly transmits them to a base station laptop computer for display, storage, and analysis. The ergonomics workstation generates long records of multi-channel data, so it is desired that the workstation automatically process these records and provide graphical and quantitative summaries of the physical workloads experienced by the healthcare workers. This paper describes a novel method of automated quantitative assessment of physical workload, termed joint cumulative amplitude-duration (JCAD) analysis, that has advantages over previous methods and illustrates its use in a comparison of the physical workloads of robotically-assisted surgery versus manual video-endoscopic surgery.

  11. Valuation of ecotoxicological impacts from tributyltin based on a quantitative environmental assessment framework.

    PubMed

    Noring, Maria; Håkansson, Cecilia; Dahlgren, Elin

    2016-02-01

    In the scientific literature, few valuations of biodiversity and ecosystem services following the impacts of toxicity are available, hampered by the lack of ecotoxicological documentation. Here, tributyltin is used to conduct a contingent valuation study as well as cost-benefit analysis (CBA) of measures for improving the environmental status in Swedish coastal waters of the Baltic Sea. Benefits considering different dimensions when assessing environmental status are highlighted and a quantitative environmental assessment framework based on available technology, ecological conditions, and economic valuation methodology is developed. Two scenarios are used in the valuation study: (a) achieving good environmental status by 2020 in accordance with EU legislation (USD 119 household(-1) year(-1)) and (b) achieving visible improvements by 2100 due to natural degradation (USD 108 household(-1) year(-1)) during 8 years. The later scenario was used to illustrate an application of the assessment framework. The CBA results indicate that both scenarios might generate a welfare improvement.

  12. Qualitative and quantitative approaches in the dose-response assessment of genotoxic carcinogens.

    PubMed

    Fukushima, Shoji; Gi, Min; Kakehashi, Anna; Wanibuchi, Hideki; Matsumoto, Michiharu

    2016-05-01

    Qualitative and quantitative approaches are important issues in field of carcinogenic risk assessment of the genotoxic carcinogens. Herein, we provide quantitative data on low-dose hepatocarcinogenicity studies for three genotoxic hepatocarcinogens: 2-amino-3,8-dimethylimidazo[4,5-f]quinoxaline (MeIQx), 2-amino-3-methylimidazo[4,5-f]quinoline (IQ) and N-nitrosodiethylamine (DEN). Hepatocarcinogenicity was examined by quantitative analysis of glutathione S-transferase placental form (GST-P) positive foci, which are the preneoplastic lesions in rat hepatocarcinogenesis and the endpoint carcinogenic marker in the rat liver medium-term carcinogenicity bioassay. We also examined DNA damage and gene mutations which occurred through the initiation stage of carcinogenesis. For the establishment of points of departure (PoD) from which the cancer-related risk can be estimated, we analyzed the above events by quantitative no-observed-effect level and benchmark dose approaches. MeIQx at low doses induced formation of DNA-MeIQx adducts; somewhat higher doses caused elevation of 8-hydroxy-2'-deoxyquanosine levels; at still higher doses gene mutations occurred; and the highest dose induced formation of GST-P positive foci. These data indicate that early genotoxic events in the pathway to carcinogenesis showed the expected trend of lower PoDs for earlier events in the carcinogenic process. Similarly, only the highest dose of IQ caused an increase in the number of GST-P positive foci in the liver, while IQ-DNA adduct formation was observed with low doses. Moreover, treatment with DEN at low doses had no effect on development of GST-P positive foci in the liver. These data on PoDs for the markers contribute to understand whether genotoxic carcinogens have a threshold for their carcinogenicity. The most appropriate approach to use in low dose-response assessment must be approved on the basis of scientific judgment.

  13. Quantitative microbial risk assessment for Staphylococcus aureus and Staphylococcus enterotoxin A in raw milk.

    PubMed

    Heidinger, Joelle C; Winter, Carl K; Cullor, James S

    2009-08-01

    A quantitative microbial risk assessment was constructed to determine consumer risk from Staphylococcus aureus and staphylococcal enterotoxin in raw milk. A Monte Carlo simulation model was developed to assess the risk from raw milk consumption using data on levels of S. aureus in milk collected by the University of California-Davis Dairy Food Safety Laboratory from 2,336 California dairies from 2005 to 2008 and using U.S. milk consumption data from the National Health and Nutrition Examination Survey of 2003 and 2004. Four modules were constructed to simulate pathogen growth and staphylococcal enterotoxin A production scenarios to quantify consumer risk levels under various time and temperature storage conditions. The three growth modules predicted that S. aureus levels could surpass the 10(5) CFU/ml level of concern at the 99.9th or 99.99th percentile of servings and therefore may represent a potential consumer risk. Results obtained from the staphylococcal enterotoxin A production module predicted that exposure at the 99.99th percentile could represent a dose capable of eliciting staphylococcal enterotoxin intoxication in all consumer age groups. This study illustrates the utility of quantitative microbial risk assessments for identifying potential food safety issues. PMID:19722395

  14. Quantitative risk assessment for human salmonellosis through the consumption of pork sausage in Porto Alegre, Brazil.

    PubMed

    Mürmann, Lisandra; Corbellini, Luis Gustavo; Collor, Alexandre Ávila; Cardoso, Marisa

    2011-04-01

    A quantitative microbiology risk assessment was conducted to evaluate the risk of Salmonella infection to consumers of fresh pork sausages prepared at barbecues in Porto Alegre, Brazil. For the analysis, a prevalence of 24.4% positive pork sausages with a level of contamination between 0.03 and 460 CFU g(-1) was assumed. Data related to frequency and habits of consumption were obtained by a questionnaire survey given to 424 people. A second-order Monte Carlo simulation separating the uncertain parameter of cooking time from the variable parameters was run. Of the people interviewed, 87.5% consumed pork sausage, and 85.4% ate it at barbecues. The average risk of salmonellosis per barbecue at a minimum cooking time of 15.6 min (worst-case scenario) was 6.24 × 10(-4), and the risk assessed per month was 1.61 × 10(-3). Cooking for 19 min would fully inactivate Salmonella in 99.9% of the cases. At this cooking time, the sausage reached a mean internal temperature of 75.7°C. The results of the quantitative microbiology risk assessment revealed that the consumption of fresh pork sausage is safe when cooking time is approximately 19 min, whereas undercooked pork sausage may represent a nonnegligible health risk for consumers.

  15. Assessment of liver tumor response to therapy: role of quantitative imaging.

    PubMed

    Gonzalez-Guindalini, Fernanda D; Botelho, Marcos P F; Harmath, Carla B; Sandrasegaran, Kumaresan; Miller, Frank H; Salem, Riad; Yaghmai, Vahid

    2013-10-01

    Quantitative imaging is the analysis of retrieved numeric data from images with the goal of reducing subjective assessment. It is an increasingly important radiologic tool to assess treatment response in oncology patients. Quantification of response to therapy depends on the tumor type and method of treatment. Anatomic imaging biomarkers that quantify liver tumor response to cytotoxic therapy are based on temporal change in the size of the tumors. Anatomic biomarkers have been incorporated into the World Health Organization criteria and the Response Evaluation Criteria in Solid Tumors (RECIST) versions 1.0 and 1.1. However, the development of novel therapies with different mechanisms of action, such as antiangiogenesis or radioembolization, has required new methods for measuring response to therapy. This need has led to development of tumor- or therapy-specific guidelines such as the Modified CT Response Evaluation (Choi) Criteria for gastrointestinal stromal tumors, the European Association for Study of the Liver (EASL) criteria, and modified RECIST for hepatocellular carcinoma, among many others. The authors review the current quantification criteria used in the evaluation of treatment response in liver tumors, summarizing their indications, advantages, and disadvantages, and discuss future directions with newer methods that have the potential for assessment of treatment response. Knowledge of these quantitative methods is important to facilitate pivotal communication between oncologists and radiologists about cancer treatment, with benefit ultimately accruing to the patient. PMID:24108562

  16. A relative quantitative assessment of myocardial perfusion by first-pass technique: animal study

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Zhang, Zhang; Yu, Xuefang; Zhou, Kenneth J.

    2015-03-01

    The purpose of this study is to quantitatively assess the myocardial perfusion by first-pass technique in swine model. Numerous techniques based on the analysis of Computed Tomography (CT) Hounsfield Unit (HU) density have emerged. Although these methods proposed to be able to assess haemodynamically significant coronary artery stenosis, their limitations are noticed. There are still needs to develop some new techniques. Experiments were performed upon five (5) closed-chest swine. Balloon catheters were placed into the coronary artery to simulate different degrees of luminal stenosis. Myocardial Blood Flow (MBF) was measured using color microsphere technique. Fractional Flow Reserve (FFR) was measured using pressure wire. CT examinations were performed twice during First-pass phase under adenosine-stress condition. CT HU Density (HUDCT) and CT HU Density Ratio (HUDRCT) were calculated using the acquired CT images. Our study presents that HUDRCT shows a good (y=0.07245+0.09963x, r2=0.898) correlation with MBF and FFR. In receiver operating characteristic (ROC) curve analyses, HUDRCT provides excellent diagnostic performance for the detection of significant ischemia during adenosine-stress as defined by FFR indicated by the value of Area Under the Curve (AUC) of 0.927. HUDRCT has the potential to be developed as a useful indicator of quantitative assessment of myocardial perfusion.

  17. Quantitative assessment of emphysema from whole lung CT scans: comparison with visual grading

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Apanosovich, Tatiyana V.; Wang, Jianwei; Yankelevitz, David F.; Henschke, Claudia I.

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for imaging of the anatomical basis of emphysema and for visual assessment by radiologists of the extent present in the lungs. Several measures have been introduced for the quantification of the extent of disease directly from CT data in order to add to the qualitative assessments made by radiologists. In this paper we compare emphysema index, mean lung density, histogram percentiles, and the fractal dimension to visual grade in order to evaluate the predictability of radiologist visual scoring of emphysema from low-dose CT scans through quantitative scores, in order to determine which measures can be useful as surrogates for visual assessment. All measures were computed over nine divisions of the lung field (whole lung, individual lungs, and upper/middle/lower thirds of each lung) for each of 148 low-dose, whole lung scans. In addition, a visual grade of each section was also given by an expert radiologist. One-way ANOVA and multinomial logistic regression were used to determine the ability of the measures to predict visual grade from quantitative score. We found that all measures were able to distinguish between normal and severe grades (p<0.01), and between mild/moderate and all other grades (p<0.05). However, no measure was able to distinguish between mild and moderate cases. Approximately 65% prediction accuracy was achieved from using quantitative score to predict visual grade, with 73% if mild and moderate cases are considered as a single class.

  18. The economics of drug abuse: a quantitative assessment of drug demand.

    PubMed

    Hursh, Steven R; Galuska, Chad M; Winger, Gail; Woods, James H

    2005-02-01

    Behavioral economic concepts have proven useful for an overall understanding of the regulation of behavior by environmental commodities and complements a pharmacological perspective on drug abuse in several ways. First, a quantitative assessment of drug demand, equated in terms of drug potency, allows meaningful comparisons to be made among drug reinforcers within and across pharmacological classes. Second, behavioral economics provides a conceptual framework for understanding key factors, both pharmacological and environmental, that contribute to reductions in consumption of illicit drugs. Finally, behavioral economics provides a basis for generalization from laboratory and clinical studies to the development of novel behavioral and pharmacological therapies.

  19. Quantitative Assessment of the Effects of Oxidants on Antigen-Antibody Binding In Vitro

    PubMed Central

    Han, Shuang; Wang, Guanyu; Xu, Naijin; Liu, Hui

    2016-01-01

    Objective. We quantitatively assessed the influence of oxidants on antigen-antibody-binding activity. Methods. We used several immunological detection methods, including precipitation reactions, agglutination reactions, and enzyme immunoassays, to determine antibody activity. The oxidation-reduction potential was measured in order to determine total serum antioxidant capacity. Results. Certain concentrations of oxidants resulted in significant inhibition of antibody activity but had little influence on total serum antioxidant capacity. Conclusions. Oxidants had a significant influence on interactions between antigen and antibody, but minimal effect on the peptide of the antibody molecule. PMID:27313823

  20. Quantitative assessment of synovitis in Legg-Calvé-Perthes disease using gadolinium-enhanced MRI.

    PubMed

    Neal, David C; O'Brien, Jack C; Burgess, Jamie; Jo, Chanhee; Kim, Harry K W

    2015-03-01

    A quantitative method to assess hip synovitis in Legg-Calvé-Perthes disease (LCPD) is not currently available. To develop this method, the areas of synovial enhancement on gadolinium-enhanced MRI (Gd-MRI) were measured by two independent observers. The volume of synovial enhancement was significantly increased in the initial and the fragmentation stages of LCPD (Waldenström stages I and II), with a persistence of synovitis into the reossification stage (stage III). The Gd-MRI method had high interobserver and intraobserver agreements and may serve as a useful method to monitor the effect of various treatments on hip synovitis in LCPD. PMID:25305048

  1. Semi-quantitative exposure assessment of occupational exposure to wood dust and nasopharyngeal cancer risk.

    PubMed

    Ekpanyaskul, Chatchai; Sangrajrang, Suleeporn; Ekburanawat, Wiwat; Brennan, Paul; Mannetje, Andrea; Thetkathuek, Anamai; Saejiw, Nutjaree; Ruangsuwan, Tassanu; Boffetta, Paolo

    2015-01-01

    Occupational exposure to wood dust is one cause of nasopharyngeal cancer (NPC); however, assessing this exposure remains problematic. Therefore, the objective of this study was to develop a semi-quantitative exposure assessment method and then utilize it to evaluate the association between occupational exposure to wood dust and the development of NPC. In addition, variations in risk by histology were examined. A case-control study was conducted with 327 newly diagnosed cases of NPC at the National Cancer Institute and regional cancer centers in Thailand with 1:1 controls matched for age, gender and geographical residence. Occupational information was obtained through personal interviews. The potential probability, frequency and intensity of exposure to wood dust were assessed on a job-by-job basis by experienced experts. Analysis was performed by conditional logistic regression and presented in odds ratio (ORs) estimates and 95% confidence intervals (CI). Overall, a non significant relationship between occupational wood dust exposure and NPC risk for all subjects was observed (ORs=1.61, 95%CI 0.99-2.59); however, the risk became significant when analyses focused on types 2 and 3 of NPC (ORs=1.62, 95%CI 1.03-2.74). The significant association was stronger for those exposed to wood dust for >10 year (ORs=2.26, 95%CI 1.10-4.63), for those with first-time exposure at age>25 year (ORs=2.07, 95%CI 1.08-3.94), and for those who had a high cumulative exposure (ORs=2.17, 95%CI 1.03-4.58) when compared with those considered unexposed. In conclusion, wood dust is likely to be associated with an increased risk of type 2 or 3 NPC in the Thai population. The results of this study show that semi-quantitative exposure assessment is suitable for occupational exposure assessment in a case control study and complements the information from self-reporting.

  2. Quantitative assessment of intrinsic groundwater vulnerability to contamination using numerical simulations.

    PubMed

    Neukum, Christoph; Azzam, Rafig

    2009-12-20

    Intrinsic vulnerability assessment to groundwater contamination is part of groundwater management in many areas of the world. However, popular assessment methods estimate vulnerability only qualitatively. To enhance vulnerability assessment, an approach for quantitative vulnerability assessment using numerical simulation of water flow and solute transport with transient boundary conditions and new vulnerability indicators are presented in this work. Based on a conceptual model of the unsaturated underground with distinct hydrogeological layers and site specific hydrological characteristics the numerical simulations of water flow and solute transport are applied on each hydrogeological layer with standardized conditions separately. Analysis of the simulation results reveals functional relationships between layer thickness, groundwater recharge and transit time. Based on the first, second and third quartiles of solute mass breakthrough at the lower boundary of the unsaturated zone, and the solute dilution, four vulnerability indicators are extracted. The indicator transit time t(50) is the time were 50% of solute mass breakthrough passes the groundwater table. Dilution is referred as maximum solute concentration C(max) in the percolation water when entering the groundwater table in relation to the injected mass or solute concentration C(0) at the ground surface. Duration of solute breakthrough is defined as the time period between 25% and 75% (t(25%)-t(75%)) of total solute mass breakthrough at the groundwater table. The temporal shape of the breakthrough curve is expressed with the quotient (t(25%)-t(50%))/(t(25%)-t(75%)). Results from an application of this new quantitative vulnerability assessment approach, its advantages and disadvantages, and potential benefits for future groundwater management strategies are discussed.

  3. Automated quantitative assessment of three-dimensional bioprinted hydrogel scaffolds using optical coherence tomography

    PubMed Central

    Wang, Ling; Xu, Mingen; Zhang, LieLie; Zhou, QingQing; Luo, Li

    2016-01-01

    Reconstructing and quantitatively assessing the internal architecture of opaque three-dimensional (3D) bioprinted hydrogel scaffolds is difficult but vital to the improvement of 3D bioprinting techniques and to the fabrication of functional engineered tissues. In this study, swept-source optical coherence tomography was applied to acquire high-resolution images of hydrogel scaffolds. Novel 3D gelatin/alginate hydrogel scaffolds with six different representative architectures were fabricated using our 3D bioprinting system. Both the scaffold material networks and the interconnected flow channel networks were reconstructed through volume rendering and binarisation processing to provide a 3D volumetric view. An image analysis algorithm was developed based on the automatic selection of the spatially-isolated region-of–interest. Via this algorithm, the spatially-resolved morphological parameters including pore size, pore shape, strut size, surface area, porosity, and interconnectivity were quantified precisely. Fabrication defects and differences between the designed and as-produced scaffolds were clearly identified in both 2D and 3D; the locations and dimensions of each of the fabrication defects were also defined. It concludes that this method will be a key tool for non-destructive and quantitative characterization, design optimisation and fabrication refinement of 3D bioprinted hydrogel scaffolds. Furthermore, this method enables investigation into the quantitative relationship between scaffold structure and biological outcome. PMID:27231597

  4. Quantitative breast MRI radiomics for cancer risk assessment and the monitoring of high-risk populations

    NASA Astrophysics Data System (ADS)

    Mendel, Kayla R.; Li, Hui; Giger, Maryellen L.

    2016-03-01

    Breast density is routinely assessed qualitatively in screening mammography. However, it is challenging to quantitatively determine a 3D density from a 2D image such as a mammogram. Furthermore, dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) is used more frequently in the screening of high-risk populations. The purpose of our study is to segment parenchyma and to quantitatively determine volumetric breast density on pre-contrast axial DCE-MRI images (i.e., non-contrast) using a semi-automated quantitative approach. In this study, we retroactively examined 3D DCE-MRI images taken for breast cancer screening of a high-risk population. We analyzed 66 cases with ages between 28 and 76 (mean 48.8, standard deviation 10.8). DCE-MRIs were obtained on a Philips 3.0 T scanner. Our semi-automated DCE-MRI algorithm includes: (a) segmentation of breast tissue from non-breast tissue using fuzzy cmeans clustering (b) separation of dense and fatty tissues using Otsu's method, and (c) calculation of volumetric density as the ratio of dense voxels to total breast voxels. We examined the relationship between pre-contrast DCE-MRI density and clinical BI-RADS density obtained from radiology reports, and obtained a statistically significant correlation [Spearman ρ-value of 0.66 (p < 0.0001)]. Our method within precision medicine may be useful for monitoring high-risk populations.

  5. Quantitative sonoelastography for the in vivo assessment of skeletal muscle viscoelasticity

    PubMed Central

    Hoyt, Kenneth; Kneezel, Timothy; Castaneda, Benjamin; Parker, Kevin J

    2015-01-01

    A novel quantitative sonoelastography technique for assessing the viscoelastic properties of skeletal muscle tissue was developed. Slowly propagating shear wave interference patterns (termed crawling waves) were generated using a two-source configuration vibrating normal to the surface. Theoretical models predict crawling wave displacement fields, which were validated through phantom studies. In experiments, a viscoelastic model was fit to dispersive shear wave speed sonoelastographic data using nonlinear least-squares techniques to determine frequency-independent shear modulus and viscosity estimates. Shear modulus estimates derived using the viscoelastic model were in agreement with that obtained by mechanical testing on phantom samples. Preliminary sonoelastographic data acquired in healthy human skeletal muscles confirm that high-quality quantitative elasticity data can be acquired in vivo. Studies on relaxed muscle indicate discernible differences in both shear modulus and viscosity estimates between different skeletal muscle groups. Investigations into the dynamic viscoelastic properties of (healthy) human skeletal muscles revealed that voluntarily contracted muscles exhibit considerable increases in both shear modulus and viscosity estimates as compared to the relaxed state. Overall, preliminary results are encouraging and quantitative sonoelastography may prove clinically feasible for in vivo characterization of the dynamic viscoelastic properties of human skeletal muscle. PMID:18612176

  6. The Quantitative Ideas and Methods in Assessment of Four Properties of Chinese Medicinal Herbs.

    PubMed

    Fu, Jialei; Pang, Jingxiang; Zhao, Xiaolei; Han, Jinxiang

    2015-04-01

    The purpose of this review is to summarize and reflect on the current status and problems of the research on the properties of Chinese medicinal herbs. Hot, warm, cold, and cool are the four properties/natures of Chinese medicinal herbs. They are defined based on the interaction between the herbs with human body. How to quantitatively assess the therapeutic effect of Chinese medicinal herbs based on the theoretical system of Traditional Chinese medicine (TCM) remains to be a challenge. Previous studies on the topic from several perspectives have been presented. Results and problems were discussed. New ideas based on the technology of biophoton radiation detection are proposed. With the development of biophoton detection technology, detection and characterization of human biophoton emission has led to its potential applications in TCM. The possibility of using the biophoton analysis system to study the interaction of Chinese medicinal herbs with human body and to quantitatively determine the effect of the Chinese medicinal herbal is entirely consistent with the holistic concept of TCM theory. The statistical entropy of electromagnetic radiations from the biological systems can characterize the four properties of Chinese medicinal herbs, and the spectrum can characterize the meridian tropism of it. Therefore, we hypothesize that by the use of biophoton analysis system, the four properties and meridian tropism of Chinese medicinal herbs can be quantitatively expressed.

  7. Monitoring and quantitative assessment of tumor burden using in vivo bioluminescence imaging

    NASA Astrophysics Data System (ADS)

    Chen, Chia-Chi; Hwang, Jeng-Jong; Ting, Gann; Tseng, Yun-Long; Wang, Shyh-Jen; Whang-Peng, Jaqueline

    2007-02-01

    In vivo bioluminescence imaging (BLI) is a sensitive imaging modality that is rapid and accessible, and may comprise an ideal tool for evaluating tumor growth. In this study, the kinetic of tumor growth has been assessed in C26 colon carcinoma bearing BALB/c mouse model. The ability of BLI to noninvasively quantitate the growth of subcutaneous tumors transplanted with C26 cells genetically engineered to stably express firefly luciferase and herpes simplex virus type-1 thymidine kinase (C26/ tk-luc). A good correlation ( R2=0.998) of photon emission to the cell number was found in vitro. Tumor burden and tumor volume were monitored in vivo over time by quantitation of photon emission using Xenogen IVIS 50 and standard external caliper measurement, respectively. At various time intervals, tumor-bearing mice were imaged to determine the correlation of in vivo BLI to tumor volume. However, a correlation of BLI to tumor volume was observed when tumor volume was smaller than 1000 mm 3 ( R2=0.907). γ Scintigraphy combined with [ 131I]FIAU was another imaging modality used for verifying the previous results. In conclusion, this study showed that bioluminescence imaging is a powerful and quantitative tool for the direct assay to monitor tumor growth in vivo. The dual reporter genes transfected tumor-bearing animal model can be applied in the evaluation of the efficacy of new developed anti-cancer drugs.

  8. Safety evaluation of disposable baby diapers using principles of quantitative risk assessment.

    PubMed

    Rai, Prashant; Lee, Byung-Mu; Liu, Tsung-Yun; Yuhui, Qin; Krause, Edburga; Marsman, Daniel S; Felter, Susan

    2009-01-01

    Baby diapers are complex products consisting of multiple layers of materials, most of which are not in direct contact with the skin. The safety profile of a diaper is determined by the biological properties of individual components and the extent to which the baby is exposed to each component during use. Rigorous evaluation of the toxicological profile and realistic exposure conditions of each material is important to ensure the overall safety of the diaper under normal and foreseeable use conditions. Quantitative risk assessment (QRA) principles may be applied to the safety assessment of diapers and similar products. Exposure to component materials is determined by (1) considering the conditions of product use, (2) the degree to which individual layers of the product are in contact with the skin during use, and (3) the extent to which some components may be extracted by urine and delivered to skin. This assessment of potential exposure is then combined with data from standard safety assessments of components to determine the margin of safety (MOS). This study examined the application of QRA to the safety evaluation of baby diapers, including risk assessments for some diaper ingredient chemicals for which establishment of acceptable and safe exposure levels were demonstrated.

  9. Quantitative, fluorescence-based in-situ assessment of protein expression.

    PubMed

    Moeder, Christopher B; Giltnane, Jennifer M; Moulis, Sharon Pozner; Rimm, David L

    2009-01-01

    As companion diagnostics grow in prevalence and importance, the need for accurate assessment of in situ protein concentrations has increased. Traditional immunohistochemistry (IHC), while valuable for assessment of context of expression, is less valuable for quantification. The lack of rigorous quantitative potential of traditional IHC led to our development of an immunofluorescence-based method now commercialized as the AQUA technology. Immunostaining of tissue samples, image acquisition, and use of AQUA software allow investigators to quickly, efficiently, and accurately measure levels of expression within user-defined subcellular or architectural compartments. IHC analyzed by AQUA shows high reproducibility and demonstrates protein measurement accuracy similar to ELISA assays. The process is largely automated, eliminating potential error, and the resultant scores are exported on a continuous scale. There are now numerous published examples where observations made with this technology are not seen by traditional methods.

  10. Benchmark dose profiles for joint-action continuous data in quantitative risk assessment.

    PubMed

    Deutsch, Roland C; Piegorsch, Walter W

    2013-09-01

    Benchmark analysis is a widely used tool in biomedical and environmental risk assessment. Therein, estimation of minimum exposure levels, called benchmark doses (BMDs), that induce a prespecified benchmark response (BMR) is well understood for the case of an adverse response to a single stimulus. For cases where two agents are studied in tandem, however, the benchmark approach is far less developed. This paper demonstrates how the benchmark modeling paradigm can be expanded from the single-agent setting to joint-action, two-agent studies. Focus is on continuous response outcomes. Extending the single-exposure setting, representations of risk are based on a joint-action dose-response model involving both agents. Based on such a model, the concept of a benchmark profile-a two-dimensional analog of the single-dose BMD at which both agents achieve the specified BMR-is defined for use in quantitative risk characterization and assessment.

  11. Survey of Quantitative Research Metrics to Assess Pilot Performance in Upset Recovery

    NASA Technical Reports Server (NTRS)

    Le Vie, Lisa R.

    2016-01-01

    Accidents attributable to in-flight loss of control are the primary cause for fatal commercial jet accidents worldwide. The National Aeronautics and Space Administration (NASA) conducted a literature review to determine and identify the quantitative standards for assessing upset recovery performance. This review contains current recovery procedures for both military and commercial aviation and includes the metrics researchers use to assess aircraft recovery performance. Metrics include time to first input, recognition time and recovery time and whether that input was correct or incorrect. Other metrics included are: the state of the autopilot and autothrottle, control wheel/sidestick movement resulting in pitch and roll, and inputs to the throttle and rudder. In addition, airplane state measures, such as roll reversals, altitude loss/gain, maximum vertical speed, maximum/minimum air speed, maximum bank angle and maximum g loading are reviewed as well.

  12. Quantitative photoacoustic assessment of ex-vivo lymph nodes of colorectal cancer patients

    NASA Astrophysics Data System (ADS)

    Sampathkumar, Ashwin; Mamou, Jonathan; Saegusa-Beercroft, Emi; Chitnis, Parag V.; Machi, Junji; Feleppa, Ernest J.

    2015-03-01

    Staging of cancers and selection of appropriate treatment requires histological examination of multiple dissected lymph nodes (LNs) per patient, so that a staggering number of nodes require histopathological examination, and the finite resources of pathology facilities create a severe processing bottleneck. Histologically examining the entire 3D volume of every dissected node is not feasible, and therefore, only the central region of each node is examined histologically, which results in severe sampling limitations. In this work, we assess the feasibility of using quantitative photoacoustics (QPA) to overcome the limitations imposed by current procedures and eliminate the resulting under sampling in node assessments. QPA is emerging as a new hybrid modality that assesses tissue properties and classifies tissue type based on multiple estimates derived from spectrum analysis of photoacoustic (PA) radiofrequency (RF) data and from statistical analysis of envelope-signal data derived from the RF signals. Our study seeks to use QPA to distinguish cancerous from non-cancerous regions of dissected LNs and hence serve as a reliable means of imaging and detecting small but clinically significant cancerous foci that would be missed by current methods. Dissected lymph nodes were placed in a water bath and PA signals were generated using a wavelength-tunable (680-950 nm) laser. A 26-MHz, f-2 transducer was used to sense the PA signals. We present an overview of our experimental setup; provide a statistical analysis of multi-wavelength classification parameters (mid-band fit, slope, intercept) obtained from the PA signal spectrum generated in the LNs; and compare QPA performance with our established quantitative ultrasound (QUS) techniques in distinguishing metastatic from non-cancerous tissue in dissected LNs. QPA-QUS methods offer a novel general means of tissue typing and evaluation in a broad range of disease-assessment applications, e.g., cardiac, intravascular

  13. Quantitative microbial risk assessment for Staphylococcus aureus in natural and processed cheese in Korea.

    PubMed

    Lee, Heeyoung; Kim, Kyunga; Choi, Kyoung-Hee; Yoon, Yohan

    2015-09-01

    This study quantitatively assessed the microbial risk of Staphylococcus aureus in cheese in Korea. The quantitative microbial risk assessment was carried out for natural and processed cheese from factory to consumption. Hazards for S. aureus in cheese were identified through the literature. For exposure assessment, the levels of S. aureus contamination in cheeses were evaluated, and the growth of S. aureus was predicted by predictive models at the surveyed temperatures, and at the time of cheese processing and distribution. For hazard characterization, a dose-response model for S. aureus was found, and the model was used to estimate the risk of illness. With these data, simulation models were prepared with @RISK (Palisade Corp., Ithaca, NY) to estimate the risk of illness per person per day in risk characterization. Staphylococcus aureus cell counts on cheese samples from factories and markets were below detection limits (0.30-0.45 log cfu/g), and pert distribution showed that the mean temperature at markets was 6.63°C. Exponential model [P=1 - exp(7.64×10(-8) × N), where N=dose] for dose-response was deemed appropriate for hazard characterization. Mean temperature of home storage was 4.02°C (log-logistic distribution). The results of risk characterization for S. aureus in natural and processed cheese showed that the mean values for the probability of illness per person per day were higher in processed cheese (mean: 2.24×10(-9); maximum: 7.97×10(-6)) than in natural cheese (mean: 7.84×10(-10); maximum: 2.32×10(-6)). These results indicate that the risk of S. aureus-related foodborne illness due to cheese consumption can be considered low under the present conditions in Korea. In addition, the developed stochastic risk assessment model in this study can be useful in establishing microbial criteria for S. aureus in cheese.

  14. Quantitative assessments of burn degree by high-frequency ultrasonic backscattering and statistical model

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Hsun; Huang, Chih-Chung; Wang, Shyh-Hau

    2011-02-01

    An accurate and quantitative modality to assess the burn degree is crucial for determining further treatments to be properly applied to burn injury patients. Ultrasounds with frequencies higher than 20 MHz have been applied to dermatological diagnosis due to its high resolution and noninvasive capability. Yet, it is still lacking a substantial means to sensitively correlate the burn degree and ultrasonic measurements quantitatively. Thus, a 50 MHz ultrasound system was developed and implemented to measure ultrasonic signals backscattered from the burned skin tissues. Various burn degrees were achieved by placing a 100 °C brass plate onto the dorsal skins of anesthetized rats for various durations ranged from 5 to 20 s. The burn degrees were correlated with ultrasonic parameters, including integrated backscatter (IB) and Nakagami parameter (m) calculated from ultrasonic signals acquired from the burned tissues of a 5 × 1.4 mm (width × depth) area. Results demonstrated that both IB and m decreased exponentially with the increase of burn degree. Specifically, an IB of -79.0 ± 2.4 (mean ± standard deviation) dB for normal skin tissues tended to decrease to -94.0 ± 1.3 dB for those burned for 20 s, while the corresponding Nakagami parameters tended to decrease from 0.76 ± 0.08 to 0.45 ± 0.04. The variation of both IB and m was partially associated with the change of properties of collagen fibers from the burned tissues verified by samples of tissue histological sections. Particularly, the m parameter may be more sensitive to differentiate burned skin due to the fact that it has a greater rate of change with respect to different burn durations. These ultrasonic parameters in conjunction with high-frequency B-mode and Nakagami images could have the potential to assess the burn degree quantitatively.

  15. Changes in transmural distribution of myocardial perfusion assessed by quantitative intravenous myocardial contrast echocardiography in humans

    PubMed Central

    Fukuda, S; Muro, T; Hozumi, T; Watanabe, H; Shimada, K; Yoshiyama, M; Takeuchi, K; Yoshikawa, J

    2002-01-01

    Objective: To clarify whether changes in transmural distribution of myocardial perfusion under significant coronary artery stenosis can be assessed by quantitative intravenous myocardial contrast echocardiography (MCE) in humans. Methods: 31 patients underwent dipyridamole stress MCE and quantitative coronary angiography. Intravenous MCE was performed by continuous infusion of Levovist. Images were obtained from the apical four chamber view with alternating pulsing intervals both at rest and after dipyridamole infusion. Images were analysed offline by placing regions of interest over both endocardial and epicardial sides of the mid-septum. The background subtracted intensity versus pulsing interval plots were fitted to an exponential function, y = A (1 − e−βt), where A is plateau level and β is rate of rise. Results: Of the 31 patients, 16 had significant stenosis (> 70%) in the left anterior descending artery (group A) and 15 did not (group B). At rest, there were no differences in the A endocardial to epicardial ratio (A-EER) and β-EER between the two groups (mean (SD) 1.2 (0.6) v 1.2 (0.8) and 1.2 (0.7) v 1.1 (0.6), respectively, NS). During hyperaemia, β-EER in group A was significantly lower than that in group B (1.0 (0.5) v 1.4 (0.5), p < 0.05) and A-EER did not differ between the two groups (1.0 (0.5) v 1.2 (0.4), NS). Conclusions: Changes in transmural distribution of myocardial perfusion under significant coronary artery stenosis can be assessed by quantitative intravenous MCE in humans. PMID:12231594

  16. Quantitative Assessment of Amino Acid Damage upon keV Ion Beam Irradiation Through FTIR Spectroscopy

    NASA Astrophysics Data System (ADS)

    Huang, Qing; Ke, Zhigang; Su, Xi; Yuan, Hang; Zhang, Shuqing; Yu, Zengliang

    2010-06-01

    Ion beam irradiation induces important biological effects and it is a long-standing task to acquire both qualitative and quantitative assessment of these effects. One effective way in the investigation is to utilize Fourier transformation infrared (FTIR) spectroscopy because it can offer sensitive and non-invasive measurements. In this paper a novel protocol was employed to prepare biomolecular samples in the form of thin and transversely uniform solid films that were suitable for both infrared and low-energy ion beam irradiation experiments. Under the irradiation of N+ and Ar+ ion beams of 25 keV with fluence ranging from 5×1015 ions/cm2 to 2.5×10 ions/cm2, the ion radio-sensitivity of four amino acids, namely, glycine, tyrosine, methionine and phenylalanine, were evaluated and compared. The ion beam irradiation caused biomolecular decomposition accompanied by molecular desorption of volatile species and the damage was dependent on ion type, fluence, energy and types of amino acids. The effectiveness of application of FTIR spectroscopy to the quantitative assessment of biomolecular damage dose effect induced by low-energy ion radiation was thus demonstrated.

  17. Using Non-Invasive Multi-Spectral Imaging to Quantitatively Assess Tissue Vasculature

    SciTech Connect

    Vogel, A; Chernomordik, V; Riley, J; Hassan, M; Amyot, F; Dasgeb, B; Demos, S G; Pursley, R; Little, R; Yarchoan, R; Tao, Y; Gandjbakhche, A H

    2007-10-04

    This research describes a non-invasive, non-contact method used to quantitatively analyze the functional characteristics of tissue. Multi-spectral images collected at several near-infrared wavelengths are input into a mathematical optical skin model that considers the contributions from different analytes in the epidermis and dermis skin layers. Through a reconstruction algorithm, we can quantify the percent of blood in a given area of tissue and the fraction of that blood that is oxygenated. Imaging normal tissue confirms previously reported values for the percent of blood in tissue and the percent of blood that is oxygenated in tissue and surrounding vasculature, for the normal state and when ischemia is induced. This methodology has been applied to assess vascular Kaposi's sarcoma lesions and the surrounding tissue before and during experimental therapies. The multi-spectral imaging technique has been combined with laser Doppler imaging to gain additional information. Results indicate that these techniques are able to provide quantitative and functional information about tissue changes during experimental drug therapy and investigate progression of disease before changes are visibly apparent, suggesting a potential for them to be used as complementary imaging techniques to clinical assessment.

  18. Second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygenation

    NASA Astrophysics Data System (ADS)

    Huang, Jiwei; Zhang, Shiwu; Gnyawali, Surya; Sen, Chandan K.; Xu, Ronald X.

    2015-03-01

    We report a second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygen saturation (StO2). The algorithm is based on a forward model of light transport in multilayered skin tissue and an inverse algorithm for StO2 reconstruction. Based on the forward simulation results, a parameter of a second derivative ratio (SDR) is derived as a function of cutaneous tissue StO2. The SDR function is optimized at a wavelength set of 544, 552, 568, 576, 592, and 600 nm so that cutaneous tissue StO2 can be derived with minimal artifacts by blood concentration, tissue scattering, and melanin concentration. The proposed multispectral StO2 imaging algorithm is verified in both benchtop and in vivo experiments. The experimental results show that the proposed multispectral imaging algorithm is able to map cutaneous tissue StO2 in high temporal resolution with reduced measurement artifacts induced by different skin conditions in comparison with other three commercial tissue oxygen measurement systems. These results indicate that the multispectral StO2 imaging technique has the potential for noninvasive and quantitative assessment of skin tissue oxygenation with a high temporal resolution.

  19. A quantitative model to assess Social Responsibility in Environmental Science and Technology.

    PubMed

    Valcárcel, M; Lucena, R

    2014-01-01

    The awareness of the impact of human activities in society and environment is known as "Social Responsibility" (SR). It has been a topic of growing interest in many enterprises since the fifties of the past Century, and its implementation/assessment is nowadays supported by international standards. There is a tendency to amplify its scope of application to other areas of the human activities, such as Research, Development and Innovation (R + D + I). In this paper, a model of quantitative assessment of Social Responsibility in Environmental Science and Technology (SR EST) is described in detail. This model is based on well established written standards as the EFQM Excellence model and the ISO 26000:2010 Guidance on SR. The definition of five hierarchies of indicators, the transformation of qualitative information into quantitative data and the dual procedure of self-evaluation and external evaluation are the milestones of the proposed model, which can be applied to Environmental Research Centres and institutions. In addition, a simplified model that facilitates its implementation is presented in the article. PMID:23892022

  20. A quantitative model to assess Social Responsibility in Environmental Science and Technology.

    PubMed

    Valcárcel, M; Lucena, R

    2014-01-01

    The awareness of the impact of human activities in society and environment is known as "Social Responsibility" (SR). It has been a topic of growing interest in many enterprises since the fifties of the past Century, and its implementation/assessment is nowadays supported by international standards. There is a tendency to amplify its scope of application to other areas of the human activities, such as Research, Development and Innovation (R + D + I). In this paper, a model of quantitative assessment of Social Responsibility in Environmental Science and Technology (SR EST) is described in detail. This model is based on well established written standards as the EFQM Excellence model and the ISO 26000:2010 Guidance on SR. The definition of five hierarchies of indicators, the transformation of qualitative information into quantitative data and the dual procedure of self-evaluation and external evaluation are the milestones of the proposed model, which can be applied to Environmental Research Centres and institutions. In addition, a simplified model that facilitates its implementation is presented in the article.

  1. Quantitative assessment on soil enzyme activities of heavy metal contaminated soils with various soil properties.

    PubMed

    Xian, Yu; Wang, Meie; Chen, Weiping

    2015-11-01

    Soil enzyme activities are greatly influenced by soil properties and could be significant indicators of heavy metal toxicity in soil for bioavailability assessment. Two groups of experiments were conducted to determine the joint effects of heavy metals and soil properties on soil enzyme activities. Results showed that arylsulfatase was the most sensitive soil enzyme and could be used as an indicator to study the enzymatic toxicity of heavy metals under various soil properties. Soil organic matter (SOM) was the dominant factor affecting the activity of arylsulfatase in soil. A quantitative model was derived to predict the changes of arylsulfatase activity with SOM content. When the soil organic matter content was less than the critical point A (1.05% in our study), the arylsulfatase activity dropped rapidly. When the soil organic matter content was greater than the critical point A, the arylsulfatase activity gradually rose to higher levels showing that instead of harm the soil microbial activities were enhanced. The SOM content needs to be over the critical point B (2.42% in our study) to protect its microbial community from harm due to the severe Pb pollution (500mgkg(-1) in our study). The quantitative model revealed the pattern of variation of enzymatic toxicity due to heavy metals under various SOM contents. The applicability of the model under wider soil properties need to be tested. The model however may provide a methodological basis for ecological risk assessment of heavy metals in soil.

  2. Second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygenation

    PubMed Central

    Huang, Jiwei; Zhang, Shiwu; Gnyawali, Surya; Sen, Chandan K.; Xu, Ronald X.

    2015-01-01

    Abstract. We report a second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygen saturation (StO2). The algorithm is based on a forward model of light transport in multilayered skin tissue and an inverse algorithm for StO2 reconstruction. Based on the forward simulation results, a parameter of a second derivative ratio (SDR) is derived as a function of cutaneous tissue StO2. The SDR function is optimized at a wavelength set of 544, 552, 568, 576, 592, and 600 nm so that cutaneous tissue StO2 can be derived with minimal artifacts by blood concentration, tissue scattering, and melanin concentration. The proposed multispectral StO2 imaging algorithm is verified in both benchtop and in vivo experiments. The experimental results show that the proposed multispectral imaging algorithm is able to map cutaneous tissue StO2 in high temporal resolution with reduced measurement artifacts induced by different skin conditions in comparison with other three commercial tissue oxygen measurement systems. These results indicate that the multispectral StO2 imaging technique has the potential for noninvasive and quantitative assessment of skin tissue oxygenation with a high temporal resolution. PMID:25734405

  3. Quantitative safety assessment of computer based I and C systems via modular Markov analysis

    SciTech Connect

    Elks, C. R.; Yu, Y.; Johnson, B. W.

    2006-07-01

    This paper gives a brief overview of the methodology based on quantitative metrics for evaluating digital I and C system that has been under development at the Univ. of Virginia for a number years. Our quantitative assessment methodology is based on three well understood and extensively practiced disciplines in the dependability assessment field: (1) System level fault modeling and fault injection, (2) safety and coverage based dependability modeling methods, and (3) statistical estimation of model parameters used for safety predication. There are two contributions of this paper; the first contribution is related to incorporating design flaw information into homogenous Markov models when such data is available. The second is to introduce a Markov modeling method for managing the modeling complexities of large distributed I and C systems for the predication of safety and reliability. The method is called Modular Markov Chain analysis. This method allows Markov models of the system to be composed in a modular manner. In doing so, it address two important issues. (1) The models are more visually representative of the functional the system. (2) Important failure dependencies that naturally occur in complex systems are modeled accurately with our approach. (authors)

  4. Semi-quantitative assessment of pulmonary perfusion in children using dynamic contrast-enhanced MRI

    NASA Astrophysics Data System (ADS)

    Fetita, Catalin; Thong, William E.; Ou, Phalla

    2013-03-01

    This paper addresses the study of semi-quantitative assessment of pulmonary perfusion acquired from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in a study population mainly composed of children with pulmonary malformations. The automatic analysis approach proposed is based on the indicator-dilution theory introduced in 1954. First, a robust method is developed to segment the pulmonary artery and the lungs from anatomical MRI data, exploiting 2D and 3D mathematical morphology operators. Second, the time-dependent contrast signal of the lung regions is deconvolved by the arterial input function for the assessment of the local hemodynamic system parameters, ie. mean transit time, pulmonary blood volume and pulmonary blood flow. The discrete deconvolution method implements here a truncated singular value decomposition (tSVD) method. Parametric images for the entire lungs are generated as additional elements for diagnosis and quantitative follow-up. The preliminary results attest the feasibility of perfusion quantification in pulmonary DCE-MRI and open an interesting alternative to scintigraphy for this type of evaluation, to be considered at least as a preliminary decision in the diagnostic due to the large availability of the technique and to the non-invasive aspects.

  5. Berloque dermatitis mimicking child abuse.

    PubMed

    Gruson, Lisa Moed; Chang, Mary Wu

    2002-11-01

    Berloque dermatitis is a type of photocontact dermatitis. It occurs after perfumed products containing bergamot (or a psoralen) are applied to the skin followed by exposure to sunlight. Striking linear patterns of hyperpigmentation are characteristic, corresponding to local application of the scented product. In the acute phase, erythema and even blistering can be seen. We report a case of berloque dermatitis in a 9-year-old girl that was initially reported as child abuse. To our knowledge, this is the first report of berloque dermatitis mimicking child abuse. Questioning to elicit a history of perfume application coupled with sunlight exposure should help to prevent this misdiagnosis in children.

  6. A qualitative and quantitative needs assessment of pain management for hospitalized orthopedic patients.

    PubMed

    Cordts, Grace A; Grant, Marian S; Brandt, Lynsey E; Mears, Simon C

    2011-08-08

    Despite advances in pain management, little formal teaching is given to practitioners and nurses in its use for postoperative orthopedic patients. The goal of our study was to determine the educational needs for orthopedic pain management of our residents, nurses, and physical therapists using a quantitative and qualitative assessment. The needs analysis was conducted in a 10-bed orthopedic unit at a teaching hospital and included a survey given to 20 orthopedic residents, 9 nurses, and 6 physical therapists, followed by focus groups addressing barriers to pain control and knowledge of pain management. Key challenges for nurses included not always having breakthrough pain medication orders and the gap in pain management between cessation of patient-controlled analgesia and ordering and administering oral medications. Key challenges for orthopedic residents included treating pain in patients with a history of substance abuse, assessing pain, and determining when to use long-acting vs short-acting opioids. Focus group assessments revealed a lack of training in pain management and the need for better coordination of care between nurses and practitioners and improved education about special needs groups (the elderly and those with substance abuse issues). This needs assessment showed that orthopedic residents and nurses receive little formal education on pain management, despite having to address pain on a daily basis. This information will be used to develop an educational program to improve pain management for postoperative orthopedic patients. An integrated educational program with orthopedic residents, nurses, and physical therapists would promote understanding of issues for each discipline.

  7. Quantitative Integration of Ndt with Probabilistic Fracture Mechanics for the Assessment of Fracture Risk in Pipelines

    NASA Astrophysics Data System (ADS)

    Kurz, J. H.; Cioclov, D.; Dobmann, G.; Boller, C.

    2010-02-01

    In the context of probabilistic paradigm of fracture risk assessment in structural components a computer simulation rationale is presented which has at the base the integration of Quantitative Non-destructive Inspection and Probabilistic Fracture Mechanics. In this study the static failure under static loading is assessed in the format known as Failure Assessment Diagram (FAD). The fracture risk is evaluated in probabilistic terms. The superposed probabilistic pattern over the deterministic one is implemented via Monte-Carlo sampling. The probabilistic fracture simulation yields a more informative analysis in terms of probability of failure. The ability to simulate the influence of the quality and reliability of non-destructive inspection (NDI) is an important feature of this approach. It is achieved by integrating, algorithmically, probabilistic FAD analysis and the Probability of Detection (POD). The POD information can only be applied in a probabilistic analysis and leads to a refinement of the assessment. By this means, it can be ascertained the decrease of probability of failure when POD-characterized NDI is applied. Therefore, this procedure can be used as a tool for inspection based life time conceptions. In this paper results of sensitivity analyses are presented with the aim to outline, in terms of non-failure probabilities, the benefits of applying NDI, in various qualities, in comparison with the situation when NDI is lacking. A better substantiation is enabled of both the component reliability management and the costs-effectiveness of NDI timing.

  8. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    SciTech Connect

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and the potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments

  9. Quantitative assessment of properties of make-up products by video imaging: application to lipsticks.

    PubMed

    Korichi, Rodolphe; Provost, Robin; Heusèle, Catherine; Schnebert, Sylvianne

    2000-11-01

    BACKGROUND/AIMS: The different properties and visual effects of lipstick have been studied by image analysis directly on volunteers. METHODS: After controlling the volunteer's position mechanically using an ophthalmic table and visually using an acquirement mask, which is an indicator of luminance and guide marks, we carried out video colour images of the make-up area. From these images, we quantified the colour, gloss, covering power, long-lasting effect and streakiness, using computer science programs. RESULTS/CONCLUSION: Quantitative colorimetric assessment requires the transformation of the RGB components obtained by a video colour camera into CIELAB colorimetric space. The expression of each coordinate of the L*a*b* space according to R,G,B was carried out by a statistical method of polynomial approximations. A study, using 24 colour images extracted from a Pantone(R) palette, showed a very good correlation with a Minolta Colorimeter(R) CR 300. The colour assessment on volunteers required a segmentation method by maximizing the entropy. The aim was to separate the colour information sent back by the skin to the make-up area. It was very useful to precisely delimit the contour between the skin and the product in the case of almost identical colours and to evaluate the streakiness. From this colour segmentation, an algorithm was studied to search for the shades most represented in the overall colour of the make-up area. The capacity to replicate what the consumer perceives of the make-up product, to carry out studies without having any contact with the skin surface, and the constant improvement of software and video acquirement systems all make video imaging a very useful tool in the quantitative assessment of the properties and visual effects of a make-up product. PMID:11428961

  10. Quantitative assessment of properties of make-up products by video imaging: application to lipsticks.

    PubMed

    Korichi, Rodolphe; Provost, Robin; Heusèle, Catherine; Schnebert, Sylvianne

    2000-11-01

    BACKGROUND/AIMS: The different properties and visual effects of lipstick have been studied by image analysis directly on volunteers. METHODS: After controlling the volunteer's position mechanically using an ophthalmic table and visually using an acquirement mask, which is an indicator of luminance and guide marks, we carried out video colour images of the make-up area. From these images, we quantified the colour, gloss, covering power, long-lasting effect and streakiness, using computer science programs. RESULTS/CONCLUSION: Quantitative colorimetric assessment requires the transformation of the RGB components obtained by a video colour camera into CIELAB colorimetric space. The expression of each coordinate of the L*a*b* space according to R,G,B was carried out by a statistical method of polynomial approximations. A study, using 24 colour images extracted from a Pantone(R) palette, showed a very good correlation with a Minolta Colorimeter(R) CR 300. The colour assessment on volunteers required a segmentation method by maximizing the entropy. The aim was to separate the colour information sent back by the skin to the make-up area. It was very useful to precisely delimit the contour between the skin and the product in the case of almost identical colours and to evaluate the streakiness. From this colour segmentation, an algorithm was studied to search for the shades most represented in the overall colour of the make-up area. The capacity to replicate what the consumer perceives of the make-up product, to carry out studies without having any contact with the skin surface, and the constant improvement of software and video acquirement systems all make video imaging a very useful tool in the quantitative assessment of the properties and visual effects of a make-up product.

  11. Zebrafish Caudal Fin Angiogenesis Assay—Advanced Quantitative Assessment Including 3-Way Correlative Microscopy

    PubMed Central

    Correa Shokiche, Carlos; Schaad, Laura; Triet, Ramona; Jazwinska, Anna; Tschanz, Stefan A.; Djonov, Valentin

    2016-01-01

    Background Researchers evaluating angiomodulating compounds as a part of scientific projects or pre-clinical studies are often confronted with limitations of applied animal models. The rough and insufficient early-stage compound assessment without reliable quantification of the vascular response counts, at least partially, to the low transition rate to clinics. Objective To establish an advanced, rapid and cost-effective angiogenesis assay for the precise and sensitive assessment of angiomodulating compounds using zebrafish caudal fin regeneration. It should provide information regarding the angiogenic mechanisms involved and should include qualitative and quantitative data of drug effects in a non-biased and time-efficient way. Approach & Results Basic vascular parameters (total regenerated area, vascular projection area, contour length, vessel area density) were extracted from in vivo fluorescence microscopy images using a stereological approach. Skeletonization of the vasculature by our custom-made software Skelios provided additional parameters including “graph energy” and “distance to farthest node”. The latter gave important insights into the complexity, connectivity and maturation status of the regenerating vascular network. The employment of a reference point (vascular parameters prior amputation) is unique for the model and crucial for a proper assessment. Additionally, the assay provides exceptional possibilities for correlative microscopy by combining in vivo-imaging and morphological investigation of the area of interest. The 3-way correlative microscopy links the dynamic changes in vivo with their structural substrate at the subcellular level. Conclusions The improved zebrafish fin regeneration model with advanced quantitative analysis and optional 3-way correlative morphology is a promising in vivo angiogenesis assay, well-suitable for basic research and preclinical investigations. PMID:26950851

  12. Basic concepts in three-part quantitative assessments of undiscovered mineral resources

    USGS Publications Warehouse

    Singer, D.A.

    1993-01-01

    Since 1975, mineral resource assessments have been made for over 27 areas covering 5??106 km2 at various scales using what is now called the three-part form of quantitative assessment. In these assessments, (1) areas are delineated according to the types of deposits permitted by the geology,(2) the amount of metal and some ore characteristics are estimated using grade and tonnage models, and (3) the number of undiscovered deposits of each type is estimated. Permissive boundaries are drawn for one or more deposit types such that the probability of a deposit lying outside the boundary is negligible, that is, less than 1 in 100,000 to 1,000,000. Grade and tonnage models combined with estimates of the number of deposits are the fundamental means of translating geologists' resource assessments into a language that economists can use. Estimates of the number of deposits explicitly represent the probability (or degree of belief) that some fixed but unknown number of undiscovered deposits exist in the delineated tracts. Estimates are by deposit type and must be consistent with the grade and tonnage model. Other guidelines for these estimates include (1) frequency of deposits from well-explored areas, (2) local deposit extrapolations, (3) counting and assigning probabilities to anomalies and occurrences, (4) process constraints, (5) relative frequencies of related deposit types, and (6) area spatial limits. In most cases, estimates are made subjectively, as they are in meteorology, gambling, and geologic interpretations. In three-part assessments, the estimates are internally consistent because delineated tracts are consistent with descriptive models, grade and tonnage models are consistent with descriptive models, as well as with known deposits in the area, and estimates of number of deposits are consistent with grade and tonnage models. All available information is used in the assessment, and uncertainty is explicitly represented. ?? 1993 Oxford University Press.

  13. Assessment of Nutritional Status of Nepalese Hemodialysis Patients by Anthropometric Examinations and Modified Quantitative Subjective Global Assessment

    PubMed Central

    Sedhain, Arun; Hada, Rajani; Agrawal, Rajendra Kumar; Bhattarai, Gandhi R; Baral, Anil

    2015-01-01

    OBJECTIVE To assess the nutritional status of patients on maintenance hemodialysis by using modified quantitative subjective global assessment (MQSGA) and anthropometric measurements. METHOD We Conducted a cross sectional descriptive analytical study to assess the nutritional status of fifty four patients with chronic kidney disease undergoing maintenance hemodialysis by using MQSGA and different anthropometric and laboratory measurements like body mass index (BMI), mid-arm circumference (MAC), mid-arm muscle circumference (MAMC), triceps skin fold (TSF) and biceps skin fold (BSF), serum albumin, C-reactive protein (CRP) and lipid profile in a government tertiary hospital at Kathmandu, Nepal. RESULTS Based on MQSGA criteria, 66.7% of the patients suffered from mild to moderate malnutrition and 33.3% were well nourished. None of the patients were severely malnourished. CRP was positive in 56.3% patients. Serum albumin, MAC and BMI were (mean + SD) 4.0 + 0.3 mg/dl, 22 + 2.6 cm and 19.6 ± 3.2 kg/m2 respectively. MQSGA showed negative correlation with MAC (r = −0.563; P = <0.001), BMI (r = −0.448; P = <0.001), MAMC (r = −0.506; P = <.0001), TSF (r = −0.483; P = <.0002), and BSF (r = −0.508; P = <0.0001). Negative correlation of MQSGA was also found with total cholesterol, triglyceride, LDL cholesterol and HDL cholesterol without any statistical significance. CONCLUSION Mild to moderate malnutrition was found to be present in two thirds of the patients undergoing hemodialysis. Anthropometric measurements like BMI, MAC, MAMC, BSF and TSF were negatively correlated with MQSGA. Anthropometric and laboratory assessment tools could be used for nutritional assessment as they are relatively easier, cheaper and practical markers of nutritional status. PMID:26327781

  14. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  15. Quantitative Assessment of Upstream Source Influences on TGM Observations at Three CAMNet Sites

    NASA Astrophysics Data System (ADS)

    Wen, D.; Lin, J. C.; Meng, F.; Gbor, P. K.; He, Z.; Sloan, J. J.

    2009-05-01

    Mercury is a persistent and toxic substance in the environment. Exposure to high levels of mercury can cause a range of adverse health effects, including damage to the nervous system, reproduction system and childhood development. Proper recognition and prediction of atmospheric levels of mercury can effectively avoid the adverse affect of Hg, however they cannot be achieved without accurate and quantitative identification of source influences, which is a great challenge due to the complexity of Hg in the air. The objective of this study is to present a new method to simulate Hg concentrations at the location of a monitoring site and quantitatively assess its upstream source influences. Hourly total gaseous mercury (TGM) concentrations at three CAMNet monitoring sites (receptors) in Ontario were predicted for four selected periods using the Stochastic Time-Inverted Lagrangian Transport (STILT) model, which is capable of representing near-field influences that are not resolved by typical grid sizes in transport models. The model was modified to deal with Hg depositions and point source Hg emissions. The model-predicted Hg concentrations were compared with observations, as well as with the results from a CMAQ-Hg simulation in which the same emission and meteorology inputs were used. The comparisons show that STILT-predicted Hg concentrations agree well with observations, and are generally closer to the observations than those predicted by CMAQ-Hg. The better performance of the STILT simulation can be attributed to its ability to account for near-field influences. STILT was also applied to assess quantitatively the relative importance of different upstream source regions for the selected episodes. The assessment was made based on emission fluxes and STILT footprints, i.e., sensitivities of atmospheric concentrations to upstream surface fluxes. The results indicated that the main source regions of observed low Hg concentrations were in Northeastern Ontario, whereas

  16. Malignant gliomas: current perspectives in diagnosis, treatment, and early response assessment using advanced quantitative imaging methods.

    PubMed

    Ahmed, Rafay; Oborski, Matthew J; Hwang, Misun; Lieberman, Frank S; Mountz, James M

    2014-01-01

    Malignant gliomas consist of glioblastomas, anaplastic astrocytomas, anaplastic oligodendrogliomas and anaplastic oligoastrocytomas, and some less common tumors such as anaplastic ependymomas and anaplastic gangliogliomas. Malignant gliomas have high morbidity and mortality. Even with optimal treatment, median survival is only 12-15 months for glioblastomas and 2-5 years for anaplastic gliomas. However, recent advances in imaging and quantitative analysis of image data have led to earlier diagnosis of tumors and tumor response to therapy, providing oncologists with a greater time window for therapy management. In addition, improved understanding of tumor biology, genetics, and resistance mechanisms has enhanced surgical techniques, chemotherapy methods, and radiotherapy administration. After proper diagnosis and institution of appropriate therapy, there is now a vital need for quantitative methods that can sensitively detect malignant glioma response to therapy at early follow-up times, when changes in management of nonresponders can have its greatest effect. Currently, response is largely evaluated by measuring magnetic resonance contrast and size change, but this approach does not take into account the key biologic steps that precede tumor size reduction. Molecular imaging is ideally suited to measuring early response by quantifying cellular metabolism, proliferation, and apoptosis, activities altered early in treatment. We expect that successful integration of quantitative imaging biomarker assessment into the early phase of clinical trials could provide a novel approach for testing new therapies, and importantly, for facilitating patient management, sparing patients from weeks or months of toxicity and ineffective treatment. This review will present an overview of epidemiology, molecular pathogenesis and current advances in diagnoses, and management of malignant gliomas.

  17. Assessment of the sources of error affecting the quantitative accuracy of SPECT imaging in small animals

    SciTech Connect

    Joint Graduate Group in Bioengineering, University of California, San Francisco and University of California, Berkeley; Department of Radiology, University of California; Gullberg, Grant T; Hwang, Andrew B.; Franc, Benjamin L.; Gullberg, Grant T.; Hasegawa, Bruce H.

    2008-02-15

    Small animal SPECT imaging systems have multiple potential applications in biomedical research. Whereas SPECT data are commonly interpreted qualitatively in a clinical setting, the ability to accurately quantify measurements will increase the utility of the SPECT data for laboratory measurements involving small animals. In this work, we assess the effect of photon attenuation, scatter and partial volume errors on the quantitative accuracy of small animal SPECT measurements, first with Monte Carlo simulation and then confirmed with experimental measurements. The simulations modeled the imaging geometry of a commercially available small animal SPECT system. We simulated the imaging of a radioactive source within a cylinder of water, and reconstructed the projection data using iterative reconstruction algorithms. The size of the source and the size of the surrounding cylinder were varied to evaluate the effects of photon attenuation and scatter on quantitative accuracy. We found that photon attenuation can reduce the measured concentration of radioactivity in a volume of interest in the center of a rat-sized cylinder of water by up to 50percent when imaging with iodine-125, and up to 25percent when imaging with technetium-99m. When imaging with iodine-125, the scatter-to-primary ratio can reach up to approximately 30percent, and can cause overestimation of the radioactivity concentration when reconstructing data with attenuation correction. We varied the size of the source to evaluate partial volume errors, which we found to be a strong function of the size of the volume of interest and the spatial resolution. These errors can result in large (>50percent) changes in the measured amount of radioactivity. The simulation results were compared with and found to agree with experimental measurements. The inclusion of attenuation correction in the reconstruction algorithm improved quantitative accuracy. We also found that an improvement of the spatial resolution through the

  18. The application of quantitative risk assessment to microbial food safety risks.

    PubMed

    Jaykus, L A

    1996-01-01

    Regulatory programs and guidelines for the control of foodborne microbial agents have existed in the U.S. for nearly 100 years. However, increased awareness of the scope and magnitude of foodborne disease, as well as the emergence of previously unrecognized human pathogens transmitted via the foodborne route, have prompted regulatory officials to consider new and improved strategies to reduce the health risks associated with pathogenic microorganisms in foods. Implementation of these proposed strategies will involve definitive costs for a finite level of risk reduction. While regulatory decisions regarding the management of foodborne disease risk have traditionally been done with the aid of the scientific community, a formal conceptual framework for the evaluation of health risks from pathogenic microorganisms in foods is warranted. Quantitative risk assessment (QRA), which is formally defined as the technical assessment of the nature and magnitude of a risk caused by a hazard, provides such a framework. Reproducing microorganisms in foods present a particular challenge to QRA because both their introduction and numbers may be affected by numerous factors within the food chain, with all of these factors representing significant stages in food production, handling, and consumption, in a farm-to-table type of approach. The process of QRA entails four designated phases: (1) hazard identification, (2) exposure assessment, (3) dose-response assessment, and (4) risk characterization. Specific analytical tools are available to accomplish the analyses required for each phase of the QRA. The purpose of this paper is to provide a description of the conceptual framework for quantitative microbial risk assessment within the standard description provided by the National Academy of Sciences (NAS) paradigm. Each of the sequential steps in QRA are discussed in detail, providing information on current applications, tools for conducting the analyses, and methodological and/or data

  19. Quantitative assessment of cutaneous sensory function in subjects with neurologic disease.

    PubMed

    Conomy, J P; Barnes, K L

    1976-12-01

    Based upon techniques devised for the behavioral study of cutaneous sensation in monkeys, a method has been developed which studies quantitatively cutaneous sensation in man. The techniques is analogous to the von Békésy method of audiometry and employs a subject-operated stimulus and signalling divice. In tests utilizing electrical stimulation of the skin surfaces the subject serves as his own control for comparison of one cutaneous zone with another and from one trial session to another. A permanent, written record of stimulus and nonverbal perceptual response is produced in this instrumental method which permits statistical analysis of responses. The analysis includes determination of cutaneous sensory thresholds, limits of stimulus intensity during detection, duration of perception, detection cycle rates, and persistence indices. This instrumental method of cutaneous sensory assessment is quantifiable, free of verbal bias, and repeatable in terms of defined stimulus strengths. In applied clinical studies, patients with peripheral nerve lesions show elevations of perceptual thresholds, reduced numbers of detection-disappearance cycles per unit time, prolonged, contorted decay slopes, and occasionally persistence of perception in the absence of stimulation. Patients with central lesions have variable threshold abnormalities, but little slowing of cycle rate or perceptual persistence. These quantitative sensation parameters can be evaluated longitudinally during the course of an illness and its treatment. The method has potential use in the investigation of basic aspects of sensation and its interactions with behavior.

  20. Residual Isocyanates in Medical Devices and Products: A Qualitative and Quantitative Assessment

    PubMed Central

    Franklin, Gillian; Harari, Homero; Ahsan, Samavi; Bello, Dhimiter; Sterling, David A.; Nedrelow, Jonathan; Raynaud, Scott; Biswas, Swati; Liu, Youcheng

    2016-01-01

    We conducted a pilot qualitative and quantitative assessment of residual isocyanates and their potential initial exposures in neonates, as little is known about their contact effect. After a neonatal intensive care unit (NICU) stockroom inventory, polyurethane (PU) and PU foam (PUF) devices and products were qualitatively evaluated for residual isocyanates using Surface SWYPE™. Those containing isocyanates were quantitatively tested for methylene diphenyl diisocyanate (MDI) species, using UPLC-UV-MS/MS method. Ten of 37 products and devices tested, indicated both free and bound residual surface isocyanates; PU/PUF pieces contained aromatic isocyanates; one product contained aliphatic isocyanates. Overall, quantified mean MDI concentrations were low (4,4′-MDI = 0.52 to 140.1 pg/mg) and (2,4′-MDI = 0.01 to 4.48 pg/mg). The 4,4′-MDI species had the highest measured concentration (280 pg/mg). Commonly used medical devices/products contain low, but measurable concentrations of residual isocyanates. Quantifying other isocyanate species and neonatal skin exposure to isocyanates from these devices and products requires further investigation. PMID:27773989

  1. Quantitative assessment of mis-registration issues of diffusion tensor imaging (DTI)

    NASA Astrophysics Data System (ADS)

    Li, Yue; Jiang, Hangyi; Mori, Susumu

    2012-02-01

    Image distortions caused by eddy current and patient motion have been two major sources of the mis-registration issues in diffusion tensor imaging (DTI). Numerous registration methods have been proposed to correct them. However, quality control of DTI remains an important issue, because we rarely report how much mis-registration existed and how well they were corrected. In this paper, we propose a method for quantitative reporting of DTI data quality. This registration method minimizes a cost function based on mean square tensor fitting errors. Registration with twelve-parameter full affine transformation is used. From the registration result, distortion and motion parameters are estimated. Because the translation parameters involve both eddy-current-induced image translation and the patient motion, by analyzing the transformation model, we separate them by removing the contributions that are linearly correlated with diffusion gradients. We define the metrics measuring the amounts of distortion, rotation, translation. We tested our method on a database with 64 subjects and found the statistics of each of metrics. Finally we demonstrate that how these statistics can be used for assessing the data quality quantitatively in several examples.

  2. Quantitative assessment of developmental levels in overarm throwing using wearable inertial sensing technology.

    PubMed

    Grimpampi, Eleni; Masci, Ilaria; Pesce, Caterina; Vannozzi, Giuseppe

    2016-09-01

    Motor proficiency in childhood has been recently recognised as a public health determinant, having a potential impact on the physical activity level and possible sedentary behaviour of the child later in life. Among fundamental motor skills, ballistic skills assessment based on in-field quantitative observations is progressively needed in the motor development community. The aim of this study was to propose an in-field quantitative approach to identify different developmental levels in overarm throwing. Fifty-eight children aged 5-10 years performed an overarm throwing task while wearing three inertial sensors located at the wrist, trunk and pelvis level and were then categorised using a developmental sequence of overarm throwing. A set of biomechanical parameters were defined and analysed using multivariate statistics to evaluate whether they can be used as developmental indicators. Trunk and pelvis angular velocities and time durations before the ball release showed increasing/decreasing trends with increasing developmental level. Significant differences between developmental level pairs were observed for selected biomechanical parameters. The results support the suitability and feasibility of objective developmental measures in ecological learning contexts, suggesting their potential supportiveness to motor learning experiences in educational and youth sports training settings. PMID:26818205

  3. Quantitative gallium 67 lung scan to assess the inflammatory activity in the pneumoconioses

    SciTech Connect

    Bisson, G.; Lamoureux, G.; Begin, R.

    1987-01-01

    Gallium 67 lung scan has recently become increasingly used to evaluate the biological activity of alveolitis of interstitial lung diseases and to stage the disease process. In order to have a more precise and objective indicator of the inflammatory activity in the lung, we and others have developed computer-based quantitative techniques to process the /sup 67/Ga scan. In this report, we compare the results of three such computer-based methods of analysis of the scans of 38 normal humans and 60 patients suspected to have pneumoconiosis. Results of previous investigations on the mechanisms of /sup 67/Ga uptake in interstitial lung disease are reviewed. These data strengthen the view that quantitative /sup 67/Ga lung scan has become a standard technique to assess inflammatory activity in the interstitial lung diseases and that computer-based method of analysis of the scan provides an index of inflammatory activity of the lung disease that correlates with lung lavage and biopsy indices of inflammation in the lung tissue. 51 references.

  4. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    NASA Astrophysics Data System (ADS)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66-1.06, 1.06-1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  5. Specific and quantitative assessment of naphthalene and salicylate bioavailability by using a bioluminescent catabolic reporter bacterium

    SciTech Connect

    Heitzer, A.; Thonnard, J.E.; Sayler, G.S.; Webb, O.F. )

    1992-06-01

    A bioassay was developed and standardized for the rapid, specific, and quantitative assessment of naphthalene and salicylate bioavailability by use of bioluminescence monitoring of catabolic gene expression. The bioluminescent reporter strain Pseudomonas fluorescens HK44, which carries a transcriptional nahG-luxCDABE fusion for naphthalene and salicylate catabolism, was used. The physiological state of the reporter cultures as well as the intrinsic regulatory properties of the naphthalene degradation operon must be taken into account to obtain a high specificity at low target substrate concentrations. Experiments have shown that the use of exponentially growing reporter cultures has advantages over the use of carbon-starved, resting cultures. In aqueous solutions for both substrates, naphthalene and salicylate, linear relationships between initial substrate concentration and bioluminescence response were found over concentration ranges of 1 to 2 orders of magnitude. Naphthalene could be detected at a concentration of 45 ppb. Studies conducted under defined conditions with extracts and slurries of experimentally contaminated sterile soils and identical uncontaminated soil controls demonstrated that this method can be used for specific and quantitative estimations of target pollutant presence and bioavailability in soil extracts and for specific and qualitative estimations of napthalene in soil slurries.

  6. Dynamic and quantitative assessment of blood coagulation using optical coherence elastography

    NASA Astrophysics Data System (ADS)

    Xu, Xiangqun; Zhu, Jiang; Chen, Zhongping

    2016-04-01

    Reliable clot diagnostic systems are needed for directing treatment in a broad spectrum of cardiovascular diseases and coagulopathy. Here, we report on non-contact measurement of elastic modulus for dynamic and quantitative assessment of whole blood coagulation using acoustic radiation force orthogonal excitation optical coherence elastography (ARFOE-OCE). In this system, acoustic radiation force (ARF) is produced by a remote ultrasonic transducer, and a shear wave induced by ARF excitation is detected by the optical coherence tomography (OCT) system. During porcine whole blood coagulation, changes in the elastic property of the clots increase the shear modulus of the sample, altering the propagating velocity of the shear wave. Consequently, dynamic blood coagulation status can be measured quantitatively by relating the velocity of the shear wave with clinically relevant coagulation metrics, including reaction time, clot formation kinetics and maximum shear modulus. The results show that the ARFOE-OCE is sensitive to the clot formation kinetics and can differentiate the elastic properties of the recalcified porcine whole blood, blood added with kaolin as an activator, and blood spiked with fibrinogen.

  7. Quantitative assessment of apatite formation via a biomimetic method using quartz crystal microbalance.

    PubMed

    Tanahashi, M; Kokubo, T; Matsuda, T

    1996-06-01

    Quantitative assessment of hydroxyapatite formation on a gold surface via the biomimetic method, composed of a nucleation step in a simulated body fluid (SBF) containing glass powders and a subsequent apatite growth step in glass powder-free SBF, was made using a quartz crystal microbalance (QCM) technique. The frequency change of the QCM linearly increased with increasing soaking time, and largely depended on the nucleation period. The growth rates, defined as daily increase in thickness, increased monotonicly with an increasing nucleation period of up to 96 h, thereafter being constant at 2.0 microns/day. The growth rate of the apatite layer increased with increasing temperature of the SBF: 0.9, 2.0, and 3.8 microns/day at 25, 37, and 50 degrees C, respectively. The Arrhenius-type activation energy for the growth of apatite was 47.3 kJ/mol. The QCM method was found to be a very powerful tool for quantitative, in situ measurement of precipitation and growth of apatite in real time.

  8. Noninvasive Qualitative and Quantitative Assessment of Spoilage Attributes of Chilled Pork Using Hyperspectral Scattering Technique.

    PubMed

    Zhang, Leilei; Peng, Yankun

    2016-08-01

    The objective of this research was to develop a rapid noninvasive method for quantitative and qualitative determination of chilled pork spoilage. Microbiological, physicochemical, and organoleptic characteristics such as the total viable count (TVC), Pseudomonas spp., total volatile basic-nitrogen (TVB-N), pH value, and color parameter L* were determined to appraise pork quality. The hyperspectral scattering characteristics from 54 meat samples were fitted by four-parameter modified Gompertz function accurately. Support vector machines (SVM) was applied to establish quantitative prediction model between scattering fitting parameters and reference values. In addition, partial least squares discriminant analysis (PLS-DA) and Bayesian analysis were utilized as supervised and unsupervised techniques for the qualitative identification of meat spoilage. All stored chilled meat samples were classified into three grades: "fresh," "semi-fresh," and "spoiled." Bayesian classification model was superior to PLS-DA with overall classification accuracy of 92.86%. The results demonstrated that hyperspectral scattering technique combined with SVM and Bayesian possessed a powerful capability for meat spoilage assessment rapidly and noninvasively.

  9. Noninvasive Qualitative and Quantitative Assessment of Spoilage Attributes of Chilled Pork Using Hyperspectral Scattering Technique.

    PubMed

    Zhang, Leilei; Peng, Yankun

    2016-08-01

    The objective of this research was to develop a rapid noninvasive method for quantitative and qualitative determination of chilled pork spoilage. Microbiological, physicochemical, and organoleptic characteristics such as the total viable count (TVC), Pseudomonas spp., total volatile basic-nitrogen (TVB-N), pH value, and color parameter L* were determined to appraise pork quality. The hyperspectral scattering characteristics from 54 meat samples were fitted by four-parameter modified Gompertz function accurately. Support vector machines (SVM) was applied to establish quantitative prediction model between scattering fitting parameters and reference values. In addition, partial least squares discriminant analysis (PLS-DA) and Bayesian analysis were utilized as supervised and unsupervised techniques for the qualitative identification of meat spoilage. All stored chilled meat samples were classified into three grades: "fresh," "semi-fresh," and "spoiled." Bayesian classification model was superior to PLS-DA with overall classification accuracy of 92.86%. The results demonstrated that hyperspectral scattering technique combined with SVM and Bayesian possessed a powerful capability for meat spoilage assessment rapidly and noninvasively. PMID:27340214

  10. Provisional guidance for quantitative risk assessment of polycyclic aromatic hydrocarbons. Final report

    SciTech Connect

    Schoeny, R.; Poirier, K.

    1993-07-01

    PAHs are products of incomplete combustion of organic materials; sources are, thus, widespread including cigarette smoke, municipal waste incineration, wood stove emissions, coal conversion, energy production form fossil fuels, and automobile and diesel exhaust. As PAHs are common environmental contaminants, it is important that EPA have a scientifically justified, consistent approach to the evaluation of human health risk from exposure to these compounds. For the majority of PAHs classified as B2, probable human carcinogen, data are insufficient for calculation of an inhalation or drinking water unit risk. Benzo(a)pyrene (BAP) is the most completely studied of the PAHs, and data, while problematic, are sufficient for calculation of quantitative estimates of carcinogenic potency. Toxicity Equivalency Factors (TEF) have been used by U.S. EPA on an interim basis for risk assessment of chlorinated dibenzodioxins and dibenzofurans. Data for PAHs do not meet all criteria for use of TEF. The document presents a somewhat different approach to quantitative estimation for PAHs using weighted potential potencies.

  11. Quantitative assessment of resilience of a water supply system under rainfall reduction due to climate change

    NASA Astrophysics Data System (ADS)

    Amarasinghe, Pradeep; Liu, An; Egodawatta, Prasanna; Barnes, Paul; McGree, James; Goonetilleke, Ashantha

    2016-09-01

    A water supply system can be impacted by rainfall reduction due to climate change, thereby reducing its supply potential. This highlights the need to understand the system resilience, which refers to the ability to maintain service under various pressures (or disruptions). Currently, the concept of resilience has not yet been widely applied in managing water supply systems. This paper proposed three technical resilience indictors to assess the resilience of a water supply system. A case study analysis was undertaken of the Water Grid system of Queensland State, Australia, to showcase how the proposed indicators can be applied to assess resilience. The research outcomes confirmed that the use of resilience indicators is capable of identifying critical conditions in relation to the water supply system operation, such as the maximum allowable rainfall reduction for the system to maintain its operation without failure. Additionally, resilience indicators also provided useful insight regarding the sensitivity of the water supply system to a changing rainfall pattern in the context of climate change, which represents the system's stability when experiencing pressure. The study outcomes will help in the quantitative assessment of resilience and provide improved guidance to system operators to enhance the efficiency and reliability of a water supply system.

  12. Quantitative Gait Measurement With Pulse-Doppler Radar for Passive In-Home Gait Assessment

    PubMed Central

    Skubic, Marjorie; Rantz, Marilyn; Cuddihy, Paul E.

    2014-01-01

    In this paper, we propose a pulse-Doppler radar system for in-home gait assessment of older adults. A methodology has been developed to extract gait parameters including walking speed and step time using Doppler radar. The gait parameters have been validated with a Vicon motion capture system in the lab with 13 participants and 158 test runs. The study revealed that for an optimal step recognition and walking speed estimation, a dual radar set up with one radar placed at foot level and the other at torso level is necessary. An excellent absolute agreement with intraclass correlation coefficients of 0.97 was found for step time estimation with the foot level radar. For walking speed, although both radars show excellent consistency they all have a system offset compared to the ground truth due to walking direction with respect to the radar beam. The torso level radar has a better performance (9% offset on average) in the speed estimation compared to the foot level radar (13%–18% offset). Quantitative analysis has been performed to compute the angles causing the systematic error. These lab results demonstrate the capability of the system to be used as a daily gait assessment tool in home environments, useful for fall risk assessment and other health care applications. The system is currently being tested in an unstructured home environment. PMID:24771566

  13. Quantitative gait measurement with pulse-Doppler radar for passive in-home gait assessment.

    PubMed

    Wang, Fang; Skubic, Marjorie; Rantz, Marilyn; Cuddihy, Paul E

    2014-09-01

    In this paper, we propose a pulse-Doppler radar system for in-home gait assessment of older adults. A methodology has been developed to extract gait parameters including walking speed and step time using Doppler radar. The gait parameters have been validated with a Vicon motion capture system in the lab with 13 participants and 158 test runs. The study revealed that for an optimal step recognition and walking speed estimation, a dual radar set up with one radar placed at foot level and the other at torso level is necessary. An excellent absolute agreement with intraclass correlation coefficients of 0.97 was found for step time estimation with the foot level radar. For walking speed, although both radars show excellent consistency they all have a system offset compared to the ground truth due to walking direction with respect to the radar beam. The torso level radar has a better performance (9% offset on average) in the speed estimation compared to the foot level radar (13%-18% offset). Quantitative analysis has been performed to compute the angles causing the systematic error. These lab results demonstrate the capability of the system to be used as a daily gait assessment tool in home environments, useful for fall risk assessment and other health care applications. The system is currently being tested in an unstructured home environment.

  14. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection

    PubMed Central

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  15. Quantitative risk assessment for skin sensitisation: consideration of a simplified approach for hair dye ingredients.

    PubMed

    Goebel, Carsten; Diepgen, Thomas L; Krasteva, Maya; Schlatter, Harald; Nicolas, Jean-Francois; Blömeke, Brunhilde; Coenraads, Pieter Jan; Schnuch, Axel; Taylor, James S; Pungier, Jacquemine; Fautz, Rolf; Fuchs, Anne; Schuh, Werner; Gerberick, G Frank; Kimber, Ian

    2012-12-01

    With the availability of the local lymph node assay, and the ability to evaluate effectively the relative skin sensitizing potency of contact allergens, a model for quantitative-risk-assessment (QRA) has been developed. This QRA process comprises: (a) determination of a no-expected-sensitisation-induction-level (NESIL), (b) incorporation of sensitization-assessment-factors (SAFs) reflecting variations between subjects, product use patterns and matrices, and (c) estimation of consumer-exposure-level (CEL). Based on these elements an acceptable-exposure-level (AEL) can be calculated by dividing the NESIL of the product by individual SAFs. Finally, the AEL is compared with the CEL to judge about risks to human health. We propose a simplified approach to risk assessment of hair dye ingredients by making use of precise experimental product exposure data. This data set provides firmly established dose/unit area concentrations under relevant consumer use conditions referred to as the measured-exposure-level (MEL). For that reason a direct comparison is possible between the NESIL with the MEL as a proof-of-concept quantification of the risk of skin sensitization. This is illustrated here by reference to two specific hair dye ingredients p-phenylenediamine and resorcinol. Comparison of these robust and toxicologically relevant values is therefore considered an improvement versus a hazard-based classification of hair dye ingredients. PMID:23069142

  16. Quantitative gait measurement with pulse-Doppler radar for passive in-home gait assessment.

    PubMed

    Wang, Fang; Skubic, Marjorie; Rantz, Marilyn; Cuddihy, Paul E

    2014-09-01

    In this paper, we propose a pulse-Doppler radar system for in-home gait assessment of older adults. A methodology has been developed to extract gait parameters including walking speed and step time using Doppler radar. The gait parameters have been validated with a Vicon motion capture system in the lab with 13 participants and 158 test runs. The study revealed that for an optimal step recognition and walking speed estimation, a dual radar set up with one radar placed at foot level and the other at torso level is necessary. An excellent absolute agreement with intraclass correlation coefficients of 0.97 was found for step time estimation with the foot level radar. For walking speed, although both radars show excellent consistency they all have a system offset compared to the ground truth due to walking direction with respect to the radar beam. The torso level radar has a better performance (9% offset on average) in the speed estimation compared to the foot level radar (13%-18% offset). Quantitative analysis has been performed to compute the angles causing the systematic error. These lab results demonstrate the capability of the system to be used as a daily gait assessment tool in home environments, useful for fall risk assessment and other health care applications. The system is currently being tested in an unstructured home environment. PMID:24771566

  17. Quantitative Assessment of Eye Phenotypes for Functional Genetic Studies Using Drosophila melanogaster

    PubMed Central

    Iyer, Janani; Wang, Qingyu; Le, Thanh; Pizzo, Lucilla; Grönke, Sebastian; Ambegaokar, Surendra S.; Imai, Yuzuru; Srivastava, Ashutosh; Troisí, Beatriz Llamusí; Mardon, Graeme; Artero, Ruben; Jackson, George R.; Isaacs, Adrian M.; Partridge, Linda; Lu, Bingwei; Kumar, Justin P.; Girirajan, Santhosh

    2016-01-01

    About two-thirds of the vital genes in the Drosophila genome are involved in eye development, making the fly eye an excellent genetic system to study cellular function and development, neurodevelopment/degeneration, and complex diseases such as cancer and diabetes. We developed a novel computational method, implemented as Flynotyper software (http://flynotyper.sourceforge.net), to quantitatively assess the morphological defects in the Drosophila eye resulting from genetic alterations affecting basic cellular and developmental processes. Flynotyper utilizes a series of image processing operations to automatically detect the fly eye and the individual ommatidium, and calculates a phenotypic score as a measure of the disorderliness of ommatidial arrangement in the fly eye. As a proof of principle, we tested our method by analyzing the defects due to eye-specific knockdown of Drosophila orthologs of 12 neurodevelopmental genes to accurately document differential sensitivities of these genes to dosage alteration. We also evaluated eye images from six independent studies assessing the effect of overexpression of repeats, candidates from peptide library screens, and modifiers of neurotoxicity and developmental processes on eye morphology, and show strong concordance with the original assessment. We further demonstrate the utility of this method by analyzing 16 modifiers of sine oculis obtained from two genome-wide deficiency screens of Drosophila and accurately quantifying the effect of its enhancers and suppressors during eye development. Our method will complement existing assays for eye phenotypes, and increase the accuracy of studies that use fly eyes for functional evaluation of genes and genetic interactions. PMID:26994292

  18. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection.

    PubMed

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  19. Assessing the toxic effects of ethylene glycol ethers using Quantitative Structure Toxicity Relationship models

    SciTech Connect

    Ruiz, Patricia; Mumtaz, Moiz; Gombar, Vijay

    2011-07-15

    Experimental determination of toxicity profiles consumes a great deal of time, money, and other resources. Consequently, businesses, societies, and regulators strive for reliable alternatives such as Quantitative Structure Toxicity Relationship (QSTR) models to fill gaps in toxicity profiles of compounds of concern to human health. The use of glycol ethers and their health effects have recently attracted the attention of international organizations such as the World Health Organization (WHO). The board members of Concise International Chemical Assessment Documents (CICAD) recently identified inadequate testing as well as gaps in toxicity profiles of ethylene glycol mono-n-alkyl ethers (EGEs). The CICAD board requested the ATSDR Computational Toxicology and Methods Development Laboratory to conduct QSTR assessments of certain specific toxicity endpoints for these chemicals. In order to evaluate the potential health effects of EGEs, CICAD proposed a critical QSTR analysis of the mutagenicity, carcinogenicity, and developmental effects of EGEs and other selected chemicals. We report here results of the application of QSTRs to assess rodent carcinogenicity, mutagenicity, and developmental toxicity of four EGEs: 2-methoxyethanol, 2-ethoxyethanol, 2-propoxyethanol, and 2-butoxyethanol and their metabolites. Neither mutagenicity nor carcinogenicity is indicated for the parent compounds, but these compounds are predicted to be developmental toxicants. The predicted toxicity effects were subjected to reverse QSTR (rQSTR) analysis to identify structural attributes that may be the main drivers of the developmental toxicity potential of these compounds.

  20. A quantitative health assessment index for rapid evaluation of fish condition in the field

    SciTech Connect

    Adams, S.M. ); Brown, A.M. ); Goede, R.W. )

    1993-01-01

    The health assessment index (HAI) is an extension and refinement of a previously published field necropsy system. The HAI is a quantitative index that allows statistical comparisons of fish health among data sets. Index variables are assigned numerical values based on the degree of severity or damage incurred by an organ or tissue from environmental stressors. This approach has been used to evaluate the general health status of fish populations in a wide range of reservoir types in the Tennessee River basin (North Carolina, Tennessee, Alabama, Kentucky), in Hartwell Reservoir (Georgia, South Carolina) that is contaminated by polychlorinated biphenyls, and in the Pigeon River (Tennessee, North Carolina) that receives effluents from a bleaches kraft mill. The ability of the HAI to accurately characterize the health of fish in these systems was evaluated by comparing this index to other types of fish health measures (contaminant, bioindicator, and reproductive analysis) made at the same time as the HAI. In all cases, the HAI demonstrated the same pattern of fish health status between sites as did each of the other more sophisticated health assessment methods. The HAI has proven to be a simple and inexpensive means of rapidly assessing general fish health in field situations. 29 refs., 5 tabs.

  1. Quantitative assessment of reactive hyperemia using laser speckle contrast imaging at multiple wavelengths

    NASA Astrophysics Data System (ADS)

    Young, Anthony; Vishwanath, Karthik

    2016-03-01

    Reactive hyperemia refers to an increase of blood flow in tissue post release of an occlusion in the local vasculature. Measuring the temporal response of reactive hyperemia, post-occlusion in patients has the potential to shed information about microvascular diseases such as systemic sclerosis and diabetes. Laser speckle contrast imaging (LSCI) is an imaging technique capable of sensing superficial blood flow in tissue which can be used to quantitatively assess reactive hyperemia. Here, we employ LSCI using coherent sources in the blue, green and red wavelengths to evaluate reactive hyperemia in healthy human volunteers. Blood flow in the forearms of subjects were measured using LSCI to assess the time-course of reactive hyperemia that was triggered by a pressure cuff applied to the biceps of the subjects. Raw speckle images were acquired and processed to yield blood-flow parameters from a region of interest before, during and after application of occlusion. Reactive hyperemia was quantified via two measures - (1) by calculating the difference between the peak LSCI flow during the hyperemia and baseline flow, and (2) by measuring the amount of time that elapsed between the release of the occlusion and peak flow. These measurements were acquired in three healthy human participants, under the three laser wavelengths employed. The studies shed light on the utility of in vivo LSCI-based flow sensing for non-invasive assessment of reactive hyperemia responses and how they varied with the choice source wavelength influences the measured parameters.

  2. Coherent and consistent decision making for mixed hazardous waste management: The application of quantitative assessment techniques

    SciTech Connect

    Smith, G.M.; Little, R.H.; Torres, C.

    1994-12-31

    This paper focuses on predictive modelling capacity for post-disposal safety assessments of land-based disposal facilities, illustrated by presentation of the development and application of a comprehensive, yet practicable, assessment framework. The issues addressed include: (1) land-based disposal practice, (2) the conceptual and mathematical representation of processes leading to release, migration and accumulation of contaminants, (3) the identification and evaluation of relevant assessment end-points, including human health, health of non-human biota and eco-systems, and property and resource effects, (4) the gap between data requirements and data availability, and (5) the application of results in decision making, given the uncertainties in assessment results and the difficulty of comparing qualitatively different impacts arising in different temporal and spatial scales. The paper illustrates the issues with examples based on disposal of metals and radionuclides to shallow facilities. The types of disposal facility considered include features consistent with facilities for radioactive wastes as well as other types of design more typical of hazardous wastes. The intention is to raise the question of whether radioactive and other hazardous wastes are being consistently managed, and to show that assessment methods are being developed which can provide quantitative information on the levels of environmental impact as well as a consistent approach for different types of waste, such methods can then be applied to mixed hazardous wastes contained radionuclides as well as other contaminants. The remaining question is whether the will exists to employ them. The discussion and worked illustrations are based on a methodology developed and being extended within the current European Atomic Energy Community`s cost-sharing research program on radioactive waste management and disposal, with co-funding support from Empresa Nacional de Residuous Radiactivos SA, Spain.

  3. Quantitative Assessment of Participant Knowledge and Evaluation of Participant Satisfaction in the CARES Training Program

    PubMed Central

    Goodman, Melody S.; Si, Xuemei; Stafford, Jewel D.; Obasohan, Adesuwa; Mchunguzi, Cheryl

    2016-01-01

    Background The purpose of the Community Alliance for Research Empowering Social change (CARES) training program was to (1) train community members on evidence-based public health, (2) increase their scientific literacy, and (3) develop the infrastructure for community-based participatory research (CBPR). Objectives We assessed participant knowledge and evaluated participant satisfaction of the CARES training program to identify learning needs, obtain valuable feedback about the training, and ensure learning objectives were met through mutually beneficial CBPR approaches. Methods A baseline assessment was administered before the first training session and a follow-up assessment and evaluation was administered after the final training session. At each training session a pretest was administered before the session and a posttest and evaluation were administered at the end of the session. After training session six, a mid-training evaluation was administered. We analyze results from quantitative questions on the assessments, pre- and post-tests, and evaluations. Results CARES fellows knowledge increased at follow-up (75% of questions were answered correctly on average) compared with baseline (38% of questions were answered correctly on average) assessment; post-test scores were higher than pre-test scores in 9 out of 11 sessions. Fellows enjoyed the training and rated all sessions well on the evaluations. Conclusions The CARES fellows training program was successful in participant satisfaction and increasing community knowledge of public health, CBPR, and research method ology. Engaging and training community members in evidence-based public health research can develop an infrastructure for community–academic research partnerships. PMID:22982849

  4. Evaluation of dental enamel caries assessment using Quantitative Light Induced Fluorescence and Optical Coherence Tomography.

    PubMed

    Maia, Ana Marly Araújo; de Freitas, Anderson Zanardi; de L Campello, Sergio; Gomes, Anderson Stevens Leônidas; Karlsson, Lena

    2016-06-01

    An in vitro study of morphological alterations between sound dental structure and artificially induced white spot lesions in human teeth, was performed through the loss of fluorescence by Quantitative Light-Induced Fluorescence (QLF) and the alterations of the light attenuation coefficient by Optical Coherence Tomography (OCT). To analyze the OCT images using a commercially available system, a special algorithm was applied, whereas the QLF images were analyzed using the software available in the commercial system employed. When analyzing the sound region against white spot lesions region by QLF, a reduction in the fluorescence intensity was observed, whilst an increase of light attenuation by the OCT system occurred. Comparison of the percentage of alteration between optical properties of sound and artificial enamel caries regions showed that OCT processed images through the attenuation of light enhanced the tooth optical alterations more than fluorescence detected by QLF System. QLF versus OCT imaging of enamel caries: a photonics assessment.

  5. Quantitative assessment of the surface crack density in thermal barrier coatings

    NASA Astrophysics Data System (ADS)

    Yang, Li; Zhong, Zhi-Chun; Zhou, Yi-Chun; Lu, Chun-Sheng

    2014-04-01

    In this paper, a modified shear-lag model is developed to calculate the surface crack density in thermal barrier coatings (TBCs). The mechanical properties of TBCs are also measured to quantitatively assess their surface crack density. Acoustic emission (AE) and digital image correlation methods are applied to monitor the surface cracking in TBCs under tensile loading. The results show that the calculated surface crack density from the modified model is in agreement with that obtained from experiments. The surface cracking process of TBCs can be discriminated by their AE characteristics and strain evolution. Based on the correlation of energy released from cracking and its corresponding AE signals, a linear relationship is built up between the surface crack density and AE parameters, with the slope being dependent on the mechanical properties of TBCs. [Figure not available: see fulltext.

  6. Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations

    PubMed Central

    Røttingen, John-Arne

    2015-01-01

    We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196

  7. Estimation methods for monthly humidity from dynamical downscaling data for quantitative assessments of climate change impacts

    NASA Astrophysics Data System (ADS)

    Ueyama, Hideki

    2012-07-01

    Methods are proposed to estimate the monthly relative humidity and wet bulb temperature based on observations from a dynamical downscaling coupled general circulation model with a regional climate model (RCM) for a quantitative assessment of climate change impacts. The water vapor pressure estimation model developed was a regression model with a monthly saturated water vapor pressure that used minimum air temperature as a variable. The monthly minimum air temperature correction model for RCM bias was developed by stepwise multiple regression analysis using the difference in monthly minimum air temperatures between observations and RCM output as a dependent variable and geographic factors as independent variables. The wet bulb temperature was estimated using the estimated water vapor pressure, air temperature, and atmospheric pressure at ground level both corrected for RCM bias. Root mean square errors of the data decreased considerably in August.

  8. Quantitative assessment of motion correction for high angular resolution diffusion imaging.

    PubMed

    Sakaie, Ken E; Lowe, Mark J

    2010-02-01

    Several methods have been proposed for motion correction of high angular resolution diffusion imaging (HARDI) data. There have been few comparisons of these methods, partly due to a lack of quantitative metrics of performance. We compare two motion correction strategies using two figures of merit: displacement introduced by the motion correction and the 95% confidence interval of the cone of uncertainty of voxels with prolate tensors. What follows is a general approach for assessing motion correction of HARDI data that may have broad application for quality assurance and optimization of postprocessing protocols. Our analysis demonstrates two important issues related to motion correction of HARDI data: (1) although neither method we tested was dramatically superior in performance, both were dramatically better than performing no motion correction, and (2) iteration of motion correction can improve the final results. Based on the results demonstrated here, iterative motion correction is strongly recommended for HARDI acquisitions. PMID:19695824

  9. Quantitative risk assessment & leak detection criteria for a subsea oil export pipeline

    NASA Astrophysics Data System (ADS)

    Zhang, Fang-Yuan; Bai, Yong; Badaruddin, Mohd Fauzi; Tuty, Suhartodjo

    2009-06-01

    A quantitative risk assessment (QRA) based on leak detection criteria (LDC) for the design of a proposed subsea oil export pipeline is presented in this paper. The objective of this QRA/LDC study was to determine if current leak detection methodologies were sufficient, based on QRA results, while excluding the use of statistical leak detection; if not, an appropriate LDC for the leak detection system would need to be established. The famous UK PARLOC database was used for the calculation of pipeline failure rates, and the software POSVCM from MMS was used for oil spill simulations. QRA results revealed that the installation of a statistically based leak detection system (LDS) can significantly reduce time to leak detection, thereby mitigating the consequences of leakage. A sound LDC has been defined based on QRA study results and comments from various LDS vendors to assist the emergency response team (ERT) to quickly identify and locate leakage and employ the most effective measures to contain damage.

  10. Attribution of human VTEC O157 infection from meat products: a quantitative risk assessment approach.

    PubMed

    Kosmider, Rowena D; Nally, Pádraig; Simons, Robin R L; Brouwer, Adam; Cheung, Susan; Snary, Emma L; Wooldridge, Marion

    2010-05-01

    To address the risk posed to human health by the consumption of VTEC O157 within contaminated pork, lamb, and beef products within Great Britain, a quantitative risk assessment model has been developed. This model aims to simulate the prevalence and amount of VTEC O157 in different meat products at consumption within a single model framework by adapting previously developed models. The model is stochastic in nature, enabling both variability (natural variation between animals, carcasses, products) and uncertainty (lack of knowledge) about the input parameters to be modeled. Based on the model assumptions and data, it is concluded that the prevalence of VTEC O157 in meat products (joints and mince) at consumption is low (i.e., <0.04%). Beef products, particularly beef burgers, present the highest estimated risk with an estimated eight out of 100,000 servings on average resulting in human infection with VTEC O157.

  11. A quantitative structure-activity relationship approach for assessing toxicity of mixture of organic compounds.

    PubMed

    Chang, C M; Ou, Y H; Liu, T-C; Lu, S-Y; Wang, M-K

    2016-06-01

    Four types of reactivity indices were employed to construct quantitative structure-activity relationships for the assessment of toxicity of organic chemical mixtures. Results of analysis indicated that the maximum positive charge of the hydrogen atom and the inverse of the apolar surface area are the most important descriptors for the toxicity of mixture of benzene and its derivatives to Vibrio fischeri. The toxicity of mixture of aromatic compounds to green alga Scenedesmus obliquus is mainly affected by the electron flow and electrostatic interactions. The electron-acceptance chemical potential and the maximum positive charge of the hydrogen atom are found to be the most important descriptors for the joint toxicity of aromatic compounds.

  12. A quantitative assessment of using the Kinect for Xbox 360 for respiratory surface motion tracking

    NASA Astrophysics Data System (ADS)

    Alnowami, M.; Alnwaimi, B.; Tahavori, F.; Copland, M.; Wells, K.

    2012-02-01

    This paper describes a quantitative assessment of the Microsoft Kinect for X-box360TM for potential application in tracking respiratory and body motion in diagnostic imaging and external beam radiotherapy. However, the results can also be used in many other biomedical applications. We consider the performance of the Kinect in controlled conditions and find mm precision at depths of 0.8-1.5m. We also demonstrate the use of the Kinect for monitoring respiratory motion of the anterior surface. To improve the performance of respiratory monitoring, we fit a spline model of the chest surface through the depth data as a method of a marker-less monitoring of a respiratory motion. In addition, a comparison between the Kinect camera with and without zoom lens and a marker-based system was used to evaluate the accuracy of using the Kinect camera as a respiratory tracking system.

  13. Quantitative Framework for Retrospective Assessment of Interim Decisions in Clinical Trials

    PubMed Central

    Stanev, Roger

    2016-01-01

    This article presents a quantitative way of modeling the interim decisions of clinical trials. While statistical approaches tend to focus on the epistemic aspects of statistical monitoring rules, often overlooking ethical considerations, ethical approaches tend to neglect the key epistemic dimension. The proposal is a second-order decision-analytic framework. The framework provides means for retrospective assessment of interim decisions based on a clear and consistent set of criteria that combines both ethical and epistemic considerations. The framework is broadly Bayesian and addresses a fundamental question behind many concerns about clinical trials: What does it take for an interim decision (e.g., whether to stop the trial or continue) to be a good decision? Simulations illustrating the modeling of interim decisions counterfactually are provided. PMID:27353825

  14. Quantitative assessment of the stent/scaffold strut embedment analysis by optical coherence tomography.

    PubMed

    Sotomi, Yohei; Tateishi, Hiroki; Suwannasom, Pannipa; Dijkstra, Jouke; Eggermont, Jeroen; Liu, Shengnan; Tenekecioglu, Erhan; Zheng, Yaping; Abdelghani, Mohammad; Cavalcante, Rafael; de Winter, Robbert J; Wykrzykowska, Joanna J; Onuma, Yoshinobu; Serruys, Patrick W; Kimura, Takeshi

    2016-06-01

    The degree of stent/scaffold embedment could be a surrogate parameter of the vessel wall-stent/scaffold interaction and could have biological implications in the vascular response. We have developed a new specific software for the quantitative evaluation of embedment of struts by optical coherence tomography (OCT). In the present study, we described the algorithm of the embedment analysis and its reproducibility. The degree of embedment was evaluated as the ratio of the embedded part versus the whole strut height and subdivided into quartiles. The agreement and the inter- and intra-observer reproducibility were evaluated using the kappa and the interclass correlation coefficient (ICC). A total of 4 pullbacks of OCT images in 4 randomly selected coronary lesions with 3.0 × 18 mm devices [2 lesions with Absorb BVS and 2 lesions with XIENCE (both from Abbott Vascular, Santa Clara, CA, USA)] from Absorb Japan trial were evaluated by two investigators with QCU-CMS software version 4.69 (Leiden University Medical Center, Leiden, The Netherlands). Finally, 1481 polymeric struts in 174 cross-sections and 1415 metallic struts in 161 cross-sections were analyzed. Inter- and intra-observer reproducibility of quantitative measurements of embedment ratio and categorical assessment of embedment in Absorb BVS and XIENCE had excellent agreement with ICC ranging from 0.958 to 0.999 and kappa ranging from 0.850 to 0.980. The newly developed embedment software showed excellent reproducibility. Computer-assisted embedment analysis could be a feasible tool to assess the strut penetration into the vessel wall that could be a surrogate of acute injury caused by implantation of devices. PMID:26898315

  15. MO-G-12A-01: Quantitative Imaging Metrology: What Should Be Assessed and How?

    SciTech Connect

    Giger, M; Petrick, N; Obuchowski, N; Kinahan, P

    2014-06-15

    The first two symposia in the Quantitative Imaging Track focused on 1) the introduction of quantitative imaging (QI) challenges and opportunities, and QI efforts of agencies and organizations such as the RSNA, NCI, FDA, and NIST, and 2) the techniques, applications, and challenges of QI, with specific examples from CT, PET/CT, and MR. This third symposium in the QI Track will focus on metrology and its importance in successfully advancing the QI field. While the specific focus will be on QI, many of the concepts presented are more broadly applicable to many areas of medical physics research and applications. As such, the topics discussed should be of interest to medical physicists involved in imaging as well as therapy. The first talk of the session will focus on the introduction to metrology and why it is critically important in QI. The second talk will focus on appropriate methods for technical performance assessment. The third talk will address statistically valid methods for algorithm comparison, a common problem not only in QI but also in other areas of medical physics. The final talk in the session will address strategies for publication of results that will allow statistically valid meta-analyses, which is critical for combining results of individual studies with typically small sample sizes in a manner that can best inform decisions and advance the field. Learning Objectives: Understand the importance of metrology in the QI efforts. Understand appropriate methods for technical performance assessment. Understand methods for comparing algorithms with or without reference data (i.e., “ground truth”). Understand the challenges and importance of reporting results in a manner that allows for statistically valid meta-analyses.

  16. A comparative study of qualitative and quantitative methods for the assessment of adhesive remnant after bracket debonding.

    PubMed

    Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C

    2012-04-01

    The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.

  17. Assessment and Mission Planning Capability For Quantitative Aerothermodynamic Flight Measurements Using Remote Imaging

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas; Splinter, Scott; Daryabeigi, Kamran; Wood, William; Schwartz, Richard; Ross, Martin

    2008-01-01

    assessment study focused on increasing the probability of returning spatially resolved scientific/engineering thermal imagery. This paper provides an overview of the assessment task and the systematic approach designed to establish confidence in the ability of existing assets to reliably acquire, track and return global quantitative surface temperatures of the Shuttle during entry. A discussion of capability demonstration in support of a potential Shuttle boundary layer transition flight test is presented. Successful demonstration of a quantitative, spatially resolved, global temperature measurement on the proposed Shuttle boundary layer transition flight test could lead to potential future applications with hypersonic flight test programs within the USAF and DARPA along with flight test opportunities supporting NASA s project Constellation.

  18. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    SciTech Connect

    Waters, Michael Jackson, Marcus

    2008-11-15

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens and presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including

  19. Quantitative Assessment of Regional Wall Motion Abnormalities Using Dual-Energy Digital Subtraction Intravenous Ventriculography

    NASA Astrophysics Data System (ADS)

    McCollough, Cynthia H.

    Healthy portions of the left ventricle (LV) can often compensate for regional dysfunction, thereby masking regional disease when global indices of LV function are employed. Thus, quantitation of regional function provides a more useful method of assessing LV function, especially in diseases that have regional effects such as coronary artery disease. This dissertation studied the ability of a phase -matched dual-energy digital subtraction angiography (DE -DSA) technique to quantitate changes in regional LV systolic volume. The potential benefits and a theoretical description of the DE imaging technique are detailed. A correlated noise reduction algorithm is also presented which raises the signal-to-noise ratio of DE images by a factor of 2 -4. Ten open-chest dogs were instrumented with transmural ultrasonic crystals to assess regional LV function in terms of systolic normalized-wall-thickening rate (NWTR) and percent-systolic-thickening (PST). A pneumatic occluder was placed on the left-anterior-descending (LAD) coronary artery to temporarily reduce myocardial blood flow, thereby changing regional LV function in the LAD bed. DE-DSA intravenous left ventriculograms were obtained at control and four levels of graded myocardial ischemia, as determined by reductions in PST. Phase-matched images displaying changes in systolic contractile function were created by subtracting an end-systolic (ES) control image from ES images acquired at each level of myocardial ischemia. The resulting wall-motion difference signal (WMD), which represents a change in regional systolic volume between the control and ischemic states, was quantitated by videodensitometry and compared with changes in NWTR and PST. Regression analysis of 56 data points from 10 animals shows a linear relationship between WMD and both NWTR and PST: WMD = -2.46 NWTR + 13.9, r = 0.64, p < 0.001; WMD = -2.11 PST + 18.4, r = 0.54, p < 0.001. Thus, changes in regional ES LV volume between rest and ischemic states, as

  20. NPEC Sourcebook on Assessment: Definitions and Assessment Methods for Communication, Leadership, Information Literacy, Quantitative Reasoning, and Quantitative Skills. NPEC 2005-0832

    ERIC Educational Resources Information Center

    Jones, Elizabeth A.; RiCharde, Stephen

    2005-01-01

    Faculty, instructional staff, and assessment professionals are interested in student outcomes assessment processes and tools that can be used to improve learning experiences and academic programs. How can students' skills be assessed effectively? What assessments measure skills in communication? Leadership? Information literacy? Quantitative…

  1. Quantitative assessment of scatter correction techniques incorporated in next generation dual-source computed tomography

    NASA Astrophysics Data System (ADS)

    Mobberley, Sean David

    Accurate, cross-scanner assessment of in-vivo air density used to quantitatively assess amount and distribution of emphysema in COPD subjects has remained elusive. Hounsfield units (HU) within tracheal air can be considerably more positive than -1000 HU. With the advent of new dual-source scanners which employ dedicated scatter correction techniques, it is of interest to evaluate how the quantitative measures of lung density compare between dual-source and single-source scan modes. This study has sought to characterize in-vivo and phantom-based air metrics using dual-energy computed tomography technology where the nature of the technology has required adjustments to scatter correction. Anesthetized ovine (N=6), swine (N=13: more human-like rib cage shape), lung phantom and a thoracic phantom were studied using a dual-source MDCT scanner (Siemens Definition Flash. Multiple dual-source dual-energy (DSDE) and single-source (SS) scans taken at different energy levels and scan settings were acquired for direct quantitative comparison. Density histograms were evaluated for the lung, tracheal, water and blood segments. Image data were obtained at 80, 100, 120, and 140 kVp in the SS mode (B35f kernel) and at 80, 100, 140, and 140-Sn (tin filtered) kVp in the DSDE mode (B35f and D30f kernels), in addition to variations in dose, rotation time, and pitch. To minimize the effect of cross-scatter, the phantom scans in the DSDE mode was obtained by reducing the tube current of one of the tubes to its minimum (near zero) value. When using image data obtained in the DSDE mode, the median HU values in the tracheal regions of all animals and the phantom were consistently closer to -1000 HU regardless of reconstruction kernel (chapters 3 and 4). Similarly, HU values of water and blood were consistently closer to their nominal values of 0 HU and 55 HU respectively. When using image data obtained in the SS mode the air CT numbers demonstrated a consistent positive shift of up to 35 HU

  2. Black hole mimickers: Regular versus singular behavior

    SciTech Connect

    Lemos, Jose P. S.; Zaslavskii, Oleg B.

    2008-07-15

    Black hole mimickers are possible alternatives to black holes; they would look observationally almost like black holes but would have no horizon. The properties in the near-horizon region where gravity is strong can be quite different for both types of objects, but at infinity it could be difficult to discern black holes from their mimickers. To disentangle this possible confusion, we examine the near-horizon properties, and their connection with far away asymptotic properties, of some candidates to black mimickers. We study spherically symmetric uncharged or charged but nonextremal objects, as well as spherically symmetric charged extremal objects. Within the uncharged or charged but nonextremal black hole mimickers, we study nonextremal {epsilon}-wormholes on the threshold of the formation of an event horizon, of which a subclass are called black foils, and gravastars. Within the charged extremal black hole mimickers we study extremal {epsilon}-wormholes on the threshold of the formation of an event horizon, quasi-black holes, and wormholes on the basis of quasi-black holes from Bonnor stars. We elucidate whether or not the objects belonging to these two classes remain regular in the near-horizon limit. The requirement of full regularity, i.e., finite curvature and absence of naked behavior, up to an arbitrary neighborhood of the gravitational radius of the object enables one to rule out potential mimickers in most of the cases. A list ranking the best black hole mimickers up to the worst, both nonextremal and extremal, is as follows: wormholes on the basis of extremal black holes or on the basis of quasi-black holes, quasi-black holes, wormholes on the basis of nonextremal black holes (black foils), and gravastars. Since in observational astrophysics it is difficult to find extremal configurations (the best mimickers in the ranking), whereas nonextremal configurations are really bad mimickers, the task of distinguishing black holes from their mimickers seems to

  3. Achalasia mimicking prepubertal anorexia nervosa.

    PubMed

    Richterich, Andreas; Brunner, Romuald; Resch, Franz

    2003-04-01

    A 9-year-old girl presents for continuing weight loss of 10 kg over the course of 1 year. Medical history showed three episodes of pneumonia requiring hospital admission in the 6 months before presentation and 4 months of weekly psychotherapy for anorexia nervosa. A thorough history of eating behavior and a review of systems revealed not only typical aspects of prepubertal anorexia nervosa but also vomiting at night while asleep, difficulty drinking liquids, epigastric pain, and a frequent experience of "a lump in the throat"; these symptoms were not suggestive of a diagnosis of anorexia nervosa but rather of esophageal achalasia. The patient was transferred to the Department of Pediatrics, and a diagnosis of esophageal achalasia was made by chest x-ray and barium swallow. After dilatation and botulinum toxin application, the patient regained weight easily and was discharged in stable condition. In this case, esophageal achalasia mimicked prepubertal anorexia nervosa.

  4. Non-destructive assessment of human ribs mechanical properties using quantitative ultrasound.

    PubMed

    Mitton, David; Minonzio, Jean-Gabriel; Talmant, Maryline; Ellouz, Rafaa; Rongieras, Frédéric; Laugier, Pascal; Bruyère-Garnier, Karine

    2014-04-11

    Advanced finite element models of the thorax have been developed to study, for example, the effects of car crashes. While there is a need for material properties to parameterize such models, specific properties are largely missing. Non-destructive techniques applicable in vivo would, therefore, be of interest to support further development of thorax models. The only non-destructive technique available today to derive rib bone properties would be based on quantitative computed tomography that measures bone mineral density. However, this approach is limited by the radiation dose. Bidirectional ultrasound axial transmission was developed on long bones ex vivo and used to assess in vivo health status of the radius. However, it is currently unknown if the ribs are good candidates for such a measurement. Therefore, the goal of this study is to evaluate the relationship between ex vivo ultrasonic measurements (axial transmission) and the mechanical properties of human ribs to determine if the mechanical properties of the ribs can be quantified non-destructively. The results show statistically significant relationships between the ultrasonic measurements and mechanical properties of the ribs. These results are promising with respect to a non-destructive and non-ionizing assessment of rib mechanical properties. This ex vivo study is a first step toward in vivo studies to derive subject-specific rib properties.

  5. Assessment and application of quantitative schlieren methods: Calibrated color schlieren and background oriented schlieren

    NASA Astrophysics Data System (ADS)

    Elsinga, G. E.; van Oudheusden, B. W.; Scarano, F.; Watt, D. W.

    Two quantitative schlieren methods are assessed and compared: calibrated color schlieren (CCS) and background oriented schlieren (BOS). Both methods are capable of measuring the light deflection angle in two spatial directions, and hence the projected density gradient vector field. Spatial integration using the conjugate gradient method returns the projected density field. To assess the performance of CCS and BOS, density measurements of a two-dimensional benchmark flow (a Prandtl-Meyer expansion fan) are compared with the theoretical density field and with the density inferred from PIV velocity measurements. The method's performance is also evaluated a priori from an experiment ray-tracing simulation. The density measurements show good agreement with theory. Moreover, CCS and BOS return comparable results with respect to each other and with respect to the PIV measurements. BOS proves to be very sensitive to displacements of the wind tunnel during the experiment and requires a correction for it, making it necessary to apply extra boundary conditions in the integration procedure. Furthermore, spatial resolution can be a limiting factor for accurate measurements using BOS. CCS suffers from relatively high noise in the density gradient measurement due to camera noise and has a smaller dynamic range when compared to BOS. Finally the application of the two schlieren methods to a separated wake flow is demonstrated. Flow features such as shear layers and expansion and recompression waves are measured with both methods.

  6. Disability adjusted life year (DALY): a useful tool for quantitative assessment of environmental pollution.

    PubMed

    Gao, Tingting; Wang, Xiaochang C; Chen, Rong; Ngo, Huu Hao; Guo, Wenshan

    2015-04-01

    Disability adjusted life year (DALY) has been widely used since 1990s for evaluating global and/or regional burden of diseases. As many environmental pollutants are hazardous to human health, DALY is also recognized as an indicator to quantify the health impact of environmental pollution related to disease burden. Based on literature reviews, this article aims to give an overview of the applicable methodologies and research directions for using DALY as a tool for quantitative assessment of environmental pollution. With an introduction of the methodological framework of DALY, the requirements on data collection and manipulation for quantifying disease burdens are summarized. Regarding environmental pollutants hazardous to human beings, health effect/risk evaluation is indispensable for transforming pollution data into disease data through exposure and dose-response analyses which need careful selection of models and determination of parameters. Following the methodological discussions, real cases are analyzed with attention paid to chemical pollutants and pathogens usually encountered in environmental pollution. It can be seen from existing studies that DALY is advantageous over conventional environmental impact assessment for quantification and comparison of the risks resulted from environmental pollution. However, further studies are still required to standardize the methods of health effect evaluation regarding varied pollutants under varied circumstances before DALY calculation.

  7. Quantitative microbial risk assessment of human illness from exposure to marine beach sand.

    PubMed

    Shibata, Tomoyuki; Solo-Gabriele, Helena M

    2012-03-01

    Currently no U.S. federal guideline is available for assessing risk of illness from sand at recreational sites. The objectives of this study were to compute a reference level guideline for pathogens in beach sand and to compare these reference levels with measurements from a beach impacted by nonpoint sources of contamination. Reference levels were computed using quantitative microbial risk assessment (QMRA) coupled with Monte Carlo simulations. In order to reach an equivalent level of risk of illness as set by the U.S. EPA for marine water exposure (1.9 × 10(-2)), levels would need to be at least about 10 oocysts/g (about 1 oocyst/g for a pica child) for Cryptosporidium, about 5 MPN/g (about 1 MPN/g for pica) for enterovirus, and less than 10(6) CFU/g for S. aureus. Pathogen levels measured in sand at a nonpoint source recreational beach were lower than the reference levels. More research is needed in evaluating risk from yeast and helminth exposures as well as in identifying acceptable levels of risk for skin infections associated with sand exposures.

  8. Quantitative assessment of the probability of bluetongue virus overwintering by horizontal transmission: application to Germany

    PubMed Central

    2011-01-01

    Even though bluetongue virus (BTV) transmission is apparently interrupted during winter, bluetongue outbreaks often reappear in the next season (overwintering). Several mechanisms for BTV overwintering have been proposed, but to date, their relative importance remain unclear. In order to assess the probability of BTV overwintering by persistence in adult vectors, ruminants (through prolonged viraemia) or a combination of both, a quantitative risk assessment model was developed. Furthermore, the model allowed the role played by the residual number of vectors present during winter to be examined, and the effect of a proportion of Culicoides living inside buildings (endophilic behaviour) to be explored. The model was then applied to a real scenario: overwintering in Germany between 2006 and 2007. The results showed that the limited number of vectors active during winter seemed to allow the transmission of BTV during this period, and that while transmission was favoured by the endophilic behaviour of some Culicoides, its effect was limited. Even though transmission was possible, the likelihood of BTV overwintering by the mechanisms studied seemed too low to explain the observed re-emergence of the disease. Therefore, other overwintering mechanisms not considered in the model are likely to have played a significant role in BTV overwintering in Germany between 2006 and 2007. PMID:21314966

  9. Development of a new quantitative gas permeability method for dental implant-abutment connection tightness assessment

    PubMed Central

    2011-01-01

    Background Most dental implant systems are presently made of two pieces: the implant itself and the abutment. The connection tightness between those two pieces is a key point to prevent bacterial proliferation, tissue inflammation and bone loss. The leak has been previously estimated by microbial, color tracer and endotoxin percolation. Methods A new nitrogen flow technique was developed for implant-abutment connection leakage measurement, adapted from a recent, sensitive, reproducible and quantitative method used to assess endodontic sealing. Results The results show very significant differences between various sealing and screwing conditions. The remaining flow was lower after key screwing compared to hand screwing (p = 0.03) and remained different from the negative test (p = 0.0004). The method reproducibility was very good, with a coefficient of variation of 1.29%. Conclusions Therefore, the presented new gas flow method appears to be a simple and robust method to compare different implant systems. It allows successive measures without disconnecting the abutment from the implant and should in particular be used to assess the behavior of the connection before and after mechanical stress. PMID:21492459

  10. Skeletal status assessed by quantitative ultrasound at the hand phalanges in karate training males.

    PubMed

    Drozdzowska, Bogna; Münzer, Ulrich; Adamczyk, Piotr; Pluskiewicz, Wojciech

    2011-02-01

    The aim of the study was to assess the influence of regularly exercised karate on the skeletal status. The study comprised a group of 226 males (the mean age: 25.64 ± 12.3 years, range 7-61 years), exercising for 61.9 ± 68.4 months, with the mean frequency of 3.12 ± 1.4 times per week, and 502 controls, matched for age and body size. The skeletal status was assessed by quantitative ultrasound, using a DBM Sonic 1200 (IGEA, Italy) sonographic device, which measures amplitude-dependent speed of sound (Ad-SoS [m/s]) at hand phalanges. Ad-SoS, T-score, Z-score were significantly higher in the examined karatekas than in controls. Up to age 18, there had been no difference between the study subjects and controls, while afterwards, up to age 35, the difference increased to stabilize again after age 35. Longer duration, higher frequency and earlier start of physical training positively influenced the skeletal status. In conclusion, karate is a sport with a positive influence on the skeletal status with the most significant benefits occurring in adults. PMID:21208731

  11. Large-Scale Quantitative Assessment of Binding Preferences in Protein-Nucleic Acid Complexes.

    PubMed

    Jakubec, Dávid; Hostas, Jirí; Laskowski, Roman A; Hobza, Pavel; Vondrásek, Jirí

    2015-04-14

    The growing number of high-quality experimental (X-ray, NMR) structures of protein–DNA complexes has sufficient enough information to assess whether universal rules governing the DNA sequence recognition process apply. While previous studies have investigated the relative abundance of various modes of amino acid–base contacts (van der Waals contacts, hydrogen bonds), relatively little is known about the energetics of these noncovalent interactions. In the present study, we have performed the first large-scale quantitative assessment of binding preferences in protein–DNA complexes by calculating the interaction energies in all 80 possible amino acid–DNA base combinations. We found that several mutual amino acid–base orientations featuring bidentate hydrogen bonds capable of unambiguous one-to-one recognition correspond to unique minima in the potential energy space of the amino acid–base pairs. A clustering algorithm revealed that these contacts form a spatially well-defined group offering relatively little conformational freedom. Various molecular mechanics force field and DFT-D ab initio calculations were performed, yielding similar results. PMID:26894243

  12. Quantitative assessment of the differential impacts of arbuscular and ectomycorrhiza on soil carbon cycling.

    PubMed

    Soudzilovskaia, Nadejda A; van der Heijden, Marcel G A; Cornelissen, Johannes H C; Makarov, Mikhail I; Onipchenko, Vladimir G; Maslov, Mikhail N; Akhmetzhanova, Asem A; van Bodegom, Peter M

    2015-10-01

    A significant fraction of carbon stored in the Earth's soil moves through arbuscular mycorrhiza (AM) and ectomycorrhiza (EM). The impacts of AM and EM on the soil carbon budget are poorly understood. We propose a method to quantify the mycorrhizal contribution to carbon cycling, explicitly accounting for the abundance of plant-associated and extraradical mycorrhizal mycelium. We discuss the need to acquire additional data to use our method, and present our new global database holding information on plant species-by-site intensity of root colonization by mycorrhizas. We demonstrate that the degree of mycorrhizal fungal colonization has globally consistent patterns across plant species. This suggests that the level of plant species-specific root colonization can be used as a plant trait. To exemplify our method, we assessed the differential impacts of AM : EM ratio and EM shrub encroachment on carbon stocks in sub-arctic tundra. AM and EM affect tundra carbon stocks at different magnitudes, and via partly distinct dominant pathways: via extraradical mycelium (both EM and AM) and via mycorrhizal impacts on above- and belowground biomass carbon (mostly AM). Our method provides a powerful tool for the quantitative assessment of mycorrhizal impact on local and global carbon cycling processes, paving the way towards an improved understanding of the role of mycorrhizas in the Earth's carbon cycle.

  13. Quantitative microbial risk assessment of distributed drinking water using faecal indicator incidence and concentrations.

    PubMed

    van Lieverloo, J Hein M; Blokker, E J Mirjam; Medema, Gertjan

    2007-01-01

    Quantitative Microbial Risk Assessments (QMRA) have focused on drinking water system components upstream of distribution to customers, for nominal and event conditions. Yet some 15-33% of waterborne outbreaks are reported to be caused by contamination events in distribution systems. In the majority of these cases and probably in all non-outbreak contamination events, no pathogen concentration data was available. Faecal contamination events are usually detected or confirmed by the presence of E. coli or other faecal indicators, although the absence of this indicator is no guarantee of the absence of faecal pathogens. In this paper, the incidence and concentrations of various coliforms and sources of faecal contamination were used to estimate the possible concentrations of faecal pathogens and consequently the infection risks to consumers in event-affected areas. The results indicate that the infection risks may be very high, especially from Campylobacter and enteroviruses, but also that the uncertainties are very high. The high variability of pathogen to thermotolerant coliform ratios estimated in environmental samples severely limits the applicability of the approach described. Importantly, the highest ratios of enteroviruses to thermotolerant coliform were suggested from soil and shallow groundwaters, the most likely sources of faecal contamination that are detected in distribution systems. Epidemiological evaluations of non-outbreak faecal contamination of drinking water distribution systems and thorough tracking and characterisation of the contamination sources are necessary to assess the actual risks of these events.

  14. Use of coefficient of variation in assessing variability of quantitative assays.

    PubMed

    Reed, George F; Lynn, Freyja; Meade, Bruce D

    2002-11-01

    We have derived the mathematical relationship between the coefficient of variation associated with repeated measurements from quantitative assays and the expected fraction of pairs of those measurements that differ by at least some given factor, i.e., the expected frequency of disparate results that are due to assay variability rather than true differences. Knowledge of this frequency helps determine what magnitudes of differences can be expected by chance alone when the particular coefficient of variation is in effect. This frequency is an operational index of variability in the sense that it indicates the probability of observing a particular disparity between two measurements under the assumption that they measure the same quantity. Thus the frequency or probability becomes the basis for assessing if an assay is sufficiently precise. This assessment also provides a standard for determining if two assay results for the same subject, separated by an intervention such as vaccination or infection, differ by more than expected from the variation of the assay, thus indicating an intervention effect. Data from an international collaborative study are used to illustrate the application of this proposed interpretation of the coefficient of variation, and they also provide support for the assumptions used in the mathematical derivation.

  15. Use of Coefficient of Variation in Assessing Variability of Quantitative Assays

    PubMed Central

    Reed, George F.; Lynn, Freyja; Meade, Bruce D.

    2002-01-01

    We have derived the mathematical relationship between the coefficient of variation associated with repeated measurements from quantitative assays and the expected fraction of pairs of those measurements that differ by at least some given factor, i.e., the expected frequency of disparate results that are due to assay variability rather than true differences. Knowledge of this frequency helps determine what magnitudes of differences can be expected by chance alone when the particular coefficient of variation is in effect. This frequency is an operational index of variability in the sense that it indicates the probability of observing a particular disparity between two measurements under the assumption that they measure the same quantity. Thus the frequency or probability becomes the basis for assessing if an assay is sufficiently precise. This assessment also provides a standard for determining if two assay results for the same subject, separated by an intervention such as vaccination or infection, differ by more than expected from the variation of the assay, thus indicating an intervention effect. Data from an international collaborative study are used to illustrate the application of this proposed interpretation of the coefficient of variation, and they also provide support for the assumptions used in the mathematical derivation. PMID:12414755

  16. Quantitative structure-activity relationships and ecological risk assessment: an overview of predictive aquatic toxicology research.

    PubMed

    Bradbury, S P

    1995-09-01

    In the field of aquatic toxicology, quantitative structure-activity relationships (QSARs) have developed as scientifically credible tools for predicting the toxicity of chemicals when little or no empirical data are available. A fundamental understanding of toxicological principles has been considered an important component to the acceptance and application of QSAR approaches as biologically relevant in ecological risk assessments. As a consequence, there has been an evolution of QSAR development and application from that of a chemical-class perspective to one that is more consistent with assumptions regarding modes of toxic action. In this review, techniques to assess modes of toxic action from chemical structure are discussed, with consideration that toxicodynamic knowledge bases must be clearly defined with regard to exposure regimes, biological models/endpoints and compounds that adequately span the diversity of chemicals anticipated for future applications. With such knowledge bases, classification systems, including rule-based expert systems, have been established for use in predictive aquatic toxicology applications. The establishment of QSAR techniques that are based on an understanding of toxic mechanisms is needed to provide a link to physiologically based toxicokinetic and toxicodynamic models, which can provide the means to extrapolate adverse effects across species and exposure regimes. PMID:7570660

  17. Application of quantitative uncertainty analysis for human health risk assessment at Rocky Flats

    SciTech Connect

    Duncan, F.L.W.; Gordon, J.W. ); Smith, D. ); Singh, S.P. )

    1993-01-01

    The characterization of uncertainty is an important component of the risk assessment process. According to the U.S. Environmental Protection Agency's (EPA's) [open quotes]Guidance on Risk Characterization for Risk Managers and Risk Assessors,[close quotes] point estimates of risk [open quotes]do not fully convey the range of information considered and used in developing the assessment.[close quotes] Furthermore, the guidance states that the Monte Carlo simulation may be used to estimate descriptive risk percentiles. To provide information about the uncertainties associated with the reasonable maximum exposure (RME) estimate and the relation of the RME to other percentiles of the risk distribution for Operable Unit 1 (OU-1) at Rocky Flats, uncertainties were identified and quantitatively evaluated. Monte Carlo simulation is a technique that can be used to provide a probability function of estimated risk using random values of exposure factors and toxicity values in an exposure scenario. The Monte Carlo simulation involves assigning a joint probability distribution to the input variables (i.e., exposure factors) of an exposure scenario. Next, a large number of independent samples from the assigned joint distribution are taken and the corresponding outputs calculated. Methods of statistical inference are used to estimate, from the output sample, some parameters of the output distribution, such as percentiles and the expected value.

  18. Quantitative ultrasound criteria for risk stratification in clinical practice: a comparative assessment.

    PubMed

    Noale, Marianna; Maggi, Stefania; Gonnelli, Stefano; Limongi, Federica; Zanoni, Silvia; Zambon, Sabina; Rozzini, Renzo; Crepaldi, Gaetano

    2012-07-01

    This study aimed to compare two different classifications of the risk of fracture/osteoporosis (OP) based on quantitative ultrasound (QUS). Analyses were based on data from the Epidemiological Study on the Prevalence of Osteoporosis, a cross-sectional study conducted in 2000 aimed at assessing the risk of OP in a representative sample of the Italian population. Subjects were classified into 5 groups considering the cross-classification found in previous studies; logistic regression models were defined separately for women and men to study the fracture risk attributable to groups defined by the cross-classification, adjusting for traditional risk factors. Eight-thousand six-hundred eighty-one subjects were considered in the analyses. Logistic regression models revealed that the two classifications seem to be able to identify a common core of individuals at low and at high risk of fractures, and the importance of a multidimensional assessment in older patients to evaluate clinical risk factors together with a simple, inexpensive, radiation-free device such as QUS.

  19. Swept source optical coherence tomography for quantitative and qualitative assessment of dental composite restorations

    NASA Astrophysics Data System (ADS)

    Sadr, Alireza; Shimada, Yasushi; Mayoral, Juan Ricardo; Hariri, Ilnaz; Bakhsh, Turki A.; Sumi, Yasunori; Tagami, Junji

    2011-03-01

    The aim of this work was to explore the utility of swept-source optical coherence tomography (SS-OCT) for quantitative evaluation of dental composite restorations. The system (Santec, Japan) with a center wavelength of around 1300 nm and axial resolution of 12 μm was used to record data during and after placement of light-cured composites. The Fresnel phenomenon at the interfacial defects resulted in brighter areas indicating gaps as small as a few micrometers. The gap extension at the interface was quantified and compared to the observation by confocal laser scanning microscope after trimming the specimen to the same cross-section. Also, video imaging of the composite during polymerization could provide information about real-time kinetics of contraction stress and resulting gaps, distinguishing them from those gaps resulting from poor adaptation of composite to the cavity prior to polymerization. Some samples were also subjected to a high resolution microfocus X-ray computed tomography (μCT) assessment; it was found that differentiation of smaller gaps from the radiolucent bonding layer was difficult with 3D μCT. Finally, a clinical imaging example using a newly developed dental SS-OCT system with an intra-oral scanning probe (Panasonic Healthcare, Japan) is presented. SS-OCT is a unique tool for clinical assessment and laboratory research on resin-based dental restorations. Supported by GCOE at TMDU and NCGG.

  20. Electroencephalographic Data Analysis With Visibility Graph Technique for Quantitative Assessment of Brain Dysfunction.

    PubMed

    Bhaduri, Susmita; Ghosh, Dipak

    2015-07-01

    Usual techniques for electroencephalographic (EEG) data analysis lack some of the important properties essential for quantitative assessment of the progress of the dysfunction of the human brain. EEG data are essentially nonlinear and this nonlinear time series has been identified as multi-fractal in nature. We need rigorous techniques for such analysis. In this article, we present the visibility graph as the latest, rigorous technique that can assess the degree of multifractality accurately and reliably. Moreover, it has also been found that this technique can give reliable results with test data of comparatively short length. In this work, the visibility graph algorithm has been used for mapping a time series-EEG signals-to a graph to study complexity and fractality of the time series through investigation of its complexity. The power of scale-freeness of visibility graph has been used as an effective method for measuring fractality in the EEG signal. The scale-freeness of the visibility graph has also been observed after averaging the statistically independent samples of the signal. Scale-freeness of the visibility graph has been calculated for 5 sets of EEG data patterns varying from normal eye closed to epileptic. The change in the values is analyzed further, and it has been observed that it reduces uniformly from normal eye closed to epileptic.

  1. Multi-observation PET image analysis for patient follow-up quantitation and therapy assessment

    PubMed Central

    David, Simon; Visvikis, Dimitris; Roux, Christian; Hatt, Mathieu

    2011-01-01

    In Positron Emission Tomography (PET) imaging, an early therapeutic response is usually characterized by variations of semi-quantitative parameters restricted to maximum SUV measured in PET scans during the treatment. Such measurements do not reflect overall tumour volume and radiotracer uptake variations. The proposed approach is based on multi-observation image analysis for merging several PET acquisitions to assess tumour metabolic volume and uptake variations. The fusion algorithm is based on iterative estimation using stochastic expectation maximization (SEM) algorithm. The proposed method was applied to simulated and clinical follow-up PET images. We compared the multi-observation fusion performance to threshold-based methods, proposed for the assessment of the therapeutic response based on functional volumes. On simulated datasets, the adaptive threshold applied independently on both images led to higher errors than the ASEM fusion and on the clinical datasets, it failed to provide coherent measurements for four patients out of seven due to aberrant delineations. The ASEM method demonstrated improved and more robust estimation of the evaluation leading to more pertinent measurements. Future work will consist in extending the methodology and applying it to clinical multi-tracers datasets in order to evaluate its potential impact on the biological tumour volume definition for radiotherapy applications. PMID:21846937

  2. Estimation of undiscovered deposits in quantitative mineral resource assessments-examples from Venezuela and Puerto Rico

    USGS Publications Warehouse

    Cox, D.P.

    1993-01-01

    Quantitative mineral resource assessments used by the United States Geological Survey are based on deposit models. These assessments consist of three parts: (1) selecting appropriate deposit models and delineating on maps areas permissive for each type of deposit; (2) constructing a grade-tonnage model for each deposit model; and (3) estimating the number of undiscovered deposits of each type. In this article, I focus on the estimation of undiscovered deposits using two methods: the deposit density method and the target counting method. In the deposit density method, estimates are made by analogy with well-explored areas that are geologically similar to the study area and that contain a known density of deposits per unit area. The deposit density method is useful for regions where there is little or no data. This method was used to estimate undiscovered low-sulfide gold-quartz vein deposits in Venezuela. Estimates can also be made by counting targets such as mineral occurrences, geophysical or geochemical anomalies, or exploration "plays" and by assigning to each target a probability that it represents an undiscovered deposit that is a member of the grade-tonnage distribution. This method is useful in areas where detailed geological, geophysical, geochemical, and mineral occurrence data exist. Using this method, porphyry copper-gold deposits were estimated in Puerto Rico. ?? 1993 Oxford University Press.

  3. Skeletal status assessed by quantitative ultrasound at the hand phalanges in karate training males.

    PubMed

    Drozdzowska, Bogna; Münzer, Ulrich; Adamczyk, Piotr; Pluskiewicz, Wojciech

    2011-02-01

    The aim of the study was to assess the influence of regularly exercised karate on the skeletal status. The study comprised a group of 226 males (the mean age: 25.64 ± 12.3 years, range 7-61 years), exercising for 61.9 ± 68.4 months, with the mean frequency of 3.12 ± 1.4 times per week, and 502 controls, matched for age and body size. The skeletal status was assessed by quantitative ultrasound, using a DBM Sonic 1200 (IGEA, Italy) sonographic device, which measures amplitude-dependent speed of sound (Ad-SoS [m/s]) at hand phalanges. Ad-SoS, T-score, Z-score were significantly higher in the examined karatekas than in controls. Up to age 18, there had been no difference between the study subjects and controls, while afterwards, up to age 35, the difference increased to stabilize again after age 35. Longer duration, higher frequency and earlier start of physical training positively influenced the skeletal status. In conclusion, karate is a sport with a positive influence on the skeletal status with the most significant benefits occurring in adults.

  4. Quantitative risk assessment for the induction of allergic contact dermatitis: uncertainty factors for mucosal exposures.

    PubMed

    Farage, Miranda A; Bjerke, Donald L; Mahony, Catherine; Blackburn, Karen L; Gerberick, G Frank

    2003-09-01

    The quantitative risk assessment (QRA) paradigm has been extended to evaluating the risk of induction of allergic contact dermatitis from consumer products. Sensitization QRA compares product-related, topical exposures to a safe benchmark, the sensitization reference dose. The latter is based on an experimentally or clinically determined 'no observable adverse effect level' (NOAEL) and further refined by incorporating 'sensitization uncertainty factors' (SUFs) that address variables not adequately reflected in the data from which the threshold NOAEL was derived. A critical area of uncertainty for the risk assessment of oral care or feminine hygiene products is the extrapolation from skin to mucosal exposures. Most sensitization data are derived from skin contact, but the permeability of vulvovaginal and oral mucosae is greater than that of keratinized skin. Consequently, the QRA for some personal products that are exposed to mucosal tissue may require the use of more conservative SUFs. This article reviews the scientific basis for SUFs applied to topical exposure to vulvovaginal and oral mucosae. We propose a 20-fold range in the default uncertainty factor used in the contact sensitization QRA when extrapolating from data derived from the skin to situations involving exposure to non-keratinized mucosal tissue.

  5. The quantitative assessment of the pre- and postoperative craniosynostosis using the methods of image analysis.

    PubMed

    Fabijańska, Anna; Węgliński, Tomasz

    2015-12-01

    This paper considers the problem of the CT based quantitative assessment of the craniosynostosis before and after the surgery. First, fast and efficient brain segmentation approach is proposed. The algorithm is robust to discontinuity of skull. As a result it can be applied both in pre- and post-operative cases. Additionally, image processing and analysis algorithms are proposed for describing the disease based on CT scans. The proposed algorithms automate determination of the standard linear indices used for assessment of the craniosynostosis (i.e. cephalic index CI and head circumference HC) and allow for planar and volumetric analysis which so far have not been reported. Results of applying the introduced methods to sample craniosynostotic cases before and after the surgery are presented and discussed. The results show that the proposed brain segmentation algorithm is characterized by high accuracy when applied both in the pre- and postoperative craniosynostosis, while the introduced planar and volumetric indices for the disease description may be helpful to distinguish between the types of the disease.

  6. [Multi-component quantitative analysis combined with chromatographic fingerprint for quality assessment of Onosma hookeri].

    PubMed

    Aga, Er-bu; Nie, Li-juan; Dongzhi, Zhuo-ma; Wang, Ju-le

    2015-11-01

    A method for simultaneous determination of the shikonin, acetyl shikonin and β, β'-dimethylpropene shikonin in Onosma hookeri and the chromatographic fingerprint was estabished by HPLC-DAD on an Agilent Zorbax SB-column with a gradient elution of acetonitrile and water at 0.8 mL x min(-1), 30 degrees C. The quality assessment was conducted by comparing the content difference of three naphthoquinone constituents, in combination with chromatographic fingerprint analysis and systems cluster analysis among 7 batches of radix O. hookeri. The content of the three naphthoquinone constituents showed wide variations in 7 bathces. The similarity value of the fingerprints of sample 5, 6 and 7 was above 0.99, sample 2 and 3 above 0.97, sample 3 and 4 above 0.90, and other samples larger than 0.8, which was in concert with the content of three naphthoquinone constituents. The 7 samples were roughly divided into 4 categories. The results above indicated that the using of this medicine is complex and rather spotty. The established HPLC fingerprints and the quantitative analysis method can be used efficiently for quality assessment of O. hookeri.

  7. Three-Dimensional Quantitative Validation of Breast Magnetic Resonance Imaging Background Parenchymal Enhancement Assessments.

    PubMed

    Ha, Richard; Mema, Eralda; Guo, Xiaotao; Mango, Victoria; Desperito, Elise; Ha, Jason; Wynn, Ralph; Zhao, Binsheng

    2016-01-01

    The magnetic resonance imaging (MRI) background parenchymal enhancement (BPE) and its clinical significance as a biomarker of breast cancer risk has been proposed based on qualitative studies. Previous BPE quantification studies lack appropriate correlation with BPE qualitative assessments. The purpose of this study is to validate our three-dimensional BPE quantification method with standardized BPE qualitative cases. An Institutional Review Board-approved study reviewed 500 consecutive magnetic resonance imaging cases (from January 2013-December 2014) using a strict inclusion criteria and 120 cases that best represented each of the BPE qualitative categories (minimal or mild or moderate or marked) were selected. Blinded to the qualitative data, fibroglandular tissue contours of precontrast and postcontrast images were delineated using an in-house, proprietary segmentation algorithm. Metrics of BPE were calculated including %BPE ([ratio of BPE volume to fibroglandular tissue volume] × 100) at multiple threshold levels to determine the optimal cutoff point for BPE quantification that best correlated with the reference BPE qualitative cases. The highest positive correlation was present at ×1.5 precontrast average signal intensity threshold level (r = 0.84, P < 0.001). At this level, the BPE qualitative assessment of minimal, mild, moderate, and marked correlated with the mean quantitative %BPE of 14.1% (95% CI: 10.9-17.2), 26.1% (95% CI: 22.8-29.3), 45.9% (95% CI: 40.2-51.7), and 74.0% (95% CI: 68.6-79.5), respectively. A one-way analysis of variance with post-hoc analysis showed that at ×1.5 precontrast average signal intensity level, the quantitative %BPE measurements best differentiated the four reference BPE qualitative groups (F [3,117] = 106.8, P < 0.001). Our three-dimensional BPE quantification methodology was validated using the reference BPE qualitative cases and could become an invaluable clinical tool to more accurately assess breast cancer risk and to

  8. Quantitative assessment of human and pet exposure to Salmonella associated with dry pet foods.

    PubMed

    Lambertini, Elisabetta; Buchanan, Robert L; Narrod, Clare; Ford, Randall M; Baker, Robert C; Pradhan, Abani K

    2016-01-01

    Recent Salmonella outbreaks associated with dry pet foods and treats highlight the importance of these foods as previously overlooked exposure vehicles for both pets and humans. In the last decade efforts have been made to raise the safety of this class of products, for instance by upgrading production equipment, cleaning protocols, and finished product testing. However, no comprehensive or quantitative risk profile is available for pet foods, thus limiting the ability to establish safety standards and assess the effectiveness of current and proposed Salmonella control measures. This study sought to develop an ingredients-to-consumer quantitative microbial exposure assessment model to: 1) estimate pet and human exposure to Salmonella via dry pet food, and 2) assess the impact of industry and household-level mitigation strategies on exposure. Data on prevalence and concentration of Salmonella in pet food ingredients, production process parameters, bacterial ecology, and contact transfer in the household were obtained through literature review, industry data, and targeted research. A probabilistic Monte Carlo modeling framework was developed to simulate the production process and basic household exposure routes. Under the range of assumptions adopted in this model, human exposure due to handling pet food is null to minimal if contamination occurs exclusively before extrusion. Exposure increases considerably if recontamination occurs post-extrusion during coating with fat, although mean ingested doses remain modest even at high fat contamination levels, due to the low percent of fat in the finished product. Exposure is highly variable, with the distribution of doses ingested by adult pet owners spanning 3Log CFU per exposure event. Child exposure due to ingestion of 1g of pet food leads to significantly higher doses than adult doses associated with handling the food. Recontamination after extrusion and coating, e.g., via dust or equipment surfaces, may also lead to

  9. Groundwater availability in the United States: the value of quantitative regional assessments

    USGS Publications Warehouse

    Dennehy, Kevin F.; Reilly, Thomas E.; Cunningham, William L.

    2015-01-01

    The sustainability of water resources is under continued threat from the challenges associated with a growing population, competing demands, and a changing climate. Freshwater scarcity has become a fact in many areas. Much of the United States surface-water supplies are fully apportioned for use; thus, in some areas the only potential alternative freshwater source that can provide needed quantities is groundwater. Although frequently overlooked, groundwater serves as the principal reserve of freshwater in the US and represents much of the potential supply during periods of drought. Some nations have requirements to monitor and characterize the availability of groundwater such as the European Union’s Water Framework Directive (EPCEU 2000). In the US there is no such national requirement. Quantitative regional groundwater availability assessments, however, are essential to document the status and trends of groundwater availability for the US and make informed water-resource decisions possible now and in the future. Barthel (2014) highlighted that the value of regional groundwater assessments goes well beyond just quantifying the resource so that it can be better managed. The tools and techniques required to evaluate these unique regional systems advance the science of hydrogeology and provide enhanced methods that can benefit local-scale groundwater investigations. In addition, a significant, yet under-utilized benefit is the digital spatial and temporal data sets routinely generated as part of these studies. Even though there is no legal or regulatory requirement for regional groundwater assessments in the US, there is a logical basis for their implementation. The purpose of this essay is to articulate the rationale for and reaffirm the value of regional groundwater assessments primarily in the US; however, the arguments hold for all nations. The importance of the data sets and the methods and model development that occur as part of these assessments is stressed

  10. Comparison of Diagnostic Performance Between Visual and Quantitative Assessment of Bone Scintigraphy Results in Patients With Painful Temporomandibular Disorder

    PubMed Central

    Choi, Bong-Hoi; Yoon, Seok-Ho; Song, Seung-Il; Yoon, Joon-Kee; Lee, Su Jin; An, Young-Sil

    2016-01-01

    Abstract This retrospective clinical study was performed to evaluate whether a visual or quantitative method is more valuable for assessing painful temporomandibular disorder (TMD) using bone scintigraphy results. In total, 230 patients (172 women and 58 men) with TMD were enrolled. All patients were questioned about their temporomandibular joint (TMJ) pain. Bone scintigraphic data were acquired in all patients, and images were analyzed by visual and quantitative methods using the TMJ-to-skull uptake ratio. The diagnostic performances of both bone scintigraphic assessment methods for painful TMD were compared. In total, 241 of 460 TMJs (52.4%) were finally diagnosed with painful TMD. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of the visual analysis for diagnosing painful TMD were 62.8%, 59.6%, 58.6%, 63.8%, and 61.1%, respectively. The quantitative assessment showed the ability to diagnose painful TMD with a sensitivity of 58.8% and specificity of 69.3%. The diagnostic ability of the visual analysis for diagnosing painful TMD was not significantly different from that of the quantitative analysis. Visual bone scintigraphic analysis showed a diagnostic utility similar to that of quantitative assessment for the diagnosis of painful TMD. PMID:26765456

  11. Quantitative MRI of the spinal cord and brain in adrenomyeloneuropathy: in vivo assessment of structural changes.

    PubMed

    Castellano, Antonella; Papinutto, Nico; Cadioli, Marcello; Brugnara, Gianluca; Iadanza, Antonella; Scigliuolo, Graziana; Pareyson, Davide; Uziel, Graziella; Köhler, Wolfgang; Aubourg, Patrick; Falini, Andrea; Henry, Roland G; Politi, Letterio S; Salsano, Ettore

    2016-06-01

    Adrenomyeloneuropathy is the late-onset form of X-linked adrenoleukodystrophy, and is considered the most frequent metabolic hereditary spastic paraplegia. In adrenomyeloneuropathy the spinal cord is the main site of pathology. Differently from quantitative magnetic resonance imaging of the brain, little is known about the feasibility and utility of advanced neuroimaging in quantifying the spinal cord abnormalities in hereditary diseases. Moreover, little is known about the subtle pathological changes that can characterize the brain of adrenomyeloneuropathy subjects in the early stages of the disease. We performed a cross-sectional study on 13 patients with adrenomyeloneuropathy and 12 age-matched healthy control subjects who underwent quantitative magnetic resonance imaging to assess the structural changes of the upper spinal cord and brain. Total cord areas from C2-3 to T2-3 level were measured, and diffusion tensor imaging metrics, i.e. fractional anisotropy, mean, axial and radial diffusivity values were calculated in both grey and white matter of spinal cord. In the brain, grey matter regions were parcellated with Freesurfer and average volume and thickness, and mean diffusivity and fractional anisotropy from co-registered diffusion maps were calculated in each region. Brain white matter diffusion tensor imaging metrics were assessed using whole-brain tract-based spatial statistics, and tractography-based analysis on corticospinal tracts. Correlations among clinical, structural and diffusion tensor imaging measures were calculated. In patients total cord area was reduced by 26.3% to 40.2% at all tested levels (P < 0.0001). A mean 16% reduction of spinal cord white matter fractional anisotropy (P ≤ 0.0003) with a concomitant 9.7% axial diffusivity reduction (P < 0.009) and 34.5% radial diffusivity increase (P < 0.009) was observed, suggesting co-presence of axonal degeneration and demyelination. Brain tract-based spatial statistics showed a marked reduction

  12. Quantitative assessment of breast lesion viscoelasticity: initial clinical results using supersonic shear imaging.

    PubMed

    Tanter, Mickael; Bercoff, Jeremy; Athanasiou, Alexandra; Deffieux, Thomas; Gennisson, Jean-Luc; Montaldo, Gabriel; Muller, Marie; Tardivon, Anne; Fink, Mathias

    2008-09-01

    This paper presents an initial clinical evaluation of in vivo elastography for breast lesion imaging using the concept of supersonic shear imaging. This technique is based on the combination of a radiation force induced in tissue by an ultrasonic beam and an ultrafast imaging sequence capable of catching in real time the propagation of the resulting shear waves. The local shear wave velocity is recovered using a time-offlight technique and enables the 2-D mapping of shear elasticity. This imaging modality is implemented on a conventional linear probe driven by a dedicated ultrafast echographic device. Consequently, it can be performed during a standard echographic examination. The clinical investigation was performed on 15 patients, which corresponded to 15 lesions (4 cases BI-RADS 3, 7 cases BI-RADS 4 and 4 cases BI-RADS 5). The ability of the supersonic shear imaging technique to provide a quantitative and local estimation of the shear modulus of abnormalities with a millimetric resolution is illustrated on several malignant (invasive ductal and lobular carcinoma) and benign cases (fibrocystic changes and viscous cysts). In the investigated cases, malignant lesions were found to be significantly different from benign solid lesions with respect to their elasticity values. Cystic lesions have shown no shear wave propagate at all in the lesion (because shear waves do not propage in liquid). These preliminary clinical results directly demonstrate the clinical feasibility of this new elastography technique in providing quantitative assessment of relative stiffness of breast tissues. This technique of evaluating tissue elasticity gives valuable information that is complementary to the B-mode morphologic information. More extensive studies are necessary to validate the assumption that this new mode potentially helps the physician in both false-positive and false-negative rejection.

  13. Quantitative assessment of inhalation exposure and deposited dose of aerosol from nanotechnology-based consumer sprays†

    PubMed Central

    Nazarenko, Yevgen; Lioy, Paul J.; Mainelis, Gediminas

    2015-01-01

    This study provides a quantitative assessment of inhalation exposure and deposited aerosol dose in the 14 nm to 20 μm particle size range based on the aerosol measurements conducted during realistic usage simulation of five nanotechnology-based and five regular spray products matching the nano-products by purpose of application. The products were also examined using transmission electron microscopy. In seven out of ten sprays, the highest inhalation exposure was observed for the coarse (2.5–10 μm) particles while being minimal or below the detection limit for the remaining three sprays. Nanosized aerosol particles (14–100 nm) were released, which resulted in low but measurable inhalation exposures from all of the investigated consumer sprays. Eight out of ten products produced high total deposited aerosol doses on the order of 101–103 ng kg−1 bw per application, ~85–88% of which were in the head airways, only <10% in the alveolar region and <8% in the tracheobronchial region. One nano and one regular spray produced substantially lower total deposited doses (by 2–4 orders of magnitude less), only ~52–64% of which were in the head while ~29–40% in the alveolar region. The electron microscopy data showed nanosized objects in some products not labeled as nanotechnology-based and conversely did not find nano-objects in some nano-sprays. We found no correlation between nano-object presence and abundance as per the electron microscopy data and the determined inhalation exposures and deposited doses. The findings of this study and the reported quantitative exposure data will be valuable for the manufacturers of nanotechnology-based consumer sprays to minimize inhalation exposure from their products, as well as for the regulators focusing on protecting the public health. PMID:25621175

  14. Multimodal Quantitative Phase Imaging with Digital Holographic Microscopy Accurately Assesses Intestinal Inflammation and Epithelial Wound Healing.

    PubMed

    Lenz, Philipp; Brückner, Markus; Ketelhut, Steffi; Heidemann, Jan; Kemper, Björn; Bettenworth, Dominik

    2016-01-01

    The incidence of inflammatory bowel disease, i.e., Crohn's disease and Ulcerative colitis, has significantly increased over the last decade. The etiology of IBD remains unknown and current therapeutic strategies are based on the unspecific suppression of the immune system. The development of treatments that specifically target intestinal inflammation and epithelial wound healing could significantly improve management of IBD, however this requires accurate detection of inflammatory changes. Currently, potential drug candidates are usually evaluated using animal models in vivo or with cell culture based techniques in vitro. Histological examination usually requires the cells or tissues of interest to be stained, which may alter the sample characteristics and furthermore, the interpretation of findings can vary by investigator expertise. Digital holographic microscopy (DHM), based on the detection of optical path length delay, allows stain-free quantitative phase contrast imaging. This allows the results to be directly correlated with absolute biophysical parameters. We demonstrate how measurement of changes in tissue density with DHM, based on refractive index measurement, can quantify inflammatory alterations, without staining, in different layers of colonic tissue specimens from mice and humans with colitis. Additionally, we demonstrate continuous multimodal label-free monitoring of epithelial wound healing in vitro, possible using DHM through the simple automated determination of the wounded area and simultaneous determination of morphological parameters such as dry mass and layer thickness of migrating cells. In conclusion, DHM represents a valuable, novel and quantitative tool for the assessment of intestinal inflammation with absolute values for parameters possible, simplified quantification of epithelial wound healing in vitro and therefore has high potential for translational diagnostic use. PMID:27685659

  15. Delayed villous maturation of the placenta: quantitative assessment in different cohorts.

    PubMed

    Treacy, Ann; Higgins, Mary; Kearney, John M; McAuliffe, Fionnuala; Mooney, Eoghan E

    2013-01-01

    Placental villous maturation is maximal in the 3rd trimester, with an abundance of terminal villi. Delayed villous maturation (DVM) of the placenta is associated with chromosomal abnormalities, gestational diabetes, and an adverse outcome. This study compares quantitative assessment of vasculo-syncytial membranes (VSM) in cases of liveborn infants, perinatal deaths, and controls. Cases were selected as follows: (1) liveborn infants with a qualitative diagnosis of DVM (n  =  15); (2) controls matched for gestational age whose placentas did not have DVM (n  =  15); (3) stillbirths (SB)/neonatal deaths (NND) showing DVM (n  =  13); and (4) SB from autopsies in which DVM was felt to be the cause of death (COD) (n  =  12). Vasculo-syncytial membranes were counted in 10 terminal villi in each of 10 consecutive high-power fields on 3 slides. Data analysis was carried out using SPSS. Liveborn cases with DVM showed statistically significantly less VSM than controls (mean 1.01 vs 2.42, P < 0.0001). The SB/NND group also showed significantly less VSM than the control group (mean 0.46 vs 2.42, P < 0.0001) and less than the liveborn DVM group (mean 0.46 vs 1.01, P  =  0.001). The COD group was significantly different from the control group (mean 0.42 vs 2.42, P < 0.0001) and the liveborn DVM group (mean 0.42 vs 1.01, P < 0.0001) but not significantly different from the SB/NND group. There is a quantitative reduction in VSM in cases of DVM compared to controls.

  16. Magnetic resonance imaging-based semiquantitative and quantitative assessment in osteoarthritis.

    PubMed

    Roemer, Frank W; Eckstein, Felix; Guermazi, Ali

    2009-08-01

    Whole organ magnetic resonance imaging (MRI)-based semiquantitative (SQ) assessment of knee osteoarthritis (OA), based on reliable scoring methods and expert reading, has become a powerful research tool in OA. SQ morphologic scoring has been applied to large observational cross-sectional and longitudinal epidemiologic studies as well as interventional clinical trials. SQ whole organ scoring analyzes all joint structures that are potentially relevant as surrogate outcome measures of OA and potential disease modification, including cartilage, subchondral bone, osteophytes, intra- and periarticular ligaments, menisci, synovial lining, cysts, and bursae. Resources needed for SQ scoring rely on the MRI protocol, image quality, experience of the expert readers, method of documentation, and the individual scoring system that will be applied. The first part of this article discusses the different available OA whole organ scoring systems, focusing on MRI of the knee, and also reviews alternative approaches. Rheumatologists are made aware of artifacts and differential diagnoses when applying any of the SQ scoring systems. The second part focuses on quantitative approaches in OA, particularly measurement of (subregional) cartilage loss. This approach allows one to determine minute changes that occur relatively homogeneously across cartilage structures and that are not apparent to the naked eye. To this end, the cartilage surfaces need to be segmented by trained users using specialized software. Measurements of knee cartilage loss based on water-excitation spoiled gradient recalled echo acquisition in the steady state, fast low-angle shot, or double-echo steady-state imaging sequences reported a 1% to 2% decrease in cartilage thickness annually, and a high degree of spatial heterogeneity of cartilage thickness changes in femorotibial subregions between subjects. Risk factors identified by quantitative measurement technology included a high body mass index, meniscal extrusion

  17. Qualitative and Quantitative Assessment of Hepatitis A Virus in Wastewaters in Tunisia.

    PubMed

    Béji-Hamza, A; Khélifi-Gharbi, H; Hassine-Zaafrane, M; Della Libera, S; Iaconelli, M; Muscillo, M; Petricca, S; Ciccaglione, A R; Bruni, R; Taffon, S; Equestre, M; Aouni, M; La Rosa, G

    2014-12-01

    Hepatitis A causes substantial morbidity in both industrialized and non-industrialized countries and represents an important health problem in several southern Mediterranean countries. The objectives of the study were as follows: (a) to assess the occurrence of hepatitis A virus (HAV) in Tunisia through the monitoring of urban wastewaters collected at wastewater treatment plants (WTPs); (b) to characterize environmental strains; and (c) to estimate the viral load in raw and treated sewages, in order to evaluate the potential impact on superficial waters receiving discharges. A total of 150 raw and treated wastewaters were collected from three WTPs and analyzed by both qualitative (RT-PCR/nested) and quantitative (qRT-PCR) methods. Of these, 100 (66%) were found to be positive for HAV by the qualitative assay: 68.3% in influents and 64.7% in effluents. The vast majority of HAV sequences belonged to sub-genotype IA, with 11 different strains detected found to be identical to clinical strains isolated from Tunisian patients with acute hepatitis. Five unique variants were also detected, not previously reported in clinical cases. Only two IB strains were found, confirming the rarity of this sub-genotype in this country. The results of the present study indicate a wide circulation of the pathogen in the population, most probably in the form of asymptomatic infections, a finding consistent with the classification of the country as having intermediate/high endemicity. Quantitative data showed high viral loads in influents (3.5E+05 genome copies/liter, mean value) as well as effluents (2.5E+05 genome copies/liter, mean value), suggesting that contaminated water could be a critical element in transmission.

  18. A Platform for Rapid, Quantitative Assessment of Multiple Drug Combinations Simultaneously in Solid Tumors In Vivo

    PubMed Central

    Grenley, Marc O.; Casalini, Joseph R.; Tretyak, Ilona; Ditzler, Sally H.; Thirstrup, Derek J.; Frazier, Jason P.; Pierce, Daniel W.; Carleton, Michael; Klinghoffer, Richard A.

    2016-01-01

    While advances in high-throughput screening have resulted in increased ability to identify synergistic anti-cancer drug combinations, validation of drug synergy in the in vivo setting and prioritization of combinations for clinical development remain low-throughput and resource intensive. Furthermore, there is currently no viable method for prospectively assessing drug synergy directly in human patients in order to potentially tailor therapies. To address these issues we have employed the previously described CIVO platform and developed a quantitative approach for investigating multiple combination hypotheses simultaneously in single living tumors. This platform provides a rapid, quantitative and cost effective approach to compare and prioritize drug combinations based on evidence of synergistic tumor cell killing in the live tumor context. Using a gemcitabine resistant model of pancreatic cancer, we efficiently investigated nine rationally selected Abraxane-based combinations employing only 19 xenografted mice. Among the drugs tested, the BCL2/BCLxL inhibitor ABT-263 was identified as the one agent that synergized with Abraxane® to enhance acute induction of localized apoptosis in this model of human pancreatic cancer. Importantly, results obtained with CIVO accurately predicted the outcome of systemic dosing studies in the same model where superior tumor regression induced by the Abraxane/ABT-263 combination was observed compared to that induced by either single agent. This supports expanded use of CIVO as an in vivo platform for expedited in vivo drug combination validation and sets the stage for performing toxicity-sparing drug combination studies directly in cancer patients with solid malignancies. PMID:27359113

  19. Quick, non-invasive and quantitative assessment of small fiber neuropathy in patients receiving chemotherapy.

    PubMed

    Saad, Mehdi; Psimaras, Dimitri; Tafani, Camille; Sallansonnet-Froment, Magali; Calvet, Jean-Henri; Vilier, Alice; Tigaud, Jean-Marie; Bompaire, Flavie; Lebouteux, Marie; de Greslan, Thierry; Ceccaldi, Bernard; Poirier, Jean-Michel; Ferrand, François-Régis; Le Moulec, Sylvestre; Huillard, Olivier; Goldwasser, François; Taillia, Hervé; Maisonobe, Thierry; Ricard, Damien

    2016-04-01

    Chemotherapy-induced peripheral neurotoxicity (CIPN) is a common, potentially severe and dose-limiting adverse effect; however, it is poorly investigated at an early stage due to the lack of a simple assessment tool. As sweat glands are innervated by small autonomic C-fibers, sudomotor function testing has been suggested for early screening of peripheral neuropathy. This study aimed to evaluate Sudoscan, a non-invasive and quantitative method to assess sudomotor function, in the detection and follow-up of CIPN. Eighty-eight patients receiving at least two infusions of Oxaliplatin only (45.4%), Paclitaxel only (14.8%), another drug only (28.4%) or two drugs (11.4%) were enrolled in the study. At each chemotherapy infusion the accumulated dose of chemotherapy was calculated and the Total Neuropathy Score clinical version (TNSc) was carried out. Small fiber neuropathy was assessed using Sudoscan (a 3-min test). The device measures the Electrochemical Skin Conductance (ESC) of the hands and feet expressed in microSiemens (µS). For patients receiving Oxaliplatin mean hands ESC changed from 73 ± 2 to 63 ± 2 and feet ESC from 77 ± 2 to 66 ± 3 µS (p < 0.001) while TNSc changed from 2.9 ± 0.5 to 4.3 ± 0.4. Similar results were observed in patients receiving Paclitaxel or another neurotoxic chemotherapy. During the follow-up, ESC values of both hands and feet with a corresponding TNSc < 2 were 70 ± 2 and 73 ± 2 µS respectively while they were 59 ± 1.4 and 64 ± 1.5 µS with a corresponding TNSc ≥ 6 (p < 0.0001 and p = 0.0003 respectively). This preliminary study suggests that small fiber neuropathy could be screened and followed using Sudoscan in patients receiving chemotherapy. PMID:26749101

  20. [High resolution peripheral quantitative computed tomography for the assessment of morphological and mechanical bone parameters].

    PubMed

    Fuller, Henrique; Fuller, Ricardo; Pereira, Rosa Maria R

    2015-01-01

    High resolution peripheral quantitative computed tomography (HR-pQCT) is a new technology commercially available for less than 10 years that allows performing in vivo assessment of bone parameters. HR-pQCT assesses the trabecular thickness, trabecular separation, trabecular number and connectivity density and, in addition, cortical bone density and thickness and total bone volume and density in high-definition mode, which additionally allows obtaining digital constructs of bone microarchitecture. The application of mathematics to captured data, a method called finite element analysis (FEA), allows the estimation of the physical properties of the tissue, simulating supported loads in a non-invasive way. Thus, HR-pQCT simultaneously acquires data previously provided separately by dual energy x-ray absorptiometry (DXA), magnetic resonance imaging and histomorphometry, aggregating biomechanical estimates previously only possible in extracted tissues. This method has a satisfactory reproducibility, with coefficients of variation rarely exceeding 3%. Regarding accuracy, the method shows a fair to good agreement (r(2) = 0.37-0.97). The main clinical application of this method is in the quantification and monitoring of metabolic bone disorders, more fully evaluating bone strength and fracture risk. In rheumatoid arthritis patients, this allows gauging the number and size of erosions and cysts, in addition to joint space. In osteoarthritis, it is possible to characterize the bone marrow edema-like areas that show a correlation with cartilage breakdown. Given its high cost, HR-pQCT is still a research tool, but the high resolution and efficiency of this method reveal advantages over the methods currently used for bone assessment, with a potential to become an important tool in clinical practice.

  1. Quantitative 13C NMR of whole and fractionated Iowa Mollisols for assessment of organic matter composition

    NASA Astrophysics Data System (ADS)

    Fang, Xiaowen; Chua, Teresita; Schmidt-Rohr, Klaus; Thompson, Michael L.

    2010-01-01

    Both the concentrations and the stocks of soil organic carbon vary across the landscape. Do the amounts of recalcitrant components of soil organic matter (SOM) vary with landscape position? To address this question, we studied four Mollisols in central Iowa, two developed in till and two developed in loess. Two of the soils were well drained and two were poorly drained. We collected surface-horizon samples and studied organic matter in the particulate organic matter (POM) fraction, the clay fractions, and the whole, unfractionated samples. We treated the soil samples with 5 M HF at ambient temperature or at 60 °C for 30 min to concentrate the SOM. To assess the composition of the SOM, we used solid-state nuclear magnetic resonance (NMR) spectroscopy, in particular, quantitative 13C DP/MAS (direct-polarization/magic-angle spinning), with and without recoupled dipolar dephasing. Spin counting by correlation of the integral NMR intensity with the C concentration by elemental analysis showed that NMR was ⩾85% quantitative for the majority of the samples studied. For untreated whole-soil samples with <2.5 wt.% C, which is considerably less than in most previous quantitative NMR analyses of SOM, useful spectra that reflected ⩾65% of all C were obtained. The NMR analyses allowed us to conclude (1) that the HF treatment (with or without heat) had low impact on the organic C composition in the samples, except for protonating carboxylate anions to carboxylic acids, (2) that most organic C was observable by NMR even in untreated soil materials, (3) that esters were likely to compose only a minor fraction of SOM in these Mollisols, and (4) that the aromatic components of SOM were enriched to ˜53% in the poorly drained soils, compared with ˜48% in the well drained soils; in plant tissue and particulate organic matter (POM) the aromaticities were ˜18% and ˜32%, respectively. Nonpolar, nonprotonated aromatic C, interpreted as a proxy for charcoal C, dominated the

  2. Establishment and assessment of two methods for quantitative detection of serum duck hepatitis B virus DNA

    PubMed Central

    Chen, Ya-Xi; Huang, Ai-Long; Qi, Zhen-Yuan; Guo, Shu-Hua

    2004-01-01

    AIM: To establish and assess the methods for quantitative detection of serum duck hepatitis B virus (DHBV) DNA by quantitative membrane hybridization using DHBV DNA probe labeled directly with alkaline phosphatase and fluorescence quantitative PCR (qPCR). METHODS: Probes of DHBV DNA labeled directly with alkaline phosphatase and chemiluminescent substrate CDP-star were used in this assay. DHBV DNA was detected by autoradiography, and then scanned by DNA dot-blot. In addition, three primers derived from DHBV DNA S gene were designed. Semi-nested primer was labeled by AmpliSensor. Standard curve of the positive standards of DHBV DNA was established after asymmetric preamplification, semi-nested amplification and on-line detection. Results from 100 samples detected separately by alkaline phosphatase direct-labeled DHBV DNA probe with dot-blot hybridization and digoxigenin-labeled DHBV DNA probe hybridization. Seventy samples of duck serum were tested by fluorescent qPCR and digoxigenin-labeled DHBV DNA probe in dot-blot hybridization assay and the correlation of results was analysed. RESULTS: Sensitivity of alkaline phosphatase direct-labeled DHBV DNA probe was 10 pg. The coincidence was 100% compared with digoxigenin-labeled DHBV DNA probe assay. After 30 cycles, amplification products showed two bands of about 180 bp and 70 bp by 20 g/L agarose gel electrophoresis. Concentration of amplification products was in direct proportion to the initial concentration of positive standards. The detection index was in direct proportion to the quantity of amplification products accumulated in the current cycle. The initial concentration of positive standards was in inverse proportion to the number of cycles needed for enough quantities of amplification products. Correlation coefficient of the results was (0.97, P < 0.01) between fluorescent qPCR and dot-blot hybridization. CONCLUSION: Alkaline phosphatase direct-labeled DHBV DNA probe in dot-blot hybridization and fluorescent q

  3. Quantitative Assessment of Fat Infiltration in the Rotator Cuff Muscles using water-fat MRI

    PubMed Central

    Nardo, Lorenzo; Karampinos, Dimitrios C.; Lansdown, Drew A.; Carballido-Gamio, Julio; Lee, Sonia; Maroldi, Roberto; Ma, C. Benjamin; Link, Thomas M.; Krug, Roland

    2013-01-01

    Purpose To evaluate a chemical shift-based fat quantification technique in the rotator cuff muscles in comparison with the semi-quantitative Goutallier fat infiltration classification (GC) and to assess their relationship with clinical parameters. Materials and Methods The shoulders of 57 patients were imaged using a 3T MR scanner. The rotator cuff muscles were assessed for fat infiltration using GC by two radiologists and an orthopedic surgeon. Sequences included oblique-sagittal T1-, T2- and proton density-weighted fast spin echo, and six-echo gradient echo. The iterative decomposition of water and fat with echo asymmetry and least-squares estimation (IDEAL) was used to measure fat fraction. Pain and range of motion of the shoulder were recorded. Results Fat fraction values were significantly correlated with GC grades (p< 0.0001, kappa>0.9) showing consistent increase with GC grades (grade=0, 0%–5.59%; grade=1, 1.1%–9.70%; grade=2, 6.44%–14.86%; grade=3, 15.25%–17.77%; grade=4, 19.85%–29.63%). A significant correlation between fat infiltration of the subscapularis muscle quantified with IDEAL versus a) deficit in internal rotation (Spearman Rank Correlation Coefficient=0.39, 95% CI 0.13–0.60, p<0.01) and b) pain (Spearman Rank Correlation coefficient=0.313, 95% CI 0.049–0.536, p=0.02) was found but was not seen between the clinical parameters and GC grades. Additionally, only quantitative fat infiltration measures of the supraspinatus muscle were significantly correlated with a deficit in abduction (Spearman Rank Correlation Coefficient=0.45, 95% CI 0.20–0.60, p<0.01). Conclusion We concluded that an accurate and highly reproducible fat quantification in the rotator cuff muscles using water-fat MRI techniques is possible and significantly correlates with shoulder pain and range of motion. PMID:24115490

  4. Bone tumor mimickers: A pictorial essay

    PubMed Central

    Mhuircheartaigh, Jennifer Ni; Lin, Yu-Ching; Wu, Jim S

    2014-01-01

    Focal lesions in bone are very common and many of these lesions are not bone tumors. These bone tumor mimickers can include numerous normal anatomic variants and non-neoplastic processes. Many of these tumor mimickers can be left alone, while others can be due to a significant disease process. It is important for the radiologist and clinician to be aware of these bone tumor mimickers and understand the characteristic features which allow discrimination between them and true neoplasms in order to avoid unnecessary additional workup. Knowing which lesions to leave alone or which ones require workup can prevent misdiagnosis and reduce patient anxiety. PMID:25114385

  5. An educationally inspired illustration of two-dimensional Quantitative Microbiological Risk Assessment (QMRA) and sensitivity analysis.

    PubMed

    Vásquez, G A; Busschaert, P; Haberbeck, L U; Uyttendaele, M; Geeraerd, A H

    2014-11-01

    Quantitative Microbiological Risk Assessment (QMRA) is a structured methodology used to assess the risk involved by ingestion of a pathogen. It applies mathematical models combined with an accurate exploitation of data sets, represented by distributions and - in the case of two-dimensional Monte Carlo simulations - their hyperparameters. This research aims to highlight background information, assumptions and truncations of a two-dimensional QMRA and advanced sensitivity analysis. We believe that such a detailed listing is not always clearly presented in actual risk assessment studies, while it is essential to ensure reliable and realistic simulations and interpretations. As a case-study, we are considering the occurrence of listeriosis in smoked fish products in Belgium during the period 2008-2009, using two-dimensional Monte Carlo and two sensitivity analysis methods (Spearman correlation and Sobol sensitivity indices) to estimate the most relevant factors of the final risk estimate. A risk estimate of 0.018% per consumption of contaminated smoked fish by an immunocompromised person was obtained. The final estimate of listeriosis cases (23) is within the actual reported result obtained for the same period and for the same population. Variability on the final risk estimate is determined by the variability regarding (i) consumer refrigerator temperatures, (ii) the reference growth rate of L. monocytogenes, (iii) the minimum growth temperature of L. monocytogenes and (iv) consumer portion size. Variability regarding the initial contamination level of L. monocytogenes tends to appear as a determinant of risk variability only when the minimum growth temperature is not included in the sensitivity analysis; when it is included the impact regarding the variability on the initial contamination level of L. monocytogenes is disappearing. Uncertainty determinants of the final risk indicated the need of gathering more information on the reference growth rate and the minimum

  6. An educationally inspired illustration of two-dimensional Quantitative Microbiological Risk Assessment (QMRA) and sensitivity analysis.

    PubMed

    Vásquez, G A; Busschaert, P; Haberbeck, L U; Uyttendaele, M; Geeraerd, A H

    2014-11-01

    Quantitative Microbiological Risk Assessment (QMRA) is a structured methodology used to assess the risk involved by ingestion of a pathogen. It applies mathematical models combined with an accurate exploitation of data sets, represented by distributions and - in the case of two-dimensional Monte Carlo simulations - their hyperparameters. This research aims to highlight background information, assumptions and truncations of a two-dimensional QMRA and advanced sensitivity analysis. We believe that such a detailed listing is not always clearly presented in actual risk assessment studies, while it is essential to ensure reliable and realistic simulations and interpretations. As a case-study, we are considering the occurrence of listeriosis in smoked fish products in Belgium during the period 2008-2009, using two-dimensional Monte Carlo and two sensitivity analysis methods (Spearman correlation and Sobol sensitivity indices) to estimate the most relevant factors of the final risk estimate. A risk estimate of 0.018% per consumption of contaminated smoked fish by an immunocompromised person was obtained. The final estimate of listeriosis cases (23) is within the actual reported result obtained for the same period and for the same population. Variability on the final risk estimate is determined by the variability regarding (i) consumer refrigerator temperatures, (ii) the reference growth rate of L. monocytogenes, (iii) the minimum growth temperature of L. monocytogenes and (iv) consumer portion size. Variability regarding the initial contamination level of L. monocytogenes tends to appear as a determinant of risk variability only when the minimum growth temperature is not included in the sensitivity analysis; when it is included the impact regarding the variability on the initial contamination level of L. monocytogenes is disappearing. Uncertainty determinants of the final risk indicated the need of gathering more information on the reference growth rate and the minimum

  7. Assessment of the healing process in distal radius fractures by high resolution peripheral quantitative computed tomography.

    PubMed

    de Jong, Joost J A; Willems, Paul C; Arts, Jacobus J; Bours, Sandrine G P; Brink, Peter R G; van Geel, Tineke A C M; Poeze, Martijn; Geusens, Piet P; van Rietbergen, Bert; van den Bergh, Joop P W

    2014-07-01

    In clinical practice, fracture healing is evaluated by clinical judgment in combination with conventional radiography. Due to limited resolution, radiographs don't provide detailed information regarding the bone micro-architecture and bone strength. Recently, assessment of in vivo bone density, architectural and mechanical properties at the microscale became possible using high resolution peripheral quantitative computed tomography (HR-pQCT) in combination with micro finite element analysis (μFEA). So far, such techniques have been used mainly to study intact bone. The aim of this study was to explore whether these techniques can also be used to assess changes in bone density, micro-architecture and bone stiffness during fracture healing. Therefore, the fracture region in eighteen women, aged 50 years or older with a stable distal radius fracture, was scanned using HR-pQCT at 1-2 (baseline), 3-4, 6-8 and 12weeks post-fracture. At 1-2 and 12 weeks post-fracture the distal radius at the contra-lateral side was also scanned as control. Standard bone density, micro-architectural and geometric parameters were calculated and bone stiffness in compression, torsion and bending was assessed using μFEA. A linear mixed effect model with time post-fracture as fixed effect was used to detect significant (p-value ≤0.05) changes from baseline. Wrist pain and function were scored using the patient-rated wrist evaluation (PRWE) questionnaire. Correlations between the bone parameters and the PRWE score were calculated by Spearman's correlation coefficient. At the fracture site, total and trabecular bone density increased by 11% and 20%, respectively, at 6-8 weeks, whereas cortical density was decreased by 4%. Trabecular thickness increased by 23-31% at 6-8 and 12 weeks and the intertrabecular area became blurred, indicating intertrabecular bone formation. Compared to baseline, calculated bone stiffness in compression, torsion and bending was increased by 31% after 12 weeks. A

  8. Quantitative assessment of post-disaster housing recovery: a case study of Punta Gorda, Florida, after Hurricane Charley.

    PubMed

    Rathfon, Dana; Davidson, Rachel; Bevington, John; Vicini, Alessandro; Hill, Arleen

    2013-04-01

    Quantitative assessment of post-disaster housing recovery is critical to enhancing understanding of the process and improving the decisions that shape it. Nevertheless, few comprehensive empirical evaluations of post-disaster housing recovery have been conducted, and no standard measurement methods exist. This paper presents a quantitative assessment of housing recovery in Punta Gorda, Florida, United States, following Hurricane Charley of August 2004, including an overview of the phases of housing recovery, progression of recovery over time, alternative trajectories of recovery, differential recovery, incorporation of mitigation, and effect on property sales. The assessment is grounded in a conceptual framework that considers the recovery of both people and place, and that emphasises recovery as a process, not as an endpoint. Several data sources are integrated into the assessment--including building permits, remotely sensed imagery, and property appraiser data--and their strengths and limitations are discussed with a view to developing a standardised method for measuring and monitoring housing recovery.

  9. Linking quantitative microbial risk assessment and epidemiological data: informing safe drinking water trials in developing countries.

    PubMed

    Enger, Kyle S; Nelson, Kara L; Clasen, Thomas; Rose, Joan B; Eisenberg, Joseph N S

    2012-05-01

    Intervention trials are used extensively to assess household water treatment (HWT) device efficacy against diarrheal disease in developing countries. Using these data for policy, however, requires addressing issues of generalizability (relevance of one trial in other contexts) and systematic bias associated with design and conduct of a study. To illustrate how quantitative microbial risk assessment (QMRA) can address water safety and health issues, we analyzed a published randomized controlled trial (RCT) of the LifeStraw Family Filter in the Congo. The model accounted for bias due to (1) incomplete compliance with filtration, (2) unexpected antimicrobial activity by the placebo device, and (3) incomplete recall of diarrheal disease. Effectiveness was measured using the longitudinal prevalence ratio (LPR) of reported diarrhea. The Congo RCT observed an LPR of 0.84 (95% CI: 0.61, 1.14). Our model predicted LPRs, assuming a perfect placebo, ranging from 0.50 (2.5-97.5 percentile: 0.33, 0.77) to 0.86 (2.5-97.5 percentile: 0.68, 1.09) for high (but not perfect) and low (but not zero) compliance, respectively. The calibration step provided estimates of the concentrations of three pathogen types (modeled as diarrheagenic E. coli, Giardia, and rotavirus) in drinking water, consistent with the longitudinal prevalence of reported diarrhea measured in the trial, and constrained by epidemiological data from the trial. Use of a QMRA model demonstrated the importance of compliance in HWT efficacy, the need for pathogen data from source waters, the effect of quantifying biases associated with epidemiological data, and the usefulness of generalizing the effectiveness of HWT trials to other contexts.

  10. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    PubMed

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation.

  11. The Strategic Environment Assessment bibliographic network: A quantitative literature review analysis

    SciTech Connect

    Caschili, Simone; De Montis, Andrea; Ganciu, Amedeo; Ledda, Antonio; Barra, Mario

    2014-07-01

    Academic literature has been continuously growing at such a pace that it can be difficult to follow the progression of scientific achievements; hence, the need to dispose of quantitative knowledge support systems to analyze the literature of a subject. In this article we utilize network analysis tools to build a literature review of scientific documents published in the multidisciplinary field of Strategic Environment Assessment (SEA). The proposed approach helps researchers to build unbiased and comprehensive literature reviews. We collect information on 7662 SEA publications and build the SEA Bibliographic Network (SEABN) employing the basic idea that two publications are interconnected if one cites the other. We apply network analysis at macroscopic (network architecture), mesoscopic (sub graph) and microscopic levels (node) in order to i) verify what network structure characterizes the SEA literature, ii) identify the authors, disciplines and journals that are contributing to the international discussion on SEA, and iii) scrutinize the most cited and important publications in the field. Results show that the SEA is a multidisciplinary subject; the SEABN belongs to the class of real small world networks with a dominance of publications in Environmental studies over a total of 12 scientific sectors. Christopher Wood, Olivia Bina, Matthew Cashmore, and Andrew Jordan are found to be the leading authors while Environmental Impact Assessment Review is by far the scientific journal with the highest number of publications in SEA studies. - Highlights: • We utilize network analysis to analyze scientific documents in the SEA field. • We build the SEA Bibliographic Network (SEABN) of 7662 publications. • We apply network analysis at macroscopic, mesoscopic and microscopic network levels. • We identify SEABN architecture, relevant publications, authors, subjects and journals.

  12. Quantitative Evaluation of MODIS Fire Radiative Power Measurement for Global Smoke Emissions Assessment

    NASA Technical Reports Server (NTRS)

    Ichoku, Charles; Ellison, Luke

    2011-01-01

    Satellite remote sensing is providing us tremendous opportunities to measure the fire radiative energy (FRE) release rate or power (FRP) from open biomass burning, which affects many vegetated regions of the world on a seasonal basis. Knowledge of the biomass burning characteristics and emission source strengths of different (particulate and gaseous) smoke constituents is one of the principal ingredients upon which the assessment, modeling, and forecasting of their distribution and impacts depend. This knowledge can be gained through accurate measurement of FRP, which has been shown to have a direct relationship with the rates of biomass consumption and emissions of major smoke constituents. Over the last decade or so, FRP has been routinely measured from space by both the MODIS sensors aboard the polar orbiting Terra and Aqua satellites, and the SEVIRI sensor aboard the Meteosat Second Generation (MSG) geostationary satellite. During the last few years, FRP has steadily gained increasing recognition as an important parameter for facilitating the development of various scientific studies and applications relating to the quantitative characterization of biomass burning and their emissions. To establish the scientific integrity of the FRP as a stable quantity that can be measured consistently across a variety of sensors and platforms, with the potential of being utilized to develop a unified long-term climate data record of fire activity and impacts, it needs to be thoroughly evaluated, calibrated, and validated. Therefore, we are conducting a detailed analysis of the FRP products from MODIS to evaluate the uncertainties associated with them, such as those due to the effects of satellite variable observation geometry and other factors, in order to establish their error budget for use in diverse scientific research and applications. In this presentation, we will show recent results of the MODIS FRP uncertainty analysis and error mitigation solutions, and demonstrate

  13. Quantitative risk assessment of the New York State operated West Valley Radioactive Waste Disposal Area.

    PubMed

    Garrick, B John; Stetkar, John W; Bembia, Paul J

    2010-08-01

    This article is based on a quantitative risk assessment (QRA) that was performed on a radioactive waste disposal area within the Western New York Nuclear Service Center in western New York State. The QRA results were instrumental in the decision by the New York State Energy Research and Development Authority to support a strategy of in-place management of the disposal area for another decade. The QRA methodology adopted for this first of a kind application was a scenario-based approach in the framework of the triplet definition of risk (scenarios, likelihoods, consequences). The measure of risk is the frequency of occurrence of different levels of radiation dose to humans at prescribed locations. The risk from each scenario is determined by (1) the frequency of disruptive events or natural processes that cause a release of radioactive materials from the disposal area; (2) the physical form, quantity, and radionuclide content of the material that is released during each scenario; (3) distribution, dilution, and deposition of the released materials throughout the environment surrounding the disposal area; and (4) public exposure to the distributed material and the accumulated radiation dose from that exposure. The risks of the individual scenarios are assembled into a representation of the risk from the disposal area. In addition to quantifying the total risk to the public, the analysis ranks the importance of each contributing scenario, which facilitates taking corrective actions and implementing effective risk management. Perhaps most importantly, quantification of the uncertainties is an intrinsic part of the risk results. This approach to safety analysis has demonstrated many advantages of applying QRA principles to assessing the risk of facilities involving hazardous materials.

  14. A quantitative integrated assessment of pollution prevention achieved by integrated pollution prevention control licensing.

    PubMed

    Styles, David; O'Brien, Kieran; Jones, Michael B

    2009-11-01

    This paper presents an innovative, quantitative assessment of pollution avoidance attributable to environmental regulation enforced through integrated licensing, using Ireland's pharmaceutical-manufacturing sector as a case study. Emissions data reported by pharmaceutical installations were aggregated into a pollution trend using an Environmental Emissions Index (EEI) based on Lifecycle Assessment methodologies. Complete sectoral emissions data from 2001 to 2007 were extrapolated back to 1995, based on available data. Production volume data were used to derive a sectoral production index, and determine 'no-improvement' emission trends, whilst questionnaire responses from 20 industry representatives were used to quantify the contribution of integrated licensing to emission avoidance relative to these trends. Between 2001 and 2007, there was a 40% absolute reduction in direct pollution from 27 core installations, and 45% pollution avoidance relative to hypothetical 'no-improvement' pollution. It was estimated that environmental regulation avoided 20% of 'no-improvement' pollution, in addition to 25% avoidance under business-as-usual. For specific emissions, avoidance ranged from 14% and 30 kt a(-1) for CO(2) to 88% and 598 t a(-1) for SO(x). Between 1995 and 2007, there was a 59% absolute reduction in direct pollution, and 76% pollution avoidance. Pollution avoidance was dominated by reductions in emissions of VOCs, SO(x) and NO(x) to air, and emissions of heavy metals to water. Pollution avoidance of 35% was attributed to integrated licensing, ranging from between 8% and 2.9 t a(-1) for phosphorus emissions to water to 49% and 3143 t a(-1) for SO(x) emissions to air. Environmental regulation enforced through integrated licensing has been the major driver of substantial pollution avoidance achieved by Ireland's pharmaceutical sector - through emission limit values associated with Best Available Techniques, emissions monitoring and reporting requirements, and

  15. High Resolution Peripheral Quantitative Computed Tomography for Assessment of Bone Quality

    NASA Astrophysics Data System (ADS)

    Kazakia, Galateia

    2014-03-01

    The study of bone quality is motivated by the high morbidity, mortality, and societal cost of skeletal fractures. Over 10 million people are diagnosed with osteoporosis in the US alone, suffering 1.5 million osteoporotic fractures and costing the health care system over 17 billion annually. Accurate assessment of fracture risk is necessary to ensure that pharmacological and other interventions are appropriately administered. Currently, areal bone mineral density (aBMD) based on 2D dual-energy X-ray absorptiometry (DXA) is used to determine osteoporotic status and predict fracture risk. Though aBMD is a significant predictor of fracture risk, it does not completely explain bone strength or fracture incidence. The major limitation of aBMD is the lack of 3D information, which is necessary to distinguish between cortical and trabecular bone and to quantify bone geometry and microarchitecture. High resolution peripheral quantitative computed tomography (HR-pQCT) enables in vivo assessment of volumetric BMD within specific bone compartments as well as quantification of geometric and microarchitectural measures of bone quality. HR-pQCT studies have documented that trabecular bone microstructure alterations are associated with fracture risk independent of aBMD.... Cortical bone microstructure - specifically porosity - is a major determinant of strength, stiffness, and fracture toughness of cortical tissue and may further explain the aBMD-independent effect of age on bone fragility and fracture risk. The application of finite element analysis (FEA) to HR-pQCT data permits estimation of patient-specific bone strength, shown to be associated with fracture incidence independent of aBMD. This talk will describe the HR-pQCT scanner, established metrics of bone quality derived from HR-pQCT data, and novel analyses of bone quality currently in development. Cross-sectional and longitudinal HR-pQCT studies investigating the impact of aging, disease, injury, gender, race, and

  16. Quantitative assessment of rotator cuff muscle elasticity: Reliability and feasibility of shear wave elastography.

    PubMed

    Hatta, Taku; Giambini, Hugo; Uehara, Kosuke; Okamoto, Seiji; Chen, Shigao; Sperling, John W; Itoi, Eiji; An, Kai-Nan

    2015-11-01

    Ultrasound imaging has been used to evaluate various shoulder pathologies, whereas, quantification of the rotator cuff muscle stiffness using shear wave elastography (SWE) has not been verified. The purpose of this study was to investigate the reliability and feasibility of SWE measurements for the quantification of supraspinatus (SSP) muscle elasticity. Thirty cadaveric shoulders (18 intact and 12 with torn rotator cuff) were used. Intra- and inter-observer reliability was evaluated on an established SWE technique for measuring the SSP muscle elasticity. To assess the effect of overlying soft tissues above the SSP muscle, SWE values were measured with the transducer placed on the skin, on the subcutaneous fat after removing the skin, on the trapezius muscle after removing the subcutaneous fat, and directly on the SSP muscle. In addition, SWE measurements on 4 shoulder positions (0°, 30°, 60°, and 90° abduction) were compared in those with/without rotator cuff tears. Intra- and inter-observer reliability of SWE measurements were excellent for all regions in SSP muscle. Also, removing the overlying soft tissue showed no significant difference on SWE values measured in the SSP muscle. The SSP muscle with 0° abduction showed large SWE values, whereas, shoulders with large-massive tear showed smaller variation throughout the adduction-abduction positions. SWE is a reliable and feasible tool for quantitatively assessing the SSP muscle elasticity. This study also presented SWE measurements on the SSP muscle under various shoulder abduction positions which might help characterize patterns in accordance to the size of rotator cuff tears. PMID:26472309

  17. Assessing caries removal by undergraduate dental students using quantitative light-induced fluorescence.

    PubMed

    Adeyemi, Adejumoke A; Jarad, Fadi D; Komarov, Gleb N; Pender, Neil; Higham, Susan M

    2008-11-01

    The purpose of this study was to compare detection of enamel and dentinal caries by dental students' and faculty members' visual inspection and by quantitative light-induced fluorescence (QLF). The overall aim was to determine whether QLF is an appropriate technique for use in clinical skills laboratories as a teaching aid for dental undergraduates to detect and assess the removal of enamel and dentinal caries. Sixty students who had no clinical experience with dental caries were asked to select . suitably decayed teeth and mount them in plaster. After recording baseline QLF images, students removed caries according to instructions given by the clinical tutor. On completion of the exercise, the teeth were visually determined to be caries-free by the student, then confirmed by the clinical tutor. A fluorescein in alcohol solution was injected into the cavity for two minutes, rinsed, and dried before QLF images were captured. The images were visually analyzed by two examiners for the presence or absence of caries. From seventy-four images recorded, seventeen were excluded due to exposure of the pulp chamber. The remaining fifty-seven teeth, which by clinical visual examination were judged to be caries-free, were examined using QLF. Fifty-three percent were found to be caries-free, while 47 percent were carious. In this sample of fifty-seven teeth judged to be caries-free by both dental students and faculty members, QLF thus detected caries in almost half of these teeth. These findings suggest that QLF is a useful, noninvasive, nondestructive technique for the detection of caries and can serve as an adjunct to chair-side diagnosis and management of dental caries, which is typically accomplished by visual inspection. QLF may be useful and appropriate as an objective clinical teaching aid for the assessment of dental caries.

  18. A quantitative methodology to assess the risks to human health from CO 2 leakage into groundwater

    NASA Astrophysics Data System (ADS)

    Siirila, Erica R.; Navarre-Sitchler, Alexis K.; Maxwell, Reed M.; McCray, John E.

    2012-02-01

    Leakage of CO 2 and associated gases into overlying aquifers as a result of geologic carbon capture and sequestration may have adverse impacts on aquifer drinking-water quality. Gas or aqueous-phase leakage may occur due to transport via faults and fractures, through faulty well bores, or through leaky confining materials. Contaminants of concern include aqueous salts and dissolved solids, gaseous or aqueous-phase organic contaminants, and acidic gas or aqueous-phase fluids that can liberate metals from aquifer minerals. Here we present a quantitative risk assessment framework to predict potential human health risk from CO 2 leakage into drinking water aquifers. This framework incorporates the potential release of CO 2 into the drinking water aquifer; mobilization of metals due to a decrease in pH; transport of these metals down gradient to municipal receptors; distributions of contaminated groundwater to multiple households; and exposure and health risk to individuals using this water for household purposes. Additionally, this framework is stochastic, incorporates detailed variations in geological and geostatistical parameters and discriminates between uncertain and variable parameters using a two-stage, or nested, Monte Carlo approach. This approach is demonstrated using example simulations with hypothetical, yet realistic, aquifer characteristics and leakage scenarios. These example simulations show a greater risk for arsenic than for lead for both cancer and non-cancer endpoints, an unexpected finding. Higher background groundwater gradients also yield higher risk. The overall risk and the associated uncertainty are sensitive to the extent of aquifer stratification and the degree of local-scale dispersion. These results all highlight the importance of hydrologic modeling in risk assessment. A linear relationship between carcinogenic and noncarcinogenic risk was found for arsenic and suggests action levels for carcinogenic risk will be exceeded in exposure

  19. Pharmacology-based toxicity assessment: towards quantitative risk prediction in humans.

    PubMed

    Sahota, Tarjinder; Danhof, Meindert; Della Pasqua, Oscar

    2016-05-01

    Despite ongoing efforts to better understand the mechanisms underlying safety and toxicity, ~30% of the attrition in drug discovery and development is still due to safety concerns. Changes in current practice regarding the assessment of safety and toxicity are required to reduce late stage attrition and enable effective development of novel medicines. This review focuses on the implications of empirical evidence generation for the evaluation of safety and toxicity during drug development. A shift in paradigm is needed to (i) ensure that pharmacological concepts are incorporated into the evaluation of safety and toxicity; (ii) facilitate the integration of historical evidence and thereby the translation of findings across species as well as between in vitro and in vivo experiments and (iii) promote the use of experimental protocols tailored to address specific safety and toxicity questions. Based on historical examples, we highlight the challenges for the early characterisation of the safety profile of a new molecule and discuss how model-based methodologies can be applied for the design and analysis of experimental protocols. Issues relative to the scientific rationale are categorised and presented as a hierarchical tree describing the decision-making process. Focus is given to four different areas, namely, optimisation, translation, analytical construct and decision criteria. From a methodological perspective, the relevance of quantitative methods for estimation and extrapolation of risk from toxicology and safety pharmacology experimental protocols, such as points of departure and potency, is discussed in light of advancements in population and Bayesian modelling techniques (e.g. non-linear mixed effects modelling). Their use in the evaluation of pharmacokinetics (PK) and pharmacokinetic-pharmacodynamic relationships (PKPD) has enabled great insight into the dose rationale for medicines in humans, both in terms of efficacy and adverse events. Comparable benefits

  20. Quantitative assessment of the enamel machinability in tooth preparation with dental diamond burs.

    PubMed

    Song, Xiao-Fei; Jin, Chen-Xin; Yin, Ling

    2015-01-01

    Enamel cutting using dental handpieces is a critical process in tooth preparation for dental restorations and treatment but the machinability of enamel is poorly understood. This paper reports on the first quantitative assessment of the enamel machinability using computer-assisted numerical control, high-speed data acquisition, and force sensing systems. The enamel machinability in terms of cutting forces, force ratio, cutting torque, cutting speed and specific cutting energy were characterized in relation to enamel surface orientation, specific material removal rate and diamond bur grit size. The results show that enamel surface orientation, specific material removal rate and diamond bur grit size critically affected the enamel cutting capability. Cutting buccal/lingual surfaces resulted in significantly higher tangential and normal forces, torques and specific energy (p<0.05) but lower cutting speeds than occlusal surfaces (p<0.05). Increasing material removal rate for high cutting efficiencies using coarse burs yielded remarkable rises in cutting forces and torque (p<0.05) but significant reductions in cutting speed and specific cutting energy (p<0.05). In particular, great variations in cutting forces, torques and specific energy were observed at the specific material removal rate of 3mm(3)/min/mm using coarse burs, indicating the cutting limit. This work provides fundamental data and the scientific understanding of the enamel machinability for clinical dental practice.

  1. Assessing vertebral fracture risk on volumetric quantitative computed tomography by geometric characterization of trabecular bone structure

    NASA Astrophysics Data System (ADS)

    Checefsky, Walter A.; Abidin, Anas Z.; Nagarajan, Mahesh B.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel

    2016-03-01

    The current clinical standard for measuring Bone Mineral Density (BMD) is dual X-ray absorptiometry, however more recently BMD derived from volumetric quantitative computed tomography has been shown to demonstrate a high association with spinal fracture susceptibility. In this study, we propose a method of fracture risk assessment using structural properties of trabecular bone in spinal vertebrae. Experimental data was acquired via axial multi-detector CT (MDCT) from 12 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. Common image processing methods were used to annotate the trabecular compartment in the vertebral slices creating a circular region of interest (ROI) that excluded cortical bone for each slice. The pixels inside the ROI were converted to values indicative of BMD. High dimensional geometrical features were derived using the scaling index method (SIM) at different radii and scaling factors (SF). The mean BMD values within the ROI were then extracted and used in conjunction with a support vector machine to predict the failure load of the specimens. Prediction performance was measured using the root-mean-square error (RMSE) metric and determined that SIM combined with mean BMD features (RMSE = 0.82 +/- 0.37) outperformed MDCT-measured mean BMD (RMSE = 1.11 +/- 0.33) (p < 10-4). These results demonstrate that biomechanical strength prediction in vertebrae can be significantly improved through the use of SIM-derived texture features from trabecular bone.

  2. Quantitative Assessment of Turbulence and Flow Eccentricity in an Aortic Coarctation: Impact of Virtual Interventions.

    PubMed

    Andersson, Magnus; Lantz, Jonas; Ebbers, Tino; Karlsson, Matts

    2015-09-01

    Turbulence and flow eccentricity can be measured by magnetic resonance imaging (MRI) and may play an important role in the pathogenesis of numerous cardiovascular diseases. In the present study, we propose quantitative techniques to assess turbulent kinetic energy (TKE) and flow eccentricity that could assist in the evaluation and treatment of stenotic severities. These hemodynamic parameters were studied in a pre-treated aortic coarctation (CoA) and after several virtual interventions using computational fluid dynamics (CFD), to demonstrate the effect of different dilatation options on the flow field. Patient-specific geometry and flow conditions were derived from MRI data. The unsteady pulsatile flow was resolved by large eddy simulation including non-Newtonian blood rheology. Results showed an inverse asymptotic relationship between the total amount of TKE and degree of dilatation of the stenosis, where turbulent flow proximal the constriction limits the possible improvement by treating the CoA alone. Spatiotemporal maps of TKE and flow eccentricity could be linked to the characteristics of the jet, where improved flow conditions were favored by an eccentric dilatation of the CoA. By including these flow markers into a combined MRI-CFD intervention framework, CoA therapy has not only the possibility to produce predictions via simulation, but can also be validated pre- and immediate post treatment, as well as during follow-up studies. PMID:26577361

  3. An Overview of Quantitative Risk Assessment of Space Shuttle Propulsion Elements

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.

    1998-01-01

    Since the Space Shuttle Challenger accident in 1986, NASA has been working to incorporate quantitative risk assessment (QRA) in decisions concerning the Space Shuttle and other NASA projects. One current major NASA QRA study is the creation of a risk model for the overall Space Shuttle system. The model is intended to provide a tool to estimate Space Shuttle risk and to perform sensitivity analyses/trade studies, including the evaluation of upgrades. Marshall Space Flight Center (MSFC) is a part of the NASA team conducting the QRA study; MSFC responsibility involves modeling the propulsion elements of the Space Shuttle, namely: the External Tank (ET), the Solid Rocket Booster (SRB), the Reusable Solid Rocket Motor (RSRM), and the Space Shuttle Main Engine (SSME). This paper discusses the approach that MSFC has used to model its Space Shuttle elements, including insights obtained from this experience in modeling large scale, highly complex systems with a varying availability of success/failure data. Insights, which are applicable to any QRA study, pertain to organizing the modeling effort, obtaining customer buy-in, preparing documentation, and using varied modeling methods and data sources. Also provided is an overall evaluation of the study results, including the strengths and the limitations of the MSFC QRA approach and of qRA technology in general.

  4. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    SciTech Connect

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; Prowant, Matthew S.; Nettleship, Ian; Addleman, Raymond S.; Bonheyo, George T.

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results were compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.

  5. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE PAGES

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; Prowant, Matthew S.; Nettleship, Ian; Addleman, Raymond S.; Bonheyo, George T.

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  6. Quantitative assessment of a Tanzanian integrated conservation and development project involving butterfly farming.

    PubMed

    Morgan-Brown, Theron; Jacobson, Susan K; Wald, Kenneth; Child, Brian

    2010-04-01

    Scientific understanding of the role of development in conservation has been hindered by the quality of evaluations of integrated conservation and development projects. We used a quasi-experimental design to quantitatively assess a conservation and development project involving commercial butterfly farming in the East Usambara Mountains of Tanzania. Using a survey of conservation attitudes, beliefs, knowledge, and behavior, we compared 150 butterfly farmers with a control group of 170 fellow community members. Due to the nonrandom assignment of individuals to the two groups, we used propensity-score matching and weighting in our analyses to control for observed bias. Eighty percent of the farmers believed butterfly farming would be impossible if local forests were cleared, and butterfly farmers reported significantly more participation in forest conservation behaviors and were more likely to believe that conservation behaviors were effective. The two groups did not differ in terms of their general conservation attitudes, attitudes toward conservation officials, or knowledge of conservation-friendly building techniques. The relationship between butterfly farming and conservation behavior was mediated by dependency on butterfly farming income. Assuming unobserved bias played a limited role, our findings suggest that participation in butterfly farming increased participation in conservation behaviors among project participants because farmers perceive a link between earnings from butterfly farming and forest conservation. PMID:20151990

  7. Purity Assessment of Aryltetralin Lactone Lignans by Quantitative 1H Nuclear Magnetic Resonance.

    PubMed

    Sun, Yan-Jun; Zhang, Yan-Li; Wang, Yu; Wang, Jun-Min; Zhao, Xuan; Gong, Jian-Hong; Gao, Wei; Guan, Yan-Bin

    2015-01-01

    In the present work, a quantitative 1H Nuclear Magnetic Resonance (qHNMR) was established for purity assessment of six aryltetralin lactone lignans. The validation of the method was carried out, including specificity, selectivity, linearity, accuracy, precision, and robustness. Several experimental parameters were optimized, including relaxation delay (D1), scan numbers (NS), and pulse angle. 1,4-Dinitrobenzene was used as internal standard (IS), and deuterated dimethyl sulfoxide (DMSO-d6) as the NMR solvent. The purities were calculated by the area ratios of H-2,6 from target analytes vs. aromatic protons from IS. Six aryltetralin lactone lignans (deoxypodophyllotoxin, podophyllotoxin, 4-demethylpodophyllotoxin, podophyllotoxin-7'-O-β-d-glucopyranoside, 4-demethylpodophyllotoxin-7'-O-β-d-glucopyranoside, and 6''-acetyl-podophyllotoxin-7'-O-β -d-glucopyranoside) were analyzed. The analytic results of qHNMR were further validated by high performance liquid chromatography (HPLC). Therefore, the qHNMR method was a rapid, accurate, reliable tool for monitoring the purity of aryltetralin lactone lignans. PMID:26016553

  8. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis.

    PubMed

    Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T

    2016-01-01

    The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms. PMID:26643074

  9. Quantitative immunohistochemical assessment of blood and lymphatic microcirculation in cutaneous lichen planus lesions.

    PubMed

    Výbohová, Desanka; Mellová, Yvetta; Adamicová, Katarína; Adamkov, Marián; Hešková, Gabriela

    2015-06-01

    Latest advances have brought to light the hypothesis that angiogenesis and lymphangiogenesis are tightly connected to some chronic inflammatory diseases. The present study focuses on immunohistochemical assessment of the quantitative changes in the blood and lymphatic microcirculatory bed in common chronic dermatosis - cutaneous lichen planus. Double immunohistochemistry with CD34 and podoplanin antibodies was used to detect blood and lymphatic endothelium, while anti-human VEGF was used for the observation of a key angiogenesis and lymphangiogenesis inducer. Morphometric analysis was performed with QuickPhoto Micro image analysis software. Results confirmed statistically significant enlargement of both the blood and lymphatic microcirculatory beds. Compared to healthy skin, cutaneous lichen planus lesions revealed 1.6 times enlarged blood microcirculatory bed and 1.8 times enlarged lymphatic microcirculatory bed. Vascular endothelial growth factor (VEGF) expression in lesional skin was significantly higher in the epidermis (19.1 times increase) than in the dermis (10.3 times increase). These findings indicate a tight association of angiogenesis and lymphangiogenesis with the pathogenesis of cutaneous lichen planus. PMID:25504638

  10. Perfluorinated acid isomer profiling in water and quantitative assessment of manufacturing source.

    PubMed

    Benskin, Jonathan P; Yeung, Leo W Y; Yamashita, Nobuyoshi; Taniyasu, Sachi; Lam, Paul K S; Martin, Jonathan W

    2010-12-01

    A method for isomer profiling of perfluorinated compounds (PFCs) in water was developed and applied to quantitatively assess the contributions from electrochemical (ECF) and telomer manufacturing processes around source regions of North America, Asia, and Europe. With the exception of 3 sites in Japan, over 80% of total perfluorooctanoate (PFOA, C(7)F(15)COO(-)) was from ECF, with the balance attributable to strictly linear (presumably telomer) manufacturing source(s). Comparing PFOA isomer profiles in samples from China, with PFOA obtained from a local Chinese manufacturer, indicated <3% difference in overall branched isomer content; thus, exclusive contribution from local ECF production cannot be ruled out. In Tokyo Bay, ECF, linear-telomer, and isopropyl-telomer sources contributed to 33%, 53%, and 14% of total PFOA, respectively. Perfluorooctane sulfonate (PFOS, C(8)F(17)SO(3)(-)) isomer profiles were enriched in branched content (i.e., >50% branched) in the Mississippi River but in all other locations were similar or only slightly enriched in branched content relative to historical ECF PFOS. Isomer profiles of other PFCs are also reported. Overall, these data suggest that, with the exception of Tokyo Bay, ECF manufacturing has contributed to the bulk of contamination around these source regions, but other sources are significant, and remote sites should be monitored.

  11. Quantitative analysis of the fall-risk assessment test with wearable inertia sensors.

    PubMed

    Tmaura, Toshiyo; Zakaria, Nor Aini; Kuwae, Yutaka; Sekine, Masaki; Minato, Kotaro; Yoshida, Masaki

    2013-01-01

    We performed a quantitative analysis of the fall-risk assessment test using a wearable inertia sensor focusing on two tests: the time up and go (TUG) test and the four square step test (FSST). These tests consist of various daily activities, such as sitting, standing, walking, stepping, and turning. The TUG test was performed by subjects at low and high fall risk, while FSST was performed by healthy elderly and hemiplegic patients with high fall risk. In general, the total performance time of activities was evaluated. Clinically, it is important to evaluate each activity for further training and management. The wearable sensor consisted of an accelerometer and angular velocity sensor. The angular velocity and angle of pitch direction were used for TUG evaluation, and those in the pitch and yaw directions at the thigh were used for FSST. Using the threshold of the angular velocity signal, we classified the phase corresponding to each activity. We then observed the characteristics of each activity and recommended suitable training and management. The wearable sensor can be used for more detailed evaluation in fall risk management. The wearable sensor can be used more detailed evaluation for fall-risk management test.

  12. A Quantitative Assay Using Mycelial Fragments to Assess Virulence of Mycosphaerella fijiensis.

    PubMed

    Donzelli, Bruno Giuliano Garisto; Churchill, Alice C L

    2007-08-01

    ABSTRACT We describe a method to evaluate the virulence of Mycosphaerella fijiensis, the causal agent of black leaf streak disease (BLSD) of banana and plantain. The method is based on the delivery of weighed slurries of fragmented mycelia by camel's hair brush to 5-by-5-cm areas on the abaxial surface of banana leaf blades. Reliable BLSD development was attained in an environmental growth chamber with stringent lighting and humidity controls. By localizing inoculum onto small areas of large leaves, we achieved a dramatic increase in the number of strains that can be tested on each leaf and plant, which is critical for comparing the virulence of numerous strains concurrently. Image analysis software was used to measure the percentage of each inoculated leaf section showing BLSD symptoms over time. We demonstrated that the level of disease of four isolates was correlated with the weight of the mycelium applied and relatively insensitive to the degree of fragmentation of hyphae. This is the first report demonstrating that weighed mycelial inoculum, combined with image analysis software to measure disease severity, can be used to quantitatively assess the virulence of M. fijiensis under rigorously controlled environmental conditions. PMID:18943631

  13. Quantitative assessment of the response to therapy in achalasia of the cardia.

    PubMed

    Robertson, C S; Hardy, J G; Atkinson, M

    1989-06-01

    Radionuclide oesophageal transit studies and manometry have been carried out in 15 patients with achalasia of the cardia, before treatment, after a course of nifedipine and after pneumatic bag dilatation. Transit studies were also done in 10 patients after cardiomyotomy and in 10 normal subjects. Images were recorded with the subjects seated in front of a gamma camera while swallowing a 10 ml bolus of 99Tcm-tin colloid and then after a further drink of 50 ml water. There was marked retention of tracer in the oesophagus in patients with achalasia compared with rapid clearance in control subjects. Bag dilatation significantly reduced lower oesophageal sphincter pressure but there was no significant difference in the 50% clearance time or percentage dose retained at 100s before and after the treatments. Oesophageal clearance of tracer after the additional drink of water, was improved by bag dilatation. Oesophageal transit in the patients after cardiomyotomy was similar to that in patients who had undergone bag dilatation. There was considerable retention of the tracer in the oesophagus overnight, but this did not result in pulmonary aspiration. Radionuclide oesophageal transit studies provided a quantitative assessment of therapy in achalasia and the proportion of tracer retained after the additional drink proved to be a sensitive measure of response to treatment. Nifedipine proved ineffective as a treatment for achalasia. Bag dilatation and cardiomyotomy were of similar value.

  14. Quantitative microbial risk assessment applied to irrigation of salad crops with waste stabilization pond effluents.

    PubMed

    Pavione, D M S; Bastos, R K X; Bevilacqua, P D

    2013-01-01

    A quantitative microbial risk assessment model for estimating infection risks arising from consuming crops eaten raw that have been irrigated with effluents from stabilization ponds was constructed. A log-normal probability distribution function was fitted to a large database from a comprehensive monitoring of an experimental pond system to account for variability in Escherichia coli concentration in irrigation water. Crop contamination levels were estimated using predictive models derived from field experiments involving the irrigation of several crops with different effluent qualities. Data on daily intake of salad crops were obtained from a national survey in Brazil. Ten thousand-trial Monte Carlo simulations were used to estimate human health risks associated with the use of wastewater for irrigating low- and high-growing crops. The use of effluents containing 10(3)-10(4) E. coli per 100 ml resulted in median rotavirus infection risk of approximately 10(-3) and 10(-4) pppy when irrigating, respectively, low- and high-growing crops; the corresponding 95th percentile risk estimates were around 10(-2) in both scenarios. Sensitivity analyses revealed that variations in effluent quality, in the assumed ratios of pathogens to E. coli, and in the reduction of pathogens between harvest and consumption had great impact upon risk estimates.

  15. Benchmark dose profiles for joint-action quantal data in quantitative risk assessment.

    PubMed

    Deutsch, Roland C; Piegorsch, Walter W

    2012-12-01

    Benchmark analysis is a widely used tool in public health risk analysis. Therein, estimation of minimum exposure levels, called Benchmark Doses (BMDs), that induce a prespecified Benchmark Response (BMR) is well understood for the case of an adverse response to a single stimulus. For cases where two agents are studied in tandem, however, the benchmark approach is far less developed. This article demonstrates how the benchmark modeling paradigm can be expanded from the single-dose setting to joint-action, two-agent studies. Focus is on response outcomes expressed as proportions. Extending the single-exposure setting, representations of risk are based on a joint-action dose-response model involving both agents. Based on such a model, the concept of a benchmark profile (BMP) - a two-dimensional analog of the single-dose BMD at which both agents achieve the specified BMR - is defined for use in quantitative risk characterization and assessment. The resulting, joint, low-dose guidelines can improve public health planning and risk regulation when dealing with low-level exposures to combinations of hazardous agents.

  16. Quantitative risk assessment of rabies entering Great Britain from North America via cats and dogs.

    PubMed

    Jones, Rowena D; Kelly, Louise; Fooks, Anthony R; Wooldridge, Marion

    2005-06-01

    Great Britain has been rabies-free since 1922, which is often considered to be in part due to the strict laws requiring that imported cats and dogs be vaccinated and quarantined for 6 months immediately on entry into the country. Except for two isolated incidents, this quarantine policy has contributed to ensuring that Great Britain has remained free of rabies. In 2000, amendments to the UK quarantine laws were made and the Pet Travel Scheme (PETS) was launched for companion animals traveling from European Union countries and rabies-free islands. Since its introduction, it has been proposed that other countries including North America should be included within the UK scheme. A quantitative risk assessment was developed to assist in the policy decision to amend the long-standing quarantine laws for dogs and cats from North America. It was determined that the risk of rabies entry is very low and is dependent on the level of compliance (i.e., legally conforming to all of the required regulations) with PETS and the number of pets imported. Assuming 100% compliance with PETS and the current level of importation of cats and dogs from North America, the annual probability of importing rabies is lower for animals traveling via PETS (7.22 x 10(-6), 95th percentile) than quarantine (1.01 x 10(-5), 95th percentile). These results, and other scientific evidence, directly informed the decision to expand the PETS scheme to North America as of December 2002.

  17. A Miniaturized Technique for Assessing Protein Thermodynamics and Function Using Fast Determination of Quantitative Cysteine Reactivity

    PubMed Central

    Isom, Daniel G.; Marguet, Philippe R.; Oas, Terrence G.; Hellinga, Homme W.

    2010-01-01

    Protein thermodynamic stability is a fundamental physical characteristic that determines biological function. Furthermore, alteration of thermodynamic stability by macromolecular interactions or biochemical modifications is a powerful tool for assessing the relationship between protein structure, stability, and biological function. High-throughput approaches for quantifying protein stability are beginning to emerge that enable thermodynamic measurements on small amounts of material, in short periods of time, and using readily accessible instrumentation. Here we present such a method, fast quantitative cysteine reactivity (fQCR), which exploits the linkage between protein stability, sidechain protection by protein structure, and structural dynamics to characterize the thermodynamic and kinetic properties of proteins. In this approach, the reaction of a protected cysteine and thiol-reactive fluorogenic indicator is monitored over a gradient of temperatures after a short incubation time. These labeling data can be used to determine the midpoint of thermal unfolding, measure the temperature dependence of protein stability, quantify ligand-binding affinity, and, under certain conditions, estimate folding rate constants. Here, we demonstrate the fQCR method by characterizing these thermodynamic and kinetic properties for variants of Staphylococcal nuclease and E. coli ribose-binding protein engineered to contain single, protected cysteines. These straightforward, information-rich experiments are likely to find applications in protein engineering and functional genomics. PMID:21387407

  18. A real-time, quantitative PCR protocol for assessing the relative parasitemia of Leucocytozoon in waterfowl

    USGS Publications Warehouse

    Smith, Matthew M.; Schmutz, Joel A.; Apelgren, Chloe; Ramey, Andy M.

    2015-01-01

    Microscopic examination of blood smears can be effective at diagnosing and quantifying hematozoa infections. However, this method requires highly trained observers, is time consuming, and may be inaccurate for detection of infections at low levels of parasitemia. To develop a molecular methodology for identifying and quantifying Leucocytozoon parasite infection in wild waterfowl (Anseriformes), we designed a real-time, quantitative PCR protocol to amplify Leucocytozoon mitochondrial DNA using TaqMan fluorogenic probes and validated our methodology using blood samples collected from waterfowl in interior Alaska during late summer and autumn (n = 105). By comparing our qPCR results to those derived from a widely used nested PCR protocol, we determined that our assay showed high levels of sensitivity (91%) and specificity (100%) in detecting Leucocytozoon DNA from host blood samples. Additionally, results of a linear regression revealed significant correlation between the raw measure of parasitemia produced by our qPCR assay (Ct values) and numbers of parasites observed on blood smears (R2 = 0.694, P = 0.003), indicating that our assay can reliably determine the relative parasitemia levels among samples. This methodology provides a powerful new tool for studies assessing effects of haemosporidian infection in wild avian species.

  19. Australia’s first national level quantitative environmental justice assessment of industrial air pollution

    NASA Astrophysics Data System (ADS)

    Chakraborty, Jayajit; Green, Donna

    2014-04-01

    This study presents the first national level quantitative environmental justice assessment of industrial air pollution in Australia. Specifically, our analysis links the spatial distribution of sites and emissions associated with industrial pollution sources derived from the National Pollution Inventory, to Indigenous status and social disadvantage characteristics of communities derived from Australian Bureau of Statistics indicators. Our results reveal a clear national pattern of environmental injustice based on the locations of industrial pollution sources, as well as volume, and toxicity of air pollution released at these locations. Communities with the highest number of polluting sites, emission volume, and toxicity-weighted air emissions indicate significantly greater proportions of Indigenous population and higher levels of socio-economic disadvantage. The quantities and toxicities of industrial air pollution are particularly higher in communities with the lowest levels of educational attainment and occupational status. These findings emphasize the need for more detailed analysis in specific regions and communities where socially disadvantaged groups are disproportionately impacted by industrial air pollution. Our empirical findings also underscore the growing necessity to incorporate environmental justice considerations in environmental planning and policy-making in Australia.

  20. Community Capacity for Watershed Conservation: A Quantitative Assessment of Indicators and Core Dimensions

    NASA Astrophysics Data System (ADS)

    Brinkman, Elliot; Seekamp, Erin; Davenport, Mae A.; Brehm, Joan M.

    2012-10-01

    Community capacity for watershed management has emerged as an important topic for the conservation of water resources. While much of the literature on community capacity has focused primarily on theory construction, there have been few efforts to quantitatively assess community capacity variables and constructs, particularly for watershed management and conservation. This study seeks to identify predictors of community capacity for watershed conservation in southwestern Illinois. A subwatershed-scale survey of residents from four communities located within the Lower Kaskaskia River watershed of southwestern Illinois was administered to measure three specific capacity variables: community empowerment, shared vision and collective action. Principal component analysis revealed key dimensions of each variable. Specifically, collective action was characterized by items relating to collaborative governance and social networks, community empowerment was characterized by items relating to community competency and a sense of responsibility and shared vision was characterized by items relating to perceptions of environmental threats, issues with development, environmental sense of place and quality of life. From the emerging factors, composite measures were calculated to determine the extent to which each variable contributed to community capacity. A stepwise regression revealed that community empowerment explained most of the variability in the composite measure of community capacity for watershed conservation. This study contributes to the theoretical understanding of community capacity by quantifying the role of collective action, community empowerment and shared vision in community capacity, highlighting the need for multilevel interaction to address watershed issues.

  1. Assessment of Quantitative and Allelic MGMT Methylation Patterns as a Prognostic Marker in Glioblastoma.

    PubMed

    Kristensen, Lasse S; Michaelsen, Signe R; Dyrbye, Henrik; Aslan, Derya; Grunnet, Kirsten; Christensen, Ib J; Poulsen, Hans S; Grønbæk, Kirsten; Broholm, Helle

    2016-03-01

    Methylation of the O(6)-methylguanine-DNA methyltransferase (MGMT) gene is a predictive and prognostic marker in newly diagnosed glioblastoma patients treated with temozolomide but how MGMT methylation should be assessed to ensure optimal detection accuracy is debated. We developed a novel quantitative methylation-specific PCR (qMSP) MGMT assay capable of providing allelic methylation data and analyzed 151 glioblastomas from patients receiving standard of care treatment (Stupp protocol). The samples were also analyzed by immunohistochemistry (IHC), standard bisulfite pyrosequencing, and genotyped for the rs1690252 MGMT promoter single nucleotide polymorphism. Monoallelic methylation was observed more frequently than biallelic methylation, and some cases with monoallelic methylation expressed the MGMT protein whereas others did not. The presence of MGMT methylation was associated with better overall survival (p = 0.006; qMSP and p = 0.002; standard pyrosequencing), and the presence of the protein was associated with worse overall survival (p = 0.009). Combined analyses of qMSP and standard pyrosequencing or IHC identified additional patients who benefited from temozolomide treatment. Finally, low methylation levels were also associated with better overall survival (p = 0.061; qMSP and p = 0.02; standard pyrosequencing). These data support the use of both MGMT methylation and MGMT IHC but not allelic methylation data as prognostic markers in patients with temozolomide-treated glioblastoma. PMID:26883115

  2. Quantitative assessment of chemical artefacts produced by propionylation of histones prior to mass spectrometry analysis.

    PubMed

    Soldi, Monica; Cuomo, Alessandro; Bonaldi, Tiziana

    2016-07-01

    Histone PTMs play a crucial role in regulating chromatin structure and function, with impact on gene expression. MS is nowadays widely applied to study histone PTMs systematically. Because histones are rich in arginine and lysine, classical shot-gun approaches based on trypsin digestion are typically not employed for histone modifications mapping. Instead, different protocols of chemical derivatization of lysines in combination with trypsin have been implemented to obtain "Arg-C like" digestion products that are more suitable for LC-MS/MS analysis. Although widespread, these strategies have been recently described to cause various side reactions that result in chemical modifications prone to be misinterpreted as native histone marks. These artefacts can also interfere with the quantification process, causing errors in histone PTMs profiling. The work of Paternoster V. et al. is a quantitative assessment of methyl-esterification and other side reactions occurring on histones after chemical derivatization of lysines with propionic anhydride [Proteomics 2016, 16, 2059-2063]. The authors estimate the effect of different solvents, incubation times, and pH on the extent of these side reactions. The results collected indicate that the replacement of methanol with isopropanol or ACN not only blocks methyl-esterification, but also significantly reduces other undesired unspecific reactions. Carefully titrating the pH after propionic anhydride addition is another way to keep methyl-esterification under control. Overall, the authors describe a set of experimental conditions that allow reducing the generation of various artefacts during histone propionylation. PMID:27373704

  3. Quantitative CT assessment of bone mineral density in dogs with hyperadrenocorticism

    PubMed Central

    Lee, Donghoon; Lee, Youngjae; Choi, Wooshin; Chang, Jinhwa; Kang, Ji-Houn; Na, Ki-Jeong

    2015-01-01

    Canine hyperadrenocorticism (HAC) is one of the most common causes of general osteopenia. In this study, quantitative computed tomography (QCT) was used to compare the bone mineral densities (BMD) between 39 normal dogs and 8 dogs with HAC (6 pituitary-dependent hyperadrenocorticism [PDH]; pituitary dependent hyperadrenocorticism, 2 adrenal hyperadrenocorticism [ADH]; adrenal dependent hyperadrenocorticism) diagnosed through hormonal assay. A computed tomogaraphy scan of the 12th thoracic to 7th lumbar vertebra was performed and the region of interest was drawn in each trabecular and cortical bone. Mean Hounsfield unit values were converted to equivalent BMD with bone-density phantom by linear regression analysis. The converted mean trabecular BMDs were significantly lower than those of normal dogs. ADH dogs showed significantly lower BMDs at cortical bone than normal dogs. Mean trabecular BMDs of dogs with PDH using QCT were significantly lower than those of normal dogs, and both mean trabecular and cortical BMDs in dogs with ADH were significantly lower than those of normal dogs. Taken together, these findings indicate that QCT is useful to assess BMD in dogs with HAC. PMID:26040613

  4. Bioaerosol Deposition to Food Crops near Manure Application: Quantitative Microbial Risk Assessment.

    PubMed

    Jahne, Michael A; Rogers, Shane W; Holsen, Thomas M; Grimberg, Stefan J; Ramler, Ivan P; Kim, Seungo

    2016-03-01

    Production of both livestock and food crops are central priorities of agriculture; however, food safety concerns arise where these practices intersect. In this study, we investigated the public health risks associated with potential bioaerosol deposition to crops grown in the vicinity of manure application sites. A field sampling campaign at dairy manure application sites supported the emission, transport, and deposition modeling of bioaerosols emitted from these lands following application activities. Results were coupled with a quantitative microbial risk assessment model to estimate the infection risk due to consumption of leafy green vegetable crops grown at various distances downwind from the application area. Inactivation of pathogens ( spp., spp., and O157:H7) on both the manure-amended field and on crops was considered to determine the maximum loading of pathogens to plants with time following application. Overall median one-time infection risks at the time of maximum loading decreased from 1:1300 at 0 m directly downwind from the field to 1:6700 at 100 m and 1:92,000 at 1000 m; peak risks (95th percentiles) were considerably greater (1:18, 1:89, and 1:1200, respectively). Median risk was below 1:10,000 at >160 m downwind. As such, it is recommended that a 160-m setback distance is provided between manure application and nearby leafy green crop production. Additional distance or delay before harvest will provide further protection of public health.

  5. A quantitative assessment of precipitation associated with the ITCZ in the CMIP5 GCM simulations

    NASA Astrophysics Data System (ADS)

    Stanfield, Ryan E.; Jiang, Jonathan H.; Dong, Xiquan; Xi, Baike; Su, Hui; Donner, Leo; Rotstayn, Leon; Wu, Tongwen; Cole, Jason; Shindo, Eiki

    2016-09-01

    According to the Intergovernmental Panel on Climate Change 5th Assessment Report, the broad-scale features of precipitation as simulated by Phase 5 of the Coupled Model Intercomparison Project (CMIP5) are in modest agreement with observations, however, large systematic errors are found in the Tropics. In this study, a new algorithm has been developed to define the North Pacific Intertropical Convergence Zone (ITCZ) through several metrics, including: the centerline position of the ITCZ, the width of the ITCZ, and the magnitude of precipitation along the defined ITCZ. These metrics provide a quantitative analysis of precipitation associated with the ITCZ over the equatorial northern Pacific. Results from 29 CMIP5 Atmospheric Model Intercomparison Project (AMIP) Global Circulation Model (GCM) runs are compared with Global Precipitation Climatology Project (GPCP) and Tropical Rainfall Measuring Mission (TRMM) observations. Similarities and differences between the GCM simulations and observations are analyzed with the intent of quantifying magnitude-, location-, and width-based biases within the GCMs. Comparisons show that most of the GCMs tend to simulate a stronger, wider ITCZ shifted slightly northward compared to the ITCZ in GPCP and TRMM observations. Comparisons of CMIP and AMIP simulated precipitation using like-models were found to be nearly equally distributed, with roughly half of GCMs showing an increase (decrease) in precipitation when coupled (decoupled) from their respective ocean model. Further study is warranted to understand these differences.

  6. Quantitative assessment of elemental carbon in the lungs of never smokers, cigarette smokers, and coal miners.

    PubMed

    Saxena, Rajiv K; McClure, Michael E; Hays, Michael D; Green, Francis H Y; McPhee, Laura J; Vallyathan, V; Gilmour, M Ian

    2011-01-01

    Inhalation exposure to particulates such as cigarette smoke and coal dust is known to contribute to the development of chronic lung disease. The purpose of this study was to estimate the amount of elemental carbon (EC) deposits from autopsied lung samples from cigarette smokers, miners, and control subjects and explore the relationship between EC level, exposure history, and the extent of chronic lung disease. The samples comprised three subgroups representing never smokers (8), chronic cigarette smokers (26), and coal miners (6). Following the dissolution of lung tissue, the extracted EC residue was quantified using a thermal-optical transmission (TOT) carbon analyzer. Mean EC levels in the lungs of the control group were 56.68 ± 24.86 (SD) μg/g dry lung weight. Respective mean EC values in lung samples from the smokers and coal miners were 449.56 ± 320.3 μg/g and 6678.2 ± 6162 μg/g. These values were significantly higher than those obtained from the never-smoker group. EC levels in the lung and pack-years of cigarette smoking correlated significantly, as did EC levels and the severity of small airway disease. This study provides one of the first quantitative assessments of EC in human lungs from populations at high relative risk for the development of chronic lung disease.

  7. Utilization of quantitative structure-activity relationships (QSARs) in risk assessment: Alkylphenols

    SciTech Connect

    Beck, B.D.; Toole, A.P.; Callahan, B.G.; Siddhanti, S.K. )

    1991-12-01

    Alkylphenols are a class of environmentally pervasive compounds, found both in natural (e.g., crude oils) and in anthropogenic (e.g., wood tar, coal gasification waste) materials. Despite the frequent environmental occurrence of these chemicals, there is a limited toxicity database on alkylphenols. The authors have therefore developed a 'toxicity equivalence approach' for alkylphenols which is based on their ability to inhibit, in a specific manner, the enzyme cyclooxygenase. Enzyme-inhibiting ability for individual alkylphenols can be estimated based on the quantitative structure-activity relationship developed by Dewhirst (1980) and is a function of the free hydroxyl group, electron-donating ring substituents, and hydrophobic aromatic ring substituents. The authors evaluated the toxicological significance of cyclooxygenase inhibition by comparison of the inhibitory capacity of alkylphenols with the inhibitory capacity of acetylsalicylic acid, or aspirin, a compound whose low-level effects are due to cyclooxygenase inhibition. Since nearly complete absorption for alkylphenols and aspirin is predicted, based on estimates of hydrophobicity and fraction of charged molecules at gastrointestinal pHs, risks from alkylphenols can be expressed directly in terms of 'milligram aspirin equivalence,' without correction for absorption differences. They recommend this method for assessing risks of mixtures of alkylphenols, especially for those compounds with no chronic toxicity data.38 references.

  8. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis.

    PubMed

    Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T

    2016-01-01

    The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms.

  9. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Pavlov, Konstantin A.

    1997-01-10

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing.

  10. Quantitative Muscle MRI as an Assessment Tool for Monitoring Disease Progression in LGMD2I: A Multicentre Longitudinal Study

    PubMed Central

    Coombs, Anna; Sveen, Marie-Louise; Andersen, Søren; Stojkovic, Tanya; Eagle, Michelle; Mayhew, Anna; de Sousa, Paulo L.; Dewar, Liz; Morrow, Jasper M.; Sinclair, Christopher D. J.; Thornton, John S.; Bushby, Kate; Lochmüller, Hanns; Hanna, Michael G.; Hogrel, Jean-Yves; Carlier, Pierre G.; Vissing, John; Straub, Volker

    2013-01-01

    Background Outcome measures for clinical trials in neuromuscular diseases are typically based on physical assessments which are dependent on patient effort, combine the effort of different muscle groups, and may not be sensitive to progression over short trial periods in slow-progressing diseases. We hypothesised that quantitative fat imaging by MRI (Dixon technique) could provide more discriminating quantitative, patient-independent measurements of the progress of muscle fat replacement within individual muscle groups. Objective To determine whether quantitative fat imaging could measure disease progression in a cohort of limb-girdle muscular dystrophy 2I (LGMD2I) patients over a 12 month period. Methods 32 adult patients (17 male;15 female) from 4 European tertiary referral centres with the homozygous c.826C>A mutation in the fukutin-related protein gene (FKRP) completed baseline and follow up measurements 12 months later. Quantitative fat imaging was performed and muscle fat fraction change was compared with (i) muscle strength and function assessed using standardized physical tests and (ii) standard T1-weighted MRI graded on a 6 point scale. Results There was a significant increase in muscle fat fraction in 9 of the 14 muscles analyzed using the quantitative MRI technique from baseline to 12 months follow up. Changes were not seen in the conventional longitudinal physical assessments or in qualitative scoring of the T1w images. Conclusions Quantitative muscle MRI, using the Dixon technique, could be used as an important longitudinal outcome measure to assess muscle pathology and monitor therapeutic efficacy in patients with LGMD2I. PMID:23967145

  11. Study Quality in SLA: An Assessment of Designs, Analyses, and Reporting Practices in Quantitative L2 Research

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2013-01-01

    This study assesses research and reporting practices in quantitative second language (L2) research. A sample of 606 primary studies, published from 1990 to 2010 in "Language Learning and Studies in Second Language Acquisition," was collected and coded for designs, statistical analyses, reporting practices, and outcomes (i.e., effect…

  12. The Moment of Learning: Quantitative Analysis of Exemplar Gameplay Supports CyGaMEs Approach to Embedded Assessment

    ERIC Educational Resources Information Center

    Reese, Debbie Denise; Tabachnick, Barbara G.

    2010-01-01

    In this paper, the authors summarize a quantitative analysis demonstrating that the CyGaMEs toolset for embedded assessment of learning within instructional games measures growth in conceptual knowledge by quantifying player behavior. CyGaMEs stands for Cyberlearning through GaME-based, Metaphor Enhanced Learning Objects. Some scientists of…

  13. 78 FR 9701 - Draft Joint Food and Drug Administration/Health Canada Quantitative Assessment of the Risk of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-11

    ... purpose of the draft QRA is to evaluate the effect of factors such as the microbiological status of milk... milk. II. Quantitative Risk Assessment The draft QRA (Refs. 3 to 6) provides a science-based analytical... made from raw milk, in its reevaluation of the existing 60-day aging requirements for cheeses made...

  14. OMICS DATA IN THE QUALITATIVE AND QUANTITATIVE CHARACTERIZATION OF THE MODE OF ACTION IN SUPPORT OF IRIS ASSESSMENTS

    EPA Science Inventory

    Knowledge and information generated using new tools/methods collectively called "Omics" technologies could have a profound effect on qualitative and quantitative characterizations of human health risk assessments.

    The suffix "Omics" is a descriptor used for a series of e...

  15. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Zakharov, S.M.

    1997-01-01

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation. {copyright} {ital 1997 American Institute of Physics.}

  16. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Zakharov, Sergei M.

    1997-01-10

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation.

  17. Elementary Writing Assessment Platforms: A Quantitative Examination of Online versus Offline Writing Performance of Fifth-Grade Students

    ERIC Educational Resources Information Center

    Heath, Vickie L.

    2013-01-01

    This quantitative study explored if significant differences exist between how fifth-grade students produce a written response to a narrative prompt using online versus offline writing platforms. The cultural and social trend of instructional and assessment writing paradigms in education is shifting to online writing platforms (National Assessment…

  18. ADVANCING EPA WETLAND SCIENCE: DEVELOPING TOOLS FOR QUANTITATIVE ASSESSMENT OF WETLAND FUNCTION AND CONDITION AT THE REGIONAL LEVEL

    EPA Science Inventory

    The EPA Office of Water has recognized a critical need for tribes, states and federal agencies to be able to quantitatively assess the condition of the nations wetland resources. Currently, greater than 85% of states, tribes, and territories are lacking even rudimentary biologic...

  19. Humanlike Robots - Synthetically Mimicking Humans

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph

    2012-01-01

    Nature inspired many inventions and the field of technology that is based on the mimicking or inspiration of nature is widely known as Biomimetics and it is increasingly leading to many new capabilities. There are numerous examples of biomimetic successes including the copying of fins for swimming, and the inspiration of the insects and birds flight. More and more commercial implementations of biomimetics are appearing and behaving lifelike and applications are emerging that are important to our daily life. Making humanlike robots is the ultimate challenge to biomimetics and, for many years, it was considered science fiction, but such robots are becoming an engineering reality. Advances in producing such robot are allowing them to perform impressive functions and tasks. The development of such robots involves addressing many challenges and is raising concerns that are related to fear of their application implications and potential ethical issues. In this paper, the state-of-the-art of humanlike robots, potential applications and challenges will be reviewed.

  20. Tracheobronchial Amyloidosis Mimicking Tracheal Tumor

    PubMed Central

    Özgül, Mehmet Akif; Uzun, Oğuz; Yaşar, Zehra; Acat, Murat; Arda, Naciye; Çetinkaya, Erdoğan

    2016-01-01

    Tracheobronchial amyloidosis is a rare presentation and accounts for about 1% of benign tumors in this area. The diagnosis of disease is delayed due to nonspecific pulmonary symptoms. Therapeutic approaches are required to control progressive pulmonary symptoms for most of the patients. Herein, we report a case of a 68-year-old man admitted with progressive dyspnea to our institution for further evaluation and management. He was initially diagnosed with and underwent management for bronchial asthma for two years but had persistent symptoms despite optimal medical therapy. Pulmonary computed tomography scan revealed severe endotracheal stenosis. Bronchoscopy was performed and showed endotracheal mass obstructing 70% of the distal trachea and mimicking a neoplastic lesion. The mass was successfully resected by mechanical resection, argon plasma coagulation (APC), and Nd-YAG laser during rigid bronchoscopy. Biopsy materials showed deposits of amorphous material by hematoxylin and eosin staining and these deposits were selectively stained with Congo Red. Although this is a rare clinical condition, this case indicated that carrying out a bronchoscopy in any patient complaining of atypical bronchial symptoms or with uncontrolled asthma is very important.

  1. Tracheobronchial Amyloidosis Mimicking Tracheal Tumor

    PubMed Central

    Özgül, Mehmet Akif; Uzun, Oğuz; Yaşar, Zehra; Acat, Murat; Arda, Naciye; Çetinkaya, Erdoğan

    2016-01-01

    Tracheobronchial amyloidosis is a rare presentation and accounts for about 1% of benign tumors in this area. The diagnosis of disease is delayed due to nonspecific pulmonary symptoms. Therapeutic approaches are required to control progressive pulmonary symptoms for most of the patients. Herein, we report a case of a 68-year-old man admitted with progressive dyspnea to our institution for further evaluation and management. He was initially diagnosed with and underwent management for bronchial asthma for two years but had persistent symptoms despite optimal medical therapy. Pulmonary computed tomography scan revealed severe endotracheal stenosis. Bronchoscopy was performed and showed endotracheal mass obstructing 70% of the distal trachea and mimicking a neoplastic lesion. The mass was successfully resected by mechanical resection, argon plasma coagulation (APC), and Nd-YAG laser during rigid bronchoscopy. Biopsy materials showed deposits of amorphous material by hematoxylin and eosin staining and these deposits were selectively stained with Congo Red. Although this is a rare clinical condition, this case indicated that carrying out a bronchoscopy in any patient complaining of atypical bronchial symptoms or with uncontrolled asthma is very important. PMID:27594885

  2. Tracheobronchial Amyloidosis Mimicking Tracheal Tumor.

    PubMed

    Tanrıverdi, Elif; Özgül, Mehmet Akif; Uzun, Oğuz; Gül, Şule; Çörtük, Mustafa; Yaşar, Zehra; Acat, Murat; Arda, Naciye; Çetinkaya, Erdoğan

    2016-01-01

    Tracheobronchial amyloidosis is a rare presentation and accounts for about 1% of benign tumors in this area. The diagnosis of disease is delayed due to nonspecific pulmonary symptoms. Therapeutic approaches are required to control progressive pulmonary symptoms for most of the patients. Herein, we report a case of a 68-year-old man admitted with progressive dyspnea to our institution for further evaluation and management. He was initially diagnosed with and underwent management for bronchial asthma for two years but had persistent symptoms despite optimal medical therapy. Pulmonary computed tomography scan revealed severe endotracheal stenosis. Bronchoscopy was performed and showed endotracheal mass obstructing 70% of the distal trachea and mimicking a neoplastic lesion. The mass was successfully resected by mechanical resection, argon plasma coagulation (APC), and Nd-YAG laser during rigid bronchoscopy. Biopsy materials showed deposits of amorphous material by hematoxylin and eosin staining and these deposits were selectively stained with Congo Red. Although this is a rare clinical condition, this case indicated that carrying out a bronchoscopy in any patient complaining of atypical bronchial symptoms or with uncontrolled asthma is very important. PMID:27594885

  3. Fibrosing mediastinitis mimicking bronchogenic carcinoma

    PubMed Central

    Bayiz, Hulya; Mutluay, Neslihan; Koyuncu, Adem; Demirag, Funda; Dagli, Gulfidan; Berktas, Bahadir; Berkoglu, Mine

    2013-01-01

    Fibrosing mediastinitis is a rare but benign disorder characterized by an excessive fibrotic reaction in the mediastinum which can result in compromise of airways, great vessels, and other mediastinal structures. In this paper we presented a patient with fibrosing mediastinitis mimicking bronchogenic carcinoma. The patient was a 32-year-old diabetic male admitting with cough and hemoptysis. There was a right hilar mass and multiple mediastinal conglomerated lymph nodes on chest computed tomography. Positron emission tomography with computed tomography (PET/CT) scan demonstrated increased fluorodeoxyglucose (FDG) uptake at the right hilar mass lesion and mediastinal lymph nodes. Fiberoptic bronchoscopy showed mucosal distortion of right upper lobe. Pathologic examination of the mucosal biopsy revealed inflammation. Endobronchial ultrasound guided transbronchial needle and cervical mediastinoscopic lymph node biopsies were undiagnostic. Diagnostic thoracotomy confirmed the diagnosis fibrosing mediastinitis. Administration of six months of systemic corticosteroid and antituberculous therapy was not beneficial. In conclusion, despite being a rare clinical entity, fibrosing mediastinitis should be kept in mind in the differential diagnosis of mediastinal mass lesions of unknown etiology. The diagnosis is exceptionally difficult in the presence of atypical radiological findings. The treatment is particularly challenging without any proven effective therapy. PMID:23372962

  4. Quantitative assessment of direct and indirect landslide risk along transportation lines in southern India

    NASA Astrophysics Data System (ADS)

    Jaiswal, P.; van Westen, C. J.; Jetten, V.

    2010-06-01

    A quantitative approach for landslide risk assessment along transportation lines is presented and applied to a road and a railway alignment in the Nilgiri hills in southern India. The method allows estimating direct risk affecting the alignments, vehicles and people, and indirect risk resulting from the disruption of economic activities. The data required for the risk estimation were obtained from historical records. A total of 901 landslides were catalogued initiating from cut slopes along the railway and road alignment. The landslides were grouped into three magnitude classes based on the landslide type, volume, scar depth, run-out distance, etc and their probability of occurrence was obtained using frequency-volume distribution. Hazard, for a given return period, expressed as the number of landslides of a given magnitude class per kilometre of cut slopes, was obtained using Gumbel distribution and probability of landslide magnitude. In total 18 specific hazard scenarios were generated using the three magnitude classes and six return periods (1, 3, 5, 15, 25, and 50 years). The assessment of the vulnerability of the road and railway line was based on damage records whereas the vulnerability of different types of vehicles and people was subjectively assessed based on limited historic incidents. Direct specific loss for the alignments (railway line and road), vehicles (train, bus, lorry, car and motorbike) was expressed in monetary value (US), and direct specific loss of life of commuters was expressed in annual probability of death. Indirect specific loss (US) derived from the traffic interruption was evaluated considering alternative driving routes, and includes losses resulting from additional fuel consumption, additional travel cost, loss of income to the local business, and loss of revenue to the railway department. The results indicate that the total loss, including both direct and indirect loss, from 1 to 50 years return period, varies from US 90 840 to US

  5. Hydraulic fracturing in unconventional reservoirs - Identification of hazards and strategies for a quantitative risk assessment

    NASA Astrophysics Data System (ADS)

    Helmig, R.; Kissinger, A.; Class, H.; Ebigbo, A.

    2012-12-01

    fractured reservoir, fracture propagation, fault zones and their role in regard to fluid migration into shallow aquifers). A quantitative risk assessment which should be the main aim of future work in this field has much higher demands, especially on site specific data, as the estimation of statistical parameter uncertainty requires site specific parameter distributions. There is already ongoing research on risk assessment in related fields like CO2 sequestration. We therefore propose these methodologies to be transferred to risk estimation relating to the use of the hydraulic fracking method, be it for unconventional gas or enhanced geothermal energy production. The overall aim should be to set common and transparent standards for different uses of the subsurface and their involved risks and communicate those to policy makers and stake holders.

  6. Evaluation of a quantitative clinical method for assessment of sensory skin irritation.

    PubMed

    Robinson, M K; Perkins, M A

    2001-10-01

    Sensory skin irritation refers to the myriad of symptomatic complaints (e.g., sting and burn) frequently associated with inflammatory skin conditions or skin intolerance to various chemicals or finished products. Sensory irritation is an important factor in consumer acceptance of the products that they buy and use; however, from a safety testing and risk assessment standpoint, it has been difficult to evaluate. Recently, methods have been developed to more quantitatively assess sensory irritation using a semantically-labeled scale of sensation intensity, the labeled magnitude (LM) scale. Using this device, studies were conducted to determine if test subjects' perceptions of recalled or imagined sensory responses (from a series of survey questions) were related to their actual sensory reactivity to chemical challenge. Subjects were presented with 15 skin sensation scenarios of varying intensities and asked to record their self-perceived recalled or imagined responses using the LM scale. Individual and mean responses to each of the 15 survey questions were compared within and across studies. Considerable variation was seen between subjects' responses to the questions, particularly for questions pertaining to stronger stimuli (e.g., scalding water or skin lacerations). There was also little consistency seen in the pattern of individual responses across the questions. However, among 4 different study populations, the group mean scores for each of the 15 survey questions showed a high degree of consistency. Also, in spite of the variability in perceived responses to the recalled/imagined skin sensations, statistically significant dose-response and time-response patterns were observed in chemical (lactic acid and capsaicin) challenge studies. In one capsaicin study, a direct relationship was observed, among 83% of the study subjects, between the mean recall intensity scores and actual responses to subsequent capsaicin challenge. This pattern was not seen in a lactic acid

  7. Mammary analogue secretory carcinoma mimicking salivary adenoma.

    PubMed

    Williams, Lindsay; Chiosea, Simion I

    2013-12-01

    Mammary analogue secretory carcinoma (MASC) is a recently described salivary gland tumor characterized by ETV6 translocation. It appears that prior studies have identified MASC by reviewing salivary gland carcinomas, such as acinic cell carcinoma and adenocarcinoma, not otherwise specified. To address the possibility of MASC mimicking benign salivary neoplasms we reviewed 12 salivary gland (cyst)adenomas diagnosed prior to the discovery of MASC. One encapsulated (cyst)adenoma of the parotid gland demonstrated features of MASC. The diagnosis was confirmed by fluorescence in situ hybridization with an ETV6 break-apart probe. An unusual complex pattern of ETV6 rearrangement with duplication of the telomeric/distal ETV6 probe was identified. This case illustrates that MASC may mimic salivary (cyst)adenomas. To more accurately assess true clinical and morphologic spectrum of MASC, future studies may have to include review of salivary (cyst)adenomas. The differential diagnosis of MASC may have to be expanded to include cases resembling salivary (cyst)adenomas.

  8. Assessing the risk of impact of farming intensification on calcareous grasslands in Europe: a quantitative implementation of the MIRABEL framework.

    PubMed

    Petit, Sandrine; Elbersen, Berien

    2006-09-01

    Intensification of farming practices is still a major driver of biodiversity loss in Europe, despite the implementation of policies that aim to reverse this trend. A conceptual framework called MIRABEL was previously developed that enabled a qualitative and expert-based assessment of the impact of agricultural intensification on ecologically valuable habitats. We present a quantitative update of the previous assessment that uses newly available pan-European spatially explicit data on pressures and habitats at risk. This quantitative assessment shows that the number of calcareous grasslands potentially at risk of eutrophication and overgrazing is rapidly increasing in Europe. Decreases in nitrogen surpluses and stocking densities that occurred between 1990 and 2000 have rarely led to values that were below the ecological thresholds. At the same time, a substantial proportion of calcareous grassland that has so far experienced low values for indicators of farming intensification has faced increases between 1990 and 2000 and could well become at high risk from farming intensification in the near future. As such, this assessment is an early warning signal, especially for habitats located in areas that have traditionally been farmed extensively. When comparing the outcome of this assessment with the previous qualitative MIRABEL assessment, it appears that if pan-European data are useful to assess the intensity of the pressures, more work is needed to identify regional variations in the response of biodiversity to such pressures. This is where a qualitative approach based on regional expertise should be used to complement data-driven assessments. PMID:17240762

  9. Quantitative assessment of ischemia and reactive hyperemia of the dermal layers using multi - spectral imaging on the human arm

    NASA Astrophysics Data System (ADS)

    Kainerstorfer, Jana M.; Amyot, Franck; Demos, Stavros G.; Hassan, Moinuddin; Chernomordik, Victor; Hitzenberger, Christoph K.; Gandjbakhche, Amir H.; Riley, Jason D.

    2009-07-01

    Quantitative assessment of skin chromophores in a non-invasive fashion is often desirable. Especially pixel wise assessment of blood volume and blood oxygenation is beneficial for improved diagnostics. We utilized a multi-spectral imaging system for acquiring diffuse reflectance images of healthy volunteers' lower forearm. Ischemia and reactive hyperemia was introduced by occluding the upper arm with a pressure cuff for 5min with 180mmHg. Multi-spectral images were taken every 30s, before, during and after occlusion. Image reconstruction for blood volume and blood oxygenation was performed, using a two layered skin model. As the images were taken in a non-contact way, strong artifacts related to the shape (curvature) of the arms were observed, making reconstruction of optical / physiological parameters highly inaccurate. We developed a curvature correction method, which is based on extracting the curvature directly from the intensity images acquired and does not require any additional measures on the object imaged. The effectiveness of the algorithm was demonstrated, on reconstruction results of blood volume and blood oxygenation for in vivo data during occlusion of the arm. Pixel wise assessment of blood volume and blood oxygenation was made possible over the entire image area and comparison of occlusion effects between veins and surrounding skin was performed. Induced ischemia during occlusion and reactive hyperemia afterwards was observed and quantitatively assessed. Furthermore, the influence of epidermal thickness on reconstruction results was evaluated and the exact knowledge of this parameter for fully quantitative assessment was pointed out.

  10. Quantitative assessment of faecal shedding of β-lactam-resistant Escherichia coli and enterococci in dogs.

    PubMed

    Espinosa-Gongora, Carmen; Shah, Syed Qaswar Ali; Jessen, Lisbeth Rem; Bortolaia, Valeria; Langebæk, Rikke; Bjørnvad, Charlotte Reinhard; Guardabassi, Luca

    2015-12-31

    Quantitative data on faecal shedding of antimicrobial resistant bacteria are crucial to assess the risk of transmission from dogs to other animals as well as humans. In this study we investigated prevalence and concentrations of β-lactam-resistant Escherichia coli and enterococci in the faeces of 108 dogs presenting at a veterinary hospital in Denmark. The dogs had not been treated with antimicrobials for 4 weeks prior to the study. Total E. coli and enterococci were quantified by counts on MacConkey and Slanetz-Bartley, respectively. Resistant E. coli and enterococci were counted on the same media containing relevant antibiotic concentrations, followed by species identification using MALDI-TOF. Ampicillin- and cefotaxime-resistant E. coli were detected in 40% and 8% of the dogs, respectively, whereas approximately 15% carried ampicillin-resistant enterococci, mainly Enterococcus faecium. In the faeces of the carriers, the proportion of resistant strains in the total bacterial species population was on average 15% for both ampicillin-resistant E. coli (median faecal load 3.2×10(4)cfu/g) and E. faecium (5.8×10(2) cfu/g), and 4.6% for cefotaxime-resistant E. coli (8.6×10(3) cfu/g). Cefotaxime resistance was associated with the presence of blaCTX-M-1 (n=4), blaCMY-2 (n=4) or multiple mutations in the promoter and coding region of chromosomal ampC (n=1). Altogether the results indicate that the risks of zoonotic transmission of β-lactam-resistant bacteria via human exposure to canine faeces greatly vary amongst individual dogs and are influenced by unidentified factors other than recent antimicrobial use.

  11. Characterizing trabecular bone structure for assessing vertebral fracture risk on volumetric quantitative computed tomography

    NASA Astrophysics Data System (ADS)

    Nagarajan, Mahesh B.; Checefsky, Walter A.; Abidin, Anas Z.; Tsai, Halley; Wang, Xixi; Hobbs, Susan K.; Bauer, Jan S.; Baum, Thomas; Wismüller, Axel

    2015-03-01

    While the proximal femur is preferred for measuring bone mineral density (BMD) in fracture risk estimation, the introduction of volumetric quantitative computed tomography has revealed stronger associations between BMD and spinal fracture status. In this study, we propose to capture properties of trabecular bone structure in spinal vertebrae with advanced second-order statistical features for purposes of fracture risk assessment. For this purpose, axial multi-detector CT (MDCT) images were acquired from 28 spinal vertebrae specimens using a whole-body 256-row CT scanner with a dedicated calibration phantom. A semi-automated method was used to annotate the trabecular compartment in the central vertebral slice with a circular region of interest (ROI) to exclude cortical bone; pixels within were converted to values indicative of BMD. Six second-order statistical features derived from gray-level co-occurrence matrices (GLCM) and the mean BMD within the ROI were then extracted and used in conjunction with a generalized radial basis functions (GRBF) neural network to predict the failure load of the specimens; true failure load was measured through biomechanical testing. Prediction performance was evaluated with a root-mean-square error (RMSE) metric. The best prediction performance was observed with GLCM feature `correlation' (RMSE = 1.02 ± 0.18), which significantly outperformed all other GLCM features (p < 0.01). GLCM feature correlation also significantly outperformed MDCTmeasured mean BMD (RMSE = 1.11 ± 0.17) (p< 10-4). These results suggest that biomechanical strength prediction in spinal vertebrae can be significantly improved through characterization of trabecular bone structure with GLCM-derived texture features.

  12. Assessment of Suitable Reference Genes for Quantitative Gene Expression Studies in Melon Fruits.

    PubMed

    Kong, Qiusheng; Gao, Lingyun; Cao, Lei; Liu, Yue; Saba, Hameed; Huang, Yuan; Bie, Zhilong

    2016-01-01

    Melon (Cucumis melo L.) is an attractive model plant for investigating fruit development because of its morphological, physiological, and biochemical diversity. Quantification of gene expression by quantitative reverse transcription polymerase chain reaction (qRT-PCR) with stably expressed reference genes for normalization can effectively elucidate the biological functions of genes that regulate fruit development. However, the reference genes for data normalization in melon fruits have not yet been systematically validated. This study aims to assess the suitability of 20 genes for their potential use as reference genes in melon fruits. Expression variations of these genes were measured in 24 samples that represented different developmental stages of fertilized and parthenocarpic melon fruits by qRT-PCR analysis. GeNorm identified ribosomal protein L (CmRPL) and cytosolic ribosomal protein S15 (CmRPS15) as the best pair of reference genes, and as many as five genes including CmRPL, CmRPS15, TIP41-like family protein (CmTIP41), cyclophilin ROC7 (CmCYP7), and ADP ribosylation factor 1 (CmADP) were required for more reliable normalization. NormFinder ranked CmRPS15 as the best single reference gene, and RAN GTPase gene family (CmRAN) and TATA-box binding protein (CmTBP2) as the best combination of reference genes in melon fruits. Their effectiveness was further validated by parallel analyses on the activities of soluble acid invertase and sucrose phosphate synthase, and expression profiles of their respective encoding genes CmAIN2 and CmSPS1, as well as sucrose contents during melon fruit ripening. The validated reference genes will help to improve the accuracy of gene expression studies in melon fruits.

  13. Quantitative hazard assessment at Vulcano (Aeolian islands): integration of geology, event statistics and physical modelling

    NASA Astrophysics Data System (ADS)

    Dellino, Pierfrancesco; de Astis, Gianfilippo; La Volpe, Luigi; Mele, Daniela; Sulpizio, Roberto

    2010-05-01

    The analysis of stratigraphy and of pyroclastic deposits particle features allowed the reconstruction of the volcanic history of La Fossa di Vulcano. An eruptive scenario driven by superficial phreatomagmatic explosions emerged. A statistical analysis of the pyroclastic Successions led to define a repetitive sequence of dilute pyroclastic density currents as the most probable events at short term, followed by fallout of dense ballistic blocks. The scale of such events is related to the amount of magma involved in each explosion. Events involving a million of cubic meters of magma are probable in view of what happened in the most recent eruptions. They led to the formation of hundreds of meters thick dilute pyroclastic density currents, moving down the volcano slope at velocities exceeding 50 m/sec. The dispersion of desnity currents affected the whole Vulcano Porto area, the Vulcanello area and also overrode the Fossa Caldera's rim, spreading over the Piano area. Similarly, older pyroclastic deposits erupted at different times (Piano Grotte dei Rossi formation, ~20-7.7 ka) from vents within La Fossa Caldera and before La Fossa Cone formation. They also were phreatomagmatic in origin and fed dilute pyroclastic density currents (PDC). They represent the eruptions with the highest magnitude on the Island. Therefore, for the aim of hazard assessment, these deposits from La Fossa Cone and La Fossa Caldera were used to depict eruptive scenarios at short term and at long term. On the base of physical models that make use of pyroclastic deposits particle features, the impact parameters for each scenario have been calculated. They are dynamic pressure and particle volumetric concentration of density currents, and impact energy of ballistic blocks. On this base, a quantitative hazard map is presented, which could be of direct use for territory planning and for the calculation of the expected damage.

  14. Quantitative microbial risk assessment of antibacterial hand hygiene products on risk of shigellosis.

    PubMed

    Schaffner, Donald W; Bowman, James P; English, Donald J; Fischler, George E; Fuls, Janice L; Krowka, John F; Kruszewski, Francis H

    2014-04-01

    There are conflicting reports on whether antibacterial hand hygiene products are more effective than nonantibacterial products in reducing bacteria on hands and preventing disease. This research used new laboratory data, together with simulation techniques, to compare the ability of nonantibacterial and antibacterial products to reduce shigellosis risk. One hundred sixtythree subjects were used to compare five different hand treatments: two nonantibacterial products and three antibacterial products, i.e., 0.46% triclosan, 4% chlorhexidine gluconate, or 62% ethyl alcohol. Hands were inoculated with 5.5 to 6 log CFU Shigella; the simulated food handlers then washed their hands with one of the five products before handling melon balls. Each simulation scenario represented an event in which 100 people would be exposed to Shigella from melon balls that had been handled by food workers with Shigella on their hands. Analysis of experimental data showed that the two nonantibacterial treatments produced about a 2-log reduction on hands. The three antibacterial treatments showed log reductions greater than 3 but less than 4 on hands. All three antibacterial treatments resulted in statistically significantly lower concentration on the melon balls relative to the nonantibacterial treatments. A simulation that assumed 1 million Shigella bacteria on the hands and the use of a nonantibacterial treatment predicted that 50 to 60 cases of shigellosis would result (of 100 exposed). Each of the antibacterial treatments was predicted to result in an appreciable number of simulations for which the number of illness cases would be 0, with the most common number of illness cases being 5 (of 100 exposed). These effects maintained statistical significance from 10(6) Shigella per hand down to as low as 100 Shigella per hand, with some evidence to support lower levels. This quantitative microbial risk assessment shows that antibacterial hand treatments can significantly reduce Shigella risk

  15. Assessment of Suitable Reference Genes for Quantitative Gene Expression Studies in Melon Fruits

    PubMed Central

    Kong, Qiusheng; Gao, Lingyun; Cao, Lei; Liu, Yue; Saba, Hameed; Huang, Yuan; Bie, Zhilong

    2016-01-01

    Melon (Cucumis melo L.) is an attractive model plant for investigating fruit development because of its morphological, physiological, and biochemical diversity. Quantification of gene expression by quantitative reverse transcription polymerase chain reaction (qRT-PCR) with stably expressed reference genes for normalization can effectively elucidate the biological functions of genes that regulate fruit development. However, the reference genes for data normalization in melon fruits have not yet been systematically validated. This study aims to assess the suitability of 20 genes for their potential use as reference genes in melon fruits. Expression variations of these genes were measured in 24 samples that represented different developmental stages of fertilized and parthenocarpic melon fruits by qRT-PCR analysis. GeNorm identified ribosomal protein L (CmRPL) and cytosolic ribosomal protein S15 (CmRPS15) as the best pair of reference genes, and as many as five genes including CmRPL, CmRPS15, TIP41-like family protein (CmTIP41), cyclophilin ROC7 (CmCYP7), and ADP ribosylation factor 1 (CmADP) were required for more reliable normalization. NormFinder ranked CmRPS15 as the best single reference gene, and RAN GTPase gene family (CmRAN) and TATA-box binding protein (CmTBP2) as the best combination of reference genes in melon fruits. Their effectiveness was further validated by parallel analyses on the activities of soluble acid invertase and sucrose phosphate synthase, and expression profiles of their respective encoding genes CmAIN2 and CmSPS1, as well as sucrose contents during melon fruit ripening. The validated reference genes will help to improve the accuracy of gene expression studies in melon fruits. PMID:27536316

  16. Evaluation of quantitative PCR combined with PMA treatment for molecular assessment of microbial water quality.

    PubMed

    Gensberger, Eva Theres; Polt, Marlies; Konrad-Köszler, Marianne; Kinner, Paul; Sessitsch, Angela; Kostić, Tanja

    2014-12-15

    Microbial water quality assessment currently relies on cultivation-based methods. Nucleic acid-based techniques such as quantitative PCR (qPCR) enable more rapid and specific detection of target organisms and propidium monoazide (PMA) treatment facilitates the exclusion of false positive results caused by DNA from dead cells. Established molecular assays (qPCR and PMA-qPCR) for legally defined microbial quality parameters (Escherichia coli, Enterococcus spp. and Pseudomonas aeruginosa) and indicator organism group of coliforms (implemented on the molecular detection of Enterobacteriaceae) were comparatively evaluated to conventional microbiological methods. The evaluation of an extended set of drinking and process water samples showed that PMA-qPCR for E. coli, Enterococcus spp. and P. aeruginosa resulted in higher specificity because substantial or complete reduction of false positive signals in comparison to qPCR were obtained. Complete compliance to reference method was achieved for E. coli PMA-qPCR and 100% specificity for Enterococcus spp. and P. aeruginosa in the evaluation of process water samples. A major challenge remained in sensitivity of the assays, exhibited through false negative results (7-23%), which is presumably due to insufficient sample preparation (i.e. concentration of bacteria and DNA extraction), rather than the qPCR limit of detection. For the detection of the indicator group of coliforms, the evaluation study revealed that the utilization of alternative molecular assays based on the taxonomic group of Enterobacteriaceae was not adequate. Given the careful optimization of the sensitivity, the highly specific PMA-qPCR could be a valuable tool for rapid detection of hygienic parameters such as E. coli, Enterococcus spp. and P. aeruginosa.

  17. Contouring Variability of the Penile Bulb on CT Images: Quantitative Assessment Using a Generalized Concordance Index

    SciTech Connect

    Carillo, Viviana; Cozzarini, Cesare; Perna, Lucia; Calandra, Mauro; Gianolini, Stefano; Rancati, Tiziana; Spinelli, Antonello Enrico; Vavassori, Vittorio; Villa, Sergio; Valdagni, Riccardo; Fiorino, Claudio

    2012-11-01

    Purpose: Within a multicenter study (DUE-01) focused on the search of predictors of erectile dysfunction and urinary toxicity after radiotherapy for prostate cancer, a dummy run exercise on penile bulb (PB) contouring on computed tomography (CT) images was carried out. The aim of this study was to quantitatively assess interobserver contouring variability by the application of the generalized DICE index. Methods and Materials: Fifteen physicians from different Institutes drew the PB on CT images of 10 patients. The spread of DICE values was used to objectively select those observers who significantly disagreed with the others. The analyses were performed with a dedicated module in the VODCA software package. Results: DICE values were found to significantly change among observers and patients. The mean DICE value was 0.67, ranging between 0.43 and 0.80. The statistics of DICE coefficients identified 4 of 15 observers who systematically showed a value below the average (p value range, 0.013 - 0.059): Mean DICE values were 0.62 for the 4 'bad' observers compared to 0.69 of the 11 'good' observers. For all bad observers, the main cause of the disagreement was identified. Average DICE values were significantly worse from the average in 2 of 10 patients (0.60 vs. 0.70, p < 0.05) because of the limited visibility of the PB. Excluding the bad observers and the 'bad' patients,' the mean DICE value increased from 0.67 to 0.70; interobserver variability, expressed in terms of standard deviation of DICE spread, was also reduced. Conclusions: The obtained values of DICE around 0.7 shows an acceptable agreement, considered the small dimension of the PB. Additional strategies to improve this agreement are under consideration and include an additional tutorial of the so-called bad observers with a recontouring procedure, or the recontouring by a single observer of the PB for all patients included in the DUE-01 study.

  18. Quantitative Neuroimaging Software for Clinical Assessment of Hippocampal Volumes on MR Imaging

    PubMed Central

    Ahdidan, Jamila; Raji, Cyrus A.; DeYoe, Edgar A.; Mathis, Jedidiah; Noe, Karsten Ø.; Rimestad, Jens; Kjeldsen, Thomas K.; Mosegaard, Jesper; Becker, James T.; Lopez, Oscar

    2015-01-01

    Background: Multiple neurological disorders including Alzheimer’s disease (AD), mesial temporal sclerosis, and mild traumatic brain injury manifest with volume loss on brain MRI. Subtle volume loss is particularly seen early in AD. While prior research has demonstrated the value of this additional information from quantitative neuroimaging, very few applications have been approved for clinical use. Here we describe a US FDA cleared software program, NeuroreaderTM, for assessment of clinical hippocampal volume on brain MRI. Objective: To present the validation of hippocampal volumetrics on a clinical software program. Method: Subjects were drawn (n = 99) from the Alzheimer Disease Neuroimaging Initiative study. Volumetric brain MR imaging was acquired in both 1.5 T (n = 59) and 3.0 T (n = 40) scanners in participants with manual hippocampal segmentation. Fully automated hippocampal segmentation and measurement was done using a multiple atlas approach. The Dice Similarity Coefficient (DSC) measured the level of spatial overlap between NeuroreaderTM and gold standard manual segmentation from 0 to 1 with 0 denoting no overlap and 1 representing complete agreement. DSC comparisons between 1.5 T and 3.0 T scanners were done using standard independent samples T-tests. Results: In the bilateral hippocampus, mean DSC was 0.87 with a range of 0.78–0.91 (right hippocampus) and 0.76–0.91 (left hippocampus). Automated segmentation agreement with manual segmentation was essentially equivalent at 1.5 T (DSC = 0.879) versus 3.0 T (DSC = 0.872). Conclusion: This work provides a description and validation of a software program that can be applied in measuring hippocampal volume, a biomarker that is frequently abnormal in AD and other neurological disorders. PMID:26484924

  19. Quantitative assessment of the efficacy of spiral-wound membrane cleaning procedures to remove biofilms.

    PubMed

    Hijnen, W A M; Castillo, C; Brouwer-Hanzens, A H; Harmsen, D J H; Cornelissen, E R; van der Kooij, D

    2012-12-01

    Cleaning of high pressure RO/NF membranes is an important operational tool to control biofouling. Quantitative information on the efficacy of cleaning agents and protocols to remove biomass is scarce. Therefore, a laboratory cleaning test to assess the efficiency of cleaning procedures to remove attached biomass was developed. The major components of the test are (i) production of uniform biofilm samples, (ii) the quantification of the biomass concentrations with robust parameters and (iii) a simple test procedure with optimal exposure of the biofilm samples to the chemicals. The results showed that PVC-P is a suitable substratum for the production of uniform biofilm samples. ATP and carbohydrates (CH) as major components of the biofilm matrix for nucleotides (living bacterial cells) and extracellular polymeric substances EPS, respectively, were selected as robust biomass parameters. The removal of ATP and CH with the NaOH/Sodium Dodecyl Sulfate (SDS) mixture, selected as a standard treatment at pH 12.0, was reproducible. The resistance of the EPS matrix against chemical cleaning was demonstrated by a low CH removal (32.8 ± 6.0%) compared to the ATP removal (70.5 ± 15.1%). The inverse relationship of biomass removal with the CH to ATP ratio (μg/ng) of the biofilms demonstrated the influence of the biomass characteristics on cleaning. None of the 27 chemicals tested (analytical-grade and commercial brands) in single step or in double-step treatments were significantly more effective than NaOH/SDS. Oxidizing agents NaOCl and H(2)O(2), the latter in combination with SDS, both tested as common agents in biofilm control, showed a significantly higher efficiency (70%) to remove biofilms. In the test, simultaneously, the efficiency of agents to remove precipitated minerals such as Fe can be assessed. Validation tests with Cleaning in Place (CIP) in 8 and 2.5-inch RO membrane pilot plant experiments showed similar ranking of the cleaning efficiency of cleaning protocols

  20. A statistical model for assessing performance standards for quantitative and semiquantitative disinfectant test methods.

    PubMed

    Parker, Albert E; Hamilton, Martin A; Tomasino, Stephen F

    2014-01-01

    using the computer code provided, offers a quantitative decision-making tool for assessing a performance standard for any disinfectant test method for which log reductions can be calculated.

  1. Quantitative assessment of the efficacy of spiral-wound membrane cleaning procedures to remove biofilms.

    PubMed

    Hijnen, W A M; Castillo, C; Brouwer-Hanzens, A H; Harmsen, D J H; Cornelissen, E R; van der Kooij, D

    2012-12-01

    Cleaning of high pressure RO/NF membranes is an important operational tool to control biofouling. Quantitative information on the efficacy of cleaning agents and protocols to remove biomass is scarce. Therefore, a laboratory cleaning test to assess the efficiency of cleaning procedures to remove attached biomass was developed. The major components of the test are (i) production of uniform biofilm samples, (ii) the quantification of the biomass concentrations with robust parameters and (iii) a simple test procedure with optimal exposure of the biofilm samples to the chemicals. The results showed that PVC-P is a suitable substratum for the production of uniform biofilm samples. ATP and carbohydrates (CH) as major components of the biofilm matrix for nucleotides (living bacterial cells) and extracellular polymeric substances EPS, respectively, were selected as robust biomass parameters. The removal of ATP and CH with the NaOH/Sodium Dodecyl Sulfate (SDS) mixture, selected as a standard treatment at pH 12.0, was reproducible. The resistance of the EPS matrix against chemical cleaning was demonstrated by a low CH removal (32.8 ± 6.0%) compared to the ATP removal (70.5 ± 15.1%). The inverse relationship of biomass removal with the CH to ATP ratio (μg/ng) of the biofilms demonstrated the influence of the biomass characteristics on cleaning. None of the 27 chemicals tested (analytical-grade and commercial brands) in single step or in double-step treatments were significantly more effective than NaOH/SDS. Oxidizing agents NaOCl and H(2)O(2), the latter in combination with SDS, both tested as common agents in biofilm control, showed a significantly higher efficiency (70%) to remove biofilms. In the test, simultaneously, the efficiency of agents to remove precipitated minerals such as Fe can be assessed. Validation tests with Cleaning in Place (CIP) in 8 and 2.5-inch RO membrane pilot plant experiments showed similar ranking of the cleaning efficiency of cleaning protocols

  2. Quantitative risk assessment of entry of contagious bovine pleuropneumonia through live cattle imported from northwestern Ethiopia.

    PubMed

    Woube, Yilkal Asfaw; Dibaba, Asseged Bogale; Tameru, Berhanu; Fite, Richard; Nganwa, David; Robnett, Vinaida; Demisse, Amsalu; Habtemariam, Tsegaye

    2015-11-01

    Contagious bovine pleuropneumonia (CBPP) is a highly contagious bacterial disease of cattle caused by Mycoplasma mycoides subspecies mycoides small colony (SC) bovine biotype (MmmSC). It has been eradicated from many countries; however, the disease persists in many parts of Africa and Asia. CBPP is one of the major trade-restricting diseases of cattle in Ethiopia. In this quantitative risk assessment the OIE concept of zoning was adopted to assess the entry of CBPP into an importing country when up to 280,000 live cattle are exported every year from the northwestern proposed disease free zone (DFZ) of Ethiopia. To estimate the level of risk, a six-tiered risk pathway (scenario tree) was developed, evidences collected and equations generated. The probability of occurrence of the hazard at each node was modelled as a probability distribution using Monte Carlo simulation (@RISK software) at 10,000 iterations to account for uncertainty and variability. The uncertainty and variability of data points surrounding the risk estimate were further quantified by sensitivity analysis. In this study a single animal destined for export from the northwestern DFZ of Ethiopia has a CBPP infection probability of 4.76×10(-6) (95% CI=7.25×10(-8) 1.92×10(-5)). The probability that at least one infected animal enters an importing country in one year is 0.53 (90% CI=0.042-0.97). The expected number of CBPP infected animals exported any given year is 1.28 (95% CI=0.021-5.42). According to the risk estimate, an average of 2.73×10(6) animals (90% CI=10,674-5.9×10(6)) must be exported to get the first infected case. By this account it would, on average, take 10.15 years (90% CI=0.24-23.18) for the first infected animal to be included in the consignment. Sensitivity analysis revealed that prevalence and vaccination had the highest impact on the uncertainty and variability of the overall risk.

  3. A quantitative methodology to assess the risks to human health from CO2 leakage into groundwater

    NASA Astrophysics Data System (ADS)

    Siirila, E.; Sitchler, A.; Maxwell, R. M.; McCray, J. E.

    2010-12-01

    Leakage of CO2 and associated gases into overlying aquifers as a result of geologic carbon capture and sequestration may have adverse impacts on aquifer drinking-water quality. Gas or aqueous-phase leakage may occur due to transport via faults and fractures, through faulty well bores, or through leaky confining materials. Contaminants of concern include aqueous salts and dissolved solids, gaseous or aqueous-phase organic contaminants, and acidic gas or aqueous-phase fluids that can liberate metals from aquifer minerals. Here we present a quantitative risk assessment framework to predict potential human health risk from CO2 leakage into drinking water aquifers. This framework incorporates the potential release of CO2 into the drinking water aquifer; mobilization of metals due to a decrease in pH; transport of these metals down gradient to municipal receptors; distributions of contaminated groundwater to multiple households; and exposure and health risk to individuals using this water for household purposes. Additionally, this framework is stochastic, incorporates detailed variations in geological and geostatistical parameters and discriminates between uncertain and variable parameters using a two-stage, or nested, Monte Carlo approach. This approach is demonstrated using example simulations with hypothetical, yet realistic, aquifer characteristics and leakage scenarios. These example simulations show a greater risk for arsenic than for lead for both cancer and non-cancer endpoints, an unexpected finding given greater toxicity of lead at lower doses than arsenic. It was also found that higher background groundwater gradients also yield higher risk. The overall risk and the associated uncertainty are sensitive to the extent of aquifer stratification and the degree of local-scale dispersion. These results all highlight the importance of hydrologic modeling in risk assessment. A linear relationship between carcinogenic and noncarcinogenic risk was found for arsenic and

  4. Analysis of Experts’ Quantitative Assessment of Adolescent Basketball Players and the Role of Anthropometric and Physiological Attributes

    PubMed Central

    Štrumbelj, Erik; Erčulj, Frane

    2014-01-01

    In this paper, we investigated two questions: (1) can measurements of anthropometric and physiological attributes substitute for expert assessment of adolescent basketball players, and (2) how much does the quantitative assessment of a player vary among experts? The first question is relevant to the potential simplification of the player selection process. The second question pertains directly to the validity of expert quantitative assessment. Our research was based on data from 148 U14 female and male basketball players. For each player, an array of anthropometric and physiological attributes was recorded, including body height, body mass, BMI, and several motor skill tests. Furthermore, each player’s current ability and potential ability were quantitatively evaluated by two different experts from a group of seven experts. Analysis of the recorded data showed that the anthropometric and physiological attributes explained between 15% and 40% of the variance in experts’ scores. The primary predictive attributes were speed and agility (for predicting current ability) and body height and growth potential (for predicting potential ability). We concluded that these attributes were not sufficiently informative to act as a substitute for expert assessment of the players’ current or potential ability. There is substantial variability in different experts’ scores of the same player’s ability. However, the differences between experts are mostly in scale, and the relationships between experts’ scores are monotonic. That is, different experts rank players on ability very similarly, but their scores are not well calibrated. PMID:25414759

  5. EnviroLand: A Simple Computer Program for Quantitative Stream Assessment.

    ERIC Educational Resources Information Center

    Dunnivant, Frank; Danowski, Dan; Timmens-Haroldson, Alice; Newman, Meredith

    2002-01-01

    Introduces the Enviroland computer program which features lab simulations of theoretical calculations for quantitative analysis and environmental chemistry, and fate and transport models. Uses the program to demonstrate the nature of linear and nonlinear equations. (Author/YDS)

  6. Fully automated, quantitative, noninvasive assessment of collagen fiber content and organization in thick collagen gels

    NASA Astrophysics Data System (ADS)

    Bayan, Christopher; Levitt, Jonathan M.; Miller, Eric; Kaplan, David; Georgakoudi, Irene

    2009-05-01

    Collagen is the most prominent protein of human tissues. Its content and organization define to a large extent the mechanical properties of tissue as well as its function. Methods that have been used traditionally to visualize and analyze collagen are invasive, provide only qualitative or indirect information, and have limited use in studies that aim to understand the dynamic nature of collagen remodeling and its interactions with the surrounding cells and other matrix components. Second harmonic generation (SHG) imaging emerged as a promising noninvasive modality for providing high-resolution images of collagen fibers within thick specimens, such as tissues. In this article, we present a fully automated procedure to acquire quantitative information on the content, orientation, and organization of collagen fibers. We use this procedure to monitor the dynamic remodeling of collagen gels in the absence or presence of fibroblasts over periods of 12 or 14 days. We find that an adaptive thresholding and stretching approach provides great insight to the content of collagen fibers within SHG images without the need for user input. An additional feature-erosion and feature-dilation step is useful for preserving structure and noise removal in images with low signal. To quantitatively assess the orientation of collagen fibers, we extract the orientation index (OI), a parameter based on the power distribution of the spatial-frequency-averaged, two-dimensional Fourier transform of the SHG images. To measure the local organization of the collagen fibers, we access the Hough transform of small tiles of the image and compute the entropy distribution, which represents the probability of finding the direction of fibers along a dominant direction. Using these methods we observed that the presence and number of fibroblasts within the collagen gel significantly affects the remodeling of the collagen matrix. In the absence of fibroblasts, gels contract, especially during the first few

  7. Quantitative assessment of solid waste treatment systems in the industrial ecology perspective by exergy analysis.

    PubMed

    Dewulf, Jo P; Van Langenhove, Herman R

    2002-03-01

    shown that the exergy concept allows a quantitative comparison of different industrial metabolic options, contributing to a better assessment of sustainability of technology with respect to resource management.

  8. Quantitative Assessment of Theses at Mazandaran University of Medical Sciences Years–(1995-2014)

    PubMed Central

    Balaghafari, Azita; Siamian, Hasan; Kharamin, Farideh; Rashida, Seyyedeh Shahrbanoo; Ghahrani, Nassim

    2016-01-01

    Background: Review and evaluation of research for the correct steps towards real progress is essential which is a healthy and dynamic feature of the system. For the correct step toward real progress, evaluation research is essential which is feature of healthy and dynamic system. Considering the importance of scientific thesis in production and development and be aware of as the lack of structured information and qualitative and quantitative assessment at Mazandaran University of Medical Sciences, therefore we decided to do qualitative stud of theirs prepared 1995-2014. Methods: This study was a descriptive survey, a sample of 325 graduate and PhD thesis and dissertation in clinical and basic science at the university of medical sciences of the population in 2060 is a thesis from 1994 to the end of 2014. To study the population, stratified sampling method was used. The descriptive study was conducted in terms of matching the degree thesis students, theses subjects, specialty of supervisors and Advisers. The data gathering tool was checklist of information (gender, discipline, degree and department education of students, School, year of dependence, title of theses and dissertations, specialty and departments of supervisors and advisers, type of research, grade obtained of students). Statistical analysis of the data was performed using 21 SPSS software. Results: We studied 325 theses; 303 dissertations which 1 researcher; 21 dissertations which 2 researchers and 1 dissertation with 3 researchers. A total of 348 students (174 females and 174 males) researcher had theses. The number of students in the Department of Basic Science 82 (23.5%), 266 (76.5 %) in clinical group; 29(8.33%), 29 (8.33%) master degree; 260 (74.71%) general practitioner; 58 (16.67%) specialty and 1(29) at the PhD level. There was no relationship between research and level of education (p = 0.081). However, it was found that majority of the theses for the general practitioner (59.8%) wryer type 1

  9. Quantitative assessment of hydrocarbon contamination in soil using reflectance spectroscopy: a "multipath" approach.

    PubMed

    Schwartz, Guy; Ben-Dor, Eyal; Eshel, Gil

    2013-11-01

    Petroleum hydrocarbons are contaminants of great significance. The commonly used analytic method for assessing total petroleum hydrocarbons (TPH) in soil samples is based on extraction with 1,1,2-Trichlorotrifluoroethane (Freon 113), a substance prohibited to use by the Environmental Protection Agency. During the past 20 years, a new quantitative methodology that uses the reflected radiation of solids has been widely adopted. By using this approach, the reflectance radiation across the visible, near infrared-shortwave infrared region (400-2500 nm) is modeled against constituents determined using traditional analytic chemistry methods and then used to predict unknown samples. This technology is environmentally friendly and permits rapid and cost-effective measurements of large numbers of samples. Thus, this method dramatically reduces chemical analytical costs and secondary pollution, enabling a new dimension of environmental monitoring. In this study we adapted this approach and developed effective steps in which hydrocarbon contamination in soils can be determined rapidly, accurately, and cost effectively solely from reflectance spectroscopy. Artificial contaminated samples were analyzed chemically and spectrally to form a database of five soils contaminated with three types of petroleum hydrocarbons (PHCs), creating 15 datasets of 48 samples each at contamination levels of 50-5000 wt% ppm (parts per million). A brute force preprocessing approach was used by combining eight different preprocessing techniques with all possible datasets, resulting in 120 different mutations for each dataset. The brute force was done based on an innovative computing system developed for this study. A new parameter for evaluating model performance scoring (MPS) is proposed based on a combination of several common statistical parameters. The effect of dividing the data into training validation and test sets on modeling accuracy is also discussed. The results of this study clearly show

  10. A quantitative microbial risk assessment for meatborne Toxoplasma gondii infection in The Netherlands.

    PubMed

    Opsteegh, Marieke; Prickaerts, Saskia; Frankena, Klaas; Evers, Eric G

    2011-11-01

    Toxoplasma gondii is an important foodborne pathogen, and the cause of a high disease burden due to congenital toxoplasmosis in The Netherlands. The aim of this study was to quantify the relative contribution of sheep, beef and pork products to human T. gondii infections by Quantitative Microbial Risk Assessment (QMRA). Bradyzoite concentration and portion size data were used to estimate the bradyzoite number in infected unprocessed portions for human consumption. The reduction factors for salting, freezing and heating as estimated based on published experiments in mice, were subsequently used to estimate the bradyzoite number in processed portions. A dose-response relation for T. gondii infection in mice was used to estimate the human probability of infection due to consumption of these originally infected processed portions. By multiplying these probabilities with the prevalence of T. gondii per livestock species and the number of portions consumed per year, the number of infections per year was calculated for the susceptible Dutch population and the subpopulation of susceptible pregnant women. QMRA results predict high numbers of infections per year with beef as the most important source. Although many uncertainties were present in the data and the number of congenital infections predicted by the model was almost twenty times higher than the number estimated based on the incidence in newborns, the usefulness of the advice to thoroughly heat meat is confirmed by our results. Forty percent of all predicted infections is due to the consumption of unheated meat products, and sensitivity analysis indicates that heating temperature has the strongest influence on the predicted number of infections. The results also demonstrate that, even with a low prevalence of infection in cattle, consumption of beef remains an important source of infection. Developing this QMRA model has helped identify important gaps of knowledge and resulted in the following recommendations for

  11. Quantitative Assessment of Effect of Preanalytic Cold Ischemic Time on Protein Expression in Breast Cancer Tissues

    PubMed Central

    2012-01-01

    Background Companion diagnostic tests can depend on accurate measurement of protein expression in tissues. Preanalytic variables, especially cold ischemic time (time from tissue removal to fixation in formalin) can affect the measurement and may cause false-negative results. We examined 23 proteins, including four commonly used breast cancer biomarker proteins, to quantify their sensitivity to cold ischemia in breast cancer tissues. Methods A series of 93 breast cancer specimens with known time-to-fixation represented in a tissue microarray and a second series of 25 matched pairs of core needle biopsies and breast cancer resections were used to evaluate changes in antigenicity as a function of cold ischemic time. Estrogen receptor (ER), progesterone receptor (PgR), HER2 or Ki67, and 19 other antigens were tested. Each antigen was measured using the AQUA method of quantitative immunofluorescence on at least one series. All statistical tests were two-sided. Results We found no evidence for loss of antigenicity with time-to-fixation for ER, PgR, HER2, or Ki67 in a 4-hour time window. However, with a bootstrapping analysis, we observed a trend toward loss for ER and PgR, a statistically significant loss of antigenicity for phosphorylated tyrosine (P = .0048), and trends toward loss for other proteins. There was evidence of increased antigenicity in acetylated lysine, AKAP13 (P = .009), and HIF1A (P = .046), which are proteins known to be expressed in conditions of hypoxia. The loss of antigenicity for phosphorylated tyrosine and increase in expression of AKAP13, and HIF1A were confirmed in the biopsy/resection series. Conclusions Key breast cancer biomarkers show no evidence of loss of antigenicity, although this dataset assesses the relatively short time beyond the 1-hour limit in recent guidelines. Other proteins show changes in antigenicity in both directions. Future studies that extend the time range and normalize for heterogeneity will provide more comprehensive

  12. The Bias Associated with Amplicon Sequencing Does Not Affect the Quantitative Assessment of Bacterial Community Dynamics

    PubMed Central

    Figuerola, Eva L. M.; Erijman, Leonardo

    2014-01-01

    The performance of two sets of primers targeting variable regions of the 16S rRNA gene V1–V3 and V4 was compared in their ability to describe changes of bacterial diversity and temporal turnover in full-scale activated sludge. Duplicate sets of high-throughput amplicon sequencing data of the two 16S rRNA regions shared a collection of core taxa that were observed across a series of twelve monthly samples, although the relative abundance of each taxon was substantially different between regions. A case in point was the changes in the relative abundance of filamentous bacteria Thiothrix, which caused a large effect on diversity indices, but only in the V1–V3 data set. Yet the relative abundance of Thiothrix in the amplicon sequencing data from both regions correlated with the estimation of its abundance determined using fluorescence in situ hybridization. In nonmetric multidimensional analysis samples were distributed along the first ordination axis according to the sequenced region rather than according to sample identities. The dynamics of microbial communities indicated that V1–V3 and the V4 regions of the 16S rRNA gene yielded comparable patterns of: 1) the changes occurring within the communities along fixed time intervals, 2) the slow turnover of activated sludge communities and 3) the rate of species replacement calculated from the taxa–time relationships. The temperature was the only operational variable that showed significant correlation with the composition of bacterial communities over time for the sets of data obtained with both pairs of primers. In conclusion, we show that despite the bias introduced by amplicon sequencing, the variable regions V1–V3 and V4 can be confidently used for the quantitative assessment of bacterial community dynamics, and provide a proper qualitative account of general taxa in the community, especially when the data are obtained over a convenient time window rather than at a single time point. PMID:24923665

  13. Quantitative Risk Assessment of CO2 Sequestration in a commerical-scale EOR Site

    NASA Astrophysics Data System (ADS)

    Pan, F.; McPherson, B. J. O. L.; Dai, Z.; Jia, W.; Lee, S. Y.; Ampomah, W.; Viswanathan, H. S.

    2015-12-01

    Enhanced Oil Recovery with CO2 (CO2-EOR) is perhaps the most feasible option for geologic CO2 sequestration (GCS), if only due to existing infrastructure and economic opportunities of associated oil production. Probably the most significant source of uncertainty of CO2 storage forecasts is heterogeneity of reservoir properties. Quantification of storage forecast uncertainty is critical for accurate assessment of risks associated with GCS in EOR fields. This study employs a response surface methodology (RSM) to quantify uncertainties of CO2 storage associated with oil production in an active CO2-EOR field. Specifically, the Morrow formation, a clastic reservoir within the Farnsworth EOR Unit (FWU) in Texas, was selected as a case study. Four uncertain parameters (i.e., independent variables) are reservoir permeability, anisotropy ratio of permeability, water-alternating-gas (WAG) time ratio, and initial oil saturation. Cumulative oil production and net CO2 injection are the output dependent variables. A 3-D FWU reservoir model, including a representative 5-spot well pattern, was constructed for CO2-oil-water multiphase flow analysis. A total of 25 permutations of 3-D reservoir simulations were executed using Eclipse simulator. After performing stepwise regression analysis, a series of response surface models of the output variables at each step were constructed and verified using appropriate goodness-of-fit measures. The R2 values are larger than 0.9 and NRMSE values are less than 5% between the simulated and predicted oil production and net CO2 injection, suggesting that the response surface (or proxy) models are sufficient for predicting CO2-EOR system behavior for FWU case. Given the range of uncertainties in the independent variables, the cumulative distribution functions (CDFs) of dependent variables were estimated using the proxy models. The predicted cumulative oil production and net CO2 injection at 95th percentile after 5 years are about 3.65 times, and 1

  14. A quantitative microbiological exposure assessment model for Bacillus cereus in REPFEDs.

    PubMed

    Daelman, Jeff; Membré, Jeanne-Marie; Jacxsens, Liesbeth; Vermeulen, An; Devlieghere, Frank; Uyttendaele, Mieke

    2013-09-16

    One of the pathogens of concern in refrigerated and processed foods of extended durability (REPFED) is psychrotrophic Bacillus cereus, because of its ability to survive pasteurisation and grow at low temperatures. In this study a quantitative microbiological exposure assessment (QMEA) of psychrotrophic B. cereus in REPFEDs is presented. The goal is to quantify (i) the prevalence and concentration of B. cereus during production and shelf life, (ii) the number of packages with potential emetic toxin formation and (iii) the impact of different processing steps and consumer behaviour on the exposure to B. cereus from REPFEDs. The QMEA comprises the entire production and distribution process, from raw materials over pasteurisation and up to the moment it is consumed or discarded. To model this process the modular process risk model (MPRM) was used (Nauta, 2002). The product life was divided into nine modules, each module corresponding to a basic process: (1) raw material contamination, (2) cross contamination during handling, (3) inactivation during preparation, (4) growth during intermediate storage, (5) partitioning of batches in portions, (6) mixing portions to create the product, (7) recontamination during assembly and packaging, (8) inactivation during pasteurisation and (9) growth during shelf life. Each of the modules was modelled and built using a combination of newly gathered and literature data, predictive models and expert opinions. Units (batch/portion/package) with a B. cereus concentration of 10(5)CFU/g or more were considered 'risky' units. Results show that the main drivers of variability and uncertainty are consumer behaviour, strain variability and modelling error. The prevalence of B. cereus in the final products is estimated at 48.6% (±0.01%) and the number of packs with too high B. cereus counts at the moment of consumption is estimated at 4750 packs per million (0.48%). Cold storage at retail and consumer level is vital in limiting the exposure

  15. Assessment of Riboflavin as a Tracer Substance: Comparison of a Qualitative to a Quantitative Method of Riboflavin Measurement

    PubMed Central

    Herron, Abigail J.; Mariani, John J.; Pavlicova, Martina; Parinello, Christina M.; Bold, Krysten W.; Levin, Frances R.; Nunes, Edward V.; Sullivan, Maria A.; Raby, Wilfred N.; Bisaga, Adam

    2013-01-01

    Background Noncompliance with medications may have major impacts on outcomes measured in research, potentially distorting the validity of controlled clinical trials. Riboflavin is frequently used in trials as a marker of adherence. It can be combined with study medication and is excreted in urine where it fluoresces under UV light. This study compares qualitative visual inspection of fluorescence to quantitative fluorometric analysis of riboflavin concentration in its ability to detect the presence of riboflavin in urine. Methods Twenty-four volunteers received 0 mg, 25 mg, and 50 mg doses of riboflavin under single-blind conditions, with 20 also receiving a 100 mg dose. Five serial urine samples were collected over the following 36 hours. Quantitative measurement of riboflavin by fluorometric analysis and qualitative assessment of each sample using visual inspection were performed. Results The overall false positive rate for qualitative assessment was 53%. For quantitative assessment, a riboflavin concentration of 900 ng/mL was established to classify positive samples. More than 80% of samples were positive 2 to 24 hours following ingestion of 25 mg and 50 mg, and less than 80% were positive at 36 hours. At least 95% of observations for the 100 mg dose were above 900 ng/mL at all timepoints. Conclusions Quantitative fluorometric assessment is superior to qualitative visual inspection alone in determining medication adherence. The combination of 25–50 mg of daily riboflavin and a cut-off level of 900 ng/mL allows for the acceptable sensitivity of missing detection of non-compliant participants while preserving a high level of power to detect all cases of medication compliance. PMID:22921475

  16. Integrating expert opinion with modelling for quantitative multi-hazard risk assessment in the Eastern Italian Alps

    NASA Astrophysics Data System (ADS)

    Chen, Lixia; van Westen, Cees J.; Hussin, Haydar; Ciurean, Roxana L.; Turkington, Thea; Chavarro-Rincon, Diana; Shrestha, Dhruba P.

    2016-11-01

    Extreme rainfall events are the main triggering causes for hydro-meteorological hazards in mountainous areas, where development is often constrained by the limited space suitable for construction. In these areas, hazard and risk assessments are fundamental for risk mitigation, especially for preventive planning, risk communication and emergency preparedness. Multi-hazard risk assessment in mountainous areas at local and regional scales remain a major challenge because of lack of data related to past events and causal factors, and the interactions between different types of hazards. The lack of data leads to a high level of uncertainty in the application of quantitative methods for hazard and risk assessment. Therefore, a systematic approach is required to combine these quantitative methods with expert-based assumptions and decisions. In this study, a quantitative multi-hazard risk assessment was carried out in the Fella River valley, prone to debris flows and flood in the north-eastern Italian Alps. The main steps include data collection and development of inventory maps, definition of hazard scenarios, hazard assessment in terms of temporal and spatial probability calculation and intensity modelling, elements-at-risk mapping, estimation of asset values and the number of people, physical vulnerability assessment, the generation of risk curves and annual risk calculation. To compare the risk for each type of hazard, risk curves were generated for debris flows, river floods and flash floods. Uncertainties were expressed as minimum, average and maximum values of temporal and spatial probability, replacement costs of assets, population numbers, and physical vulnerability. These result in minimum, average and maximum risk curves. To validate this approach, a back analysis was conducted using the extreme hydro-meteorological event that occurred in August 2003 in the Fella River valley. The results show a good performance when compared to the historical damage reports.

  17. Quantitative assessment of historical coastal landfill contamination using in-situ field portable XRF (FPXRF)

    NASA Astrophysics Data System (ADS)

    O'Shea, Francis; Spencer, Kate; Brasington, James

    2014-05-01

    in the field to determine the presence, location and extent of the sub-surface contaminant plume. Although XRF analysis has gained acceptance in the study of in-situ metal contamination (Kalnicky and Singhvi 2001; Martin Peinado et al. 2010) field moisture content and sample heterogeneity can suppress X-ray signals. Therefore, sediment samples were also collected and returned to the laboratory and analysed by ICP OES for comparison. Both wet and dry certified reference materials were also analysed in the laboratory using XRF and ICP OES to observe the impact of moisture content and to produce a correction factor allowing quantitative data to be collected in the field. In-situ raw XRF data identified the location of contamination plumes in the field in agreement with ICP data, although the data were systematically suppressed compared to ICP data, under-estimating the levels of contamination. Applying a correction factor for moisture content provided accurate measurements of concentration. The use of field portable XRF with the application of a moisture content correction factor enables the rapid screening of sediment fronting coastal landfill sites, goes some way towards providing a national baseline dataset and can contribute to the development of risk assessments.

  18. Primary Renal Lymphoma Mimicking a Subcapsular Hematoma: A Case Report

    PubMed Central

    Dedekam, Erik; Graham, Jess; Strenge, Karen; Mosier, Andrew D.

    2013-01-01

    Primary renal lymphoma (PRL) is a rare entity with a history of controversy regarding its existence. Lymphomatous involvement of the kidney is more commonly seen secondarily to spread from an adjacent lymphomatous mass, rather than arising primarily from the kidney. PRL can mimic other renal lesions such as renal cell carcinoma, renal abscess, and metastasis; therefore, an early diagnosis is crucial to guide treatment and properly assess prognosis. We present a rare case of a 77 year-old male who presented with hematuria and PRL mimicking a subcapsular hematoma. PMID:24421949

  19. Primary renal lymphoma mimicking a subcapsular hematoma: a case report.

    PubMed

    Dedekam, Erik; Graham, Jess; Strenge, Karen; Mosier, Andrew D

    2013-08-01

    Primary renal lymphoma (PRL) is a rare entity with a history of controversy regarding its existence. Lymphomatous involvement of the kidney is more commonly seen secondarily to spread from an adjacent lymphomatous mass, rather than arising primarily from the kidney. PRL can mimic other renal lesions such as renal cell carcinoma, renal abscess, and metastasis; therefore, an early diagnosis is crucial to guide treatment and properly assess prognosis. We present a rare case of a 77 year-old male who presented with hematuria and PRL mimicking a subcapsular hematoma. PMID:24421949

  20. Assessing Student Status and Progress in Science Reasoning and Quantitative Literacy at a Very Large Undergraduate Institution

    NASA Astrophysics Data System (ADS)

    Donahue, Megan; Kaplan, J.; Ebert-May, D.; Ording, G.; Melfi, V.; Gilliland, D.; Sikorski, A.; Johnson, N.

    2009-01-01

    The typical large liberal-arts, tier-one research university requires all of its graduates to achieve some minimal standards of quantitative literacy and scientific reasoning skills. But how do we know what we are doing, as instructors and as a university, is working the way we think it should? At Michigan State University, a cross-disciplinary team of scientists, statisticians, and teacher education experts have begun a large-scale investigation about student mastery of quantitative and scientific skills, beginning with an assessment of 3,000 freshmen before they start their university careers. We will describe the process we used for developing and testing an instrument, for expanding faculty involvement and input on high-level goals. For this limited presentation, we will limit the discussion mainly to the scientific reasoning perspective, but we will briefly mention some intriguing observations regarding quantitative literacy as well. This project represents the beginning of long-term, longitudinal tracking of the progress of students at our institution. We will discuss preliminary results our 2008 assessment of incoming freshman at Michigan State, and where we plan to go from here. We acknowledge local support from the Quality Fund from the Office of the Provost at MSU. We also acknowledge the Center for Assessment at James Madison University and the NSF for their support at the very beginning of our work.

  1. Quantitative risk assessment from farm to fork and beyond: a global Bayesian approach concerning food-borne diseases.

    PubMed

    Albert, Isabelle; Grenier, Emmanuel; Denis, Jean-Baptiste; Rousseau, Judith

    2008-04-01

    A novel approach to the quantitative assessment of food-borne risks is proposed. The basic idea is to use Bayesian techniques in two distinct steps: first by constructing a stochastic core model via a Bayesian network based on expert knowledge, and second, using the data available to improve this knowledge. Unlike the Monte Carlo simulation approach as commonly used in quantitative assessment of food-borne risks where data sets are used independently in each module, our consistent procedure incorporates information conveyed by data throughout the chain. It allows "back-calculation" in the food chain model, together with the use of data obtained "downstream" in the food chain. Moreover, the expert knowledge is introduced more simply and consistently than with classical statistical methods. Other advantages of this approach include the clear framework of an iterative learning process, considerable flexibility enabling the use of heterogeneous data, and a justified method to explore the effects of variability and uncertainty. As an illustration, we present an estimation of the probability of contracting a campylobacteriosis as a result of broiler contamination, from the standpoint of quantitative risk assessment. Although the model thus constructed is oversimplified, it clarifies the principles and properties of the method proposed, which demonstrates its ability to deal with quite complex situations and provides a useful basis for further discussions with different experts in the food chain.

  2. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Pavlov, K.A.

    1997-01-01

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing. {copyright} {ital 1997 American Institute of Physics.}

  3. A correlative imaging based methodology for accurate quantitative assessment of bone formation in additive manufactured implants.

    PubMed

    Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D

    2016-06-01

    A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants.

  4. Comparing models for quantitative risk assessment: an application to the European Registry of foreign body injuries in children.

    PubMed

    Berchialla, Paola; Scarinzi, Cecilia; Snidero, Silvia; Gregori, Dario

    2016-08-01

    Risk Assessment is the systematic study of decisions subject to uncertain consequences. An increasing interest has been focused on modeling techniques like Bayesian Networks since their capability of (1) combining in the probabilistic framework different type of evidence including both expert judgments and objective data; (2) overturning previous beliefs in the light of the new information being received and (3) making predictions even with incomplete data. In this work, we proposed a comparison among Bayesian Networks and other classical Quantitative Risk Assessment techniques such as Neural Networks, Classification Trees, Random Forests and Logistic Regression models. Hybrid approaches, combining both Classification Trees and Bayesian Networks, were also considered. Among Bayesian Networks, a clear distinction between purely data-driven approach and combination of expert knowledge with objective data is made. The aim of this paper consists in evaluating among this models which best can be applied, in the framework of Quantitative Risk Assessment, to assess the safety of children who are exposed to the risk of inhalation/insertion/aspiration of consumer products. The issue of preventing injuries in children is of paramount importance, in particular where product design is involved: quantifying the risk associated to product characteristics can be of great usefulness in addressing the product safety design regulation. Data of the European Registry of Foreign Bodies Injuries formed the starting evidence for risk assessment. Results showed that Bayesian Networks appeared to have both the ease of interpretability and accuracy in making prediction, even if simpler models like logistic regression still performed well.

  5. A novel technique for the quantitative assessment of apraxic deficits: application to individuals with mild cognitive impairment.

    PubMed

    Crutch, Sebastian J; Rossor, Martin N; Warrington, Elizabeth K

    2007-09-01

    The purpose of this study was to apply two novel quantitative assessments of apraxia to issues surrounding the cognitive profile of individuals with mild cognitive impairment (MCI) who are at increased risk of Alzheimer's disease (AD). In particular, it was wished to determine whether such quantitative assessment techniques can detect minor degrees of impairment at a stage in the putative disease process before apraxia has become clinically obvious. A total of 23 individuals with MCI and 75 healthy controls were assessed on two 3-item sequential movement tasks involving either meaningful or meaningless actions. A traditional rating scale assessment of gesture-to-command was also administered. MCI patients took significantly longer than control subjects to complete the sequential movement tasks despite unimpaired performance on the traditional gesture production tasks. Furthermore, retrospective analyses revealed that, at the group level, only MCI patients who subsequently proceeded to a clinical diagnosis of AD were significantly slower than controls at the initial assessment. These findings provide the first evidence that the neuropsychological deficits associated with MCI may extend to the domain of praxic functions. Consequently, this work contributes to the growing literature questioning the clinical usefulness of the concept of MCI and the appropriateness of current diagnostic criteria for distinguishing this condition from mild AD.

  6. Comparative assessment of fluorescent transgene methods for quantitative imaging in human cells

    PubMed Central

    Mahen, Robert; Koch, Birgit; Wachsmuth, Malte; Politi, Antonio Z.; Perez-Gonzalez, Alexis; Mergenthaler, Julia; Cai, Yin; Ellenberg, Jan

    2014-01-01

    Fluorescence tagging of proteins is a widely used tool to study protein function and dynamics in live cells. However, the extent to which different mammalian transgene methods faithfully report on the properties of endogenous proteins has not been studied comparatively. Here we use quantitative live-cell imaging and single-molecule spectroscopy to analyze how different transgene systems affect imaging of the functional properties of the mitotic kinase Aurora B. We show that the transgene method fundamentally influences level and variability of expression and can severely compromise the ability to report on endogenous binding and localization parameters, providing a guide for quantitative imaging studies in mammalian cells. PMID:25232003

  7. Assessing the Validity of Diagnostic Quantitative PCR Assays for Phakopsora pachyrhizi and P. meibomiae

    Technology Transfer Automated Retrieval System (TEKTRAN)

    There are 123 confirmed species in the genus Phakopsora worldwide, with 19 species reported in the continental United States. In 2002, a quantitative PCR (qPCR) diagnostic assay was developed by Frederick et al. that has been used for detecting Phakopsora pachyrhizi in spore trapping studies. Based ...

  8. A Critical Assessment of Null Hypothesis Significance Testing in Quantitative Communication Research

    ERIC Educational Resources Information Center

    Levine, Timothy R.; Weber, Rene; Hullett, Craig; Park, Hee Sun; Lindsey, Lisa L. Massi

    2008-01-01

    Null hypothesis significance testing (NHST) is the most widely accepted and frequently used approach to statistical inference in quantitative communication research. NHST, however, is highly controversial, and several serious problems with the approach have been identified. This paper reviews NHST and the controversy surrounding it. Commonly…

  9. Diagnosing Conceptions about the Epistemology of Science: Contributions of a Quantitative Assessment Methodology

    ERIC Educational Resources Information Center

    Vázquez-Alonso, Ángel; Manassero-Mas, María-Antonia; García-Carmona, Antonio; Montesano de Talavera, Marisa

    2016-01-01

    This study applies a new quantitative methodological approach to diagnose epistemology conceptions in a large sample. The analyses use seven multiple-rating items on the epistemology of science drawn from the item pool Views on Science-Technology-Society (VOSTS). The bases of the new methodological diagnostic approach are the empirical…

  10. Assessment of patient selection criteria for quantitative imaging with respiratory-gated positron emission tomography.

    PubMed

    Bowen, Stephen R; Pierce, Larry A; Alessio, Adam M; Liu, Chi; Wollenweber, Scott D; Stearns, Charles W; Kinahan, Paul E

    2014-07-01

    The objective of this investigation was to propose techniques for determining which patients are likely to benefit from quantitative respiratory-gated imaging by correlating respiratory patterns to changes in positron emission tomography (PET) metrics. Twenty-six lung and liver cancer patients underwent PET/computed tomography exams with recorded chest/abdominal displacements. Static and adaptive amplitude-gated [[Formula: see text

  11. Toward a Quantitative Basis for Assessment and Diagnosis of Apraxia of Speech

    ERIC Educational Resources Information Center

    Haley, Katarina L.; Jacks, Adam; de Riesthal, Michael; Abou-Khalil, Rima; Roth, Heidi L.

    2012-01-01

    Purpose: We explored the reliability and validity of 2 quantitative approaches to document presence and severity of speech properties associated with apraxia of speech (AOS). Method: A motor speech evaluation was administered to 39 individuals with aphasia. Audio-recordings of the evaluation were presented to 3 experienced clinicians to determine…

  12. Quantitative proteomics: assessing the spectrum of in-gel protein detection methods

    PubMed Central

    Gauci, Victoria J.; Wright, Elise P.

    2010-01-01

    Proteomics research relies heavily on visualization methods for detection of proteins separated by polyacrylamide gel electrophoresis. Commonly used staining approaches involve colorimetric dyes such as Coomassie Brilliant Blue, fluorescent dyes including Sypro Ruby, newly developed reactive fluorophores, as well as a plethora of others. The most desired characteristic in selecting one stain over another is sensitivity, but this is far from the only important parameter. This review evaluates protein detection methods in terms of their quantitative attributes, including limit of detection (i.e., sensitivity), linear dynamic range, inter-protein variability, capacity for spot detection after 2D gel electrophoresis, and compatibility with subsequent mass spectrometric analyses. Unfortunately, many of these quantitative criteria are not routinely or consistently addressed by most of the studies published to date. We would urge more rigorous routine characterization of stains and detection methodologies as a critical approach to systematically improving these critically important tools for quantitative proteomics. In addition, substantial improvements in detection technology, particularly over the last decade or so, emphasize the need to consider renewed characterization of existing stains; the quantitative stains we need, or at least the chemistries required for their future development, may well already exist. PMID:21686332

  13. Quantitative Assessment of Detection Frequency for the INL Ambient Air Monitoring Network

    SciTech Connect

    A. Jeffrey Sondrup; Arthur S. Rood

    2014-11-01

    A quantitative assessment of the Idaho National Laboratory (INL) air monitoring network was performed using frequency of detection as the performance metric. The INL air monitoring network consists of 37 low-volume air samplers in 31 different locations. Twenty of the samplers are located on INL (onsite) and 17 are located off INL (offsite). Detection frequencies were calculated using both BEA and ESER laboratory minimum detectable activity (MDA) levels. The CALPUFF Lagrangian puff dispersion model, coupled with 1 year of meteorological data, was used to calculate time-integrated concentrations at sampler locations for a 1-hour release of unit activity (1 Ci) for every hour of the year. The unit-activity time-integrated concentration (TICu) values were calculated at all samplers for releases from eight INL facilities. The TICu values were then scaled and integrated for a given release quantity and release duration. All facilities modeled a ground-level release emanating either from the center of the facility or at a point where significant emissions are possible. In addition to ground-level releases, three existing stacks at the Advanced Test Reactor Complex, Idaho Nuclear Technology and Engineering Center, and Material and Fuels Complex were also modeled. Meteorological data from the 35 stations comprising the INL Mesonet network, data from the Idaho Falls Regional airport, upper air data from the Boise airport, and three-dimensional gridded data from the weather research forecasting model were used for modeling. Three representative radionuclides identified as key radionuclides in INL’s annual National Emission Standards for Hazardous Air Pollutants evaluations were considered for the frequency of detection analysis: Cs-137 (beta-gamma emitter), Pu-239 (alpha emitter), and Sr-90 (beta emitter). Source-specific release quantities were calculated for each radionuclide, such that the maximum inhalation dose at any publicly accessible sampler or the National

  14. QUANTITATION OF MOLECULAR ENDPOINTS FOR THE DOSE-RESPONSE COMPONENT OF CANCER RISK ASSESSMENT

    EPA Science Inventory

    Cancer risk assessment involves the steps of hazard identification, dose-response assessment, exposure assessment and risk characterization. The rapid advances in the use of molecular biology approaches has had an impact on all four components, but the greatest overall current...

  15. Prosodic Aspects of Hearing-Impaired Children: A Qualitative and Quantitative Assessment.

    ERIC Educational Resources Information Center

    Ching, Teresa Y. C.

    1989-01-01

    This study discusses the development of a qualitative assessment to profile prosodic skills of Cantonese-speaking children with speech defects, and correlates it with a quantiative assessment of productive skills. The study entails the use of the Visi-pitch to provide objective data for assessment. The aim is to devise a comprehensive description…

  16. Assessing Student Teachers' Reflective Writing through Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Poldner, Eric; Van der Schaaf, Marieke; Simons, P. Robert-Jan; Van Tartwijk, Jan; Wijngaards, Guus

    2014-01-01

    Students' reflective essay writing can be stimulated by the formative assessments provided to them by their teachers. Such assessments contain information about the quality of students' reflective writings and offer suggestions for improvement. Despite the importance of formatively assessing students' reflective writings in teacher…

  17. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Caffo, Brian; Frey, Eric C.

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  18. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  19. A quantitative method to analyze the quality of EIA information in wind energy development and avian/bat assessments

    SciTech Connect

    Chang, Tony; Nielsen, Erik; Auberle, William; Solop, Frederic I.

    2013-01-15

    The environmental impact assessment (EIA) has been a tool for decision makers since the enactment of the National Environmental Policy Act (NEPA). Since that time, few analyses have been performed to verify the quality of information and content within EIAs. High quality information within assessments is vital in order for decision makers, stake holders, and the public to understand the potential impact of proposed actions on the ecosystem and wildlife species. Low quality information has been a major cause for litigation and economic loss. Since 1999, wind energy development has seen an exponential growth with unknown levels of impact on wildlife species, in particular bird and bat species. The purpose of this article is to: (1) develop, validate, and apply a quantitative index to review avian/bat assessment quality for wind energy EIAs; and (2) assess the trends and status of avian/bat assessment quality in a sample of wind energy EIAs. This research presents the development and testing of the Avian and Bat Assessment Quality Index (ABAQI), a new approach to quantify information quality of ecological assessments within wind energy development EIAs in relation to avian and bat species based on review areas and factors derived from 23 state wind/wildlife siting guidance documents. The ABAQI was tested through a review of 49 publicly available EIA documents and validated by identifying high variation in avian and bat assessments quality for wind energy developments. Of all the reviewed EIAs, 66% failed to provide high levels of preconstruction avian and bat survey information, compared to recommended factors from state guidelines. This suggests the need for greater consistency from recommended guidelines by state, and mandatory compliance by EIA preparers to avoid possible habitat and species loss, wind energy development shut down, and future lawsuits. - Highlights: Black-Right-Pointing-Pointer We developed, validated, and applied a quantitative index to review

  20. Rare Mimickers of Exostosis: A Case Series

    PubMed Central

    Perubhotla, Lakshmi Manasa

    2016-01-01

    Exophytic growths from bones are a common entity. Osteochondroma is the most common benign exophytic lesion and we tend to diagnose every benign looking exophytic lesion as osteochondroma. Here we reported two entities of cases, one was Nora’s lesion and another one was supracondylar process of humerus, both of which were mimickers of osteochondroma and their salient and differentiating features from osteochondromas.

  1. Pulmonary hyalinizing granuloma mimicking lung carcinoma.

    PubMed

    Basoglu, A; Findik, S; Celik, B; Yildiz, L

    2006-06-01

    Pulmonary hyalinizing granuloma has rarely been reported and is a benign entity of unknown origin. The chest radiograph reveals multiple and frequently bilateral pulmonary nodules. We describe a patient with pulmonary hyalinizing granuloma who presented with a central mass in the left lung mimicking lung carcinoma. PMID:16755455

  2. [Two cystic retroperitoneal lesions mimicking adrenal cysts].

    PubMed

    Grabellus, F; Dereskewitz, C; Schmitz, K J; Kaiser, G M; Kühl, H; Kersting, C; Frilling, A; Metz, K A; Baba, H A

    2005-05-01

    Adrenal cysts are uncommon lesions and most of them are found incidentally during abdominal imaging. We report on two benign extraadrenal lesions mimicking adrenal tumors in abdominal imaging. The histopathological investigation of the lesions revealed a foregut duplication cyst of the lesser gastric curvature and an epithelial inclusion cyst (epidermoid cyst) in an intrapancreatic accessory spleen respectively.

  3. Quantitative assessment of hemadsorption by myxoviruses: anti-immunoglobulin G hemadsorption-inhibition test.

    PubMed

    Hahon, N; Stewart, J D; Eckert, H L

    1973-04-01

    A quantitative hemadsorption-inhibition test was developed to estimate myxovirus serum antibodies within 24 h by determining the serum dilution inhibiting hemadsorption in 50% of the infected cells. The test depends on the interactions of virus-infected cell monolayers with antiviral serum and of the resultant complexes with antiimmunoglobulin G serum. The incorporation of species-specific anti-immunoglobulin G serum into the test significantly increased sensitivity. PMID:4349249

  4. Quantitative assessment of cancer risk from exposure to diesel engine emissions

    SciTech Connect

    Pepelko, W.E.; Chen, C.

    1993-01-01

    Quantitative estimates of lung cancer risk from exposure to diesel engine emissions were developed using data from three chronic bioassays with Fischer 344 rats. Human target organ dose was estimated with the aid of a comprehensive dosimetry model. The epithelial tissue lining the alveoli and lower airways is the primary target site for induction of lung tumors. Dose was therefore based upon the concentration of carbon particulate matter per unit lung surface area.

  5. Quantitative assessment of in situ microbial communities affecting nuclear waste disposal

    SciTech Connect

    White, D.C. |

    1996-05-01

    Microbes in the environments surrounding nuclear waste depositories pose several questions regarding the protection of the surrounding communities. microbes can facilitate microbially influenced corrosion (MIC), mobilize and facilitate the transport of nuclides as well as produce gaseous emissions which can compromise containment. We have developed an analysis of the extant microbiota that is independent of quantitative recovery and subsequent growth, based on signature biomarkers analysis (SBA).

  6. Assessing quantitative chimerism longitudinally: technical considerations, clinical applications and routine feasibility.

    PubMed

    Kristt, D; Stein, J; Yaniv, I; Klein, T

    2007-03-01

    In this review, we describe the current laboratory approach to quantitative chimerism testing based on short tandem repeats (STRs), focusing on a longitudinal analysis. The latter is based on relative changes appearing in the course of sequential samples, and as such exploits the ultimate potential of this intrinsically semiquantitative platform. Such an analysis is more informative than single static values, less likely to be confused with platform artifacts, and is individualized to the particular patient. It is particularly useful with non-myeloablative conditioning, where mixed chimerism is common. Importantly, longitudinal monitoring is a routinely feasible laboratory option because multiplex STR-polymerase chain reaction kits are available commercially, and modern software can be used to perform computation, reliability testing and longitudinal tracking in a rapid, easy to use format. The ChimerTrack application, a shareware, user friendly program developed for this purpose, produces a report that automatically summarizes and illustrates the quantitative temporal course of the patient's chimeric status. Such a longitudinal perspective enhances the value of quantitative chimerism monitoring for decisions regarding immunomodulatory post transplant therapy. This information also provides unique insights into the biological dynamics of engraftment underlying the fluctuations in the temporal course of a patient's chimeric status.

  7. Quantitative approaches for assessment of white matter hyperintensities in elderly populations

    PubMed Central

    Brickman, Adam M.; Sneed, Joel R.; Provenzano, Frank A.; Garcon, Ernst; Johnert, Lauren; Muraskin, Jordan; Yeung, Lok-Kin; Zimmerman, Molly E.; Roose, Steven P.

    2011-01-01

    White matter hyperintensities (WMH) are areas of increased signal on T2-weighted magnetic resonance imaging (MRI), including fluid attenuated inverse recovery sequences. Total and regional WMH burden (i.e., volume or severity) has been associated with myriad cognitive, neurological, and psychiatric conditions among older adults. In the current report, we illustrate two approaches to quantify periventricular, deep, and total WMH and examine their reliability and criterion validity among 28 elderly patients enrolled in a depression treatment trial. The first approach, an operator-driven quantitative approach, involves visual inspection of individual MRI scans and manual labeling using a three-step series of procedures. The second approach, a fully automated quantitative approach, uses a processing stream that involves image segmentation, voxel intensity thresholding, and seed growing to label WMH and calculate their volume automatically. There was good agreement in WMH quantification between the two approaches (Cronbach’s alpha values from 0.835 to 0.968). Further, severity of WMH was significantly associated with worse depression and increased age, and these associations did not differ significantly between the two quantification approaches. We provide evidence for good reliability and criterion validity for two approaches for WMH volume determination. The operator-driven approach may be better suited for smaller studies with highly trained raters, whereas the fully automated quantitative approach may be more appropriate for larger, high-throughput studies. PMID:21680159

  8. Quantitative assessment of the multiple processes responsible for bilirubin homeostasis in health and disease.

    PubMed

    Levitt, David G; Levitt, Michael D

    2014-01-01

    Serum bilirubin measurements are commonly obtained for the evaluation of ill patients and to screen for liver disease in routine physical exams. An enormous research effort has identified the multiple mechanisms involved in the production and metabolism of conjugated (CB) and unconjugated bilirubin (UB). While the qualitative effects of these mechanisms are well understood, their expected quantitative influence on serum bilirubin homeostasis has received less attention. In this review, each of the steps involved in bilirubin production, metabolism, hepatic cell uptake, and excretion is quantitatively examined. We then attempt to predict the expected effect of normal and defective function on serum UB and CB levels in health and disease states including hemolysis, extra- and intrahepatic cholestasis, hepatocellular diseases (eg, cirrhosis, hepatitis), and various congenital defects in bilirubin conjugation and secretion (eg, Gilbert's, Dubin-Johnson, Crigler-Najjar, Rotor syndromes). Novel aspects of this review include: 1) quantitative estimates of the free and total UB and CB in the plasma, hepatocyte, and bile; 2) detailed discussion of the important implications of the recently recognized role of the hepatic OATP transporters in the maintenance of CB homeostasis; 3) discussion of the differences between the standard diazo assay versus chromatographic measurement of CB and UB; 4) pharmacokinetic implications of the extremely high-affinity albumin binding of UB; 5) role of the enterohepatic circulation in physiologic jaundice of newborn and fasting hyperbilirubinemia; and 6) insights concerning the clinical interpretation of bilirubin measurements. PMID:25214800

  9. Proposal for the assessment of quantitative dermal exposure limits in occupational environments: Part 1. Development of a concept to derive a quantitative dermal occupational exposure limit

    PubMed Central

    Bos, P. M.; Brouwer, D. H.; Stevenson, H.; Boogaard, P. J.; de Kort, W. L.; van Hemmen, J. J.

    1998-01-01

    Dermal uptake of chemicals at the workplace may contribute considerably to the total internal exposure and so needs to be regulated. At present only qualitative warning signs--the "skin notations"--are available as instruments. An attempt was made to develop a quantitative dermal occupational exposure limit (DOEL) complementary to respiratory occupational exposure limits (OELs). The DOEL refers to the total dose deposited on the skin during a working shift. Based on available data and experience a theoretical procedure for the assessment of a DOEL was developed. A DOEL was derived for cyclophosphamide and 4,4-methylene dianiline (MDA) according to this procedure. The DOEL for MDA was tested for applicability in an actual occupational exposure scenario. An integrated approach is recommended for situations in which both dermal and respiratory exposures contribute considerably to the internal exposure of the worker. The starting point should be an internal health based occupational exposure limit--that is, the maximum dose to be absorbed without leading to adverse systemic effects. The proposed assessment of an external DOEL is then either based on absorption rate or absorption percentage. The estimation of skin penetration seems to be of crucial importance in this concept. If for a specific substance a maximal absorption rate can be estimated a maximal skin surface area to be exposed can be assessed which may then serve the purpose of a DOEL. As long as the actual skin surface exposed is smaller than this maximal skin surface area the internal OEL will not be exceeded, and therefore, no systemic health problems would be expected, independent of the dermal dose/unit area. If not, the DOEL may be interpreted as the product of dermal dose/unit area (mg/cm2) and exposed skin surface area (cm2). The proposed concept for a DOEL is relevant and can be made applicable for health surveillance in the occupational situation where dermal exposure contributes notably to the

  10. Quantitative assessment of pressure sore generation and healing through numerical analysis of high-frequency ultrasound images.

    PubMed

    Moghimi, Sahar; Miran Baygi, Mohammad Hossein; Torkaman, Giti; Mahloojifar, Ali

    2010-01-01

    Abstract-This article focuses on the development of a method to quantitatively assess the healing process of artificially induced pressure sores using high-frequency (20 MHz) ultrasound images. We induced sores in guinea pigs and monitored predefined regions on days 3, 7, 14, and 21 after sore generation. We extracted relevant parameters regarding the tissue echographic structure and attenuation properties. We examined tissue healing by defining a healing function that used the extracted parameters. We verified the significance of the extracted features by using analysis of variance and multiple comparison tests. The features displayed ascending/descending behavior during wound generation and reverse behavior during healing. We optimized the parameters of our healing function by using a pattern search method. We tested the efficiency of the optimized values by calculating the healing function value on assessment days and then comparing these results with the expected pattern of changes in the tissue conditions after removing the applied pressure. The results of this study suggest that the methodology developed may be a viable tool for quantitative assessment of pressure sores during their early generation as well as during healing stages.

  11. Quantitative assessment of trabecular bone micro-architecture of the wrist via 7 Tesla MRI: preliminary results

    PubMed Central

    Wang, Ligong; Liang, Guoyuan; Babb, James S.; Wiggins, Graham C.; Saha, Punam K.; Regatte, Ravinder R.

    2013-01-01

    Object The goal of this study was to determine the feasibility of performing quantitative 7T magnetic resonance imaging (MRI) assessment of trabecular bone micro-architecture of the wrist, a common fracture site. Materials and methods The wrists of 4 healthy subjects (1 woman, 3 men, 28±8.9 years) were scanned on a 7T whole body MR scanner using a 3D fast low-angle shot (FLASH) sequence (TR/TE = 20/4.5ms, 0.169 × 0.169 × 0.5mm). Trabecular bone was segmented and divided into 4 or 8 angular subregions. Total bone volume (TBV), bone volume fraction (BVF), surface-curve ratio (SC), and erosion index (EI) were computed. Subjects were scanned twice to assess measurement reproducibility. Results Group mean subregional values for TBV, BVF, SC, and EI (8 subregion analysis) were as follows: 8489 ± 3686, 0.27 ± 0.045, 9.61 ± 6.52; and 1.43 ± 1.25. Within each individual, there was subregional variation in TBV, SC, and EI (>5%), but not BVF (<5%). Intersubject variation (≥12%) existed for all parameters. Within-subject coefficients of variation were ≤10%. Conclusion This is the first study to perform quantitative 7T MRI assessment of trabecular bone micro-architecture of the wrist. This method could be utilized to study perturbations in bone structure in subjects with osteoporosis or other bone disorders. PMID:21544680

  12. A Quantitative Toxicogenomics Assay for High-throughput and Mechanistic Genotoxicity Assessment and Screening of Environmental Pollutants.

    PubMed

    Lan, Jiaqi; Gou, Na; Rahman, Sheikh Mokhles; Gao, Ce; He, Miao; Gu, April Z

    2016-03-15

    The ecological and health concern of mutagenicity and carcinogenicity potentially associated with an overwhelmingly large and ever-increasing number of chemicals demands for cost-effective and feasible method for genotoxicity screening and risk assessment. This study proposed a genotoxicity assay using GFP-tagged yeast reporter strains, covering 38 selected protein biomarkers indicative of all the seven known DNA damage repair pathways. The assay was applied to assess four model genotoxic chemicals, eight environmental pollutants and four negative controls across six concentrations. Quantitative molecular genotoxicity end points were derived based on dose response modeling of a newly developed integrated molecular effect quantifier, Protein Effect Level Index (PELI). The molecular genotoxicity end points were consistent with multiple conventional in vitro genotoxicity assays, as well as with in vivo carcinogenicity assay results. Further more, the proposed genotoxicity end point PELI values quantitatively correlated with both comet assay in human cell and carcinogenicity potency assay in mice, providing promising evidence for linking the molecular disturbance measurements to adverse outcomes at a biological relevant level. In addition, the high-resolution DNA damaging repair pathway alternated protein expression profiles allowed for chemical clustering and classification. This toxicogenomics-based assay presents a promising alternative for fast, efficient and mechanistic genotoxicity screening and assessment of drugs, foods, and environmental contaminants.

  13. Geneflow from GM plants--towards a more quantitative risk assessment.

    PubMed

    Poppy, Guy M

    2004-09-01

    Assessing the risks associated with geneflow from GM crops to wild relatives is a significant scientific challenge. Most researchers have focused on assessing the frequency of gene flow, too often on a localized scale, and ignoring the hazards caused by geneflow. To quantify risk, multi-disciplinary research teams need to unite and scale up their studies.

  14. Quantitative and Qualitative Assessment of Solfege in a Brazilian Higher Educational Context

    ERIC Educational Resources Information Center

    Teixeira Dos Santos, Regina Antunes; Del-Ben, Luciana

    2010-01-01

    This article reports on the feasibility of using the assessment criteria for solfege proposed by Davidson, Scripp, and Meyaard as a way to assess a group of Brazilian undergraduate students. The experiment was carried out in 2003, with 16 first-year students in a variety of majors, each with different levels of previous music experience. The…

  15. Global and local health burden trade-off through the hybridisation of quantitative microbial risk assessment and life cycle assessment to aid water management.

    PubMed

    Kobayashi, Yumi; Peters, Greg M; Ashbolt, Nicholas J; Heimersson, Sara; Svanström, Magdalena; Khan, Stuart J

    2015-08-01

    Life cycle assessment (LCA) and quantitative risk assessment (QRA) are commonly used to evaluate potential human health impacts associated with proposed or existing infrastructure and products. Each approach has a distinct objective and, consequently, their conclusions may be inconsistent or contradictory. It is proposed that the integration of elements of QRA and LCA may provide a more holistic approach to health impact assessment. Here we examine the possibility of merging LCA assessed human health impacts with quantitative microbial risk assessment (QMRA) for waterborne pathogen impacts, expressed with the common health metric, disability adjusted life years (DALYs). The example of a recent large-scale water recycling project in Sydney, Australia was used to identify and demonstrate the potential advantages and current limitations of this approach. A comparative analysis of two scenarios - with and without the development of this project - was undertaken for this purpose. LCA and QMRA were carried out independently for the two scenarios to compare human health impacts, as measured by DALYs lost per year. LCA results suggested that construction of the project would lead to an increased number of DALYs lost per year, while estimated disease burden resulting from microbial exposures indicated that it would result in the loss of fewer DALYs per year than the alternative scenario. By merging the results of the LCA and QMRA, we demonstrate the advantages in providing a more comprehensive assessment of human disease burden for the two scenarios, in particular, the importance of considering the results of both LCA and QRA in a comparative assessment of decision alternatives to avoid problem shifting. The application of DALYs as a common measure between the two approaches was found to be useful for this purpose.

  16. Global and local health burden trade-off through the hybridisation of quantitative microbial risk assessment and life cycle assessment to aid water management.

    PubMed

    Kobayashi, Yumi; Peters, Greg M; Ashbolt, Nicholas J; Heimersson, Sara; Svanström, Magdalena; Khan, Stuart J

    2015-08-01

    Life cycle assessment (LCA) and quantitative risk assessment (QRA) are commonly used to evaluate potential human health impacts associated with proposed or existing infrastructure and products. Each approach has a distinct objective and, consequently, their conclusions may be inconsistent or contradictory. It is proposed that the integration of elements of QRA and LCA may provide a more holistic approach to health impact assessment. Here we examine the possibility of merging LCA assessed human health impacts with quantitative microbial risk assessment (QMRA) for waterborne pathogen impacts, expressed with the common health metric, disability adjusted life years (DALYs). The example of a recent large-scale water recycling project in Sydney, Australia was used to identify and demonstrate the potential advantages and current limitations of this approach. A comparative analysis of two scenarios - with and without the development of this project - was undertaken for this purpose. LCA and QMRA were carried out independently for the two scenarios to compare human health impacts, as measured by DALYs lost per year. LCA results suggested that construction of the project would lead to an increased number of DALYs lost per year, while estimated disease burden resulting from microbial exposures indicated that it would result in the loss of fewer DALYs per year than the alternative scenario. By merging the results of the LCA and QMRA, we demonstrate the advantages in providing a more comprehensive assessment of human disease burden for the two scenarios, in particular, the importance of considering the results of both LCA and QRA in a comparative assessment of decision alternatives to avoid problem shifting. The application of DALYs as a common measure between the two approaches was found to be useful for this purpose. PMID:25965885

  17. Quantitative Assessment of Motor and Sensory/Motor Acquisition in Handicapped and Nonhandicapped Infants and Young Children. Volume IV: Application of the Procedures.

    ERIC Educational Resources Information Center

    Guess, Doug; And Others

    Three studies that applied quantitative procedures to measure motor and sensory/motor acquisition among handicapped and nonhandicapped infants and children are presented. In addition, a study concerning the replication of the quantitative procedures for assessing rolling behavior is described in a fourth article. The first study, by C. Janssen,…

  18. Assessment and introduction of quantitative resistance to Fusarium head blight in elite spring barley.

    PubMed

    Linkmeyer, A; Götz, M; Hu, L; Asam, S; Rychlik, M; Hausladen, H; Hess, M; Hückelhoven, R

    2013-12-01

    Breeding for resistance is a key task to control Fusarium head blight (FHB), a devastating disease of small cereals leading to economic losses and grain contamination with mycotoxins harmful for humans and animals. In the present work, FHB resistance of the six-rowed spring barley 'Chevron' to FHB in Germany was compared with those of adapted German spring barley cultivars. Both under natural infection conditions and after spray inoculation with conidia of Fusarium culmorum, F. sporotrichioides, and F. avenaceum under field conditions, Chevron showed a high level of quantitative resistance to the infection and contamination of grain with diverse mycotoxins. This indicates that Chevron is not only a little susceptible to deoxynivalenol-producing Fusarium spp. but also to Fusarium spp. producing type A trichothecenes and enniatins. Monitoring the initial infection course of F. culmorum on barley lemma tissue by confocal laser-scanning microscopy provided evidence that FHB resistance of Chevron is partially mediated by a preformed penetration resistance, because direct penetration of floral tissue by F. culmorum was observed rarely on Chevron but was common on susceptible genotypes. Alternatively, F. culmorum penetrated Chevron lemma tissue via stomata, which was unusual for susceptible genotypes. We generated double-haploid barley populations segregating for the major FHB resistance quantitative trait loci (QTL) Qrgz-2H-8 of Chevron. Subsequently, we characterized these populations by spray inoculation with conidia of F. culmorum and F. sporotrichioides. This suggested that Qrgz-2H-8 was functional in the genetic background of European elite barley cultivars. However, the degree of achieved resistance was very low when compared with quantitative resistance of the QTL donor Chevron, and the introgression of Qrgz-2H-8 was not sufficient to mediate the cellular resistance phenotype of Chevron in the European backgrounds.

  19. Fluorescence microangiography for quantitative assessment of peritubular capillary changes after AKI in mice.

    PubMed

    Kramann, Rafael; Tanaka, Mari; Humphreys, Benjamin D

    2014-09-01

    AKI predicts the future development of CKD, and one proposed mechanism for this epidemiologic link is loss of peritubular capillaries triggering chronic hypoxia. A precise definition of changes in peritubular perfusion would help test this hypothesis by more accurately correlating these changes with future loss of kidney function. Here, we have adapted and validated a fluorescence microangiography approach for use with mice to visualize, analyze, and quantitate peritubular capillary dynamics after AKI. A novel software-based approach enabled rapid and automated quantitation of capillary number, individual area, and perimeter. After validating perfusion in mice with genetically labeled endothelia, we compared peritubular capillary number and size after moderate AKI, characterized by complete renal recovery, and after severe AKI, characterized by development of interstitial fibrosis and CKD. Eight weeks after severe AKI, we measured a 40%±7.4% reduction in peritubular capillary number (P<0.05) and a 36%±4% decrease in individual capillary cross-sectional area (P<0.001) for a 62%±2.2% reduction in total peritubular perfusion (P<0.01). Whereas total peritubular perfusion and number of capillaries did not change, we detected a significant change of single capillary size following moderate AKI. The loss of peritubular capillary density and caliber at week 8 closely correlated with severity of kidney injury at day 1, suggesting irreparable microvascular damage. These findings emphasize a direct link between severity of acute injury and future loss of peritubular perfusion, demonstrate that reduced capillary caliber is an unappreciated long-term consequence of AKI, and offer a new quantitative imaging tool for understanding how AKI leads to future CKD in mouse models.

  20. Assessment of mold concentrations in Singapore shopping centers using mold-specific quantitative PCR (MSQPCR) analysis.

    PubMed

    Yap, Jennifer; Toh, Zhen Ann; Goh, Vivien; Ng, Lee Chen; Vesper, Stephen

    2009-09-01

    Molds can pose a human health threat and may amplify in buildings in humid climates. The objective of this study was to evaluate the mold growth in Singapore shopping centers based on the collection of 40 dust samples from 15 shopping centers, including one with a history of water damage. The dust was analyzed by a DNA-based technology called mold-specific quantitative PCR (MSQPCR). In a water-damaged shopping center, most of the 26 water-damage indicator species were detected at some concentration and many were much more abundant than the average in the shopping centers. MSQPCR is a useful method for quantifying indoor molds in tropical climates.

  1. Quantitative assessment of hyaline cartilage elasticity during optical clearing using optical coherence elastography

    NASA Astrophysics Data System (ADS)

    Liu, Chih-Hao; Singh, Manmohan; Li, Jiasong; Han, Zhaolong; Wu, Chen; Wang, Shang; Idugboe, Rita; Raghunathan, Raksha; Zakharov, Valery P.; Sobol, Emil N.; Tuchin, Valery V.; Twa, Michael; Larin, Kirill V.

    2015-03-01

    We report the first study on using optical coherence elastography (OCE) to quantitatively monitor the elasticity change of the hyaline cartilage during the optical clearing administrated by glucose solution. The measurement of the elasticity is verified using uniaxial compression test, demonstrating the feasibility of using OCE to quantify the Young's modulus of the cartilage tissue. As the results, we found that the stiffness of the hyaline cartilage increases during the optical clearing of the tissue. This study might be potentially useful for the early detection of osteoarthritis disease.

  2. Quantitative Carré differential interference contrast microscopy to assess phase and amplitude.

    PubMed

    Duncan, Donald D; Fischer, David G; Dayton, Amanda; Prahl, Scott A

    2011-06-01

    We present a method of using an unmodified differential interference contrast microscope to acquire quantitative information on scatter and absorption of thin tissue samples. A simple calibration process is discussed that uses a standard optical wedge. Subsequently, we present a phase-stepping procedure for acquiring phase gradient information exclusive of absorption effects. The procedure results in two-dimensional maps of the local angular (polar and azimuthal) ray deviation. We demonstrate the calibration process, discuss details of the phase-stepping algorithm, and present representative results for a porcine skin sample.

  3. Quantitative assessment of the p53-Mdm2 feedback loop using protein lysate microarrays.

    PubMed

    Ramalingam, Sundhar; Honkanen, Peter; Young, Lynn; Shimura, Tsutomu; Austin, John; Steeg, Patricia S; Nishizuka, Satoshi

    2007-07-01

    Mathematical simulations of the p53-Mdm2 feedback loop suggest that both proteins will exhibit impulsive expression characteristics in response to high cellular stress levels. However, little quantitative experimental evaluation has been done, particularly of the phosphorylated forms. To evaluate the mathematical models experimentally, we used lysate microarrays from an isogenic pair of gamma-ray-irradiated cell lysates from HCT116 (p53(+/+) and p53(-/-)). Both p53 and Mdm2 proteins showed expected pulses in the wild type, whereas no pulses were seen in the knockout. Based on experimental observations, we determined model parameters and generated an in silico "knockout," reflecting the experimental data, including phosphorylated proteins.

  4. A Quantitative Climate-Match Score for Risk-Assessment Screening of Reptile and Amphibian Introductions

    NASA Astrophysics Data System (ADS)

    van Wilgen, Nicola J.; Roura-Pascual, Núria; Richardson, David M.

    2009-09-01

    Assessing climatic suitability provides a good preliminary estimate of the invasive potential of a species to inform risk assessment. We examined two approaches for bioclimatic modeling for 67 reptile and amphibian species introduced to California and Florida. First, we modeled the worldwide distribution of the biomes found in the introduced range to highlight similar areas worldwide from which invaders might arise. Second, we modeled potentially suitable environments for species based on climatic factors in their native ranges, using three sources of distribution data. Performance of the three datasets and both approaches were compared for each species. Climate match was positively correlated with species establishment success (maximum predicted suitability in the introduced range was more strongly correlated with establishment success than mean suitability). Data assembled from the Global Amphibian Assessment through NatureServe provided the most accurate models for amphibians, while ecoregion data compiled by the World Wide Fund for Nature yielded models which described reptile climatic suitability better than available point-locality data. We present three methods of assigning a climate-match score for use in risk assessment using both the mean and maximum climatic suitabilities. Managers may choose to use different methods depending on the stringency of the assessment and the available data, facilitating higher resolution and accuracy for herpetofaunal risk assessment. Climate-matching has inherent limitations and other factors pertaining to ecological interactions and life-history traits must also be considered for thorough risk assessment.

  5. Objective patellar instability: MR-based quantitative assessment of potentially associated anatomical features.

    PubMed

    Escala, Joan S; Mellado, José M; Olona, Montserrat; Giné, Josep; Saurí, Amadeu; Neyret, Phillipe

    2006-03-01

    To evaluate and compare the diagnostic utility of multiple quantitative parameters as measured on knee magnetic resonance (MR) examinations of patients suffering objective patellar instability (OPI). We performed a retrospective evaluation of knee MR examinations in a group of 46 patients (59 knees) with clinically proven OPI, and in a control group of 69 patients (71 knees). Multiple quantitative parameters in both groups were statistically evaluated and compared for their association with OPI. OPI patients tend to present shallower trochlear groove (<5 mm), larger Insall-Salvati index (>1.2), shorter patellar nose (<9 mm), smaller morphology ratio (<1.2), and larger patellar tilt (>11 degrees ) than control patients. The best sensitivities were those of the lateral patellar tilt (92.7%), the trochlear groove depth at the roman arch level (85.7%) and the Insall-Salvati index (78%). The best specificities were those of the morphology ratio (86.9%), the patellar nose (84.5%) and the patellar tendon length (84.5%). Shallow trochlear groove may be confidently identified at the roman arch view in OPI patients. Patella alta may be more reliably detected by the Insall-Salvati index in OPI patients. Patellar nose and morphology ratio are very specific indicators of OPI. A short patellar nose (that is to say, a patellar nose ratio of <0.25) has a high association with OPI. Lateral patellar tilt remains the single feature with the highest sensitivity and specificity for identifying OPI patients.

  6. A quantitative approach to assessing the efficacy of occupant protection programs: A case study from Montana.

    PubMed

    Manlove, Kezia; Stanley, Laura; Peck, Alyssa

    2015-10-01

    Quantitative evaluation of vehicle occupant protection programs is critical for ensuring efficient government resource allocation, but few methods exist for conducting evaluation across multiple programs simultaneously. Here we present an analysis of occupant protection efficacy in the state of Montana. This approach relies on seat belt compliance rates as measured by the National Occupant Protection Usage Survey (NOPUS). A hierarchical logistic regression model is used to estimate the impacts of four Montana Department of Transportation (MDT)-funded occupant protection programs used in the state of Montana, following adjustment for a suite of potential confounders. Activity from two programs, Buckle Up coalitions and media campaigns, are associated with increased seat belt use in Montana, whereas the impact of another program, Selective Traffic Enforcement, is potentially masked by other program activity. A final program, Driver's Education, is not associated with any shift in seat belt use. This method allows for a preliminary quantitative estimation of program impacts without requiring states to obtain any new seat belt use data. This approach provides states a preliminary look at program impacts, and a means for carefully planning future program allocation and investigation.

  7. Quantitative assessment of canalicular bile formation in isolated hepatocyte couplets using microscopic optical planimetry.

    PubMed Central

    Gautam, A; Ng, O C; Strazzabosco, M; Boyer, J L

    1989-01-01

    Isolated rat hepatocyte couplets (IRHC) are primary units of bile secretion that accumulate fluid in an enclosed canalicular space with time in culture. We have quantitated the rate of canalicular secretion in IRHC cultured for 4-8 h by measuring the change in canalicular space volume by video-microscopic optical planimetry using high resolution Nomarski optics. Electron microscopic morphometric studies revealed significant increases in canalicular membrane area after 4-6 h in culture. Canalicular secretion in basal L-15 medium (3.8 +/- 1.3 fl/min) increased significantly with the choleretic bile salts (10 microM), taurocholate, and ursodeoxycholate (14 +/- 7 fl/min each). Secretion rates after exposure to bile acids correlated directly with the canalicular surface area before stimulation. In contrast, expansion times after stimulation varied inversely with initial canalicular volumes. Ursodeoxycholic acid failed to produce a hypercholeresis at 10-, 100-, or 200-microM concentrations compared with taurocholate, either in normal or taurine-depleted IRHC. The present findings establish that rates of canalicular bile secretion can be quantitated in IRHC by serial optical planimetry, both in the basal state and after stimulation with bile acids. Furthermore, ursodeoxycholate does not acutely induce hypercholeresis at the canalicular level in this model. Rather, both taurocholic and ursodeoxycholic acids induced secretion in proportion to the surface area of the canalicular membrane. The IRHC are a useful model to identify canalicular choleretics and for studies of canalicular bile formation. Images PMID:2913052

  8. Quantitative assessment of vaginal microflora during use of tampons of various compositions.

    PubMed

    Onderdonk, A B; Zamarchi, G R; Rodriguez, M L; Hirsch, M L; Muñoz, A; Kass, E H

    1987-12-01

    Although the effect of vaginal tampons on the microbial flora during menstruation has recently been studied by several investigators, quantitative effects attributable to particular tampon fibers have received less attention. The purposes of the present study were (i) to determine and then to compare the effects of polyacrylate rayon tampons and viscose rayon tampons on the normal vaginal flora, (ii) to compare quantitative bacterial counts obtained from these tampons with those obtained from concomitant vaginal swabs, and (iii) to determine whether either of these tampon types alters the vaginal microflora when compared with the microflora in the same women using all-cotton tampons or external catamenial pads. Tampon and swab samples were obtained at predetermined times from 18 women for an average of seven menstrual cycles. Samples consisting of swabs from women wearing menstrual pads were compared with swab and tampon samples taken at predetermined times during the menstrual cycle from women using cotton, polyacrylate rayon, or viscose rayon tampons. Samples were analyzed for total aerobic, facultative, and anaerobic bacterial counts. Statistical evaluation of the results indicated that, on the whole, total bacterial counts decreased during menstruation and that the numbers of bacteria in tampons tended to be lower than those in swab samples taken at the same time. The tampon type had little effect on the vaginal microflora.

  9. Cold adaptation in the marine bacterium, Sphingopyxis alaskensis, assessed using quantitative proteomics.

    PubMed

    Ting, Lily; Williams, Timothy J; Cowley, Mark J; Lauro, Federico M; Guilhaus, Michael; Raftery, Mark J; Cavicchioli, Ricardo

    2010-10-01

    The cold marine environment constitutes a large proportion of the Earth's biosphere. Sphingopyxis alaskensis was isolated as a numerically abundant bacterium from several cold marine locations, and has been extensively studied as a model marine bacterium. Recently, a metabolic labelling platform was developed to comprehensively identify and quantify proteins from S. alaskensis. The approach incorporated data normalization and statistical validation for the purpose of generating highly confident quantitative proteomics data. Using this approach, we determined quantitative differences between cells grown at 10°C (low temperature) and 30°C (high temperature). Cold adaptation was linked to specific aspects of gene expression: a dedicated protein-folding system using GroESL, DnaK, DnaJ, GrpE, SecB, ClpB and PPIase; polyhydroxyalkanoate-associated storage materials; a link between enzymes in fatty acid metabolism and energy generation; de novo synthesis of polyunsaturated fatty acids in the membrane and cell wall; inorganic phosphate ion transport by a phosphate import PstB homologue; TonB-dependent receptor and bacterioferritin in iron homeostasis; histidine, tryptophan and proline amino acid metabolism; and a large number of proteins without annotated functions. This study provides a new level of understanding on how important marine bacteria can adapt to compete effectively in cold marine environments. This study is also a benchmark for comparative proteomic analyses with other important marine bacteria and other cold-adapted organisms. PMID:20482592

  10. Quantitative assessment of antibody internalization with novel monoclonal antibodies against Alexa fluorophores.

    PubMed

    Liao-Chan, Sindy; Daine-Matsuoka, Barbara; Heald, Nathan; Wong, Tiffany; Lin, Tracey; Cai, Allen G; Lai, Michelle; D'Alessio, Joseph A; Theunissen, Jan-Willem

    2015-01-01

    Antibodies against cell surface antigens may be internalized through their specific interactions with these proteins and in some cases may induce or perturb antigen internalization. The anti-cancer efficacy of antibody-drug conjugates is thought to rely on their uptake by cancer cells expressing the surface antigen. Numerous techniques, including microscopy and flow cytometry, have been used to identify antibodies with desired cellular uptake rates. To enable quantitative measurements of internalization of labeled antibodies, an assay based on internalized and quenched fluorescence was developed. For this approach, we generated novel anti-Alexa Fluor monoclonal antibodies (mAbs) that effectively and specifically quench cell surface-bound Alexa Fluor 488 or Alexa Fluor 594 fluorescence. Utilizing Alexa Fluor-labeled mAbs against the EphA2 receptor tyrosine kinase, we showed that the anti-Alexa Fluor reagents could be used to monitor internalization quantitatively over time. The anti-Alexa Fluor mAbs were also validated in a proof of concept dual-label internalization assay with simultaneous exposure of cells to two different mAbs. Importantly, the unique anti-Alexa Fluor mAbs described here may also enable other single- and dual-label experiments, including label detection and signal enhancement in macromolecules, trafficking of proteins and microorganisms, and cell migration and morphology.

  11. A polarized multispectral imaging system for quantitative assessment of hypertrophic scars

    PubMed Central

    Ghassemi, Pejhman; Travis, Taryn E.; Moffatt, Lauren T.; Shupp, Jeffrey W.; Ramella-Roman, Jessica C.

    2014-01-01

    Hypertrophic scars (HTS) are a pathologic reaction of the skin and soft tissue to burn or other traumatic injury. Scar tissue can cause patients serious functional and cosmetic issues. Scar management strategies, specifically scar assessment techniques, are vital to improve clinical outcome. To date, no entirely objective method for scar assessment has been embraced by the medical community. In this study, we introduce for the first time, a novel polarized multispectral imaging system combining out-of-plane Stokes polarimetry and Spatial Frequency Domain Imaging (SFDI). This imaging system enables us to assess the pathophysiology (hemoglobin, blood oxygenation, water, and melanin) and structural features (cellularity and roughness) of HTS. To apply the proposed technique in an in vivo experiment, dermal wounds were created in a porcine model and allowed to form into scars. The developed scars were then measured at various time points using the imaging system. Results showed a good agreement with clinical Vancouver Scar Scale assessment and histological examinations. PMID:25360354

  12. Personal Exposure Monitoring Wearing Protocol Compliance: An Initial Assessment of Quantitative Measurements

    EPA Science Inventory

    Personal exposure sampling provides the most accurate and representative assessment of exposure to a pollutant, but only if measures are implemented to minimize exposure misclassification and reduce confounders that may cause misinterpretation of the collected data. Poor complian...

  13. Performance Assessment of Human and Cattle Associated Quantitative Real-time PCR Assays - slides

    EPA Science Inventory

    The presentation overview is (1) Single laboratory performance assessment of human- and cattle associated PCR assays and (2) A Field Study: Evaluation of two human fecal waste management practices in Ohio watershed.

  14. A quantitative ¹H nuclear magnetic resonance (qHNMR) method for assessing the purity of iridoids and secoiridoids.

    PubMed

    Li, Zeyun; Welbeck, Edward; Yang, Li; He, Chunyong; Hu, Haijun; Song, Ming; Bi, Kaishun; Wang, Zhengtao

    2015-01-01

    This paper utilized a quantitative (1)H nuclear magnetic resonance (qHNMR) method for assessing the purity of iridoids and secoiridoids. The method was fully validated, including specificity, linearity, accuracy, precision, reproducibility, and robustness. For optimization of experimental conditions, several experimental parameters were investigated, including relaxation delay (D1), scan numbers (NS) and power length (PL1). The quantification was based on the area ratios of H-3 from analytes relative to aromatic protons from 1,4-dinitrobenzene (internal standard) with methanol-d4 as solvent. Five iridoids and secoiridoids (sweroside, swertiamarin, gentiopicroside, geniposide, genipin) were analyzed. Furthermore, the results were validated by the high performance liquid chromatography coupled with ultraviolet detection (HPLC-UV) method. It can be concluded that the qHNMR method was simple, rapid, and accurate, providing a reliable and superior method for assessing the purity of iridoids and secoiridoids.

  15. Quantitative assessment of the mucosal architecture of jejunal biopsy specimens: a comparison between linear measurement, stereology, and computer aided microscopy.

    PubMed Central

    Corazza, G R; Frazzoni, M; Dixon, M F; Gasbarrini, G

    1985-01-01

    Fifty jejunal biopsy specimens obtained from normal subjects and from untreated and treated patients with coeliac disease were assessed blindly by three independent observers, each of them using different morphometric techniques-namely, linear measurement, stereology, and computer aided microscopy. In two of 26 control biopsy specimens linear measurement was not possible because of distortion of villi. Highly significant (p less than 0.001) correlation coefficients were found between the different techniques. With all methods significant differences between controls and patients with coeliac disease and between treated and untreated coeliac patients were found. Only by stereology, however, was there no overlap between results for patients and those for controls. In view of the limitations of linear measurement and the high cost and complexity of computer aided microscopy, we propose that a simple stereological technique using an eyepiece graticule is the method of choice in the quantitative assessment of mucosal architecture in jejunal biopsy specimens. Images PMID:3894431

  16. Quantitative assessment of evidential weight for a fingerprint comparison. Part II: a generalisation to take account of the general pattern.

    PubMed

    Neumann, Cedric; Evett, Ian W; Skerrett, James E; Mateos-Garcia, Ismael

    2012-01-10

    The authors have proposed a quantitative method for assessing weight of evidence in the case where a fingermark from a crime scene is compared with a set of control prints from the ten fingers of a suspect. The approach is based on the notion of calculating a Likelihood Ratio (LR) that addresses a pair of propositions relating to the individual who left the crime mark. The current method considers only information extracted from minutiae, such as location, direction and type. It does not consider other information usually taken into account by fingerprint examiners, such as the general pattern of the ridge flow on the mark and the control prints. In this paper, we propose an improvement to our model that allows a fingerprint examiner to take advantage of pattern information when assessing the evidential weight to be assigned to a fingerprint comparison. We present an extension of the formal analysis proposed earlier and we illustrate our approach with an example.

  17. Assessing the activity of sarcoidosis: quantitative /sup 67/Ga-citrate imaging

    SciTech Connect

    Fajman, W.A.; Greenwald, L.V.; Staton, G.; Check, I.J.; Pine, J.; Gilman, M.; Scheidt, K.A.; McClees, E.C.

    1984-04-01

    Three different methods of quantitating /sup 67/Ga-citrate lung images - a visual index, a computer-assisted index, and the total-lung-to-background ratio - were compared in 71 studies of patients with biopsy-proven sarcoidosis. Fifty consecutive cases were analyzed independently by two different observers using all three methods. In these studies, each index was correlated with the cell differential in the bronchoalveolar lavage fluid. The total-lung-to-background ratio proved to be the simplest to perform; correlated best with the original visual index and the percentage of lymphocytes obtained in bronchoalveolar lavage fluid. Sensitivity for detecting active disease was 84% compared with 64% and 58% for the visual and computer-assisted indices, respectively, with no sacrifice in specificity.

  18. Quantitative assessment of Naegleria fowleri and Escherichia coli concentrations within a Texas reservoir.

    PubMed

    Painter, Stephanie M; Pfau, Russell S; Brady, Jeff A; McFarland, Anne M S

    2013-06-01

    Previous presence/absence studies have indicated a correlation between the presence of the pathogenic amoeba Naegleria fowleri and the presence of bacteria, such as the fecal indicator Escherichia coli, in environmental surface waters. The objective of this study was to use quantitative real-time polymerase chain reaction (qPCR) methodologies to measure N. fowleri and E. coli concentrations within a Texas reservoir in late summer, and to determine if concentrations of N. fowleri and E. coli were statistically correlated. N. fowleri was detected in water samples from 67% of the reservoir sites tested, with concentrations ranging up to an estimated 26 CE (cell equivalents)/100 mL. E. coli was detected in water samples from 60% of the reservoir sites tested, with concentrations ranging up to 427 CE/100 mL. In this study, E. coli concentrations were not indicative of N. fowleri concentrations. PMID:23708581

  19. A statistical assessment of the quantitative uptake of vinyl chloride monomer from aqueous solution.

    PubMed

    Withey, J R; Collins, B T

    1976-11-01

    The presence of vinyl chloride monomer (VCM) in foodstuffs and its demonstrated carcinogenic potential when administered by the oral route has raised questions concerning the quantitative estimation of the safety of the use of food packaging fabricated from rigid polyvinyl chloride. A statistical model, which was tested by curve-fitting data obtained from an oral uptake study, has been demonstrated to be of predictive value. Ninety-five percent condifence limits were also calculated, and the data from this study were compared with those from a previous gas phase exposure study. It was concluded that if the total daily liquid intake contained 20 ppm of VCM then the area generated under the blood level-time curve, for rats, would be equivalent to an inhalation exposure of about 2 ppm for 24 hr.

  20. Quantitative MR assessment of longitudinal parenchymal changes in children treated for medulloblastoma

    NASA Astrophysics Data System (ADS)

    Reddick, Wilburn E.; Glass, John O.; Wu, Shingjie; Palmer, Shawna L.; Mulhern, Raymond K.; Gajjar, Amar

    2002-05-01

    Our research builds on the hypothesis that white matter damage, in children treated for cancer with cranial spinal irradiation, spans a continuum of severity that can be reliably probed using non-invasive MR technology and results in potentially debilitating neurological and neuropsychological problems. This longitudinal project focuses on 341 quantitative volumetric MR examinations from 58 children treated for medulloblastoma (MB) with cranial irradiation (CRT) of 35-40 Gy. Quadratic mixed effects models were used to fit changes in tissue volumes (white matter, gray matter, CSF, and cerebral) with time since CRT and age at CRT as covariates. We successfully defined algorithms that are useful in the prediction of brain development among children treated for MB.

  1. Changes in bone structure of Corriedale sheep with inherited rickets: a peripheral quantitative computed tomography assessment.

    PubMed

    Dittmer, Keren E; Firth, Elwyn C; Thompson, Keith G; Marshall, Jonathan C; Blair, Hugh T

    2011-03-01

    An inherited skeletal disease with gross and microscopic features of rickets has been diagnosed in Corriedale sheep in New Zealand. The aim of this study was to quantify the changes present in tibia from sheep with inherited rickets using peripheral quantitative computed tomography. In affected sheep, scans in the proximal tibia, where metaphysis becomes diaphysis, showed significantly greater trabecular bone mineral content (BMC) and bone mineral density (BMD). The sheep with inherited rickets had significantly greater BMC and bone area in the mid-diaphysis of the proximal tibia compared to control sheep. However, BMD in the mid-diaphysis was significantly less in affected sheep than in controls, due to the greater cortical area and lower voxel density values in affected sheep. From this it was concluded that the increased strain on under-mineralised bone in sheep with inherited rickets led to increased bone mass in an attempt to improve bone strength.

  2. Quantitative assessment of electrostatic embedding in Density Functional Theory calculations of biomolecular systems

    SciTech Connect

    Fattebert, J; Law, R J; Bennion, B; Lau, E Y; Schwegler, E; Lightstone, F C

    2009-04-24

    We evaluate the accuracy of density functional theory quantum calculations of biomolecular subsystems using a simple electrostatic embedding scheme. Our scheme is based on dividing the system of interest into a primary and secondary subsystem. A finite difference discretization of the Kohn-Sham equations is used for the primary subsystem, while its electrostatic environment is modeled with a simple one-electron potential. Force-field atomic partial charges are used to generate smeared Gaussian charge densities and to model the secondary subsystem. We illustrate the utility of this approach with calculations of truncated dipeptide chains. We analyze quantitatively the accuracy of this approach by calculating atomic forces and comparing results with fullQMcalculations. The impact of the choice made in terminating dangling bonds at the frontier of the QM region is also investigated.

  3. Quantitative assessment of molecular dynamics-grown amorphous silicon and germanium films on silicon (111)

    NASA Astrophysics Data System (ADS)

    Käshammer, Peter; Borgardt, Nikolai I.; Seibt, Michael; Sinno, Talid

    2016-09-01

    Molecular dynamics based on the empirical Tersoff potential was used to simulate the deposition of amorphous silicon and germanium on silicon(111) at various deposition rates and temperatures. The resulting films were analyzed quantitatively by comparing one-dimensional atomic density profiles to experimental measurements. It is found that the simulations are able to capture well the structural features of the deposited films, which exhibit a gradual loss of crystalline order over several monolayers. A simple mechanistic model is used to demonstrate that the simulation temperature may be used to effectively accelerate the surface relaxation processes during deposition, leading to films that are consistent with experimental samples grown at deposition rates many orders-of-magnitude slower than possible in a molecular dynamics simulation.

  4. Reverse Phase Protein Arrays—Quantitative Assessment of Multiple Biomarkers in Biopsies for Clinical Use

    PubMed Central

    Boellner, Stefanie; Becker, Karl-Friedrich

    2015-01-01

    Reverse Phase Protein Arrays (RPPA) represent a very promising sensitive and precise high-throughput technology for the quantitative measurement of hundreds of signaling proteins in biological and clinical samples. This array format allows quantification of one protein or phosphoprotein in multiple samples under the same experimental conditions at the same time. Moreover, it is suited for signal transduction profiling of small numbers of cultured cells or cells isolated from human biopsies, including formalin fixed and paraffin embedded (FFPE) tissues. Owing to the much easier sample preparation, as compared to mass spectrometry based technologies, and the extraordinary sensitivity for the detection of low-abundance signaling proteins over a large linear range, RPPA have the potential for characterization of deregulated interconnecting protein pathways and networks in limited amounts of sample material in clinical routine settings. Current aspects of RPPA technology, including dilution curves, spotting, controls, signal detection, antibody validation, and calculation of protein levels are addressed. PMID:27600215

  5. Quantitative assessment of Naegleria fowleri and Escherichia coli concentrations within a Texas reservoir.

    PubMed

    Painter, Stephanie M; Pfau, Russell S; Brady, Jeff A; McFarland, Anne M S

    2013-06-01

    Previous presence/absence studies have indicated a correlation between the presence of the pathogenic amoeba Naegleria fowleri and the presence of bacteria, such as the fecal indicator Escherichia coli, in environmental surface waters. The objective of this study was to use quantitative real-time polymerase chain reaction (qPCR) methodologies to measure N. fowleri and E. coli concentrations within a Texas reservoir in late summer, and to determine if concentrations of N. fowleri and E. coli were statistically correlated. N. fowleri was detected in water samples from 67% of the reservoir sites tested, with concentrations ranging up to an estimated 26 CE (cell equivalents)/100 mL. E. coli was detected in water samples from 60% of the reservoir sites tested, with concentrations ranging up to 427 CE/100 mL. In this study, E. coli concentrations were not indicative of N. fowleri concentrations.

  6. Reverse Phase Protein Arrays—Quantitative Assessment of Multiple Biomarkers in Biopsies for Clinical Use

    PubMed Central

    Boellner, Stefanie; Becker, Karl-Friedrich

    2015-01-01

    Reverse Phase Protein Arrays (RPPA) represent a very promising sensitive and precise high-throughput technology for the quantitative measurement of hundreds of signaling proteins in biological and clinical samples. This array format allows quantification of one protein or phosphoprotein in multiple samples under the same experimental conditions at the same time. Moreover, it is suited for signal transduction profiling of small numbers of cultured cells or cells isolated from human biopsies, including formalin fixed and paraffin embedded (FFPE) tissues. Owing to the much easier sample preparation, as compared to mass spectrometry based technologies, and the extraordinary sensitivity for the detection of low-abundance signaling proteins over a large linear range, RPPA have the potential for characterization of deregulated interconnecting protein pathways and networks in limited amounts of sample material in clinical routine settings. Current aspects of RPPA technology, including dilution curves, spotting, controls, signal detection, antibody validation, and calculation of protein levels are addressed.

  7. Paleoceanography of the Atlantic-Mediterranean exchange: Overview and first quantitative assessment of climatic forcing

    NASA Astrophysics Data System (ADS)

    Rogerson, M.; Rohling, E. J.; Bigg, G. R.; Ramirez, J.

    2012-06-01

    The Mediterranean Sea provides a major route for heat and freshwater loss from the North Atlantic and thus is an important cause of the high density of Atlantic waters. In addition to the traditional view that loss of fresh water via the Mediterranean enhances the general salinity of the North Atlantic, and the interior of the eastern North Atlantic in particular, it should be noted that Mediterranean water outflowing at Gibraltar is in fact cooler than compensating inflowing water. The consequence is that the Mediterranean is also a region of heat loss from the Atlantic and contributes to its large-scale cooling. Uniquely, this system can be understood physically via the constraints placed on it by a single hydraulic structure: the Gibraltar exchange. Here we review the existing knowledge about the physical structure of the Gibraltar exchange today and the evidential basis for arguments that it has been different in the past. Using a series of quantitative experiments, we then test prevailing concepts regarding the potential causes of these past changes. We find that (1) changes in the vertical position of the plume of Mediterranean water in the Atlantic are controlled by the vertical density structure of the Atlantic; (2) a prominent Early Holocene "contourite gap" within the Gulf of Cadiz is a response to reduced buoyancy loss in the eastern Mediterranean during the time of "sapropel 1" deposition; (3) changes in buoyancy loss from the Mediterranean during MIS3 caused changes in the bottom velocity field in the Gulf of Cadiz, but we note that the likely cause is reduced freshwater loss and not enhanced heat loss; and (4) strong exchange at Gibraltar during Atlantic freshening phases implies that the Gibraltar exchange provides a strong negative feedback to reduced Atlantic meridional overturning. Given the very counterintuitive way in which the Strait of Gibraltar system behaves, we recommend that without quantitative supporting work, qualitative interpretations

  8. Quantitative assessment of chromosome instability induced through chemical disruption of mitotic progression

    PubMed Central

    Markossian, Sarine; Arnaoutov, Alexei; Saba, Nakhle S.; Larionov, Vladimir; Dasso, Mary

    2016-01-01

    ABSTRACT Most solid tumors are aneuploid, carrying an abnormal number of chromosomes, and they frequently missegregate whole chromosomes in a phenomenon termed chromosome instability (CIN). While CIN can be provoked through disruption of numerous mitotic pathways, it is not clear which of these mechanisms are most critical, or whether alternative mechanisms could also contribute significantly in vivo. One difficulty in determining the relative importance of candidate CIN regulators has been the lack of a straightforward, quantitative assay for CIN in live human cells: While gross mitotic abnormalities can be detected visually, moderate levels of CIN may not be obvious, and are thus problematic to measure. To address this issue, we have developed the first Human Artificial Chromosome (HAC)-based quantitative live-cell assay for mitotic chromosome segregation in human cells. We have produced U2OS-Phoenix cells carrying the alphoidtetO-HAC encoding copies of eGFP fused to the destruction box (DB) of anaphase promoting complex/cyclosome (APC/C) substrate hSecurin and sequences encoding the tetracycline repressor fused to mCherry (TetR-mCherry). Upon HAC missegregation, daughter cells that do not obtain a copy of the HAC are GFP negative in the subsequent interphase. The HAC can also be monitored live following the TetR-mCherry signal. U2OS-Phoenix cells show low inherent levels of CIN, which can be enhanced by agents that target mitotic progression through distinct mechanisms. This assay allows direct detection of CIN induced by clinically important agents without conspicuous mitotic defects, allowing us to score increased levels of CIN that fall below the threshold required for discernable morphological disruption. PMID:27104376

  9. Skill Assessment of An Hybrid Technique To Estimate Quantitative Precipitation Forecast For Galicia (nw Spain)

    NASA Astrophysics Data System (ADS)

    Lage, A.; Taboada, J. J.

    Precipitation is the most obvious of the weather elements in its effects on normal life. Numerical weather prediction (NWP) is generally used to produce quantitative precip- itation forecast (QPF) beyond the 1-3 h time frame. These models often fail to predict small-scale variations of rain because of spin-up problems and their coarse spatial and temporal resolution (Antolik, 2000). Moreover, there are some uncertainties about the behaviour of the NWP models in extreme situations (de Bruijn and Brandsma, 2000). Hybrid techniques, combining the benefits of NWP and statistical approaches in a flexible way, are very useful to achieve a good QPF. In this work, a new technique of QPF for Galicia (NW of Spain) is presented. This region has a percentage of rainy days per year greater than 50% with quantities that may cause floods, with human and economical damages. The technique is composed of a NWP model (ARPS) and a statistical downscaling process based on an automated classification scheme of at- mospheric circulation patterns for the Iberian Peninsula (J. Ribalaygua and R. Boren, 1995). Results show that QPF for Galicia is improved using this hybrid technique. [1] Antolik, M.S. 2000 "An Overview of the National Weather Service's centralized statistical quantitative precipitation forecasts". Journal of Hydrology, 239, pp:306- 337. [2] de Bruijn, E.I.F and T. Brandsma "Rainfall prediction for a flooding event in Ireland caused by the remnants of Hurricane Charley". Journal of Hydrology, 239, pp:148-161. [3] Ribalaygua, J. and Boren R. "Clasificación de patrones espaciales de precipitación diaria sobre la España Peninsular". Informes N 3 y 4 del Servicio de Análisis e Investigación del Clima. Instituto Nacional de Meteorología. Madrid. 53 pp.

  10. Quantitative assessment of contact and non-contact lateral force calibration methods for atomic force microscopy.

    PubMed

    Tran Khac, Bien Cuong; Chung, Koo-Hyun

    2016-02-01

    Atomic Force Microscopy (AFM) has been widely used for measuring friction force at the nano-scale. However, one of the key challenges faced by AFM researchers is to calibrate an AFM system to interpret a lateral force signal as a quantifiable force. In this study, five rectangular cantilevers were used to quantitatively compare three different lateral force calibration methods to demonstrate the legitimacy and to establish confidence in the quantitative integrity of the proposed methods. The Flat-Wedge method is based on a variation of the lateral output on a surface with flat and changing slopes, the Multi-Load Pivot method is based on taking pivot measurements at several locations along the cantilever length, and the Lateral AFM Thermal-Sader method is based on determining the optical lever sensitivity from the thermal noise spectrum of the first torsional mode with a known torsional spring constant from the Sader method. The results of the calibration using the Flat-Wedge and Multi-Load Pivot methods were found to be consistent within experimental uncertainties, and the experimental uncertainties of the two methods were found to be less than 15%. However, the lateral force sensitivity determined by the Lateral AFM Thermal-Sader method was found to be 8-29% smaller than those obtained from the other two methods. This discrepancy decreased to 3-19% when the torsional mode correction factor for an ideal cantilever was used, which suggests that the torsional mode correction should be taken into account to establish confidence in Lateral AFM Thermal-Sader method.

  11. Quantitative Assessment of RNA-Protein Interactions with High Throughput Sequencing - RNA Affinity Profiling (HiTS-RAP)

    PubMed Central

    Ozer, Abdullah; Tome, Jacob M.; Friedman, Robin C.; Gheba, Dan; Schroth, Gary P.; Lis, John T.

    2016-01-01

    Because RNA-protein interactions play a central role in a wide-array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the High Throughput Sequencing-RNA Affinity Profiling (HiTS-RAP) assay, which couples sequencing on an Illumina GAIIx with the quantitative assessment of one or several proteins’ interactions with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of EGFP and NELF-E proteins with their corresponding canonical and mutant RNA aptamers. Here, we provide a detailed protocol for HiTS-RAP, which can be completed in about a month (8 days hands-on time) including the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, high-throughput sequencing and protein binding with GAIIx, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, RNA-MaP and RBNS. A successful HiTS-RAP experiment provides the sequence and binding curves for approximately 200 million RNAs in a single experiment. PMID:26182240

  12. Using wide-field quantitative diffuse reflectance spectroscopy in combination with high-resolution imaging for margin assessment

    NASA Astrophysics Data System (ADS)

    Kennedy, Stephanie; Mueller, Jenna; Bydlon, Torre; Brown, J. Quincy; Ramanujam, Nimmi

    2011-03-01

    Due to the large number of women diagnosed with breast cancer and the lack of intra-operative tools, breast cancer margin assessment presents a significant unmet clinical need. Diffuse reflectance spectral imaging provides a method for