Science.gov

Sample records for mimicking quantitative assessment

  1. Quantitatively mimicking wet colloidal suspensions with dry granular media.

    PubMed

    Messina, René; Aljawhari, Sarah; Bécu, Lydiane; Schockmel, Julien; Lumay, Geoffroy; Vandewalle, Nicolas

    2015-06-01

    Athermal two-dimensional granular systems are exposed to external mechanical noise leading to Brownian-like motion. Using tunable repulsive interparticle interaction, it is shown that the same microstructure as that observed in colloidal suspensions can be quantitatively recovered at a macroscopic scale. To that end, experiments on granular and colloidal systems made up of magnetized particles as well as computer simulations are performed and compared. Excellent agreement throughout the range of the magnetic coupling parameter is found for the pair distribution as well as the bond-orientational correlation functions. This finding opens new ways to efficiently and very conveniently explore phase transitions, crystallization, nucleation, etc in confined geometries.

  2. Quantitatively mimicking wet colloidal suspensions with dry granular media

    PubMed Central

    Messina, René; Aljawhari, Sarah; Bécu, Lydiane; Schockmel, Julien; Lumay, Geoffroy; Vandewalle, Nicolas

    2015-01-01

    Athermal two-dimensional granular systems are exposed to external mechanical noise leading to Brownian-like motion. Using tunable repulsive interparticle interaction, it is shown that the same microstructure as that observed in colloidal suspensions can be quantitatively recovered at a macroscopic scale. To that end, experiments on granular and colloidal systems made up of magnetized particles as well as computer simulations are performed and compared. Excellent agreement throughout the range of the magnetic coupling parameter is found for the pair distribution as well as the bond-orientational correlation functions. This finding opens new ways to efficiently and very conveniently explore phase transitions, crystallization, nucleation, etc in confined geometries. PMID:26030718

  3. Quantitative microbiological risk assessment.

    PubMed

    Hoornstra, E; Notermans, S

    2001-05-21

    The production of safe food is being increasingly based on the use of risk analysis, and this process is now in use to establish national and international food safety objectives. It is also being used more frequently to guarantee that safety objectives are met and that such guarantees are achieved in a cost-effective manner. One part of the overall risk analysis procedure-risk assessment-is the scientific process in which the hazards and risk factors are identified, and the risk estimate or risk profile is determined. Risk assessment is an especially important tool for governments when food safety objectives have to be developed in the case of 'new' contaminants in known products or known contaminants causing trouble in 'new' products. Risk assessment is also an important approach for food companies (i) during product development, (ii) during (hygienic) process optimalization, and (iii) as an extension (validation) of the more qualitative HACCP-plan. This paper discusses these two different types of risk assessment, and uses probability distribution functions to assess the risks posed by Escherichia coli O157:H7 in each case. Such approaches are essential elements of risk management, as they draw on all available information to derive accurate and realistic estimations of the risk posed. The paper also discusses the potential of scenario-analysis in simulating the impact of different or modified risk factors during the consideration of new or improved control measures.

  4. Feasibility of quantitative determination of local optical absorbances in tissue-mimicking phantoms using acousto-optic sensing

    NASA Astrophysics Data System (ADS)

    Bratchenia, A.; Molenaar, R.; Kooyman, R. P. H.

    2008-03-01

    We have investigated the application of ultrasound modulation of coherent light for quantitative determination of local absorbances in tissue-mimicking phantoms. An Intralipid-based phantom model, which mimics a blood vessel in human tissue, was used. The detection technique was based on homodyne parallel speckle detection in transmission mode. Based on a comparison of experimental data and Monte Carlo simulations, a quantitative correlation between local absorbances of the phantom and the measured signal has been shown. The use of microsecond pulses of ultrasound and laser light resulted in a spatial resolution of the system of a few millimeters.

  5. Microbiological Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dominguez, Silvia; Schaffner, Donald W.

    The meat and poultry industry faces ongoing challenges due to the natural association of pathogens of concern (e.g., Salmonella, Campylobacter jejuni, Escherichia coli O157:H7) with a variety of domesticated food animals. In addition, pathogens such as Listeria monocytogenes pose a significant cross-contamination risk during further meat and poultry processing, distribution, and storage. Furthermore, the meat and poultry industries are constantly changing with the addition of new products, use of new raw materials, and targeting of new consumer populations, each of which may give rise to potential new risks. National and international regulations are increasingly using a “risk-based” approach to food safety (where the regulatory focus is driven by the magnitude of the risk), so risk assessment is becoming a valuable tool to systematically organize and evaluate the potential public health risk posed by food processing operations.

  6. Towards quantitative assessment of calciphylaxis

    NASA Astrophysics Data System (ADS)

    Deserno, Thomas M.; Sárándi, István.; Jose, Abin; Haak, Daniel; Jonas, Stephan; Specht, Paula; Brandenburg, Vincent

    2014-03-01

    Calciphylaxis is a rare disease that has devastating conditions associated with high morbidity and mortality. Calciphylaxis is characterized by systemic medial calcification of the arteries yielding necrotic skin ulcerations. In this paper, we aim at supporting the installation of multi-center registries for calciphylaxis, which includes a photographic documentation of skin necrosis. However, photographs acquired in different centers under different conditions using different equipment and photographers cannot be compared quantitatively. For normalization, we use a simple color pad that is placed into the field of view, segmented from the image, and its color fields are analyzed. In total, 24 colors are printed on that scale. A least-squares approach is used to determine the affine color transform. Furthermore, the card allows scale normalization. We provide a case study for qualitative assessment. In addition, the method is evaluated quantitatively using 10 images of two sets of different captures of the same necrosis. The variability of quantitative measurements based on free hand photography is assessed regarding geometric and color distortions before and after our simple calibration procedure. Using automated image processing, the standard deviation of measurements is significantly reduced. The coefficients of variations yield 5-20% and 2-10% for geometry and color, respectively. Hence, quantitative assessment of calciphylaxis becomes practicable and will impact a better understanding of this rare but fatal disease.

  7. Characterisation of a PVCP-based tissue-mimicking phantom for quantitative photoacoustic imaging

    NASA Astrophysics Data System (ADS)

    Fonseca, Martina; Zeqiri, Bajram; Beard, Paul; Cox, Ben

    2015-07-01

    Photoacoustic imaging can provide high resolution images of tissue structure, pathology and function. As these images can be obtained at multiple wavelengths, quantitatively accurate, spatially resolved, estimates for chromophore concentration, for example, may be obtainable. Such a capability would find a wide range of clinical and pre-clinical applications. However, despite a growing body of theoretical papers on how this might be achieved, there is a noticeable lack of studies providing validated evidence that it can be achieved experimentally, either in vitro or in vivo. Well-defined, versatile and stable phantom materials are essential to assess the accuracy, robustness and applicability of multispectral Quantitative Photoacoustic Imaging (qPAI) algorithms in experimental scenarios. This study assesses the potential of polyvinyl chloride plastisol (PVCP) as a phantom material for qPAI, building on previous work that focused on using PVCP for quality control. Parameters that might be controlled or tuned to assess the performance of qPAI algorithms were studied: broadband acoustic properties, multiwavelength optical properties with added absorbers and scatterers, and photoacoustic efficiency. The optical and acoustic properties of PVCP can be tuned to be broadly representative of soft tissue. The Grüneisen parameter is larger than expected in tissue, which is an advantage as it increases the signal-to-noise ratio of the photoacoustic measurements. Interestingly, when the absorption was altered by adding absorbers, the absorption spectra measured using high peak power nanosecond-pulsed sources (typical in photoacoustics) were repeatably different from the ones measured using the low power source in the spectrophotometer, indicative of photochemical reactions taking place.

  8. Environmental probabilistic quantitative assessment methodologies

    NASA Astrophysics Data System (ADS)

    Crovelli, Robert A.

    1995-10-01

    Probabilistic methodologies developed originally for one area of application may be applicable in another area. Therefore, it is extremely important to communicate across disciplines. Of course, a physical reinterpretation is necessary and perhaps some modification of the methodology. This seems to be the situation in applying resource assessment methodologies as environmental assessment methodologies. In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. It is ironic that oil as a precious resource in the ground can become a serious pollutant as a spill in the ocean. There are similarities in both situations where the quantity of undiscovered crude oil and natural gas resources, and the quantity of a pollutant or contaminant are to be estimated. Obviously, we are interested in making a quantitative assessment in order to answer the question, "How much material is there?" For situations in which there are a lack of statistical data, risk analysis is used rather than classical statistical analysis. That is, a relatively subjective evaluation is made rather than an evaluation based on random sampling which may be impossible. Hence, probabilistic quantitative assessment methodologies are needed for the risk analysis. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: (1) direct assessment, (2) accumulation size, (3) volumetric yield, and (4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz., TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. TRIAGG

  9. Quantitative assessment of increasing complexity

    NASA Astrophysics Data System (ADS)

    Csernai, L. P.; Spinnangr, S. F.; Velle, S.

    2017-05-01

    We study the build up of complexity on the example of 1 kg matter in different forms. We start with the simplest example of ideal gases, and then continue with more complex chemical, biological, life, social and technical structures. We assess the complexity of these systems quantitatively, based on their entropy. We present a method to attribute the same entropy to known physical systems and to complex organic molecules, up to a DNA molecule. The important steps in this program and the basic obstacles are discussed.

  10. Experimental assessment of four ultrasound scattering models for characterizing concentrated tissue-mimicking phantoms.

    PubMed

    Franceschini, Emilie; Guillermin, Régine

    2012-12-01

    Tissue-mimicking phantoms with high scatterer concentrations were examined using quantitative ultrasound techniques based on four scattering models: The Gaussian model (GM), the Faran model (FM), the structure factor model (SFM), and the particle model (PM). Experiments were conducted using 10- and 17.5-MHz focused transducers on tissue-mimicking phantoms with scatterer concentrations ranging from 1% to 25%. Theoretical backscatter coefficients (BSCs) were first compared with the experimentally measured BSCs in the forward problem framework. The measured BSC versus scatterer concentration relationship was predicted satisfactorily by the SFM and the PM. The FM and the PM overestimated the BSC magnitude at actual concentrations greater than 2.5% and 10%, respectively. The SFM was the model that better matched the BSC magnitude at all the scatterer concentrations tested. Second, the four scattering models were compared in the inverse problem framework to estimate the scatterer size and concentration from the experimentally measured BSCs. The FM did not predict the concentration accurately at actual concentrations greater than 12.5%. The SFM and PM need to be associated with another quantitative parameter to differentiate between low and high concentrations. In that case, the SFM predicted the concentration satisfactorily with relative errors below 38% at actual concentrations ranging from 10% to 25%.

  11. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  12. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  13. Quantitative Microbial Risk Assessment Tutorial - Primer

    EPA Science Inventory

    This document provides a Quantitative Microbial Risk Assessment (QMRA) primer that organizes QMRA tutorials. The tutorials describe functionality of a QMRA infrastructure, guide the user through software use and assessment options, provide step-by-step instructions for implementi...

  14. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  15. Quantitative assessment of scientific quality

    NASA Astrophysics Data System (ADS)

    Heinzl, Harald; Bloching, Philipp

    2012-09-01

    Scientific publications, authors, and journals are commonly evaluated with quantitative bibliometric measures. Frequently-used measures will be reviewed and their strengths and weaknesses will be highlighted. Reflections about conditions for a new, research paper-specific measure will be presented.

  16. Use of chemostat cultures mimicking different phases of wine fermentations as a tool for quantitative physiological analysis

    PubMed Central

    2014-01-01

    Background Saccharomyces cerevisiae is the most relevant yeast species conducting the alcoholic fermentation that takes place during winemaking. Although the physiology of this model organism has been extensively studied, systematic quantitative physiology studies of this yeast under winemaking conditions are still scarce, thus limiting the understanding of fermentative metabolism of wine yeast strains and the systematic description, modelling and prediction of fermentation processes. In this study, we implemented and validated the use of chemostat cultures as a tool to simulate different stages of a standard wine fermentation, thereby allowing to implement metabolic flux analyses describing the sequence of metabolic states of S. cerevisae along the wine fermentation. Results Chemostat cultures mimicking the different stages of standard wine fermentations of S. cerevisiae EC1118 were performed using a synthetic must and strict anaerobic conditions. The simulated stages corresponded to the onset of the exponential growth phase, late exponential growth phase and cells just entering stationary phase, at dilution rates of 0.27, 0.04, 0.007 h−1, respectively. Notably, measured substrate uptake and product formation rates at each steady state condition were generally within the range of corresponding conversion rates estimated during the different batch fermentation stages. Moreover, chemostat data were further used for metabolic flux analysis, where biomass composition data for each condition was considered in the stoichiometric model. Metabolic flux distributions were coherent with previous analyses based on batch cultivations data and the pseudo-steady state assumption. Conclusions Steady state conditions obtained in chemostat cultures reflect the environmental conditions and physiological states of S. cerevisiae corresponding to the different growth stages of a typical batch wine fermentation, thereby showing the potential of this experimental approach to

  17. Therapeutic ultrasound in physical medicine and rehabilitation: characterization and assessment of its physical effects on joint-mimicking phantoms.

    PubMed

    Lioce, Elisa Edi Anna Nadia; Novello, Matteo; Durando, Gianni; Bistolfi, Alessandro; Actis, Maria Vittoria; Massazza, Giuseppe; Magnetto, Chiara; Guiot, Caterina

    2014-11-01

    The aim of the study described here was to quantitatively assess thermal and mechanical effects of therapeutic ultrasound (US) by sonicating a joint-mimicking phantom, made of muscle-equivalent material, using clinical US equipment. The phantom contains two bone disks simulating a deep joint (treated at 1 MHz) and a superficial joint (3 MHz). Thermal probes were inserted in fixed positions. To test the mechanical (cavitational) effects, we used a latex balloon filled with oxygen-loaded nanobubbles; the dimensions of the oxygen-loaded nanobubbles were determined before and after sonication. Significant increases in temperature (up to 17°C) with fixed field using continuous waves were detected both in front of and behind the bones, depending on the US mode (continuous wave vs. pulsed wave) and on the treatment modality (fixed vs. massage). We found no significant differences in mechanical effects. Although limited by the in vitro design (no blood perfusion, no metabolic compensation), the results can be used to guide operators in their choice of the best US treatment modality for a specific joint. Copyright © 2014 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  18. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  19. Risk Assessment: A Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Baert, K.; Francois, K.; de Meulenaer, B.; Devlieghere, F.

    A risk can be defined as a function of the probability of an adverse health effect and the severity of that effect, consequential to a hazard in food (Codex Alimentarius, 1999) . During a risk assessment, an estimate of the risk is obtained. The goal is to estimate the likelihood and the extent of adverse effects occurring to humans due to possible exposure(s) to hazards. Risk assessment is a scientifically based process consisting of the following steps: (1) hazard identification, (2) hazard characterization, (3) exposure assessment and (4) and risk characterization (Codex Alimentarius, 1999).

  20. QUANTITATIVE PROCEDURES FOR NEUROTOXICOLOGY RISK ASSESSMENT

    EPA Science Inventory

    In this project, previously published information on biologically based dose-response model for brain development was used to quantitatively evaluate critical neurodevelopmental processes, and to assess potential chemical impacts on early brain development. This model has been ex...

  1. QUANTITATIVE PROCEDURES FOR NEUROTOXICOLOGY RISK ASSESSMENT

    EPA Science Inventory

    In this project, previously published information on biologically based dose-response model for brain development was used to quantitatively evaluate critical neurodevelopmental processes, and to assess potential chemical impacts on early brain development. This model has been ex...

  2. Quantitative Assessment of Fluorescent Proteins

    PubMed Central

    Cranfill, Paula J.; Sell, Brittney R.; Baird, Michelle A.; Allen, John R.; Lavagnino, Zeno; de Gruiter, H. Martijn; Kremers, Gert-Jan; Davidson, Michael W.; Ustione, Alessandro; Piston, David W.

    2016-01-01

    The advent of fluorescent proteins (FP) for genetic labeling of molecules and cells has revolutionized fluorescence microscopy. Genetic manipulations have created a vast array of bright and stable FPs spanning the blue to red spectral regions. Common to autofluorescent FPs is their tight β-barrel structure, which provides the rigidity and chemical environment needed for effectual fluorescence. Despite the common structure, each FP has its own unique photophysical properties. Thus, there is no single “best” fluorescent protein for every circumstance, and each FP has advantages and disadvantages. To guide decisions about which FP is right for any given application, we have characterized quantitatively over 40 different FPs for their brightness, photostability, pH stability, and monomeric properties, which permits easy apples-to-apples comparisons between these FPs. We report the values for all of the FPs measured, but focus the discussion on the more popular and/or best performing FPs in each spectral region. PMID:27240257

  3. Quantitative assessment of fluorescent proteins.

    PubMed

    Cranfill, Paula J; Sell, Brittney R; Baird, Michelle A; Allen, John R; Lavagnino, Zeno; de Gruiter, H Martijn; Kremers, Gert-Jan; Davidson, Michael W; Ustione, Alessandro; Piston, David W

    2016-07-01

    The advent of fluorescent proteins (FPs) for genetic labeling of molecules and cells has revolutionized fluorescence microscopy. Genetic manipulations have created a vast array of bright and stable FPs spanning blue to red spectral regions. Common to autofluorescent FPs is their tight β-barrel structure, which provides the rigidity and chemical environment needed for effectual fluorescence. Despite the common structure, each FP has unique properties. Thus, there is no single 'best' FP for every circumstance, and each FP has advantages and disadvantages. To guide decisions about which FP is right for a given application, we have quantitatively characterized the brightness, photostability, pH stability and monomeric properties of more than 40 FPs to enable straightforward and direct comparison between them. We focus on popular and/or top-performing FPs in each spectral region.

  4. Assessment of tissue Doppler imaging measurements of arterial wall motion using a tissue mimicking test rig.

    PubMed

    Thrush, Abigail J; Brewin, Mark P; Birch, Malcolm J

    2008-03-01

    The aim of this in vitro study is to assess the accuracy of the tissue Doppler imaging arterial wall motion (TDI AWM) technique in measuring dilation over a range of distances and velocities. A test rig, consisting of two parallel blocks of tissue mimicking material (TMM), has been developed to generate known wall motion. One block remains stationary while the other moves in a cyclical motion. A calibrated laser range finder was used to measure the TMM motion. The TDI AWM measurements were found to underestimate the dilation by 21% +/- 4.7% when using the recommended scanner parameters. The size of the error was found to increase with a decrease in ultrasound output power. Results suggested that errors in the TDI AWM dilation measurements relate to underestimates in the velocity measured by the TDI technique. The error demonstrated in this study indicates a limitation in the value of TDI AWM result obtained in vivo. (E-mail: abigail.thrush@bartsandthelondon.nhs.uk).

  5. Accuracy of quantitative visual soil assessment

    NASA Astrophysics Data System (ADS)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  6. Integrated regional assessment: qualitative and quantitative issues

    SciTech Connect

    Malone, Elizabeth L.

    2009-11-19

    Qualitative and quantitative issues are particularly significant in integrated regional assessment. This chapter examines the terms “qualitative” and “quantitative” separately and in relation to one another, along with a discussion of the degree of interdependence or overlap between the two. Strategies for integrating the two general approaches often produce uneasy compromises. However, integrated regional assessment provides opportunities for strong collaborations in addressing specific problems in specific places.

  7. Radiological interpretation 2020: Toward quantitative image assessment

    SciTech Connect

    Boone, John M.

    2007-11-15

    The interpretation of medical images by radiologists is primarily and fundamentally a subjective activity, but there are a number of clinical applications such as tumor imaging where quantitative imaging (QI) metrics (such as tumor growth rate) would be valuable to the patient’s care. It is predicted that the subjective interpretive environment of the past will, over the next decade, evolve toward the increased use of quantitative metrics for evaluating patient health from images. The increasing sophistication and resolution of modern tomographic scanners promote the development of meaningful quantitative end points, determined from images which are in turn produced using well-controlled imaging protocols. For the QI environment to expand, medical physicists, physicians, other researchers and equipment vendors need to work collaboratively to develop the quantitative protocols for imaging, scanner calibrations, and robust analytical software that will lead to the routine inclusion of quantitative parameters in the diagnosis and therapeutic assessment of human health. Most importantly, quantitative metrics need to be developed which have genuine impact on patient diagnosis and welfare, and only then will QI techniques become integrated into the clinical environment.

  8. Physiologic basis for understanding quantitative dehydration assessment.

    PubMed

    Cheuvront, Samuel N; Kenefick, Robert W; Charkoudian, Nisha; Sawka, Michael N

    2013-03-01

    Dehydration (body water deficit) is a physiologic state that can have profound implications for human health and performance. Unfortunately, dehydration can be difficult to assess, and there is no single, universal gold standard for decision making. In this article, we review the physiologic basis for understanding quantitative dehydration assessment. We highlight how phenomenologic interpretations of dehydration depend critically on the type (dehydration compared with volume depletion) and magnitude (moderate compared with severe) of dehydration, which in turn influence the osmotic (plasma osmolality) and blood volume-dependent compensatory thresholds for antidiuretic and thirst responses. In particular, we review new findings regarding the biological variation in osmotic responses to dehydration and discuss how this variation can help provide a quantitative and clinically relevant link between the physiology and phenomenology of dehydration. Practical measures with empirical thresholds are provided as a starting point for improving the practice of dehydration assessment.

  9. Quantitative methods in assessment of neurologic function.

    PubMed

    Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J

    1981-01-01

    Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.

  10. Sensitivity analysis in quantitative microbial risk assessment.

    PubMed

    Zwieterin, M H; van Gerwen, S J

    2000-07-15

    The occurrence of foodborne disease remains a widespread problem in both the developing and the developed world. A systematic and quantitative evaluation of food safety is important to control the risk of foodborne diseases. World-wide, many initiatives are being taken to develop quantitative risk analysis. However, the quantitative evaluation of food safety in all its aspects is very complex, especially since in many cases specific parameter values are not available. Often many variables have large statistical variability while the quantitative effect of various phenomena is unknown. Therefore, sensitivity analysis can be a useful tool to determine the main risk-determining phenomena, as well as the aspects that mainly determine the inaccuracy in the risk estimate. This paper presents three stages of sensitivity analysis. First, deterministic analysis selects the most relevant determinants for risk. Overlooking of exceptional, but relevant cases is prevented by a second, worst-case analysis. This analysis finds relevant process steps in worst-case situations, and shows the relevance of variations of factors for risk. The third, stochastic analysis, studies the effects of variations of factors for the variability of risk estimates. Care must be taken that the assumptions made as well as the results are clearly communicated. Stochastic risk estimates are, like deterministic ones, just as good (or bad) as the available data, and the stochastic analysis must not be used to mask lack of information. Sensitivity analysis is a valuable tool in quantitative risk assessment by determining critical aspects and effects of variations.

  11. Novel Quantitative Assessment of Metamorphopsia in Maculopathy

    PubMed Central

    Wiecek, Emily; Lashkari, Kameran; Dakin, Steven C.; Bex, Peter

    2015-01-01

    Purpose. Patients with macular disease often report experiencing metamorphopsia (visual distortion). Although typically measured with Amsler charts, more quantitative assessments of perceived distortion are desirable to effectively monitor the presence, progression, and remediation of visual impairment. Methods. Participants with binocular (n = 33) and monocular (n = 50) maculopathy across seven disease groups, and control participants (n = 10) with no identifiable retinal disease completed a modified Amsler grid assessment (presented on a computer screen with eye tracking to ensure fixation compliance) and two novel assessments to measure metamorphopsia in the central 5° of visual field. A total of 81% (67/83) of participants completed a hyperacuity task where they aligned eight dots in the shape of a square, and 64% (32/50) of participants with monocular distortion completed a spatial alignment task using dichoptic stimuli. Ten controls completed all tasks. Results. Horizontal and vertical distortion magnitudes were calculated for each of the three assessments. Distortion magnitudes were significantly higher in patients than controls in all assessments. There was no significant difference in magnitude of distortion across different macular diseases. There were no significant correlations between overall magnitude of distortion among any of the three measures and no significant correlations in localized measures of distortion. Conclusions. Three alternative quantifications of monocular spatial distortion in the central visual field generated uncorrelated estimates of visual distortion. It is therefore unlikely that metamorphopsia is caused solely by retinal displacement, but instead involves additional top-down information, knowledge about the scene, and perhaps, cortical reorganization. PMID:25406293

  12. Quantitative assessment of growth plate activity

    SciTech Connect

    Harcke, H.T.; Macy, N.J.; Mandell, G.A.; MacEwen, G.D.

    1984-01-01

    In the immature skeleton the physis or growth plate is the area of bone least able to withstand external forces and is therefore prone to trauma. Such trauma often leads to premature closure of the plate and results in limb shortening and/or angular deformity (varus or valgus). Active localization of bone seeking tracers in the physis makes bone scintigraphy an excellent method for assessing growth plate physiology. To be most effective, however, physeal activity should be quantified so that serial evaluations are accurate and comparable. The authors have developed a quantitative method for assessing physeal activity and have applied it ot the hip and knee. Using computer acquired pinhole images of the abnormal and contralateral normal joints, ten regions of interest are placed at key locations around each joint and comparative ratios are generated to form a growth plate profile. The ratios compare segmental physeal activity to total growth plate activity on both ipsilateral and contralateral sides and to adjacent bone. In 25 patients, ages 2 to 15 years, with angular deformities of the legs secondary to trauma, Blount's disease, and Perthes disease, this technique is able to differentiate abnormal segmental physeal activity. This is important since plate closure does not usually occur uniformly across the physis. The technique may permit the use of scintigraphy in the prediction of early closure through the quantitative analysis of serial studies.

  13. Hydrogen quantitative risk assessment workshop proceedings.

    SciTech Connect

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersion 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.

  14. A toolbox for rockfall Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Agliardi, F.; Mavrouli, O.; Schubert, M.; Corominas, J.; Crosta, G. B.; Faber, M. H.; Frattini, P.; Narasimhan, H.

    2012-04-01

    Rockfall Quantitative Risk Analysis for mitigation design and implementation requires evaluating the probability of rockfall events, the probability and intensity of impacts on structures (elements at risk and countermeasures), their vulnerability, and the related expected costs for different scenarios. A sound theoretical framework has been developed during the last years both for spatially-distributed and local (i.e. single element at risk) analyses. Nevertheless, the practical application of existing methodologies remains challenging, due to difficulties in the collection of required data and to the lack of simple, dedicated analysis tools. In order to fill this gap, specific tools have been developed in the form of Excel spreadsheets, in the framework of Safeland EU project. These tools can be used by stakeholders, practitioners and other interested parties for the quantitative calculation of rock fall risk through its key components (probabilities, vulnerability, loss), using combinations of deterministic and probabilistic approaches. Three tools have been developed, namely: QuRAR (by UNIMIB), VulBlock (by UPC), and RiskNow-Falling Rocks (by ETH Zurich). QuRAR implements a spatially distributed, quantitative assessment methodology of rockfall risk for individual buildings or structures in a multi-building context (urban area). Risk is calculated in terms of expected annual cost, through the evaluation of rockfall event probability, propagation and impact probability (by 3D numerical modelling of rockfall trajectories), and empirical vulnerability for different risk protection scenarios. Vulblock allows a detailed, analytical calculation of the vulnerability of reinforced concrete frame buildings to rockfalls and related fragility curves, both as functions of block velocity and the size. The calculated vulnerability can be integrated in other methodologies/procedures based on the risk equation, by incorporating the uncertainty of the impact location of the rock

  15. [Quantitative assessment of quality of vision].

    PubMed

    Oshika, Tetsuro

    2004-12-01

    The importance of quality of vision (QOV) along with quality of life (QOL) in medicine has been recently widely recognized. We have conducted studies to quantitatively analyze factors related to QOV. Irregular astigmatism can be a significant obstacle for achieving satisfactory QOV. Videokeratography data were broken down using Fourier harmonic series analysis into spherical power, regular astigmatism (second harmonic component, n = 2), asymmetry (n = 1), and higher order irregularity (n > or = 3). The irregular astigmatism component calculated by the Fourier analysis significantly correlated with best spectacle-corrected visual acuity. Software was developed to display color-coded maps for the four Fourier indices. The normal range was defined for each Fourier index, and eyes with pathologic and postsurgical conditions were evaluated using the normal range. Progression of keratoconus over time was quantitatively described by Fourier analysis of the videokeratography data. Using the Fourier method, changes in corneal topography following suture removal after penetrating keratoplasty were evaluated. Fourier analysis of videokeratography data significantly facilitated determination of refraction and measurement of best spectacle-corrected visual acuity in eyes with corneal irregular astigmatism such as post-penetrating keratoplasty eyes. Higher-order wavefront aberrations of the cornea were calculated by expanding videokeratography elevation data into Zernike polynomials, and coma and spherical aberrations were computed. For ocular aberrations, the data obtained with the Hartmann-Shack sensor were decomposed into Zernike polynomials. Coma aberrations of the cornea significantly correlated with age, while corneal spherical aberrations showed no age-related changes. The time-course of changes in corneal higher-order aberrations was reported for photorefractive keratectomy and laser in situ keratomileusis (LASIK). For ocular aberrations, the degree of tilting of the

  16. The quantitative assessment of body iron.

    PubMed

    Cook, James D; Flowers, Carol H; Skikne, Barry S

    2003-05-01

    Current initiatives to reduce the high prevalence of nutritional iron deficiency have highlighted the need for reliable epidemiologic methods to assess iron status. The present report describes a method for estimating body iron based on the ratio of the serum transferrin receptor to serum ferritin. Analysis showed a single normal distribution of body iron stores in US men aged 20 to 65 years (mean +/- 1 SD, 9.82 +/- 2.82 mg/kg). A single normal distribution was also observed in pregnant Jamaican women (mean +/- 1 SD, 0.09 +/- 4.48 mg/kg). Distribution analysis in US women aged 20 to 45 years indicated 2 populations; 93% of women had body iron stores averaging 5.5 +/- 3.35 mg/kg (mean +/- 1 SD), whereas the remaining 7% of women had a mean tissue iron deficit of 3.87 +/- 3.23 mg/kg. Calculations of body iron in trials of iron supplementation in Jamaica and iron fortification in Vietnam demonstrated that the method can be used to calculate absorption of the added iron. Quantitative estimates of body iron greatly enhance the evaluation of iron status and the sensitivity of iron intervention trials in populations in which inflammation is uncommon or has been excluded by laboratory screening. The method is useful clinically for monitoring iron status in those who are highly susceptible to iron deficiency.

  17. Quantitative Risk Assessment for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Lowry, T. S.; McKenna, S. A.; Hadgu, T.; Kalinina, E.

    2011-12-01

    This study uses a quantitative risk-assessment approach to place the uncertainty associated with enhanced geothermal systems (EGS) development into meaningful context and to identify points of attack that can reduce risk the most. Using the integrated geothermal assessment tool, GT-Mod, we calculate the complimentary cumulative distribution function of the levelized cost of electricity (LCOE) that results from uncertainty in a variety of geologic and economic input parameter values. EGS is a developing technology that taps deep (2-10km) geologic heat sources for energy production by "enhancing" non-permeable hot rock through hydraulic stimulation. Despite the promise of EGS, uncertainties in predicting the physical end economic performance of a site has hindered its development. To address this, we apply a quantitative risk-assessment approach that calculates risk as the sum of the consequence, C, multiplied by the range of the probability, ΔP, over all estimations of a given exceedance probability, n, over time, t. The consequence here is defined as the deviation from the best estimate LCOE, which is calculated using the 'best-guess' input parameter values. The analysis assumes a realistic but fictitious EGS site with uncertainties in the exploration success rate, the sub-surface thermal gradient, the reservoir fracture pattern, and the power plant performance. Uncertainty in the exploration, construction, O&M, and drilling costs are also included. The depth to the resource is calculated from the thermal gradient and a target resource temperature of 225 °C. Thermal performance is simulated using the Gringarten analytical solution. The mass flow rate is set to produce 30 MWe of power for the given conditions and is adjusted over time to maintain that rate over the plant lifetime of 30 years. Simulations are conducted using GT-Mod, which dynamically links the physical systems of a geothermal site to simulate, as an integrated, multi-system component, the

  18. Quantitative Assessment of Autistic Symptomatology in Preschoolers

    ERIC Educational Resources Information Center

    Pine, Elyse; Luby, Joan; Abbacchi, Anna; Constantino, John N.

    2006-01-01

    Given a growing emphasis on early intervention for children with autism, valid quantitative tools for measuring treatment response are needed. The Social Responsiveness Scale (SRS) is a brief (15-20 minute) quantitative measure of autistic traits in 4-to 18-year-olds, for which a version for 3-year-olds was recently developed. We obtained serial…

  19. A Quantitative Approach to Assessing System Evolvability

    NASA Technical Reports Server (NTRS)

    Christian, John A., III

    2004-01-01

    When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.

  20. Quantitative risk assessment of Listeria monocytogenes in French cold-smoked salmon: I. Quantitative exposure assessment.

    PubMed

    Pouillot, Régis; Miconnet, Nicolas; Afchain, Anne-Laure; Delignette-Muller, Marie Laure; Beaufort, Annie; Rosso, Laurent; Denis, Jean-Baptiste; Cornu, Marie

    2007-06-01

    A quantitative assessment of the exposure to Listeria monocytogenes from cold-smoked salmon (CSS) consumption in France is developed. The general framework is a second-order (or two-dimensional) Monte Carlo simulation, which characterizes the uncertainty and variability of the exposure estimate. The model takes into account the competitive bacterial growth between L. monocytogenes and the background competitive flora from the end of the production line to the consumer phase. An original algorithm is proposed to integrate this growth in conditions of varying temperature. As part of a more general project led by the French Food Safety Agency (Afssa), specific data were acquired and modeled for this quantitative exposure assessment model, particularly time-temperature profiles, prevalence data, and contamination-level data. The sensitivity analysis points out the main influence of the mean temperature in household refrigerators and the prevalence of contaminated CSS on the exposure level. The outputs of this model can be used as inputs for further risk assessment.

  1. A Quantitative Model for Assessing Faculty Promotion.

    ERIC Educational Resources Information Center

    Tekian, Ara; And Others

    This paper describes a quantitative model that can be used to evaluate faculty performance for promotion decisions. Through the use of an evaluation form, the system (1) informs faculty members how they will be evaluated at the end of each academic year; (2) allows faculty growth to be documented in teaching, research, and other activities which…

  2. Assessing Quantitative Reasoning in Young Children

    ERIC Educational Resources Information Center

    Nunes, Terezinha; Bryant, Peter; Evans, Deborah; Barros, Rossana

    2015-01-01

    Before starting school, many children reason logically about concepts that are basic to their later mathematical learning. We describe a measure of quantitative reasoning that was administered to children at school entry (mean age 5.8 years) and accounted for more variance in a mathematical attainment test than general cognitive ability 16 months…

  3. Quantitative Assessment of Robot-Generated Maps

    NASA Astrophysics Data System (ADS)

    Scrapper, C.; Madhavan, R.; Lakaemper, R.; Censi, A.; Godil, A.; Wagan, A.; Jacoff, A.

    Mobile robotic mapping is now considered to be a sufficiently mature field with demonstrated successes in various domains. While much progress has been made in the development of computationally efficient and consistent mapping schemes, it is still murky, at best, on how these maps can be evaluated. We are motivated by the absence of an accepted standard for quantitatively measuring the performance of robotic mapping systems against user-defined requirements. It is our belief that the development of standardized methods for quantitatively evaluating existing robotic technologies will improve the utility of mobile robots in already established application areas, such as vacuum cleaning, robot surveillance, and bomb disposal. This approach will also enable the proliferation and acceptance of such technologies in emerging markets. This chapter summarizes our preliminary efforts by bringing together the research community towards addressing this important problem which has ramifications not only from researchers' perspective but also from consumers', robot manufacturers', and developers' viewpoints.

  4. Doppler backscatter properties of a blood-mimicking fluid for Doppler performance assessment.

    PubMed

    Ramnarine, K V; Hoskins, P R; Routh, H F; Davidson, F

    1999-01-01

    The Doppler backscatter properties of a blood-mimickig fluid (BMF) were studied to evaluate its suitability for use in a Doppler flow test object. Measurements were performed using a flow rig with C-flex tubing and BMF flow produced by a roller pump or a gear pump. A SciMed Doppler system was used to measure the backscattered Doppler power with a root-mean-square power meter connected to the audio output. Studies investigated the dependence of the backscattered Doppler power of the BMF with: circulation time; batch and operator preparations; storage; sieve size; flow speed; and pump type. A comparison was made with human red blood cells resuspended in saline. The backscatter properties are stable and within International Electrotechnical Commission requirements. The BMF is suitable for use in a test object for Doppler performance assessment.

  5. A Quantitative Software Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  6. CUMULATIVE RISK ASSESSMENT FOR QUANTITATIVE RESPONSE DATA

    EPA Science Inventory

    The Relative Potency Factor approach (RPF) is used to normalize and combine different toxic potencies among a group of chemicals selected for cumulative risk assessment. The RPF method assumes that the slopes of the dose-response functions are all equal; but this method depends o...

  7. CUMULATIVE RISK ASSESSMENT FOR QUANTITATIVE RESPONSE DATA

    EPA Science Inventory

    The Relative Potency Factor approach (RPF) is used to normalize and combine different toxic potencies among a group of chemicals selected for cumulative risk assessment. The RPF method assumes that the slopes of the dose-response functions are all equal; but this method depends o...

  8. QUANTITATIVE RISK ASSESSMENT FOR MICROBIAL AGENTS

    EPA Science Inventory

    Compared to chemical risk assessment, the process for microbial agents and infectious disease is more complex because of host factors and the variety of settings in which disease transmission can occur. While the National Academy of Science has established a paradigm for performi...

  9. QUANTITATIVE RISK ASSESSMENT FOR MICROBIAL AGENTS

    EPA Science Inventory

    Compared to chemical risk assessment, the process for microbial agents and infectious disease is more complex because of host factors and the variety of settings in which disease transmission can occur. While the National Academy of Science has established a paradigm for performi...

  10. Asbestos exposure--quantitative assessment of risk

    SciTech Connect

    Hughes, J.M.; Weill, H.

    1986-01-01

    Methods for deriving quantitative estimates of asbestos-associated health risks are reviewed and their numerous assumptions and uncertainties described. These methods involve extrapolation of risks observed at past relatively high asbestos concentration levels down to usually much lower concentration levels of interest today--in some cases, orders of magnitude lower. These models are used to calculate estimates of the potential risk to workers manufacturing asbestos products and to students enrolled in schools containing asbestos products. The potential risk to workers exposed for 40 yr to 0.5 fibers per milliliter (f/ml) of mixed asbestos fiber type (a permissible workplace exposure limit under consideration by the Occupational Safety and Health Administration (OSHA) ) are estimated as 82 lifetime excess cancers per 10,000 exposed. The risk to students exposed to an average asbestos concentration of 0.001 f/ml of mixed asbestos fiber types for an average enrollment period of 6 school years is estimated as 5 lifetime excess cancers per one million exposed. If the school exposure is to chrysotile asbestos only, then the estimated risk is 1.5 lifetime excess cancers per million. Risks from other causes are presented for comparison; e.g., annual rates (per million) of 10 deaths from high school football, 14 from bicycling (10-14 yr of age), 5 to 20 for whooping cough vaccination. Decisions concerning asbestos products require participation of all parties involved and should only be made after a scientifically defensible estimate of the associated risk has been obtained. In many cases to date, such decisions have been made without adequate consideration of the level of risk or the cost-effectiveness of attempts to lower the potential risk. 73 references.

  11. Quantitative performance assessments for neuromagnetic imaging systems.

    PubMed

    Koga, Ryo; Hiyama, Ei; Matsumoto, Takuya; Sekihara, Kensuke

    2013-01-01

    We have developed a Monte-Carlo simulation method to assess the performance of neuromagnetic imaging systems using two kinds of performance metrics: A-prime metric and spatial resolution. We compute these performance metrics for virtual sensor systems having 80, 160, 320, and 640 sensors, and discuss how the system performance is improved, depending on the number of sensors. We also compute these metrics for existing whole-head MEG systems, MEGvision™ (Yokogawa Electric Corporation, Tokyo, Japan) that uses axial-gradiometer sensors, and TRIUX™ (Elekta Corporate, Stockholm, Sweden) that uses planar-gradiometer and magnetometer sensors. We discuss performance comparisons between these significantly different systems.

  12. Quantitative Assessment of Abdominal Aortic Aneurysm Geometry

    PubMed Central

    Shum, Judy; Martufi, Giampaolo; Di Martino, Elena; Washington, Christopher B.; Grisafi, Joseph; Muluk, Satish C.; Finol, Ender A.

    2011-01-01

    Recent studies have shown that the maximum transverse diameter of an abdominal aortic aneurysm (AAA) and expansion rate are not entirely reliable indicators of rupture potential. We hypothesize that aneurysm morphology and wall thickness are more predictive of rupture risk and can be the deciding factors in the clinical management of the disease. A non-invasive, image-based evaluation of AAA shape was implemented on a retrospective study of 10 ruptured and 66 unruptured aneurysms. Three-dimensional models were generated from segmented, contrast-enhanced computed tomography images. Geometric indices and regional variations in wall thickness were estimated based on novel segmentation algorithms. A model was created using a J48 decision tree algorithm and its performance was assessed using ten-fold cross validation. Feature selection was performed using the χ2-test. The model correctly classified 65 datasets and had an average prediction accuracy of 86.6% (κ = 0.37). The highest ranked features were sac length, sac height, volume, surface area, maximum diameter, bulge height, and intra-luminal thrombus volume. Given that individual AAAs have complex shapes with local changes in surface curvature and wall thickness, the assessment of AAA rupture risk should be based on the accurate quantification of aneurysmal sac shape and size. PMID:20890661

  13. Quantitative estimation in Health Impact Assessment: Opportunities and challenges

    SciTech Connect

    Bhatia, Rajiv; Seto, Edmund

    2011-04-15

    Health Impact Assessment (HIA) considers multiple effects on health of policies, programs, plans and projects and thus requires the use of diverse analytic tools and sources of evidence. Quantitative estimation has desirable properties for the purpose of HIA but adequate tools for quantification exist currently for a limited number of health impacts and decision settings; furthermore, quantitative estimation generates thorny questions about the precision of estimates and the validity of methodological assumptions. In the United States, HIA has only recently emerged as an independent practice apart from integrated EIA, and this article aims to synthesize the experience with quantitative health effects estimation within that practice. We use examples identified through a scan of available identified instances of quantitative estimation in the U.S. practice experience to illustrate methods applied in different policy settings along with their strengths and limitations. We then discuss opportunity areas and practical considerations for the use of quantitative estimation in HIA.

  14. Quantitative Assessments of the Martian Hydrosphere

    NASA Astrophysics Data System (ADS)

    Lasue, Jeremie; Mangold, Nicolas; Hauber, Ernst; Clifford, Steve; Feldman, William; Gasnault, Olivier; Grima, Cyril; Maurice, Sylvestre; Mousis, Olivier

    2013-01-01

    In this paper, we review current estimates of the global water inventory of Mars, potential loss mechanisms, the thermophysical characteristics of the different reservoirs that water may be currently stored in, and assess how the planet's hydrosphere and cryosphere evolved with time. First, we summarize the water inventory quantified from geological analyses of surface features related to both liquid water erosion, and ice-related landscapes. They indicate that, throughout most of Martian geologic history (and possibly continuing through to the present day), water was present to substantial depths, with a total inventory ranging from several 100 to as much as 1000 m Global Equivalent Layer (GEL). We then review the most recent estimates of water content based on subsurface detection by orbital and landed instruments, including deep penetrating radars such as SHARAD and MARSIS. We show that the total amount of water measured so far is about 30 m GEL, although a far larger amount of water may be stored below the sounding depths of currently operational instruments. Finally, a global picture of the current state of the subsurface water reservoirs and their evolution is discussed.

  15. Sensitive Quantitative Assessment of Balance Disorders

    NASA Technical Reports Server (NTRS)

    Paloski, Willilam H.

    2007-01-01

    Computerized dynamic posturography (CDP) has become a standard technique for objectively quantifying balance control performance, diagnosing the nature of functional impairments underlying balance disorders, and monitoring clinical treatment outcomes. We have long used CDP protocols to assess recovery of sensory-motor function in astronauts following space flight. The most reliable indicators of post-flight crew performance are the sensory organization tests (SOTs), particularly SOTs 5 and 6, which are sensitive to changes in availability and/or utilization of vestibular cues. We have noted, however, that some astronauts exhibiting obvious signs of balance impairment after flight are able to score within clinical norms on these tests, perhaps as a result of adopting competitive strategies or by their natural skills at substituting alternate sensory information sources. This insensitivity of the CDP protocol could underestimate of the degree of impairment and, perhaps, lead to premature release of those crewmembers to normal duties. To improve the sensitivity of the CDP protocol we have introduced static and dynamic head tilt SOT trials into our protocol. The pattern of postflight recovery quantified by the enhanced CDP protocol appears to more aptly track the re-integration of sensory-motor function, with recovery time increasing as the complexity of sensory-motor/biomechanical task increases. The new CDP protocol therefore seems more suitable for monitoring post-flight sensory-motor recovery and for indicating to crewmembers and flight surgeons fitness for return to duty and/or activities of daily living. There may be classes of patients (e.g., athletes, pilots) having motivation and/or performance characteristics similar to astronauts whose sensory-motor treatment outcomes would also be more accurately monitored using the enhanced CDP protocol. Furthermore, the enhanced protocol may be useful in early detection of age-related balance disorders.

  16. Sensitive Quantitative Assessment of Balance Disorders

    NASA Technical Reports Server (NTRS)

    Paloski, Willilam H.

    2007-01-01

    Computerized dynamic posturography (CDP) has become a standard technique for objectively quantifying balance control performance, diagnosing the nature of functional impairments underlying balance disorders, and monitoring clinical treatment outcomes. We have long used CDP protocols to assess recovery of sensory-motor function in astronauts following space flight. The most reliable indicators of post-flight crew performance are the sensory organization tests (SOTs), particularly SOTs 5 and 6, which are sensitive to changes in availability and/or utilization of vestibular cues. We have noted, however, that some astronauts exhibiting obvious signs of balance impairment after flight are able to score within clinical norms on these tests, perhaps as a result of adopting competitive strategies or by their natural skills at substituting alternate sensory information sources. This insensitivity of the CDP protocol could underestimate of the degree of impairment and, perhaps, lead to premature release of those crewmembers to normal duties. To improve the sensitivity of the CDP protocol we have introduced static and dynamic head tilt SOT trials into our protocol. The pattern of postflight recovery quantified by the enhanced CDP protocol appears to more aptly track the re-integration of sensory-motor function, with recovery time increasing as the complexity of sensory-motor/biomechanical task increases. The new CDP protocol therefore seems more suitable for monitoring post-flight sensory-motor recovery and for indicating to crewmembers and flight surgeons fitness for return to duty and/or activities of daily living. There may be classes of patients (e.g., athletes, pilots) having motivation and/or performance characteristics similar to astronauts whose sensory-motor treatment outcomes would also be more accurately monitored using the enhanced CDP protocol. Furthermore, the enhanced protocol may be useful in early detection of age-related balance disorders.

  17. A comparison of risk assessment techniques from qualitative to quantitative

    SciTech Connect

    Altenbach, T.J.

    1995-02-13

    Risk assessment techniques vary from purely qualitative approaches, through a regime of semi-qualitative to the more traditional quantitative. Constraints such as time, money, manpower, skills, management perceptions, risk result communication to the public, and political pressures all affect the manner in which risk assessments are carried out. This paper surveys some risk matrix techniques, examining the uses and applicability for each. Limitations and problems for each technique are presented and compared to the others. Risk matrix approaches vary from purely qualitative axis descriptions of accident frequency vs consequences, to fully quantitative axis definitions using multi-attribute utility theory to equate different types of risk from the same operation.

  18. Quantitative health impact assessment: current practice and future directions

    PubMed Central

    Veerman, J; Barendregt, J; Mackenbach, J

    2005-01-01

    Study objective: To assess what methods are used in quantitative health impact assessment (HIA), and to identify areas for future research and development. Design: HIA reports were assessed for (1) methods used to quantify effects of policy on determinants of health (exposure impact assessment) and (2) methods used to quantify health outcomes resulting from changes in exposure to determinants (outcome assessment). Main results: Of 98 prospective HIA studies, 17 reported quantitative estimates of change in exposure to determinants, and 16 gave quantified health outcomes. Eleven (categories of) determinants were quantified up to the level of health outcomes. Methods for exposure impact assessment were: estimation on the basis of routine data and measurements, and various kinds of modelling of traffic related and environmental factors, supplemented with experts' estimates and author's assumptions. Some studies used estimates from other documents pertaining to the policy. For the calculation of health outcomes, variants of epidemiological and toxicological risk assessment were used, in some cases in mathematical models. Conclusions: Quantification is comparatively rare in HIA. Methods are available in the areas of environmental health and, to a lesser extent, traffic accidents, infectious diseases, and behavioural factors. The methods are diverse and their reliability and validity are uncertain. Research and development in the following areas could benefit quantitative HIA: methods to quantify the effect of socioeconomic and behavioural determinants; user friendly simulation models; the use of summary measures of public health, expert opinion and scenario building; and empirical research into validity and reliability. PMID:15831683

  19. Quantitative wearable sensors for objective assessment of Parkinson's disease.

    PubMed

    Maetzler, Walter; Domingos, Josefa; Srulijes, Karin; Ferreira, Joaquim J; Bloem, Bastiaan R

    2013-10-01

    There is a rapidly growing interest in the quantitative assessment of Parkinson's disease (PD)-associated signs and disability using wearable technology. Both persons with PD and their clinicians see advantages in such developments. Specifically, quantitative assessments using wearable technology may allow for continuous, unobtrusive, objective, and ecologically valid data collection. Also, this approach may improve patient-doctor interaction, influence therapeutic decisions, and ultimately ameliorate patients' global health status. In addition, such measures have the potential to be used as outcome parameters in clinical trials, allowing for frequent assessments; eg, in the home setting. This review discusses promising wearable technology, addresses which parameters should be prioritized in such assessment strategies, and reports about studies that have already investigated daily life issues in PD using this new technology.

  20. [The method of quantitative assessment of dentition aesthetic parameters].

    PubMed

    Ryakhovsky, A N; Kalacheva, Ya A

    2016-01-01

    This article describes the formula for calculating the aesthetic index of treatment outcome. The formula was derived on the basis of the obtained regression equations showing the dependence of visual assessment of the value of aesthetic violations. The formula can be used for objective quantitative evaluation of the aesthetics of the teeth when smiling before and after dental treatment.

  1. Qualitative and Quantitative Hippocampal MRI Assessments in Intractable Epilepsy

    PubMed Central

    Singh, Paramdeep; Kaur, Rupinderjeet; Saggar, Kavita; Singh, Gagandeep; Kaur, Amarpreet

    2013-01-01

    Aims. To acquire normative data of hippocampal volumes and T2 relaxation times, to evaluate and compare qualitative and quantitative assessments in evaluating hippocampi in patients with different durations of intractable epilepsy, and to propose an imaging protocol based on performance of these techniques. Methods. MRI analysis was done in 50 nonepileptic controls and 30 patients with intractable epilepsy on 1.5T scanner. Visual assessment and hippocampal volumetry were done on oblique coronal IR/T2W and T1W MP-RAGE images, respectively. T2 relaxation times were measured using 16-echo Carr-Purcell-Meiboom-Gill sequence. Volumetric data was normalized for variation in head size between individuals. Patients were divided into temporal (n = 20) and extratemporal (n = 10) groups based on clinical and EEG localization. Results. In controls, right hippocampal volume was slightly more than the left with no effect of age or gender. In TLE patients, hippocampal volumetry provided maximum concordance with EEG. Visual assessment of unilateral pathology concurred well with measured quantitative values but poorly in cases with bilateral pathologies. There were no significant differences of mean values between extratemporal group and controls group. Quantitative techniques detected mild abnormalities, undetected on visual assessment. Conclusions. Quantitative techniques are more sensitive to diagnose bilateral and mild unilateral hippocampal abnormalities. PMID:23984369

  2. acoustic assessment of a konjac–carrageenan tissue-mimicking material aT 5–60 MHZ.

    PubMed

    Kenwright, David A; Sadhoo, Neelaksh; Rajagopal, Srinath; Anderson, Tom; Moran, Carmel M; Hadoke, Patrick W; Gray, Gillian A; Zeqiri, Bajram; Hoskins, Peter R

    2014-12-01

    The acoustic properties of a robust tissue-mimicking material based on konjac–carrageenan at ultrasound frequencies in the range 5–60 MHz are described. Acoustic properties were characterized using two methods: a broadband reflection substitution technique using a commercially available preclinical ultrasound scanner (Vevo 770, FUJIFILM VisualSonics, Toronto, ON, Canada), and a dedicated high-frequency ultrasound facility developed at the National Physical Laboratory (NPL, Teddington, UK), which employed a broadband through-transmission substitution technique. The mean speed of sound across the measured frequencies was found to be 1551.7 ± 12.7 and 1547.7 ± 3.3 m s21, respectively. The attenuation exhibited a non-linear dependence on frequency, f (MHz), in the form of a polynomial function: 0.009787f2 1 0.2671f and 0.01024f2 1 0.3639f, respectively. The characterization of this tissue-mimicking material will provide reference data for designing phantoms for preclinical systems, which may, in certain applications such as flow phantoms, require a physically more robust tissuemimicking material than is currently available.

  3. Sites of Superoxide and Hydrogen Peroxide Production by Muscle Mitochondria Assessed ex Vivo under Conditions Mimicking Rest and Exercise*

    PubMed Central

    Goncalves, Renata L. S.; Quinlan, Casey L.; Perevoshchikova, Irina V.; Hey-Mogensen, Martin; Brand, Martin D.

    2015-01-01

    The sites and rates of mitochondrial production of superoxide and H2O2 in vivo are not yet defined. At least 10 different mitochondrial sites can generate these species. Each site has a different maximum capacity (e.g. the outer quinol site in complex III (site IIIQo) has a very high capacity in rat skeletal muscle mitochondria, whereas the flavin site in complex I (site IF) has a very low capacity). The maximum capacities can greatly exceed the actual rates observed in the absence of electron transport chain inhibitors, so maximum capacities are a poor guide to actual rates. Here, we use new approaches to measure the rates at which different mitochondrial sites produce superoxide/H2O2 using isolated muscle mitochondria incubated in media mimicking the cytoplasmic substrate and effector mix of skeletal muscle during rest and exercise. We find that four or five sites dominate during rest in this ex vivo system. Remarkably, the quinol site in complex I (site IQ) and the flavin site in complex II (site IIF) each account for about a quarter of the total measured rate of H2O2 production. Site IF, site IIIQo, and perhaps site EF in the β-oxidation pathway account for most of the remainder. Under conditions mimicking mild and intense aerobic exercise, total production is much less, and the low capacity site IF dominates. These results give novel insights into which mitochondrial sites may produce superoxide/H2O2 in vivo. PMID:25389297

  4. Elastic properties of soft tissue-mimicking phantoms assessed by combined use of laser ultrasonics and low coherence interferometry.

    PubMed

    Li, Chunhui; Huang, Zhihong; Wang, Ruikang K

    2011-05-23

    Advances in the field of laser ultrasonics have opened up new possibilities in medical applications. This paper evaluates this technique as a method that would allow for rapid characterization of the elastic properties of soft biological tissue. In doing so, we propose a novel approach that utilizes a low coherence interferometer to detect the laser-induced surface acoustic waves (SAW) from the tissue-mimicking phantoms. A Nd:YAG focused laser line-source is applied to one- and two-layer tissue-mimicking agar-agar phantoms, and the generated SAW signals are detected by a time domain low coherence interferometry system. SAW phase velocity dispersion curves are calculated, from which the elasticity of the specimens is evaluated. We show that the experimental results agree well with those of the theoretical expectations. This study is the first report that a laser-generated SAW phase velocity dispersion technique is applied to soft materials. This technique may open a way for laser ultrasonics to detect the mechanical properties of soft tissues, such as skin.

  5. Measurement of guided mode wavenumbers in soft tissue-bone mimicking phantoms using ultrasonic axial transmission

    NASA Astrophysics Data System (ADS)

    Chen, Jiangang; Foiret, Josquin; Minonzio, Jean-Gabriel; Talmant, Maryline; Su, Zhongqing; Cheng, Li; Laugier, Pascal

    2012-05-01

    Human soft tissue is an important factor that influences the assessment of human long bones using quantitative ultrasound techniques. To investigate such influence, a series of soft tissue-bone phantoms (a bone-mimicking plate coated with a layer of water, glycerol or silicon rubber) were ultrasonically investigated using a probe with multi-emitter and multi-receiver arrays in an axial transmission configuration. A singular value decomposition signal processing technique was applied to extract the frequency-dependent wavenumbers of several guided modes. The results indicate that the presence of a soft tissue-mimicking layer introduces additional guided modes predicted by a fluid waveguide model. The modes propagating in the bone-mimicking plate covered by the soft-tissue phantom are only slightly modified compared to their counterparts in the free bone-mimicking plate, and they are still predicted by an elastic transverse isotropic two-dimensional waveguide. Altogether these observations suggest that the soft tissue-bone phantoms can be modeled as two independent waveguides. Even in the presence of the overlying soft tissue-mimicking layer, the modes propagating in the bone-mimicking plate can still be extracted and identified. These results suggest that our approach can be applied for the purpose of the characterization of the material and structural properties of cortical bone.

  6. Measurement of guided mode wavenumbers in soft tissue-bone mimicking phantoms using ultrasonic axial transmission.

    PubMed

    Chen, Jiangang; Foiret, Josquin; Minonzio, Jean-Gabriel; Talmant, Maryline; Su, Zhongqing; Cheng, Li; Laugier, Pascal

    2012-05-21

    Human soft tissue is an important factor that influences the assessment of human long bones using quantitative ultrasound techniques. To investigate such influence, a series of soft tissue-bone phantoms (a bone-mimicking plate coated with a layer of water, glycerol or silicon rubber) were ultrasonically investigated using a probe with multi-emitter and multi-receiver arrays in an axial transmission configuration. A singular value decomposition signal processing technique was applied to extract the frequency-dependent wavenumbers of several guided modes. The results indicate that the presence of a soft tissue-mimicking layer introduces additional guided modes predicted by a fluid waveguide model. The modes propagating in the bone-mimicking plate covered by the soft-tissue phantom are only slightly modified compared to their counterparts in the free bone-mimicking plate, and they are still predicted by an elastic transverse isotropic two-dimensional waveguide. Altogether these observations suggest that the soft tissue-bone phantoms can be modeled as two independent waveguides. Even in the presence of the overlying soft tissue-mimicking layer, the modes propagating in the bone-mimicking plate can still be extracted and identified. These results suggest that our approach can be applied for the purpose of the characterization of the material and structural properties of cortical bone.

  7. Assessment of and standardization for quantitative nondestructive test

    NASA Technical Reports Server (NTRS)

    Neuschaefer, R. W.; Beal, J. B.

    1972-01-01

    Present capabilities and limitations of nondestructive testing (NDT) as applied to aerospace structures during design, development, production, and operational phases are assessed. It will help determine what useful structural quantitative and qualitative data may be provided from raw materials to vehicle refurbishment. This assessment considers metal alloys systems and bonded composites presently applied in active NASA programs or strong contenders for future use. Quantitative and qualitative data has been summarized from recent literature, and in-house information, and presented along with a description of those structures or standards where the information was obtained. Examples, in tabular form, of NDT technique capabilities and limitations have been provided. NDT techniques discussed and assessed were radiography, ultrasonics, penetrants, thermal, acoustic, and electromagnetic. Quantitative data is sparse; therefore, obtaining statistically reliable flaw detection data must be strongly emphasized. The new requirements for reusable space vehicles have resulted in highly efficient design concepts operating in severe environments. This increases the need for quantitative NDT evaluation of selected structural components, the end item structure, and during refurbishment operations.

  8. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    PubMed

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  9. Quantitative risk assessment methods for cancer and noncancer effects.

    PubMed

    Baynes, Ronald E

    2012-01-01

    Human health risk assessments have evolved from the more qualitative approaches to more quantitative approaches in the past decade. This has been facilitated by the improvement in computer hardware and software capability and novel computational approaches being slowly recognized by regulatory agencies. These events have helped reduce the reliance on experimental animals as well as better utilization of published animal toxicology data in deriving quantitative toxicity indices that may be useful for risk management purposes. This chapter briefly describes some of the approaches as described in the guidance documents from several of the regulatory agencies as it pertains to hazard identification and dose-response assessment of a chemical. These approaches are contrasted with more novel computational approaches that provide a better grasp of the uncertainty often associated with chemical risk assessments.

  10. Some suggested future directions of quantitative resource assessments

    USGS Publications Warehouse

    Singer, D.A.

    2001-01-01

    Future quantitative assessments will be expected to estimate quantities, values, and locations of undiscovered mineral resources in a form that conveys both economic viability and uncertainty associated with the resources. Historically, declining metal prices point to the need for larger deposits over time. Sensitivity analysis demonstrates that the greatest opportunity for reducing uncertainty in assessments lies in lowering uncertainty associated with tonnage estimates. Of all errors possible in assessments, those affecting tonnage estimates are by far the most important. Selecting the correct deposit model is the most important way of controlling errors because the dominance of tonnage-deposit models are the best known predictor of tonnage. Much of the surface is covered with apparently barren rocks and sediments in many large regions. Because many exposed mineral deposits are believed to have been found, a prime concern is the presence of possible mineralized rock under cover. Assessments of areas with resources under cover must rely on extrapolation from surrounding areas, new geologic maps of rocks under cover, or analogy with other well-explored areas that can be considered training tracts. Cover has a profound effect on uncertainty and on methods and procedures of assessments because geology is seldom known and geophysical methods typically have attenuated responses. Many earlier assessment methods were based on relationships of geochemical and geophysical variables to deposits learned from deposits exposed on the surface-these will need to be relearned based on covered deposits. Mineral-deposit models are important in quantitative resource assessments for two reasons: (1) grades and tonnages of most deposit types are significantly different, and (2) deposit types are present in different geologic settings that can be identified from geologic maps. Mineral-deposit models are the keystone in combining the diverse geoscience information on geology, mineral

  11. Status and future of Quantitative Microbiological Risk Assessment in China

    PubMed Central

    Dong, Q.L.; Barker, G.C.; Gorris, L.G.M.; Tian, M.S.; Song, X.Y.; Malakar, P.K.

    2015-01-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives. PMID:26089594

  12. Status and future of Quantitative Microbiological Risk Assessment in China.

    PubMed

    Dong, Q L; Barker, G C; Gorris, L G M; Tian, M S; Song, X Y; Malakar, P K

    2015-03-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives.

  13. Quantitative risk assessments as evidence in civil litigation

    SciTech Connect

    Walker, V.R.

    1988-12-01

    Those who prepare quantitative risk assessments do not always appreciate that those assessments might be used as evidence in civil litigation. This paper suggests that litigation attorneys, judges, and juries be regarded as audiences to whom the information in the risk assessment must be communicated. The way that a risk assessment is prepared can affect significantly whether litigation is brought at all, the resolution of evidentiary motions involving the risk assessment, as well as the ultimate outcome of the litigation. This paper discusses certain procedural and evidentiary aspects of the civil litigation process in the hope that a better understanding of that process might lead to the preparation of risk assessments that are more adequately understood by juries, judges, and litigants.

  14. Quantitative transverse flow assessment using OCT speckle decorrelation analysis

    NASA Astrophysics Data System (ADS)

    Liu, Xuan; Huang, Yong; Ramella-Roman, Jessica C.; Kang, Jin U.

    2013-03-01

    In this study, we demonstrate the use of inter-Ascan speckle decorrelation analysis of optical coherence tomography (OCT) to assess fluid flow. This method allows quantitative measurement of fluid flow in a plane normal to the scanning beam. To validate this method, OCT images were obtained from a micro fluid channel with bovine milk flowing at different speeds. We also imaged a blood vessel from in vivo animal models and performed speckle analysis to asses blood flow.

  15. Multiparametric Quantitative Ultrasound Imaging in Assessment of Chronic Kidney Disease.

    PubMed

    Gao, Jing; Perlman, Alan; Kalache, Safa; Berman, Nathaniel; Seshan, Surya; Salvatore, Steven; Smith, Lindsey; Wehrli, Natasha; Waldron, Levi; Kodali, Hanish; Chevalier, James

    2017-04-13

    To evaluate the value of multiparametric quantitative ultrasound imaging in assessing chronic kidney disease (CKD) using kidney biopsy pathologic findings as reference standards. We prospectively measured multiparametric quantitative ultrasound markers with grayscale, spectral Doppler, and acoustic radiation force impulse imaging in 25 patients with CKD before kidney biopsy and 10 healthy volunteers. Based on all pathologic (glomerulosclerosis, interstitial fibrosis/tubular atrophy, arteriosclerosis, and edema) scores, the patients with CKD were classified into mild (no grade 3 and <2 of grade 2) and moderate to severe (at least 2 of grade 2 or 1 of grade 3) CKD groups. Multiparametric quantitative ultrasound parameters included kidney length, cortical thickness, pixel intensity, parenchymal shear wave velocity, intrarenal artery peak systolic velocity (PSV), end-diastolic velocity (EDV), and resistive index. We tested the difference in quantitative ultrasound parameters among mild CKD, moderate to severe CKD, and healthy controls using analysis of variance, analyzed correlations of quantitative ultrasound parameters with pathologic scores and the estimated glomerular filtration rate (GFR) using Pearson correlation coefficients, and examined the diagnostic performance of quantitative ultrasound parameters in determining moderate CKD and an estimated GFR of less than 60 mL/min/1.73 m(2) using receiver operating characteristic curve analysis. There were significant differences in cortical thickness, pixel intensity, PSV, and EDV among the 3 groups (all P < .01). Among quantitative ultrasound parameters, the top areas under the receiver operating characteristic curves for PSV and EDV were 0.88 and 0.97, respectively, for determining pathologic moderate to severe CKD, and 0.76 and 0.86 for estimated GFR of less than 60 mL/min/1.73 m(2) . Moderate to good correlations were found for PSV, EDV, and pixel intensity with pathologic scores and estimated GFR. The

  16. Quantitative-PCR assessment of Cryptosporidium parvum cell culture infection.

    PubMed

    Di Giovanni, George D; LeChevallier, Mark W

    2005-03-01

    A quantitative TaqMan PCR method was developed for assessing the Cryptosporidium parvum infection of in vitro cultivated human ileocecal adenocarcinoma (HCT-8) cell cultures. This method, termed cell culture quantitative sequence detection (CC-QSD), has numerous applications, several of which are presented. CC-QSD was used to investigate parasite infection in cell culture over time, the effects of oocyst treatment on infectivity and infectivity assessment of different C. parvum isolates. CC-QSD revealed that cell culture infection at 24 and 48 h postinoculation was approximately 20 and 60%, respectively, of the endpoint 72-h postinoculation infection. Evaluation of three different lots of C. parvum Iowa isolate oocysts revealed that the mean infection of 0.1 N HCl-treated oocysts was only 36% of the infection obtained with oocysts treated with acidified Hanks' balanced salt solution containing 1% trypsin. CC-QSD comparison of the C. parvum Iowa and TAMU isolates revealed significantly higher levels of infection for the TAMU isolate, which agrees with and supports previous human, animal, and cell culture studies. CC-QSD has the potential to aid in the optimization of Cryptosporidium cell culture methods and facilitate quantitative evaluation of cell culture infectivity experiments.

  17. Quantitative assessment of regional right ventricular function with color kinesis.

    PubMed

    Vignon, P; Weinert, L; Mor-Avi, V; Spencer, K T; Bednarz, J; Lang, R M

    1999-06-01

    We used color kinesis, a recent echocardiographic technique that provides regional information on the magnitude and timing of endocardial wall motion, to quantitatively assess regional right ventricular (RV) systolic and diastolic properties in 76 subjects who were divided into five groups, as follows: normal (n = 20), heart failure (n = 15), pressure/volume overload (n = 14), pressure overload (n = 12), and RV hypertrophy (n = 15). Quantitative segmental analysis of color kinesis images was used to obtain regional fractional area change (RFAC), which was displayed in the form of stacked histograms to determine patterns of endocardial wall motion. Time curves of integrated RFAC were used to objectively identify asynchrony of diastolic endocardial motion. When compared with normal subjects, patients with pressure overload or heart failure exhibited significantly decreased endocardial motion along the RV free wall. In the presence of mixed pressure/volume overload, the markedly increased ventricular septal motion compensated for decreased RV free wall motion. Diastolic endocardial wall motion was delayed in 17 of 72 segments (24%) in patients with RV pressure overload, and in 31 of 90 segments (34%) in patients with RV hypertrophy. Asynchrony of diastolic endocardial wall motion was greater in the latter group than in normal subjects (16% versus 10%: p < 0.01). Segmental analysis of color kinesis images allows quantitative assessment of regional RV systolic and diastolic properties.

  18. DREAM: a method for semi-quantitative dermal exposure assessment.

    PubMed

    Van-Wendel-de-Joode, Berna; Brouwer, Derk H; Vermeulen, Roel; Van Hemmen, Joop J; Heederik, Dick; Kromhout, Hans

    2003-01-01

    This paper describes a new method (DREAM) for structured, semi-quantitative dermal exposure assessment for chemical or biological agents that can be used in occupational hygiene or epidemiology. It is anticipated that DREAM could serve as an initial assessment of dermal exposure, amongst others, resulting in a ranking of tasks and subsequently jobs. DREAM consists of an inventory and evaluation part. Two examples of dermal exposure of workers of a car-construction company show that DREAM characterizes tasks and gives insight into exposure mechanisms, forming a basis for systematic exposure reduction. DREAM supplies estimates for exposure levels on the outside clothing layer as well as on skin, and provides insight into the distribution of dermal exposure over the body. Together with the ranking of tasks and people, this provides information for measurement strategies and helps to determine who, where and what to measure. In addition to dermal exposure assessment, the systematic description of dermal exposure pathways helps to prioritize and determine most adequate measurement strategies and methods. DREAM could be a promising approach for structured, semi-quantitative, dermal exposure assessment.

  19. Quantitative CT: technique dependency of volume assessment for pulmonary nodules

    NASA Astrophysics Data System (ADS)

    Chen, Baiyu; Richard, Samuel; Barnhart, Huiman; Colsher, James; Amurao, Maxwell; Samei, Ehsan

    2010-04-01

    Current lung nodule size assessment methods typically rely on one-dimensional estimation of lesions. While new 3D volume assessment techniques using MSCT scan data have enabled improved estimation of lesion size, the effect of acquisition and reconstruction parameters on accuracy and precision of such estimation has not been adequately investigated. To characterize such dependencies, we scanned an anthropomorphic thoracic phantom containing synthetic nodules with different protocols, including various acquisition and reconstruction parameters. We also scanned the phantom repeatedly with the same protocol to investigate repeatability. The nodule's volume was estimated by a clinical lung analysis software package, LungVCAR. Accuracy (bias) and precision (variance) of the volume assessment were calculated across the nodules and compared between protocols via Generalized Estimating Equation analysis. Results suggest a strong dependence of accuracy and precision on dose level but little dependence on reconstruction thickness, thus providing possible guidelines for protocol optimization for quantitative tasks.

  20. Simulation evaluation of quantitative myocardial perfusion assessment from cardiac CT

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2014-03-01

    Contrast enhancement on cardiac CT provides valuable information about myocardial perfusion and methods have been proposed to assess perfusion with static and dynamic acquisitions. There is a lack of knowledge and consensus on the appropriate approach to ensure 1) sufficient diagnostic accuracy for clinical decisions and 2) low radiation doses for patient safety. This work developed a thorough dynamic CT simulation and several accepted blood flow estimation techniques to evaluate the performance of perfusion assessment across a range of acquisition and estimation scenarios. Cardiac CT acquisitions were simulated for a range of flow states (Flow = 0.5, 1, 2, 3 ml/g/min, cardiac output = 3,5,8 L/min). CT acquisitions were simulated with a validated CT simulator incorporating polyenergetic data acquisition and realistic x-ray flux levels for dynamic acquisitions with a range of scenarios including 1, 2, 3 sec sampling for 30 sec with 25, 70, 140 mAs. Images were generated using conventional image reconstruction with additional image-based beam hardening correction to account for iodine content. Time attenuation curves were extracted for multiple regions around the myocardium and used to estimate flow. In total, 2,700 independent realizations of dynamic sequences were generated and multiple MBF estimation methods were applied to each of these. Evaluation of quantitative kinetic modeling yielded blood flow estimates with an root mean square error (RMSE) of ~0.6 ml/g/min averaged across multiple scenarios. Semi-quantitative modeling and qualitative static imaging resulted in significantly more error (RMSE = ~1.2 and ~1.2 ml/min/g respectively). For quantitative methods, dose reduction through reduced temporal sampling or reduced tube current had comparable impact on the MBF estimate fidelity. On average, half dose acquisitions increased the RMSE of estimates by only 18% suggesting that substantial dose reductions can be employed in the context of quantitative myocardial

  1. Quantitative risk assessment in aerospace: Evolution from the nuclear industry

    SciTech Connect

    Frank, M.V.

    1996-12-31

    In 1987, the National Aeronautics and Space Administration (NASA) and the aerospace industry relied on failure mode and effects analysis (FMEA) and hazards analysis as the primary tools for safety and reliability of their systems. The FMEAs were reviewed to provide critical items using a set of qualitative criteria. Hazards and critical items judged the worst, by a qualitative method, were to be either eliminated by a design change or controlled by the addition of a safeguard. However, it is frequently the case that limitations of space, weight, technical feasibility, and cost left critical items and hazards unable to be eliminated or controlled. In these situations, program management accepted the risk. How much risk was being accepted was unknown because quantitative risk assessment methods were not used. Perhaps the greatest contribution of the nuclear industry to NASA and the aerospace industry was the introduction of modern (i.e., post-WASH-1400) quantitative risk assessment concepts and techniques. The concepts of risk assessment that have been most useful in the aerospace industry are the following: 1. combination of accident sequence diagrams, event trees, and fault trees to model scenarios and their causative factors; 2. use of Bayesian analysis of system and component failure data; 3. evaluation and presentation of uncertainties in the risk estimates.

  2. Quantitative analysis in outcome assessment of instrumented lumbosacral arthrodesis.

    PubMed

    Champain, Sabina; Mazel, Christian; Mitulescu, Anca; Skalli, Wafa

    2007-08-01

    The outcome assessment in instrumented lumbosacral fusion mostly focuses on clinical criteria, complications and scores, with a high variability of imaging means, methods of fusion grading and parameters describing degenerative changes, making comparisons between studies difficult. The aim of this retrospective evaluation was to evaluate the interest of quantified radiographic analysis of lumbar spine in global outcome assessment and to highlight the key biomechanical factors involved. Clinical data and Beaujon-Lassale scores were collected for 49 patients who underwent lumbosacral arthrodesis after prior lumbar discectomy (mean follow-up: 5 years). Sagittal standing and lumbar flexion-extension X-ray films allowed quantifying vertebral, lumbar, pelvic and kinematic parameters of the lumbar spine, which were compared to reference values. Statistics were performed to assess evolution for all variables. At long-term follow-up, 90% of patients presented satisfactory clinical outcomes, associated to normal sagittal alignment; vertebral parameters objectified adjacent level degeneration in four cases (8%). Clinical outcome was correlated (r = 0.8) with fusion that was confirmed in 80% of cases, doubtful in 16% and pseudarthrosis seemed to occur in 4% (2) of cases. In addition to clinical data (outcomes comparable to the literature), quantitative analysis accurately described lumbar spine geometry and kinematics, highlighting parameters related to adjacent level's degeneration and a significant correlation between clinical outcome and fusion. Furthermore, criteria proposed to quantitatively evaluate fusion from lumbar dynamic radiographs seem to be appropriate and in agreement with surgeon's qualitative grading in 87% of cases.

  3. Quantitative Assessment of the Canine Pupillary Light Reflex

    PubMed Central

    Whiting, Rebecca E. H.; Yao, Gang; Narfström, Kristina; Pearce, Jacqueline W.; Coates, Joan R.; Dodam, John R.; Castaner, Leilani J.; Katz, Martin L.

    2013-01-01

    Purpose. To develop instrumentation and methods for thorough quantitative assessment of the pupillary light reflex (PLR) in dogs under varying stimulus conditions. Methods. The PLR was recorded in normal Dachshunds using a custom system allowing full user control over stimulus intensity, color, and duration. Chemical restraint protocols were compared to determine which protocol provided for optimal baseline stability of pupil size and appropriate eye positioning. A series of white light stimuli of increasing intensity was used to elicit pupil constriction. Pupil images were concurrently recorded using continuous infrared illumination and an infrared-sensitive camera. The PLR was also recorded in response to blue and red stimuli. Results. With injectable chemical restraint alone, spontaneous fluctuations in pupil size occurred independent of light stimulation, and spontaneous eye movements made it difficult to fully visualize the pupil. Combined injectable chemical and inhalation restraint provided a steady baseline pupil size throughout PLR assessment and allowed for stable positioning of the eye using a conjunctival stay suture. Robust PLRs were elicited with all light colors. PLR constriction amplitude increased with increasing flash intensity and ranged from 5% to 70%. Conclusions. A recording system and protocol have been developed to reliably quantify the canine PLR. The techniques and instrumentation will be useful for objective quantitative assessment of the PLR in dogs and other species in research applications and may be useful in clinical veterinary ophthalmology and neurology if PLR abnormalities detected with these procedures can be associated with specific diseases. PMID:23847311

  4. Quantitative Cardiac Assessment in Fetal Tetralogy of Fallot.

    PubMed

    Jatavan, Phudit; Tongprasert, Fuanglada; Srisupundit, Kasemsri; Luewan, Suchaya; Traisrisilp, Kuntharee; Tongsong, Theera

    2016-07-01

    The purpose of this study was to quantitatively assess cardiac function and biometric parameters in fetuses with a diagnosis of tetralogy of Fallot and compare them to those in healthy fetuses. Two hundred healthy fetuses and 20 fetuses with a diagnosis of classic tetralogy of Fallot were quantitatively assessed for 16 cardiac parameters, including morphologic characteristics and functions. All recruited fetuses were in the second trimester with correct gestational ages. The measured values that were out of normal reference ranges were considered abnormal. Rates of abnormalities of these parameters were compared between the groups. The significant parameters were further analyzed for their sensitivity, specificity, and likelihood ratio. Of the 16 parameters, rates of abnormalities in 7 parameters, including right ventricular wall thickness, peak systolic velocities (PSVs) in the pulmonary artery and aorta, time to peak velocity, or acceleration time, in the pulmonary artery, aortic valve diameter, pulmonary valve diameter, and aortic-to-pulmonary valve diameter ratio, were significantly higher in fetuses with tetralogy of Fallot (P < .001). The pulmonary artery PSV, pulmonary artery time to peak velocity, aortic valve diameter, pulmonary valve diameter, and aortic-to-pulmonary valve diameter ratio had high sensitivities (80.0%, 75.0%, 90.0%, 90.0%, and 100.0%, respectively) and specificities (95.5%, 97.0%, 94.5%, 96.0%, and 84.5%). In addition to a routine anatomic examination, quantitative assessment of fetal hemodynamics, especially an abnormally high PSV in the pulmonary artery, as well as a shortened acceleration time and abnormal valve size, might be very helpful for confirmation of the diagnosis in cases of suspected tetralogy of Fallot.

  5. Clubfoot: the treatment outcome using quantitative assessment of deformity.

    PubMed

    Rasit, Ah; Rasit, Ah; Azani, H; Zabidah, Pa; Merikan, A; Nur Alyana, Ba

    2012-06-01

    The recent trend in management of congenital idiopathic clubfoot tends towards conservative treatment. This study reviews the outcomes of treatment in our practice using the quantitative clubfoot assessment of the deformity (QCAD). Thirty patients (38 cases of clubfoot) with congenital idiopathic clubfoot treated at Sarawak General Hospital were followed-up for a mean of 3.6 years. The quantitative assessment consists of limb anthropometric measurement and the Pirani deformity severity score. There were 15 boys and 15 girls, with a mean age of 4.4 years (range, 13m - 8y). Most patients were of the Malay race (67%), followed by Chinese (23%) and others (10%). Eight patients suffered from bilateral congenital idiopathic clubfoot (33%), 12 were left unilateral (40%) and 10 were right unilateral (27%). Out of the total of 30 patients, 12 were treated conservatively with serial casting and 18 patients were treated surgically after resistance to serial casting at the age of nine months. At follow-up, there were significant differences between the surgical group (2.57 ± 1.45); (0.86 ± 0.36) and conservative group (0.7 ± 0.81); (0.34 ± 0.35) respectively (p < 0.05) regarding the mean difference in mid-leg circumference and foot length discrepancy in patients with unilateral clubfoot. There were no significant difference noted between groups with regards to results of the Pirani score, leg length discrepancy and mean difference of mid-foot circumference. There were significant differences in calf atrophy and foot length discrepancy when comparing surgically treated clubfoot patient compared to conservatively treated patients. Conservative treatment of clubfoot is the preferred method of treatment while surgical treatment may be necessary in more resistant cases. Clubfoot, outcome, treatment, quantitative assessment, deformity.

  6. Quantitative risk assessment for skin sensitization: Success or failure?

    PubMed

    Kimber, Ian; Gerberick, G Frank; Basketter, David A

    2017-02-01

    Skin sensitization is unique in the world of toxicology. There is a combination of reliable, validated predictive test methods for identification of skin sensitizing chemicals, a clearly documented and transparent approach to risk assessment, and effective feedback from dermatology clinics around the world delivering evidence of the success or failure of the hazard identification/risk assessment/management process. Recent epidemics of contact allergy, particularly to preservatives, have raised questions of whether the safety/risk assessment process is working in an optimal manner (or indeed is working at all!). This review has as its focus skin sensitization quantitative risk assessment (QRA). The core toxicological principles of QRA are reviewed, and evidence of use and misuse examined. What becomes clear is that skin sensitization QRA will only function adequately if two essential criteria are met. The first is that QRA is applied rigourously, and the second is that potential exposure to the sensitizing substance is assessed adequately. This conclusion will come as no surprise to any toxicologist who appreciates the basic premise that "risk = hazard x exposure". Accordingly, use of skin sensitization QRA is encouraged, not least because the essential feedback from dermatology clinics can be used as a tool to refine QRA in situations where this risk assessment tool has not been properly used. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Short Course Introduction to Quantitative Mineral Resource Assessments

    USGS Publications Warehouse

    Singer, Donald A.

    2007-01-01

    This is an abbreviated text supplementing the content of three sets of slides used in a short course that has been presented by the author at several workshops. The slides should be viewed in the order of (1) Introduction and models, (2) Delineation and estimation, and (3) Combining estimates and summary. References cited in the slides are listed at the end of this text. The purpose of the three-part form of mineral resource assessments discussed in the accompanying slides is to make unbiased quantitative assessments in a format needed in decision-support systems so that consequences of alternative courses of action can be examined. The three-part form of mineral resource assessments was developed to assist policy makers evaluate the consequences of alternative courses of action with respect to land use and mineral-resource development. The audience for three-part assessments is a governmental or industrial policy maker, a manager of exploration, a planner of regional development, or similar decision-maker. Some of the tools and models presented here will be useful for selection of exploration sites, but that is a side benefit, not the goal. To provide unbiased information, we recommend the three-part form of mineral resource assessments where general locations of undiscovered deposits are delineated from a deposit type's geologic setting, frequency distributions of tonnages and grades of well-explored deposits serve as models of grades and tonnages of undiscovered deposits, and number of undiscovered deposits are estimated probabilistically by type. The internally consistent descriptive, grade and tonnage, deposit density, and economic models used in the design of the three-part form of assessments reduce the chances of biased estimates of the undiscovered resources. What and why quantitative resource assessments: The kind of assessment recommended here is founded in decision analysis in order to provide a framework for making decisions concerning mineral

  8. Quantitative analysis in outcome assessment of instrumented lumbosacral arthrodesis

    PubMed Central

    Mazel, Christian; Mitulescu, Anca

    2007-01-01

    The outcome assessment in instrumented lumbosacral fusion mostly focuses on clinical criteria, complications and scores, with a high variability of imaging means, methods of fusion grading and parameters describing degenerative changes, making comparisons between studies difficult. The aim of this retrospective evaluation was to evaluate the interest of quantified radiographic analysis of lumbar spine in global outcome assessment and to highlight the key biomechanical factors involved. Clinical data and Beaujon–Lassale scores were collected for 49 patients who underwent lumbosacral arthrodesis after prior lumbar discectomy (mean follow-up: 5 years). Sagittal standing and lumbar flexion-extension X-ray films allowed quantifying vertebral, lumbar, pelvic and kinematic parameters of the lumbar spine, which were compared to reference values. Statistics were performed to assess evolution for all variables. At long-term follow-up, 90% of patients presented satisfactory clinical outcomes, associated to normal sagittal alignment; vertebral parameters objectified adjacent level degeneration in four cases (8%). Clinical outcome was correlated (r = 0.8) with fusion that was confirmed in 80% of cases, doubtful in 16% and pseudarthrosis seemed to occur in 4% (2) of cases. In addition to clinical data (outcomes comparable to the literature), quantitative analysis accurately described lumbar spine geometry and kinematics, highlighting parameters related to adjacent level’s degeneration and a significant correlation between clinical outcome and fusion. Furthermore, criteria proposed to quantitatively evaluate fusion from lumbar dynamic radiographs seem to be appropriate and in agreement with surgeon’s qualitative grading in 87% of cases. PMID:17216227

  9. Remotely Sensed Quantitative Drought Risk Assessment in Vulnerable Agroecosystems

    NASA Astrophysics Data System (ADS)

    Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.

    2012-04-01

    Hazard may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, economy and society. This paper deals with drought risk assessment, which the first step designed to find out what the problems are and comprises three distinct steps, namely risk identification, risk management which is not covered in this paper, there should be a fourth step to address the need for feedback and to take post-audits of all risk assessment exercises. In particular, quantitative drought risk assessment is attempted by using statistical methods. For the qualification of drought, the Reconnaissance Drought Index (RDI) is employed, which is a new index based on hydrometeorological parameters, such as precipitation and potential evapotranspiration. The remotely sensed estimation of RDI is based on NOA-AVHRR satellite data for a period of 20 years (1981-2001). The study area is Thessaly, central Greece, which is a drought-prone agricultural region characterized by vulnerable agriculture. Specifically, the undertaken drought risk assessment processes are specified as follows: 1. Risk identification: This step involves drought quantification and monitoring based on remotely sensed RDI and extraction of several features such as severity, duration, areal extent, onset and end time. Moreover, it involves a drought early warning system based on the above parameters. 2. Risk estimation: This step includes an analysis of drought severity, frequency and their relationships. 3. Risk evaluation: This step covers drought evaluation based on analysis of RDI images before and after each drought episode, which usually lasts one hydrological year (12month). The results of these three-step drought assessment processes are considered quite satisfactory in a drought-prone region such as Thessaly in central

  10. Assessment of metabolic bone diseases by quantitative computed tomography

    NASA Technical Reports Server (NTRS)

    Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.

    1985-01-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.

  11. Assessment of metabolic bone diseases by quantitative computed tomography

    NASA Technical Reports Server (NTRS)

    Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.

    1985-01-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.

  12. Role of computed tomography in quantitative assessment of emphysema

    PubMed Central

    Choromańska, Agnieszka; Macura, Katarzyna J.

    2012-01-01

    Summary Pulmonary emphysema, together with chronic bronchitis is a part of chronic obstructive pulmonary disease (COPD), which is one of the leading causes of death in the United States and worldwide. There are many methods to diagnose emphysema. Unfortunately many of them, for example pulmonary function tests (PFTs), clinical signs and conventional radiology are able to detect emphysema usually in its late stages when a great portion of lung parenchyma has been already destroyed by the disease. Computed tomography (CT) allows for early detection of emphysema. CT also makes it possible to quantify the total amount of emphysema in the lungs which is important in order to precisely estimate the severity of the disease. Those abilities of CT are important in monitoring the course of the disease and in attempts to prevent its further progression. In this review we discuss currently available methods for imaging emphysema with emphasis on the quantitative assessment of emphysema. To date, quantitative methods have not been widely used clinically, however, the initial results of several research studies regarding this subject are very encouraging. PMID:22802863

  13. Extending the quantitative assessment of industrial risks to earthquake effects.

    PubMed

    Campedel, Michela; Cozzani, Valerio; Garcia-Agreda, Anita; Salzano, Ernesto

    2008-10-01

    In the general framework of quantitative methods for natural-technological (NaTech) risk analysis, a specific methodology was developed for assessing risks caused by hazardous substances released due to earthquakes. The contribution of accidental scenarios initiated by seismic events to the overall industrial risk was assessed in three case studies derived from the actual plant layout of existing oil refineries. Several specific vulnerability models for different equipment classes were compared and assessed. The effect of differing structural resistances for process equipment on the final risk results was also investigated. The main factors influencing the final risk values resulted from the models for equipment vulnerability and the assumptions for the reference damage states of the process equipment. The analysis of case studies showed that in seismic zones the additional risk deriving from damage caused by earthquakes may be up to more than one order of magnitude higher than that associated to internal failure causes. Critical equipment was determined to be mainly pressurized tanks, even though atmospheric tanks were more vulnerable to containment loss. Failure of minor process equipment having a limited hold-up of hazardous substances (such as pumps) was shown to have limited influence on the final values of the risk increase caused by earthquakes.

  14. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  15. Quantitative Security Risk Assessment and Management for Railway Transportation Infrastructures

    NASA Astrophysics Data System (ADS)

    Flammini, Francesco; Gaglione, Andrea; Mazzocca, Nicola; Pragliola, Concetta

    Scientists have been long investigating procedures, models and tools for the risk analysis in several domains, from economics to computer networks. This paper presents a quantitative method and a tool for the security risk assessment and management specifically tailored to the context of railway transportation systems, which are exposed to threats ranging from vandalism to terrorism. The method is based on a reference mathematical model and it is supported by a specifically developed tool. The tool allows for the management of data, including attributes of attack scenarios and effectiveness of protection mechanisms, and the computation of results, including risk and cost/benefit indices. The main focus is on the design of physical protection systems, but the analysis can be extended to logical threats as well. The cost/benefit analysis allows for the evaluation of the return on investment, which is a nowadays important issue to be addressed by risk analysts.

  16. Assessing the Reliability of Quantitative Imaging of Sm-153

    NASA Astrophysics Data System (ADS)

    Poh, Zijie; Dagan, Maáyan; Veldman, Jeanette; Trees, Brad

    2013-03-01

    Samarium-153 is used for palliation of and recently has been investigated for therapy for bone metastases. Patient specific dosing of Sm-153 is based on quantitative single-photon emission computed tomography (SPECT) and knowing the accuracy and precision of image-based estimates of the in vivo activity distribution. Physical phantom studies are useful for estimating these in simple objects, but do not model realistic activity distributions. We are using realistic Monte Carlo simulations combined with a realistic digital phantom modeling human anatomy to assess the accuracy and precision of Sm-153 SPECT. Preliminary data indicates that we can simulate projection images and reconstruct them with compensation for various physical image degrading factors, such as attenuation and scatter in the body as well as non-idealities in the imaging system, to provide realistic SPECT images.

  17. Quantitative assessment of scleroderma by surface wave technique.

    PubMed

    Zhang, Xiaoming; Osborn, Thomas G; Pittelkow, Mark R; Qiang, Bo; Kinnick, Randall R; Greenleaf, James F

    2011-01-01

    Scleroderma is a multisystem disease characterized by cutaneous and visceral fibrosis. Skin disease is both a disabling feature of scleroderma and a predictor of visceral involvement. The established method of skin assessment is the modified Rodnan skin score (MRSS) which uses semi-quantitative manual skin scoring. However, the Rodnan method is subjective. We have developed a technique and system for assessing skin health by producing and analyzing surface waves in the skin to determine the skin viscoelastic properties. Viscoelasticity of human skin is measured on 30 healthy volunteers and 10 scleroderma patients at six anatomic sites. A small force, monitored by a force transducer, is applied to the skin using a ball-tipped device attached to a mechanical shaker. The skin motion is measured by a scanning laser vibrometer. The surface wave speed is measured by the phase gradient method. The viscoelasticity is inversely estimated by the wave speed dispersion. A typical measurement of the surface wave speed is 3.25±0.19 m/s on the forearm of a volunteer at 200 Hz. With the wave speed dispersion from 100 Hz to 400 Hz, the shear elasticity μ(1) and shear viscosity μ(2) are estimated, respectively, 7.86±1.86 kPa and 5.03±0.60 Pa on the forearm. Statistical analyses suggest that there are significant differences of viscoelasticity between scleroderma patients and healthy subjects. Scleroderma can be effectively and quantitatively evaluated based on human skin viscoelasticity. Copyright © 2010 IPEM. Published by Elsevier Ltd. All rights reserved.

  18. Assessing the mechanical properties of tissue-mimicking phantoms at different depths as an approach to measure biomechanical gradient of crystalline lens

    PubMed Central

    Wang, Shang; Aglyamov, Salavat; Karpiouk, Andrei; Li, Jiasong; Emelianov, Stanislav; Manns, Fabrice; Larin, Kirill V.

    2013-01-01

    We demonstrate the feasibility of using the dominant frequency of the sample surface response to a mechanical stimulation as an effective indicator for sensing the depthwise distribution of elastic properties in transparent layered phantom samples simulating the cortex and nucleus of the crystalline lens. Focused ultrasound waves are used to noninvasively interrogate the sample surface. A phase-sensitive optical coherence tomography system is utilized to capture the surface dynamics over time with nanometer scale sensitivity. Spectral analysis is performed on the sample surface response to ultrasound stimulation and the dominant frequency is calculated under particular loading parameters. Pilot experiments were conducted on homogeneous and layered tissue-mimicking phantoms. Results indicate that the mechanical layers located at different depths introduce different frequencies to the sample surface response, which are correlated with the depth-dependent elasticity of the sample. The duration and the frequency of the ultrasound excitation are also investigated for their influences on this spectrum-based detection. This noninvasive method may be potentially applied for localized and rapid assessment of the depth dependence of the mechanical properties of the crystalline lens. PMID:24409379

  19. Thermography as a quantitative imaging method for assessing postoperative inflammation

    PubMed Central

    Christensen, J; Matzen, LH; Vaeth, M; Schou, S; Wenzel, A

    2012-01-01

    Objective To assess differences in skin temperature between the operated and control side of the face after mandibular third molar surgery using thermography. Methods 127 patients had 1 mandibular third molar removed. Before the surgery, standardized thermograms were taken of both sides of the patient's face using a Flir ThermaCam™ E320 (Precisions Teknik AB, Halmstad, Sweden). The imaging procedure was repeated 2 days and 7 days after surgery. A region of interest including the third molar region was marked on each image. The mean temperature within each region of interest was calculated. The difference between sides and over time were assessed using paired t-tests. Results No significant difference was found between the operated side and the control side either before or 7 days after surgery (p > 0.3). The temperature of the operated side (mean: 32.39 °C, range: 28.9–35.3 °C) was higher than that of the control side (mean: 32.06 °C, range: 28.5–35.0 °C) 2 days after surgery [0.33 °C, 95% confidence interval (CI): 0.22–0.44 °C, p < 0.001]. No significant difference was found between the pre-operative and the 7-day post-operative temperature (p > 0.1). After 2 days, the operated side was not significantly different from the temperature pre-operatively (p = 0.12), whereas the control side had a lower temperature (0.57 °C, 95% CI: 0.29–0.86 °C, p < 0.001). Conclusions Thermography seems useful for quantitative assessment of inflammation between the intervention side and the control side after surgical removal of mandibular third molars. However, thermography cannot be used to assess absolute temperature changes due to normal variations in skin temperature over time. PMID:22752326

  20. Preschool Temperament Assessment: A Quantitative Assessment of the Validity of Behavioral Style Questionnaire Data

    ERIC Educational Resources Information Center

    Huelsman, Timothy J.; Gagnon, Sandra Glover; Kidder-Ashley, Pamela; Griggs, Marissa Swaim

    2014-01-01

    Research Findings: Child temperament is an important construct, but its measurement has been marked by a number of weaknesses that have diminished the frequency with which it is assessed in practice. We address this problem by presenting the results of a quantitative construct validation study. We calculated validity indices by hypothesizing the…

  1. A Framework for General Education Assessment: Assessing Information Literacy and Quantitative Literacy with ePortfolios

    ERIC Educational Resources Information Center

    Hubert, David A.; Lewis, Kati J.

    2014-01-01

    This essay presents the findings of an authentic and holistic assessment, using a random sample of one hundred student General Education ePortfolios, of two of Salt Lake Community College's (SLCC) college-wide learning outcomes: quantitative literacy (QL) and information literacy (IL). Performed by four faculty from biology, humanities, and…

  2. Preschool Temperament Assessment: A Quantitative Assessment of the Validity of Behavioral Style Questionnaire Data

    ERIC Educational Resources Information Center

    Huelsman, Timothy J.; Gagnon, Sandra Glover; Kidder-Ashley, Pamela; Griggs, Marissa Swaim

    2014-01-01

    Research Findings: Child temperament is an important construct, but its measurement has been marked by a number of weaknesses that have diminished the frequency with which it is assessed in practice. We address this problem by presenting the results of a quantitative construct validation study. We calculated validity indices by hypothesizing the…

  3. Computational technique for stepwise quantitative assessment of equation correctness

    NASA Astrophysics Data System (ADS)

    Othman, Nuru'l Izzah; Bakar, Zainab Abu

    2017-04-01

    Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors.

  4. Dermal sensitization quantitative risk assessment (QRA) for fragrance ingredients.

    PubMed

    Api, Anne Marie; Basketter, David A; Cadby, Peter A; Cano, Marie-France; Ellis, Graham; Gerberick, G Frank; Griem, Peter; McNamee, Pauline M; Ryan, Cindy A; Safford, Robert

    2008-10-01

    Based on chemical, cellular, and molecular understanding of dermal sensitization, an exposure-based quantitative risk assessment (QRA) can be conducted to determine safe use levels of fragrance ingredients in different consumer product types. The key steps are: (1) determination of benchmarks (no expected sensitization induction level (NESIL)); (2) application of sensitization assessment factors (SAF); and (3) consumer exposure (CEL) calculation through product use. Using these parameters, an acceptable exposure level (AEL) can be calculated and compared with the CEL. The ratio of AEL to CEL must be favorable to support safe use of the potential skin sensitizer. This ratio must be calculated for the fragrance ingredient in each product type. Based on the Research Institute for Fragrance Materials, Inc. (RIFM) Expert Panel's recommendation, RIFM and the International Fragrance Association (IFRA) have adopted the dermal sensitization QRA approach described in this review for fragrance ingredients identified as potential dermal sensitizers. This now forms the fragrance industry's core strategy for primary prevention of dermal sensitization to these materials in consumer products. This methodology is used to determine global fragrance industry product management practices (IFRA Standards) for fragrance ingredients that are potential dermal sensitizers. This paper describes the principles of the recommended approach, provides detailed review of all the information used in the dermal sensitization QRA approach for fragrance ingredients and presents key conclusions for its use now and refinement in the future.

  5. A quantitative method for assessing the quality of meibomian glands.

    PubMed

    Koprowski, Robert; Wilczyński, Sławomir; Olczyk, Paweł; Nowińska, Anna; Węglarz, Beata; Wylęgała, Edward

    2016-08-01

    Meibomian gland dysfunction is a common cause of dry eye syndrome which can also lead to eyelid inflammation. Today, diagnostics of meibomian glands is not fully automatic yet and is based on a qualitative assessment made by an ophthalmologist. Therefore, this article proposes a new automatic analysis method which provides a quantitative assessment of meibomian gland dysfunction. The new algorithm involves a sequence of operations: image acquisition (acquisition of data from OCULUS Keratograph® 5M); image pre-processing (image conversion to gray levels, median filtering, removal of uneven lighting, normalization); main image processing (binarization, morphological opening, labeling, Gaussian filtering, skeletonization, distance transform, watersheds). The algorithm was implemented in Matlab with Image Processing Toolbox (Matlab: Version 7.11.0.584, R2010b) on a PC running Windows 7 Professional, 64-bit with the Intel Core i7-4960X CPU @ 3.60GHz. The algorithm described in this article has the following features: it is fully automatic, provides fully reproducible results - sensitivity of 99.3% and specificity of 97.5% in the diagnosis of meibomian glands, and is insensitive to parameter changes. The time of image analysis for a single subject does not exceed 0.5s. Currently, the presented algorithm is tested in the Railway Hospital in Katowice, Poland. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. The potential optical coherence tomography in tooth bleaching quantitative assessment

    NASA Astrophysics Data System (ADS)

    Ni, Y. R.; Guo, Z. Y.; Shu, S. Y.; Zeng, C. C.; Zhong, H. Q.; Chen, B. L.; Liu, Z. M.; Bao, Y.

    2011-12-01

    In this paper, we report the outcomes from a pilot study on using OCT functional imaging method to evaluate and quantify color alteration in the human teeth in vitro. The image formations of the dental tissues without and with treatment 35% hydrogen peroxide were obtained by an OCT system at a 1310 nm central wavelength. One parameter for the quantification of optical properties from OCT measurements is introduced in our study: attenuate coefficient (μ). And the attenuate coefficient have significant decrease ( p < 0.001) in dentine as well as a significant increase ( p < 0.001) in enamel was observed during tooth bleaching process. From the experimental results, it is found that attenuate coefficient could be useful to assess color alteration of the human tooth samples. OCT has a potential to become an effective tool for the assessment tooth bleaching. And our experiment offer a now method to evaluate color change in visible region by quantitative analysis of the infrared region information from OCT.

  7. An Assessment of the Quantitative Literacy of Undergraduate Students

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.

    2016-01-01

    Quantitative literacy (QLT) represents an underlying higher-order construct that accounts for a person's willingness to engage in quantitative situations in everyday life. The purpose of this study is to retest the construct validity of a model of quantitative literacy (Wilkins, 2010). In this model, QLT represents a second-order factor that…

  8. Melorheostosis mimicking synovial osteochondromatosis.

    PubMed

    Wadhwa, Vibhor; Chhabra, Avneesh; Samet, Jonathan D

    2014-01-01

    Melorheostosis is an uncommon, sporadic, sclerosing bone lesion that may affect the adjacent soft tissues. It has been associated with many entities such as osteopoikilosis, soft tissue vascular malformations, bone and soft tissue tumors, nephrotic syndrome, segmental limb contractures, osteosarcoma, desmoid tumor, and mesenteric fibromatosis. Synovial osteochondromatosis is a benign neoplasia of the hyaline cartilage presenting as nodules in the subsynovial tissue of a joint or tendon sheath. The intra-articular extension of melorheostosis mimicking synovial osteochondromatosis has not been reported before. In this article, the authors describe an unusual case mimicking synovial chondromatosis arising as a result of melorheostosis and their characteristic imaging findings.

  9. Readability of Wikipedia Pages on Autoimmune Disorders: Systematic Quantitative Assessment

    PubMed Central

    Bragazzi, Nicola Luigi; Brigo, Francesco; Sharif, Kassem; Amital, Howard; McGonagle, Dennis; Shoenfeld, Yehuda; Adawi, Mohammad

    2017-01-01

    Background In the era of new information and communication technologies, the Internet is being increasingly accessed for health-related information. Indeed, recently published patient surveys of people with autoimmune disorders confirmed that the Internet was reported as one of the most important health information sources. Wikipedia, a free online encyclopedia launched in 2001, is generally one of the most visited websites worldwide and is often consulted for health-related information. Objective The main objective of this investigation was to quantitatively assess whether the Wikipedia pages related to autoimmune disorders can be easily accessed by patients and their families, in terms of readability. Methods We obtained and downloaded a list of autoimmune disorders from the American Autoimmune Related Diseases Association (AARDA) website. We analyzed Wikipedia articles for their overall level of readability with 6 different quantitative readability scales: (1) the Flesch Reading Ease, (2) the Gunning Fog Index, (3) the Coleman-Liau Index, (4) the Flesch-Kincaid Grade Level, (5) the Automated Readability Index (ARI), and (6) the Simple Measure of Gobbledygook (SMOG). Further, we investigated the correlation between readability and clinical, pathological, and epidemiological parameters. Moreover, each Wikipedia analysis was assessed according to its content, breaking down the readability indices by main topic of each part (namely, pathogenesis, treatment, diagnosis, and prognosis plus a section containing paragraphs not falling into any of the previous categories). Results We retrieved 134 diseases from the AARDA website. The Flesch Reading Ease yielded a mean score of 24.34 (SD 10.73), indicating that the sites were very difficult to read and best understood by university graduates, while mean Gunning Fog Index and ARI scores were 16.87 (SD 2.03) and 14.06 (SD 2.12), respectively. The Coleman-Liau Index and the Flesch-Kincaid Grade Level yielded mean scores of 14

  10. Readability of Wikipedia Pages on Autoimmune Disorders: Systematic Quantitative Assessment.

    PubMed

    Watad, Abdulla; Bragazzi, Nicola Luigi; Brigo, Francesco; Sharif, Kassem; Amital, Howard; McGonagle, Dennis; Shoenfeld, Yehuda; Adawi, Mohammad

    2017-07-18

    In the era of new information and communication technologies, the Internet is being increasingly accessed for health-related information. Indeed, recently published patient surveys of people with autoimmune disorders confirmed that the Internet was reported as one of the most important health information sources. Wikipedia, a free online encyclopedia launched in 2001, is generally one of the most visited websites worldwide and is often consulted for health-related information. The main objective of this investigation was to quantitatively assess whether the Wikipedia pages related to autoimmune disorders can be easily accessed by patients and their families, in terms of readability. We obtained and downloaded a list of autoimmune disorders from the American Autoimmune Related Diseases Association (AARDA) website. We analyzed Wikipedia articles for their overall level of readability with 6 different quantitative readability scales: (1) the Flesch Reading Ease, (2) the Gunning Fog Index, (3) the Coleman-Liau Index, (4) the Flesch-Kincaid Grade Level, (5) the Automated Readability Index (ARI), and (6) the Simple Measure of Gobbledygook (SMOG). Further, we investigated the correlation between readability and clinical, pathological, and epidemiological parameters. Moreover, each Wikipedia analysis was assessed according to its content, breaking down the readability indices by main topic of each part (namely, pathogenesis, treatment, diagnosis, and prognosis plus a section containing paragraphs not falling into any of the previous categories). We retrieved 134 diseases from the AARDA website. The Flesch Reading Ease yielded a mean score of 24.34 (SD 10.73), indicating that the sites were very difficult to read and best understood by university graduates, while mean Gunning Fog Index and ARI scores were 16.87 (SD 2.03) and 14.06 (SD 2.12), respectively. The Coleman-Liau Index and the Flesch-Kincaid Grade Level yielded mean scores of 14.48 (SD 1.57) and 14.86 (1

  11. Comparison of biorelevant simulated media mimicking the intestinal environment to assess the solubility profiles of poorly soluble drugs.

    PubMed

    Prasad, Dev; Gu, Chong-Hui; Kuldipkumar, Anuj

    2016-01-01

    During the discovery stage in lead identification/optimization, compounds are characterized for their solubilities in biorelevant media and these data are often used to model the in vivo behavior of the compounds and predict the fraction absorbed. These media are selected to closely approximate the composition of human intestinal fluid. Owing to the complexity and variability in human intestinal fluid composition, it is essential that the chosen simulated media mimic the in vivo condition as closely as possible. Several recipes have been developed and are routinely used in assessing the solubilities of compounds. It is necessary to revisit these recipes and modify them as the understanding of the human GI tract increases. In the present work, we have evaluated the solubilities of six model compounds in several media and have proposed slight modifications to the currently used recipes based on our own data and that reported in the literature.

  12. Supersonic transient magnetic resonance elastography for quantitative assessment of tissue elasticity

    NASA Astrophysics Data System (ADS)

    Liu, Yu; Liu, Jingfei; Fite, Brett Z.; Foiret, Josquin; Ilovitsh, Asaf; Leach, J. Kent; Dumont, Erik; Caskey, Charles F.; Ferrara, Katherine W.

    2017-05-01

    Non-invasive, quantitative methods to assess the properties of biological tissues are needed for many therapeutic and tissue engineering applications. Magnetic resonance elastography (MRE) has historically relied on external vibration to generate periodic shear waves. In order to focally assess a biomaterial or to monitor the response to ablative therapy, the interrogation of a specific region of interest by a focused beam is desirable and transient MRE (t-MRE) techniques have previously been developed to accomplish this goal. Also, strategies employing a series of discrete ultrasound pulses directed to increasing depths along a single line-of-sight have been designed to generate a quasi-planar shear wave. Such ‘supersonic’ excitations have been applied for ultrasound elasticity measurements. The resulting shear wave is higher in amplitude than that generated from a single excitation and the properties of the media are simply visualized and quantified due to the quasi-planar wave geometry and the opportunity to generate the wave at the site of interest. Here for the first time, we extend the application of supersonic methods by developing a protocol for supersonic transient magnetic resonance elastography (sst-MRE) using an MR-guided focused ultrasound system capable of therapeutic ablation. We apply the new protocol to quantify tissue elasticity in vitro using biologically-relevant inclusions and tissue-mimicking phantoms, compare the results with elasticity maps acquired with ultrasound shear wave elasticity imaging (US-SWEI), and validate both methods with mechanical testing. We found that a modified time-of-flight (TOF) method efficiently quantified shear modulus from sst-MRE data, and both the TOF and local inversion methods result in similar maps based on US-SWEI. With a three-pulse excitation, the proposed sst-MRE protocol was capable of visualizing quasi-planar shear waves propagating away from the excitation location and detecting differences in shear

  13. A quantitative risk assessment model for Salmonella and whole chickens.

    PubMed

    Oscar, Thomas P

    2004-06-01

    Existing data and predictive models were used to define the input settings of a previously developed but modified quantitative risk assessment model (QRAM) for Salmonella and whole chickens. The QRAM was constructed in an Excel spreadsheet and was simulated using @Risk. The retail-to-table pathway was modeled as a series of unit operations and associated pathogen events that included initial contamination at retail, growth during consumer transport, thermal inactivation during cooking, cross-contamination during serving, and dose response after consumption. Published data as well as predictive models for growth and thermal inactivation of Salmonella were used to establish input settings. Noncontaminated chickens were simulated so that the QRAM could predict changes in the incidence of Salmonella contamination. The incidence of Salmonella contamination changed from 30% at retail to 0.16% after cooking to 4% at consumption. Salmonella growth on chickens during consumer transport was the only pathogen event that did not impact the risk of salmonellosis. For the scenario simulated, the QRAM predicted 0.44 cases of salmonellosis per 100,000 consumers, which was consistent with recent epidemiological data that indicate a rate of 0.66-0.88 cases of salmonellosis per 100,000 consumers of chicken. Although the QRAM was in agreement with the epidemiological data, surrogate data and models were used, assumptions were made, and potentially important unit operations and pathogen events were not included because of data gaps and thus, further refinement of the QRAM is needed.

  14. Quantitative assessment of brain volumes in fish: comparison of methodologies.

    PubMed

    Ullmann, Jeremy F P; Cowin, Gary; Collin, Shaun P

    2010-01-01

    When correlating brain areas with behavioral and environmental characteristics, a variety of techniques are employed. In fishes (elasmobranchs and teleosts), 2 methods, histology and the idealized ellipsoid and/or half-ellipsoid technique, are primarily used to calculate the volume of a brain area and therefore its relationship to social or ecological complexity. In this study on a perciform teleost, we have quantitatively compared brain volumes obtained using the conventional techniques of histology and approximating brain volume to an idealized ellipsoid (or half ellipsoid) and magnetic resonance imaging, an established clinical tool typically used for assessing brain volume in other vertebrates. Our results indicate that, when compared to brain volumes measured using magnetic resonance imaging of brain regions in situ, variations in brain shape and histological artifacts can lead to significant differences in brain volume, especially in the telencephalon and optic tecta. Consequently, in comparative studies of brain volumes, we advise caution when using the histological and/or ellipsoid methods to make correlations between brain area size and environmental, behavioral and social characteristics and, when possible, we propose the use of magnetic resonance imaging. Copyright © 2010 S. Karger AG, Basel.

  15. A quantitative assessment of chemical perturbations in thermotropic cyanobiphenyls.

    PubMed

    Guerra, Sebastiano; Dutronc, Thibault; Terazzi, Emmanuel; Guénée, Laure; Piguet, Claude

    2016-05-25

    Chemical programming of the temperature domains of existence of liquid crystals is greatly desired by both academic workers and industrial partners. This contribution proposes to combine empirical approaches, which rely on systematic chemical substitutions of mesogenic molecules followed by thermal characterizations, with a rational thermodynamic assessment of the effects induced by chemical perturbations. Taking into account the similarities which exist between temperature-dependent cohesive Gibbs free energy densities (CFEDs) and pressure-temperature phase diagrams modeled with the Clapeyron equation, chemical perturbations are considered as pressure increments along phase boundaries, which control the thermotropic liquid crystalline properties. Taking the familiar calamitic amphiphilic cyanobiphenyl-type mesogens as models, the consequences of (i) methyl substitution of the aromatic polar heads and (ii) connections of bulky silyl groups at the termini of the apolar flexible alkyl chain on the melting and clearing temperatures are quantitatively analyzed. Particular efforts were focused on the translation of the thermodynamic rationalization into a predictive tool accessible to synthetic chemists mainly interested in designing liquid crystals with specific technological applications.

  16. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  17. Quantitative risk assessment of foods containing peanut advisory labeling.

    PubMed

    Remington, Benjamin C; Baumert, Joseph L; Marx, David B; Taylor, Steve L

    2013-12-01

    Foods with advisory labeling (i.e. "may contain") continue to be prevalent and the warning may be increasingly ignored by allergic consumers. We sought to determine the residual levels of peanut in various packaged foods bearing advisory labeling, compare similar data from 2005 and 2009, and determine any potential risk for peanut-allergic consumers. Of food products bearing advisory statements regarding peanut or products that had peanut listed as a minor ingredient, 8.6% and 37.5% contained detectable levels of peanut (>2.5 ppm whole peanut), respectively. Peanut-allergic individuals should be advised to avoid such products regardless of the wording of the advisory statement. Peanut was detected at similar rates and levels in products tested in both 2005 and 2009. Advisory labeled nutrition bars contained the highest levels of peanut and an additional market survey of 399 products was conducted. Probabilistic risk assessment showed the risk of a reaction to peanut-allergic consumers from advisory labeled nutrition bars was significant but brand-dependent. Peanut advisory labeling may be overused on some nutrition bars but prudently used on others. The probabilistic approach could provide the food industry with a quantitative method to assist with determining when advisory labeling is most appropriate. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Assessment of hair surface roughness using quantitative image analysis.

    PubMed

    Park, K H; Kim, H J; Oh, B; Lee, E; Ha, J

    2017-07-19

    Focus on the hair and hair cuticle is increasing. The hair cuticle is the first layer to be exposed to damage and the area of primary protection. For such reasons, hair product manufacturers consider cuticle protection important. However, previous studies used only visual assessment to examine the cuticle. This study aimed to obtain the changes in cuticles and measure hair roughness using a HIROX microscope. A total of 23 female subjects used the same products daily for 4 weeks. Three hair samples per subject were collected from three different areas of the head. Measurements were taken before and after 4 weeks of daily product use. The hair surface changes were clearly observed on the captured images. Moreover, hair surface roughness was observed using various parameters on HIROX software. After 4 weeks of daily product use, the roughness parameter value of the hair surface was significantly decreased. Our result suggests that the hair roughness analytical method using HIROX can be a new paradigm for high-quality quantitative analysis of the hair cuticle. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. [Quantitative methods of cancer risk assessment in exposure to chemicals].

    PubMed

    Szymczak, Wiesław

    2009-01-01

    This is a methodology paper--it contains a review of different quantitative risk assessment methods and their comparison. There are two aspects of cancer risk modeling discussed here: 1. When there is one effective dose only. There were compared two models in this evaluation: one proposed by the Dutch Expert Committee on Occupational Standards and the other--a classical two-stage model. It was taken into account that in both models the animals were exposed for less than two years. An exposure period and a study period of animals were considered in the Dutch methodology. If we use as an exposure measure average lifespan dose estimated with different coefficients of exposure time in an experiment, we get two different dose-response models. And each of them will create different human risk models. There is no criterion that would let us assess which of them is better. 2. There are many models used in the BenchMark Dose (BMD) method. But there is no criterion that allows us to choose the best model objectively. In this paper a two-stage classical model and three BMD models (two-stage, Weibull and linear) were fit for particular data. Very small differences between all the models were noticed. The differences were insignificant because of uncertainties in the risk modeling. The possibility of choice of one model from a bigger set of models is the greatest benefit of this comparison. If the examined chemical is a genotoxic carcinogen, nothing more is needed than to estimate the threshold value.

  20. Quantitative Assessment of Islets of Langerhans Encapsulated in Alginate

    PubMed Central

    Johnson, Amy S.; O'Sullivan, Esther; D'Aoust, Laura N.; Omer, Abdulkadir; Bonner-Weir, Susan; Fisher, Robert J.; Weir, Gordon C.

    2011-01-01

    Improved methods have recently been developed for assessing islet viability and quantity in human islet preparations for transplantation, and these measurements have proven useful for predicting transplantation outcome. The objectives of this study were to adapt these methods for use with microencapsulated islets, to verify that they provide meaningful quantitative measurements, and to test them with two model systems: (1) barium alginate and (2) barium alginate containing a 70% (w/v) perfluorocarbon (PFC) emulsion, which presents challenges to use of these assays and is of interest in its own right as a means for reducing oxygen supply limitations to encapsulated tissue. Mitochondrial function was assessed by oxygen consumption rate measurements, and the analysis of data was modified to account for the increased solubility of oxygen in the PFC-alginate capsules. Capsules were dissolved and tissue recovered for nuclei counting to measure the number of cells. Capsule volume was determined from alginate or PFC content and used to normalize measurements. After low oxygen culture for 2 days, islets in normal alginate lost substantial viable tissue and displayed necrotic cores, whereas most of the original oxygen consumption rate was recovered with PFC alginate, and little necrosis was observed. All nuclei were recovered with normal alginate, but some nuclei from nonrespiring cells were lost with PFC alginate. Biocompatibility tests revealed toxicity at the islet periphery associated with the lipid emulsion used to provide surfactants during the emulsification process. We conclude that these new assay methods can be applied to islets encapsulated in materials as complex as PFC-alginate. Measurements made with these materials revealed that enhancement of oxygen permeability of the encapsulating material with a concentrated PFC emulsion improves survival of encapsulated islets under hypoxic conditions, but reformulation of the PFC emulsion is needed to reduce toxicity

  1. Quantitative assessment of computational models for retinotopic map formation

    PubMed Central

    Sterratt, David C; Cutts, Catherine S; Willshaw, David J; Eglen, Stephen J

    2014-01-01

    ABSTRACT Molecular and activity‐based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. Our framework facilitates comparison between models, testing new models against known phenotypes and simulating new phenotypes in existing models. We have used this framework to assess four representative models that combine Eph/ephrin gradients and/or activity‐based mechanisms and competition. Two of the models were updated from their original form to fit into our framework. The models were tested against five different phenotypes: wild type, Isl2‐EphA3 ki/ki, Isl2‐EphA3 ki/+, ephrin‐A2,A3,A5 triple knock‐out (TKO), and Math5 −/− (Atoh7). Two models successfully reproduced the extent of the Math5 −/− anteromedial projection, but only one of those could account for the collapse point in Isl2‐EphA3 ki/+. The models needed a weak anteroposterior gradient in the SC to reproduce the residual order in the ephrin‐A2,A3,A5 TKO phenotype, suggesting either an incomplete knock‐out or the presence of another guidance molecule. Our article demonstrates the importance of testing retinotopic models against as full a range of phenotypes as possible, and we have made available MATLAB software, we wrote to facilitate this process. © 2014 Wiley Periodicals, Inc. Develop Neurobiol 75: 641–666, 2015 PMID:25367067

  2. Quantitative assessment of hip osteoarthritis based on image texture analysis.

    PubMed

    Boniatis, I S; Costaridou, L I; Cavouras, D A; Panagiotopoulos, E C; Panayiotakis, G S

    2006-03-01

    A non-invasive method was developed to investigate the potential capacity of digital image texture analysis in evaluating the severity of hip osteoarthritis (OA) and in monitoring its progression. 19 textural features evaluating patterns of pixel intensity fluctuations were extracted from 64 images of radiographic hip joint spaces (HJS), corresponding to 32 patients with verified unilateral or bilateral OA. Images were enhanced employing custom developed software for the delineation of the articular margins on digitized pelvic radiographs. The severity of OA for each patient was assessed by expert orthopaedists employing the Kellgren and Lawrence (KL) scale. Additionally, an index expressing HJS-narrowing was computed considering patients from the unilateral OA-group. A textural feature that quantified pixel distribution non-uniformity (grey level non-uniformity, GLNU) demonstrated the strongest correlation with the HJS-narrowing index among all extracted features and utilized in further analysis. Classification rules employing GLNU feature were introduced to characterize a hip as normal or osteoarthritic and to assign it to one of three severity categories, formed in accordance with the KL scale. Application of the proposed rules resulted in relatively high classification accuracies in characterizing a hip as normal or osteoarthritic (90.6%) and in assigning it to the correct KL scale category (88.9%). Furthermore, the strong correlation between the HJS-narrowing index and the pathological GLNU (r = -0.9, p<0.001) was utilized to provide percentages quantifying hip OA-severity. Texture analysis may contribute in the quantitative assessment of OA-severity, in the monitoring of OA-progression and in the evaluation of a chondroprotective therapy.

  3. Quantitative Assessment of Molecular Dynamics Sampling for Flexible Systems.

    PubMed

    Nemec, Mike; Hoffmann, Daniel

    2017-02-14

    Molecular dynamics (MD) simulation is a natural method for the study of flexible molecules but at the same time is limited by the large size of the conformational space of these molecules. We ask by how much the MD sampling quality for flexible molecules can be improved by two means: the use of diverse sets of trajectories starting from different initial conformations to detect deviations between samples and sampling with enhanced methods such as accelerated MD (aMD) or scaled MD (sMD) that distort the energy landscape in controlled ways. To this end, we test the effects of these approaches on MD simulations of two flexible biomolecules in aqueous solution, Met-Enkephalin (5 amino acids) and HIV-1 gp120 V3 (a cycle of 35 amino acids). We assess the convergence of the sampling quantitatively with known, extensive measures of cluster number Nc and cluster distribution entropy Sc and with two new quantities, conformational overlap Oconf and density overlap Odens, both conveniently ranging from 0 to 1. These new overlap measures quantify self-consistency of sampling in multitrajectory MD experiments, a necessary condition for converged sampling. A comprehensive assessment of sampling quality of MD experiments identifies the combination of diverse trajectory sets and aMD as the most efficient approach among those tested. However, analysis of Odens between conventional and aMD trajectories also reveals that we have not completely corrected aMD sampling for the distorted energy landscape. Moreover, for V3, the courses of Nc and Odens indicate that much higher resources than those generally invested today will probably be needed to achieve convergence. The comparative analysis also shows that conventional MD simulations with insufficient sampling can be easily misinterpreted as being converged.

  4. Quantitative assessment of the effectiveness of a rockfall warning system

    NASA Astrophysics Data System (ADS)

    Bründl, Michael; Sättele, Martina; Krautblatter, Michael; Straub, Daniel

    2016-04-01

    Rockslides and rockfalls can pose high risk to human settlements and traffic infrastructure. In addition to structural mitigation measures like rockfall nets, warning systems are increasingly installed to reduce rockfall risks. Whereas for structural mitigation measures with reducing effects on the spatial extent a structured evaluation method is existing, no or only few approaches to assess the effectiveness for warning systems are known. Especially for higher magnitude rockfalls structural mitigation measures are not effective, and reliable early warning systems will be essential in future. In response to that, we developed a classification and a framework to assess the reliability and effectiveness of early warning systems (Sättele et al, 2015a; 2016). Here, we demonstrate an application for the rockfall warning system installed in Preonzo prior to a major rockfall in May 2012 (Sättele et al., 2015b). We show that it is necessary to design such a warning system as fail-safe construction, which has to incorporate components with low failure probabilities, high redundancy, low warning thresholds, and additional control systems. With a hypothetical probabilistic analysis, we investigate the effect of the risk attitude of decision makers and of the number of sensors on the probability of detecting an event and on initiating a timely evacuation, as well as on related intervention cost. We conclude that it is possible to quantitatively assess the effectiveness of warning systems, which helps to optimize mitigation strategies against rockfall events. References Sättele, M., Bründl, M., and Straub, D.: Reliability and effectiveness of warning systems for natural hazards: concept and application to debris flow warning, Rel. Eng. Syst. Safety, 142, 192-202, 2015a. Sättele, M., Krautblatter, M., Bründl, M., and Straub, D.: Forecasting rock slope failure: How reliable and effective are warning systems?, Landslides, 605, 1-14, 2015b. Sättele, M., Bründl, M., and

  5. Quantitative risk assessment for a glass fiber insulation product.

    PubMed

    Fayerweather, W E; Bender, J R; Hadley, J G; Eastes, W

    1997-04-01

    California Proposition 65 (Prop65) provides a mechanism by which the manufacturer may perform a quantitative risk assessment to be used in determining the need for cancer warning labels. This paper presents a risk assessment under this regulation for professional and do-it-yourself insulation installers. It determines the level of insulation glass fiber exposure (specifically Owens Corning's R-25 PinkPlus with Miraflex) that, assuming a working lifetime exposure, poses no significant cancer risk under Prop65's regulations. "No significant risk" is defined under Prop65 as a lifetime risk of no more than one additional cancer case per 100,000 exposed persons, and nonsignificant exposure is defined as a working lifetime exposure associated with "no significant risk." This determination can be carried out despite the fact that the relevant underlying studies (i.e., chronic inhalation bioassays) of comparable glass wool fibers do not show tumorigenic activity. Nonsignificant exposures are estimated from (1) the most recent RCC chronic inhalation bioassay of nondurable fiberglass in rats; (2) intraperitoneal fiberglass injection studies in rats; (3) a distributional, decision analysis approach applied to four chronic inhalation rat bioassays of conventional fiberglass; (4) an extrapolation from the RCC chronic rat inhalation bioassay of durable refractory ceramic fibers; and (5) an extrapolation from the IOM chronic rat inhalation bioassay of durable E glass microfibers. When the EPA linear nonthreshold model is used, central estimates of nonsignificant exposure range from 0.36 fibers/cc (for the RCC chronic inhalation bioassay of fiberglass) through 21 fibers/cc (for the i.p. fiberglass injection studies). Lower 95% confidence bounds on these estimates vary from 0.17 fibers/cc through 13 fibers/cc. Estimates derived from the distributional approach or from applying the EPA linear nonthreshold model to chronic bioassays of durable fibers such as refractory ceramic fiber

  6. Is there a place for quantitative risk assessment?

    PubMed

    Hall, Eric J

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  7. Quantitative risk assessment of Cryptosporidium in tap water in Ireland.

    PubMed

    Cummins, E; Kennedy, R; Cormican, M

    2010-01-15

    Cryptosporidium species are protozoan parasites associated with gastro-intestinal illness. Following a number of high profile outbreaks worldwide, it has emerged as a parasite of major public health concern. A quantitative Monte Carlo simulation model was developed to evaluate the annual risk of infection from Cryptosporidium in tap water in Ireland. The assessment considers the potential initial contamination levels in raw water, oocyst removal and decontamination events following various process stages, including coagulation/flocculation, sedimentation, filtration and disinfection. A number of scenarios were analysed to represent potential risks from public water supplies, group water schemes and private wells. Where surface water is used additional physical and chemical water treatment is important in terms of reducing the risk to consumers. The simulated annual risk of illness for immunocompetent individuals was below 1 x 10(-4) per year (as set by the US EPA) except under extreme contamination events. The risk for immunocompromised individuals was 2-3 orders of magnitude greater for the scenarios analysed. The model indicates a reduced risk of infection from tap water that has undergone microfiltration, as this treatment is more robust in the event of high contamination loads. The sensitivity analysis highlighted the importance of watershed protection and the importance of adequate coagulation/flocculation in conventional treatment. The frequency of failure of the treatment process is the most important parameter influencing human risk in conventional treatment. The model developed in this study may be useful for local authorities, government agencies and other stakeholders to evaluate the likely risk of infection given some basic input data on source water and treatment processes used.

  8. A poultry-processing model for quantitative microbiological risk assessment.

    PubMed

    Nauta, Maarten; van der Fels-Klerx, Ine; Havelaar, Arie

    2005-02-01

    A poultry-processing model for a quantitative microbiological risk assessment (QMRA) of campylobacter is presented, which can also be applied to other QMRAs involving poultry processing. The same basic model is applied in each consecutive stage of industrial processing. It describes the effects of inactivation and removal of the bacteria, and the dynamics of cross-contamination in terms of the transfer of campylobacter from the intestines to the carcass surface and the environment, from the carcasses to the environment, and from the environment to the carcasses. From the model it can be derived that, in general, the effect of inactivation and removal is dominant for those carcasses with high initial bacterial loads, and cross-contamination is dominant for those with low initial levels. In other QMRA poultry-processing models, the input-output relationship between the numbers of bacteria on the carcasses is usually assumed to be linear on a logarithmic scale. By including some basic mechanistics, it is shown that this may not be realistic. As nonlinear behavior may affect the predicted effects of risk mitigations; this finding is relevant for risk management. Good knowledge of the variability of bacterial loads on poultry entering the process is important. The common practice in microbiology to only present geometric mean of bacterial counts is insufficient: arithmetic mean are more suitable, in particular, to describe the effect of cross-contamination. The effects of logistic slaughter (scheduled processing) as a risk mitigation strategy are predicted to be small. Some additional complications in applying microbiological data obtained in processing plants are discussed.

  9. Quantitative Assessment of Visceral Obesity and Postoperative Colon Cancer Outcomes

    PubMed Central

    Ozoya, Oluwatobi. O.; Siegel, Erin M.; Srikumar, Thejal; Bloomer, Amanda M.; DeRenzis, Amanda; Shibata, David

    2017-01-01

    Background Quantitative computed tomography (CT) assessment of visceral adiposity may be superior to body mass index (BMI) as a predictor of surgical morbidity. We sought to examine the association of CT measures of obesity and BMI with short-term post-operative outcomes in colon cancer patients. Methods In this retrospective study, 110 patients treated with colectomy for Stage I–III colon cancer were classified as obese or non-obese by pre-operative CT-based measures of adiposity or BMI. [Obese: BMI≥30kg/m2, visceral fat area (VFA) to subcutaneous fat area ratio (V/S) ≥0.4 and VFA>100cm2)]. Post-operative morbidity and mortality rates were compared. Results Obese patients, by V/S and VFA but not BMI, were more likely to be male and have pre-existing hypertension and diabetes. The overall complication rate was 25.5% and there were no mortalities. Obese patients by VFA (with a trend for VS but not BMI) were more likely to develop postoperative complications as compared to patients classified as non-obese; VFA (30.5% vs.10.7%, p= 0.03), VS (29.2% vs. 9.5%, p=0.05) and BMI (32.4% vs. 21.9%, p=0.23). Conclusions Elevated visceral obesity quantified by CT is associated with the presence of key metabolic comorbidities and increased post-operative morbidity and may be superior to BMI for risk stratification. PMID:28101721

  10. Quantitative Assessment of Visceral Obesity and Postoperative Colon Cancer Outcomes.

    PubMed

    Ozoya, Oluwatobi O; Siegel, Erin M; Srikumar, Thejal; Bloomer, Amanda M; DeRenzis, Amanda; Shibata, David

    2017-03-01

    Quantitative computed tomography (CT) assessment of visceral adiposity may be superior to body mass index (BMI) as a predictor of surgical morbidity. We sought to examine the association of CT measures of obesity and BMI with short-term postoperative outcomes in colon cancer patients. In this retrospective study, 110 patients treated with colectomy for stage I-III colon cancer were classified as obese or non-obese by preoperative CT-based measures of adiposity or BMI [obese: BMI ≥ 30 kg/m(2), visceral fat area (VFA) to subcutaneous fat area ratio (V/S) ≥0.4, and VFA > 100 cm(2)]. Postoperative morbidity and mortality rates were compared. Obese patients, by V/S and VFA but not BMI, were more likely to be male and have preexisting hypertension and diabetes. The overall complication rate was 25.5%, and there were no mortalities. Obese patients by VFA (with a trend for V/S but not BMI) were more likely to develop postoperative complications as compared to patients classified as non-obese: VFA (30.5 vs.10.7%, p = 0.03), V/S (29.2 vs. 9.5%, p = 0.05), and BMI (32.4 vs. 21.9%, p = 0.23). Elevated visceral obesity quantified by CT is associated with the presence of key metabolic comorbidities and increased postoperative morbidity and may be superior to BMI for risk stratification.

  11. Quantitative assessment of dictionary-based protein named entity tagging.

    PubMed

    Liu, Hongfang; Hu, Zhang-Zhi; Torii, Manabu; Wu, Cathy; Friedman, Carol

    2006-01-01

    Natural language processing (NLP) approaches have been explored to manage and mine information recorded in biological literature. A critical step for biological literature mining is biological named entity tagging (BNET) that identifies names mentioned in text and normalizes them with entries in biological databases. The aim of this study was to provide quantitative assessment of the complexity of BNET on protein entities through BioThesaurus, a thesaurus of gene/protein names for UniProt knowledgebase (UniProtKB) entries that was acquired using online resources. We evaluated the complexity through several perspectives: ambiguity (i.e., the number of genes/proteins represented by one name), synonymy (i.e., the number of names associated with the same gene/protein), and coverage (i.e., the percentage of gene/protein names in text included in the thesaurus). We also normalized names in BioThesaurus and measures were obtained twice, once before normalization and once after. The current version of BioThesaurus has over 2.6 million names or 2.1 million normalized names covering more than 1.8 million UniProtKB entries. The average synonymy is 3.53 (2.86 after normalization), ambiguity is 2.31 before normalization and 2.32 after, while the coverage is 94.0% based on the BioCreAtive data set comprising MEDLINE abstracts containing genes/proteins. The study indicated that names for genes/proteins are highly ambiguous and there are usually multiple names for the same gene or protein. It also demonstrated that most gene/protein names appearing in text can be found in BioThesaurus.

  12. A Quantitative Assessment Method for Ascaris Eggs on Hands

    PubMed Central

    Jeandron, Aurelie; Ensink, Jeroen H. J.; Thamsborg, Stig M.; Dalsgaard, Anders; Sengupta, Mita E.

    2014-01-01

    The importance of hands in the transmission of soil transmitted helminths, especially Ascaris and Trichuris infections, is under-researched. This is partly because of the absence of a reliable method to quantify the number of eggs on hands. Therefore, the aim of this study was to develop a method to assess the number of Ascaris eggs on hands and determine the egg recovery rate of the method. Under laboratory conditions, hands were seeded with a known number of Ascaris eggs, air dried and washed in a plastic bag retaining the washing water, in order to determine recovery rates of eggs for four different detergents (cationic [benzethonium chloride 0.1% and cetylpyridinium chloride CPC 0.1%], anionic [7X 1% - quadrafos, glycol ether, and dioctyl sulfoccinate sodium salt] and non-ionic [Tween80 0.1% -polyethylene glycol sorbitan monooleate]) and two egg detection methods (McMaster technique and FLOTAC). A modified concentration McMaster technique showed the highest egg recovery rate from bags. Two of the four diluted detergents (benzethonium chloride 0.1% and 7X 1%) also showed a higher egg recovery rate and were then compared with de-ionized water for recovery of helminth eggs from hands. The highest recovery rate (95.6%) was achieved with a hand rinse performed with 7X 1%. Washing hands with de-ionized water resulted in an egg recovery rate of 82.7%. This washing method performed with a low concentration of detergent offers potential for quantitative investigation of contamination of hands with Ascaris eggs and of their role in human infection. Follow-up studies are needed that validate the hand washing method under field conditions, e.g. including people of different age, lower levels of contamination and various levels of hand cleanliness. PMID:24802859

  13. A quantitative assessment method for Ascaris eggs on hands.

    PubMed

    Jeandron, Aurelie; Ensink, Jeroen H J; Thamsborg, Stig M; Dalsgaard, Anders; Sengupta, Mita E

    2014-01-01

    The importance of hands in the transmission of soil transmitted helminths, especially Ascaris and Trichuris infections, is under-researched. This is partly because of the absence of a reliable method to quantify the number of eggs on hands. Therefore, the aim of this study was to develop a method to assess the number of Ascaris eggs on hands and determine the egg recovery rate of the method. Under laboratory conditions, hands were seeded with a known number of Ascaris eggs, air dried and washed in a plastic bag retaining the washing water, in order to determine recovery rates of eggs for four different detergents (cationic [benzethonium chloride 0.1% and cetylpyridinium chloride CPC 0.1%], anionic [7X 1% - quadrafos, glycol ether, and dioctyl sulfoccinate sodium salt] and non-ionic [Tween80 0.1% -polyethylene glycol sorbitan monooleate]) and two egg detection methods (McMaster technique and FLOTAC). A modified concentration McMaster technique showed the highest egg recovery rate from bags. Two of the four diluted detergents (benzethonium chloride 0.1% and 7X 1%) also showed a higher egg recovery rate and were then compared with de-ionized water for recovery of helminth eggs from hands. The highest recovery rate (95.6%) was achieved with a hand rinse performed with 7X 1%. Washing hands with de-ionized water resulted in an egg recovery rate of 82.7%. This washing method performed with a low concentration of detergent offers potential for quantitative investigation of contamination of hands with Ascaris eggs and of their role in human infection. Follow-up studies are needed that validate the hand washing method under field conditions, e.g. including people of different age, lower levels of contamination and various levels of hand cleanliness.

  14. Quantitative Assessment of Water Resources Adaptation Policies in Mediterranean Europe

    NASA Astrophysics Data System (ADS)

    Garrote, L. M.; Mediero, L.; Martin-Carrasco, F.

    2011-12-01

    Many factors challenge water management in Southern Europe: scarce water resources, climate change, population growth, environmental concerns and economic development, among others. Water policy in the region is designed to ensure future sustainability of water resources under strong socioeconomic forcing while maintaining the strategic ecological and social services of water. Climate change is projected to intensify these conflicts, since most models agree that Southern Europe will show a significant drying trend, especially during the second half of the century. For this reason, there is a strong need to integrate climate change adaptation into implementation of the EU Water Framework Directive. From the policy perspective, there are many studies on how climate change might lead to changes in hydrologic regime, water demands, water quality or ecosystems, but there little knowledge on how much water demand might be met with future hydrologic regime. In water scarce regions, water demands are supplied by means of hydraulic infrastructure, which performs functions of storage, transportation and distribution, to overcome the spatio-temporal irregularities of hydrologic regime. Knowledge on the relationship between natural water resources, reservoir storage and water demands is essential to assess the effectiveness of alternative policy options to ensure adequate public water supply. In this paper we provide a simple way to account for the influence of socioeconomic factors (hydraulic infrastructure and water policy) on climate change impacts on water resources in the Mediterranean region. We present a methodology to identify and evaluate climate change adaptation policies in this context. The methodology is based on the application of the WAAPA (Water Availability and Adaptation Policy Assessment) model, which computes net water availability for consumptive use for a river basin taking into account the regulation capacity of its water supply system and a set of

  15. Use of Quantitative Microbial Risk Assessment to Improve Interpretation of a Recreational Water Epidemiological Study

    EPA Science Inventory

    We conducted a supplemental water quality monitoring study and quantitative microbial risk assessment (QMRA) to complement the United States Environmental Protection Agency’s (U.S. EPA) National Epidemiological and Environmental Assessment of Recreational Water study at Boq...

  16. Use of Quantitative Microbial Risk Assessment to Improve Interpretation of a Recreational Water Epidemiological Study

    EPA Science Inventory

    We conducted a supplemental water quality monitoring study and quantitative microbial risk assessment (QMRA) to complement the United States Environmental Protection Agency’s (U.S. EPA) National Epidemiological and Environmental Assessment of Recreational Water study at Boq...

  17. Assessing framing assumptions in quantitative health impact assessments: a housing intervention example.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2013-09-01

    Health impact assessment (HIA) is often used to determine ex ante the health impact of an environmental policy or an environmental intervention. Underpinning any HIA is the framing assumption, which defines the causal pathways mapping environmental exposures to health outcomes. The sensitivity of the HIA to the framing assumptions is often ignored. A novel method based on fuzzy cognitive map (FCM) is developed to quantify the framing assumptions in the assessment stage of a HIA, and is then applied to a housing intervention (tightening insulation) as a case-study. Framing assumptions of the case-study were identified through a literature search of Ovid Medline (1948-2011). The FCM approach was used to identify the key variables that have the most influence in a HIA. Changes in air-tightness, ventilation, indoor air quality and mould/humidity have been identified as having the most influence on health. The FCM approach is widely applicable and can be used to inform the formulation of the framing assumptions in any quantitative HIA of environmental interventions. We argue that it is necessary to explore and quantify framing assumptions prior to conducting a detailed quantitative HIA during the assessment stage.

  18. A potential quantitative method for assessing individual tree performance

    Treesearch

    Lance A. Vickers; David R. Larsen; Daniel C. Dey; John M. Kabrick; Benjamin O. Knapp

    2014-01-01

    By what standard should a tree be judged? This question, perhaps unknowingly, is posed almost daily by practicing foresters. Unfortunately, there are few cases in which clearly defined quantitative (i.e., directly measurable) references have been established in forestry. A lack of common references may be an unnecessary source of error in silvicultural application and...

  19. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  20. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  1. Assessing the Impact of a Quantitative Skills Course for Undergraduates

    ERIC Educational Resources Information Center

    Andersen, Kristi; Harsell, Dana Michael

    2005-01-01

    This paper evaluates the long-term benefits of a Syracuse University course offering, "Maxwell 201: Quantitative Methods for the Social Sciences" (MAX 201). The authors analyze data collected from class-administered pre- and post-tests and from a questionnaire sent to a random sample MAX 201 alumni to evaluate the extent to which…

  2. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  3. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  4. Assessing Student Teachers' Reflective Writing through Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Poldner, Eric; Van der Schaaf, Marieke; Simons, P. Robert-Jan; Van Tartwijk, Jan; Wijngaards, Guus

    2014-01-01

    Students' reflective essay writing can be stimulated by the formative assessments provided to them by their teachers. Such assessments contain information about the quality of students' reflective writings and offer suggestions for improvement. Despite the importance of formatively assessing students' reflective writings in teacher education…

  5. Assessing Student Teachers' Reflective Writing through Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Poldner, Eric; Van der Schaaf, Marieke; Simons, P. Robert-Jan; Van Tartwijk, Jan; Wijngaards, Guus

    2014-01-01

    Students' reflective essay writing can be stimulated by the formative assessments provided to them by their teachers. Such assessments contain information about the quality of students' reflective writings and offer suggestions for improvement. Despite the importance of formatively assessing students' reflective writings in teacher education…

  6. Quantitative assessment of soft tissue deformation using digital speckle pattern interferometry: studies on phantom breast models.

    PubMed

    Karuppanan, Udayakumar; Unni, Sujatha Narayanan; Angarai, Ganesan R

    2017-01-01

    Assessment of mechanical properties of soft matter is a challenging task in a purely noninvasive and noncontact environment. As tissue mechanical properties play a vital role in determining tissue health status, such noninvasive methods offer great potential in framing large-scale medical screening strategies. The digital speckle pattern interferometry (DSPI)-based image capture and analysis system described here is capable of extracting the deformation information from a single acquired fringe pattern. Such a method of analysis would be required in the case of the highly dynamic nature of speckle patterns derived from soft tissues while applying mechanical compression. Soft phantoms mimicking breast tissue optical and mechanical properties were fabricated and tested in the DSPI out of plane configuration set up. Hilbert transform (HT)-based image analysis algorithm was developed to extract the phase and corresponding deformation of the sample from a single acquired fringe pattern. The experimental fringe contours were found to correlate with numerically simulated deformation patterns of the sample using Abaqus finite element analysis software. The extracted deformation from the experimental fringe pattern using the HT-based algorithm is compared with the deformation value obtained using numerical simulation under similar conditions of loading and the results are found to correlate with an average %error of 10. The proposed method is applied on breast phantoms fabricated with included subsurface anomaly mimicking cancerous tissue and the results are analyzed.

  7. Urticaria mimickers in children.

    PubMed

    Mathur, Anubhav N; Mathes, Erin F

    2013-01-01

    Acute urticaria is a self-limited cutaneous condition marked by transient, erythematous, and pruritic wheals. It is a hypersensitivity response that is often secondary to infection, medications, or food allergies in children. In contrast, the urticarial "mimickers" described in this review article are often seen in the context of fever and extracutaneous manifestations in pediatric patients. The differential diagnosis ranges from benign and self-limited hypersensitivity responses to multisystem inflammatory diseases. Establishing the correct diagnosis of an urticarial rash in a pediatric patient is necessary to both prevent an unnecessary work up for self-limited conditions and to appropriately recognize and evaluate multisystem inflammatory disorders. Herein, we describe two cases to illustrate the clinical manifestations, laboratory findings, histopathology and differential diagnoses for several mimickers of acute urticaria including: urticaria multiforme, serum sickness like reaction, Henoch-Schönlein purpura, acute hemorrhagic edema of infancy, systemic onset juvenile idiopathic arthritis, cryopyrin associated periodic syndromes, and urticarial vasculitis. © 2013 Wiley Periodicals, Inc.

  8. Distinguishing nanomaterial particles from background airborne particulate matter for quantitative exposure assessment

    NASA Astrophysics Data System (ADS)

    Ono-Ogasawara, Mariko; Serita, Fumio; Takaya, Mitsutoshi

    2009-10-01

    As the production of engineered nanomaterials quantitatively expands, the chance that workers involved in the manufacturing process will be exposed to nanoparticles also increases. A risk management system is needed for workplaces in the nanomaterial industry based on the precautionary principle. One of the problems in the risk management system is difficulty of exposure assessment. In this article, examples of exposure assessment in nanomaterial industries are reviewed with a focus on distinguishing engineered nanomaterial particles from background nanoparticles in workplace atmosphere. An approach by JNIOSH (Japan National Institute of Occupational Safety and Health) to quantitatively measure exposure to carbonaceous nanomaterials is also introduced. In addition to real-time measurements and qualitative analysis by electron microscopy, quantitative chemical analysis is necessary for quantitatively assessing exposure to nanomaterials. Chemical analysis is suitable for quantitative exposure measurement especially at facilities with high levels of background NPs.

  9. Quantitative phylogenetic assessment of microbial communities indiverse environments

    SciTech Connect

    von Mering, C.; Hugenholtz, P.; Raes, J.; Tringe, S.G.; Doerks,T.; Jensen, L.J.; Ward, N.; Bork, P.

    2007-01-01

    The taxonomic composition of environmental communities is an important indicator of their ecology and function. Here, we use a set of protein-coding marker genes, extracted from large-scale environmental shotgun sequencing data, to provide a more direct, quantitative and accurate picture of community composition than traditional rRNA-based approaches using polymerase chain reaction (PCR). By mapping marker genes from four diverse environmental data sets onto a reference species phylogeny, we show that certain communities evolve faster than others, determine preferred habitats for entire microbial clades, and provide evidence that such habitat preferences are often remarkably stable over time.

  10. Assessment of metabolic bone diseases by quantitative computed tomography

    SciTech Connect

    Richardson, M.L.; Genant, H.K.; Cann, C.E.; Ettinger, B.; Gordan, G.S.; Kolb, F.O.; Reiser, U.J.

    1985-05-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid- induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements.

  11. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses

    ERIC Educational Resources Information Center

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory…

  12. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses

    ERIC Educational Resources Information Center

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory…

  13. Quantitative Assessment of Countermeasure Efficacy for Long-Term Space Missions

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.

    2000-01-01

    This slide presentation reviews the development of quantitative assessments of the effectiveness of countermeasures (CM) for the effects of space travel on humans for long term space missions. An example of bone mineral density (BMD) is examined to show specific quantitative measures for failure and success.

  14. Diagnostic accuracy of stress perfusion CMR in comparison with quantitative coronary angiography: fully quantitative, semiquantitative, and qualitative assessment.

    PubMed

    Mordini, Federico E; Haddad, Tariq; Hsu, Li-Yueh; Kellman, Peter; Lowrey, Tracy B; Aletras, Anthony H; Bandettini, W Patricia; Arai, Andrew E

    2014-01-01

    This study's primary objective was to determine the sensitivity, specificity, and accuracy of fully quantitative stress perfusion cardiac magnetic resonance (CMR) versus a reference standard of quantitative coronary angiography. We hypothesized that fully quantitative analysis of stress perfusion CMR would have high diagnostic accuracy for identifying significant coronary artery stenosis and exceed the accuracy of semiquantitative measures of perfusion and qualitative interpretation. Relatively few studies apply fully quantitative CMR perfusion measures to patients with coronary disease and comparisons to semiquantitative and qualitative methods are limited. Dual bolus dipyridamole stress perfusion CMR exams were performed in 67 patients with clinical indications for assessment of myocardial ischemia. Stress perfusion images alone were analyzed with a fully quantitative perfusion (QP) method and 3 semiquantitative methods including contrast enhancement ratio, upslope index, and upslope integral. Comprehensive exams (cine imaging, stress/rest perfusion, late gadolinium enhancement) were analyzed qualitatively with 2 methods including the Duke algorithm and standard clinical interpretation. A 70% or greater stenosis by quantitative coronary angiography was considered abnormal. The optimum diagnostic threshold for QP determined by receiver-operating characteristic curve occurred when endocardial flow decreased to <50% of mean epicardial flow, which yielded a sensitivity of 87% and specificity of 93%. The area under the curve for QP was 92%, which was superior to semiquantitative methods: contrast enhancement ratio: 78%; upslope index: 82%; and upslope integral: 75% (p = 0.011, p = 0.019, p = 0.004 vs. QP, respectively). Area under the curve for QP was also superior to qualitative methods: Duke algorithm: 70%; and clinical interpretation: 78% (p < 0.001 and p < 0.001 vs. QP, respectively). Fully quantitative stress perfusion CMR has high diagnostic accuracy for

  15. Disordered Speech Assessment Using Automatic Methods Based on Quantitative Measures

    NASA Astrophysics Data System (ADS)

    Gu, Lingyun; Harris, John G.; Shrivastav, Rahul; Sapienza, Christine

    2005-12-01

    Speech quality assessment methods are necessary for evaluating and documenting treatment outcomes of patients suffering from degraded speech due to Parkinson's disease, stroke, or other disease processes. Subjective methods of speech quality assessment are more accurate and more robust than objective methods but are time-consuming and costly. We propose a novel objective measure of speech quality assessment that builds on traditional speech processing techniques such as dynamic time warping (DTW) and the Itakura-Saito (IS) distortion measure. Initial results show that our objective measure correlates well with the more expensive subjective methods.

  16. Quantitative Assessment of Neurite Outgrowth in PC12 Cells

    EPA Science Inventory

    In vitro test methods can provide a rapid approach for the screening of large numbers of chemicals for their potential to produce toxicity. In order to identify potential developmental neurotoxicants, assessment of critical neurodevelopmental processes such as neuronal differenti...

  17. Quantitative Assessment of Neurite Outgrowth in PC12 Cells

    EPA Science Inventory

    In vitro test methods can provide a rapid approach for the screening of large numbers of chemicals for their potential to produce toxicity. In order to identify potential developmental neurotoxicants, assessment of critical neurodevelopmental processes such as neuronal differenti...

  18. Quantitative risk assessment: an emerging tool for emerging foodborne pathogens.

    PubMed Central

    Lammerding, A. M.; Paoli, G. M.

    1997-01-01

    New challenges to the safety of the food supply require new strategies for evaluating and managing food safety risks. Changes in pathogens, food preparation, distribution, and consumption, and population immunity have the potential to adversely affect human health. Risk assessment offers a framework for predicting the impact of changes and trends on the provision of safe food. Risk assessment models facilitate the evaluation of active or passive changes in how foods are produced, processed, distributed, and consumed. PMID:9366601

  19. Direct, quantitative clinical assessment of hand function: usefulness and reproducibility.

    PubMed

    Goodson, Alexander; McGregor, Alison H; Douglas, Jane; Taylor, Peter

    2007-05-01

    Methods of assessing functional impairment in arthritic hands include pain assessments and disability scoring scales which are subjective, variable over time and fail to take account of the patients' need to adapt to deformities. The aim of this study was to evaluate measures of functional strength and joint motion in the assessment of the rheumatoid (RA) and osteoarthritic (OA) hand. Ten control subjects, ten RA and ten OA patients were recruited for the study. All underwent pain and disability scoring and functional assessment of the hand using measures of pinch/grip strength and range of joint motion (ROM). Functional assessments including ROM analyses at interphalangeal (IP), metacarpophalangeal (MCP) and wrist joints along with pinch/grip strength clearly discriminated between patient groups (RA vs. OA MCP ROM P<0.0001), pain and disability scales were unable to. In the RA there were demonstrable relationships between ROM measurements and disability (R2=0.31) as well as disease duration (R2=0.37). Intra-patient measures of strength were robust whereas inter-patient comparisons showed variability. In conclusion, pinch/grip strength and ROM are clinically reproducible assessments that may more accurately reflect functional impairment associated with arthritis.

  20. Quantitative Assessment of Neuromotor Function in Adolescents with High Functioning Autism and Asperger Syndrome

    ERIC Educational Resources Information Center

    Freitag, Christine M.; Kleser, Christina; Schneider, Marc; von Gontard, Alexander

    2007-01-01

    Background: Motor impairment in children with Asperger Syndrome (AS) or High functioning autism (HFA) has been reported previously. This study presents results of a quantitative assessment of neuromotor skills in 14-22 year old HFA/AS. Methods: 16 HFA/AS and 16 IQ-matched controls were assessed by the Zurich Neuromotor Assessment (ZNA). Results:…

  1. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    USDA-ARS?s Scientific Manuscript database

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  2. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  3. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  4. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  5. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  6. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  7. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  8. A quantitative assessment of Arctic shipping in 2010-2014.

    PubMed

    Eguíluz, Victor M; Fernández-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M

    2016-08-01

    Rapid loss of sea ice is opening up the Arctic Ocean to shipping, a practice that is forecasted to increase rapidly by 2050 when many models predict that the Arctic Ocean will largely be free of ice toward the end of summer. These forecasts carry considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011-2014 is already significant and that it is concentrated (i) in the Norwegian and Barents Seas, and (ii) predominantly accessed via the Northeast and Northwest Passages. Thick ice along the forecasted direct trans-Arctic route was still present in 2014, preventing transit. Although Arctic shipping remains constrained by the extent of ice coverage, during every September, this coverage is at a minimum, allowing the highest levels of shipping activity. Access to Arctic resources, particularly fisheries, is the most important driver of Arctic shipping thus far.

  9. A quantitative assessment of Arctic shipping in 2010–2014

    NASA Astrophysics Data System (ADS)

    Eguíluz, Victor M.; Fernández-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M.

    2016-08-01

    Rapid loss of sea ice is opening up the Arctic Ocean to shipping, a practice that is forecasted to increase rapidly by 2050 when many models predict that the Arctic Ocean will largely be free of ice toward the end of summer. These forecasts carry considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011–2014 is already significant and that it is concentrated (i) in the Norwegian and Barents Seas, and (ii) predominantly accessed via the Northeast and Northwest Passages. Thick ice along the forecasted direct trans-Arctic route was still present in 2014, preventing transit. Although Arctic shipping remains constrained by the extent of ice coverage, during every September, this coverage is at a minimum, allowing the highest levels of shipping activity. Access to Arctic resources, particularly fisheries, is the most important driver of Arctic shipping thus far.

  10. Assessing the Phagosome Proteome by Quantitative Mass Spectrometry.

    PubMed

    Peltier, Julien; Härtlova, Anetta; Trost, Matthias

    2017-01-01

    Phagocytosis is the process that engulfs particles in vesicles called phagosomes that are trafficked through a series of maturation steps, culminating in the destruction of the internalized cargo. Because phagosomes are in direct contact with the particle and undergo constant fusion and fission events with other organelles, characterization of the phagosomal proteome is a powerful tool to understand mechanisms controlling innate immunity as well as vesicle trafficking. The ability to isolate highly pure phagosomes through the use of latex beads led to an extensive use of proteomics to study phagosomes under different stimuli. Thousands of different proteins have been identified and quantified, revealing new properties and shedding new light on the dynamics and composition of maturing phagosomes and innate immunity mechanisms. In this chapter, we describe how quantitative-based proteomic methods such as label-free, dimethyl labeling or Tandem Mass Tag (TMT) labeling can be applied for the characterization of protein composition and translocation during maturation of phagosomes in macrophages.

  11. Uncertainty in environmental health impact assessment: quantitative methods and perspectives.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Vanni, Tazio; Foss, Anna M

    2013-01-01

    Environmental health impact assessment models are subjected to great uncertainty due to the complex associations between environmental exposures and health. Quantifying the impact of uncertainty is important if the models are used to support health policy decisions. We conducted a systematic review to identify and appraise current methods used to quantify the uncertainty in environmental health impact assessment. In the 19 studies meeting the inclusion criteria, several methods were identified. These were grouped into random sampling methods, second-order probability methods, Bayesian methods, fuzzy sets, and deterministic sensitivity analysis methods. All 19 studies addressed the uncertainty in the parameter values but only 5 of the studies also addressed the uncertainty in the structure of the models. None of the articles reviewed considered conceptual sources of uncertainty associated with the framing assumptions or the conceptualisation of the model. Future research should attempt to broaden the way uncertainty is taken into account in environmental health impact assessments.

  12. Role of Quantitative Bone Scanning in the Assessment of Bone Turnover in Patients With Charcot Foot

    PubMed Central

    Bem, Robert; Jirkovská, Alexandra; Dubský, Michal; Fejfarová, Vladimira; Buncová, Marie; Skibová, Jelena; Jude, Edward B.

    2010-01-01

    OBJECTIVE To assess the new quantitative bone scan parameters as markers of Charcot neuroosteoarthropathy (CNO) activity. RESEARCH DESIGN AND METHODS Forty-two patients with acute (n = 21) and nonacute (n = 21) CNO underwent quantitative bone scanning. Patients with acute CNO were followed for 3–12 months and bone scans were repeated after treatment. New quantitative parameters were assessed and compared with markers of bone turnover and with skin temperature difference (STD). RESULTS Significant correlations between quantitative bone scan parameters and bone turnover markers were observed (all P < 0.05). These parameters decreased after treatment of CNO, and its reduction to the baseline value correlated with differences of bone turnover markers and STD (all P < 0.05). CONCLUSIONS Our study suggests that bone scanning can be used not only for diagnosis of CNO but also for monitoring disease activity by quantitative bone scan parameters. PMID:19933988

  13. Role of quantitative bone scanning in the assessment of bone turnover in patients with Charcot foot.

    PubMed

    Bem, Robert; Jirkovská, Alexandra; Dubsky, Michal; Fejfarová, Vladimira; Buncová, Marie; Skibová, Jelena; Jude, Edward B

    2010-02-01

    To assess the new quantitative bone scan parameters as markers of Charcot neuroosteoarthropathy (CNO) activity. Forty-two patients with acute (n = 21) and nonacute (n = 21) CNO underwent quantitative bone scanning. Patients with acute CNO were followed for 3-12 months and bone scans were repeated after treatment. New quantitative parameters were assessed and compared with markers of bone turnover and with skin temperature difference (STD). Significant correlations between quantitative bone scan parameters and bone turnover markers were observed (all P < 0.05). These parameters decreased after treatment of CNO, and its reduction to the baseline value correlated with differences of bone turnover markers and STD (all P < 0.05). Our study suggests that bone scanning can be used not only for diagnosis of CNO but also for monitoring disease activity by quantitative bone scan parameters.

  14. Challenges in Quantitative Risk Assessments of Space Systems

    NASA Astrophysics Data System (ADS)

    Hardy, Terry L.

    2010-09-01

    Government and commercial organizations throughout the world are developing and operating space launch vehicles and systems for the purposes of furthering exploration, delivering services, and facilitating commercial human spaceflight. The operation of the launch vehicles and space systems involve safety risks to the crew, to flight participants, and to the uninvolved public. Therefore, it is imperative that comprehensive risk assessments be performed to characterize, evaluate, and reduce the risks of these endeavors. A System Safety process is often used to help identify and reduce risks, and that System Safety process typically includes qualitative risk assessments. While qualitative risk assessments are accepted practice, they can include significant uncertainties that lead to an underestimation of that risk and overconfidence in the results. Some factors not often considered in risk assessment include human biases, process failures, and organizational considerations. The failure to understand and address these factors can lead to poor risk decision making that may result in accidents and mishaps. The paper offers recommendations to address the aforementioned factors to improve risk decision making.

  15. Developing a Quantitative Tool for Sustainability Assessment of HEIs

    ERIC Educational Resources Information Center

    Waheed, Bushra; Khan, Faisal I.; Veitch, Brian

    2011-01-01

    Purpose: Implementation of a sustainability paradigm demands new choices and innovative ways of thinking. The main objective of this paper is to provide a meaningful sustainability assessment tool for make informed decisions, which is applied to higher education institutions (HEIs). Design/methodology/approach: The objective is achieved by…

  16. Critical temperature: A quantitative method of assessing cold tolerance

    Treesearch

    D.H. DeHayes; M.W., Jr. Williams

    1989-01-01

    Critical temperature (Tc), defined as the highest temperature at which freezing injury to plant tissues can be detected, provides a biologically meaningful and statistically defined assessment of the relative cold tolerance of plant tissues. A method is described for calculating critical temperatures in laboratory freezing studies that use...

  17. Quantitative Assessments of Sensitivity to Reinforcement Contingencies in Mental Retardation.

    ERIC Educational Resources Information Center

    Dube, William V.; McIlvane, William J.

    2002-01-01

    Sensitivity to reinforcement contingencies was examined in six individuals with mental retardation using a concurrent operants procedure in the context of a computer game. Results included individual differences in sensitivity and differential sensitivity to rate and magnitude variation. Results suggest that comprehensive assessments of potential…

  18. INCORPORATION OF MOLECULAR ENDPOINTS INTO QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    The U.S. Environmental Protection Agency has recently released its Guidelines for Carcinogen Risk Assessment. These new guidelines benefit from the significant progress that has been made in understanding the cancer process and also from the more than 20 years experience that EPA...

  19. Developing a Quantitative Tool for Sustainability Assessment of HEIs

    ERIC Educational Resources Information Center

    Waheed, Bushra; Khan, Faisal I.; Veitch, Brian

    2011-01-01

    Purpose: Implementation of a sustainability paradigm demands new choices and innovative ways of thinking. The main objective of this paper is to provide a meaningful sustainability assessment tool for make informed decisions, which is applied to higher education institutions (HEIs). Design/methodology/approach: The objective is achieved by…

  20. Quantitative Assessment of Spray Deposition with Water-Sensitive Paper

    USDA-ARS?s Scientific Manuscript database

    Spray droplets, discharged from the lower six nozzles of an airblast sprayer, were sampled on pairs of absorbent filter and water-sensitive papers at nine distances from sprayer. Spray deposition on filter targets were measured by fluorometry and spray distribution on WSP targets were assessed by t...

  1. INCORPORATION OF MOLECULAR ENDPOINTS INTO QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    The U.S. Environmental Protection Agency has recently released its Guidelines for Carcinogen Risk Assessment. These new guidelines benefit from the significant progress that has been made in understanding the cancer process and also from the more than 20 years experience that EPA...

  2. Quantitative assessment of neural outgrowth using spatial light interference microscopy

    NASA Astrophysics Data System (ADS)

    Lee, Young Jae; Cintora, Pati; Arikkath, Jyothi; Akinsola, Olaoluwa; Kandel, Mikhail; Popescu, Gabriel; Best-Popescu, Catherine

    2017-06-01

    Optimal growth as well as branching of axons and dendrites is critical for the nervous system function. Neuritic length, arborization, and growth rate determine the innervation properties of neurons and define each cell's computational capability. Thus, to investigate the nervous system function, we need to develop methods and instrumentation techniques capable of quantifying various aspects of neural network formation: neuron process extension, retraction, stability, and branching. During the last three decades, fluorescence microscopy has yielded enormous advances in our understanding of neurobiology. While fluorescent markers provide valuable specificity to imaging, photobleaching, and photoxicity often limit the duration of the investigation. Here, we used spatial light interference microscopy (SLIM) to measure quantitatively neurite outgrowth as a function of cell confluence. Because it is label-free and nondestructive, SLIM allows for long-term investigation over many hours. We found that neurons exhibit a higher growth rate of neurite length in low-confluence versus medium- and high-confluence conditions. We believe this methodology will aid investigators in performing unbiased, nondestructive analysis of morphometric neuronal parameters.

  3. Quantitative assessment of Mycoplasma hemadsorption activity by flow cytometry.

    PubMed

    García-Morales, Luis; González-González, Luis; Costa, Manuela; Querol, Enrique; Piñol, Jaume

    2014-01-01

    A number of adherent mycoplasmas have developed highly complex polar structures that are involved in diverse aspects of the biology of these microorganisms and play a key role as virulence factors by promoting adhesion to host cells in the first stages of infection. Attachment activity of mycoplasma cells has been traditionally investigated by determining their hemadsorption ability to red blood cells and it is a distinctive trait widely examined when characterizing the different mycoplasma species. Despite the fact that protocols to qualitatively determine the hemadsorption or hemagglutination of mycoplasmas are straightforward, current methods when investigating hemadsorption at the quantitative level are expensive and poorly reproducible. By using flow cytometry, we have developed a procedure to quantify rapidly and accurately the hemadsorption activity of mycoplasmas in the presence of SYBR Green I, a vital fluorochrome that stains nucleic acids, allowing to resolve erythrocyte and mycoplasma cells by their different size and fluorescence. This method is very reproducible and permits the kinetic analysis of the obtained data and a precise hemadsorption quantification based on standard binding parameters such as the dissociation constant K d. The procedure we developed could be easily implemented in a standardized assay to test the hemadsorption activity of the growing number of clinical isolates and mutant strains of different mycoplasma species, providing valuable data about the virulence of these microorganisms.

  4. Quantitative Assessment of Parametric Uncertainty in Northern Hemisphere PAH Concentrations.

    PubMed

    Thackray, Colin P; Friedman, Carey L; Zhang, Yanxu; Selin, Noelle E

    2015-08-04

    We quantitatively examine the relative importance of uncertainty in emissions and physicochemical properties (including reaction rate constants) to Northern Hemisphere (NH) and Arctic polycyclic aromatic hydrocarbon (PAH) concentrations, using a computationally efficient numerical uncertainty technique applied to the global-scale chemical transport model GEOS-Chem. Using polynomial chaos (PC) methods, we propagate uncertainties in physicochemical properties and emissions for the PAHs benzo[a]pyrene, pyrene and phenanthrene to simulated spatially resolved concentration uncertainties. We find that the leading contributors to parametric uncertainty in simulated concentrations are the black carbon-air partition coefficient and oxidation rate constant for benzo[a]pyrene, and the oxidation rate constants for phenanthrene and pyrene. NH geometric average concentrations are more sensitive to uncertainty in the atmospheric lifetime than to emissions rate. We use the PC expansions and measurement data to constrain parameter uncertainty distributions to observations. This narrows a priori parameter uncertainty distributions for phenanthrene and pyrene, and leads to higher values for OH oxidation rate constants and lower values for European PHE emission rates.

  5. Quantitative assessment of rabbit alveolar macrophage function by chemiluminescence

    SciTech Connect

    Brennan, P.C.; Kirchner, F.R.

    1985-08-01

    Rabbit alveolar macrophages (RAM) were cultured for 24 hr with concentrations ranging from 3 to 12 ..mu..g/ml of vanadium oxide (V/sub 2/O/sub 5/), a known cytotoxic agent, or with high-molecular-weight organic by-products from coal gasification processes. After culture the cells were harvested and tested for functional capacity using three types of indicators: (1) luminol-amplified chemiluminescence (CL), which quantitatively detects photon emission due to respiratory burst activity measured in a newly designed instrument with standardized reagents; (2) the reduction of nitro blue tetrazolium-saturated polyacrylamide beads, a semiquantitative measure of respiratory burst activity; and (3) phagocytic efficiency, defined as percentage of cells incorporating immunoglobulin-coated polyacrylamide beads. Chemiluminescence declined linearly with increasing concentrations of V/sub 2/O/sub 5/ over the dose range tested. Dye reduction and phagocytic efficiency similarly decreased with increasing V/sub 2/O/sub 5/ concentration, but were less sensitive indicators of functional impairment than CL as measured by the amount required to reduce the response to 50% of untreated cells. The effect of coal gasification condensates on RAM function varied, but in general these test also indicated that the CL response was the most sensitive indicator.

  6. A quantitative assessment of Arctic shipping in 2010–2014

    PubMed Central

    Eguíluz, Victor M.; Fernández-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M.

    2016-01-01

    Rapid loss of sea ice is opening up the Arctic Ocean to shipping, a practice that is forecasted to increase rapidly by 2050 when many models predict that the Arctic Ocean will largely be free of ice toward the end of summer. These forecasts carry considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011–2014 is already significant and that it is concentrated (i) in the Norwegian and Barents Seas, and (ii) predominantly accessed via the Northeast and Northwest Passages. Thick ice along the forecasted direct trans-Arctic route was still present in 2014, preventing transit. Although Arctic shipping remains constrained by the extent of ice coverage, during every September, this coverage is at a minimum, allowing the highest levels of shipping activity. Access to Arctic resources, particularly fisheries, is the most important driver of Arctic shipping thus far. PMID:27477878

  7. Quantitative assessment of neural outgrowth using spatial light interference microscopy.

    PubMed

    Lee, Young Jae; Cintora, Pati; Arikkath, Jyothi; Akinsola, Olaoluwa; Kandel, Mikhail; Popescu, Gabriel; Best-Popescu, Catherine

    2017-06-01

    Optimal growth as well as branching of axons and dendrites is critical for the nervous system function. Neuritic length, arborization, and growth rate determine the innervation properties of neurons and define each cell’s computational capability. Thus, to investigate the nervous system function, we need to develop methods and instrumentation techniques capable of quantifying various aspects of neural network formation: neuron process extension, retraction, stability, and branching. During the last three decades, fluorescence microscopy has yielded enormous advances in our understanding of neurobiology. While fluorescent markers provide valuable specificity to imaging, photobleaching, and photoxicity often limit the duration of the investigation. Here, we used spatial light interference microscopy (SLIM) to measure quantitatively neurite outgrowth as a function of cell confluence. Because it is label-free and nondestructive, SLIM allows for long-term investigation over many hours. We found that neurons exhibit a higher growth rate of neurite length in low-confluence versus medium- and high-confluence conditions. We believe this methodology will aid investigators in performing unbiased, nondestructive analysis of morphometric neuronal parameters.

  8. Quantitative Assessment of Environmental Impacts in the Aquatic Environment.

    DTIC Science & Technology

    1982-01-01

    Army Corps of Engineers, Hydrologic Engineering Center, 1975). White, J. D., and J. A. Dracup, "Water Quality Modeling of a High Mountain Stream," J...construction Unte Stte Army Corp of Engineers Smring the Notin Technical Report N-114enginerin Sm~g th A-yDecember 1981research Water Quality Model ...Assessment System environmental impact analysis aquatic biology mathematical models 24L AgaiNACT CAWS..-- m.v.m N novm. and Idenifr by block numb

  9. Quantitative statistical assessment of conditional models for synthetic aperture radar.

    PubMed

    DeVore, Michael D; O'Sullivan, Joseph A

    2004-02-01

    Many applications of object recognition in the presence of pose uncertainty rely on statistical models-conditioned on pose-for observations. The image statistics of three-dimensional (3-D) objects are often assumed to belong to a family of distributions with unknown model parameters that vary with one or more continuous-valued pose parameters. Many methods for statistical model assessment, for example the tests of Kolmogorov-Smirnov and K. Pearson, require that all model parameters be fully specified or that sample sizes be large. Assessing pose-dependent models from a finite number of observations over a variety of poses can violate these requirements. However, a large number of small samples, corresponding to unique combinations of object, pose, and pixel location, are often available. We develop methods for model testing which assume a large number of small samples and apply them to the comparison of three models for synthetic aperture radar images of 3-D objects with varying pose. Each model is directly related to the Gaussian distribution and is assessed both in terms of goodness-of-fit and underlying model assumptions, such as independence, known mean, and homoscedasticity. Test results are presented in terms of the functional relationship between a given significance level and the percentage of samples that wold fail a test at that level.

  10. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  11. Quantitative phase imaging technologies to assess neuronal activity (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Thouvenin, Olivier; Fink, Mathias; Boccara, Claude

    2016-03-01

    Active neurons tends to have a different dynamical behavior compared to resting ones. Non-exhaustively, vesicular transport towards the synapses is increased, since axonal growth becomes slower. Previous studies also reported small phase variations occurring simultaneously with the action potential. Such changes exhibit times scales ranging from milliseconds to several seconds on spatial scales smaller than the optical diffraction limit. Therefore, QPI systems are of particular interest to measure neuronal activity without labels. Here, we report the development of two new QPI systems that should enable the detection of such activity. Both systems can acquire full field phase images with a sub nanometer sensitivity at a few hundreds of frames per second. The first setup is a synchronous combination of Full Field Optical Coherence Tomography (FF-OCT) and Fluorescence wide field imaging. The latter modality enables the measurement of neurons electrical activity using calcium indicators. In cultures, FF-OCT exhibits similar features to Digital Holographic Microscopy (DHM), except from complex computational reconstruction. However, FF-OCT is of particular interest in order to measure phase variations in tissues. The second setup is based on a Quantitative Differential Interference Contrast setup mounted in an epi-illumination configuration with a spectrally incoherent illumination. Such a common path interferometer exhibits a very good mechanical stability, and thus enables the measurement of phase images during hours. Additionally, such setup can not only measure a height change, but also an optical index change for both polarization. Hence, one can measure simultaneously a phase change and a birefringence change.

  12. Floods characterization: from impact data to quantitative assessment

    NASA Astrophysics Data System (ADS)

    Llasat, Maria-Carmen; Gilabert, Joan; Llasat-Botija, Montserrat; Marcos, Raül; Quintana-Seguí, Pere; Turco, Marco

    2015-04-01

    This study is based on the following flood databases from Catalonia: INUNGAMA (1900-2010) which considers 372 floods (Llasat et al, 2014), PRESSGAMA (1981-2010) and HISTOGAMA (from XIV Century on) - built as part of SPHERE project and recently updated. These databases store information about flood impacts (among others) and classify them by their severity (catastrophic, extraordinary and ordinary) by means of an indicators matrix based on other studies (i.e. Petrucci et al, 2013; Llasat et al, 2013). On this research we present a comparison between flood impacts, flow data and rainfall data on a Catalan scale and particularly for the basins of Segre, Muga, Ter and Llobregat (Western Mediterranean). From a bottom-up approach, a statistical methodology has been built (trend analysis, measures of position, cumulative distribution functions and geostatistics) in order to identify quantitative thresholds that will make possible to classify the floods. The purpose of this study is to establish generic thresholds for the whole Catalan region, for this we have selected rainfall maximums of flooding episodes stored at INUNGAMA and they have been related to flood categories by boxplot diagrams. Regarding the stream flow, we have established a relation between impacts and return periods at the day when the flow is maximum. The aim is to homogenize and compare the different drainage basins and to obtain general thresholds. It is also presented detailed analyses of relations between flooding episodes, flood classification and weather typing schemes - based in Jenkinson and Collison classification (applied to the Iberian Peninsula by Spellmann, 2000). In this way it could be analyzed whether patterns for the different types of floods exist or not. Finally, this work has pointed out the need of defining a new category for the most severe episodes.

  13. Quantitative population-health relationship (QPHR) for assessing metabolic syndrome

    PubMed Central

    Worachartcheewan, Apilak; Nantasenamat, Chanin; Isarankura-Na-Ayudhya, Chartchalerm; Prachayasittikul, Virapong

    2013-01-01

    Metabolic syndrome (MS) is a condition that predisposes individuals to the development of cardiovascular diseases and type 2 diabetes mellitus. A cross-sectional investigation of 15,365 participants residing in metropolitan Bangkok who had received an annual health checkup in 2007 was used in this study. Individuals were classified as MS or non-MS according to the International Diabetes Federation criteria using BMI cutoff of ≥ 25 kg/m2 plus two or more MS components. This study explores the utility of quantitative population-health relationship (QPHR) for predicting MS status as well as discovers variables that frequently occur together. The former was achieved by decision tree (DT) analysis, artificial neural network (ANN), support vector machine (SVM) and principal component analysis (PCA) while the latter was obtained by association analysis (AA). DT outperformed both ANN and SVM in MS classification as deduced from its accuracy value of 99 % as compared to accuracies of 98 % and 91 % for ANN and SVM, respectively. Furthermore, PCA was able to effectively classify individuals as MS and non-MS as observed from the scores plot. Moreover, AA was employed to analyze individuals with MS in order to elucidate pertinent rule from MS components that occur frequently together, which included TG+BP, BP+FPG and TG+FPG where TG, BP and FPG corresponds to triglyceride, blood pressure and fasting plasma glucose, respectively. QPHR was demonstrated to be useful in predicting the MS status of individuals from an urban Thai population. Rules obtained from AA analysis provided general guidelines (i.e. co-occurrences of TG, BP and FPG) that may be used in the prevention of MS in at risk individuals. PMID:26622213

  14. Assessment of Quantitative Precipitation Forecasts from Operational NWP Models (Invited)

    NASA Astrophysics Data System (ADS)

    Sapiano, M. R.

    2010-12-01

    Previous work has shown that satellite and numerical model estimates of precipitation have complimentary strengths, with satellites having greater skill at detecting convective precipitation events and model estimates having greater skill at detecting stratiform precipitation. This is due in part to the challenges associated with retrieving stratiform precipitation from satellites and the difficulty in resolving sub-grid scale processes in models. These complimentary strengths can be exploited to obtain new merged satellite/model datasets, and several such datasets have been constructed using reanalysis data. Whilst reanalysis data are stable in a climate sense, they also have relatively coarse resolution compared to the satellite estimates (many of which are now commonly available at quarter degree resolution) and they necessarily use fixed forecast systems that are not state-of-the-art. An alternative to reanalysis data is to use Operational Numerical Weather Prediction (NWP) model estimates, which routinely produce precipitation with higher resolution and using the most modern techniques. Such estimates have not been combined with satellite precipitation and their relative skill has not been sufficiently assessed beyond model validation. The aim of this work is to assess the information content of the models relative to satellite estimates with the goal of improving techniques for merging these data types. To that end, several operational NWP precipitation forecasts have been compared to satellite and in situ data and their relative skill in forecasting precipitation has been assessed. In particular, the relationship between precipitation forecast skill and other model variables will be explored to see if these other model variables can be used to estimate the skill of the model at a particular time. Such relationships would be provide a basis for determining weights and errors of any merged products.

  15. Aliasing as noise - A quantitative and qualitative assessment

    NASA Technical Reports Server (NTRS)

    Park, Stephen K.; Hazra, Rajeeb

    1993-01-01

    We present a model-based argument that, for the purposes of system design and digital image processing, aliasing should be treated as signal-dependent additive noise. By using a computational simulation based on this model, we process (high resolution images of) natural scenes in a way which enables the 'aliased component' of the reconstructed image to be isolated unambiguously. We demonstrate that our model-based argument leads naturally to system design metrics which quantify the extent of aliasing. And, by illustrating several aliased component images, we provide a qualitative assessment of aliasing as noise.

  16. A quantitative assessment of results with the Angelchik prosthesis.

    PubMed Central

    Wyllie, J. H.; Edwards, D. A.

    1985-01-01

    The Angelchik antireflux prosthesis was assessed in 15 unpromising patients, 12 of whom had peptic strictures of the oesophagus. Radiological techniques were used to show the effect of the device on gastro-oesophageal reflux, and on the bore and length of strictures. Twelve months later (range 6-24) most patients were well satisfied with the operation, and all considered it had been worthwhile; there was radiological evidence of reduction in reflux and remission of strictures. The device never surrounded the oesophageal sphincter; in all but 1 case it encircled a tube of stomach. Images Fig. 5 Fig. 6 PMID:4037629

  17. A comprehensive reliability assessment of quantitative diffusion tensor tractography.

    PubMed

    Wang, Jun Yi; Abdi, Hervé; Bakhadirov, Khamid; Diaz-Arrastia, Ramon; Devous, Michael D

    2012-04-02

    Diffusion tensor tractography is increasingly used to examine structural connectivity in the brain in various conditions, but its test-retest reliability is understudied. The main purposes of this study were to evaluate 1) the reliability of quantitative measurements of diffusion tensor tractography and 2) the effect on reliability of the number of gradient sampling directions and scan repetition. Images were acquired from ten healthy participants. Ten fiber regions of nine major fiber tracts were reconstructed and quantified using six fiber variables. Intra- and inter-session reliabilities were estimated using intraclass correlation coefficient (ICC) and coefficient of variation (CV), and were compared to pinpoint major error sources. Additional pairwise comparisons were made between the reliability of images with 30 directions and NEX 2 (DTI30-2), 30 directions and NEX 1 (DTI30-1), and 15 directions and NEX 2 (DTI15-2) to determine whether increasing gradient directions and scan repetition improved reliability. Of the 60 tractography measurements, 43 showed intersession CV ≤ 10%, ICC ≥ .70, or both for DTI30-2, 40 measurements for DTI30-1, and 37 for DTI15-2. Most of the reliable measurements were associated with the tracts corpus callosum, cingulum, cerebral peduncular fibers, uncinate fasciculus, and arcuate fasciculus. These reliable measurements included factional anisotropy (FA) and mean diffusivity of all 10 fiber regions. Intersession reliability was significantly worse than intra-session reliability for FA, mean length, and tract volume measurements from DTI15-2, indicating that the combination of MRI signal variation and physiological noise/change over time was the major error source for this sequence. Increasing the number of gradient directions from 15 to 30 while controlling the scan time, significantly affected values for all six variables and reduced intersession variability for mean length and tract volume measurements. Additionally, while

  18. Quantitative assessment of human body shape using Fourier analysis

    NASA Astrophysics Data System (ADS)

    Friess, Martin; Rohlf, F. J.; Hsiao, Hongwei

    2004-04-01

    Fall protection harnesses are commonly used to reduce the number and severity of injuries. Increasing the efficiency of harness design requires the size and shape variation of the user population to be assessed as detailed and as accurately as possible. In light of the unsatisfactory performance of traditional anthropometry with respect to such assessments, we propose the use of 3D laser surface scans of whole bodies and the statistical analysis of elliptic Fourier coefficients. Ninety-eight male and female adults were scanned. Key features of each torso were extracted as a 3D curve along front, back and the thighs. A 3D extension of Elliptic Fourier analysis4 was used to quantify their shape through multivariate statistics. Shape change as a function of size (allometry) was predicted by regressing the coefficients onto stature, weight and hip circumference. Upper and lower limits of torso shape variation were determined and can be used to redefine the design of the harness that will fit most individual body shapes. Observed allometric changes are used for adjustments to the harness shape in each size. Finally, the estimated outline data were used as templates for a free-form deformation of the complete torso surface using NURBS models (non-uniform rational B-splines).

  19. Disability and occupational assessment: objective diagnosis and quantitative impairment rating.

    PubMed

    Williams, C Donald

    2010-01-01

    Industrial insurance originated in Europe in the nineteenth century and replaced the old system of negligence liability in the United States between 1910 and 1940. Today psychiatric disability assessments are performed by psychiatrists in the context of Social Security Disability Insurance applications, workers' compensation claims, private disability insurance claims, and fitness for duty evaluations. Expertise in the performance of psychiatric disability evaluations is required, but general psychiatric residency programs provide experience only with treatment evaluations, which differ fundamentally from independent medical evaluations as to role boundaries and the focus of assessment. Psychiatrists offer opinions regarding psychiatric impairments, but administrative or judicial tribunals make the actual determinations of disability. Social Security Disability Insurance evaluations and workers' compensation evaluations are discussed, as is the distinction between diagnoses, which are categorical, and impairment ratings, which are dimensional. Inconsistency in impairment ratings has been problematic in the United States and elsewhere in the workers' compensation arena. A protocol for achieving more consistent impairment ratings is proposed, one that correlates three commonly used global rating scales in a 3 × 5 grid, supplemented by objective psychological test data.

  20. Quantitative assessment of workload and stressors in clinical radiation oncology.

    PubMed

    Mazur, Lukasz M; Mosaly, Prithima R; Jackson, Marianne; Chang, Sha X; Burkhardt, Katharin Deschesne; Adams, Robert D; Jones, Ellen L; Hoyle, Lesley; Xu, Jing; Rockwell, John; Marks, Lawrence B

    2012-08-01

    Workload level and sources of stressors have been implicated as sources of error in multiple settings. We assessed workload levels and sources of stressors among radiation oncology professionals. Furthermore, we explored the potential association between workload and the frequency of reported radiotherapy incidents by the World Health Organization (WHO). Data collection was aimed at various tasks performed by 21 study participants from different radiation oncology professional subgroups (simulation therapists, radiation therapists, physicists, dosimetrists, and physicians). Workload was assessed using National Aeronautics and Space Administration Task-Load Index (NASA TLX). Sources of stressors were quantified using observational methods and segregated using a standard taxonomy. Comparisons between professional subgroups and tasks were made using analysis of variance ANOVA, multivariate ANOVA, and Duncan test. An association between workload levels (NASA TLX) and the frequency of radiotherapy incidents (WHO incidents) was explored (Pearson correlation test). A total of 173 workload assessments were obtained. Overall, simulation therapists had relatively low workloads (NASA TLX range, 30-36), and physicists had relatively high workloads (NASA TLX range, 51-63). NASA TLX scores for physicians, radiation therapists, and dosimetrists ranged from 40-52. There was marked intertask/professional subgroup variation (P<.0001). Mental demand (P<.001), physical demand (P=.001), and effort (P=.006) significantly differed among professional subgroups. Typically, there were 3-5 stressors per cycle of analyzed tasks with the following distribution: interruptions (41.4%), time factors (17%), technical factors (13.6%), teamwork issues (11.6%), patient factors (9.0%), and environmental factors (7.4%). A positive association between workload and frequency of reported radiotherapy incidents by the WHO was found (r = 0.87, P value=.045). Workload level and sources of stressors vary

  1. Quantitative risk assessment of FMD virus transmission via water.

    PubMed

    Schijven, Jack; Rijs, Gerard B J; de Roda Husman, Ana Maria

    2005-02-01

    Foot-and-mouth disease (FMD) is a viral disease of domesticated and wild cloven-hoofed animals. FMD virus is known to spread by direct contact between infected and susceptible animals, by animal products such as meat and milk, by the airborne route, and mechanical transfer on people, wild animals, birds, and by vehicles. During the outbreak of 2001 in the Netherlands, milk from dairy cattle was illegally discharged into the sewerage as a consequence of transport prohibition. This may lead to contaminated discharges of biologically treated and raw sewage in surface water that is given to cattle to drink. The objective of the present study was to assess the probability of infecting dairy cows that were drinking FMD virus contaminated surface water due to illegal discharges of contaminated milk. So, the following data were collected from literature: FMD virus inactivation in aqueous environments, FMD virus concentrations in milk, dilution in sewage water, virus removal by sewage treatment, dilution in surface water, water consumption of cows, size of a herd in a meadow, and dose-response data for ingested FMD virus by cattle. In the case of 1.6 x 10(2) FMD virus per milliliter in milk and discharge of treated sewage in surface water, the probability of infecting a herd of cows was estimated to be 3.3 x 10(-7) to 8.5 x 10(-5), dependent on dilution in the receiving surface water. In the case of discharge of raw sewage, all probabilities of infection were 100 times higher. In the case of little dilution in small rivers, the high level of 8.5 x 10(-3) is reached. For 10(4) times higher FMD virus concentrations in milk, the probabilities of infecting a herd of cows are high in the case of discharge of treated sewage (3.3 x 10(-3) to 5.7 x 10(-1)) and very high in the case of discharge of raw sewage (0.28-1.0). It can be concluded that illegal and uncontrolled discharges of contaminated milk into the sewerage system may lead to high risks to other cattle farms at 6-50 km

  2. Quantitative Assessment of Workload and Stressors in Clinical Radiation Oncology

    SciTech Connect

    Mazur, Lukasz M.; Mosaly, Prithima R.; Jackson, Marianne; Chang, Sha X.; Burkhardt, Katharin Deschesne; Adams, Robert D.; Jones, Ellen L.; Hoyle, Lesley; Xu, Jing; Rockwell, John; Marks, Lawrence B.

    2012-08-01

    Purpose: Workload level and sources of stressors have been implicated as sources of error in multiple settings. We assessed workload levels and sources of stressors among radiation oncology professionals. Furthermore, we explored the potential association between workload and the frequency of reported radiotherapy incidents by the World Health Organization (WHO). Methods and Materials: Data collection was aimed at various tasks performed by 21 study participants from different radiation oncology professional subgroups (simulation therapists, radiation therapists, physicists, dosimetrists, and physicians). Workload was assessed using National Aeronautics and Space Administration Task-Load Index (NASA TLX). Sources of stressors were quantified using observational methods and segregated using a standard taxonomy. Comparisons between professional subgroups and tasks were made using analysis of variance ANOVA, multivariate ANOVA, and Duncan test. An association between workload levels (NASA TLX) and the frequency of radiotherapy incidents (WHO incidents) was explored (Pearson correlation test). Results: A total of 173 workload assessments were obtained. Overall, simulation therapists had relatively low workloads (NASA TLX range, 30-36), and physicists had relatively high workloads (NASA TLX range, 51-63). NASA TLX scores for physicians, radiation therapists, and dosimetrists ranged from 40-52. There was marked intertask/professional subgroup variation (P<.0001). Mental demand (P<.001), physical demand (P=.001), and effort (P=.006) significantly differed among professional subgroups. Typically, there were 3-5 stressors per cycle of analyzed tasks with the following distribution: interruptions (41.4%), time factors (17%), technical factors (13.6%), teamwork issues (11.6%), patient factors (9.0%), and environmental factors (7.4%). A positive association between workload and frequency of reported radiotherapy incidents by the WHO was found (r = 0.87, P value=.045

  3. Quantitative ultrasound techniques for the assessment of osteoporosis: expert agreement on current status. The International Quantitative Ultrasound Consensus Group.

    PubMed

    Glüer, C C

    1997-08-01

    Quantitative ultrasound (QUS) methods have been introduced in recent years for the assessment of skeletal status in osteoporosis. The performance of QUS techniques has been evaluated in a large number of studies. Reviewing existing knowledge, an international expert panel formulated the following consensus regarding the current status of this technology. To date, evidence supports the use of QUS techniques for the assessment of fracture risk in elderly women. This has been best established for water-based calcaneal QUS systems. Future studies should include the predictive validity of other QUS systems. Additional clinical applications of QUS, specifically the assessment of rates of change for monitoring disease progression or response to treatment, require further investigation. Its low cost and portability make QUS an attractive technology for assessing risk of fractures in larger populations than may be suitable or feasible for bone densitometry. Additional investigations that assess innovative QUS techniques in well defined research settings are important to determine and utilize the full potential of this technology for the benefit of early detection and monitoring of osteoporosis.

  4. The role of ultraviolet colour in the assessment of mimetic accuracy between Batesian mimics and their models: a case study using ant-mimicking spiders

    NASA Astrophysics Data System (ADS)

    Corcobado, Guadalupe; Herberstein, Marie E.; Pekár, Stano

    2016-12-01

    The use of ultraviolet (UV) cues for intra- and inter-specific communication is common in many animal species. Still, the role of UV signals under some predator-prey contexts, such as Batesian mimicry, is not clear. Batesian mimicry is a defensive strategy by which a palatable species (the mimic) resembles an unpalatable or noxious species (the model) to avoid predation. This strategy has evolved independently in many different taxa that are predated by species capable of UV perception. Moreover, there is considerable variation in how accurately Batesian mimics resemble their models across species. Our aim was to investigate how UV colour contributed to mimetic accuracy using several ant-mimicking spider species as a case study. We measured the reflectance spectrum (300-700 nm) for several species of mimics and models, and we tested whether they differ in visible and UV colour. We modelled whether two different predators could discriminate between mimics and models using colour information. We found that generally, ant-mimicking spiders differed significantly from their ant models in UV colour and that information from the visible range of light cannot be extrapolated into the UV. Our modelling suggested that wasps should be able to discriminate between mimics and models combining information from visible and the UV light, whereas birds may not discriminate between them. Thus, we show that UV colour can influence mimic accuracy and we discuss its potential role in Batesian mimicry. We conclude that colour, especially in the UV range, should be taken into account when measuring mimetic accuracy.

  5. The role of ultraviolet colour in the assessment of mimetic accuracy between Batesian mimics and their models: a case study using ant-mimicking spiders.

    PubMed

    Corcobado, Guadalupe; Herberstein, Marie E; Pekár, Stano

    2016-12-01

    The use of ultraviolet (UV) cues for intra- and inter-specific communication is common in many animal species. Still, the role of UV signals under some predator-prey contexts, such as Batesian mimicry, is not clear. Batesian mimicry is a defensive strategy by which a palatable species (the mimic) resembles an unpalatable or noxious species (the model) to avoid predation. This strategy has evolved independently in many different taxa that are predated by species capable of UV perception. Moreover, there is considerable variation in how accurately Batesian mimics resemble their models across species. Our aim was to investigate how UV colour contributed to mimetic accuracy using several ant-mimicking spider species as a case study. We measured the reflectance spectrum (300-700 nm) for several species of mimics and models, and we tested whether they differ in visible and UV colour. We modelled whether two different predators could discriminate between mimics and models using colour information. We found that generally, ant-mimicking spiders differed significantly from their ant models in UV colour and that information from the visible range of light cannot be extrapolated into the UV. Our modelling suggested that wasps should be able to discriminate between mimics and models combining information from visible and the UV light, whereas birds may not discriminate between them. Thus, we show that UV colour can influence mimic accuracy and we discuss its potential role in Batesian mimicry. We conclude that colour, especially in the UV range, should be taken into account when measuring mimetic accuracy.

  6. Calibration of multiple-choice questionnaires to assess quantitative indicators.

    PubMed

    Annoni, Paola; Ferrari, Pieralda

    2008-01-01

    The joint use of two latent factor methods is proposed to assess a measurement instrument for an underlying phenomenon. For this purpose, Rasch analysis is initially used to properly calibrate questionnaires, to discard non informative variables and redundant categories. As a second step, an optimal scaling technique, Nonlinear PCA, is applied to quantify variable categories and to compute a continuous indicator. Specifically, the paper deals with the state of decay of Italian buildings of great architectural and historical interest, which function as a case study . The decay level of the buildings is quantified on the basis of a broad set of observed ordinal variables and the final indicator may be independently used for buildings of future inventory. Overall, similarity and diverse potentiality of the techniques are analyzed and discussed with the purpose of exploring the synergic effect of their combined use.

  7. Quantitation of carboxyhaemoglobin in blood: external quality assessment of techniques.

    PubMed

    Barnett, K; Wilson, J F

    1998-06-01

    The performance of four dedicated carbon monoxide (CO)-oximeters (AVL, Chiron, IL, Radiometer), spectrophotometry with and without dithionite, spectrophotometry by second derivative, and the Whitehead and Worthington precipitation technique for the measurement of carboxyhaemoglobin in blood was compared by a mean of 136 participants in the United Kingdom National External Quality Assessment Scheme in 21 samples formulated to contain from 4% to 48% carboxyhaemoglobin. The dedicated instruments and spectrophotometry by second derivative were of significantly higher precision than the other techniques, producing fewer measurements rejected as being > 3 standard deviations from the sample mean and having a lower standard deviation for non-rejected measurements. The AVL instrument and spectrophotometry by second derivative had a significant positive bias compared to the other techniques. The Whitehead and Worthington method was of an unacceptably low precision.

  8. Quantitative assessment of impedance tomography for temperature measurements in hyperthermia.

    PubMed

    Blad, B; Persson, B; Lindström, K

    1992-01-01

    The objective of this study is a non-invasive assessment of the thermal dose in hyperthermia. Electrical impedance tomography (EIT) has previously been given a first trial as a temperature monitoring method together with microwave-induced hyperthermia treatment, but it has not been thoroughly investigated. In the present work we have examined this method in order to investigate the correlation in vitro between the true spatial temperature distribution and the corresponding measured relative resistivity changes. Different hyperthermia techniques, such as interstitial water tubings, microwave-induced, laser-induced and ferromagnetic seeds have been used. The results show that it is possible to find a correlation between the measured temperature values and the tomographically measured relative resistivity changes in tissue-equivalent phantoms. But the uncertainty of the temperature coefficients, which has been observed, shows that the method has to be improved before it can be applied to clinical in vivo applications.

  9. Assessing the reporting of categorised quantitative variables in observational epidemiological studies.

    PubMed

    Mabikwa, Onkabetse V; Greenwood, Darren C; Baxter, Paul D; Fleming, Sarah J

    2017-03-14

    One aspect to consider when reporting results of observational studies in epidemiology is how quantitative risk factors are analysed. The STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) guidelines recommend that researchers describe how they handle quantitative variables when analysing data. For categorised quantitative variables, the authors are required to provide reasons and justifications informing their practice. We investigated and assessed the practices and reporting of categorised quantitative variables in epidemiology. The assessment was based on five medical journals that publish epidemiological research. Observational studies published between April and June 2015 and investigating the relationships between quantitative exposures (or risk factors) and the outcomes were considered for assessment. A standard form was used to collect the data, and the reporting patterns amongst eligible studies were quantified and described. Out of 61 articles assessed for eligibility, 23 observational studies were included in the assessment. Categorisation of quantitative exposures occurred in 61% of these studies and reasons informing the practice were rarely provided. Only one article explained the choice of categorisation in the analysis. Transformation of quantitative exposures into four or five groups was common and dominant amongst studies using equally spaced categories. Dichotomisation was not popular; the practice featured in one article. Overall, the majority (86%) of the studies preferred ordered or arbitrary group categories. Other criterions used to decide categorical boundaries were based on established guidelines such as consensus statements and WHO standards. Categorisation of continuous variables remains a dominant practice in epidemiological studies. The reasons informing the practice of categorisation within published work are limited and remain unknown in most articles. The existing STROBE guidelines could provide stronger

  10. Compressed natural gas bus safety: a quantitative risk assessment.

    PubMed

    Chamberlain, Samuel; Modarres, Mohammad

    2005-04-01

    This study assesses the fire safety risks associated with compressed natural gas (CNG) vehicle systems, comprising primarily a typical school bus and supporting fuel infrastructure. The study determines the sensitivity of the results to variations in component failure rates and consequences of fire events. The components and subsystems that contribute most to fire safety risk are determined. Finally, the results are compared to fire risks of the present generation of diesel-fueled school buses. Direct computation of the safety risks associated with diesel-powered vehicles is possible because these are mature technologies for which historical performance data are available. Because of limited experience, fatal accident data for CNG bus fleets are minimal. Therefore, this study uses the probabilistic risk assessment (PRA) approach to model and predict fire safety risk of CNG buses. Generic failure data, engineering judgments, and assumptions are used in this study. This study predicts the mean fire fatality risk for typical CNG buses as approximately 0.23 fatalities per 100-million miles for all people involved, including bus passengers. The study estimates mean values of 0.16 fatalities per 100-million miles for bus passengers only. Based on historical data, diesel school bus mean fire fatality risk is 0.091 and 0.0007 per 100-million miles for all people and bus passengers, respectively. One can therefore conclude that CNG buses are more prone to fire fatality risk by 2.5 times that of diesel buses, with the bus passengers being more at risk by over two orders of magnitude. The study estimates a mean fire risk frequency of 2.2 x 10(-5) fatalities/bus per year. The 5% and 95% uncertainty bounds are 9.1 x 10(-6) and 4.0 x 10(-5), respectively. The risk result was found to be affected most by failure rates of pressure relief valves, CNG cylinders, and fuel piping.

  11. Mimicking the Moon

    NASA Image and Video Library

    2014-11-03

    When Galileo first observed Venus displaying a crescent phase, he excitedly wrote to Kepler (in anagram) of Venus mimicking the moon-goddess. He would have been delirious with joy to see Saturn and Titan, seen in this image, doing the same thing. More than just pretty pictures, high-phase observations -- taken looking generally toward the Sun, as in this image -- are very powerful scientifically since the way atmospheres and rings transmit sunlight is often diagnostic of compositions and physical states. In this example, Titan's crescent nearly encircles its disk due to the small haze particles high in its atmosphere refracting the incoming light of the distant Sun. This view looks toward the sunlit side of the rings from about 3 degrees above the ringplane. The image was taken in violet light with the Cassini spacecraft wide-angle camera on Aug. 11, 2013. The view was obtained at a distance of approximately 1.1 million miles (1.7 million kilometers) from Saturn and at a Sun-Saturn-spacecraft, or phase, angle of 154 degrees. Image scale is 64 miles (103 kilometers) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA18291

  12. Quantitative Assessment of a Field-Based Course on Integrative Geology, Ecology and Cultural History

    ERIC Educational Resources Information Center

    Sheppard, Paul R.; Donaldson, Brad A.; Huckleberry, Gary

    2010-01-01

    A field-based course at the University of Arizona called Sense of Place (SOP) covers the geology, ecology and cultural history of the Tucson area. SOP was quantitatively assessed for pedagogical effectiveness. Students of the Spring 2008 course were given pre- and post-course word association surveys in order to assess awareness and comprehension…

  13. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    USDA-ARS?s Scientific Manuscript database

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  14. Quantitative Assessment of a Field-Based Course on Integrative Geology, Ecology and Cultural History

    ERIC Educational Resources Information Center

    Sheppard, Paul R.; Donaldson, Brad A.; Huckleberry, Gary

    2010-01-01

    A field-based course at the University of Arizona called Sense of Place (SOP) covers the geology, ecology and cultural history of the Tucson area. SOP was quantitatively assessed for pedagogical effectiveness. Students of the Spring 2008 course were given pre- and post-course word association surveys in order to assess awareness and comprehension…

  15. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  16. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  17. Towards quantitative condition assessment of biodiversity outcomes: Insights from Australian marine protected areas.

    PubMed

    Addison, Prue F E; Flander, Louisa B; Cook, Carly N

    2017-08-01

    Protected area management effectiveness (PAME) evaluation is increasingly undertaken to evaluate governance, assess conservation outcomes and inform evidence-based management of protected areas (PAs). Within PAME, quantitative approaches to assess biodiversity outcomes are now emerging, where biological monitoring data are directly assessed against quantitative (numerically defined) condition categories (termed quantitative condition assessments). However, more commonly qualitative condition assessments are employed in PAME, which use descriptive condition categories and are evaluated largely with expert judgement that can be subject to a range of biases, such as linguistic uncertainty and overconfidence. Despite the benefits of increased transparency and repeatability of evaluations, quantitative condition assessments are rarely used in PAME. To understand why, we interviewed practitioners from all Australian marine protected area (MPA) networks, which have access to long-term biological monitoring data and are developing or conducting PAME evaluations. Our research revealed that there is a desire within management agencies to implement quantitative condition assessment of biodiversity outcomes in Australian MPAs. However, practitioners report many challenges in transitioning from undertaking qualitative to quantitative condition assessments of biodiversity outcomes, which are hampering progress. Challenges include a lack of agency capacity (staff numbers and money), knowledge gaps, and diminishing public and political support for PAs. We point to opportunities to target strategies that will assist agencies overcome these challenges, including new decision support tools, approaches to better finance conservation efforts, and to promote more management relevant science. While a single solution is unlikely to achieve full evidence-based conservation, we suggest ways for agencies to target strategies and advance PAME evaluations toward best practice. Copyright

  18. Stepwise quantitative risk assessment as a tool for characterization of microbiological food safety.

    PubMed

    van Gerwen, S J; te Giffel, M C; van't Riet, K; Beumer, R R; Zwietering, M H

    2000-06-01

    This paper describes a system for the microbiological quantitative risk assessment for food products and their production processes. The system applies a stepwise risk assessment, allowing the main problems to be addressed before focusing on less important problems. First, risks are assessed broadly, using order of magnitude estimates. Characteristic numbers are used to quantitatively characterize microbial behaviour during the production process. These numbers help to highlight the major risk-determining phenomena, and to find negligible aspects. Second, the risk-determining phenomena are studied in more detail. Both general and/or specific models can be used for this and varying situations can be simulated to quantitatively describe the risk-determining phenomena. Third, even more detailed studies can be performed where necessary, for instance by using stochastic variables. The system for quantitative risk assessment has been implemented as a decision supporting expert system called SIEFE: Stepwise and Interactive Evaluation of Food safety by an Expert System. SIEFE performs bacterial risk assessments in a structured manner, using various information sources. Because all steps are transparent, every step can easily be scrutinized. In the current study the effectiveness of SIEFE is shown for a cheese spread. With this product, quantitative data concerning the major risk-determining factors were not completely available to carry out a full detailed assessment. However, this did not necessarily hamper adequate risk estimation. Using ranges of values instead helped identifying the quantitatively most important parameters and the magnitude of their impact. This example shows that SIEFE provides quantitative insights into production processes and their risk-determining factors to both risk assessors and decision makers, and highlights critical gaps in knowledge.

  19. Assessing the use of Quantitative Light-induced Fluorescence-Digital as a clinical plaque assessment.

    PubMed

    Han, Sun-Young; Kim, Bo-Ra; Ko, Hae-Youn; Kwon, Ho-Keun; Kim, Baek-Il

    2016-03-01

    The aims of this study were to compare the relationship between red fluorescent plaque (RF plaque) area by Quantitative Light-induced Fluorescence-Digital (QLF-D) and disclosed plaque area by two-tone disclosure, and to assess the bacterial composition of the RF plaque by real time-PCR. Fifty healthy subjects were included and 600 facial surfaces of their anterior teeth were examined. QLF-D was taken on two separate occasions (before and after disclosing), and the RF plaque area was calculated based on Plaque Percent Index (PPI). After disclosing, the stained plaque area was analyzed to investigate the relationship with the RF plaque area. The relationship was evaluated using Pearson correlation and paired t-test. Then, the RF and non-red fluorescent (non-RF) plaque samples were obtained from the same subject for real-time PCR test. Total 10 plaque samples were compared the ratio of the 6 of bacteria using Wilcoxon signed rank test. Regarding the paired t-test, the blue-staining plaque area (9.3±9.2) showed significantly similarity with the RF plaque area (9.1±14.9, p=0.80) at ΔR20, however, the red-staining plaque area (31.6±20.9) presented difference from the RF plaque area (p<0.0001). In addition, bacterial composition of Prevotella intermedia and Streptococcus anginosus was associated with substantially more the RF plaque than the non-RF plaque (p<0.05). The plaque assessment method using QLF-D has potential to detect mature plaque, and the plaque area was associated with the blue-staining area using two-tone disclosure. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Quantitative assessment of human fetal renal blood flow.

    PubMed

    Veille, J C; Hanson, R A; Tatum, K; Kelley, K

    1993-12-01

    Our purpose was to longitudinally quantify human fetal renal blood flow. Twenty-two normal fetuses underwent a color-pulsed Doppler evaluation of the renal artery. The Doppler waveforms were digitized to assess the velocity-time integral. The size of the vessel was determined during systole with color high-resolution two-dimensional ultrasonography. Renal blood flow was estimated by multiplying the time-velocity integral (i.e., area under the curve) by the area of the renal artery. The combined cardiac output was calculated by adding right and left inflow Doppler-derived volumes. Renal artery size, peak flow velocity, time-velocity integral, and renal blood flow significantly increased with advancing gestational age. The resistivity indexes, such as the systolic/diastolic ratio or the Pourcelot index of the fetal renal artery, did not significantly change with advancing gestational age. The pulsatility index, however, was correlated with gestational age. The percentage of the combined cardiac output to the fetal kidney remained constant throughout gestation. Color pulsed Doppler can be used to visualize small and deep vascular structures in the human fetus. Renal blood flow increased with advancing gestational age. This increase seems to be related to the increase in the combined cardiac output.

  1. Validation of a quantitative phosphorus loss assessment tool.

    PubMed

    White, Michael J; Storm, Daniel E; Smolen, Michael D; Busteed, Philip R; Zhang, Hailin; Fox, Garey A

    2014-01-01

    Pasture Phosphorus Management Plus (PPM Plus) is a tool that allows nutrient management and conservation planners to evaluate phosphorus (P) loss from agricultural fields. This tool uses a modified version of the widely used Soil and Water Assessment Tool model with a vastly simplified interface. The development of PPM Plus has been fully described in previous publications; in this article we evaluate the accuracy of PPM Plus using 286 field-years of runoff, sediment, and P validation data from runoff studies at various locations in Oklahoma, Texas, Arkansas, and Georgia. Land uses include pasture, small grains, and row crops with rainfall ranging from 630 to 1390 mm yr, with and without animal manure application. PPM Plus explained 68% of the variability in total P loss, 56% of runoff, and 73% of the variability of sediment yield. An empirical model developed from these data using soil test P, total applied P, slope, and precipitation only accounted for 15% of the variability in total P loss, which implies that a process-based model is required to account for the diversity present in these data. PPM Plus is an easy-to-use conservation planning tool for P loss prediction, which, with modification, could be applicable at the regional and national scales.

  2. Quantitative assessment of gene expression network module-validation methods.

    PubMed

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-10-16

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks.

  3. A Quantitative Measure of Handwriting Dysfluency for Assessing Tardive Dyskinesia

    PubMed Central

    Caligiuri, Michael P.; Teulings, Hans-Leo; Dean, Charles E.; Lohr, James B.

    2015-01-01

    Tardive dyskinesia (TD) is movement disorder commonly associated with chronic exposure to antidopaminergic medications which may be in some cases disfiguring and socially disabling. The consensus from a growing body of research on the incidence and prevalence of TD in the modern era of antipsychotics indicates that this disorder has not disappeared continues to challenge the effective management of psychotic symptoms in patients with schizophrenia. A fundamental component in an effective strategy for managing TD is its reliable and accurate assessment. In the present study, we examined the clinical utility of a brief handwriting dysfluency measure for quantifying TD. Digitized samples of handwritten circles and loops were obtained from 62 psychosis patients with or without TD and from 50 healthy subjects. Two measures of dysfluent pen movements were extracted from each vertical pen stroke, including normalized jerk and the number of acceleration peaks. TD patients exhibited significantly higher dysfluency scores than non-TD patients and controls. Severity of handwriting movement dysfluency was correlated with AIMS severity ratings for some tasks. The procedure yielded high degrees of test-retest reliability. These results suggest that measures of handwriting movement dysfluency may be particularly useful for objectively evaluating the efficacy of pharmacotherapeutic strategies for treating TD. PMID:25679121

  4. Quantitative assessment of protein function prediction from metagenomics shotgun sequences.

    PubMed

    Harrington, E D; Singh, A H; Doerks, T; Letunic, I; von Mering, C; Jensen, L J; Raes, J; Bork, P

    2007-08-28

    To assess the potential of protein function prediction in environmental genomics data, we analyzed shotgun sequences from four diverse and complex habitats. Using homology searches as well as customized gene neighborhood methods that incorporate intergenic and evolutionary distances, we inferred specific functions for 76% of the 1.4 million predicted ORFs in these samples (83% when nonspecific functions are considered). Surprisingly, these fractions are only slightly smaller than the corresponding ones in completely sequenced genomes (83% and 86%, respectively, by using the same methodology) and considerably higher than previously thought. For as many as 75,448 ORFs (5% of the total), only neighborhood methods can assign functions, illustrated here by a previously undescribed gene associated with the well characterized heme biosynthesis operon and a potential transcription factor that might regulate a coupling between fatty acid biosynthesis and degradation. Our results further suggest that, although functions can be inferred for most proteins on earth, many functions remain to be discovered in numerous small, rare protein families.

  5. Quantitative Lateral Flow Assays for Salivary Biomarker Assessment: A Review

    PubMed Central

    Miočević, Olga; Cole, Craig R.; Laughlin, Mary J.; Buck, Robert L.; Slowey, Paul D.; Shirtcliff, Elizabeth A.

    2017-01-01

    Saliva is an emerging biofluid with a significant number of applications in use across research and clinical settings. The present paper explores the reasons why saliva has grown in popularity in recent years, balancing both the potential strengths and weaknesses of this biofluid. Focusing on reasons why saliva is different from other common biological fluids such as blood, urine, or tears, we review how saliva is easily obtained, with minimal risk to the donor, and reduced costs for collection, transportation, and analysis. We then move on to a brief review of the history and progress in rapid salivary testing, again reviewing the strengths and weaknesses of rapid immunoassays (e.g., lateral flow immunoassay) compared to more traditional immunoassays. We consider the potential for saliva as an alternative biofluid in a setting where rapid results are important. We focus the review on salivary tests for small molecule biomarkers using cortisol as an example. Such salivary tests can be applied readily in a variety of settings and for specific measurement purposes, providing researchers and clinicians with opportunities to assess biomarkers in real time with lower transportation, collection, and analysis costs, faster turnaround time, and minimal training requirements. We conclude with a note of cautious optimism that the field will soon gain the ability to collect and analyze salivary specimens at any location and return viable results within minutes. PMID:28660183

  6. Quantitative assessment of the serve speed in tennis.

    PubMed

    Vaverka, Frantisek; Cernosek, Miroslav

    2016-01-01

    A method is presented for assessing the serve speeds of tennis players based on their body height. The research involved a sample of top world players (221 males and 215 females) who participated in the Grand Slam tournaments in 2008 and 2012. The method is based on the linear regression analysis of the association between the player's body height and the serve speed (fastest serve, average first-serve, and second-serve speed). The coefficient of serve speed (CSS) was calculated as the quotient of the measured and the theoretical value of the serve speed on a regression line relative to the player's body height. The CSS of >1, 1 and <1 indicate above-average, average, and below-average serve speeds, respectively, relative to the top world tennis players with the same body height. The CSS adds a new element to the already existing statistics about a tennis match, and provides additional information about the performance of tennis players. The CSS can be utilised e.g. for setting the target serve speed of a given player to achieve based on his/her body height, choosing the most appropriate match strategy against a particular player, and a long-term monitoring of the effectiveness of training focused on the serve speed.

  7. Quantitative assessment of gene expression network module-validation methods

    PubMed Central

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-01-01

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks. PMID:26470848

  8. Quantitative methods for environmental justice assessment of transportation

    PubMed

    Mills; Neuhauser

    2000-06-01

    Application of Executive Order 12898 to risk assessment of highway or rail transport of hazardous materials has proven difficult; in general, the location and conditions affecting the propagation of a plume of hazardous material released in a potential accident are unknown. Therefore, analyses have only been possible in a geographically broad or approximate manner. The advent of geographic information systems and development of software enhancements at Sandia National Laboratories have made kilometer-by-kilometer analysis of populations tallied by U.S. Census blocks along entire routes practicable. Tabulations of total or racially/ethnically distinct populations close to a route, its alternatives, or the broader surrounding area, can then be compared and differences evaluated statistically. This article presents methods of comparing populations and their racial/ethnic compositions using simple tabulations, histograms, and chi-square tests for statistical significance of differences found. Two examples of these methods are presented: comparison of two routes and comparison of a route with its surroundings.

  9. A preliminary study for quantitative assessment of upper limb proprioception.

    PubMed

    Contu, Sara; Hussain, Asif; Masia, Lorenzo; Campolo, Domenico

    2016-08-01

    Proprioception, or sense of position and movement of the body, strongly correlates with motor recovery of the hemiplegic arm. The evaluation of the awareness of the location of joints in space involves measuring the accuracy of joint-angle replication. Robotic devices allow an accurate manipulation of joint movements necessary to assess proprioceptive status. This study evaluated the proprioceptive performance of healthy subjects by mean of the H-Man, a planar robot designed for upper-limb rehabilitation to gather preliminary normative data for neurorehabilitation applications. Twelve participants were equally divided into Aged and Young groups and were asked to indicate when their dominant hand position matched a predefined target in the contralateral, sagittal and ipsilateral direction. Results indicated a better performance for movements towards the contralateral target in terms of both absolute and signed error while there was not a significant effect of age group. Error variability was not affected by the target location and participants' age. The present study established preliminary proprioceptive metrics that could assist in providing information about the normal range of proprioceptive acuity of healthy subjects of different age.

  10. Quantitative assessment of the retinal microvasculature using optical coherence tomography angiography

    NASA Astrophysics Data System (ADS)

    Chu, Zhongdi; Lin, Jason; Gao, Chen; Xin, Chen; Zhang, Qinqin; Chen, Chieh-Li; Roisman, Luis; Gregori, Giovanni; Rosenfeld, Philip J.; Wang, Ruikang K.

    2016-06-01

    Optical coherence tomography angiography (OCTA) is clinically useful for the qualitative assessment of the macular microvasculature. However, there is a need for comprehensive quantitative tools to help objectively analyze the OCT angiograms. Few studies have reported the use of a single quantitative index to describe vessel density in OCT angiograms. In this study, we introduce a five-index quantitative analysis of OCT angiograms in an attempt to detect and assess vascular abnormalities from multiple perspectives. The indices include vessel area density, vessel skeleton density, vessel diameter index, vessel perimeter index, and vessel complexity index. We show the usefulness of the proposed indices with five illustrative cases. Repeatability is tested on both a healthy case and a stable diseased case, giving interclass coefficients smaller than 0.031. The results demonstrate that our proposed quantitative analysis may be useful as a complement to conventional OCTA for the diagnosis of disease and monitoring of treatment.

  11. Quantitative assessment of the retinal microvasculature using optical coherence tomography angiography

    PubMed Central

    Chu, Zhongdi; Lin, Jason; Gao, Chen; Xin, Chen; Zhang, Qinqin; Chen, Chieh-Li; Roisman, Luis; Gregori, Giovanni; Rosenfeld, Philip J.; Wang, Ruikang K.

    2016-01-01

    Abstract. Optical coherence tomography angiography (OCTA) is clinically useful for the qualitative assessment of the macular microvasculature. However, there is a need for comprehensive quantitative tools to help objectively analyze the OCT angiograms. Few studies have reported the use of a single quantitative index to describe vessel density in OCT angiograms. In this study, we introduce a five-index quantitative analysis of OCT angiograms in an attempt to detect and assess vascular abnormalities from multiple perspectives. The indices include vessel area density, vessel skeleton density, vessel diameter index, vessel perimeter index, and vessel complexity index. We show the usefulness of the proposed indices with five illustrative cases. Repeatability is tested on both a healthy case and a stable diseased case, giving interclass coefficients smaller than 0.031. The results demonstrate that our proposed quantitative analysis may be useful as a complement to conventional OCTA for the diagnosis of disease and monitoring of treatment. PMID:27286188

  12. Quantitative assessment of corpus callosum morphology in periventricular nodular heterotopia.

    PubMed

    Pardoe, Heath R; Mandelstam, Simone A; Hiess, Rebecca Kucharsky; Kuzniecky, Ruben I; Jackson, Graeme D

    2015-01-01

    We investigated systematic differences in corpus callosum morphology in periventricular nodular heterotopia (PVNH). Differences in corpus callosum mid-sagittal area and subregional area changes were measured using an automated software-based method. Heterotopic gray matter deposits were automatically labeled and compared with corpus callosum changes. The spatial pattern of corpus callosum changes were interpreted in the context of the characteristic anterior-posterior development of the corpus callosum in healthy individuals. Individuals with periventricular nodular heterotopia were imaged at the Melbourne Brain Center or as part of the multi-site Epilepsy Phenome Genome project. Whole brain T1 weighted MRI was acquired in cases (n=48) and controls (n=663). The corpus callosum was segmented on the mid-sagittal plane using the software "yuki". Heterotopic gray matter and intracranial brain volume was measured using Freesurfer. Differences in corpus callosum area and subregional areas were assessed, as well as the relationship between corpus callosum area and heterotopic GM volume. The anterior-posterior distribution of corpus callosum changes and heterotopic GM nodules were quantified using a novel metric and compared with each other. Corpus callosum area was reduced by 14% in PVNH (p=1.59×10(-9)). The magnitude of the effect was least in the genu (7% reduction) and greatest in the isthmus and splenium (26% reduction). Individuals with higher heterotopic GM volume had a smaller corpus callosum. Heterotopic GM volume was highest in posterior brain regions, however there was no linear relationship between the anterior-posterior position of corpus callosum changes and PVNH nodules. Reduced corpus callosum area is strongly associated with PVNH, and is probably associated with abnormal brain development in this neurological disorder. The primarily posterior corpus callosum changes may inform our understanding of the etiology of PVNH. Our results suggest that

  13. Quantitative assessment of damage growth in graphite epoxy laminates by acousto-ultrasonic measurements

    NASA Technical Reports Server (NTRS)

    Talreja, R.; Govada, A.; Henneke, E. G., II

    1984-01-01

    The acoustoultrasonic NDT method proposed by Vary (1976, 1978) for composite laminate damage growth quantitative assessment can both respond to the development of damage states and furnish quantitative parameters that monitor this damage development. Attention is presently given to data obtained for the case of quasi-static loading and fatigue testing of graphite-epoxy laminates. The shape parameters of the power spectral density for the ultrasonic signals correlate well with such other indications of damage development as stiffness degradation.

  14. Real Time Quantitative Radiological Monitoring Equipment for Environmental Assessment

    SciTech Connect

    John R. Giles; Lyle G. Roybal; Michael V. Carpenter

    2006-03-01

    The Idaho National Laboratory (INL) has developed a suite of systems that rapidly scan, analyze, and characterize radiological contamination in soil. These systems have been successfully deployed at several Department of Energy (DOE) laboratories and Cold War Legacy closure sites. Traditionally, these systems have been used during the characterization and remediation of radiologically contaminated soils and surfaces; however, subsequent to the terrorist attacks of September 11, 2001, the applications of these systems have expanded to include homeland security operations for first response, continuing assessment and verification of cleanup activities in the event of the detonation of a radiological dispersal device. The core system components are a detector, a spectral analyzer, and a global positioning system (GPS). The system is computer controlled by menu-driven, user-friendly custom software designed for a technician-level operator. A wide variety of detectors have been used including several configurations of sodium iodide (NaI) and high-purity germanium (HPGe) detectors, and a large area proportional counter designed for the detection of x-rays from actinides such as Am-241 and Pu-238. Systems have been deployed from several platforms including a small all-terrain vehicle (ATV), hand-pushed carts, a backpack mounted unit, and an excavator mounted unit used where personnel safety considerations are paramount. The INL has advanced this concept, and expanded the system functionality to create an integrated, field-deployed analytical system through the use of tailored analysis and operations software. Customized, site specific software is assembled from a supporting toolbox of algorithms that streamline the data acquisition, analysis and reporting process. These algorithms include region specific spectral stripping, automated energy calibration, background subtraction, activity calculations based on measured detector efficiencies, and on-line data quality checks

  15. Towards quantitative ecological risk assessment of elevated carbon dioxide levels in the marine environment.

    PubMed

    de Vries, Pepijn; Tamis, Jacqueline E; Foekema, Edwin M; Klok, Chris; Murk, Albertinka J

    2013-08-30

    The environmental impact of elevated carbon dioxide (CO2) levels has become of more interest in recent years. This, in relation to globally rising CO2 levels and related considerations of geological CO2 storage as a mitigating measure. In the present study effect data from literature were collected in order to conduct a marine ecological risk assessment of elevated CO2 levels, using a Species Sensitivity Distribution (SSD). It became evident that information currently available from the literature is mostly insufficient for such a quantitative approach. Most studies focus on effects of expected future CO2 levels, testing only one or two elevated concentrations. A full dose-response relationship, a uniform measure of exposure, and standardized test protocols are essential for conducting a proper quantitative risk assessment of elevated CO2 levels. Improvements are proposed to make future tests more valuable and usable for quantitative risk assessment. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Quantitative image analysis in the assessment of diffuse large B-cell lymphoma.

    PubMed

    Chabot-Richards, Devon S; Martin, David R; Myers, Orrin B; Czuchlewski, David R; Hunt, Kristin E

    2011-12-01

    Proliferation rates in diffuse large B-cell lymphoma have been associated with conflicting outcomes in the literature, more often with high proliferation associated with poor prognosis. In most studies, the proliferation rate was estimated by a pathologist using an immunohistochemical stain for the monoclonal antibody Ki-67. We hypothesized that a quantitative image analysis algorithm would give a more accurate estimate of the proliferation rate, leading to better associations with survival. In all, 84 cases of diffuse large B-cell lymphoma were selected according to the World Health Organization criteria. Ki-67 percentage positivity estimated by the pathologist was recorded from the original report. The same slides were then scanned using an Aperio ImageScope, and Ki-67 percentage positivity was calculated using a computer-based quantitative immunohistochemistry nuclear algorithm. In addition, chart review was performed and survival time was recorded. The Ki-67 percentage estimated by the pathologist from the original report versus quantitative image analysis was significantly correlated (P<0.001), but pathologist Ki-67 percentages were significantly higher than quantitative image analysis (P=0.021). There was less agreement at lower Ki-67 percentages. Comparison of Ki-67 percentage positivity versus survival did not show significant association either with pathologist estimate or quantitative image analysis. However, although not significant, there was a trend of worse survival at higher proliferation rates detected by the pathologist but not by quantitative image analysis. Interestingly, our data suggest that the Ki-67 percentage positivity as assessed by the pathologist may be more closely associated with survival outcome than that identified by quantitative image analysis. This may indicate that pathologists are better at selecting appropriate areas of the slide. More cases are needed to assess whether this finding would be statistically significant. Due to

  17. Assessment of pain and other patient symptoms in routine clinical care as quantitative, standardised, "scientific" data.

    PubMed

    Chua, Jacquelin R; Castrejon, Isabel; Pincus, Theodore

    2017-01-01

    Pain is the most common basis for visits to a rheumatologist, and reduction of pain is a primary goal of clinical care. Pain is assessed optimally by the patient on a self-report questionnaire. In clinical trials and other clinical research concerning pain and pain relief, detailed questionnaires are generally completed by patients. However, in routine clinical care, pain is generally assessed only according to narrative descriptions by the physician, and only a minority of settings assess pain using a standard, quantitative measure. Accurate, standard, quantitative assessment of pain in routine care is easily assessed in all patients with all diagnoses on a 0-10 visual analogue scale (VAS), by asking each patient to complete a 2-page multidimensional health assessment questionnaire/routine assessment of patient index data 3 (MDHAQ/RAPID3) at all visits. The MDHAQ includes VAS for pain, patient global assessment, and fatigue, as well as a quantitative physical function scale, RAPID3, review of systems, and recent medical history. The questionnaire provides the doctor with a 10-15 second overview of medical history data that otherwise would require about 10-15 minutes of conversation, saving time for the doctor and patient to focus on the most prominent concerns for the visit. MDHAQ scores from patients with 10 different rheumatic diagnoses, and specific data indicating similarity of scores in patients with osteoarthritis versus rheumatoid arthritis on the same questionnaire, are presented to illustrate the value of the MDHAQ in routine care.

  18. A framework for organizing and selecting quantitative approaches for benefit-harm assessment

    PubMed Central

    2012-01-01

    Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of

  19. A framework for organizing and selecting quantitative approaches for benefit-harm assessment.

    PubMed

    Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M

    2012-11-19

    Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the

  20. Assessment of extravascular lung water by quantitative ultrasound and CT in isolated bovine lung.

    PubMed

    Corradi, Francesco; Ball, Lorenzo; Brusasco, Claudia; Riccio, Anna Maria; Baroffio, Michele; Bovio, Giulio; Pelosi, Paolo; Brusasco, Vito

    2013-07-01

    Lung ultrasonography (LUS) and computed tomography (CT) were compared for quantitative assessment of extravascular lung water (EVLW) in 10 isolated bovine lung lobes. LUS and CT were obtained at different inflation pressures before and after instillation with known amounts of hypotonic saline. A video-based quantitative LUS analysis was superior to both single-frame quantitative analysis and visual scoring in the assessment of EVLW. Video-based mean LUS intensity was strongly correlated with EVLW density (r(2)=0.87) but weakly correlated with mean CT attenuation (r(2)=0.49) and physical density (r(2)=0.49). Mean CT attenuation was weakly correlated with EVLW density (r(2)=0.62) but strongly correlated with physical density (r(2)=0.99). When the effect of physical density was removed by partial correlation analysis, EVLW density was significantly correlated with video-based LUS intensity (r(2)=0.75) but not mean CT attenuation (r(2)=0.007). In conclusion, these findings suggest that quantitative LUS by video gray-scale analysis can assess EVLW more reliably than LUS visual scoring or quantitative CT. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. Assessment of Scientific Literacy: Development and Validation of the Quantitative Assessment of Socio-Scientific Reasoning (QuASSR)

    ERIC Educational Resources Information Center

    Romine, William L.; Sadler, Troy D.; Kinslow, Andrew T.

    2017-01-01

    We describe the development and validation of the Quantitative Assessment of Socio-scientific Reasoning (QuASSR) in a college context. The QuASSR contains 10 polytomous, two-tiered items crossed between two scenarios, and is based on theory suggesting a four-pronged structure for SSR (complexity, perspective taking, inquiry, and skepticism). In…

  2. Assessment of Scientific Literacy: Development and Validation of the Quantitative Assessment of Socio-Scientific Reasoning (QuASSR)

    ERIC Educational Resources Information Center

    Romine, William L.; Sadler, Troy D.; Kinslow, Andrew T.

    2017-01-01

    We describe the development and validation of the Quantitative Assessment of Socio-scientific Reasoning (QuASSR) in a college context. The QuASSR contains 10 polytomous, two-tiered items crossed between two scenarios, and is based on theory suggesting a four-pronged structure for SSR (complexity, perspective taking, inquiry, and skepticism). In…

  3. Reliability of Quantitative Ultrasonic Assessment of Normal-Tissue Toxicity in Breast Cancer Radiotherapy

    SciTech Connect

    Yoshida, Emi J.; Chen Hao; Torres, Mylin; Andic, Fundagul; Liu Haoyang; Chen Zhengjia; Sun, Xiaoyan; Curran, Walter J.; Liu Tian

    2012-02-01

    Purpose: We have recently reported that ultrasound imaging, together with ultrasound tissue characterization (UTC), can provide quantitative assessment of radiation-induced normal-tissue toxicity. This study's purpose is to evaluate the reliability of our quantitative ultrasound technology in assessing acute and late normal-tissue toxicity in breast cancer radiotherapy. Method and Materials: Our ultrasound technique analyzes radiofrequency echo signals and provides quantitative measures of dermal, hypodermal, and glandular tissue toxicities. To facilitate easy clinical implementation, we further refined this technique by developing a semiautomatic ultrasound-based toxicity assessment tool (UBTAT). Seventy-two ultrasound studies of 26 patients (720 images) were analyzed. Images of 8 patients were evaluated for acute toxicity (<6 months postradiotherapy) and those of 18 patients were evaluated for late toxicity ({>=}6 months postradiotherapy). All patients were treated according to a standard radiotherapy protocol. To assess intraobserver reliability, one observer analyzed 720 images in UBTAT and then repeated the analysis 3 months later. To assess interobserver reliability, three observers (two radiation oncologists and one ultrasound expert) each analyzed 720 images in UBTAT. An intraclass correlation coefficient (ICC) was used to evaluate intra- and interobserver reliability. Ultrasound assessment and clinical evaluation were also compared. Results: Intraobserver ICC was 0.89 for dermal toxicity, 0.74 for hypodermal toxicity, and 0.96 for glandular tissue toxicity. Interobserver ICC was 0.78 for dermal toxicity, 0.74 for hypodermal toxicity, and 0.94 for glandular tissue toxicity. Statistical analysis found significant changes in dermal (p < 0.0001), hypodermal (p = 0.0027), and glandular tissue (p < 0.0001) assessments in the acute toxicity group. Ultrasound measurements correlated with clinical Radiation Therapy Oncology Group (RTOG) toxicity scores of patients

  4. The quantitative assessment of domino effects caused by overpressure. Part I. Probit models.

    PubMed

    Cozzani, Valerio; Salzano, Ernesto

    2004-03-19

    Accidents caused by domino effect are among the more severe that took place in the chemical and process industry. However, a well established and widely accepted methodology for the quantitative assessment of domino accidents contribution to industrial risk is still missing. Hence, available data on damage to process equipment caused by blast waves were revised in the framework of quantitative risk analysis, aiming at the quantitative assessment of domino effects caused by overpressure. Specific probit models were derived for several categories of process equipment and were compared to other literature approaches for the prediction of probability of damage of equipment loaded by overpressure. The results evidence the importance of using equipment-specific models for the probability of damage and equipment-specific damage threshold values, rather than general equipment correlation, which may lead to errors up to 500%.

  5. High-throughput automated image analysis of neuroinflammation and neurodegeneration enables quantitative assessment of virus neurovirulence

    PubMed Central

    Maximova, Olga A.; Murphy, Brian R.; Pletnev, Alexander G.

    2010-01-01

    Historically, the safety of live attenuated vaccine candidates against neurotropic viruses was assessed by semi-quantitative analysis of virus-induced histopathology in the central nervous system of monkeys. We have developed a high-throughput automated image analysis (AIA) for the quantitative assessment of virus-induced neuroinflammation and neurodegeneration. Evaluation of the results generated by AIA showed that quantitative estimates of lymphocytic infiltration, microglial activation, and neurodegeneration strongly and significantly correlated with results of traditional histopathological scoring. In addition, we show that AIA is a targeted, objective, accurate, and time-efficient approach that provides reliable differentiation of virus neurovirulence. As such, it may become a useful tool in establishing consistent analytical standards across research and development laboratories and regulatory agencies, and may improve the safety evaluation of live virus vaccines. The implementation of this high-throughput AIA will markedly advance many fields of research including virology, neuroinflammation, neuroscience, and vaccinology. PMID:20688036

  6. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  7. Anatomic and Quantitative Temporal Bone CT for Preoperative Assessment of Branchio-Oto-Renal Syndrome.

    PubMed

    Ginat, D T; Ferro, L; Gluth, M B

    2016-12-01

    We describe the temporal bone computed tomography (CT) findings of an unusual case of branchio-oto-renal syndrome with ectopic ossicles that are partially located in the middle cranial fossa. We also describe quantitative temporal bone CT assessment pertaining to cochlear implantation in the setting of anomalous cochlear anatomy associated with this syndrome.

  8. Effect of Teacher Specialized Training on Limited English Speaking Students' Assessment Outcomes: Quantitative Study

    ERIC Educational Resources Information Center

    Palaroan, Michelle A.

    2009-01-01

    The quantitative study was a comparison of Limited English Proficient (LEP) students' assessment outcomes when taught by a teacher with specialized training and when taught by teachers with no specialized training. The comparison of 2007-2008 Northern Nevada LEP third grade student scores in the content areas of English language arts and…

  9. DOSIMETRY MODELING OF INHALED FORMALDEHYDE: BINNING NASAL FLUX PREDICTIONS FOR QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    Dosimetry Modeling of Inhaled Formaldehyde: Binning Nasal Flux Predictions for Quantitative Risk Assessment. Kimbell, J.S., Overton, J.H., Subramaniam, R.P., Schlosser, P.M., Morgan, K.T., Conolly, R.B., and Miller, F.J. (2001). Toxicol. Sci. 000, 000:000.

    Interspecies e...

  10. Quantitative Approach to Collaborative Learning: Performance Prediction, Individual Assessment, and Group Composition

    ERIC Educational Resources Information Center

    Cen, Ling; Ruta, Dymitr; Powell, Leigh; Hirsch, Benjamin; Ng, Jason

    2016-01-01

    The benefits of collaborative learning, although widely reported, lack the quantitative rigor and detailed insight into the dynamics of interactions within the group, while individual contributions and their impacts on group members and their collaborative work remain hidden behind joint group assessment. To bridge this gap we intend to address…

  11. Quantitative Approach to Collaborative Learning: Performance Prediction, Individual Assessment, and Group Composition

    ERIC Educational Resources Information Center

    Cen, Ling; Ruta, Dymitr; Powell, Leigh; Hirsch, Benjamin; Ng, Jason

    2016-01-01

    The benefits of collaborative learning, although widely reported, lack the quantitative rigor and detailed insight into the dynamics of interactions within the group, while individual contributions and their impacts on group members and their collaborative work remain hidden behind joint group assessment. To bridge this gap we intend to address…

  12. A quantitative microbial risk assessment for center pivot irrigation of dairy wastewaters

    USDA-ARS?s Scientific Manuscript database

    In the western United States where livestock wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks from inhaling pathogens aerosolized dur...

  13. DOSIMETRY MODELING OF INHALED FORMALDEHYDE: BINNING NASAL FLUX PREDICTIONS FOR QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    Dosimetry Modeling of Inhaled Formaldehyde: Binning Nasal Flux Predictions for Quantitative Risk Assessment. Kimbell, J.S., Overton, J.H., Subramaniam, R.P., Schlosser, P.M., Morgan, K.T., Conolly, R.B., and Miller, F.J. (2001). Toxicol. Sci. 000, 000:000.

    Interspecies e...

  14. QUANTITATIVE ASSESSMENT OF CORAL DISEASES IN THE FLORIDA KEYS: STRATEGY AND METHODOLOGY

    EPA Science Inventory

    Most studies of coral disease have focused on the incidence of a single disease within a single location. Our overall objective is to use quantitative assessments to characterize annual patterns in the distribution and frequency of scleractinian and gorgonian coral diseases over ...

  15. Understanding outbreaks of waterborne infectious disease: quantitative microbial risk assessment vs. epidemiology

    USDA-ARS?s Scientific Manuscript database

    Drinking water contaminated with microbial pathogens can cause outbreaks of infectious disease, and these outbreaks are traditionally studied using epidemiologic methods. Quantitative microbial risk assessment (QMRA) can predict – and therefore help prevent – such outbreaks, but it has never been r...

  16. Climate Change Education: Quantitatively Assessing the Impact of a Botanical Garden as an Informal Learning Environment

    ERIC Educational Resources Information Center

    Sellmann, Daniela; Bogner, Franz X.

    2013-01-01

    Although informal learning environments have been studied extensively, ours is one of the first studies to quantitatively assess the impact of learning in botanical gardens on students' cognitive achievement. We observed a group of 10th graders participating in a one-day educational intervention on climate change implemented in a botanical garden.…

  17. Climate Change Education: Quantitatively Assessing the Impact of a Botanical Garden as an Informal Learning Environment

    ERIC Educational Resources Information Center

    Sellmann, Daniela; Bogner, Franz X.

    2013-01-01

    Although informal learning environments have been studied extensively, ours is one of the first studies to quantitatively assess the impact of learning in botanical gardens on students' cognitive achievement. We observed a group of 10th graders participating in a one-day educational intervention on climate change implemented in a botanical garden.…

  18. QUANTITATIVE ASSESSMENT OF CORAL DISEASES IN THE FLORIDA KEYS: STRATEGY AND METHODOLOGY

    EPA Science Inventory

    Most studies of coral disease have focused on the incidence of a single disease within a single location. Our overall objective is to use quantitative assessments to characterize annual patterns in the distribution and frequency of scleractinian and gorgonian coral diseases over ...

  19. Quantitative assessment method for computer-generated holograms free from the effect of viewpoint.

    PubMed

    Kiwaki, Taichi; Shimobaba, Tomoyoshi; Masuda, Nobuyuki; Ito, Tomoyoshi

    2010-04-01

    A quantitative assessment method for computer-generated holograms is presented. Our scheme is based on a simple evaluation quantity reflecting the optical radiating power from the holograms; this assures the overall validity of our method as a three-dimensional (3D) display assessment technique. Moreover, the effect of location from which the 3D view is observed is ruled out from the result. This contributes to both economy of computation and conciseness of the result.

  20. A comparison of visual and quantitative assessment of left ventricular ejection fraction by cardiac magnetic resonance.

    PubMed

    Holloway, Cameron J; Edwards, Lindsay M; Rider, Oliver J; Fast, Angela; Clarke, Kieran; Francis, Jane M; Myerson, Saul G; Neubauer, Stefan

    2011-04-01

    To determine the accuracy of visual analysis of left ventricular (LV) function in comparison with the accepted quantitative gold standard method, cardiac magnetic resonance (CMR). Cine CMR imaging was performed at 1.5 T on 44 patients with a range of ejection fractions (EF, 5-80%). Clinicians (n = 18) were asked to visually assess EF after sequentially being shown cine images of a four chamber (horizontal long axis; HLA), two chamber (vertical long axis; VLA) and a short axis stack (SAS) and results were compared to a commercially available analysis package. There were strong correlations between visual and quantitative assessment. However, the EF was underestimated in all categories (by 8.4% for HLA, 8.4% for HLA + VLA and 7.9% for HLA + VLA + SAS, P all < 0.01) and particularly underestimated in mild LV impairment (17.4%, P < 0.01), less so for moderate (4.9%) and not for severe impairment (1%). Assessing more than one view of the heart improved visual assessment of LV, EF, however, clinicians underestimated EF by 8.4% on average, with particular inaccuracy in those with mild dysfunction. Given the important clinical information provided by LV assessment, quantitative analysis is recommended for accurate assessment.

  1. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined.

  2. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses

    PubMed Central

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning. PMID:27146161

  3. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses.

    PubMed

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning.

  4. Photon-tissue interaction model for quantitative assessment of biological tissues

    NASA Astrophysics Data System (ADS)

    Lee, Seung Yup; Lloyd, William R.; Wilson, Robert H.; Chandra, Malavika; McKenna, Barbara; Simeone, Diane; Scheiman, James; Mycek, Mary-Ann

    2014-02-01

    In this study, we describe a direct fit photon-tissue interaction model to quantitatively analyze reflectance spectra of biological tissue samples. The model rapidly extracts biologically-relevant parameters associated with tissue optical scattering and absorption. This model was employed to analyze reflectance spectra acquired from freshly excised human pancreatic pre-cancerous tissues (intraductal papillary mucinous neoplasm (IPMN), a common precursor lesion to pancreatic cancer). Compared to previously reported models, the direct fit model improved fit accuracy and speed. Thus, these results suggest that such models could serve as real-time, quantitative tools to characterize biological tissues assessed with reflectance spectroscopy.

  5. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.

  6. A novel tolerance range approach for the quantitative assessment of ecosystems.

    PubMed

    Hearnshaw, Edward J S; Hughey, Kenneth F D

    2012-03-15

    This paper develops a novel tolerance range approach that allows for the quantitative assessment of ecosystems with only a minimum amount of information. The quantitative assessment is achieved through the determination of tolerance range scores and indices that indicate the vulnerability of species. For the purposes of demonstrating the tolerance range approach an ecosystem assessment is performed on Te Waihora/Lake Ellesmere, a large shallow lake found in the Canterbury region of New Zealand. From the analysis of tolerance range scores and indices it was found that brown trout and lake-margin vegetation are the most vulnerable species of value to further degradation. This information implies that management actions should prioritize towards preserving these species to maintain all valued species along sustainable pathways.

  7. Improving the Linkages between Air Pollution Epidemiology and Quantitative Risk Assessment

    PubMed Central

    Bell, Michelle L.; Walker, Katy; Hubbell, Bryan

    2011-01-01

    Background: Air pollution epidemiology plays an integral role in both identifying the hazards of air pollution as well as supplying the risk coefficients that are used in quantitative risk assessments. Evidence from both epidemiology and risk assessments has historically supported critical environmental policy decisions. The extent to which risk assessors can properly specify a quantitative risk assessment and characterize key sources of uncertainty depends in part on the availability, and clarity, of data and assumptions in the epidemiological studies. Objectives: We discuss the interests shared by air pollution epidemiology and risk assessment communities in ensuring that the findings of epidemiological studies are appropriately characterized and applied correctly in risk assessments. We highlight the key input parameters for risk assessments and consider how modest changes in the characterization of these data might enable more accurate risk assessments that better represent the findings of epidemiological studies. Discussion: We argue that more complete information regarding the methodological choices and input data used in epidemiological studies would support more accurate risk assessments—to the benefit of both disciplines. In particular, we suggest including additional details regarding air quality, demographic, and health data, as well as certain types of data-rich graphics. Conclusions: Relatively modest changes to the data reported in epidemiological studies will improve the quality of risk assessments and help prevent the misinterpretation and mischaracterization of the results of epidemiological studies. Such changes may also benefit epidemiologists undertaking meta-analyses. We suggest workshops as a way to improve the dialogue between the two communities. PMID:21816702

  8. Dirofilariasis Mimicking an Acute Scrotum.

    PubMed

    Bertozzi, Mirko; Rinaldi, Victoria Elisa; Prestipino, Marco; Giovenali, Paolo; Appignani, Antonino

    2015-10-01

    Human infections caused by Dirofilaria repens have been reported in many areas of the world. We describe a case of a 3-year-old child with an intrascrotal mass caused by D repens mimicking an acute scrotum. This represents the first case of scrotal dirofilariasis described in pediatric age with such an unusual presentation.

  9. Xanthogranulomatous cholecystitis mimicking gallbladder cancer.

    PubMed

    Ewelukwa, Ofor; Ali, Omair; Akram, Salma

    2014-05-08

    Xanthogranulomatous cholecystitis (XGC) is a benign, uncommon variant of chronic cholecystitis characterised by focal or diffuse destructive inflammatory process of the gallbladder (GB). Macroscopically, it appears like yellowish tumour-like masses in the wall of the GB. This article reports on a 74-year-old woman with XGC mimicking GB cancer.

  10. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.

  11. Quantitative assessment of the percent fat in domestic animal bone marrow.

    PubMed

    Meyerholtz, Kimberly A; Wilson, Christina R; Everson, Robert J; Hooser, Stephen B

    2011-05-01

    Measurement of the amount of fat in femoral bone marrow can provide a quantitative assessment of the nutritional status of an individual animal. An analytical method is presented for quantitating the percent fat in bone marrow from three domestic species: bovine, canine, and equine. In this procedure, fat is extracted from bone marrow using pentane, and the percent fat recovered is determined gravimetrically. Based on analyses from adult animals (normal body condition scores), the average percentage of fat in the bone marrow was >80%. In cases in which animals have been diagnosed as emaciated or exhibit serous atrophy of fat (body scores of 1 or 2), the femoral bone marrow fat was less than 20%. In domestic animals, bone marrow fat analysis can be a useful, quantitative measure that, when used in conjunction with all other data available, can support a diagnosis of starvation or malnutrition. © 2011 American Academy of Forensic Sciences.

  12. An assessment of software solutions for the analysis of mass spectrometry based quantitative proteomics data.

    PubMed

    Mueller, Lukas N; Brusniak, Mi-Youn; Mani, D R; Aebersold, Ruedi

    2008-01-01

    Over the past decade, a series of experimental strategies for mass spectrometry based quantitative proteomics and corresponding computational methodology for the processing of the resulting data have been generated. We provide here an overview of the main quantification principles and available software solutions for the analysis of data generated by liquid chromatography coupled to mass spectrometry (LC-MS). Three conceptually different methods to perform quantitative LC-MS experiments have been introduced. In the first, quantification is achieved by spectral counting, in the second via differential stable isotopic labeling, and in the third by using the ion current in label-free LC-MS measurements. We discuss here advantages and challenges of each quantification approach and assess available software solutions with respect to their instrument compatibility and processing functionality. This review therefore serves as a starting point for researchers to choose an appropriate software solution for quantitative proteomic experiments based on their experimental and analytical requirements.

  13. A quantitative approach to the intraoperative echocardiographic assessment of the mitral valve for repair.

    PubMed

    Mahmood, Feroze; Matyal, Robina

    2015-07-01

    Intraoperative echocardiography of the mitral valve has evolved from a qualitative assessment of flow-dependent variables to quantitative geometric analyses before and after repair. In addition, 3-dimensional echocardiographic data now allow for a precise assessment of mitral valve apparatus. Complex structures, such as the mitral annulus, can be interrogated comprehensively without geometric assumptions. Quantitative analyses of mitral valve apparatus are particularly valuable for identifying indices of left ventricular and mitral remodeling to establish the chronicity and severity of mitral regurgitation. This can help identify patients who may be unsuitable candidates for repair as the result of irreversible remodeling of the mitral valve apparatus. Principles of geometric analyses also have been extended to the assessment of repaired mitral valves. Changes in mitral annular shape and size determine the stress exerted on the mitral leaflets and, therefore, the durability of repair. Given this context, echocardiographers may be expected to diagnose and quantify valvular dysfunction, assess suitability for repair, assist in annuloplasty ring sizing, and determine the success and failure of the repair procedure. As a result, anesthesiologists have progressed from being mere service providers to participants in the decision-making process. It is therefore prudent for them to acquaint themselves with the principles of intraoperative quantitative mitral valve analysis to assist in rational and objective decision making.

  14. Quantitative Assessment of Flow Reduction After Feeder Embolization in Meningioma by Using Pseudocontinuous Arterial Spin Labeling.

    PubMed

    Wanibuchi, Masahiko; Komatsu, Katsuya; Akiyama, Yukinori; Mikami, Takeshi; Iihoshi, Satoshi; Miyata, Kei; Mikuni, Nobuhiro

    2016-09-01

    Meningioma is a hypervascular tumor of the central nervous system. Angiographic disappearance of tumor blush after preoperative feeder embolization allows qualitative, but not quantitative, assessment of flow reduction. Pseudocontinuous arterial spin labeling (PCASL), which has evolved from magnetic resonance imaging techniques, allows noninvasive measurement of cerebral blood flow (CBF) using water protons in the arterial blood flow. We applied PCASL for assessment of blood flow in meningioma and its reduction on preoperative embolization. Forty-one consecutive patients (11 males, 30 females) with histologically proven meningioma were evaluated by PCASL. Quantitative assessment by an absolute value of tumor blood flow (TBF) and a relative value of tumor vascular index (tVI; calculated as TBF divided by CBF) were calculated. In 8 cases, in which preoperative embolization was achieved, flow reduction rate was evaluated. TBF of meningiomas, 155.8 mL/100 g·min(-1) on average, was 2.6 times higher than CBF, 59.9 mL/100 g·min(-1) (P < 0.001). Patients who underwent feeder embolization showed statistically greater flow reduction rate, which was calculated as 42.7% (P < 0.05). Mean tVI before embolization was 4.1, which was reduced to 2.1 after embolization. PCASL could yield quantitative assessment of blood flow in meningioma including flow reduction rate in cases of feeder embolization. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Image coregistration: quantitative processing framework for the assessment of brain lesions.

    PubMed

    Huhdanpaa, Hannu; Hwang, Darryl H; Gasparian, Gregory G; Booker, Michael T; Cen, Yong; Lerner, Alexander; Boyko, Orest B; Go, John L; Kim, Paul E; Rajamohan, Anandh; Law, Meng; Shiroishi, Mark S

    2014-06-01

    The quantitative, multiparametric assessment of brain lesions requires coregistering different parameters derived from MRI sequences. This will be followed by analysis of the voxel values of the ROI within the sequences and calculated parametric maps, and deriving multiparametric models to classify imaging data. There is a need for an intuitive, automated quantitative processing framework that is generalized and adaptable to different clinical and research questions. As such flexible frameworks have not been previously described, we proceeded to construct a quantitative post-processing framework with commonly available software components. Matlab was chosen as the programming/integration environment, and SPM was chosen as the coregistration component. Matlab routines were created to extract and concatenate the coregistration transforms, take the coregistered MRI sequences as inputs to the process, allow specification of the ROI, and store the voxel values to the database for statistical analysis. The functionality of the framework was validated using brain tumor MRI cases. The implementation of this quantitative post-processing framework enables intuitive creation of multiple parameters for each voxel, facilitating near real-time in-depth voxel-wise analysis. Our initial empirical evaluation of the framework is an increased usage of analysis requiring post-processing and increased number of simultaneous research activities by clinicians and researchers with non-technical backgrounds. We show that common software components can be utilized to implement an intuitive real-time quantitative post-processing framework, resulting in improved scalability and increased adoption of post-processing needed to answer important diagnostic questions.

  16. Relationship between Plaque Echo, Thickness and Neovascularization Assessed by Quantitative and Semi-quantitative Contrast-Enhanced Ultrasonography in Different Stenosis Groups.

    PubMed

    Song, Yan; Feng, Jun; Dang, Ying; Zhao, Chao; Zheng, Jie; Ruan, Litao

    2017-09-28

    The aim of this study was to determine the relationship between plaque echo, thickness and neovascularization in different stenosis groups using quantitative and semi-quantitative contrast-enhanced ultrasound (CEUS) in patients with carotid atherosclerosis plaque. A total of 224 plaques were divided into mild stenosis (<50%; 135 plaques, 60.27%), moderate stenosis (50%-69%; 39 plaques, 17.41%) and severe stenosis (70%-99%; 50 plaques, 22.32%) groups. Quantitative and semi-quantitative methods were used to assess plaque neovascularization and determine the relationship between plaque echo, thickness and neovascularization. Correlation analysis revealed no relationship of neovascularization with plaque echo in the groups using either quantitative or semi-quantitative methods. Furthermore, there was no correlation of neovascularization with plaque thickness using the semi-quantitative method. The ratio of areas under the curve (RAUC) was negatively correlated with plaque thickness (r = -0.317, p = 0.001) in the mild stenosis group. With the quartile method, plaque thickness of the mild stenosis group was divided into four groups, with significant differences between the 1.5-2.2 mm and ≥3.5 mm groups (p = 0.002), 2.3-2.8 mm and ≥3.5 mm groups (p <0.001) and 2.9-3.4 mm and ≥3.5 mm groups (p <0.001). Both semi-quantitative and quantitative CEUS methods characterizing neovascularization of plaque are equivalent with respect to assessing relationships between neovascularization, echogenicity and thickness. However, the quantitative method could fail for plaque <3.5 mm because of motion artifacts. Copyright © 2017 World Federation for Ultrasound in Medicine and Biology. Published by Elsevier Inc. All rights reserved.

  17. Quantitative Assessment of In-solution Digestion Efficiency Identifies Optimal Protocols for Unbiased Protein Analysis*

    PubMed Central

    León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.

    2013-01-01

    The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921

  18. Quantitative assessment of RNA-protein interactions with high-throughput sequencing-RNA affinity profiling.

    PubMed

    Ozer, Abdullah; Tome, Jacob M; Friedman, Robin C; Gheba, Dan; Schroth, Gary P; Lis, John T

    2015-08-01

    Because RNA-protein interactions have a central role in a wide array of biological processes, methods that enable a quantitative assessment of these interactions in a high-throughput manner are in great demand. Recently, we developed the high-throughput sequencing-RNA affinity profiling (HiTS-RAP) assay that couples sequencing on an Illumina GAIIx genome analyzer with the quantitative assessment of protein-RNA interactions. This assay is able to analyze interactions between one or possibly several proteins with millions of different RNAs in a single experiment. We have successfully used HiTS-RAP to analyze interactions of the EGFP and negative elongation factor subunit E (NELF-E) proteins with their corresponding canonical and mutant RNA aptamers. Here we provide a detailed protocol for HiTS-RAP that can be completed in about a month (8 d hands-on time). This includes the preparation and testing of recombinant proteins and DNA templates, clustering DNA templates on a flowcell, HiTS and protein binding with a GAIIx instrument, and finally data analysis. We also highlight aspects of HiTS-RAP that can be further improved and points of comparison between HiTS-RAP and two other recently developed methods, quantitative analysis of RNA on a massively parallel array (RNA-MaP) and RNA Bind-n-Seq (RBNS), for quantitative analysis of RNA-protein interactions.

  19. [Application of uncertainty assessment in NIR quantitative analysis of traditional Chinese medicine].

    PubMed

    Xue, Zhong; Xu, Bing; Liu, Qian; Shi, Xin-Yuan; Li, Jian-Yu; Wu, Zhi-Sheng; Qiao, Yan-Jiang

    2014-10-01

    The near infrared (NIR) spectra of Liuyi San samples were collected during the mixing process and the quantitative models by PLS (partial least squares) method were generated for the quantification of the concentration of glycyrrhizin. The PLS quantitative model had good calibration and prediction performances (r(cal) 0.998 5, RMSEC = 0.044 mg · g(-1); r(val) = 0.947 4, RMSEP = 0.124 mg · g(-1)), indicating that NIR spectroscopy can be used as a rapid determination method of the concentration of glycyrrhizin in Liuyi San powder. After the validation tests were designed, the Liao-Lin-Iyer approach based on Monte Carlo simulation was used to estimate β-content-γ-confidence tolerance intervals. Then the uncertainty was calculated, and the uncer- tainty profile was drawn. The NIR analytical method was considered valid when the concentration of glycyrrhizin is above 1.56 mg · g(-1) since the uncertainty fell within the acceptable limits (λ = ± 20%). The results showed that uncertainty assessment can be used in NIR quantitative models of glycyrrhizin for different concentrations and provided references for other traditional Chinese medicine to finish the uncertainty assessment using NIR quantitative analysis.

  20. Study on quantitative risk assessment model of the third party damage for natural gas pipelines based on fuzzy comprehensive assessment

    NASA Astrophysics Data System (ADS)

    Qiu, Zeyang; Liang, Wei; Wang, Xue; Lin, Yang; Zhang, Meng

    2017-05-01

    As an important part of national energy supply system, transmission pipelines for natural gas are possible to cause serious environmental pollution, life and property loss in case of accident. The third party damage is one of the most significant causes for natural gas pipeline system accidents, and it is very important to establish an effective quantitative risk assessment model of the third party damage for reducing the number of gas pipelines operation accidents. Against the third party damage accident has the characteristics such as diversity, complexity and uncertainty, this paper establishes a quantitative risk assessment model of the third party damage based on Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE). Firstly, risk sources of third party damage should be identified exactly, and the weight of factors could be determined via improved AHP, finally the importance of each factor is calculated by fuzzy comprehensive evaluation model. The results show that the quantitative risk assessment model is suitable for the third party damage of natural gas pipelines and improvement measures could be put forward to avoid accidents based on the importance of each factor.

  1. Sensitivity analysis of a two-dimensional quantitative microbiological risk assessment: keeping variability and uncertainty separated.

    PubMed

    Busschaert, Pieter; Geeraerd, Annemie H; Uyttendaele, Mieke; Van Impe, Jan F

    2011-08-01

    The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo-randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used-that is, an ANOVA-like model and Sobol sensitivity indices-to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.

  2. Purity assessment problem in quantitative NMR--impurity resonance overlaps with monitor signal multiplets from stereoisomers.

    PubMed

    Malz, Frank; Jancke, Harald

    2006-06-01

    This paper describes the situation that can emerge when the signals to be evaluated in quantitative NMR measurements-so-called "monitor signals"--consist of several resonance lines from the stereoisomers of the analyte in addition to an impurity signal underneath. The monitor signal problem is demonstrated in the purity assessment of two samples of 2-(isopropylamino)-4-(ethylamino)-6-chloro-1,3,5-triazine (atrazine), a common herbizide which served as analyte in a CCQM intercomparison. It is shown that, in DMSO-d6 solution, a mixture of stereoisomers leads to several individual overlapping singlets, which are further split by spin-spin coupling. A measurement protocol was developed for finding and identifying an impurity that has a signal that is positioned precisely beneath the methyl signal chosen as the monitor signal in one of the samples. Quantitative NMR purity assessment is still possible in this special case, but with higher uncertainty.

  3. Rapid quantitative assessment of visible injury to vegetation and visual amenity effects of fluoride air pollution.

    PubMed

    Doley, D

    2010-01-01

    Quantitative measures of visible injury are proposed for the protection of the aesthetic acceptability and health of ecosystems. Visible indications of air pollutant injury symptoms can be assessed rapidly and economically over large areas of mixed species such as native ecosystems. Reliable indication requires close attention to the criteria for assessment, species selection, and the influence of other environmental conditions on plant response to a pollutant. The estimation of fluoride-induced visible injury in dicotyledonous species may require techniques that are more varied than the measurement of necrosis in linear-leaved monocotyledons and conifers. A scheme is described for quantitative estimates of necrosis, chlorosis and deformation of leaves using an approximately geometric series of injury categories that permits rapid and sufficiently consistent determination and recognises degrees of aesthetic offence associated with foliar injury to plants.

  4. The quantitative assessment of domino effect caused by overpressure. Part II. Case studies.

    PubMed

    Cozzani, Valerio; Salzano, Ernesto

    2004-03-19

    A quantitative assessment of the contribution to industrial risk of domino effect due to overpressure was undertaken by using the damage probability models developed in part I. Two case studies derived from the actual lay-out of an oil refinery were analyzed. Individual and societal risk indexes were estimated both in the absence and in the presence of the domino effects caused by overpressure. An increase of individual risk up to an order of magnitude was found when considering domino effects.

  5. Postoperative Quantitative Assessment of Reconstructive Tissue Status in Cutaneous Flap Model using Spatial Frequency Domain Imaging

    PubMed Central

    Yafi, Amr; Vetter, Thomas S; Scholz, Thomas; Patel, Sarin; Saager, Rolf B; Cuccia, David J; Evans, Gregory R; Durkin, Anthony J

    2010-01-01

    Background The purpose of this study is to investigate the capabilities of a novel optical wide-field imaging technology known as Spatial Frequency Domain Imaging (SFDI) to quantitatively assess reconstructive tissue status. Methods Twenty two cutaneous pedicle flaps were created on eleven rats based on the inferior epigastric vessels. After baseline measurement, all flaps underwent vascular ischemia, induced by clamping the supporting vessels for two hours (either arterio-venous or selective venous occlusions) normal saline was injected to the control flap, and hypertonic hyperoncotic saline solution to the experimental flap. Flaps were monitored for two hours after reperfusion. The SFDI system was used for quantitative assessment of flap status over the duration of the experiment. Results All flaps demonstrated a significant decline in oxy-hemoglobin and tissue oxygen saturation in response to occlusion. Total hemoglobin and deoxy-hemoglobin were markedly increased in the selective venous occlusion group. After reperfusion and the solutions were administered, oxy-hemoglobin and tissue oxygen saturation in those flaps that survived gradually returned to the baseline levels. However, flaps for which oxy-hemoglobin and tissue oxygen saturation didn’t show any signs of recovery appeared to be compromised and eventually became necrotic within 24–48 hours in both occlusion groups. Conclusion SFDI technology provides a quantitative, objective method to assess tissue status. This study demonstrates the potential of this optical technology to assess tissue perfusion in a very precise and quantitative way, enabling wide-field visualization of physiological parameters. The results of this study suggest that SFDI may provide a means for prospectively identifying dysfunctional flaps well in advance of failure. PMID:21200206

  6. Quantitative assessment of radiation force effect at the dielectric air-liquid interface

    PubMed Central

    Capeloto, Otávio Augusto; Zanuto, Vitor Santaella; Malacarne, Luis Carlos; Baesso, Mauro Luciano; Lukasievicz, Gustavo Vinicius Bassi; Bialkowski, Stephen Edward; Astrath, Nelson Guilherme Castelli

    2016-01-01

    We induce nanometer-scale surface deformation by exploiting momentum conservation of the interaction between laser light and dielectric liquids. The effect of radiation force at the air-liquid interface is quantitatively assessed for fluids with different density, viscosity and surface tension. The imparted pressure on the liquids by continuous or pulsed laser light excitation is fully described by the Helmholtz electromagnetic force density. PMID:26856622

  7. Summary of the workshop on issues in risk assessment: quantitative methods for developmental toxicology.

    PubMed

    Mattison, D R; Sandler, J D

    1994-08-01

    This report summarizes the proceedings of a conference on quantitative methods for assessing the risks of developmental toxicants. The conference was planned by a subcommittee of the National Research Council's Committee on Risk Assessment Methodology in conjunction with staff from several federal agencies, including the U.S. Environmental Protection Agency, U.S. Food and Drug Administration, U.S. Consumer Products Safety Commission, and Health and Welfare Canada. Issues discussed at the workshop included computerized techniques for hazard identification, use of human and animal data for defining risks in a clinical setting, relationships between end points in developmental toxicity testing, reference dose calculations for developmental toxicology, analysis of quantitative dose-response data, mechanisms of developmental toxicity, physiologically based pharmacokinetic models, and structure-activity relationships. Although a formal consensus was not sought, many participants favored the evolution of quantitative techniques for developmental toxicology risk assessment, including the replacement of lowest observed adverse effect levels (LOAELs) and no observed adverse effect levels (NOAELs) with the benchmark dose methodology.

  8. The Quantitative Reasoning for College Science (QuaRCS) Assessment in non-Astro 101 Courses

    NASA Astrophysics Data System (ADS)

    Kirkman, Thomas W.; Jensen, Ellen

    2016-06-01

    The innumeracy of American students and adults is a much lamented educational problem. The quantitative reasoning skills of college students may be particularly addressed and improved in "general education" science courses like Astro 101. Demonstrating improvement requires a standardized instrument. Among the non-proprietary instruments the Quantitative Literacy and Reasoning Assessment[1] (QRLA) and the Quantitative Reasoning for College Science (QuaRCS) Assessment[2] stand out.Follette et al. developed the QuaRCS in the context of Astro 101 at University of Arizona. We report on QuaRCS results in different contexts: pre-med physics and pre-nursing microbiology at a liberal arts college. We report on the mismatch between students' contemporaneous report of a question's difficulty and the actual probability of success. We report correlations between QuaRCS and other assessments of overall student performance in the class. We report differences in attitude towards mathematics in these two different but health-related student populations .[1] QLRA, Gaze et al., 2014, DOI: http://dx.doi.org/10.5038/1936-4660.7.2.4[2] QuaRCS, Follette, et al., 2015, DOI: http://dx.doi.org/10.5038/1936-4660.8.2.2

  9. Synthesized quantitative assessment of human mental fatigue with EEG and HRV

    NASA Astrophysics Data System (ADS)

    Han, Qingpeng; Wang, Li; Wang, Ping; Wen, Bangchun

    2005-12-01

    The electroencephalograph (EEG) signals and heart rate variable (HRV) signals, which are relative to human body mental stress, are analyzed with the nonlinear dynamics and chaos. Based on calculated three nonlinear parameters, a synthesized quantitative criterion is proposed to assess the body's mental fatigue states. Firstly, the HRV and α wave of EEG from original signals are extracted based on wavelet transform technique. Then, the Largest Lyapunov Exponents, Complexity and Approximate Entropy, are calculated for both HRV and α wave. The three nonlinear parameters reflect quantitatively human physiological activities and can be used to evaluate the mental workload degree. Based on the computation and statistical analysis of practical EEG and HRV data, a synthesized quantitative assessment criterion is induced for mental fatigues with three nonlinear parameters of the above two rhythms. For the known 10 measured data of EEG and HRV signals, the assessment results are obtained with the above laws for different metal fatigue states. To compare with the practical cases, the identification accuracy of mental fatigue or not is up to 100 percent. Furthermore, the accuracies of weak fatigue, middle fatigue and serious fatigue mental workload are all relatively higher; they are about 94.44, 88.89, and 83.33 percent, respectively.

  10. The cutaneous manifestations and common mimickers of physical child abuse.

    PubMed

    Mudd, Shawna S; Findlay, Jeanne S

    2004-01-01

    The cutaneous manifestations of physical child abuse are some of the most common and easily recognized forms of injury. To make an accurate assessment and diagnosis, it is important to differentiate between inflicted cutaneous injuries and mimickers of physical abuse. Likewise, an understanding of reporting guidelines helps guide practitioners in their decision making.

  11. Dual mode diffraction phase microscopy for quantitative functional assessment of biological cells

    NASA Astrophysics Data System (ADS)

    Talaikova, N. A.; Popov, A. P.; Kalyanov, A. L.; Ryabukho, V. P.; Meglinski, I. V.

    2017-10-01

    A diffraction phase microscopy approach with a combined use of transmission and reflection imaging modes has been developed and applied for non-invasive quantitative assessment of the refractive index of red blood cells (RBCs). We present the theoretical background of signal formation for both imaging modes, accompanied by the results of experimental studies. We demonstrate that simultaneous use of the two modes has great potential for accurate assessment of the refractive index of biological cells, and we perform a reconstruction of spatial distribution of the refractive index of RBC in 3D.

  12. A framework for quantitative assessment of impacts related to energy and mineral resource development

    USGS Publications Warehouse

    Haines, Seth S.; Diffendorfer, James; Balistrieri, Laurie S.; Berger, Byron R.; Cook, Troy A.; Gautier, Donald L.; Gallegos, Tanya J.; Gerritsen, Margot; Graffy, Elisabeth; Hawkins, Sarah; Johnson, Kathleen; Macknick, Jordan; McMahon, Peter; Modde, Tim; Pierce, Brenda; Schuenemeyer, John H.; Semmens, Darius; Simon, Benjamin; Taylor, Jason; Walton-Day, Katherine

    2013-01-01

    Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sage grouse leks and piñon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. The framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.

  13. A Framework for Quantitative Assessment of Impacts Related to Energy and Mineral Resource Development

    SciTech Connect

    Haines, Seth S.; Diffendorfer, Jay E.; Balistrieri, Laurie; Berger, Byron; Cook, Troy; DeAngelis, Don; Doremus, Holly; Gautier, Donald L.; Gallegos, Tanya; Gerritsen, Margot; Graffy, Elisabeth; Hawkins, Sarah; Johnson, Kathleen M.; Macknick, Jordan E.; McMahon, Peter; Modde, Tim; Pierce, Brenda; Schuenemeyer, John H.; Semmens, Darius; Simon, Benjamin; Taylor, Jason; Walton-Day, Katie

    2013-05-15

    Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sage grouse leks and pinon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. In conclusion, the framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.

  14. A Framework for Quantitative Assessment of Impacts Related to Energy and Mineral Resource Development

    DOE PAGES

    Haines, Seth S.; Diffendorfer, Jay E.; Balistrieri, Laurie; ...

    2013-05-15

    Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sagemore » grouse leks and pinon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. In conclusion, the framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.« less

  15. A Framework for Quantitative Assessment of Impacts Related to Energy and Mineral Resource Development

    SciTech Connect

    Macknick, Jordan E; Haines, Seth S.; Diffendorfer, Jay E.; Balistrieri, Laurie; Berger, Byron; Cook, Troy; DeAngelis, Don; Doremus, Holly; Gautier, Donald L.; Gallegos, Tanya; Gerritsen, Margot; Graffy, Elisabeth; Hawkins, Sarah; Johnson, Kathleen M.; McMahon, Peter; Modde, Tim; Pierce, Brenda; Schuenemeyer, John H.; Semmens, Darius; Simon, Benjamin; Taylor, Jason; Walton-Day, Katie

    2013-05-15

    Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sage grouse leks and pinon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. The framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.

  16. Sugar concentration in nectar: a quantitative metric of crop attractiveness for refined pollinator risk assessments.

    PubMed

    Knopper, Loren D; Dan, Tereza; Reisig, Dominic D; Johnson, Josephine D; Bowers, Lisa M

    2016-10-01

    Those involved with pollinator risk assessment know that agricultural crops vary in attractiveness to bees. Intuitively, this means that exposure to agricultural pesticides is likely greatest for attractive plants and lowest for unattractive plants. While crop attractiveness in the risk assessment process has been qualitatively remarked on by some authorities, absent is direction on how to refine the process with quantitative metrics of attractiveness. At a high level, attractiveness of crops to bees appears to depend on several key variables, including but not limited to: floral, olfactory, visual and tactile cues; seasonal availability; physical and behavioral characteristics of the bee; plant and nectar rewards. Notwithstanding the complexities and interactions among these variables, sugar content in nectar stands out as a suitable quantitative metric by which to refine pollinator risk assessments for attractiveness. Provided herein is a proposed way to use sugar nectar concentration to adjust the exposure parameter (with what is called a crop attractiveness factor) in the calculation of risk quotients in order to derive crop-specific tier I assessments. This Perspective is meant to invite discussion on incorporating such changes in the risk assessment process. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry. © 2016 The Authors. Pest Management Science published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.

  17. Quantitative assessment of tension in wires of fine-wire external fixators.

    PubMed

    Dong, Yin; Saleh, Micheal; Yang, Lang

    2005-01-01

    Fine-wire fixators are widely used in fracture management. Stable fixation requires the wires maintaining tension throughout the treatment. Clinical experience indicates that wire site complications relate to wire tension. However, there lacks a method to assess wire tension quantitatively in the clinic. The objective of this study was to develop a quantitative assessment method for in situ wire tension and to investigate the factors that influence the assessment. An apparatus was developed based on a linear variable differential transformer (LVDT) displacement transducer that measured the deflection of the testing wire with respect to a parallel reference wire when a constant transverse force of 30N was applied to the testing wire. The wire deflection measured was correlated with the wire tension measured by the force transducer. The experiment was performed under different conditions to assess the effect of bone-clamp distance, reference wire tension, number of wires, and fracture stiffness. The results showed that there was a significant and negative correlation between wire tension and deflection and the bone-clamp distance was the most important factor that affected the wire tension-deflection relationship. The assessment method makes it possible to investigate the relationship between wire tension and wire site complications in the clinic.

  18. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit

    PubMed Central

    Dai, Houde; Zhang, Pengyue; Lueth, Tim C.

    2015-01-01

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor assessment method. This inertial sensor-based method was verified through comparison with an electromagnetic motion tracking system. Seven Parkinson’s disease (PD) patients were tested using this tremor assessment system. The measured tremor amplitudes correlated well with the judgments of a neurologist (r = 0.98). The systematic analysis of sensor-based tremor quantification and the corresponding experiments could be of great help in monitoring the severity of parkinsonian tremor. PMID:26426020

  19. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit.

    PubMed

    Dai, Houde; Zhang, Pengyue; Lueth, Tim C

    2015-09-29

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor assessment method. This inertial sensor-based method was verified through comparison with an electromagnetic motion tracking system. Seven Parkinson's disease (PD) patients were tested using this tremor assessment system. The measured tremor amplitudes correlated well with the judgments of a neurologist (r = 0.98). The systematic analysis of sensor-based tremor quantification and the corresponding experiments could be of great help in monitoring the severity of parkinsonian tremor.

  20. Missed Appendicitis: Mimicking Urologic Symptoms

    PubMed Central

    Akhavizadegan, Hamed

    2012-01-01

    Appendicitis, a common disease, has different presentations. This has made its diagnosis difficult. This paper aims to present two cases of missed appendicitis with completely urologic presentation and the way that helped us to reach the correct diagnosis. The first case with symptoms fully related to kidney and the second mimicking epididymorchitis hindered prompt diagnosis. Right site of the pain, relapsing fever, frequent physical examination, and resistance to medical treatment were main clues which help us to make correct diagnosis. PMID:23326748

  1. Measurement of ankle plantar flexor spasticity following stroke: Assessment of a new quantitative tool.

    PubMed

    Chino, Naoichi; Muraoka, Yoshihiro; Ishihama, Hiroki; Ide, Masaru; Ushijima, Riousuke; Basford, Jeffrey R

    2015-09-01

    To assess the ability of a newly developed portable instrument (the Electric Spastic Ankle Measure (E-SAM)) to quantitatively measure ankle plantar flexor muscle tone and spasticity. Comparison of quantitative measurements of the E-SAM with those obtained manually with the Modified Ashworth Scale (MAS). Seven adult men with stroke of more than 8 months' duration with a MAS score of 3, and 7 healthy age-matched control subjects. Quantitative measurements of the reactive and viscoelastic components of muscle tonus and spasticity. Analysis of the pooled data of all subjects revealed 2 components: an initial negative peak (indicating visco-elasticity), and subsequent positive peaks (denoting reactive contractions of the plantar flexor muscles). Positive, reactive contraction, peaks of the subjects with stroke were significantly higher than those of age-matched controls (p<0.01, t-test). The E-SAM appears to provide meaningful information on muscle tone and spasticity that is more specific and quantitative than that obtained with the MAS. While further study is necessary, this instrument shows promise as an easy-to-use clinical and research tool for the measurement of spasticity and muscle viscosity.

  2. Quantitative assessment of p-glycoprotein expression and function using confocal image analysis.

    PubMed

    Hamrang, Zahra; Arthanari, Yamini; Clarke, David; Pluen, Alain

    2014-10-01

    P-glycoprotein is implicated in clinical drug resistance; thus, rapid quantitative analysis of its expression and activity is of paramout importance to the design and success of novel therapeutics. The scope for the application of quantitative imaging and image analysis tools in this field is reported here at "proof of concept" level. P-glycoprotein expression was utilized as a model for quantitative immunofluorescence and subsequent spatial intensity distribution analysis (SpIDA). Following expression studies, p-glycoprotein inhibition as a function of verapamil concentration was assessed in two cell lines using live cell imaging of intracellular Calcein retention and a routine monolayer fluorescence assay. Intercellular and sub-cellular distributions in the expression of the p-glycoprotein transporter between parent and MDR1-transfected Madin-Derby Canine Kidney cell lines were examined. We have demonstrated that quantitative imaging can provide dose-response parameters while permitting direct microscopic analysis of intracellular fluorophore distributions in live and fixed samples. Analysis with SpIDA offers the ability to detect heterogeniety in the distribution of labeled species, and in conjunction with live cell imaging and immunofluorescence staining may be applied to the determination of pharmacological parameters or analysis of biopsies providing a rapid prognostic tool.

  3. Quantitative assessment of microvasculopathy in arcAβ mice with USPIO-enhanced gradient echo MRI

    PubMed Central

    Deistung, Andreas; Ielacqua, Giovanna D; Seuwen, Aline; Kindler, Diana; Schweser, Ferdinand; Vaas, Markus; Kipar, Anja; Reichenbach, Jürgen R; Rudin, Markus

    2015-01-01

    Magnetic resonance imaging employing administration of iron oxide-based contrast agents is widely used to visualize cellular and molecular processes in vivo. In this study, we investigated the ability of R2* and quantitative susceptibility mapping to quantitatively assess the accumulation of ultrasmall superparamagnetic iron oxide (USPIO) particles in the arcAβ mouse model of cerebral amyloidosis. Gradient-echo data of mouse brains were acquired at 9.4 T after injection of USPIO. Focal areas with increased magnetic susceptibility and R2* values were discernible across several brain regions in 12-month-old arcAβ compared to 6-month-old arcAβ mice and to non-transgenic littermates, indicating accumulation of particles after USPIO injection. This was concomitant with higher R2* and increased magnetic susceptibility differences relative to cerebrospinal fluid measured in USPIO-injected compared to non-USPIO-injected 12-month-old arcAβ mice. No differences in R2* and magnetic susceptibility were detected in USPIO-injected compared to non-injected 12-month-old non-transgenic littermates. Histological analysis confirmed focal uptake of USPIO particles in perivascular macrophages adjacent to small caliber cerebral vessels with radii of 2–8 µm that showed no cerebral amyloid angiopathy. USPIO-enhanced R2* and quantitative susceptibility mapping constitute quantitative tools to monitor such functional microvasculopathies. PMID:26661253

  4. Computed tomography-based quantitative assessment of lower extremity lymphedema following treatment for gynecologic cancer

    PubMed Central

    Chung, Seung Hyun; Kim, Young Jae; Kim, Kwang Gi; Hwang, Ji Hye

    2017-01-01

    Objective To develop an algorithmic quantitative skin and subcutaneous tissue volume measurement protocol for lower extremity lymphedema (LEL) patients using computed tomography (CT), to verify the usefulness of the measurement techniques in LEL patients, and to observe the structural characteristics of subcutaneous tissue according to the progression of LEL in gynecologic cancer. Methods A program for algorithmic quantitative analysis of lower extremity CT scans has been developed to measure the skin and subcutaneous volume, muscle compartment volume, and the extent of the peculiar trabecular area with a honeycombed pattern. The CT venographies of 50 lower extremities from 25 subjects were reviewed in two groups (acute and chronic lymphedema). Results A significant increase in the total volume, subcutaneous volume, and extent of peculiar trabecular area with a honeycombed pattern except quantitative muscle volume was identified in the more-affected limb. The correlation of CT-based total volume and subcutaneous volume measurements with volumetry measurement was strong (correlation coefficient: 0.747 and 0.749, respectively). The larger extent of peculiar trabecular area with a honeycombed pattern in the subcutaneous tissue was identified in the more-affected limb of chronic lymphedema group. Conclusion CT-based quantitative assessments could provide objective volume measurements and information about the structural characteristics of subcutaneous tissue in women with LEL following treatment for gynecologic cancer. PMID:28028991

  5. Quantitative assessment of microvasculopathy in arcAβ mice with USPIO-enhanced gradient echo MRI.

    PubMed

    Klohs, Jan; Deistung, Andreas; Ielacqua, Giovanna D; Seuwen, Aline; Kindler, Diana; Schweser, Ferdinand; Vaas, Markus; Kipar, Anja; Reichenbach, Jürgen R; Rudin, Markus

    2016-09-01

    Magnetic resonance imaging employing administration of iron oxide-based contrast agents is widely used to visualize cellular and molecular processes in vivo. In this study, we investigated the ability of [Formula: see text] and quantitative susceptibility mapping to quantitatively assess the accumulation of ultrasmall superparamagnetic iron oxide (USPIO) particles in the arcAβ mouse model of cerebral amyloidosis. Gradient-echo data of mouse brains were acquired at 9.4 T after injection of USPIO. Focal areas with increased magnetic susceptibility and [Formula: see text] values were discernible across several brain regions in 12-month-old arcAβ compared to 6-month-old arcAβ mice and to non-transgenic littermates, indicating accumulation of particles after USPIO injection. This was concomitant with higher [Formula: see text] and increased magnetic susceptibility differences relative to cerebrospinal fluid measured in USPIO-injected compared to non-USPIO-injected 12-month-old arcAβ mice. No differences in [Formula: see text] and magnetic susceptibility were detected in USPIO-injected compared to non-injected 12-month-old non-transgenic littermates. Histological analysis confirmed focal uptake of USPIO particles in perivascular macrophages adjacent to small caliber cerebral vessels with radii of 2-8 µm that showed no cerebral amyloid angiopathy. USPIO-enhanced [Formula: see text] and quantitative susceptibility mapping constitute quantitative tools to monitor such functional microvasculopathies.

  6. Quantitative digital subtraction radiography for assessment of bone density changes following periodontal guided tissue regeneration.

    PubMed

    Christgau, M; Wenzel, A; Hiller, K A; Schmalz, G

    1996-01-01

    The quantitative assessment of alveolar bone density changes in periodontal defects following guided tissue regeneration (GTR). Twelve patients with 30 intrabony lesions and 16 furcation defects took part. Standardized radiographic and clinical examinations were carried out immediately before and then 5 and 13 months after surgery. Intra-oral radiographs were evaluated by means of digital subtraction radiography (DSR). Within the subtraction images, a window ('experimental region') was defined covering the visible density changes in the defect area. Background noise was measured by using a similarly sized window ('control region') located in an area not affected by GTR. Bone density changes were quantitatively evaluated by calculation of the mean, standard deviation and maximum and minimum values of the grey-level histogram within these windows. DSR revealed significant bone density gain after GTR in intrabony and furcation defects. While a continuous increase was observed over the 13 month period in intrabony defects, changes in furcation defects occurred mostly in the 5-13 month period. Clinically, a distinct vertical and horizontal attachment gain was found. The correlation coefficients between changes in radiographic density and clinical parameters were low, indicating a difference in the information obtained by the two diagnostic methods. Quantitative DSR is a valuable, non-invasive, objective method to obtain information on density changes in intrabony and furcation defects treated by GTR. However, a full assessment of soft and hard tissue changes requires both clinical evaluation and DSR.

  7. Quantitative MRI assessments of white matter in children treated for acute lymphoblastic leukemia

    NASA Astrophysics Data System (ADS)

    Reddick, Wilburn E.; Glass, John O.; Helton, Kathleen J.; Li, Chin-Shang; Pui, Ching-Hon

    2005-04-01

    The purpose of this study was to use objective quantitative MR imaging methods to prospectively assess changes in the physiological structure of white matter during the temporal evolution of leukoencephalopathy (LE) in children treated for acute lymphoblastic leukemia. The longitudinal incidence, extent (proportion of white matter affect), and intensity (elevation of T1 and T2 relaxation rates) of LE was evaluated for 44 children. A combined imaging set consisting of T1, T2, PD, and FLAIR MR images and white matter, gray matter and CSF a priori maps from a spatially normalized atlas were analyzed with a neural network segmentation based on a Kohonen Self-Organizing Map (SOM). Quantitative T1 and T2 relaxation maps were generated using a nonlinear parametric optimization procedure to fit the corresponding multi-exponential models. A Cox proportional regression was performed to estimate the effect of intravenous methotrexate (IV-MTX) exposure on the development of LE followed by a generalized linear model to predict the probability of LE in new patients. Additional T-tests of independent samples were performed to assess differences in quantitative measures of extent and intensity at four different points in therapy. Higher doses and more courses of IV-MTX placed patients at a higher risk of developing LE and were associated with more intense changes affecting more of the white matter volume; many of the changes resolved after completion of therapy. The impact of these changes on neurocognitive functioning and quality of life in survivors remains to be determined.

  8. What do qualitative rapid assessment collections of macroinvertebrates represent? A comparison with extensive quantitative sampling.

    PubMed

    Gillies, C L; Hose, G C; Turak, E

    2009-02-01

    It is a fundamental tenet of Rapid Biological Assessments (RBA) that the samples collected reflect the community from which they are drawn. As with any biological sampling, RBA collections are subject to sampling error resulting in the omission of some taxa. The aim of this study is to compare the composition of RBA samples with an estimate of community structure based on extensive quantitative sampling. We used logistic regression to explore the relationships between the frequency of a taxon being collected in an RBA sample and its biological and ecological traits, namely its abundance, distribution, body size and habit. RBA samples and quantitative estimates of community structure were made in riffles in the Kangaroo and Nepean Rivers, New South Wales, Australia. Single RBA samples may collect up to 63% of the taxa that are collected by extensive quantitative sampling at a site. The frequency of a taxon being recorded in an RBA sample was significantly and positively related to all traits tested indicating a bias in the collection methods towards large, abundant and widely distributed taxa. Accordingly, taxa missed by RBA sampling were generally small, narrowly distributed or rare. These findings enhance our understanding of what RBA samples represent, and the bias and source of errors associated with RBA sampling. This study also quantifies the utility of RBA methods for biodiversity assessment.

  9. A remote quantitative Fugl-Meyer assessment framework for stroke patients based on wearable sensor networks.

    PubMed

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-05-01

    To extend the use of wearable sensor networks for stroke patients training and assessment in non-clinical settings, this paper proposes a novel remote quantitative Fugl-Meyer assessment (FMA) framework, in which two accelerometer and seven flex sensors were used to monitoring the movement function of upper limb, wrist and fingers. The extreme learning machine based ensemble regression model was established to map the sensor data to clinical FMA scores while the RRelief algorithm was applied to find the optimal features subset. Considering the FMA scale is time-consuming and complicated, seven training exercises were designed to replace the upper limb related 33 items in FMA scale. 24 stroke inpatients participated in the experiments in clinical settings and 5 of them were involved in the experiments in home settings after they left the hospital. Both the experimental results in clinical and home settings showed that the proposed quantitative FMA model can precisely predict the FMA scores based on wearable sensor data, the coefficient of determination can reach as high as 0.917. It also indicated that the proposed framework can provide a potential approach to the remote quantitative rehabilitation training and evaluation.

  10. Evaluating quantitative formulas for dose-response assessment of chemical mixtures.

    PubMed

    Hertzberg, Richard C; Teuschler, Linda K

    2002-12-01

    Risk assessment formulas are often distinguished from dose-response models by being rough but necessary. The evaluation of these rough formulas is described here, using the example of mixture risk assessment. Two conditions make the dose-response part of mixture risk assessment difficult, lack of data on mixture dose-response relationships, and the need to address risk from combinations of chemicals because of public demands and statutory requirements. Consequently, the U.S. Environmental Protection Agency has developed methods for carrying out quantitative dose-response assessment for chemical mixtures that require information only on the toxicity of single chemicals and of chemical pair interactions. These formulas are based on plausible ideas and default parameters but minimal supporting data on whole mixtures. Because of this lack of mixture data, the usual evaluation of accuracy (predicted vs. observed) cannot be performed. Two approaches to the evaluation of such formulas are to consider fundamental biological concepts that support the quantitative formulas (e.g., toxicologic similarity) and to determine how well the proposed method performs under simplifying constraints (e.g., as the toxicologic interactions disappear). These ideas are illustrated using dose addition and two weight-of-evidence formulas for incorporating toxicologic interactions.

  11. Fibrosis assessment: impact on current management of chronic liver disease and application of quantitative invasive tools.

    PubMed

    Wang, Yan; Hou, Jin-Lin

    2016-05-01

    Fibrosis, a common pathogenic pathway of chronic liver disease (CLD), has long been indicated to be significantly and most importantly associated with severe prognosis. Nowadays, with remarkable advances in understanding and/or treatment of major CLDs such as hepatitis C, B, and nonalcoholic fatty liver disease, there is an unprecedented requirement for the diagnosis and assessment of liver fibrosis or cirrhosis in various clinical settings. Among the available approaches, liver biopsy remains the one which possibly provides the most direct and reliable information regarding fibrosis patterns and changes in the parenchyma at different clinical stages and with different etiologies. Thus, many endeavors have been undertaken for developing methodologies based on the strategy of quantitation for the invasive assessment. Here, we analyze the impact of fibrosis assessment on the CLD patient care based on the data of recent clinical studies. We discuss and update the current invasive tools regarding their technological features and potentials for the particular clinical applications. Furthermore, we propose the potential resolutions with application of quantitative invasive tools for some major issues in fibrosis assessment, which appear to be obstacles against the nowadays rapid progress in CLD medicine.

  12. A Compressed Sensing-Based Wearable Sensor Network for Quantitative Assessment of Stroke Patients.

    PubMed

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-02-05

    Clinical rehabilitation assessment is an important part of the therapy process because it is the premise for prescribing suitable rehabilitation interventions. However, the commonly used assessment scales have the following two drawbacks: (1) they are susceptible to subjective factors; (2) they only have several rating levels and are influenced by a ceiling effect, making it impossible to exactly detect any further improvement in the movement. Meanwhile, energy constraints are a primary design consideration in wearable sensor network systems since they are often battery-operated. Traditionally, for wearable sensor network systems that follow the Shannon/Nyquist sampling theorem, there are many data that need to be sampled and transmitted. This paper proposes a novel wearable sensor network system to monitor and quantitatively assess the upper limb motion function, based on compressed sensing technology. With the sparse representation model, less data is transmitted to the computer than with traditional systems. The experimental results show that the accelerometer signals of Bobath handshake and shoulder touch exercises can be compressed, and the length of the compressed signal is less than 1/3 of the raw signal length. More importantly, the reconstruction errors have no influence on the predictive accuracy of the Brunnstrom stage classification model. It also indicated that the proposed system can not only reduce the amount of data during the sampling and transmission processes, but also, the reconstructed accelerometer signals can be used for quantitative assessment without any loss of useful information.

  13. A Compressed Sensing-Based Wearable Sensor Network for Quantitative Assessment of Stroke Patients

    PubMed Central

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-01-01

    Clinical rehabilitation assessment is an important part of the therapy process because it is the premise for prescribing suitable rehabilitation interventions. However, the commonly used assessment scales have the following two drawbacks: (1) they are susceptible to subjective factors; (2) they only have several rating levels and are influenced by a ceiling effect, making it impossible to exactly detect any further improvement in the movement. Meanwhile, energy constraints are a primary design consideration in wearable sensor network systems since they are often battery-operated. Traditionally, for wearable sensor network systems that follow the Shannon/Nyquist sampling theorem, there are many data that need to be sampled and transmitted. This paper proposes a novel wearable sensor network system to monitor and quantitatively assess the upper limb motion function, based on compressed sensing technology. With the sparse representation model, less data is transmitted to the computer than with traditional systems. The experimental results show that the accelerometer signals of Bobath handshake and shoulder touch exercises can be compressed, and the length of the compressed signal is less than 1/3 of the raw signal length. More importantly, the reconstruction errors have no influence on the predictive accuracy of the Brunnstrom stage classification model. It also indicated that the proposed system can not only reduce the amount of data during the sampling and transmission processes, but also, the reconstructed accelerometer signals can be used for quantitative assessment without any loss of useful information. PMID:26861337

  14. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    PubMed

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical

  15. Automated Tracking of Quantitative Assessments of Tumor Burden in Clinical Trials1

    PubMed Central

    Rubin, Daniel L; Willrett, Debra; O'Connor, Martin J; Hage, Cleber; Kurtz, Camille; Moreira, Dilvan A

    2014-01-01

    There are two key challenges hindering effective use of quantitative assessment of imaging in cancer response assessment: 1) Radiologists usually describe the cancer lesions in imaging studies subjectively and sometimes ambiguously, and 2) it is difficult to repurpose imaging data, because lesion measurements are not recorded in a format that permits machine interpretation and interoperability. We have developed a freely available software platform on the basis of open standards, the electronic Physician Annotation Device (ePAD), to tackle these challenges in two ways. First, ePAD facilitates the radiologist in carrying out cancer lesion measurements as part of routine clinical trial image interpretation workflow. Second, ePAD records all image measurements and annotations in a data format that permits repurposing image data for analyses of alternative imaging biomarkers of treatment response. To determine the impact of ePAD on radiologist efficiency in quantitative assessment of imaging studies, a radiologist evaluated computed tomography (CT) imaging studies from 20 subjects having one baseline and three consecutive follow-up imaging studies with and without ePAD. The radiologist made measurements of target lesions in each imaging study using Response Evaluation Criteria in Solid Tumors 1.1 criteria, initially with the aid of ePAD, and then after a 30-day washout period, the exams were reread without ePAD. The mean total time required to review the images and summarize measurements of target lesions was 15% (P < .039) shorter using ePAD than without using this tool. In addition, it was possible to rapidly reanalyze the images to explore lesion cross-sectional area as an alternative imaging biomarker to linear measure. We conclude that ePAD appears promising to potentially improve reader efficiency for quantitative assessment of CT examinations, and it may enable discovery of future novel image-based biomarkers of cancer treatment response. PMID:24772204

  16. Quantitative assessment of spinal cord injury using circularly polarized coherent anti-Stokes Raman scattering microscopy

    NASA Astrophysics Data System (ADS)

    Bae, Kideog; Zheng, Wei; Huang, Zhiwei

    2017-08-01

    We report the quantitative assessment of spinal cord injury using the circularly polarized coherent anti-Stokes Raman scattering (CP-CARS) technique together with Stokes parameters in the Poincaré sphere. The pump and Stokes excitation beams are circularly polarized to suppress both the linear polarization-dependent artifacts and the nonresonant background of tissue CARS imaging, enabling quantitative CP-CARS image analysis. This study shows that CP-CARS imaging uncovers significantly increased phase retardance of injured spinal cord tissue as compared to normal tissue, suggesting that CP-CARS is an appealing label-free imaging tool for determining the degree of tissue phase retardance, which could serve as a unique diagnostic parameter associated with nervous tissue injury.

  17. Experimental assessment of bone mineral density using quantitative computed tomography in holstein dairy cows

    PubMed Central

    MAETANI, Ayami; ITOH, Megumi; NISHIHARA, Kahori; AOKI, Takahiro; OHTANI, Masayuki; SHIBANO, Kenichi; KAYANO, Mitsunori; YAMADA, Kazutaka

    2016-01-01

    The aim of this study was to assess the measurement of bone mineral density (BMD) by quantitative computed tomography (QCT), comparing the relationships of BMD between QCT and dual-energy X-ray absorptiometry (DXA) and between QCT and radiographic absorptiometry (RA) in the metacarpal bone of Holstein dairy cows (n=27). A significant positive correlation was found between QCT and DXA measurements (r=0.70, P<0.01), and a significant correlation was found between QCT and RA measurements (r=0.50, P<0.01). We conclude that QCT provides quantitative evaluation of BMD in dairy cows, because BMD measured by QCT showed positive correlations with BMD measured by the two conventional methods: DXA and RA. PMID:27075115

  18. A quantitative collagen fibers orientation assessment using birefringence measurements: Calibration and application to human osteons

    PubMed Central

    Spiesz, Ewa M.; Kaminsky, Werner; Zysset, Philippe K.

    2011-01-01

    Even though mechanical properties depend strongly on the arrangement of collagen fibers in mineralized tissues, it is not yet well resolved. Only a few semi-quantitative evaluations of the fiber arrangement in bone, like spectroscopic techniques or circularly polarized light microscopy methods are available. In this study the out-of-plane collagen arrangement angle was calibrated to the linear birefringence of a longitudinally fibered mineralized turkey leg tendon cut at variety of angles to the main axis. The calibration curve was applied to human cortical bone osteons to quantify the out-of-plane collagen fibers arrangement. The proposed calibration curve is normalized to sample thickness and wavelength of the probing light to enable a universally applicable quantitative assessment. This approach may improve our understanding of the fibrillar structure of bone and its implications on mechanical properties. PMID:21970947

  19. Qualitative and quantitative approaches in the dose-response assessment of genotoxic carcinogens.

    PubMed

    Fukushima, Shoji; Gi, Min; Kakehashi, Anna; Wanibuchi, Hideki; Matsumoto, Michiharu

    2016-05-01

    Qualitative and quantitative approaches are important issues in field of carcinogenic risk assessment of the genotoxic carcinogens. Herein, we provide quantitative data on low-dose hepatocarcinogenicity studies for three genotoxic hepatocarcinogens: 2-amino-3,8-dimethylimidazo[4,5-f]quinoxaline (MeIQx), 2-amino-3-methylimidazo[4,5-f]quinoline (IQ) and N-nitrosodiethylamine (DEN). Hepatocarcinogenicity was examined by quantitative analysis of glutathione S-transferase placental form (GST-P) positive foci, which are the preneoplastic lesions in rat hepatocarcinogenesis and the endpoint carcinogenic marker in the rat liver medium-term carcinogenicity bioassay. We also examined DNA damage and gene mutations which occurred through the initiation stage of carcinogenesis. For the establishment of points of departure (PoD) from which the cancer-related risk can be estimated, we analyzed the above events by quantitative no-observed-effect level and benchmark dose approaches. MeIQx at low doses induced formation of DNA-MeIQx adducts; somewhat higher doses caused elevation of 8-hydroxy-2'-deoxyquanosine levels; at still higher doses gene mutations occurred; and the highest dose induced formation of GST-P positive foci. These data indicate that early genotoxic events in the pathway to carcinogenesis showed the expected trend of lower PoDs for earlier events in the carcinogenic process. Similarly, only the highest dose of IQ caused an increase in the number of GST-P positive foci in the liver, while IQ-DNA adduct formation was observed with low doses. Moreover, treatment with DEN at low doses had no effect on development of GST-P positive foci in the liver. These data on PoDs for the markers contribute to understand whether genotoxic carcinogens have a threshold for their carcinogenicity. The most appropriate approach to use in low dose-response assessment must be approved on the basis of scientific judgment.

  20. Preparing for the unprecedented - Towards quantitative oil risk assessment in the Arctic marine areas.

    PubMed

    Nevalainen, Maisa; Helle, Inari; Vanhatalo, Jarno

    2017-01-15

    The probability of major oil accidents in Arctic seas is increasing alongside with increasing maritime traffic. Hence, there is a growing need to understand the risks posed by oil spills to these unique and sensitive areas. So far these risks have mainly been acknowledged in terms of qualitative descriptions. We introduce a probabilistic framework, based on a general food web approach, to analyze ecological impacts of oil spills. We argue that the food web approach based on key functional groups is more appropriate for providing holistic view of the involved risks than assessments based on single species. We discuss the issues characteristic to the Arctic that need a special attention in risk assessment, and provide examples how to proceed towards quantitative risk estimates. The conceptual model presented in the paper helps to identify the most important risk factors and can be used as a template for more detailed risk assessments. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Valuation of ecotoxicological impacts from tributyltin based on a quantitative environmental assessment framework.

    PubMed

    Noring, Maria; Håkansson, Cecilia; Dahlgren, Elin

    2016-02-01

    In the scientific literature, few valuations of biodiversity and ecosystem services following the impacts of toxicity are available, hampered by the lack of ecotoxicological documentation. Here, tributyltin is used to conduct a contingent valuation study as well as cost-benefit analysis (CBA) of measures for improving the environmental status in Swedish coastal waters of the Baltic Sea. Benefits considering different dimensions when assessing environmental status are highlighted and a quantitative environmental assessment framework based on available technology, ecological conditions, and economic valuation methodology is developed. Two scenarios are used in the valuation study: (a) achieving good environmental status by 2020 in accordance with EU legislation (USD 119 household(-1) year(-1)) and (b) achieving visible improvements by 2100 due to natural degradation (USD 108 household(-1) year(-1)) during 8 years. The later scenario was used to illustrate an application of the assessment framework. The CBA results indicate that both scenarios might generate a welfare improvement.

  2. Elongated left lobe of the liver mimicking a subcapsular hematoma of the spleen on the focused assessment with sonography for trauma exam.

    PubMed

    Jones, Robert; Tabbut, Matthew; Gramer, Diane

    2014-07-01

    The focused assessment with sonography for trauma examination has assumed the role of initial screening examination for the presence or absence of hemoperitoneum in the patient with blunt abdominal trauma. Sonographic pitfalls associated with the examination have primarily been related to mistaking contained fluid collections with hemoperitoneum. We present a case in which an elongated left lobe of the liver was misdiagnosed as a splenic subcapsular hematoma. It is imperative that emergency physicians and trauma surgeons be familiar with this normal variant of the liver and its associated sonographic appearance on the perisplenic window in order to prevent nontherapeutic laparotomies or embolizations.

  3. Quantitative muscle strength assessment in duchenne muscular dystrophy: longitudinal study and correlation with functional measures

    PubMed Central

    2012-01-01

    Background The aim of this study was to perform a longitudinal assessment using Quantitative Muscle Testing (QMT) in a cohort of ambulant boys affected by Duchenne muscular dystrophy (DMD) and to correlate the results of QMT with functional measures. This study is to date the most thorough long-term evaluation of QMT in a cohort of DMD patients correlated with other measures, such as the North Star Ambulatory Assessment (NSAA) or thee 6-min walk test (6MWT). Methods This is a single centre, prospective, non-randomised, study assessing QMT using the Kin Com® 125 machine in a study cohort of 28 ambulant DMD boys, aged 5 to 12 years. This cohort was assessed longitudinally over a 12 months period of time with 3 monthly assessments for QMT and with assessment of functional abilities, using the NSAA and the 6MWT at baseline and at 12 months only. QMT was also used in a control group of 13 healthy age-matched boys examined at baseline and at 12 months. Results There was an increase in QMT over 12 months in boys below the age of 7.5 years while in boys above the age of 7.5 years, QMT showed a significant decrease. All the average one-year changes were significantly different than those experienced by healthy controls. We also found a good correlation between quantitative tests and the other measures that was more obvious in the stronger children. Conclusion Our longitudinal data using QMT in a cohort of DMD patients suggest that this could be used as an additional tool to monitor changes, providing additional information on segmental strength. PMID:22974002

  4. Variables influencing the accuracy of 2-dimensional and real-time 3-dimensional echocardiography for assessment of small volumes, areas, and distances: an in vitro study using static tissue-mimicking phantoms.

    PubMed

    Herberg, Ulrike; Brand, Manuel; Bernhardt, Christine; Trier, Hans Georg; Breuer, Johannes

    2011-07-01

    The aim of this study was to assess the validity, accuracy, and reproducibility of real-time 3-dimensional (3D) echocardiography for small distances, areas, and volumes. Real-time 3D echocardiography using matrix technology was performed in small calibrated tissue-mimicking phantoms and compared with 2-dimensional (2D) echocardiography. In a systematic variation of variables on data acquisition and analysis including different 3D workstations (manual disk summation versus semiautomatic border detection), the relative contributions of sources of errors were determined. The clinical relevance of the in vitro findings was assessed in 5 neonates and infants. Distance calculation was valid (mean relative error ± SD, -0.15% ± 1.2%). Underestimation of areas and volumes was significant for both 2D and 3D echocardiography (area: 2D, -7.0% ± 2.9%; 3D, -6.0% ± 2.8%; volume: 2D, -13.1% ± 4.5%; 3D, -6.7% ± 2.5%; P < .05). Adjustment of compression and gain on data acquisition (difference of the means: 2D, 11.6%; 3D, 17.9%), gain on postprocessing (3D, 3.4%), and the border detection algorithm on analysis (2D, 4.8%; 3D, 16.6%) had a highly significant effect on volume and area calculations (P < .001). In vivo, compression and gain on acquisition (3D, 19.1%) and the 3D workstation on analysis (3D, 22.2%) had a highly significant impact on left ventricular volumetry (P < .001). Real-time 3D echocardiography is a reliable method for calculation of small distances, areas, and volumes comparable with the size of the neonatal and infant heart. Variables influencing boundary identification during image acquisition and analysis have a significant impact on 2D and 3D area and volume calculations. Standardized protocols are mandatory to avoid these sources of error in both clinical practice and research.

  5. Repeatability and reproducibility of quantitative contrast-enhanced ultrasonography for assessing duodenal perfusion in healthy dogs.

    PubMed

    Nisa, Khoirun; Lim, Sue Yee; Shinohara, Masayoshi; Nagata, Noriyuki; Sasaoka, Kazuyoshi; Dermlim, Angkhana; Leela-Arporn, Rommaneeya; Morita, Tomoya; Yokoyama, Nozomu; Osuga, Tatsuyuki; Sasaki, Noboru; Morishita, Keitaro; Nakamura, Kensuke; Ohta, Hiroshi; Takiguchi, Mitsuyoshi

    2017-09-29

    Contrast-enhanced ultrasonography (CEUS) with microbubbles as a contrast agent allows the visualization and quantification of tissue perfusion. The assessment of canine intestinal perfusion by quantitative CEUS may provide valuable information for diagnosing and monitoring chronic intestinal disorders. This study aimed to assess the repeatability (intraday variability) and reproducibility (interday variability) of quantitative duodenal CEUS in healthy dogs. Six healthy beagles underwent CEUS three times within one day (4-hr intervals) and on two different days (1-week interval). All dogs were sedated with a combination of butorphanol (0.2 mg/kg) and midazolam (0.1 mg/kg) prior to CEUS. The contrast agent (Sonazoid(®)) was administered using the intravenous bolus method (0.01 ml/kg) for imaging of the duodenum. Time-intensity curves (TIC) were created by drawing multiple regions of interest (ROIs) in the duodenal mucosa, and perfusion parameters, including the time-to-peak (TTP), peak intensity (PI), area under the curve (AUC), and wash-in and wash-out rates (WiR and WoR, respectively), were generated. Intraday and interday coefficients of variation (CVs) for TTP, PI, AUC, WiR and WoR were <25% (range, 2.27-23.41%), which indicated that CEUS was feasible for assessing duodenal perfusion in healthy sedated dogs. A further study of CEUS in dogs with chronic intestinal disorders is necessary to evaluate its clinical applicability.

  6. Quantitative risk assessment for human salmonellosis through the consumption of pork sausage in Porto Alegre, Brazil.

    PubMed

    Mürmann, Lisandra; Corbellini, Luis Gustavo; Collor, Alexandre Ávila; Cardoso, Marisa

    2011-04-01

    A quantitative microbiology risk assessment was conducted to evaluate the risk of Salmonella infection to consumers of fresh pork sausages prepared at barbecues in Porto Alegre, Brazil. For the analysis, a prevalence of 24.4% positive pork sausages with a level of contamination between 0.03 and 460 CFU g(-1) was assumed. Data related to frequency and habits of consumption were obtained by a questionnaire survey given to 424 people. A second-order Monte Carlo simulation separating the uncertain parameter of cooking time from the variable parameters was run. Of the people interviewed, 87.5% consumed pork sausage, and 85.4% ate it at barbecues. The average risk of salmonellosis per barbecue at a minimum cooking time of 15.6 min (worst-case scenario) was 6.24 × 10(-4), and the risk assessed per month was 1.61 × 10(-3). Cooking for 19 min would fully inactivate Salmonella in 99.9% of the cases. At this cooking time, the sausage reached a mean internal temperature of 75.7°C. The results of the quantitative microbiology risk assessment revealed that the consumption of fresh pork sausage is safe when cooking time is approximately 19 min, whereas undercooked pork sausage may represent a nonnegligible health risk for consumers.

  7. Evaluating variability and uncertainty separately in microbial quantitative risk assessment using two R packages.

    PubMed

    Pouillot, Régis; Delignette-Muller, Marie Laure

    2010-09-01

    Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org).

  8. Quantitative microbial risk assessment for Staphylococcus aureus and Staphylococcus enterotoxin A in raw milk.

    PubMed

    Heidinger, Joelle C; Winter, Carl K; Cullor, James S

    2009-08-01

    A quantitative microbial risk assessment was constructed to determine consumer risk from Staphylococcus aureus and staphylococcal enterotoxin in raw milk. A Monte Carlo simulation model was developed to assess the risk from raw milk consumption using data on levels of S. aureus in milk collected by the University of California-Davis Dairy Food Safety Laboratory from 2,336 California dairies from 2005 to 2008 and using U.S. milk consumption data from the National Health and Nutrition Examination Survey of 2003 and 2004. Four modules were constructed to simulate pathogen growth and staphylococcal enterotoxin A production scenarios to quantify consumer risk levels under various time and temperature storage conditions. The three growth modules predicted that S. aureus levels could surpass the 10(5) CFU/ml level of concern at the 99.9th or 99.99th percentile of servings and therefore may represent a potential consumer risk. Results obtained from the staphylococcal enterotoxin A production module predicted that exposure at the 99.99th percentile could represent a dose capable of eliciting staphylococcal enterotoxin intoxication in all consumer age groups. This study illustrates the utility of quantitative microbial risk assessments for identifying potential food safety issues.

  9. Quantitative sensory testing for assessment of somatosensory function in human oral mucosa: a review.

    PubMed

    Zhou, Pin; Chen, Yaming; Zhang, Jinglu; Wang, Kelun; Svensson, Peter

    2017-09-20

    This narrative review provides an overview of the quantitative sensory testing (QST) to assess somatosensory function in human oral mucosa. A literature search was conducted in the PubMed database to identify studies in vivo on human oral mucosa using QST methods. A list of 149 articles was obtained and screened. A total of 36 relevant articles remained and were read in full text. Manual search of the reference lists identified eight additional relevant studies. A total of 44 articles were included for final assessment. The included studies were divided into six categories according to the study content and objective. In each category, there was a great variety of aims, methods, participants and outcome measures. The application of QST has nevertheless helped to monitor somatosensory function in experimental models of intraoral pain, effects of local anesthesia, after oral and maxillofacial surgery and after prosthodontic and orthodontic treatment. QST has been proved to be sufficiently stable and reliable, and valuable information has been obtained regarding somatosensory function in healthy volunteers, special populations and orofacial pain patients. However, as most of the studies were highly heterogeneous, the results are difficult to compare quantitatively. A standardized intraoral QST protocol is recommended and expected to help advance a mechanism-based assessment of neuropathies and other intraoral pain conditions.

  10. Visual and semi-quantitative assessment of brain tumors using (201)Tl-SPECT.

    PubMed

    Nose, Ayumi; Otsuka, Hideki; Nose, Hayato; Otomi, Yoichi; Terazawa, Kaori; Harada, Masafumi

    2013-01-01

    To evaluate the usefulness of (201)Tl-SPECT in differentiating benign from malignant brain tumors. Eighty-eight patients (44 males and 44 females) with 58 high-grade (WHO grade III-IV) and 30 low-grade (WHO grade I-II) tumors were evaluated with (201)Tl-SPECT. (1) Visual assessment was performed by board-certificated radiologists using (201)Tl-SPECT. Tumors were classified in two groups (Tl-positive and Tl-negative) and scored using the five grade evaluation system. Receiver operating characteristic (ROC) analysis was performed in the Tl-positive group. (2) Semi-quantitative assessment involved measurement of early and delayed (201)Tl uptake, and the retention index (RI) was applied as follows: RI=delayed uptake ratio/early uptake ratio. Three combinations of RI using mean and maximum values of the region of interest were calculated. (1) Seventy-four Tl-positive and 14 Tl-negative tumors. The area under the ROC curve (AUC) estimated by three radiologists exceeded a value of 0.7. The value was greater when estimated by the more experienced radiologist. (2) In all RIs, the difference of RI between high-grade tumors and low-grade tumors was statistically significant. A visual and semi-quantitative assessment using (201)Tl-SPECT was found to be useful for differentiating benign from malignant brain tumors.

  11. A relative quantitative assessment of myocardial perfusion by first-pass technique: animal study

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Zhang, Zhang; Yu, Xuefang; Zhou, Kenneth J.

    2015-03-01

    The purpose of this study is to quantitatively assess the myocardial perfusion by first-pass technique in swine model. Numerous techniques based on the analysis of Computed Tomography (CT) Hounsfield Unit (HU) density have emerged. Although these methods proposed to be able to assess haemodynamically significant coronary artery stenosis, their limitations are noticed. There are still needs to develop some new techniques. Experiments were performed upon five (5) closed-chest swine. Balloon catheters were placed into the coronary artery to simulate different degrees of luminal stenosis. Myocardial Blood Flow (MBF) was measured using color microsphere technique. Fractional Flow Reserve (FFR) was measured using pressure wire. CT examinations were performed twice during First-pass phase under adenosine-stress condition. CT HU Density (HUDCT) and CT HU Density Ratio (HUDRCT) were calculated using the acquired CT images. Our study presents that HUDRCT shows a good (y=0.07245+0.09963x, r2=0.898) correlation with MBF and FFR. In receiver operating characteristic (ROC) curve analyses, HUDRCT provides excellent diagnostic performance for the detection of significant ischemia during adenosine-stress as defined by FFR indicated by the value of Area Under the Curve (AUC) of 0.927. HUDRCT has the potential to be developed as a useful indicator of quantitative assessment of myocardial perfusion.

  12. Investigation of the feasibility of non-invasive optical sensors for the quantitative assessment of dehydration.

    PubMed

    Visser, Cobus; Kieser, Eduard; Dellimore, Kiran; van den Heever, Dawie; Smith, Johan

    2017-10-01

    This study explores the feasibility of prospectively assessing infant dehydration using four non-invasive, optical sensors based on the quantitative and objective measurement of various clinical markers of dehydration. The sensors were investigated to objectively and unobtrusively assess the hydration state of an infant based on the quantification of capillary refill time (CRT), skin recoil time (SRT), skin temperature profile (STP) and skin tissue hydration by means of infrared spectrometry (ISP). To evaluate the performance of the sensors a clinical study was conducted on a cohort of 10 infants (aged 6-36 months) with acute gastroenteritis. High sensitivity and specificity were exhibited by the sensors, in particular the STP and SRT sensors, when combined into a fusion regression model (sensitivity: 0.90, specificity: 0.78). The SRT and STP sensors and the fusion model all outperformed the commonly used "gold standard" clinical dehydration scales including the Gorelick scale (sensitivity: 0.56, specificity: 0.56), CDS scale (sensitivity: 1.0, specificity: 0.2) and WHO scale (sensitivity: 0.13, specificity: 0.79). These results suggest that objective and quantitative assessment of infant dehydration may be possible using the sensors investigated. However, further evaluation of the sensors on a larger sample population is needed before deploying them in a clinical setting. Copyright © 2017 IPEM. Published by Elsevier Ltd. All rights reserved.

  13. Quantitative assessment of emphysema from whole lung CT scans: comparison with visual grading

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Apanosovich, Tatiyana V.; Wang, Jianwei; Yankelevitz, David F.; Henschke, Claudia I.

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for imaging of the anatomical basis of emphysema and for visual assessment by radiologists of the extent present in the lungs. Several measures have been introduced for the quantification of the extent of disease directly from CT data in order to add to the qualitative assessments made by radiologists. In this paper we compare emphysema index, mean lung density, histogram percentiles, and the fractal dimension to visual grade in order to evaluate the predictability of radiologist visual scoring of emphysema from low-dose CT scans through quantitative scores, in order to determine which measures can be useful as surrogates for visual assessment. All measures were computed over nine divisions of the lung field (whole lung, individual lungs, and upper/middle/lower thirds of each lung) for each of 148 low-dose, whole lung scans. In addition, a visual grade of each section was also given by an expert radiologist. One-way ANOVA and multinomial logistic regression were used to determine the ability of the measures to predict visual grade from quantitative score. We found that all measures were able to distinguish between normal and severe grades (p<0.01), and between mild/moderate and all other grades (p<0.05). However, no measure was able to distinguish between mild and moderate cases. Approximately 65% prediction accuracy was achieved from using quantitative score to predict visual grade, with 73% if mild and moderate cases are considered as a single class.

  14. Quantitative Passive Diffusive Sampling for Assessing Soil Vapor Intrusion to Indoor Air

    DTIC Science & Technology

    2012-03-28

    4/11/2012 1 Quantitative Passive Diffusive Sampling for Assessing Soil Vapor Intrusion to Indoor Air Todd McAlary and Hester Groenevelt, Geosyntec... Intrusion to Indoor Air 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK...10-6 risk (ppb) Vapour pressure (atm) Water solubility (g/l) 1,1,1-Trichloroethane 110 400 0.16 1.33 1,2,4-Trimethylbenzene

  15. Quantitative Assessment of the Effects of Oxidants on Antigen-Antibody Binding In Vitro

    PubMed Central

    Han, Shuang; Wang, Guanyu; Xu, Naijin; Liu, Hui

    2016-01-01

    Objective. We quantitatively assessed the influence of oxidants on antigen-antibody-binding activity. Methods. We used several immunological detection methods, including precipitation reactions, agglutination reactions, and enzyme immunoassays, to determine antibody activity. The oxidation-reduction potential was measured in order to determine total serum antioxidant capacity. Results. Certain concentrations of oxidants resulted in significant inhibition of antibody activity but had little influence on total serum antioxidant capacity. Conclusions. Oxidants had a significant influence on interactions between antigen and antibody, but minimal effect on the peptide of the antibody molecule. PMID:27313823

  16. Quantitative assessment of autonomic dysreflexia with combined spectroscopic and perfusion probes

    NASA Astrophysics Data System (ADS)

    Ramella-Roman, Jessica C.; Pfefer, Allison; Hidler, Joseph

    2009-02-01

    Autonomic Dysreflexia (AD) is an uncontrolled response of sympathetic output occurring in individuals with an injury at the sixth thoracic (T6) neurologic level. Any noxious stimulus below the injury level can trigger an AD episode. Progression of an AD attack can result in severe vasoconstriction below the injury level. Skin oxygenation can decrease up to 40% during an AD event. We present a quantitative and non-invasive method of assessing the progression of an AD event by measuring patient's skin oxygen levels and blood flow using a fiber optic based system.

  17. Impact assessment of abiotic resources in LCA: quantitative comparison of selected characterization models.

    PubMed

    Rørbech, Jakob T; Vadenbo, Carl; Hellweg, Stefanie; Astrup, Thomas F

    2014-10-07

    Resources have received significant attention in recent years resulting in development of a wide range of resource depletion indicators within life cycle assessment (LCA). Understanding the differences in assessment principles used to derive these indicators and the effects on the impact assessment results is critical for indicator selection and interpretation of the results. Eleven resource depletion methods were evaluated quantitatively with respect to resource coverage, characterization factors (CF), impact contributions from individual resources, and total impact scores. We included 2247 individual market inventory data sets covering a wide range of societal activities (ecoinvent database v3.0). Log-linear regression analysis was carried out for all pairwise combinations of the 11 methods for identification of correlations in CFs (resources) and total impacts (inventory data sets) between methods. Significant differences in resource coverage were observed (9-73 resources) revealing a trade-off between resource coverage and model complexity. High correlation in CFs between methods did not necessarily manifest in high correlation in total impacts. This indicates that also resource coverage may be critical for impact assessment results. Although no consistent correlations between methods applying similar assessment models could be observed, all methods showed relatively high correlation regarding the assessment of energy resources. Finally, we classify the existing methods into three groups, according to method focus and modeling approach, to aid method selection within LCA.

  18. Quantitative sensory testing in patients with postthoracotomy pain syndrome: Part 2: variability in thermal threshold assessments.

    PubMed

    Wildgaard, Kim; Ringsted, Thomas K; Kehlet, Henrik; Werner, Mads U

    2013-09-01

    Quantitative sensory testing is a reference method for characterization of postsurgical neuropathic components. Correct interpretation of data requires detailed information concerning the validity of the testing methods. The objective of the study was to assess the test-retest variability of thermal thresholds in patients (n = 14) with the postthoracotomy pain syndrome. Sensory mapping with a metal roller (25°C) on the surgical side delineated an area with cool sensory dysfunction. In this area and in a contralateral area, 4 prespecified sites (2.6 cm) were outlined, in addition to the maximum pain site on the surgical side. In these total 9 sites, warmth detection threshold, cool detection threshold, and heat pain threshold were assessed. Comparisons of thermal test-retest assessments did not demonstrate any significant intraside differences. The SDs of the thermal assessments in nonpain sites and in the maximum pain site ranged from 1.9 to 2.5°C and 3.5 to 6.9°C, respectively. The estimated within-patient and between-patient variances were 5% to 28% and 72% to 95%, respectively, of the total variances. Although a generally poor test-retest agreement was demonstrated, the much lower within-patient than between-patient variances facilitated estimations of highly statistical significant, within-patient differences in thermal thresholds. In patients with postthoracotomy pain syndrome, several statistical methods indicated an excessively high variability in thermal thresholds, questioning the use of single quantitative sensory testing in assessments to characterize patients with chronic pain states.

  19. Quantitative assessment of multiple sclerosis using inertial sensors and the TUG test.

    PubMed

    Greene, Barry R; Healy, Michael; Rutledge, Stephanie; Caulfield, Brian; Tubridy, Niall

    2014-01-01

    Multiple sclerosis (MS) is a progressive neurological disorder affecting between 2 and 2.5 million people globally. Tests of mobility form part of clinical assessments of MS. Quantitative assessment of mobility using inertial sensors has the potential to provide objective, longitudinal monitoring of disease progression in patients with MS. The mobility of 21 patients (aged 25-59 years, 8 M, 13 F), diagnosed with relapsing-remitting MS was assessed using the Timed up and Go (TUG) test, while patients wore shank-mounted inertial sensors. This exploratory, cross-sectional study aimed to examine the reliability of quantitative measures derived from inertial sensors during the TUG test, in patients with MS. Furthermore, we aimed to determine if disease status (as measured by the Multiple Sclerosis Impact Scale (MSIS-29) and the Expanded Disability Status Score (EDSS)) can be predicted by assessment using a TUG test and inertial sensors. Reliability analysis showed that 32 of 52 inertial sensors parameters obtained during the TUG showed excellent intrasession reliability, while 11 of 52 showed moderate reliability. Using the inertial sensors parameters, regression models of the EDSS and MSIS-29 scales were derived using the elastic net procedure. Using cross validation, an elastic net regularized regression model of MSIS yielded a mean square error (MSE) of 334.6 with 25 degrees of freedom (DoF). Similarly, an elastic net regularized regression model of EDSS yielded a cross-validated MSE of 1.5 with 6 DoF. Results suggest that inertial sensor parameters derived from MS patients while completing the TUG test are reliable and may have utility in assessing disease state as measured using EDSS and MSIS.

  20. A quantitative method for risk assessment of agriculture due to climate change

    NASA Astrophysics Data System (ADS)

    Dong, Zhiqiang; Pan, Zhihua; An, Pingli; Zhang, Jingting; Zhang, Jun; Pan, Yuying; Huang, Lei; Zhao, Hui; Han, Guolin; Wu, Dong; Wang, Jialin; Fan, Dongliang; Gao, Lin; Pan, Xuebiao

    2016-11-01

    Climate change has greatly affected agriculture. Agriculture is facing increasing risks as its sensitivity and vulnerability to climate change. Scientific assessment of climate change-induced agricultural risks could help to actively deal with climate change and ensure food security. However, quantitative assessment of risk is a difficult issue. Here, based on the IPCC assessment reports, a quantitative method for risk assessment of agriculture due to climate change is proposed. Risk is described as the product of the degree of loss and its probability of occurrence. The degree of loss can be expressed by the yield change amplitude. The probability of occurrence can be calculated by the new concept of climate change effect-accumulated frequency (CCEAF). Specific steps of this assessment method are suggested. This method is determined feasible and practical by using the spring wheat in Wuchuan County of Inner Mongolia as a test example. The results show that the fluctuation of spring wheat yield increased with the warming and drying climatic trend in Wuchuan County. The maximum yield decrease and its probability were 3.5 and 64.6%, respectively, for the temperature maximum increase 88.3%, and its risk was 2.2%. The maximum yield decrease and its probability were 14.1 and 56.1%, respectively, for the precipitation maximum decrease 35.2%, and its risk was 7.9%. For the comprehensive impacts of temperature and precipitation, the maximum yield decrease and its probability were 17.6 and 53.4%, respectively, and its risk increased to 9.4%. If we do not adopt appropriate adaptation strategies, the degree of loss from the negative impacts of multiclimatic factors and its probability of occurrence will both increase accordingly, and the risk will also grow obviously.

  1. Quantitative breast MRI radiomics for cancer risk assessment and the monitoring of high-risk populations

    NASA Astrophysics Data System (ADS)

    Mendel, Kayla R.; Li, Hui; Giger, Maryellen L.

    2016-03-01

    Breast density is routinely assessed qualitatively in screening mammography. However, it is challenging to quantitatively determine a 3D density from a 2D image such as a mammogram. Furthermore, dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) is used more frequently in the screening of high-risk populations. The purpose of our study is to segment parenchyma and to quantitatively determine volumetric breast density on pre-contrast axial DCE-MRI images (i.e., non-contrast) using a semi-automated quantitative approach. In this study, we retroactively examined 3D DCE-MRI images taken for breast cancer screening of a high-risk population. We analyzed 66 cases with ages between 28 and 76 (mean 48.8, standard deviation 10.8). DCE-MRIs were obtained on a Philips 3.0 T scanner. Our semi-automated DCE-MRI algorithm includes: (a) segmentation of breast tissue from non-breast tissue using fuzzy cmeans clustering (b) separation of dense and fatty tissues using Otsu's method, and (c) calculation of volumetric density as the ratio of dense voxels to total breast voxels. We examined the relationship between pre-contrast DCE-MRI density and clinical BI-RADS density obtained from radiology reports, and obtained a statistically significant correlation [Spearman ρ-value of 0.66 (p < 0.0001)]. Our method within precision medicine may be useful for monitoring high-risk populations.

  2. Automated quantitative assessment of three-dimensional bioprinted hydrogel scaffolds using optical coherence tomography

    PubMed Central

    Wang, Ling; Xu, Mingen; Zhang, LieLie; Zhou, QingQing; Luo, Li

    2016-01-01

    Reconstructing and quantitatively assessing the internal architecture of opaque three-dimensional (3D) bioprinted hydrogel scaffolds is difficult but vital to the improvement of 3D bioprinting techniques and to the fabrication of functional engineered tissues. In this study, swept-source optical coherence tomography was applied to acquire high-resolution images of hydrogel scaffolds. Novel 3D gelatin/alginate hydrogel scaffolds with six different representative architectures were fabricated using our 3D bioprinting system. Both the scaffold material networks and the interconnected flow channel networks were reconstructed through volume rendering and binarisation processing to provide a 3D volumetric view. An image analysis algorithm was developed based on the automatic selection of the spatially-isolated region-of–interest. Via this algorithm, the spatially-resolved morphological parameters including pore size, pore shape, strut size, surface area, porosity, and interconnectivity were quantified precisely. Fabrication defects and differences between the designed and as-produced scaffolds were clearly identified in both 2D and 3D; the locations and dimensions of each of the fabrication defects were also defined. It concludes that this method will be a key tool for non-destructive and quantitative characterization, design optimisation and fabrication refinement of 3D bioprinted hydrogel scaffolds. Furthermore, this method enables investigation into the quantitative relationship between scaffold structure and biological outcome. PMID:27231597

  3. Quantitative photoacoustic characterization of blood clot in blood: A mechanobiological assessment through spectral information

    NASA Astrophysics Data System (ADS)

    Biswas, Deblina; Vasudevan, Srivathsan; Chen, George C. K.; Sharma, Norman

    2017-02-01

    Formation of blood clots, called thrombus, can happen due to hyper-coagulation of blood. Thrombi, while moving through blood vessels can impede blood flow, an important criterion for many critical diseases like deep vein thrombosis and heart attacks. Understanding mechanical properties of clot formation is vital for assessment of severity of thrombosis and proper treatment. However, biomechanics of thrombus is less known to clinicians and not very well investigated. Photoacoustic (PA) spectral response, a non-invasive technique, is proposed to investigate the mechanism of formation of blood clots through elasticity and also differentiate clots from blood. Distinct shift (increase in frequency) of the PA response dominant frequency during clot formation is reported. In addition, quantitative differentiation of blood clots from blood has been achieved through parameters like dominant frequency and spectral energy of PA spectral response. Nearly twofold increases in dominant frequency in blood clots compared to blood were found in the PA spectral response. Significant changes in energy also help in quantitatively differentiating clots from blood, in the blood. Our results reveal that increase in density during clot formation is reflected in the PA spectral response, a significant step towards understanding the mechanobiology of thrombus formation. Hence, the proposed tool, in addition to detecting thrombus formation, could reveal mechanical properties of the sample through quantitative photoacoustic spectral parameters.

  4. Quantitative sonoelastography for the in vivo assessment of skeletal muscle viscoelasticity

    NASA Astrophysics Data System (ADS)

    Hoyt, Kenneth; Kneezel, Timothy; Castaneda, Benjamin; Parker, Kevin J.

    2008-08-01

    A novel quantitative sonoelastography technique for assessing the viscoelastic properties of skeletal muscle tissue was developed. Slowly propagating shear wave interference patterns (termed crawling waves) were generated using a two-source configuration vibrating normal to the surface. Theoretical models predict crawling wave displacement fields, which were validated through phantom studies. In experiments, a viscoelastic model was fit to dispersive shear wave speed sonoelastographic data using nonlinear least-squares techniques to determine frequency-independent shear modulus and viscosity estimates. Shear modulus estimates derived using the viscoelastic model were in agreement with that obtained by mechanical testing on phantom samples. Preliminary sonoelastographic data acquired in healthy human skeletal muscles confirm that high-quality quantitative elasticity data can be acquired in vivo. Studies on relaxed muscle indicate discernible differences in both shear modulus and viscosity estimates between different skeletal muscle groups. Investigations into the dynamic viscoelastic properties of (healthy) human skeletal muscles revealed that voluntarily contracted muscles exhibit considerable increases in both shear modulus and viscosity estimates as compared to the relaxed state. Overall, preliminary results are encouraging and quantitative sonoelastography may prove clinically feasible for in vivo characterization of the dynamic viscoelastic properties of human skeletal muscle.

  5. Quantitative MRI and strength measurements in the assessment of muscle quality in Duchenne muscular dystrophy.

    PubMed

    Wokke, B H; van den Bergen, J C; Versluis, M J; Niks, E H; Milles, J; Webb, A G; van Zwet, E W; Aartsma-Rus, A; Verschuuren, J J; Kan, H E

    2014-05-01

    The purpose of this study was to assess leg muscle quality and give a detailed description of leg muscle involvement in a series of Duchenne muscular dystrophy patients using quantitative MRI and strength measurements. Fatty infiltration, as well as total and contractile (not fatty infiltrated) cross sectional areas of various leg muscles were determined in 16 Duchenne patients and 11 controls (aged 8-15). To determine specific muscle strength, four leg muscle groups (quadriceps femoris, hamstrings, anterior tibialis and triceps surae) were measured and related to the amount of contractile tissue. In patients, the quadriceps femoris showed decreased total and contractile cross sectional area, attributable to muscle atrophy. The total, but not the contractile, cross sectional area of the triceps surae was increased in patients, corresponding to hypertrophy. Specific strength decreased in all four muscle groups of Duchenne patients, indicating reduced muscle quality. This suggests that muscle hypertrophy and fatty infiltration are two distinct pathological processes, differing between muscle groups. Additionally, the quality of remaining muscle fibers is severely reduced in the legs of Duchenne patients. The combination of quantitative MRI and quantitative muscle testing could be a valuable outcome parameter in longitudinal studies and in the follow-up of therapeutic effects.

  6. The Quantitative Ideas and Methods in Assessment of Four Properties of Chinese Medicinal Herbs.

    PubMed

    Fu, Jialei; Pang, Jingxiang; Zhao, Xiaolei; Han, Jinxiang

    2015-04-01

    The purpose of this review is to summarize and reflect on the current status and problems of the research on the properties of Chinese medicinal herbs. Hot, warm, cold, and cool are the four properties/natures of Chinese medicinal herbs. They are defined based on the interaction between the herbs with human body. How to quantitatively assess the therapeutic effect of Chinese medicinal herbs based on the theoretical system of Traditional Chinese medicine (TCM) remains to be a challenge. Previous studies on the topic from several perspectives have been presented. Results and problems were discussed. New ideas based on the technology of biophoton radiation detection are proposed. With the development of biophoton detection technology, detection and characterization of human biophoton emission has led to its potential applications in TCM. The possibility of using the biophoton analysis system to study the interaction of Chinese medicinal herbs with human body and to quantitatively determine the effect of the Chinese medicinal herbal is entirely consistent with the holistic concept of TCM theory. The statistical entropy of electromagnetic radiations from the biological systems can characterize the four properties of Chinese medicinal herbs, and the spectrum can characterize the meridian tropism of it. Therefore, we hypothesize that by the use of biophoton analysis system, the four properties and meridian tropism of Chinese medicinal herbs can be quantitatively expressed.

  7. An Analytical Pipeline for Quantitative Characterization of Dietary Intake: Application To Assess Grape Intake.

    PubMed

    Garcia-Perez, Isabel; Posma, Joram M; Chambers, Edward S; Nicholson, Jeremy K; C Mathers, John; Beckmann, Manfred; Draper, John; Holmes, Elaine; Frost, Gary

    2016-03-23

    Lack of accurate dietary assessment in free-living populations requires discovery of new biomarkers reflecting food intake qualitatively and quantitatively to objectively evaluate effects of diet on health. We provide a proof-of-principle for an analytical pipeline to identify quantitative dietary biomarkers. Tartaric acid was identified by nuclear magnetic resonance spectroscopy as a dose-responsive urinary biomarker of grape intake and subsequently quantified in volunteers following a series of 4-day dietary interventions incorporating 0 g/day, 50 g/day, 100 g/day, and 150 g/day of grapes in standardized diets from a randomized controlled clinical trial. Most accurate quantitative predictions of grape intake were obtained in 24 h urine samples which have the strongest linear relationship between grape intake and tartaric acid excretion (r(2) = 0.90). This new methodological pipeline for estimating nutritional intake based on coupling dietary intake information and quantified nutritional biomarkers was developed and validated in a controlled dietary intervention study, showing that this approach can improve the accuracy of estimating nutritional intakes.

  8. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Technical Performance Assessment

    PubMed Central

    2017-01-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers (QIBs) to measure changes in these features. Critical to the performance of a QIB in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method and metrics used to assess a QIB for clinical use. It is therefore, difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America (RSNA) and the Quantitative Imaging Biomarker Alliance (QIBA) with technical, radiological and statistical experts developed a set of technical performance analysis methods, metrics and study designs that provide terminology, metrics and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of QIB performance studies so that results from multiple studies can be compared, contrasted or combined. PMID:24919831

  9. Monitoring and quantitative assessment of tumor burden using in vivo bioluminescence imaging

    NASA Astrophysics Data System (ADS)

    Chen, Chia-Chi; Hwang, Jeng-Jong; Ting, Gann; Tseng, Yun-Long; Wang, Shyh-Jen; Whang-Peng, Jaqueline

    2007-02-01

    In vivo bioluminescence imaging (BLI) is a sensitive imaging modality that is rapid and accessible, and may comprise an ideal tool for evaluating tumor growth. In this study, the kinetic of tumor growth has been assessed in C26 colon carcinoma bearing BALB/c mouse model. The ability of BLI to noninvasively quantitate the growth of subcutaneous tumors transplanted with C26 cells genetically engineered to stably express firefly luciferase and herpes simplex virus type-1 thymidine kinase (C26/ tk-luc). A good correlation ( R2=0.998) of photon emission to the cell number was found in vitro. Tumor burden and tumor volume were monitored in vivo over time by quantitation of photon emission using Xenogen IVIS 50 and standard external caliper measurement, respectively. At various time intervals, tumor-bearing mice were imaged to determine the correlation of in vivo BLI to tumor volume. However, a correlation of BLI to tumor volume was observed when tumor volume was smaller than 1000 mm 3 ( R2=0.907). γ Scintigraphy combined with [ 131I]FIAU was another imaging modality used for verifying the previous results. In conclusion, this study showed that bioluminescence imaging is a powerful and quantitative tool for the direct assay to monitor tumor growth in vivo. The dual reporter genes transfected tumor-bearing animal model can be applied in the evaluation of the efficacy of new developed anti-cancer drugs.

  10. Megakaryocytes mimicking metastatic breast carcinoma.

    PubMed

    Hoda, Syed A; Resetkova, Erika; Yusuf, Yasmin; Cahan, Anthony; Rosen, Paul P

    2002-05-01

    False-positive diagnosis of lymph nodes occurs when a benign element in a lymph node, or in its capsule, is interpreted as metastatic carcinoma. This report describes a patient with breast carcinoma who had megakaryocytes in axillary sentinel lymph nodes mimicking metastatic carcinoma. The patient had no history of a hematologic disease, and we found no evidence of a concurrent hematopoietic disorder. The megakaryocytes were reactive for CD31, CD61, and von Willebrand factor, but not for cytokeratin (AE1/AE3). Megakaryocytes should be added to the list of benign histologic abnormalities that may simulate metastatic carcinoma in a sentinel lymph node.

  11. Norwegian scabies mimicking rupioid psoriasis*

    PubMed Central

    Costa, Juliana Bastos; de Sousa, Virna Lygia Lobo Rocha; da Trindade Neto, Pedro Bezerra; Paulo Filho, Thomás de Aquino; Cabral, Virgínia Célia Dias Florêncio; Pinheiro, Patrícia Moura Rossiter

    2012-01-01

    Norwegian scabies is a highly contagious skin infestation caused by an ectoparasite, Scarcoptes scabiei var. Hominis, which mainly affects immunosuppressed individuals. Clinically, it may simulate various dermatoses such as psoriasis, Darier's disease, seborrheic dermatitis, among others. This is a case report of a 33-year-old woman, immunocompetent, diagnosed with generalized anxiety disorder (cancer phobia), who had erythematous, well-defined plaques, covered with rupioid crusts, on her neck, axillary folds, breast, periumbilical region, groin area, besides upper back and elbows, mimicking an extremely rare variant of psoriasis, denominated rupioid psoriasis. PMID:23197214

  12. Tinea capitis mimicking folliculitis decalvans.

    PubMed

    Tangjaturonrusamee, C; Piraccini, B M; Vincenzi, C; Starace, M; Tosti, A

    2011-01-01

    We report on an adult patient with tinea capitis caused by Microsporum canis, who presented with diffuse alopecia and follicular pustules, mimicking folliculitis decalvans. Examination of the scalp showed severe alopecia with prominent involvement of the frontal and vertex scalp: the skin was markedly erythematous with pustules and brownish crusts. Videodermoscopy revealed visible follicular ostia, numerous pustular lesions and several comma hairs. Fluconazole 150 mg a week for 8 weeks associated with ketoconazole shampoo cleared the inflammatory lesions and produced complete hair regrowth. © 2009 Blackwell Verlag GmbH.

  13. The great mimickers of rosacea.

    PubMed

    Olazagasti, Jeannette; Lynch, Peter; Fazel, Nasim

    2014-07-01

    Although rosacea is one of the most common conditions treated by dermatologists, it also is one of the most misunderstood. It is a chronic disorder affecting the central parts of the face and is characterized by frequent flushing; persistent erythema (ie, lasting for at least 3 months); telangiectasia; and interspersed episodes of inflammation with swelling, papules, and pustules. Understanding the clinical variants and disease course of rosacea is important to differentiate this entity from other conditions that can mimic rosacea. Herein we present several mimickers of rosacea that physicians should consider when diagnosing this condition.

  14. Splenic inflammatory pseudotumor mimicking angiosarcoma.

    PubMed

    Hsu, Chao-Wen; Lin, Chieh-Hsin; Yang, Tsung-Lung; Chang, Hong-Tai

    2008-11-07

    Splenic tumors are rare. Differentiation of the tumors before operation is of great value regarding the outcome. A case of a 32-year-old man with a splenic inflammatory pseudotumor (IPT) mimicking splenic angiosarcoma is described. The tumor was highly suspected of being splenic angiosarcoma based on radiological findings preoperatively. However, after splenectomy, histopathological examinations revealed splenic IPT. Splenic IPT and angiosarcoma are rare and often pose diagnostic difficulties because the clinical and radiological findings are obscure. Due to large differences in prognosis, we briefly reviewed the clinical, radiological, and pathological features of both of the tumors.

  15. Splenic inflammatory pseudotumor mimicking angiosarcoma

    PubMed Central

    Hsu, Chao-Wen; Lin, Chieh-Hsin; Yang, Tsung-Lung; Chang, Hong-Tai

    2008-01-01

    Splenic tumors are rare. Differentiation of the tumors before operation is of great value regarding the outcome. A case of a 32-year-old man with a splenic inflammatory pseudotumor (IPT) mimicking splenic angiosarcoma is described. The tumor was highly suspected of being splenic angiosarcoma based on radiological findings preoperatively. However, after splenectomy, histopathological examinations revealed splenic IPT. Splenic IPT and angiosarcoma are rare and often pose diagnostic difficulties because the clinical and radiological findings are obscure. Due to large differences in prognosis, we briefly reviewed the clinical, radiological, and pathological features of both of the tumors. PMID:19009664

  16. Quantitative Microbial Risk Assessment Tutorial Installation of Software for Watershed Modeling in Support of QMRA - Updated 2017

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling: • QMRA Installation • SDMProjectBuilder (which includes the Mi...

  17. Disc Degeneration Assessed by Quantitative T2* (T2 star) Correlated with Functional Lumbar Mechanics

    PubMed Central

    Ellingson, Arin M.; Mehta, Hitesh; Polly, David W.; Ellermann, Jutta; Nuckley, David J.

    2013-01-01

    Study Design Experimental correlation study design to quantify features of disc health, including signal intensity and distinction between the annulus fibrosus (AF) and nucleus pulposus (NP), with T2* magnetic resonance imaging (MRI) and correlate with the functional mechanics in corresponding motion segments. Objective Establish the relationship between disc health assessed by quantitative T2* MRI and functional lumbar mechanics. Summary of Background Data Degeneration leads to altered biochemistry in the disc, affecting the mechanical competence. Clinical routine MRI sequences are not adequate in detecting early changes in degeneration and fails to correlate with pain or improve patient stratification. Quantitative T2* relaxation time mapping probes biochemical features and may offer more sensitivity in assessing disc degeneration. Methods Cadaveric lumbar spines were imaged using quantitative T2* mapping, as well as conventional T2-weighted MRI sequences. Discs were graded by the Pfirrmann scale and features of disc health, including signal intensity (T2* Intensity Area) and distinction between the AF and NP (Transition Zone Slope), were quantified by T2*. Each motion segment was subjected to pure moment bending to determine range of motion (ROM), neutral zone (NZ), and bending stiffness. Results T2* Intensity Area and Transition Zone Slope were significantly correlated with flexion ROM (p=0.015; p=0.002), ratio of NZ/ROM (p=0.010; p=0.028), and stiffness (p=0.044; p=0.026), as well as lateral bending NZ/ROM (p=0.005; p=0.010) and stiffness (p=0.022; p=0.029). T2* Intensity Area was also correlated with LB ROM (p=0.023). Pfirrmann grade was only correlated with lateral bending NZ/ROM (p=0.001) and stiffness (p=0.007). Conclusions T2* mapping is a sensitive quantitative method capable of detecting changes associated with disc degeneration. Features of disc health quantified with T2* predicted altered functional mechanics of the lumbar spine better than

  18. Survey of Quantitative Research Metrics to Assess Pilot Performance in Upset Recovery

    NASA Technical Reports Server (NTRS)

    Le Vie, Lisa R.

    2016-01-01

    Accidents attributable to in-flight loss of control are the primary cause for fatal commercial jet accidents worldwide. The National Aeronautics and Space Administration (NASA) conducted a literature review to determine and identify the quantitative standards for assessing upset recovery performance. This review contains current recovery procedures for both military and commercial aviation and includes the metrics researchers use to assess aircraft recovery performance. Metrics include time to first input, recognition time and recovery time and whether that input was correct or incorrect. Other metrics included are: the state of the autopilot and autothrottle, control wheel/sidestick movement resulting in pitch and roll, and inputs to the throttle and rudder. In addition, airplane state measures, such as roll reversals, altitude loss/gain, maximum vertical speed, maximum/minimum air speed, maximum bank angle and maximum g loading are reviewed as well.

  19. Combining qualitative and quantitative imaging evaluation for the assessment of genomic DNA integrity: The SPIDIA experience.

    PubMed

    Ciniselli, Chiara Maura; Pizzamiglio, Sara; Malentacchi, Francesca; Gelmini, Stefania; Pazzagli, Mario; Hartmann, Christina C; Ibrahim-Gawel, Hady; Verderio, Paolo

    2015-06-15

    In this note, we present an ad hoc procedure that combines qualitative (visual evaluation) and quantitative (ImageJ software) evaluations of Pulsed-Field Gel Electrophoresis (PFGE) images to assess the genomic DNA (gDNA) integrity of analyzed samples. This procedure could be suitable for the analysis of a large number of images by taking into consideration both the expertise of researchers and the objectiveness of the software. We applied this procedure on the first SPIDIA DNA External Quality Assessment (EQA) samples. Results show that the classification obtained by this ad hoc procedure allows a more accurate evaluation of gDNA integrity with respect to a single approach. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Quantitative application of biodegradation data to environmental risk and exposure assessments

    SciTech Connect

    Larson, R.J.; Cowan, C.E.

    1995-08-01

    Biodegradation is an important removal mechanism for natural and synthetic organic chemicals released to aquatic, benthic, and terrestrial ecosystems. It results in a decrease in the overall mass or load of chemicals present in the environment and is key in preventing the accumulation and persistence of chemicals in specific environmental compartments. Although biodegradation is an important process for minimizing potential adverse impacts on environmental systems, it has not been traditionally considered in a quantitative fashion in environmental risk assessments. This article outlines an approach and provides simple kinetic criteria for incorporating biodegradation rate data into environmental exposure and risk assessments. The approach is a generic one that relates biodegradation half-lives to chemical residence times in specific environmental compartments. It is broadly applicable to any organic chemical in a range of environmental compartments and has potential use as a technical and regulatory tool to better quantify environmental exposure and risk.

  1. Quantitative assessment of Cerenkov luminescence for radioguided brain tumor resection surgery

    NASA Astrophysics Data System (ADS)

    Klein, Justin S.; Mitchell, Gregory S.; Cherry, Simon R.

    2017-05-01

    Cerenkov luminescence imaging (CLI) is a developing imaging modality that detects radiolabeled molecules via visible light emitted during the radioactive decay process. We used a Monte Carlo based computer simulation to quantitatively investigate CLI compared to direct detection of the ionizing radiation itself as an intraoperative imaging tool for assessment of brain tumor margins. Our brain tumor model consisted of a 1 mm spherical tumor remnant embedded up to 5 mm in depth below the surface of normal brain tissue. Tumor to background contrast ranging from 2:1 to 10:1 were considered. We quantified all decay signals (e±, gamma photon, Cerenkov photons) reaching the brain volume surface. CLI proved to be the most sensitive method for detecting the tumor volume in both imaging and non-imaging strategies as assessed by contrast-to-noise ratio and by receiver operating characteristic output of a channelized Hotelling observer.

  2. [Quantitative diagnosis of hypernasality in cleft lip and palate patients by computerized nasal quality assessment].

    PubMed

    Bressmann, T; Sader, R; Awan, S; Busch, R; Zeilhofer, H F; Horch, H H

    1999-05-01

    In patients with cleft lip and palate (CLP), the assessment of velopharyngeal morphology and function and the quantitative analysis of perceptual consequences of velopharyngeal insufficiency are of major importance regarding the effective planning of velopharyngoplasties for speech improvement. The NasalView, a new instrument for the objective assessment of rhinophonia, is presented. The NasalView measures nasalance, the relative sound pressure level of the nasal signal in speech, expressed as a percentage. In order to evaluate the effectiveness of the computerised measurement of nasalance, 156 patients with surgically treated CLP were examined. The NasalView differentiated with high sensitivity and specificity between patients with normal nasal resonance and patients with varying degrees of hypernasality. To illustrate the importance of the NasalView for making the decision for a velopharyngoplasty, a single case is presented.

  3. Intentional Movement Performance Ability (IMPA): a method for robot-aided quantitative assessment of motor function.

    PubMed

    Shin, Sung Yul; Kim, Jung Yoon; Lee, Sanghyeop; Lee, Junwon; Kim, Seung-Jong; Kim, ChangHwan

    2013-06-01

    The purpose of this paper is to propose a new assessment method for evaluating motor function of the patients who are suffering from physical weakness after stroke, incomplete spinal cord injury (iSCI) or other diseases. In this work, we use a robotic device to obtain the information of interaction occur between patient and robot, and use it as a measure for assessing the patients. The Intentional Movement Performance Ability (IMPA) is defined by the root mean square of the interactive torque, while the subject performs given periodic movement with the robot. IMPA is proposed to quantitatively determine the level of subject's impaired motor function. The method is indirectly tested by asking the healthy subjects to lift a barbell to disturb their motor function. The experimental result shows that the IMPA has a potential for providing a proper information of the subject's motor function level.

  4. Quantitative assessment of Cerenkov luminescence for radioguided brain tumor resection surgery.

    PubMed

    Klein, Justin S; Mitchell, Gregory; Cherry, Simon

    2017-03-13

    Cerenkov luminescence imaging (CLI) is a developing imaging modality that detects radiolabeled molecules via visible light emitted during the radioactive decay process. We used a Monte Carlo based computer simulation to quantitatively investigate CLI compared to direct detection of the ionizing radiation itself as an intraoperative imaging tool for assessment of brain tumor margins. Our brain tumor model consisted of a 1 mm spherical tumor remnant embedded up to 5 mm in depth below the surface of normal brain tissue. Tumor to background contrast ranging from 2:1 to 10:1 were considered. We quantified all decay signals (e+/-, gamma photon, Cerenkov photons) reaching the brain volume surface. CLI proved to be the most sensitive method for detecting the tumor volume in both imaging and non-imaging strategies as assessed by contrast-to-noise ratio and by receiver operating characteristic output of a channelized Hotelling observer.

  5. Quantitative assessment of groundwater vulnerability using index system and transport simulation, Huangshuihe catchment, China.

    PubMed

    Yu, Cheng; Yao, Yingying; Hayes, Gregory; Zhang, Baoxiang; Zheng, Chunmiao

    2010-11-15

    Groundwater vulnerability assessment has been an increasingly important environment management tool. The existing vulnerability assessment approaches are mostly index systems which have significant disadvantages. There need to be some quantitative studies on vulnerability indicators based on objective physical process study. In this study, we tried to do vulnerability assessment in Huangshuihe catchment in Shandong province of China using both contaminant transport simulations and index system approach. Transit time of 75% of hypothetical injected contaminant concentration was considered as the vulnerability indicator. First, we collected the field data of the Huangshuihe catchment and the catchment was divided into 34 sub areas that can each be treated as a transport sub model. Next, we constructed a Hydrus1D transport model of Huangshuihe catchment. Different sub areas had different input values. Thirdly, we used Monte-Carlo simulation to improve the collected data and did vulnerability assessment using the statistics of the contaminant transit time as a vulnerability indicator. Finally, to compare with the assessment result by transport simulation, we applied two index systems to Huangshuihe catchment. The first was DRASTIC system, and the other was a system we tentatively constructed examining the relationships between the transit time and the input parameters by simply changing the input values. The result of comparisons between the two index systems and transport simulation approach suggested partial validation to DRASTIC, and the construction of the new tentative index system was an attempt of building up index approaches based on physical process simulation.

  6. Quantitative assessment of participant knowledge and evaluation of participant satisfaction in the CARES training program.

    PubMed

    Goodman, Melody S; Si, Xuemei; Stafford, Jewel D; Obasohan, Adesuwa; Mchunguzi, Cheryl

    2012-01-01

    The purpose of the Community Alliance for Research Empowering Social change (CARES) training program was to (1) train community members on evidence-based public health, (2) increase their scientific literacy, and (3) develop the infrastructure for community-based participatory research (CBPR). We assessed participant knowledge and evaluated participant satisfaction of the CARES training program to identify learning needs, obtain valuable feedback about the training, and ensure learning objectives were met through mutually beneficial CBPR approaches. A baseline assessment was administered before the first training session and a follow-up assessment and evaluation was administered after the final training session. At each training session a pretest was administered before the session and a posttest and evaluation were administered at the end of the session. After training session six, a mid-training evaluation was administered. We analyze results from quantitative questions on the assessments, pre- and post-tests, and evaluations. CARES fellows knowledge increased at follow-up (75% of questions were answered correctly on average) compared with baseline (38% of questions were answered correctly on average) assessment; post-test scores were higher than pre-test scores in 9 out of 11 sessions. Fellows enjoyed the training and rated all sessions well on the evaluations. The CARES fellows training program was successful in participant satisfaction and increasing community knowledge of public health, CBPR, and research methodology. Engaging and training community members in evidence-based public health research can develop an infrastructure for community-academic research partnerships.

  7. Safety evaluation of disposable baby diapers using principles of quantitative risk assessment.

    PubMed

    Rai, Prashant; Lee, Byung-Mu; Liu, Tsung-Yun; Yuhui, Qin; Krause, Edburga; Marsman, Daniel S; Felter, Susan

    2009-01-01

    Baby diapers are complex products consisting of multiple layers of materials, most of which are not in direct contact with the skin. The safety profile of a diaper is determined by the biological properties of individual components and the extent to which the baby is exposed to each component during use. Rigorous evaluation of the toxicological profile and realistic exposure conditions of each material is important to ensure the overall safety of the diaper under normal and foreseeable use conditions. Quantitative risk assessment (QRA) principles may be applied to the safety assessment of diapers and similar products. Exposure to component materials is determined by (1) considering the conditions of product use, (2) the degree to which individual layers of the product are in contact with the skin during use, and (3) the extent to which some components may be extracted by urine and delivered to skin. This assessment of potential exposure is then combined with data from standard safety assessments of components to determine the margin of safety (MOS). This study examined the application of QRA to the safety evaluation of baby diapers, including risk assessments for some diaper ingredient chemicals for which establishment of acceptable and safe exposure levels were demonstrated.

  8. Real-Time, In Vivo Monitoring, and Quantitative Assessment of Intra-Arterial Vasospasm Therapy.

    PubMed

    Gölitz, Philipp; Kaschka, Iris; Lang, Stefan; Roessler, Karl; Knossalla, Frauke; Doerfler, Arnd

    2016-08-01

    Our study aimed to evaluate whether the effect of an intra-arterial vasospasm therapy can be assessed quantitatively by in vivo blood flow analysis using the postprocessing algorithm parametric color coding (PCC). We evaluated 17 patients presenting with acute clinical deterioration due to vasospasm following subarachnoidal hemorrhage treated with intra-arterial nimodipine application. Pre- and post-interventional DSA series were post-processed by PCC. The relative time to maximum opacification (rTmax) was calculated in 14 arterially and venously located points of interest. From that data, the pre- and post-interventional cerebral circulation time (CirT) was calculated. Additionally, the arterial vessel diameters were measured. Pre- and post-interventional values were compared and tested for significance, respectively. Flow analysis revealed in all arterial vessel segments a non-statistically significant prolongation of rTmax after treatment. The mean CirT was 5.62 s (±1.19 s) pre-interventionally and 5.16 s (±0.81 s) post-interventionally, and the difference turned out as statistically significant (p = 0.039). A significantly increased diameter was measurable in all arterial segments post-interventionally. PCC is a fast applicable imaging technique that allows via real-time and in vivo blood flow analysis a quantitative assessment of the effect of intra-arterial vasospasm therapy. Our results seem to validate in vivo that an intra-arterial nimodipine application induces not only vasodilatation of the larger vessels, but also improves the microcirculatory flow, leading to a shortened cerebral CirT that reaches normal range post-interventionally. Procedural monitoring via PCC offers the option to compare quantitatively different therapy regimes, which allows optimization of existing approaches and implementation of individualized treatment strategies.

  9. Changes in transmural distribution of myocardial perfusion assessed by quantitative intravenous myocardial contrast echocardiography in humans

    PubMed Central

    Fukuda, S; Muro, T; Hozumi, T; Watanabe, H; Shimada, K; Yoshiyama, M; Takeuchi, K; Yoshikawa, J

    2002-01-01

    Objective: To clarify whether changes in transmural distribution of myocardial perfusion under significant coronary artery stenosis can be assessed by quantitative intravenous myocardial contrast echocardiography (MCE) in humans. Methods: 31 patients underwent dipyridamole stress MCE and quantitative coronary angiography. Intravenous MCE was performed by continuous infusion of Levovist. Images were obtained from the apical four chamber view with alternating pulsing intervals both at rest and after dipyridamole infusion. Images were analysed offline by placing regions of interest over both endocardial and epicardial sides of the mid-septum. The background subtracted intensity versus pulsing interval plots were fitted to an exponential function, y = A (1 − e−βt), where A is plateau level and β is rate of rise. Results: Of the 31 patients, 16 had significant stenosis (> 70%) in the left anterior descending artery (group A) and 15 did not (group B). At rest, there were no differences in the A endocardial to epicardial ratio (A-EER) and β-EER between the two groups (mean (SD) 1.2 (0.6) v 1.2 (0.8) and 1.2 (0.7) v 1.1 (0.6), respectively, NS). During hyperaemia, β-EER in group A was significantly lower than that in group B (1.0 (0.5) v 1.4 (0.5), p < 0.05) and A-EER did not differ between the two groups (1.0 (0.5) v 1.2 (0.4), NS). Conclusions: Changes in transmural distribution of myocardial perfusion under significant coronary artery stenosis can be assessed by quantitative intravenous MCE in humans. PMID:12231594

  10. Enantioselective reductive transformation of climbazole: A concept towards quantitative biodegradation assessment in anaerobic biological treatment processes.

    PubMed

    Brienza, Monica; Chiron, Serge

    2017-06-01

    An efficient chiral method-based using liquid chromatography-high resolution-mass spectrometry analytical method has been validated for the determination of climbazole (CBZ) enantiomers in wastewater and sludge with quantification limits below the 1 ng/L and 2 ng/g range, respectively. On the basis of this newly developed analytical method, the stereochemistry of CBZ was investigated over time in sludge biotic and sterile batch experiments under anoxic dark and light conditions and during wastewater biological treatment by subsurface flow constructed wetlands. CBZ stereoselective degradation was exclusively observed under biotic conditions, confirming the specificity of enantiomeric fraction variations to biodegradation processes. Abiotic CBZ enantiomerization was insignificant at circumneutral pH and CBZ was always biotransformed into CBZ-alcohol due to the specific and enantioselective reduction of the ketone function of CBZ into a secondary alcohol function. This transformation was almost quantitative and biodegradation gave good first order kinetic fit for both enantiomers. The possibility to apply the Rayleigh equation to enantioselective CBZ biodegradation processes was investigated. The results of enantiomeric enrichment allowed for a quantitative assessment of in situ biodegradation processes due to a good fit (R(2) > 0.96) of the anoxic/anaerobic CBZ biodegradation to the Rayleigh dependency in all the biotic microcosms and was also applied in subsurface flow constructed wetlands. This work extended the concept of applying the Rayleigh equation towards quantitative biodegradation assessment of organic contaminants to enantioselective processes operating under anoxic/anaerobic conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Quantitative assessments of burn degree by high-frequency ultrasonic backscattering and statistical model

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Hsun; Huang, Chih-Chung; Wang, Shyh-Hau

    2011-02-01

    An accurate and quantitative modality to assess the burn degree is crucial for determining further treatments to be properly applied to burn injury patients. Ultrasounds with frequencies higher than 20 MHz have been applied to dermatological diagnosis due to its high resolution and noninvasive capability. Yet, it is still lacking a substantial means to sensitively correlate the burn degree and ultrasonic measurements quantitatively. Thus, a 50 MHz ultrasound system was developed and implemented to measure ultrasonic signals backscattered from the burned skin tissues. Various burn degrees were achieved by placing a 100 °C brass plate onto the dorsal skins of anesthetized rats for various durations ranged from 5 to 20 s. The burn degrees were correlated with ultrasonic parameters, including integrated backscatter (IB) and Nakagami parameter (m) calculated from ultrasonic signals acquired from the burned tissues of a 5 × 1.4 mm (width × depth) area. Results demonstrated that both IB and m decreased exponentially with the increase of burn degree. Specifically, an IB of -79.0 ± 2.4 (mean ± standard deviation) dB for normal skin tissues tended to decrease to -94.0 ± 1.3 dB for those burned for 20 s, while the corresponding Nakagami parameters tended to decrease from 0.76 ± 0.08 to 0.45 ± 0.04. The variation of both IB and m was partially associated with the change of properties of collagen fibers from the burned tissues verified by samples of tissue histological sections. Particularly, the m parameter may be more sensitive to differentiate burned skin due to the fact that it has a greater rate of change with respect to different burn durations. These ultrasonic parameters in conjunction with high-frequency B-mode and Nakagami images could have the potential to assess the burn degree quantitatively.

  12. Assessment of a semi-quantitative screening method for diagnosis of ethylene glycol poisoning.

    PubMed

    Sankaralingam, Arun; Thomas, Annette; James, David R; Wierzbicki, Anthony S

    2017-07-01

    Background Ethylene glycol poisoning remains a rare but important presentation to acute toxicology units. Guidelines recommended that ethylene glycol should be available as an 'urgent' test within 4 h, but these are difficult to deliver in practice. This study assessed a semi-quantitative enzymatic spectrophotometric assay for ethylene glycol compatible with automated platforms. Methods The ethylene glycol method was assessed in 21 samples from patients with an increased anion gap and metabolic acidosis not due to ethylene glycol ingestion, and seven samples known to contain ethylene glycol. All samples were analysed in random order in a blinded manner to their origin on a laboratory spectrophotometer. Results In this study, seven samples were known to contain ethylene glycol at concentrations >100 mg/L. The method correctly identified all seven samples as containing ethylene glycol. No false-positives were observed. Thirteen samples gave clear negative results. Ethylene glycol was present at <20 mg/L in one sample, but this sample remained within the limits of the negative control. Passing-Bablock correlation of estimates of ethylene glycol concentration against results obtained when the samples had been analysed using the quantitative method on an automated analyser showed a good correlation (R = 0.84) but with an apparent under-recovery. Conclusions A semi-quantitative assay for ethylene glycol was able to discriminate well between samples containing ethylene glycol and those with other causes of acidosis. It is a practical small-scale assay for rapid identification of cases of ethylene glycol poisoning.

  13. Second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygenation

    PubMed Central

    Huang, Jiwei; Zhang, Shiwu; Gnyawali, Surya; Sen, Chandan K.; Xu, Ronald X.

    2015-01-01

    Abstract. We report a second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygen saturation (StO2). The algorithm is based on a forward model of light transport in multilayered skin tissue and an inverse algorithm for StO2 reconstruction. Based on the forward simulation results, a parameter of a second derivative ratio (SDR) is derived as a function of cutaneous tissue StO2. The SDR function is optimized at a wavelength set of 544, 552, 568, 576, 592, and 600 nm so that cutaneous tissue StO2 can be derived with minimal artifacts by blood concentration, tissue scattering, and melanin concentration. The proposed multispectral StO2 imaging algorithm is verified in both benchtop and in vivo experiments. The experimental results show that the proposed multispectral imaging algorithm is able to map cutaneous tissue StO2 in high temporal resolution with reduced measurement artifacts induced by different skin conditions in comparison with other three commercial tissue oxygen measurement systems. These results indicate that the multispectral StO2 imaging technique has the potential for noninvasive and quantitative assessment of skin tissue oxygenation with a high temporal resolution. PMID:25734405

  14. Using Non-Invasive Multi-Spectral Imaging to Quantitatively Assess Tissue Vasculature

    SciTech Connect

    Vogel, A; Chernomordik, V; Riley, J; Hassan, M; Amyot, F; Dasgeb, B; Demos, S G; Pursley, R; Little, R; Yarchoan, R; Tao, Y; Gandjbakhche, A H

    2007-10-04

    This research describes a non-invasive, non-contact method used to quantitatively analyze the functional characteristics of tissue. Multi-spectral images collected at several near-infrared wavelengths are input into a mathematical optical skin model that considers the contributions from different analytes in the epidermis and dermis skin layers. Through a reconstruction algorithm, we can quantify the percent of blood in a given area of tissue and the fraction of that blood that is oxygenated. Imaging normal tissue confirms previously reported values for the percent of blood in tissue and the percent of blood that is oxygenated in tissue and surrounding vasculature, for the normal state and when ischemia is induced. This methodology has been applied to assess vascular Kaposi's sarcoma lesions and the surrounding tissue before and during experimental therapies. The multi-spectral imaging technique has been combined with laser Doppler imaging to gain additional information. Results indicate that these techniques are able to provide quantitative and functional information about tissue changes during experimental drug therapy and investigate progression of disease before changes are visibly apparent, suggesting a potential for them to be used as complementary imaging techniques to clinical assessment.

  15. Consistencies and inconsistencies underlying the quantitative assessment of leukemia risk from benzene exposure

    SciTech Connect

    Lamm, S.H.; Walters, A.S. ); Wilson, R. ); Byrd, D.M. ); Grunwald, H. )

    1989-07-01

    This paper examines recent risk assessments for benzene and observes a number of inconsistencies within the study and consistencies between studies that should effect the quantitative determination of the risk from benzene exposure. Comparisons across studies show that only acute myeloid leukemia (AML) is found to be consistently in excess with significant benzene exposure. The data from the Pliofilm study that forms the basis of most quantitative assessments reveal that all the AML cases came from only one of the three studied plants and that all the benzene exposure data came from the other plants. Hematological data from the 1940s from the plant from which almost all of the industrial hygiene exposure data come do not correlate well with the originally published exposure estimates but do correlate well with an alternative set of exposure estimates that are much greater than those estimates originally published. Temporal relationships within the study are not consistent with those of other studies. The dose-response relationship is strongly nonlinear. Other data suggest that the leukemogenic effect of benzene is nonlinear and may derive from a threshold toxicity.

  16. Quantitative instruments used to assess children's sense of smell: a review article.

    PubMed

    Moura, Raissa Gomes Fonseca; Cunha, Daniele Andrade; Gomes, Ana Carolina de Lima Gusmão; Silva, Hilton Justino da

    2014-01-01

    To systematically gather from the literature available the quantitative instruments used to assess the sense of smell in studies carried out with children. The present study included a survey in the Pubmed and Bireme platforms and in the databases of MedLine, Lilacs, regional SciELO and Web of Science, followed by selection and critical analysis of the articles found and chosen. We selected original articles related to the topic in question, conducted only with children in Portuguese, English, and Spanish. We excluded studies addressing other phases of human development, exclusively or concurrently with the pediatric population; studies on animals; literature review articles; dissertations; book chapters; case study articles; and editorials. A book report protocol was created for this study, including the following information: author, department, year, location, population/sample, age, purpose of the study, methods, and main results. We found 8,451 articles by typing keywords and identifiers. Out of this total, 5,928 were excluded by the title, 2,366 by the abstract, and 123 after we read the full text. Thus, 34 articles were selected, of which 28 were repeated in the databases, totalizing 6 articles analyzed in this review. We observed a lack of standardization of the quantitative instruments used to assess children's sense of smell, with great variability in the methodology of the tests, which reduces the effectiveness and reliability of the results.

  17. Quantitative safety assessment of computer based I and C systems via modular Markov analysis

    SciTech Connect

    Elks, C. R.; Yu, Y.; Johnson, B. W.

    2006-07-01

    This paper gives a brief overview of the methodology based on quantitative metrics for evaluating digital I and C system that has been under development at the Univ. of Virginia for a number years. Our quantitative assessment methodology is based on three well understood and extensively practiced disciplines in the dependability assessment field: (1) System level fault modeling and fault injection, (2) safety and coverage based dependability modeling methods, and (3) statistical estimation of model parameters used for safety predication. There are two contributions of this paper; the first contribution is related to incorporating design flaw information into homogenous Markov models when such data is available. The second is to introduce a Markov modeling method for managing the modeling complexities of large distributed I and C systems for the predication of safety and reliability. The method is called Modular Markov Chain analysis. This method allows Markov models of the system to be composed in a modular manner. In doing so, it address two important issues. (1) The models are more visually representative of the functional the system. (2) Important failure dependencies that naturally occur in complex systems are modeled accurately with our approach. (authors)

  18. Semi-quantitative assessment of pulmonary perfusion in children using dynamic contrast-enhanced MRI

    NASA Astrophysics Data System (ADS)

    Fetita, Catalin; Thong, William E.; Ou, Phalla

    2013-03-01

    This paper addresses the study of semi-quantitative assessment of pulmonary perfusion acquired from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in a study population mainly composed of children with pulmonary malformations. The automatic analysis approach proposed is based on the indicator-dilution theory introduced in 1954. First, a robust method is developed to segment the pulmonary artery and the lungs from anatomical MRI data, exploiting 2D and 3D mathematical morphology operators. Second, the time-dependent contrast signal of the lung regions is deconvolved by the arterial input function for the assessment of the local hemodynamic system parameters, ie. mean transit time, pulmonary blood volume and pulmonary blood flow. The discrete deconvolution method implements here a truncated singular value decomposition (tSVD) method. Parametric images for the entire lungs are generated as additional elements for diagnosis and quantitative follow-up. The preliminary results attest the feasibility of perfusion quantification in pulmonary DCE-MRI and open an interesting alternative to scintigraphy for this type of evaluation, to be considered at least as a preliminary decision in the diagnostic due to the large availability of the technique and to the non-invasive aspects.

  19. Second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygenation.

    PubMed

    Huang, Jiwei; Zhang, Shiwu; Gnyawali, Surya; Sen, Chandan K; Xu, Ronald X

    2015-03-01

    We report a second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygen saturation (StO₂). The algorithm is based on a forward model of light transport in multilayered skin tissue and an inverse algorithm for StO₂ reconstruction. Based on the forward simulation results, a parameter of a second derivative ratio (SDR) is derived as a function of cutaneous tissue StO₂. The SDR function is optimized at a wavelength set of 544, 552, 568, 576, 592, and 600 nm so that cutaneous tissue StO₂ can be derived with minimal artifacts by blood concentration, tissue scattering, and melanin concentration. The proposed multispectral StO₂ imaging algorithm is verified in both benchtop and in vivo experiments. The experimental results show that the proposed multispectral imaging algorithm is able to map cutaneous tissue StO₂ in high temporal resolution with reduced measurement artifacts induced by different skin conditions in comparison with other three commercial tissue oxygen measurement systems. These results indicate that the multispectral StO₂ imaging technique has the potential for noninvasive and quantitative assessment of skin tissue oxygenation with a high temporal resolution.

  20. Quantitative assessment of wound healing using high-frequency ultrasound image analysis.

    PubMed

    Mohafez, H; Ahmad, S A; Hadizadeh, M; Moghimi, S; Roohi, S A; Marhaban, M H; Saripan, M I; Rampal, S

    2017-05-29

    We aimed to develop a method for quantitative assessment of wound healing in ulcerated diabetic feet. High-frequency ultrasound (HFU) images of 30 wounds were acquired in a controlled environment on post-debridement days 7, 14, 21, and 28. Meaningful features portraying changes in structure and intensity of echoes during healing were extracted from the images, their relevance and discriminatory power being verified by analysis of variance. Relative analysis of tissue healing was conducted by developing a features-based healing function, optimised using the pattern-search method. Its performance was investigated through leave-one-out cross-validation technique and reconfirmed using principal component analysis. The constructed healing function could depict tissue changes during healing with 87.8% accuracy. The first principal component derived from the extracted features demonstrated similar pattern to the constructed healing function, accounting for 86.3% of the data variance. The developed wound analysis technique could be a viable tool in quantitative assessment of diabetic foot ulcers during healing. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygenation

    NASA Astrophysics Data System (ADS)

    Huang, Jiwei; Zhang, Shiwu; Gnyawali, Surya; Sen, Chandan K.; Xu, Ronald X.

    2015-03-01

    We report a second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygen saturation (StO2). The algorithm is based on a forward model of light transport in multilayered skin tissue and an inverse algorithm for StO2 reconstruction. Based on the forward simulation results, a parameter of a second derivative ratio (SDR) is derived as a function of cutaneous tissue StO2. The SDR function is optimized at a wavelength set of 544, 552, 568, 576, 592, and 600 nm so that cutaneous tissue StO2 can be derived with minimal artifacts by blood concentration, tissue scattering, and melanin concentration. The proposed multispectral StO2 imaging algorithm is verified in both benchtop and in vivo experiments. The experimental results show that the proposed multispectral imaging algorithm is able to map cutaneous tissue StO2 in high temporal resolution with reduced measurement artifacts induced by different skin conditions in comparison with other three commercial tissue oxygen measurement systems. These results indicate that the multispectral StO2 imaging technique has the potential for noninvasive and quantitative assessment of skin tissue oxygenation with a high temporal resolution.

  2. Quantitative CMMI Assessment for Offshoring through the Analysis of Project Management Repositories

    NASA Astrophysics Data System (ADS)

    Sunetnanta, Thanwadee; Nobprapai, Ni-On; Gotel, Olly

    The nature of distributed teams and the existence of multiple sites in offshore software development projects pose a challenging setting for software process improvement. Often, the improvement and appraisal of software processes is achieved through a turnkey solution where best practices are imposed or transferred from a company’s headquarters to its offshore units. In so doing, successful project health checks and monitoring for quality on software processes requires strong project management skills, well-built onshore-offshore coordination, and often needs regular onsite visits by software process improvement consultants from the headquarters’ team. This paper focuses on software process improvement as guided by the Capability Maturity Model Integration (CMMI) and proposes a model to evaluate the status of such improvement efforts in the context of distributed multi-site projects without some of this overhead. The paper discusses the application of quantitative CMMI assessment through the collection and analysis of project data gathered directly from project repositories to facilitate CMMI implementation and reduce the cost of such implementation for offshore-outsourced software development projects. We exemplify this approach to quantitative CMMI assessment through the analysis of project management data and discuss the future directions of this work in progress.

  3. Quantitative assessment on soil enzyme activities of heavy metal contaminated soils with various soil properties.

    PubMed

    Xian, Yu; Wang, Meie; Chen, Weiping

    2015-11-01

    Soil enzyme activities are greatly influenced by soil properties and could be significant indicators of heavy metal toxicity in soil for bioavailability assessment. Two groups of experiments were conducted to determine the joint effects of heavy metals and soil properties on soil enzyme activities. Results showed that arylsulfatase was the most sensitive soil enzyme and could be used as an indicator to study the enzymatic toxicity of heavy metals under various soil properties. Soil organic matter (SOM) was the dominant factor affecting the activity of arylsulfatase in soil. A quantitative model was derived to predict the changes of arylsulfatase activity with SOM content. When the soil organic matter content was less than the critical point A (1.05% in our study), the arylsulfatase activity dropped rapidly. When the soil organic matter content was greater than the critical point A, the arylsulfatase activity gradually rose to higher levels showing that instead of harm the soil microbial activities were enhanced. The SOM content needs to be over the critical point B (2.42% in our study) to protect its microbial community from harm due to the severe Pb pollution (500mgkg(-1) in our study). The quantitative model revealed the pattern of variation of enzymatic toxicity due to heavy metals under various SOM contents. The applicability of the model under wider soil properties need to be tested. The model however may provide a methodological basis for ecological risk assessment of heavy metals in soil.

  4. Quantitative Risk Assessment of Human Trichinellosis Caused by Consumption of Pork Meat Sausages in Argentina.

    PubMed

    Sequeira, G J; Zbrun, M V; Soto, L P; Astesana, D M; Blajman, J E; Rosmini, M R; Frizzo, L S; Signorini, M L

    2016-03-01

    In Argentina, there are three known species of genus Trichinella; however, Trichinella spiralis is most commonly associated with domestic pigs and it is recognized as the main cause of human trichinellosis by the consumption of products made with raw or insufficiently cooked pork meat. In some areas of Argentina, this disease is endemic and it is thus necessary to develop a more effective programme of prevention and control. Here, we developed a quantitative risk assessment of human trichinellosis following pork meat sausage consumption, which may be used to identify the stages with greater impact on the probability of acquiring the disease. The quantitative model was designed to describe the conditions in which the meat is produced, processed, transported, stored, sold and consumed in Argentina. The model predicted a risk of human trichinellosis of 4.88 × 10(-6) and an estimated annual number of trichinellosis cases of 109. The risk of human trichinellosis was sensitive to the number of Trichinella larvae that effectively survived the storage period (r = 0.89), the average probability of infection (PPinf ) (r = 0.44) and the storage time (Storage) (r = 0.08). This model allowed assessing the impact of different factors influencing the risk of acquiring trichinellosis. The model may thus help to select possible strategies to reduce the risk in the chain of by-products of pork production.

  5. A quantitative model to assess Social Responsibility in Environmental Science and Technology.

    PubMed

    Valcárcel, M; Lucena, R

    2014-01-01

    The awareness of the impact of human activities in society and environment is known as "Social Responsibility" (SR). It has been a topic of growing interest in many enterprises since the fifties of the past Century, and its implementation/assessment is nowadays supported by international standards. There is a tendency to amplify its scope of application to other areas of the human activities, such as Research, Development and Innovation (R + D + I). In this paper, a model of quantitative assessment of Social Responsibility in Environmental Science and Technology (SR EST) is described in detail. This model is based on well established written standards as the EFQM Excellence model and the ISO 26000:2010 Guidance on SR. The definition of five hierarchies of indicators, the transformation of qualitative information into quantitative data and the dual procedure of self-evaluation and external evaluation are the milestones of the proposed model, which can be applied to Environmental Research Centres and institutions. In addition, a simplified model that facilitates its implementation is presented in the article.

  6. Quantitative photoacoustic assessment of ex-vivo lymph nodes of colorectal cancer patients

    NASA Astrophysics Data System (ADS)

    Sampathkumar, Ashwin; Mamou, Jonathan; Saegusa-Beercroft, Emi; Chitnis, Parag V.; Machi, Junji; Feleppa, Ernest J.

    2015-03-01

    Staging of cancers and selection of appropriate treatment requires histological examination of multiple dissected lymph nodes (LNs) per patient, so that a staggering number of nodes require histopathological examination, and the finite resources of pathology facilities create a severe processing bottleneck. Histologically examining the entire 3D volume of every dissected node is not feasible, and therefore, only the central region of each node is examined histologically, which results in severe sampling limitations. In this work, we assess the feasibility of using quantitative photoacoustics (QPA) to overcome the limitations imposed by current procedures and eliminate the resulting under sampling in node assessments. QPA is emerging as a new hybrid modality that assesses tissue properties and classifies tissue type based on multiple estimates derived from spectrum analysis of photoacoustic (PA) radiofrequency (RF) data and from statistical analysis of envelope-signal data derived from the RF signals. Our study seeks to use QPA to distinguish cancerous from non-cancerous regions of dissected LNs and hence serve as a reliable means of imaging and detecting small but clinically significant cancerous foci that would be missed by current methods. Dissected lymph nodes were placed in a water bath and PA signals were generated using a wavelength-tunable (680-950 nm) laser. A 26-MHz, f-2 transducer was used to sense the PA signals. We present an overview of our experimental setup; provide a statistical analysis of multi-wavelength classification parameters (mid-band fit, slope, intercept) obtained from the PA signal spectrum generated in the LNs; and compare QPA performance with our established quantitative ultrasound (QUS) techniques in distinguishing metastatic from non-cancerous tissue in dissected LNs. QPA-QUS methods offer a novel general means of tissue typing and evaluation in a broad range of disease-assessment applications, e.g., cardiac, intravascular

  7. Quantitative rock-fall hazard and risk assessment for Yosemite Valley, California

    NASA Astrophysics Data System (ADS)

    Stock, G. M.; Luco, N.; Collins, B. D.; Harp, E.; Reichenbach, P.; Frankel, K. L.

    2011-12-01

    Rock falls are a considerable hazard in Yosemite Valley, California with more than 835 rock falls and other slope movements documented since 1857. Thus, rock falls pose potentially significant risk to the nearly four million annual visitors to Yosemite National Park. Building on earlier hazard assessment work by the U.S. Geological Survey, we performed a quantitative rock-fall hazard and risk assessment for Yosemite Valley. This work was aided by several new data sets, including precise Geographic Information System (GIS) maps of rock-fall deposits, airborne and terrestrial LiDAR-based point cloud data and digital elevation models, and numerical ages of talus deposits. Using Global Position Systems (GPS), we mapped the positions of over 500 boulders on the valley floor and measured their distance relative to the mapped base of talus. Statistical analyses of these data yielded an initial hazard zone that is based on the 90th percentile distance of rock-fall boulders beyond the talus edge. This distance was subsequently scaled (either inward or outward from the 90th percentile line) based on rock-fall frequency information derived from a combination of cosmogenic beryllium-10 exposure dating of boulders beyond the edge of the talus, and computer model simulations of rock-fall runout. The scaled distances provide the basis for a new hazard zone on the floor of Yosemite Valley. Once this zone was delineated, we assembled visitor, employee, and resident use data for each structure within the hazard zone to quantitatively assess risk exposure. Our results identify areas within the new hazard zone that may warrant more detailed study, for example rock-fall susceptibility, which can be assessed through examination of high-resolution photographs, structural measurements on the cliffs, and empirical calculations derived from LiDAR point cloud data. This hazard and risk information is used to inform placement of existing and potential future infrastructure in Yosemite Valley.

  8. BTEX exposure assessment and quantitative risk assessment among petroleum product distributors.

    PubMed

    Heibati, Behzad; Pollitt, Krystal J Godri; Karimi, Ali; Yazdani Charati, Jamshid; Ducatman, Alan; Shokrzadeh, Mohammad; Mohammadyan, Mahmoud

    2017-10-01

    The aim of this study was to evaluate benzene, toluene, ethylbenzene, and xylene (BTEX) exposure among workers at four stations of a major oil distribution company. Personal BTEX exposure samples were collected over working shift (8h) for 50 workers at four stations of a major oil distribution company in Iran. Measured mean values for workers across four sites were benzene (2437, 992, 584, and 2788μg/m(3) respectively), toluene (4415, 2830, 1289, and 9407μg/m(3)), ethylbenzene (781, 522, 187, and 533μg/m(3)), and xylene (1134, 678, 322, and 525μg/m(3)). The maximum mean concentration measured across sites for benzene was 2788μg/m(3) (Station 4), toluene was 9407μg/m(3) (Station 4), ethylbenzene was 781μg/m(3) (Station 1) and xylene was 1134μg/m(3) (Station 1). The 8h averaged personal exposure benzene concentration exceeded the recommended value of 1600μg/m(3) established by the Iranian Committee for Review and Collection of Occupational Exposure Limit and American Conference of Governmental Industrial Hygienists. Mean values for excess lifetime cancer risk for exposure to benzene were then calculated across workers at each site. Estimates of excess risk ranged from 1.74 ± 4.05 (Station 4) to 8.31 ± 25.81 (Station 3). Risk was assessed by calculation of hazard quotients and hazard indexes, which indicated that xylene and particularly benzene were the strongest contributors. Tanker loading was the highest risk occupation at these facilties. Risk management approaches to reducing exposures to BTEX compounds, especially benzene, will be important to the health of workers in Iran. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Quantitative and qualitative assessment of structural magnetic resonance imaging data in a two-center study.

    PubMed

    Chalavi, Sima; Simmons, Andrew; Dijkstra, Hildebrand; Barker, Gareth J; Reinders, A A T Simone

    2012-08-06

    Multi-center magnetic resonance imaging (MRI) studies present an opportunity to advance research by pooling data. However, brain measurements derived from MR-images are susceptible to differences in MR-sequence parameters. It is therefore necessary to determine whether there is an interaction between the sequence parameters and the effect of interest, and to minimise any such interaction by careful choice of acquisition parameters. As an exemplar of the issues involved in multi-center studies, we present data from a study in which we aimed to optimize a set of volumetric MRI-protocols to define a protocol giving data that are consistent and reproducible across two centers and over time. Optimization was achieved based on data quality and quantitative measures, in our case using FreeSurfer and Voxel Based Morphometry approaches. Our approach consisted of a series of five comparisons. Firstly, a single-center dataset was collected, using a range of candidate pulse-sequences and parameters chosen on the basis of previous literature. Based on initial results, a number of minor changes were implemented to optimize the pulse-sequences, and a second single-center dataset was collected. FreeSurfer data quality measures were compared between datasets in order to determine the best performing sequence(s), which were taken forward to the next stage of testing. We subsequently acquired short-term and long-term two-center reproducibility data, and quantitative measures were again assessed to determine the protocol with the highest reproducibility across centers. Effects of a scanner software and hardware upgrade on the reproducibility of the protocols at one of the centers were also evaluated. Assessing the quality measures from the first two datasets allowed us to define artefact-free protocols, all with high image quality as assessed by FreeSurfer. Comparing the quantitative test and retest measures, we found high within-center reproducibility for all protocols, but lower

  10. An electronic portfolio for quantitative assessment of surgical skills in undergraduate medical education.

    PubMed

    Sánchez Gómez, Serafín; Ostos, Elisa María Cabot; Solano, Juan Manuel Maza; Salado, Tomás Francisco Herrero

    2013-05-06

    We evaluated a newly designed electronic portfolio (e-Portfolio) that provided quantitative evaluation of surgical skills. Medical students at the University of Seville used the e-Portfolio on a voluntary basis for evaluation of their performance in undergraduate surgical subjects. Our new web-based e-Portfolio was designed to evaluate surgical practical knowledge and skills targets. Students recorded each activity on a form, attached evidence, and added their reflections. Students self-assessed their practical knowledge using qualitative criteria (yes/no), and graded their skills according to complexity (basic/advanced) and participation (observer/assistant/independent). A numerical value was assigned to each activity, and the values of all activities were summated to obtain the total score. The application automatically displayed quantitative feedback. We performed qualitative evaluation of the perceived usefulness of the e-Portfolio and quantitative evaluation of the targets achieved. Thirty-seven of 112 students (33%) used the e-Portfolio, of which 87% reported that they understood the methodology of the portfolio. All students reported an improved understanding of their learning objectives resulting from the numerical visualization of progress, all students reported that the quantitative feedback encouraged their learning, and 79% of students felt that their teachers were more available because they were using the e-Portfolio. Only 51.3% of students reported that the reflective aspects of learning were useful. Individual students achieved a maximum of 65% of the total targets and 87% of the skills targets. The mean total score was 345 ± 38 points. For basic skills, 92% of students achieved the maximum score for participation as an independent operator, and all achieved the maximum scores for participation as an observer and assistant. For complex skills, 62% of students achieved the maximum score for participation as an independent operator, and 98% achieved

  11. An electronic portfolio for quantitative assessment of surgical skills in undergraduate medical education

    PubMed Central

    2013-01-01

    Background We evaluated a newly designed electronic portfolio (e-Portfolio) that provided quantitative evaluation of surgical skills. Medical students at the University of Seville used the e-Portfolio on a voluntary basis for evaluation of their performance in undergraduate surgical subjects. Methods Our new web-based e-Portfolio was designed to evaluate surgical practical knowledge and skills targets. Students recorded each activity on a form, attached evidence, and added their reflections. Students self-assessed their practical knowledge using qualitative criteria (yes/no), and graded their skills according to complexity (basic/advanced) and participation (observer/assistant/independent). A numerical value was assigned to each activity, and the values of all activities were summated to obtain the total score. The application automatically displayed quantitative feedback. We performed qualitative evaluation of the perceived usefulness of the e-Portfolio and quantitative evaluation of the targets achieved. Results Thirty-seven of 112 students (33%) used the e-Portfolio, of which 87% reported that they understood the methodology of the portfolio. All students reported an improved understanding of their learning objectives resulting from the numerical visualization of progress, all students reported that the quantitative feedback encouraged their learning, and 79% of students felt that their teachers were more available because they were using the e-Portfolio. Only 51.3% of students reported that the reflective aspects of learning were useful. Individual students achieved a maximum of 65% of the total targets and 87% of the skills targets. The mean total score was 345 ± 38 points. For basic skills, 92% of students achieved the maximum score for participation as an independent operator, and all achieved the maximum scores for participation as an observer and assistant. For complex skills, 62% of students achieved the maximum score for participation as an

  12. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    SciTech Connect

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and the potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments

  13. Quantitative Assessment of Breast Cosmetic Outcome After Whole-Breast Irradiation.

    PubMed

    Reddy, Jay P; Lei, Xiudong; Huang, Sheng-Cheng; Nicklaus, Krista M; Fingeret, Michelle C; Shaitelman, Simona F; Hunt, Kelly K; Buchholz, Thomas A; Merchant, Fatima; Markey, Mia K; Smith, Benjamin D

    2017-04-01

    To measure, by quantitative analysis of digital photographs, breast cosmetic outcome within the setting of a randomized trial of conventionally fractionated (CF) and hypofractionated (HF) whole-breast irradiation (WBI), to identify how quantitative cosmesis metrics were associated with patient- and physician-reported cosmesis and whether they differed by treatment arm. From 2011 to 2014, 287 women aged ≥40 with ductal carcinoma in situ or early invasive breast cancer were randomized to HF-WBI (42.56 Gy/16 fractions [fx] + 10-12.5 Gy/4-5 fx boost) or CF-WBI (50 Gy/25 fx + 10-14 Gy/5-7 fx). At 1 year after treatment we collected digital photographs, patient-reported cosmesis using the Breast Cancer Treatment and Outcomes Scale, and physician-reported cosmesis using the Radiation Therapy Oncology Group scale. Six quantitative measures of breast symmetry, labeled M1-M6, were calculated from anteroposterior digital photographs. For each measure, values closer to 1 imply greater symmetry, and values closer to 0 imply greater asymmetry. Associations between M1-M6 and patient- and physician-reported cosmesis and treatment arm were evaluated using the Kruskal-Wallis test. Among 245 evaluable patients, patient-reported cosmesis was strongly associated with M1 (vertical symmetry measure) (P<.01). Physician-reported cosmesis was similarly correlated with M1 (P<.01) and also with M2 (vertical symmetry, P=.01) and M4 (horizontal symmetry, P=.03). At 1 year after treatment, HF-WBI resulted in better values of M2 (P=.02) and M3 (P<.01) than CF-WBI; treatment arm was not significantly associated with M1, M4, M5, or M6 (P≥.12). Quantitative assessment of breast photographs reveals similar to improved cosmetic outcome with HF-WBI compared with CF-WBI 1 year after treatment. Assessing cosmetic outcome using these measures could be useful for future comparative effectiveness studies and outcome reporting. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Quantitative assessment of fibrosis and steatosis in liver biopsies from patients with chronic hepatitis C

    PubMed Central

    Zaitoun, A; Al, M; Awad, S; Ukabam, S; Makadisi, S; Record, C

    2001-01-01

    Backgrounds—Hepatic fibrosis is one of the main consequences of liver disease. Both fibrosis and steatosis may be seen in some patients with chronic hepatitis C and alcoholic liver disease (ALD). Aims—To quantitate fibrosis and steatosis by stereological and morphometric techniques in patients with chronic hepatitis C and compare the results with a control group of patients with ALD. In addition, to correlate the quantitative features of fibrosis with the Ishak modified histological score. Materials and methods—Needle liver biopsies from 86 patients with chronic hepatitis C and from 32 patients with alcoholic liver disease (disease controls) were analysed by stereological and morphometric analyses using the Prodit 5.2 system. Haematoxylin and eosin and Picro-Mallory stained sections were used. The area fractions (AA) of fibrosis, steatosis, parenchyma, and other structures (bile duct and central vein areas) were assessed by stereological method. The mean diameters of fat globules were determined by morphometric analysis. Results—Significant differences were found in the AA of fibrosis, including fibrosis within portal tract areas, between chronic hepatitis C patients and those with ALD (mean (SD): 19.14 (10.59) v 15.97 (12.51)). Portal and periportal (zone 1) fibrosis was significantly higher (p = 0.00004) in patients with chronic hepatitis C compared with the control group (mean (SD): 9.04 (6.37) v 3.59 (3.16)). Pericentral fibrosis (zone 3) occurred in both groups but was significantly more pronounced in patients with ALD. These results correlate well with the modified Ishak scoring system. However, in patients with cirrhosis (stage 6) with chronic hepatitis C the AA of fibrosis varied between 20% and 74%. The diameter of fat globules was significantly lower in patients with hepatitis C (p = 0.00002) than the ALD group (mean (SD): 14.44 (3.45) v 18.4 (3.32)). Microglobules were more frequent in patients with chronic hepatitis C than in patients with ALD

  15. A qualitative and quantitative needs assessment of pain management for hospitalized orthopedic patients.

    PubMed

    Cordts, Grace A; Grant, Marian S; Brandt, Lynsey E; Mears, Simon C

    2011-08-08

    Despite advances in pain management, little formal teaching is given to practitioners and nurses in its use for postoperative orthopedic patients. The goal of our study was to determine the educational needs for orthopedic pain management of our residents, nurses, and physical therapists using a quantitative and qualitative assessment. The needs analysis was conducted in a 10-bed orthopedic unit at a teaching hospital and included a survey given to 20 orthopedic residents, 9 nurses, and 6 physical therapists, followed by focus groups addressing barriers to pain control and knowledge of pain management. Key challenges for nurses included not always having breakthrough pain medication orders and the gap in pain management between cessation of patient-controlled analgesia and ordering and administering oral medications. Key challenges for orthopedic residents included treating pain in patients with a history of substance abuse, assessing pain, and determining when to use long-acting vs short-acting opioids. Focus group assessments revealed a lack of training in pain management and the need for better coordination of care between nurses and practitioners and improved education about special needs groups (the elderly and those with substance abuse issues). This needs assessment showed that orthopedic residents and nurses receive little formal education on pain management, despite having to address pain on a daily basis. This information will be used to develop an educational program to improve pain management for postoperative orthopedic patients. An integrated educational program with orthopedic residents, nurses, and physical therapists would promote understanding of issues for each discipline.

  16. Quantitative Integration of Ndt with Probabilistic Fracture Mechanics for the Assessment of Fracture Risk in Pipelines

    NASA Astrophysics Data System (ADS)

    Kurz, J. H.; Cioclov, D.; Dobmann, G.; Boller, C.

    2010-02-01

    In the context of probabilistic paradigm of fracture risk assessment in structural components a computer simulation rationale is presented which has at the base the integration of Quantitative Non-destructive Inspection and Probabilistic Fracture Mechanics. In this study the static failure under static loading is assessed in the format known as Failure Assessment Diagram (FAD). The fracture risk is evaluated in probabilistic terms. The superposed probabilistic pattern over the deterministic one is implemented via Monte-Carlo sampling. The probabilistic fracture simulation yields a more informative analysis in terms of probability of failure. The ability to simulate the influence of the quality and reliability of non-destructive inspection (NDI) is an important feature of this approach. It is achieved by integrating, algorithmically, probabilistic FAD analysis and the Probability of Detection (POD). The POD information can only be applied in a probabilistic analysis and leads to a refinement of the assessment. By this means, it can be ascertained the decrease of probability of failure when POD-characterized NDI is applied. Therefore, this procedure can be used as a tool for inspection based life time conceptions. In this paper results of sensitivity analyses are presented with the aim to outline, in terms of non-failure probabilities, the benefits of applying NDI, in various qualities, in comparison with the situation when NDI is lacking. A better substantiation is enabled of both the component reliability management and the costs-effectiveness of NDI timing.

  17. Quantitative assessment of properties of make-up products by video imaging: application to lipsticks.

    PubMed

    Korichi, Rodolphe; Provost, Robin; Heusèle, Catherine; Schnebert, Sylvianne

    2000-11-01

    BACKGROUND/AIMS: The different properties and visual effects of lipstick have been studied by image analysis directly on volunteers. METHODS: After controlling the volunteer's position mechanically using an ophthalmic table and visually using an acquirement mask, which is an indicator of luminance and guide marks, we carried out video colour images of the make-up area. From these images, we quantified the colour, gloss, covering power, long-lasting effect and streakiness, using computer science programs. RESULTS/CONCLUSION: Quantitative colorimetric assessment requires the transformation of the RGB components obtained by a video colour camera into CIELAB colorimetric space. The expression of each coordinate of the L*a*b* space according to R,G,B was carried out by a statistical method of polynomial approximations. A study, using 24 colour images extracted from a Pantone(R) palette, showed a very good correlation with a Minolta Colorimeter(R) CR 300. The colour assessment on volunteers required a segmentation method by maximizing the entropy. The aim was to separate the colour information sent back by the skin to the make-up area. It was very useful to precisely delimit the contour between the skin and the product in the case of almost identical colours and to evaluate the streakiness. From this colour segmentation, an algorithm was studied to search for the shades most represented in the overall colour of the make-up area. The capacity to replicate what the consumer perceives of the make-up product, to carry out studies without having any contact with the skin surface, and the constant improvement of software and video acquirement systems all make video imaging a very useful tool in the quantitative assessment of the properties and visual effects of a make-up product.

  18. Assessment of viable periodontal pathogens by reverse transcription quantitative polymerase chain reaction.

    PubMed

    Polonyi, M; Prenninger, N; Arweiler, N B; Haririan, H; Winklehner, P; Kierstein, S

    2013-10-01

    Molecular biological methods for the detection of periodontitis-associated bacteria based on DNA amplification have many advantages over classical culture techniques. However, when it comes to assessing immediate therapeutic success, e.g. reduction of viable bacteria, DNA-based polymerase chain reaction is unsuitable because it does not distinguish between live and dead bacteria. Our objective was to establish a simple RNA-based method that is easily set up and allows reliable assessment of the live bacterial load. We compared conventional quantitative real-time PCR (qPCR), propidium monoazide-qPCR and reverse transcription qPCR (RT-qPCR) for the detection of periodontal pathogens after antibiotic treatment in vitro. Applicability was tested using clinical samples of subgingival plaque obtained from patients at different treatment stages. The bacterial load was remarkably stable over prolonged periods when assessed by conventional qPCR, while both propidium monoazide intercalation as well as cDNA quantitation showed a decline according to decreasing numbers of viable bacteria after antibiotic treatment. Clinical samples of subgingival plaque were directly subjected to DNase I treatment and RT without previous extraction or purification steps. While the results of the DNA- and RNA-based methods are comparable in untreated patients, the classical qPCR frequently detected substantial bacterial load in treated patients where RT-qPCR no longer indicates the presence of those pathogens. The disagreement rates ranged between 4 and 20% in first visit patients and 8-50% in the group of currently treated patients. We propose to use RNA-based detection methods to verify the successful eradication of periodontal pathogens. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. Quantitative assessment of risk reduction from hand washing with antibacterial soaps.

    PubMed

    Gibson, L L; Rose, J B; Haas, C N; Gerba, C P; Rusin, P A

    2002-01-01

    The Centers for Disease Control and Prevention have estimated that there are 3,713,000 cases of infectious disease associated with day care facilities each year. The objective of this study was to examine the risk reduction achieved from using different soap formulations after diaper changing using a microbial quantitative risk assessment approach. To achieve this, a probability of infection model and an exposure assessment based on micro-organism transfer were used to evaluate the efficacy of different soap formulations in reducing the probability of disease following hand contact with an enteric pathogen. Based on this model, it was determined that the probability of infection ranged from 24/100 to 91/100 for those changing diapers of babies with symptomatic shigellosis who used a control product (soap without an antibacterial ingredient), 22/100 to 91/100 for those who used an antibacterial soap (chlorohexadine 4%), and 15/100 to 90/100 for those who used a triclosan (1.5%) antibacterial soap. Those with asymptomatic shigellosis who used a non-antibacterial control soap had a risk between 49/100,000 and 53/100, those who used the 4% chlorohexadine-containing soap had a risk between 43/100,000 and 51/100, and for those who used a 1.5% triclosan soap had a risk between 21/100,000 and 43/100. The adequate washing of hands after diapering reduces risk and can be further reduced by a factor of 20% by the use of an antibacterial soap. Quantitative risk assessment is a valuable tool in the evaluation of household sanitizing agents and low risk outcomes.

  20. Assessment of involuntary choreatic movements in Huntington's disease--toward objective and quantitative measures.

    PubMed

    Reilmann, Ralf; Bohlen, Stefan; Kirsten, Florian; Ringelstein, E Bernd; Lange, Herwig W

    2011-10-01

    Objective measures of motor impairment may improve the sensitivity and reliability of motor end points in clinical trials. In Huntington's disease, involuntary choreatic movements are one of the hallmarks of motor dysfunction. Chorea is commonly assessed by subitems of the Unified-Huntington's Disease Rating Scale. However, clinical rating scales are limited by inter- and intrarater variability, subjective error, and categorical design. We hypothesized that assessment of position and orientation changes interfering with a static upper extremity holding task may provide objective and quantitative measures of involuntary movements in patients with Huntington's disease. Subjects with symptomatic Huntington's disease (n = 19), premanifest gene carriers (n = 15; Unified-Huntington's Disease Rating Scale total motor score ≤ 3), and matched controls (n = 19) were asked to grasp and lift a device (250 and 500 g) equipped with an electromagnetic sensor. While subjects were instructed to hold the device as stable as possible, changes in position (x, y, z) and orientation (roll, pitch, yaw) were recorded. These were used to calculate a position index and an orientation index, both depicting the amount of choreatic movement interfering with task performance. Both indices were increased in patients with symptomatic Huntington's disease compared with controls and premanifest gene carriers for both weights, whereas only the position index with 500 g was increased in premanifest gene carriers compared with controls. Correlations were observed with the Disease Burden Score based on CAG-repeat length and age and with the Unified-Huntington's Disease Rating Scale. We conclude that quantitative assessment of chorea is feasible in Huntington's disease. The method is safe, noninvasive, and easily applicable and can be used repeatedly in outpatient settings. A use in clinical trials should be further explored in larger cohorts and follow-up studies.

  1. Quantitative imaging of cartilage and bone for functional assessment of gene therapy approaches in experimental arthritis.

    PubMed

    Stok, Kathryn S; Noël, Danièle; Apparailly, Florence; Gould, David; Chernajovsky, Yuti; Jorgensen, Christian; Müller, Ralph

    2010-07-01

    Anti-inflammatory gene therapy can inhibit inflammation driven by TNFalpha in experimental models of rheumatoid arthritis. However, assessment of the therapeutic effect on cartilage and bone quality is either missing or unsatisfactory. A multimodal imaging approach, using confocal laser scanning microscopy (CLSM) and micro-computed tomography (microCT), was used for gathering 3D quantitative image data on diseased and treated murine joints. As proof of concept, the efficacy of anti-TNF-based gene therapy was assessed, comparing imaging techniques with classical investigations. SCID mice knees were injected with human synoviocytes overexpressing TNFalpha. Two days later, electric pulse-mediated DNA transfer was performed after injection of the pGTRTT-plasmid containing a dimeric soluble-TNF receptor (dsTNFR) under the control of a doxycycline-inducible promoter. After 21 days the mice were sacrificed, TNFalpha levels were measured and the joints assessed for cartilage and bone degradation, using CLSM, microCT and histology. TNFalpha levels were decreased in the joints of mice treated with the plasmid in the presence of doxycycline. Concomitantly, histological analysis showed an increase in cartilage thickness and a decrease in specific synovial hyperplasia and cartilage erosion. Bone morphometry revealed that groups with the plasmid in the presence of doxycycline displayed a higher cortical thickness and decreased porosity. Using an anti-TNF gene therapy approach, known to reduce inflammation, as proof of concept, 3D imaging allowed quantitative evaluation of its benefits to joint architecture. It showed that local delivery of a regulated anti-TNF vector allowed decreasing arthritis severity through TNFalpha inhibition. These tools are valuable for understanding the efficacy of gene therapy on whole-joint morphometry.

  2. Quantitative assessment of biological impact using transcriptomic data and mechanistic network models

    SciTech Connect

    Thomson, Ty M.; Sewer, Alain; Martin, Florian; Belcastro, Vincenzo; Frushour, Brian P.; Gebel, Stephan; Park, Jennifer; Schlage, Walter K.; Talikka, Marja; Vasilyev, Dmitry M.; Westra, Jurjen W.; Hoeng, Julia; Peitsch, Manuel C.

    2013-11-01

    Exposure to biologically active substances such as therapeutic drugs or environmental toxicants can impact biological systems at various levels, affecting individual molecules, signaling pathways, and overall cellular processes. The ability to derive mechanistic insights from the resulting system responses requires the integration of experimental measures with a priori knowledge about the system and the interacting molecules therein. We developed a novel systems biology-based methodology that leverages mechanistic network models and transcriptomic data to quantitatively assess the biological impact of exposures to active substances. Hierarchically organized network models were first constructed to provide a coherent framework for investigating the impact of exposures at the molecular, pathway and process levels. We then validated our methodology using novel and previously published experiments. For both in vitro systems with simple exposure and in vivo systems with complex exposures, our methodology was able to recapitulate known biological responses matching expected or measured phenotypes. In addition, the quantitative results were in agreement with experimental endpoint data for many of the mechanistic effects that were assessed, providing further objective confirmation of the approach. We conclude that our methodology evaluates the biological impact of exposures in an objective, systematic, and quantifiable manner, enabling the computation of a systems-wide and pan-mechanistic biological impact measure for a given active substance or mixture. Our results suggest that various fields of human disease research, from drug development to consumer product testing and environmental impact analysis, could benefit from using this methodology. - Highlights: • The impact of biologically active substances is quantified at multiple levels. • The systems-level impact integrates the perturbations of individual networks. • The networks capture the relationships between

  3. Quantitative assessment of risk reduction from hand washing with antibacterial soaps.

    PubMed

    Gibson, L L; Rose, J B; Haas, C N; Gerba, C P; Rusin, P A

    2002-01-01

    The Centers for Disease Control and Prevention have estimated that there are 3,713,000 cases of infectious disease associated with day care facilities each year. The objective of this study was to examine the risk reduction achieved from using different soap formulations after diaper changing using a microbial quantitative risk assessment approach. To achieve this, a probability of infection model and an exposure assessment based on micro-organism transfer were used to evaluate the efficacy of different soap formulations in reducing the probability of disease following hand contact with an enteric pathogen. Based on this model, it was determined that the probability of infection ranged from 24/100 to 91/100 for those changing diapers of babies with symptomatic shigellosis who used a control product (soap without an antibacterial ingredient), 22/100 to 91/100 for those who used an antibacterial soap (chlorohexadine 4%), and 15/100 to 90/100 for those who used a triclosan (1.5%) antibacterial soap. Those with asymptomatic shigellosis who used a non-antibacterial control soap had a risk between 49/100,000 and 53/100, those who used the 4% chlorohexadine-containing soap had a risk between 43/100,000 and 51/100, and for those who used a 1.5% triclosan soap had a risk between 21/100,000 and 43/100. The adequate washing of hands after diapering reduces risk and can be further reduced by a factor of 20% by the use of an antibacterial soap. Quantitative risk assessment is a valuable tool in the evaluation of household sanitizing agents and low risk outcomes.

  4. Inverse heat mimicking of given objects

    NASA Astrophysics Data System (ADS)

    Alwakil, Ahmed; Zerrad, Myriam; Bellieud, Michel; Amra, Claude

    2017-03-01

    We address a general inverse mimicking problem in heat conduction. The objects to cloak and mimic are chosen beforehand; these objects identify a specific set of space transformations. The shapes that can be mimicked are derived from the conductivity matrices. Numerical calculation confirms all of the analytical predictions. The technique provides key advantages for applications and can be extended to the field of waves.

  5. Challenging mimickers of primary systemic vasculitis.

    PubMed

    Miloslavsky, Eli M; Stone, John H; Unizony, Sebastian H

    2015-01-01

    The need to distinguish true primary systemic vasculitis from its multiple potential mimickers is one of the most challenging diagnostic conundrums in clinical medicine. This article reviews 9 challenging vasculitis mimickers: fibromuscular dysplasia, calciphylaxis, segmental arterial mediolysis, antiphospholipid syndrome, hypereosinophilic syndrome, lymphomatoid granulomatosis, malignant atrophic papulosis, livedoid vasculopathy, and immunoglobulin G4-related disease.

  6. Basic concepts in three-part quantitative assessments of undiscovered mineral resources

    USGS Publications Warehouse

    Singer, D.A.

    1993-01-01

    Since 1975, mineral resource assessments have been made for over 27 areas covering 5??106 km2 at various scales using what is now called the three-part form of quantitative assessment. In these assessments, (1) areas are delineated according to the types of deposits permitted by the geology,(2) the amount of metal and some ore characteristics are estimated using grade and tonnage models, and (3) the number of undiscovered deposits of each type is estimated. Permissive boundaries are drawn for one or more deposit types such that the probability of a deposit lying outside the boundary is negligible, that is, less than 1 in 100,000 to 1,000,000. Grade and tonnage models combined with estimates of the number of deposits are the fundamental means of translating geologists' resource assessments into a language that economists can use. Estimates of the number of deposits explicitly represent the probability (or degree of belief) that some fixed but unknown number of undiscovered deposits exist in the delineated tracts. Estimates are by deposit type and must be consistent with the grade and tonnage model. Other guidelines for these estimates include (1) frequency of deposits from well-explored areas, (2) local deposit extrapolations, (3) counting and assigning probabilities to anomalies and occurrences, (4) process constraints, (5) relative frequencies of related deposit types, and (6) area spatial limits. In most cases, estimates are made subjectively, as they are in meteorology, gambling, and geologic interpretations. In three-part assessments, the estimates are internally consistent because delineated tracts are consistent with descriptive models, grade and tonnage models are consistent with descriptive models, as well as with known deposits in the area, and estimates of number of deposits are consistent with grade and tonnage models. All available information is used in the assessment, and uncertainty is explicitly represented. ?? 1993 Oxford University Press.

  7. Optimization of Dual-Energy Xenon-CT for Quantitative Assessment of Regional Pulmonary Ventilation

    PubMed Central

    Fuld, Matthew K.; Halaweish, Ahmed; Newell, John D.; Krauss, Bernhard; Hoffman, Eric A.

    2013-01-01

    Objective Dual-energy X-ray computed tomography (DECT) offers visualization of the airways and quantitation of regional pulmonary ventilation using a single breath of inhaled xenon gas. In this study we seek to optimize scanning protocols for DECT xenon gas ventilation imaging of the airways and lung parenchyma and to characterize the quantitative nature of the developed protocols through a series of test-object and animal studies. Materials and Methods The Institutional Animal Care and Use Committee approved all animal studies reported here. A range of xenon-oxygen gas mixtures (0, 20, 25, 33, 50, 66, 100%; balance oxygen) were scanned in syringes and balloon test-objects to optimize the delivered gas mixture for assessment of regional ventilation while allowing for the development of improved three-material decomposition calibration parameters. Additionally, to alleviate gravitational effects on xenon gas distribution, we replaced a portion of the oxygen in the xenon/oxygen gas mixture with helium and compared gas distributions in a rapid-prototyped human central-airway test-object. Additional syringe tests were performed to determine if the introduction of helium had any effect on xenon quantitation. Xenon gas mixtures were delivered to anesthetized swine in order to assess airway and lung parenchymal opacification while evaluating various DECT scan acquisition settings. Results Attenuation curves for xenon were obtained from the syringe test objects and were used to develop improved three-material decomposition parameters (HU enhancement per percent xenon: Within the chest phantom: 2.25 at 80kVp, 1.7 at 100 kVp, and 0.76 at 140 kVp with tin filtration; In open air: 2.5 at 80kVp, 1.95 at 100 kVp, and 0.81 at 140 kVp with tin filtration). The addition of helium improved the distribution of xenon gas to the gravitationally non-dependent portion of the airway tree test-object, while not affecting quantitation of xenon in the three-material decomposition DECT. 40%Xe

  8. Assessment of Nutritional Status of Nepalese Hemodialysis Patients by Anthropometric Examinations and Modified Quantitative Subjective Global Assessment

    PubMed Central

    Sedhain, Arun; Hada, Rajani; Agrawal, Rajendra Kumar; Bhattarai, Gandhi R; Baral, Anil

    2015-01-01

    OBJECTIVE To assess the nutritional status of patients on maintenance hemodialysis by using modified quantitative subjective global assessment (MQSGA) and anthropometric measurements. METHOD We Conducted a cross sectional descriptive analytical study to assess the nutritional status of fifty four patients with chronic kidney disease undergoing maintenance hemodialysis by using MQSGA and different anthropometric and laboratory measurements like body mass index (BMI), mid-arm circumference (MAC), mid-arm muscle circumference (MAMC), triceps skin fold (TSF) and biceps skin fold (BSF), serum albumin, C-reactive protein (CRP) and lipid profile in a government tertiary hospital at Kathmandu, Nepal. RESULTS Based on MQSGA criteria, 66.7% of the patients suffered from mild to moderate malnutrition and 33.3% were well nourished. None of the patients were severely malnourished. CRP was positive in 56.3% patients. Serum albumin, MAC and BMI were (mean + SD) 4.0 + 0.3 mg/dl, 22 + 2.6 cm and 19.6 ± 3.2 kg/m2 respectively. MQSGA showed negative correlation with MAC (r = −0.563; P = <0.001), BMI (r = −0.448; P = <0.001), MAMC (r = −0.506; P = <.0001), TSF (r = −0.483; P = <.0002), and BSF (r = −0.508; P = <0.0001). Negative correlation of MQSGA was also found with total cholesterol, triglyceride, LDL cholesterol and HDL cholesterol without any statistical significance. CONCLUSION Mild to moderate malnutrition was found to be present in two thirds of the patients undergoing hemodialysis. Anthropometric measurements like BMI, MAC, MAMC, BSF and TSF were negatively correlated with MQSGA. Anthropometric and laboratory assessment tools could be used for nutritional assessment as they are relatively easier, cheaper and practical markers of nutritional status. PMID:26327781

  9. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  10. Quantitative Assessment of Upstream Source Influences on TGM Observations at Three CAMNet Sites

    NASA Astrophysics Data System (ADS)

    Wen, D.; Lin, J. C.; Meng, F.; Gbor, P. K.; He, Z.; Sloan, J. J.

    2009-05-01

    Mercury is a persistent and toxic substance in the environment. Exposure to high levels of mercury can cause a range of adverse health effects, including damage to the nervous system, reproduction system and childhood development. Proper recognition and prediction of atmospheric levels of mercury can effectively avoid the adverse affect of Hg, however they cannot be achieved without accurate and quantitative identification of source influences, which is a great challenge due to the complexity of Hg in the air. The objective of this study is to present a new method to simulate Hg concentrations at the location of a monitoring site and quantitatively assess its upstream source influences. Hourly total gaseous mercury (TGM) concentrations at three CAMNet monitoring sites (receptors) in Ontario were predicted for four selected periods using the Stochastic Time-Inverted Lagrangian Transport (STILT) model, which is capable of representing near-field influences that are not resolved by typical grid sizes in transport models. The model was modified to deal with Hg depositions and point source Hg emissions. The model-predicted Hg concentrations were compared with observations, as well as with the results from a CMAQ-Hg simulation in which the same emission and meteorology inputs were used. The comparisons show that STILT-predicted Hg concentrations agree well with observations, and are generally closer to the observations than those predicted by CMAQ-Hg. The better performance of the STILT simulation can be attributed to its ability to account for near-field influences. STILT was also applied to assess quantitatively the relative importance of different upstream source regions for the selected episodes. The assessment was made based on emission fluxes and STILT footprints, i.e., sensitivities of atmospheric concentrations to upstream surface fluxes. The results indicated that the main source regions of observed low Hg concentrations were in Northeastern Ontario, whereas

  11. Assessment of the sources of error affecting the quantitative accuracy of SPECT imaging in small animals

    NASA Astrophysics Data System (ADS)

    Hwang, Andrew B.; Franc, Benjamin L.; Gullberg, Grant T.; Hasegawa, Bruce H.

    2008-05-01

    Small animal SPECT imaging systems have multiple potential applications in biomedical research. Whereas SPECT data are commonly interpreted qualitatively in a clinical setting, the ability to accurately quantify measurements will increase the utility of the SPECT data for laboratory measurements involving small animals. In this work, we assess the effect of photon attenuation, scatter and partial volume errors on the quantitative accuracy of small animal SPECT measurements, first with Monte Carlo simulation and then confirmed with experimental measurements. The simulations modeled the imaging geometry of a commercially available small animal SPECT system. We simulated the imaging of a radioactive source within a cylinder of water, and reconstructed the projection data using iterative reconstruction algorithms. The size of the source and the size of the surrounding cylinder were varied to evaluate the effects of photon attenuation and scatter on quantitative accuracy. We found that photon attenuation can reduce the measured concentration of radioactivity in a volume of interest in the center of a rat-sized cylinder of water by up to 50% when imaging with iodine-125, and up to 25% when imaging with technetium-99m. When imaging with iodine-125, the scatter-to-primary ratio can reach up to approximately 30%, and can cause overestimation of the radioactivity concentration when reconstructing data with attenuation correction. We varied the size of the source to evaluate partial volume errors, which we found to be a strong function of the size of the volume of interest and the spatial resolution. These errors can result in large (>50%) changes in the measured amount of radioactivity. The simulation results were compared with and found to agree with experimental measurements. The inclusion of attenuation correction in the reconstruction algorithm improved quantitative accuracy. We also found that an improvement of the spatial resolution through the use of resolution

  12. Quantitative assessment of cancer cell morphology and movement using telecentric digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Nguyen, Thanh C.; Nehmetallah, George; Lam, Van; Chung, Byung Min; Raub, Christopher

    2017-02-01

    Digital holographic microscopy (DHM) provides label-free and real-time quantitative phase information relevant to the analysis of dynamic biological systems. A DHM based on telecentric configuration optically mitigates phase aberrations due to the microscope objective and linear high frequency fringes due to the reference beam thus minimizing digital aberration correction needed for distortion free 3D reconstruction. The purpose of this work is to quantitatively assess growth and migratory behavior of invasive cancer cells using a telecentric DHM system. Together, the height and lateral shape features of individual cells, determined from time-lapse series of phase reconstructions, should reveal aspects of cell migration, cell-matrix adhesion, and cell cycle phase transitions. To test this, MDA-MB-231 breast cancer cells were cultured on collagen-coated or un-coated glass, and 3D holograms were reconstructed over 2 hours. Cells on collagencoated glass had an average 14% larger spread area than cells on uncoated glass (n=18-22 cells/group). The spread area of cells on uncoated glass were 15-21% larger than cells seeded on collagen hydrogels (n=18-22 cells/group). Premitotic cell rounding was observed with average phase height increasing 57% over 10 minutes. Following cell division phase height decreased linearly (R2=0.94) to 58% of the original height pre-division. Phase objects consistent with lamellipodia were apparent from the reconstructions at the leading edge of migrating cells. These data demonstrate the ability to track quantitative phase parameters and relate them to cell morphology during cell migration and division on adherent substrates, using telecentric DHM. The technique enables future studies of cell-matrix interactions relevant to cancer.

  13. Assessment of the sources of error affecting the quantitative accuracy of SPECT imaging in small animals

    SciTech Connect

    Joint Graduate Group in Bioengineering, University of California, San Francisco and University of California, Berkeley; Department of Radiology, University of California; Gullberg, Grant T; Hwang, Andrew B.; Franc, Benjamin L.; Gullberg, Grant T.; Hasegawa, Bruce H.

    2008-02-15

    Small animal SPECT imaging systems have multiple potential applications in biomedical research. Whereas SPECT data are commonly interpreted qualitatively in a clinical setting, the ability to accurately quantify measurements will increase the utility of the SPECT data for laboratory measurements involving small animals. In this work, we assess the effect of photon attenuation, scatter and partial volume errors on the quantitative accuracy of small animal SPECT measurements, first with Monte Carlo simulation and then confirmed with experimental measurements. The simulations modeled the imaging geometry of a commercially available small animal SPECT system. We simulated the imaging of a radioactive source within a cylinder of water, and reconstructed the projection data using iterative reconstruction algorithms. The size of the source and the size of the surrounding cylinder were varied to evaluate the effects of photon attenuation and scatter on quantitative accuracy. We found that photon attenuation can reduce the measured concentration of radioactivity in a volume of interest in the center of a rat-sized cylinder of water by up to 50percent when imaging with iodine-125, and up to 25percent when imaging with technetium-99m. When imaging with iodine-125, the scatter-to-primary ratio can reach up to approximately 30percent, and can cause overestimation of the radioactivity concentration when reconstructing data with attenuation correction. We varied the size of the source to evaluate partial volume errors, which we found to be a strong function of the size of the volume of interest and the spatial resolution. These errors can result in large (>50percent) changes in the measured amount of radioactivity. The simulation results were compared with and found to agree with experimental measurements. The inclusion of attenuation correction in the reconstruction algorithm improved quantitative accuracy. We also found that an improvement of the spatial resolution through the

  14. Quantitative assessment of rest and acetazolamide CBF using quantitative SPECT reconstruction and sequential administration of (123)I-iodoamphetamine: comparison among data acquired at three institutions.

    PubMed

    Yamauchi, Miho; Imabayashi, Etsuko; Matsuda, Hiroshi; Nakagawara, Jyoji; Takahashi, Masaaki; Shimosegawa, Eku; Hatazawa, Jun; Suzuki, Michiyasu; Iwanaga, Hideyuki; Fukuda, Kenji; Iihara, Koji; Iida, Hidehiro

    2014-11-01

    A recently developed technique which reconstructs quantitative images from original projection data acquired using existing single-photon emission computed tomography (SPECT) devices enabled quantitative assessment of cerebral blood flow (CBF) at rest and after acetazolamide challenge. This study was intended to generate a normal database and to investigate its inter-institutional consistency. The three institutions carried out a series of SPECT scanning on 32 healthy volunteers, following a recently proposed method that involved dual administration of (123)I-iodoamphetamine during a single SPECT scan. Intra-institute and inter-institutional variations of regional CBF values were evaluated both at rest and after acetazolamide challenge. Functional images were pooled for both rest and acetazolamide CBF, and inter-institutional difference was evaluated among these images using two independent software programs. Quantitative assessment of CBF images at rest and after acetazolamide was successfully achieved with the given protocol in all institutions. Intra-institutional variation of CBF values at rest and after acetazolamide was consistent with previously reported values. Quantitative CBF values showed no significant difference among institutions in all regions, except for a posterior cerebral artery region after acetazolamide challenge in one institution which employed SPECT device with lowest spatial resolution. Pooled CBF images at rest and after acetazolamide generated using two software programs showed no institutional differences after equalization of the spatial resolution. SPECT can provide reproducible images from projection data acquired using different SPECT devices. A common database acquired at different institutions may be shared among institutions, if images are reconstructed using a quantitative reconstruction program, and acquired by following a standardized protocol.

  15. Quantitative assessment of BAX transcript and flow cytometric expression in acute myeloid leukemia: a prospective study.

    PubMed

    Sharawat, Surender Kumar; Raina, Vinod; Kumar, Lalit; Sharma, Atul; Bakhshi, Radhika; Vishnubhatla, Sreenivas; Gupta, Ritu; Bakhshi, Sameer

    2014-10-01

    Quantitative assessment of BAX transcripts and protein in acute myeloid leukemia (AML). We quantitatively evaluated BAX gene transcripts by real-time polymerase chain reaction (TaqMan probe chemistry) and protein expression by flow cytometry. Consecutive 112 AML patients with a median age of 16 (1-59) years were recruited in the study. By flow cytometry, the percentage expression was in linear correlation with relative median fluorescent intensity (RMFI; R = 0.4425; P < 0.001). However, there was no linear relationship between the transcript copies of the BAX with its RMFI (R = -0.0559; P = 0.586). The expression of the BAX at both protein and transcript level was significantly higher in AML patients as compared with normal control. RMFI of the BAX were higher in the cohort with lower white blood cell count (P = 0.029). None of the other baseline characteristics correlated with either the BAX transcript or the RMFI. BAX expression did not correlate with complete remission rate, event free, disease free, and overall survival. BAX gene expression in AML was evaluated first time with two different methods but did not correlate with the survival outcome.

  16. Specific and Quantitative Assessment of Naphthalene and Salicylate Bioavailability by Using a Bioluminescent Catabolic Reporter Bacterium

    PubMed Central

    Heitzer, Armin; Webb, Oren F.; Thonnard, Janeen E.; Sayler, Gary S.

    1992-01-01

    A bioassay was developed and standardized for the rapid, specific, and quantitative assessment of naphthalene and salicylate bioavailability by use of bioluminescence monitoring of catabolic gene expression. The bioluminescent reporter strain Pseudomonas fluorescens HK44, which carries a transcriptional nahG-luxCDABE fusion for naphthalene and salicylate catabolism, was used. The physiological state of the reporter cultures as well as the intrinsic regulatory properties of the naphthalene degradation operon must be taken into account to obtain a high specificity at low target substrate concentrations. Experiments have shown that the use of exponentially growing reporter cultures has advantages over the use of carbon-starved, resting cultures. In aqueous solutions for both substrates, naphthalene and salicylate, linear relationships between initial substrate concentration and bioluminescence response were found over concentration ranges of 1 to 2 orders of magnitude. Naphthalene could be detected at a concentration of 45 ppb. Studies conducted under defined conditions with extracts and slurries of experimentally contaminated sterile soils and identical uncontaminated soil controls demonstrated that this method can be used for specific and quantitative estimations of target pollutant presence and bioavailability in soil extracts and for specific and qualitative estimations of napthalene in soil slurries. PMID:16348717

  17. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    NASA Astrophysics Data System (ADS)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66-1.06, 1.06-1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  18. Residual Isocyanates in Medical Devices and Products: A Qualitative and Quantitative Assessment

    PubMed Central

    Franklin, Gillian; Harari, Homero; Ahsan, Samavi; Bello, Dhimiter; Sterling, David A.; Nedrelow, Jonathan; Raynaud, Scott; Biswas, Swati; Liu, Youcheng

    2016-01-01

    We conducted a pilot qualitative and quantitative assessment of residual isocyanates and their potential initial exposures in neonates, as little is known about their contact effect. After a neonatal intensive care unit (NICU) stockroom inventory, polyurethane (PU) and PU foam (PUF) devices and products were qualitatively evaluated for residual isocyanates using Surface SWYPE™. Those containing isocyanates were quantitatively tested for methylene diphenyl diisocyanate (MDI) species, using UPLC-UV-MS/MS method. Ten of 37 products and devices tested, indicated both free and bound residual surface isocyanates; PU/PUF pieces contained aromatic isocyanates; one product contained aliphatic isocyanates. Overall, quantified mean MDI concentrations were low (4,4′-MDI = 0.52 to 140.1 pg/mg) and (2,4′-MDI = 0.01 to 4.48 pg/mg). The 4,4′-MDI species had the highest measured concentration (280 pg/mg). Commonly used medical devices/products contain low, but measurable concentrations of residual isocyanates. Quantifying other isocyanate species and neonatal skin exposure to isocyanates from these devices and products requires further investigation. PMID:27773989

  19. Noninvasive Qualitative and Quantitative Assessment of Spoilage Attributes of Chilled Pork Using Hyperspectral Scattering Technique.

    PubMed

    Zhang, Leilei; Peng, Yankun

    2016-08-01

    The objective of this research was to develop a rapid noninvasive method for quantitative and qualitative determination of chilled pork spoilage. Microbiological, physicochemical, and organoleptic characteristics such as the total viable count (TVC), Pseudomonas spp., total volatile basic-nitrogen (TVB-N), pH value, and color parameter L* were determined to appraise pork quality. The hyperspectral scattering characteristics from 54 meat samples were fitted by four-parameter modified Gompertz function accurately. Support vector machines (SVM) was applied to establish quantitative prediction model between scattering fitting parameters and reference values. In addition, partial least squares discriminant analysis (PLS-DA) and Bayesian analysis were utilized as supervised and unsupervised techniques for the qualitative identification of meat spoilage. All stored chilled meat samples were classified into three grades: "fresh," "semi-fresh," and "spoiled." Bayesian classification model was superior to PLS-DA with overall classification accuracy of 92.86%. The results demonstrated that hyperspectral scattering technique combined with SVM and Bayesian possessed a powerful capability for meat spoilage assessment rapidly and noninvasively. © The Author(s) 2016.

  20. Dynamic and quantitative assessment of blood coagulation using optical coherence elastography

    NASA Astrophysics Data System (ADS)

    Xu, Xiangqun; Zhu, Jiang; Chen, Zhongping

    2016-04-01

    Reliable clot diagnostic systems are needed for directing treatment in a broad spectrum of cardiovascular diseases and coagulopathy. Here, we report on non-contact measurement of elastic modulus for dynamic and quantitative assessment of whole blood coagulation using acoustic radiation force orthogonal excitation optical coherence elastography (ARFOE-OCE). In this system, acoustic radiation force (ARF) is produced by a remote ultrasonic transducer, and a shear wave induced by ARF excitation is detected by the optical coherence tomography (OCT) system. During porcine whole blood coagulation, changes in the elastic property of the clots increase the shear modulus of the sample, altering the propagating velocity of the shear wave. Consequently, dynamic blood coagulation status can be measured quantitatively by relating the velocity of the shear wave with clinically relevant coagulation metrics, including reaction time, clot formation kinetics and maximum shear modulus. The results show that the ARFOE-OCE is sensitive to the clot formation kinetics and can differentiate the elastic properties of the recalcified porcine whole blood, blood added with kaolin as an activator, and blood spiked with fibrinogen.

  1. Quantitative MR assessment of structural changes in white matter of children treated for ALL

    NASA Astrophysics Data System (ADS)

    Reddick, Wilburn E.; Glass, John O.; Mulhern, Raymond K.

    2001-07-01

    Our research builds on the hypothesis that white matter damage resulting from therapy spans a continuum of severity that can be reliably probed using non-invasive MR technology. This project focuses on children treated for ALL with a regimen containing seven courses of high-dose methotrexate (HDMTX) which is known to cause leukoencephalopathy. Axial FLAIR, T1-, T2-, and PD-weighted images were acquired, registered and then analyzed with a hybrid neural network segmentation algorithm to identify normal brain parenchyma and leukoencephalopathy. Quantitative T1 and T2 maps were also analyzed at the level of the basal ganglia and the centrum semiovale. The segmented images were used as mask to identify regions of normal appearing white matter (NAWM) and leukoencephalopathy in the quantitative T1 and T2 maps. We assessed the longitudinal changes in volume, T1 and T2 in NAWM and leukoencephalopathy for 42 patients. The segmentation analysis revealed that 69% of patients had leukoencephalopathy after receiving seven courses of HDMTX. The leukoencephalopathy affected approximately 17% of the patients' white matter volume on average (range 2% - 38%). Relaxation rates in the NAWM were not significantly changed between the 1st and 7th courses. Regions of leukoencephalopathy exhibited a 13% elevation in T1 and a 37% elevation in T2 relaxation rates.

  2. Quantitative assessment of locomotive syndrome by the loco-check questionnaire in older Japanese females

    PubMed Central

    Noge, Sachiko; Ohishi, Tatsuo; Yoshida, Takuya; Kumagai, Hiromichi

    2017-01-01

    [Purpose] Locomotive syndrome (LS) is a condition by which older people may require care service because of problems with locomotive organs. This study examined whether the loco-check, a 7-item questionnaire, is useful for quantitatively assessing the severity of LS. [Subjects and Methods] Seventy-one community dwelling Japanese females aged 64–96 years (81.7 ± 8.0 years) participated in this study. The associations of the loco-check with thigh muscle mass measured by X-ray CT, physical performance, nutritional status, and quality of life (QOL) were investigated. [Results] The results showed that the number of times that “yes” was selected in the loco-check was significantly correlated with thigh muscle mass, major measures of physical performance, nutritional status, and QOL. This number was also significantly larger in the participants experiencing falling, fracture, and lumbar pain than in those without these episodes. [Conclusion] These results suggest that the loco-check might be useful for quantitatively evaluating LS. PMID:28932003

  3. Application of Bayesian networks in quantitative risk assessment of subsea blowout preventer operations.

    PubMed

    Cai, Baoping; Liu, Yonghong; Liu, Zengkai; Tian, Xiaojie; Zhang, Yanzhen; Ji, Renjie

    2013-07-01

    This article proposes a methodology for the application of Bayesian networks in conducting quantitative risk assessment of operations in offshore oil and gas industry. The method involves translating a flow chart of operations into the Bayesian network directly. The proposed methodology consists of five steps. First, the flow chart is translated into a Bayesian network. Second, the influencing factors of the network nodes are classified. Third, the Bayesian network for each factor is established. Fourth, the entire Bayesian network model is established. Lastly, the Bayesian network model is analyzed. Subsequently, five categories of influencing factors, namely, human, hardware, software, mechanical, and hydraulic, are modeled and then added to the main Bayesian network. The methodology is demonstrated through the evaluation of a case study that shows the probability of failure on demand in closing subsea ram blowout preventer operations. The results show that mechanical and hydraulic factors have the most important effects on operation safety. Software and hardware factors have almost no influence, whereas human factors are in between. The results of the sensitivity analysis agree with the findings of the quantitative analysis. The three-axiom-based analysis partially validates the correctness and rationality of the proposed Bayesian network model. © 2012 Society for Risk Analysis.

  4. Quantitative assessment of thermodynamic constraints on the solution space of genome-scale metabolic models.

    PubMed

    Hamilton, Joshua J; Dwivedi, Vivek; Reed, Jennifer L

    2013-07-16

    Constraint-based methods provide powerful computational techniques to allow understanding and prediction of cellular behavior. These methods rely on physiochemical constraints to eliminate infeasible behaviors from the space of available behaviors. One such constraint is thermodynamic feasibility, the requirement that intracellular flux distributions obey the laws of thermodynamics. The past decade has seen several constraint-based methods that interpret this constraint in different ways, including those that are limited to small networks, rely on predefined reaction directions, and/or neglect the relationship between reaction free energies and metabolite concentrations. In this work, we utilize one such approach, thermodynamics-based metabolic flux analysis (TMFA), to make genome-scale, quantitative predictions about metabolite concentrations and reaction free energies in the absence of prior knowledge of reaction directions, while accounting for uncertainties in thermodynamic estimates. We applied TMFA to a genome-scale network reconstruction of Escherichia coli and examined the effect of thermodynamic constraints on the flux space. We also assessed the predictive performance of TMFA against gene essentiality and quantitative metabolomics data, under both aerobic and anaerobic, and optimal and suboptimal growth conditions. Based on these results, we propose that TMFA is a useful tool for validating phenotypes and generating hypotheses, and that additional types of data and constraints can improve predictions of metabolite concentrations.

  5. Dynamic and quantitative assessment of blood coagulation using optical coherence elastography

    PubMed Central

    Xu, Xiangqun; Zhu, Jiang; Chen, Zhongping

    2016-01-01

    Reliable clot diagnostic systems are needed for directing treatment in a broad spectrum of cardiovascular diseases and coagulopathy. Here, we report on non-contact measurement of elastic modulus for dynamic and quantitative assessment of whole blood coagulation using acoustic radiation force orthogonal excitation optical coherence elastography (ARFOE-OCE). In this system, acoustic radiation force (ARF) is produced by a remote ultrasonic transducer, and a shear wave induced by ARF excitation is detected by the optical coherence tomography (OCT) system. During porcine whole blood coagulation, changes in the elastic property of the clots increase the shear modulus of the sample, altering the propagating velocity of the shear wave. Consequently, dynamic blood coagulation status can be measured quantitatively by relating the velocity of the shear wave with clinically relevant coagulation metrics, including reaction time, clot formation kinetics and maximum shear modulus. The results show that the ARFOE-OCE is sensitive to the clot formation kinetics and can differentiate the elastic properties of the recalcified porcine whole blood, blood added with kaolin as an activator, and blood spiked with fibrinogen. PMID:27090437

  6. Enantiomeric fractionation as a tool for quantitative assessment of biodegradation: The case of metoprolol.

    PubMed

    Souchier, Marine; Benali-Raclot, Dalel; Casellas, Claude; Ingrand, Valérie; Chiron, Serge

    2016-05-15

    An efficient chiral liquid chromatography high resolution mass spectrometry method has been developed for the determination of metoprolol (MTP) and three of its major metabolites, namely O-desmethylmetoprolol (O-DMTP), α-hydroxymetoprolol (α-HMTP) and metoprolol acid (MTPA) in wastewater treatment plant (WWTP) influents and effluents. The optimized analytical method has been validated with good quality parameters including resolution >1.3 and method quantification limits down to the ng/L range except for MTPA. On the basis of this newly developed analytical method, the stereochemistry of MTP and its metabolites was studied over time in effluent/sediment biotic and sterile microcosms under dark and light conditions and in influents and effluents of 5 different WWTPs. MTP stereoselective degradation was exclusively observed under biotic conditions, confirming the specificity of enantiomeric fraction variations to biodegradation processes. MTP was always biotransformed into MTPA with a (S)-enantiomer enrichment. The results of enantiomeric enrichment pointed the way for a quantitative assessment of in situ biodegradation processes due to a good fit (R(2) > 0.98) of the aerobic MTP biodegradation to the Rayleigh dependency in all the biotic microcosms and in WWTPs because both MTP enantiomers followed the same biodegradation kinetic profiles. These results demonstrate that enantiomeric fractionation constitutes a very interesting quantitative indicator of MTP biodegradation in WWTPs and probably in the environment. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Quantitative assessment of developmental levels in overarm throwing using wearable inertial sensing technology.

    PubMed

    Grimpampi, Eleni; Masci, Ilaria; Pesce, Caterina; Vannozzi, Giuseppe

    2016-09-01

    Motor proficiency in childhood has been recently recognised as a public health determinant, having a potential impact on the physical activity level and possible sedentary behaviour of the child later in life. Among fundamental motor skills, ballistic skills assessment based on in-field quantitative observations is progressively needed in the motor development community. The aim of this study was to propose an in-field quantitative approach to identify different developmental levels in overarm throwing. Fifty-eight children aged 5-10 years performed an overarm throwing task while wearing three inertial sensors located at the wrist, trunk and pelvis level and were then categorised using a developmental sequence of overarm throwing. A set of biomechanical parameters were defined and analysed using multivariate statistics to evaluate whether they can be used as developmental indicators. Trunk and pelvis angular velocities and time durations before the ball release showed increasing/decreasing trends with increasing developmental level. Significant differences between developmental level pairs were observed for selected biomechanical parameters. The results support the suitability and feasibility of objective developmental measures in ecological learning contexts, suggesting their potential supportiveness to motor learning experiences in educational and youth sports training settings.

  8. Residual Isocyanates in Medical Devices and Products: A Qualitative and Quantitative Assessment.

    PubMed

    Franklin, Gillian; Harari, Homero; Ahsan, Samavi; Bello, Dhimiter; Sterling, David A; Nedrelow, Jonathan; Raynaud, Scott; Biswas, Swati; Liu, Youcheng

    2016-01-01

    We conducted a pilot qualitative and quantitative assessment of residual isocyanates and their potential initial exposures in neonates, as little is known about their contact effect. After a neonatal intensive care unit (NICU) stockroom inventory, polyurethane (PU) and PU foam (PUF) devices and products were qualitatively evaluated for residual isocyanates using Surface SWYPE™. Those containing isocyanates were quantitatively tested for methylene diphenyl diisocyanate (MDI) species, using UPLC-UV-MS/MS method. Ten of 37 products and devices tested, indicated both free and bound residual surface isocyanates; PU/PUF pieces contained aromatic isocyanates; one product contained aliphatic isocyanates. Overall, quantified mean MDI concentrations were low (4,4'-MDI = 0.52 to 140.1 pg/mg) and (2,4'-MDI = 0.01 to 4.48 pg/mg). The 4,4'-MDI species had the highest measured concentration (280 pg/mg). Commonly used medical devices/products contain low, but measurable concentrations of residual isocyanates. Quantifying other isocyanate species and neonatal skin exposure to isocyanates from these devices and products requires further investigation.

  9. Specific and quantitative assessment of naphthalene and salicylate bioavailability by using a bioluminescent catabolic reporter bacterium

    SciTech Connect

    Heitzer, A.; Thonnard, J.E.; Sayler, G.S.; Webb, O.F. )

    1992-06-01

    A bioassay was developed and standardized for the rapid, specific, and quantitative assessment of naphthalene and salicylate bioavailability by use of bioluminescence monitoring of catabolic gene expression. The bioluminescent reporter strain Pseudomonas fluorescens HK44, which carries a transcriptional nahG-luxCDABE fusion for naphthalene and salicylate catabolism, was used. The physiological state of the reporter cultures as well as the intrinsic regulatory properties of the naphthalene degradation operon must be taken into account to obtain a high specificity at low target substrate concentrations. Experiments have shown that the use of exponentially growing reporter cultures has advantages over the use of carbon-starved, resting cultures. In aqueous solutions for both substrates, naphthalene and salicylate, linear relationships between initial substrate concentration and bioluminescence response were found over concentration ranges of 1 to 2 orders of magnitude. Naphthalene could be detected at a concentration of 45 ppb. Studies conducted under defined conditions with extracts and slurries of experimentally contaminated sterile soils and identical uncontaminated soil controls demonstrated that this method can be used for specific and quantitative estimations of target pollutant presence and bioavailability in soil extracts and for specific and qualitative estimations of napthalene in soil slurries.

  10. Quantitative assessment of airborne exposures generated during common cleaning tasks: a pilot study

    PubMed Central

    2010-01-01

    Background A growing body of epidemiologic evidence suggests an association between exposure to cleaning products with asthma and other respiratory disorders. Thus far, these studies have conducted only limited quantitative exposure assessments. Exposures from cleaning products are difficult to measure because they are complex mixtures of chemicals with a range of physicochemical properties, thus requiring multiple measurement techniques. We conducted a pilot exposure assessment study to identify methods for assessing short term, task-based airborne exposures and to quantitatively evaluate airborne exposures associated with cleaning tasks simulated under controlled work environment conditions. Methods Sink, mirror, and toilet bowl cleaning tasks were simulated in a large ventilated bathroom and a small unventilated bathroom using a general purpose, a glass, and a bathroom cleaner. All tasks were performed for 10 minutes. Airborne total volatile organic compounds (TVOC) generated during the tasks were measured using a direct reading instrument (DRI) with a photo ionization detector. Volatile organic ingredients of the cleaning mixtures were assessed utilizing an integrated sampling and analytic method, EPA TO-17. Ammonia air concentrations were also measured with an electrochemical sensor embedded in the DRI. Results Average TVOC concentrations calculated for 10 minute tasks ranged 0.02 - 6.49 ppm and the highest peak concentrations observed ranged 0.14-11 ppm. TVOC time concentration profiles indicated that exposures above background level remained present for about 20 minutes after cessation of the tasks. Among several targeted VOC compounds from cleaning mixtures, only 2-BE was detectable with the EPA method. The ten minute average 2- BE concentrations ranged 0.30 -21 ppm between tasks. The DRI underestimated 2-BE exposures compared to the results from the integrated method. The highest concentration of ammonia of 2.8 ppm occurred during mirror cleaning

  11. Black hole mimickers: Regular versus singular behavior

    SciTech Connect

    Lemos, Jose P. S.; Zaslavskii, Oleg B.

    2008-07-15

    Black hole mimickers are possible alternatives to black holes; they would look observationally almost like black holes but would have no horizon. The properties in the near-horizon region where gravity is strong can be quite different for both types of objects, but at infinity it could be difficult to discern black holes from their mimickers. To disentangle this possible confusion, we examine the near-horizon properties, and their connection with far away asymptotic properties, of some candidates to black mimickers. We study spherically symmetric uncharged or charged but nonextremal objects, as well as spherically symmetric charged extremal objects. Within the uncharged or charged but nonextremal black hole mimickers, we study nonextremal {epsilon}-wormholes on the threshold of the formation of an event horizon, of which a subclass are called black foils, and gravastars. Within the charged extremal black hole mimickers we study extremal {epsilon}-wormholes on the threshold of the formation of an event horizon, quasi-black holes, and wormholes on the basis of quasi-black holes from Bonnor stars. We elucidate whether or not the objects belonging to these two classes remain regular in the near-horizon limit. The requirement of full regularity, i.e., finite curvature and absence of naked behavior, up to an arbitrary neighborhood of the gravitational radius of the object enables one to rule out potential mimickers in most of the cases. A list ranking the best black hole mimickers up to the worst, both nonextremal and extremal, is as follows: wormholes on the basis of extremal black holes or on the basis of quasi-black holes, quasi-black holes, wormholes on the basis of nonextremal black holes (black foils), and gravastars. Since in observational astrophysics it is difficult to find extremal configurations (the best mimickers in the ranking), whereas nonextremal configurations are really bad mimickers, the task of distinguishing black holes from their mimickers seems to

  12. Pediatric myositis ossificans mimicking osteosarcoma.

    PubMed

    Yamaga, Kensaku; Kobayashi, Eisuke; Kubota, Daisuke; Setsu, Nokitaka; Tanaka, Yuya; Minami, Yusuke; Tanzawa, Yoshikazu; Nakatani, Fumihiko; Kawai, Akira; Chuman, Hirokazu

    2015-10-01

    Myositis ossificans (MO) is a rare benign cause of heterotopic bone formation in soft tissue that most commonly affects young adults, typically following trauma. We report the case of an 11-year-old girl who developed MO mimicking osteosarcoma in her right shoulder. Plain radiography and computed tomography showed poorly defined flocculated densities in the soft tissue and a periosteal reaction along the proximal humerus. On magnetic resonance imaging, the mass displayed an ill-defined margin and inhomogeneous signal change. Histologically, the mass had a pseudosarcomatous appearance. Based on these findings, the patient was initially misdiagnosed with osteosarcoma at another hospital. The diagnosis was difficult because the patient was 11 years old and had no trauma history, with atypical radiographic changes and a predilection for the site of origin for osteosarcomas. We finally made the correct diagnosis of MO by carefully reviewing and reflecting on the pathological differences between stages. © 2015 Japan Pediatric Society.

  13. Diseases mimicking intussusception: diagnostic dilemma.

    PubMed

    Karakus, Suleyman Cuneyt; Ozokutan, Bulent Hayri; Ceylan, Haluk

    2014-10-01

    Intussusception is a common abdominal emergency in early childhood. The aim of this study was to describe the diseases mimicking intussusception and to discuss the causes and management of these conditions. Seven patients who were initially diagnosed as having intussusception on abdominal ultrasonography but who had a final diagnosis of diseases other than intussusception were reviewed retrospectively. Two patients with ileocolic intussusception underwent ultrasonography-guided reduction with a hydrostatic method but the ultrasonographic findings persisted. At surgery, only edematous ileocecal valve and mesenteric lymphadenopathy were observed. In three patients with Henoch-Schönlein purpura, initial abdominal ultrasonography showed intussusception. The patients with no sign of obstructive symptoms were managed conservatively with a diagnosis of intramural hemorrhage and on follow up the ultrasonographic findings of intussusception was resolved. One patient with the target sign on computed tomography and ultrasonography of the abdomen underwent ileocolic resection and end-to-end anastomosis due to a tumor in the cecum. There was no evidence of intussusception. One patient with a cyst in the right lower quadrant accompanying intussusception on ultrasonography of the abdomen underwent ultrasonography-guided reduction but the ultrasonographic findings persisted. On exploration, only cecal duplication cyst without intussusception was detected. Cecal resection including the cyst and end-to-end ileocolic anastomosis were performed. Ultrasonography, color Doppler ultrasonography, barium or hydrostatic enema and computed tomography are helpful in diagnosing intussusception, but patients with radiologic findings of intussusception should be evaluated on symptoms and clinical findings before surgical intervention. Also, other diseases mimicking intussusception should be kept in mind in the differential diagnosis. © 2014 Japan Pediatric Society.

  14. Quantitative assessment based on kinematic measures of functional impairments during upper extremity movements: A review.

    PubMed

    de los Reyes-Guzmán, Ana; Dimbwadyo-Terrer, Iris; Trincado-Alonso, Fernando; Monasterio-Huelin, Félix; Torricelli, Diego; Gil-Agudo, Angel

    2014-08-01

    Quantitative measures of human movement quality are important for discriminating healthy and pathological conditions and for expressing the outcomes and clinically important changes in subjects' functional state. However the most frequently used instruments for the upper extremity functional assessment are clinical scales, that previously have been standardized and validated, but have a high subjective component depending on the observer who scores the test. But they are not enough to assess motor strategies used during movements, and their use in combination with other more objective measures is necessary. The objective of the present review is to provide an overview on objective metrics found in literature with the aim of quantifying the upper extremity performance during functional tasks, regardless of the equipment or system used for registering kinematic data. A search in Medline, Google Scholar and IEEE Xplore databases was performed following a combination of a series of keywords. The full scientific papers that fulfilled the inclusion criteria were included in the review. A set of kinematic metrics was found in literature in relation to joint displacements, analysis of hand trajectories and velocity profiles. These metrics were classified into different categories according to the movement characteristic that was being measured. These kinematic metrics provide the starting point for a proposed objective metrics for the functional assessment of the upper extremity in people with movement disorders as a consequence of neurological injuries. Potential areas of future and further research are presented in the Discussion section. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Quantitative Gait Measurement With Pulse-Doppler Radar for Passive In-Home Gait Assessment

    PubMed Central

    Skubic, Marjorie; Rantz, Marilyn; Cuddihy, Paul E.

    2014-01-01

    In this paper, we propose a pulse-Doppler radar system for in-home gait assessment of older adults. A methodology has been developed to extract gait parameters including walking speed and step time using Doppler radar. The gait parameters have been validated with a Vicon motion capture system in the lab with 13 participants and 158 test runs. The study revealed that for an optimal step recognition and walking speed estimation, a dual radar set up with one radar placed at foot level and the other at torso level is necessary. An excellent absolute agreement with intraclass correlation coefficients of 0.97 was found for step time estimation with the foot level radar. For walking speed, although both radars show excellent consistency they all have a system offset compared to the ground truth due to walking direction with respect to the radar beam. The torso level radar has a better performance (9% offset on average) in the speed estimation compared to the foot level radar (13%–18% offset). Quantitative analysis has been performed to compute the angles causing the systematic error. These lab results demonstrate the capability of the system to be used as a daily gait assessment tool in home environments, useful for fall risk assessment and other health care applications. The system is currently being tested in an unstructured home environment. PMID:24771566

  16. Quantitative assessment of resilience of a water supply system under rainfall reduction due to climate change

    NASA Astrophysics Data System (ADS)

    Amarasinghe, Pradeep; Liu, An; Egodawatta, Prasanna; Barnes, Paul; McGree, James; Goonetilleke, Ashantha

    2016-09-01

    A water supply system can be impacted by rainfall reduction due to climate change, thereby reducing its supply potential. This highlights the need to understand the system resilience, which refers to the ability to maintain service under various pressures (or disruptions). Currently, the concept of resilience has not yet been widely applied in managing water supply systems. This paper proposed three technical resilience indictors to assess the resilience of a water supply system. A case study analysis was undertaken of the Water Grid system of Queensland State, Australia, to showcase how the proposed indicators can be applied to assess resilience. The research outcomes confirmed that the use of resilience indicators is capable of identifying critical conditions in relation to the water supply system operation, such as the maximum allowable rainfall reduction for the system to maintain its operation without failure. Additionally, resilience indicators also provided useful insight regarding the sensitivity of the water supply system to a changing rainfall pattern in the context of climate change, which represents the system's stability when experiencing pressure. The study outcomes will help in the quantitative assessment of resilience and provide improved guidance to system operators to enhance the efficiency and reliability of a water supply system.

  17. Quantitative risk assessment for skin sensitisation: consideration of a simplified approach for hair dye ingredients.

    PubMed

    Goebel, Carsten; Diepgen, Thomas L; Krasteva, Maya; Schlatter, Harald; Nicolas, Jean-Francois; Blömeke, Brunhilde; Coenraads, Pieter Jan; Schnuch, Axel; Taylor, James S; Pungier, Jacquemine; Fautz, Rolf; Fuchs, Anne; Schuh, Werner; Gerberick, G Frank; Kimber, Ian

    2012-12-01

    With the availability of the local lymph node assay, and the ability to evaluate effectively the relative skin sensitizing potency of contact allergens, a model for quantitative-risk-assessment (QRA) has been developed. This QRA process comprises: (a) determination of a no-expected-sensitisation-induction-level (NESIL), (b) incorporation of sensitization-assessment-factors (SAFs) reflecting variations between subjects, product use patterns and matrices, and (c) estimation of consumer-exposure-level (CEL). Based on these elements an acceptable-exposure-level (AEL) can be calculated by dividing the NESIL of the product by individual SAFs. Finally, the AEL is compared with the CEL to judge about risks to human health. We propose a simplified approach to risk assessment of hair dye ingredients by making use of precise experimental product exposure data. This data set provides firmly established dose/unit area concentrations under relevant consumer use conditions referred to as the measured-exposure-level (MEL). For that reason a direct comparison is possible between the NESIL with the MEL as a proof-of-concept quantification of the risk of skin sensitization. This is illustrated here by reference to two specific hair dye ingredients p-phenylenediamine and resorcinol. Comparison of these robust and toxicologically relevant values is therefore considered an improvement versus a hazard-based classification of hair dye ingredients. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. Quantitative Assessment of Nipple Perfusion with Near-Infrared Fluorescence Imaging

    PubMed Central

    Ashitate, Yoshitomo; Lee, Bernard T.; Ngo, Long H.; Laurence, Rita G.; Hutteman, Merlijn; Oketokoun, Rafiou; Lunsford, Elaine; Choi, Hak Soo; Frangioni, John V.

    2011-01-01

    Preserving the nipple areolar complex with a nipple-sparing mastectomy improves cosmesis compared to skin-sparing mastectomy. However, complications such as necrosis of the nipple areolar complex significantly impact cosmetic outcome. Many factors influence nipple areolar perfusion and no consensus currently exists on optimal incisional choice. This study evaluates 2 nipple sparing mastectomy incision models using near-infrared (NIR) fluorescence to assess perfusion quantitatively. The periareolar and radial incisions were compared with 2 control models in Yorkshire pigs (N = 6). Methylene blue (MB) and indocyanine green (ICG) were injected intravenously and NIR fluorescence images were recorded at 3 time points: before surgery, immediately after (0 h), and 3 d postoperatively. Contrast-to-background ratio (CBR) was used to assess perfusion. At 72 h, radial incisions showed a statistically significantly higher perfusion compared to periareolar incisions (P < 0.05). Based on our findings, radial incisions for nipple sparing mastectomy may be preferable due to higher perfusion, however, clinical trials are necessary for further assessment. PMID:21862913

  19. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection

    PubMed Central

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  20. QMRAspot: a tool for Quantitative Microbial Risk Assessment from surface water to potable water.

    PubMed

    Schijven, Jack F; Teunis, Peter F M; Rutjes, Saskia A; Bouwknegt, Martijn; de Roda Husman, Ana Maria

    2011-11-01

    In the Netherlands, a health based target for microbially safe drinking water is set at less than one infection per 10,000 persons per year. For the assessment of the microbial safety of drinking water, Dutch drinking water suppliers must conduct a Quantitative Microbial Risk Assessment (QMRA) at least every three years for the so-called index pathogens enterovirus, Campylobacter, Cryptosporidium and Giardia. In order to collect raw data in the proper format and to automate the process of QMRA, an interactive user-friendly computational tool, QMRAspot, was developed to analyze and conduct QMRA for drinking water produced from surface water. This paper gives a description of the raw data requirements for QMRA as well as a functional description of the tool. No extensive prior knowledge about QMRA modeling is required by the user, because QMRAspot provides guidance to the user on the quantity, type and format of raw data and performs a complete analysis of the raw data to yield a risk outcome for drinking water consumption that can be compared with other production locations, a legislative standard or an acceptable health based target. The uniform approach promotes proper collection and usage of raw data and, warrants quality of the risk assessment as well as enhances efficiency, i.e., less time is required. QMRAspot may facilitate QMRA for drinking water suppliers worldwide. The tool aids policy makers and other involved parties in formulating mitigation strategies, and prioritization and evaluation of effective preventive measures as integral part of water safety plans.

  1. A rapid, non-invasive procedure for quantitative assessment of drought survival using chlorophyll fluorescence

    PubMed Central

    Woo, Nick S; Badger, Murray R; Pogson, Barry J

    2008-01-01

    Background Analysis of survival is commonly used as a means of comparing the performance of plant lines under drought. However, the assessment of plant water status during such studies typically involves detachment to estimate water shock, imprecise methods of estimation or invasive measurements such as osmotic adjustment that influence or annul further evaluation of a specimen's response to drought. Results This article presents a procedure for rapid, inexpensive and non-invasive assessment of the survival of soil-grown plants during drought treatment. The changes in major photosynthetic parameters during increasing water deficit were monitored via chlorophyll fluorescence imaging and the selection of the maximum efficiency of photosystem II (Fv/Fm) parameter as the most straightforward and practical means of monitoring survival is described. The veracity of this technique is validated through application to a variety of Arabidopsis thaliana ecotypes and mutant lines with altered tolerance to drought or reduced photosynthetic efficiencies. Conclusion The method presented here allows the acquisition of quantitative numerical estimates of Arabidopsis drought survival times that are amenable to statistical analysis. Furthermore, the required measurements can be obtained quickly and non-invasively using inexpensive equipment and with minimal expertise in chlorophyll fluorometry. This technique enables the rapid assessment and comparison of the relative viability of germplasm during drought, and may complement detailed physiological and water relations studies. PMID:19014425

  2. Quantitative Assessment of Eye Phenotypes for Functional Genetic Studies Using Drosophila melanogaster

    PubMed Central

    Iyer, Janani; Wang, Qingyu; Le, Thanh; Pizzo, Lucilla; Grönke, Sebastian; Ambegaokar, Surendra S.; Imai, Yuzuru; Srivastava, Ashutosh; Troisí, Beatriz Llamusí; Mardon, Graeme; Artero, Ruben; Jackson, George R.; Isaacs, Adrian M.; Partridge, Linda; Lu, Bingwei; Kumar, Justin P.; Girirajan, Santhosh

    2016-01-01

    About two-thirds of the vital genes in the Drosophila genome are involved in eye development, making the fly eye an excellent genetic system to study cellular function and development, neurodevelopment/degeneration, and complex diseases such as cancer and diabetes. We developed a novel computational method, implemented as Flynotyper software (http://flynotyper.sourceforge.net), to quantitatively assess the morphological defects in the Drosophila eye resulting from genetic alterations affecting basic cellular and developmental processes. Flynotyper utilizes a series of image processing operations to automatically detect the fly eye and the individual ommatidium, and calculates a phenotypic score as a measure of the disorderliness of ommatidial arrangement in the fly eye. As a proof of principle, we tested our method by analyzing the defects due to eye-specific knockdown of Drosophila orthologs of 12 neurodevelopmental genes to accurately document differential sensitivities of these genes to dosage alteration. We also evaluated eye images from six independent studies assessing the effect of overexpression of repeats, candidates from peptide library screens, and modifiers of neurotoxicity and developmental processes on eye morphology, and show strong concordance with the original assessment. We further demonstrate the utility of this method by analyzing 16 modifiers of sine oculis obtained from two genome-wide deficiency screens of Drosophila and accurately quantifying the effect of its enhancers and suppressors during eye development. Our method will complement existing assays for eye phenotypes, and increase the accuracy of studies that use fly eyes for functional evaluation of genes and genetic interactions. PMID:26994292

  3. A quantitative health assessment index for rapid evaluation of fish condition in the field

    SciTech Connect

    Adams, S.M. ); Brown, A.M. ); Goede, R.W. )

    1993-01-01

    The health assessment index (HAI) is an extension and refinement of a previously published field necropsy system. The HAI is a quantitative index that allows statistical comparisons of fish health among data sets. Index variables are assigned numerical values based on the degree of severity or damage incurred by an organ or tissue from environmental stressors. This approach has been used to evaluate the general health status of fish populations in a wide range of reservoir types in the Tennessee River basin (North Carolina, Tennessee, Alabama, Kentucky), in Hartwell Reservoir (Georgia, South Carolina) that is contaminated by polychlorinated biphenyls, and in the Pigeon River (Tennessee, North Carolina) that receives effluents from a bleaches kraft mill. The ability of the HAI to accurately characterize the health of fish in these systems was evaluated by comparing this index to other types of fish health measures (contaminant, bioindicator, and reproductive analysis) made at the same time as the HAI. In all cases, the HAI demonstrated the same pattern of fish health status between sites as did each of the other more sophisticated health assessment methods. The HAI has proven to be a simple and inexpensive means of rapidly assessing general fish health in field situations. 29 refs., 5 tabs.

  4. A rapid murine coma and behavior scale for quantitative assessment of murine cerebral malaria.

    PubMed

    Carroll, Ryan W; Wainwright, Mark S; Kim, Kwang-Youn; Kidambi, Trilokesh; Gómez, Noé D; Taylor, Terrie; Haldar, Kasturi

    2010-10-01

    Cerebral malaria (CM) is a neurological syndrome that includes coma and seizures following malaria parasite infection. The pathophysiology is not fully understood and cannot be accounted for by infection alone: patients still succumb to CM, even if the underlying parasite infection has resolved. To that effect, there is no known adjuvant therapy for CM. Current murine CM (MCM) models do not allow for rapid clinical identification of affected animals following infection. An animal model that more closely mimics the clinical features of human CM would be helpful in elucidating potential mechanisms of disease pathogenesis and evaluating new adjuvant therapies. A quantitative, rapid murine coma and behavior scale (RMCBS) comprised of 10 parameters was developed to assess MCM manifested in C57BL/6 mice infected with Plasmodium berghei ANKA (PbA). Using this method a single mouse can be completely assessed within 3 minutes. The RMCBS enables the operator to follow the evolution of the clinical syndrome, validated here by correlations with intracerebral hemorrhages. It provides a tool by which subjects can be identified as symptomatic prior to the initiation of trial treatment. Since the RMCBS enables an operator to rapidly follow the course of disease, label a subject as affected or not, and correlate the level of illness with neuropathologic injury, it can ultimately be used to guide the initiation of treatment after the onset of cerebral disease (thus emulating the situation in the field). The RMCBS is a tool by which an adjuvant therapy can be objectively assessed.

  5. Quantitative gait measurement with pulse-Doppler radar for passive in-home gait assessment.

    PubMed

    Wang, Fang; Skubic, Marjorie; Rantz, Marilyn; Cuddihy, Paul E

    2014-09-01

    In this paper, we propose a pulse-Doppler radar system for in-home gait assessment of older adults. A methodology has been developed to extract gait parameters including walking speed and step time using Doppler radar. The gait parameters have been validated with a Vicon motion capture system in the lab with 13 participants and 158 test runs. The study revealed that for an optimal step recognition and walking speed estimation, a dual radar set up with one radar placed at foot level and the other at torso level is necessary. An excellent absolute agreement with intraclass correlation coefficients of 0.97 was found for step time estimation with the foot level radar. For walking speed, although both radars show excellent consistency they all have a system offset compared to the ground truth due to walking direction with respect to the radar beam. The torso level radar has a better performance (9% offset on average) in the speed estimation compared to the foot level radar (13%-18% offset). Quantitative analysis has been performed to compute the angles causing the systematic error. These lab results demonstrate the capability of the system to be used as a daily gait assessment tool in home environments, useful for fall risk assessment and other health care applications. The system is currently being tested in an unstructured home environment.

  6. Assessing the toxic effects of ethylene glycol ethers using Quantitative Structure Toxicity Relationship models

    SciTech Connect

    Ruiz, Patricia; Mumtaz, Moiz; Gombar, Vijay

    2011-07-15

    Experimental determination of toxicity profiles consumes a great deal of time, money, and other resources. Consequently, businesses, societies, and regulators strive for reliable alternatives such as Quantitative Structure Toxicity Relationship (QSTR) models to fill gaps in toxicity profiles of compounds of concern to human health. The use of glycol ethers and their health effects have recently attracted the attention of international organizations such as the World Health Organization (WHO). The board members of Concise International Chemical Assessment Documents (CICAD) recently identified inadequate testing as well as gaps in toxicity profiles of ethylene glycol mono-n-alkyl ethers (EGEs). The CICAD board requested the ATSDR Computational Toxicology and Methods Development Laboratory to conduct QSTR assessments of certain specific toxicity endpoints for these chemicals. In order to evaluate the potential health effects of EGEs, CICAD proposed a critical QSTR analysis of the mutagenicity, carcinogenicity, and developmental effects of EGEs and other selected chemicals. We report here results of the application of QSTRs to assess rodent carcinogenicity, mutagenicity, and developmental toxicity of four EGEs: 2-methoxyethanol, 2-ethoxyethanol, 2-propoxyethanol, and 2-butoxyethanol and their metabolites. Neither mutagenicity nor carcinogenicity is indicated for the parent compounds, but these compounds are predicted to be developmental toxicants. The predicted toxicity effects were subjected to reverse QSTR (rQSTR) analysis to identify structural attributes that may be the main drivers of the developmental toxicity potential of these compounds.

  7. Computer-aided analysis with Image J for quantitatively assessing psoriatic lesion area.

    PubMed

    Sun, Z; Wang, Y; Ji, S; Wang, K; Zhao, Y

    2015-11-01

    Body surface area is important in determining the severity of psoriasis. However, objective, reliable, and practical method is still in need for this purpose. We performed a computer image analysis (CIA) of psoriatic area using the image J freeware to determine whether this method could be used for objective evaluation of psoriatic area. Fifteen psoriasis patients were randomized to be treated with adalimumab or placebo in a clinical trial. At each visit, the psoriasis area of each body site was estimated by two physicians (E-method), and standard photographs were taken. The psoriasis area in the pictures was assessed with CIA using semi-automatic threshold selection (T-method), or manual selection (M-method, gold standard). The results assessed by the three methods were analyzed with reliability and affecting factors evaluated. Both T- and E-method correlated strongly with M-method, and T-method had a slightly stronger correlation with M-method. Both T- and E-methods had a good consistency between the evaluators. All the three methods were able to detect the change in the psoriatic area after treatment, while the E-method tends to overestimate. The CIA with image J freeware is reliable and practicable in quantitatively assessing the lesional of psoriasis area. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  8. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection.

    PubMed

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-07-20

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations.

  9. Vulnerability of Russian regions to natural risk: experience of quantitative assessment

    NASA Astrophysics Data System (ADS)

    Petrova, E.

    2006-01-01

    One of the important tracks leading to natural risk prevention, disaster mitigation or the reduction of losses due to natural hazards is the vulnerability assessment of an "at-risk" region. The majority of researchers propose to assess vulnerability according to an expert evaluation of several qualitative characteristics, scoring each of them usually using three ratings: low, average, and high. Unlike these investigations, we attempted a quantitative vulnerability assessment using multidimensional statistical methods. Cluster analysis for all 89 Russian regions revealed five different types of region, which are characterized with a single (rarely two) prevailing factor causing increase of vulnerability. These factors are: the sensitivity of the technosphere to unfavorable influences; a "human factor"; a high volume of stored toxic waste that increases possibility of NDs with serious consequences; the low per capita GRP, which determine reduced prevention and protection costs; the heightened liability of regions to natural disasters that can be complicated due to unfavorable social processes. The proposed methods permitted us to find differences in prevailing risk factor (vulnerability factor) for the region types that helps to show in which direction risk management should focus on.

  10. Quantitative assessment of reactive hyperemia using laser speckle contrast imaging at multiple wavelengths

    NASA Astrophysics Data System (ADS)

    Young, Anthony; Vishwanath, Karthik

    2016-03-01

    Reactive hyperemia refers to an increase of blood flow in tissue post release of an occlusion in the local vasculature. Measuring the temporal response of reactive hyperemia, post-occlusion in patients has the potential to shed information about microvascular diseases such as systemic sclerosis and diabetes. Laser speckle contrast imaging (LSCI) is an imaging technique capable of sensing superficial blood flow in tissue which can be used to quantitatively assess reactive hyperemia. Here, we employ LSCI using coherent sources in the blue, green and red wavelengths to evaluate reactive hyperemia in healthy human volunteers. Blood flow in the forearms of subjects were measured using LSCI to assess the time-course of reactive hyperemia that was triggered by a pressure cuff applied to the biceps of the subjects. Raw speckle images were acquired and processed to yield blood-flow parameters from a region of interest before, during and after application of occlusion. Reactive hyperemia was quantified via two measures - (1) by calculating the difference between the peak LSCI flow during the hyperemia and baseline flow, and (2) by measuring the amount of time that elapsed between the release of the occlusion and peak flow. These measurements were acquired in three healthy human participants, under the three laser wavelengths employed. The studies shed light on the utility of in vivo LSCI-based flow sensing for non-invasive assessment of reactive hyperemia responses and how they varied with the choice source wavelength influences the measured parameters.

  11. Decorticate spasticity: a re-examination using quantitative assessment in the primate.

    PubMed

    Tasker, R R; Gentili, F; Sogabe, K; Shanlin, M; Hawrylyshyn, P

    1975-08-01

    Decorticate spasticity in the squirrel monkey was chosen as a convenient laboratory model of spasticity capable of quantitative assessment upon which to evaluate various currently popular clinical spasmolytic measures. The effects of a wide variety of cortical lesions were studied involving primary and supplementary motor, premotor and parietal cortex unilaterally and bilaterally, measuring muscle tone with the evoked integrated E.M.G. technique. Measurable spasticity resulted only if primary motor cortex was ablated bilaterally usually but not always preferentially involving biceps brachii and quadriceps. Resulting postures were variable offering no justification for the term "decorticate posture". The integrated evoked E.M.G. was proportional to rate of stretch and chiefly phasic in type as in hemiplegic man. Stereotactic dentatectomy resulted in profound ipsilateral reduction in this spasticity, but was without effect in intercollicular or anemic decerebrate cats. The mechanism of the spasticity and of the cerebellar effects are discussed.

  12. Quantitative assessment of the benefits of specific information technologies applied to clinical studies in developing countries.

    PubMed

    Avilés, William; Ortega, Oscar; Kuan, Guillermina; Coloma, Josefina; Harris, Eva

    2008-02-01

    Clinical studies and trials require accessibility of large amounts of high-quality information in a timely manner, often daily. The integrated application of information technologies can greatly improve quality control as well as facilitate compliance with established standards such as Good Clinical Practice (GCP) and Good Laboratory Practice (GLP). We have customized and implemented a number of information technologies, such as personal data assistants (PDAs), geographic information system (GIS), and barcode and fingerprint scanning, to streamline a pediatric dengue cohort study in Managua, Nicaragua. Quantitative data was obtained to assess the actual contribution of each technology in relation to processing time, accuracy, real-time access to data, savings in consumable materials, and time to proficiency in training sessions. In addition to specific advantages, these information technologies benefited not only the study itself but numerous routine clinical and laboratory processes in the health center and laboratories of the Nicaraguan Ministry of Health.

  13. The challenge of measuring lung structure. On the "Standards for the Quantitative Assessment of Lung Structure".

    PubMed

    Weibel, Ewald R

    2010-09-01

    The purpose of this review is to call attention of respiratory scientists to an Official Policy Statement jointly issued by the American Thoracic Society and the European Respiratory Society on "Standards for the Quantitative Assessment of Lung Structure", based on an extended report of a joint task force of 20 experts, and recently published in the Am. J. Respir. Crit. Care Med. This document provides investigators of normal and diseased lung structure with a review of the stereological methods that allow measurements to be done on sections. It critically discusses the preparation procedures, the conditions for unbiased sampling of the lung for microscopic study, and the potential applications of such methods. Here we present some case studies that underpin the importance of using accurate methods of structure quantification and outline paths into the future for structure-function studies on lung diseases.

  14. A quantitative assessment of using the Kinect for Xbox 360 for respiratory surface motion tracking

    NASA Astrophysics Data System (ADS)

    Alnowami, M.; Alnwaimi, B.; Tahavori, F.; Copland, M.; Wells, K.

    2012-02-01

    This paper describes a quantitative assessment of the Microsoft Kinect for X-box360TM for potential application in tracking respiratory and body motion in diagnostic imaging and external beam radiotherapy. However, the results can also be used in many other biomedical applications. We consider the performance of the Kinect in controlled conditions and find mm precision at depths of 0.8-1.5m. We also demonstrate the use of the Kinect for monitoring respiratory motion of the anterior surface. To improve the performance of respiratory monitoring, we fit a spline model of the chest surface through the depth data as a method of a marker-less monitoring of a respiratory motion. In addition, a comparison between the Kinect camera with and without zoom lens and a marker-based system was used to evaluate the accuracy of using the Kinect camera as a respiratory tracking system.

  15. Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations

    PubMed Central

    Røttingen, John-Arne

    2015-01-01

    We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196

  16. Quantitative risk assessment & leak detection criteria for a subsea oil export pipeline

    NASA Astrophysics Data System (ADS)

    Zhang, Fang-Yuan; Bai, Yong; Badaruddin, Mohd Fauzi; Tuty, Suhartodjo

    2009-06-01

    A quantitative risk assessment (QRA) based on leak detection criteria (LDC) for the design of a proposed subsea oil export pipeline is presented in this paper. The objective of this QRA/LDC study was to determine if current leak detection methodologies were sufficient, based on QRA results, while excluding the use of statistical leak detection; if not, an appropriate LDC for the leak detection system would need to be established. The famous UK PARLOC database was used for the calculation of pipeline failure rates, and the software POSVCM from MMS was used for oil spill simulations. QRA results revealed that the installation of a statistically based leak detection system (LDS) can significantly reduce time to leak detection, thereby mitigating the consequences of leakage. A sound LDC has been defined based on QRA study results and comments from various LDS vendors to assist the emergency response team (ERT) to quickly identify and locate leakage and employ the most effective measures to contain damage.

  17. Development of neutron imaging quantitative data treatment to assess conservation products in cultural heritage.

    PubMed

    Realini, Marco; Colombo, Chiara; Conti, Claudia; Grazzi, Francesco; Perelli Cippo, Enrico; Hovind, Jan

    2017-08-14

    Distribution, penetration depth and amount of new mineralogical phases formed after the interaction between an inorganic treatment and a matrix are key factors for the evaluation of the conservation treatment behaviour. Nowadays, the conventional analytical methodologies, such as vibrational spectroscopies, scanning electron microscopy and X-ray diffraction, provide only qualitative and spot information. Here, we report, for the first time, the proof of concept of a methodology based on neutron imaging able to achieve quantitative data useful to assess the formation of calcium oxalate in a porous carbonatic stone treated with ammonium oxalate. Starting from the neutron attenuation coefficient of Noto stone-treated specimens, the concentrations of newly formed calcium oxalate and the diffusion coefficient have been calculated for both sound and decayed substrates. These outcomes have been also used for a comparative study between different treatment modalities. Graphical abstract Horizontal slice at 300 mm depth and CaOx molar density profile by NEUTRA output.

  18. Attribution of human VTEC O157 infection from meat products: a quantitative risk assessment approach.

    PubMed

    Kosmider, Rowena D; Nally, Pádraig; Simons, Robin R L; Brouwer, Adam; Cheung, Susan; Snary, Emma L; Wooldridge, Marion

    2010-05-01

    To address the risk posed to human health by the consumption of VTEC O157 within contaminated pork, lamb, and beef products within Great Britain, a quantitative risk assessment model has been developed. This model aims to simulate the prevalence and amount of VTEC O157 in different meat products at consumption within a single model framework by adapting previously developed models. The model is stochastic in nature, enabling both variability (natural variation between animals, carcasses, products) and uncertainty (lack of knowledge) about the input parameters to be modeled. Based on the model assumptions and data, it is concluded that the prevalence of VTEC O157 in meat products (joints and mince) at consumption is low (i.e., <0.04%). Beef products, particularly beef burgers, present the highest estimated risk with an estimated eight out of 100,000 servings on average resulting in human infection with VTEC O157.

  19. Quantitative Framework for Retrospective Assessment of Interim Decisions in Clinical Trials

    PubMed Central

    Stanev, Roger

    2016-01-01

    This article presents a quantitative way of modeling the interim decisions of clinical trials. While statistical approaches tend to focus on the epistemic aspects of statistical monitoring rules, often overlooking ethical considerations, ethical approaches tend to neglect the key epistemic dimension. The proposal is a second-order decision-analytic framework. The framework provides means for retrospective assessment of interim decisions based on a clear and consistent set of criteria that combines both ethical and epistemic considerations. The framework is broadly Bayesian and addresses a fundamental question behind many concerns about clinical trials: What does it take for an interim decision (e.g., whether to stop the trial or continue) to be a good decision? Simulations illustrating the modeling of interim decisions counterfactually are provided. PMID:27353825

  20. Quantitative indices for the assessment of the repeatability of distortion product otoacoustic emissions in laboratory animals.

    PubMed

    Parazzini, Marta; Galloni, Paolo; Brazzale, Alessandra R; Tognola, Gabriella; Marino, Carmela; Ravazzani, Paolo

    2006-08-01

    Distortion product otoacoustic emissions (DPOAE) can be used to study cochlear function in an objective and non-invasive manner. One practical and essential aspect of any investigating measure is the consistency of its results upon repeated testing of the same individual/animal (i.e., its test/retest repeatability). The goal of the present work is to propose two indices to quantitatively assess the repeatability of DPOAE in laboratory animals. The methodology is here illustrated using two data sets which consist of DPOAE subsequently collected from Sprague-Dawley rats. The results of these experiments showed that the proposed indices are capable of estimating both the repeatability of the true emission level and the inconsistencies associated with measurement error. These indices could be a significantly useful tool to identify real and even small changes in the cochlear function exerted by potential ototoxic agents.

  1. Quantitative assessment of manual and robotic microcannulation for eye surgery using new eye model.

    PubMed

    Tanaka, Shinichi; Harada, Kanako; Ida, Yoshiki; Tomita, Kyohei; Kato, Ippei; Arai, Fumihito; Ueta, Takashi; Noda, Yasuo; Sugita, Naohiko; Mitsuishi, Mamoru

    2015-06-01

    Microcannulation, a surgical procedure for the eye that requires drug injection into a 60-90 µm retinal vein, is difficult to perform manually. Robotic assistance has been proposed; however, its effectiveness in comparison to manual operation has not been quantified. An eye model has been developed to quantify the performance of manual and robotic microcannulation. The eye model, which is implemented with a force sensor and microchannels, also simulates the mechanical constraints of the instrument's movement. Ten subjects performed microcannulation using the model, with and without robotic assistance. The results showed that the robotic assistance was useful for motion stability when the drug was injected, whereas its positioning accuracy offered no advantage. An eye model was used to quantitatively assess the robotic microcannulation performance in comparison to manual operation. This approach could be valid for a better evaluation of surgical robotic assistance. Copyright © 2014 John Wiley & Sons, Ltd.

  2. Quantitative Framework for Retrospective Assessment of Interim Decisions in Clinical Trials.

    PubMed

    Stanev, Roger

    2016-11-01

    This article presents a quantitative way of modeling the interim decisions of clinical trials. While statistical approaches tend to focus on the epistemic aspects of statistical monitoring rules, often overlooking ethical considerations, ethical approaches tend to neglect the key epistemic dimension. The proposal is a second-order decision-analytic framework. The framework provides means for retrospective assessment of interim decisions based on a clear and consistent set of criteria that combines both ethical and epistemic considerations. The framework is broadly Bayesian and addresses a fundamental question behind many concerns about clinical trials: What does it take for an interim decision (e.g., whether to stop the trial or continue) to be a good decision? Simulations illustrating the modeling of interim decisions counterfactually are provided.

  3. Quantitative Assessment of Participant Knowledge and Evaluation of Participant Satisfaction in the CARES Training Program

    PubMed Central

    Goodman, Melody S.; Si, Xuemei; Stafford, Jewel D.; Obasohan, Adesuwa; Mchunguzi, Cheryl

    2016-01-01

    Background The purpose of the Community Alliance for Research Empowering Social change (CARES) training program was to (1) train community members on evidence-based public health, (2) increase their scientific literacy, and (3) develop the infrastructure for community-based participatory research (CBPR). Objectives We assessed participant knowledge and evaluated participant satisfaction of the CARES training program to identify learning needs, obtain valuable feedback about the training, and ensure learning objectives were met through mutually beneficial CBPR approaches. Methods A baseline assessment was administered before the first training session and a follow-up assessment and evaluation was administered after the final training session. At each training session a pretest was administered before the session and a posttest and evaluation were administered at the end of the session. After training session six, a mid-training evaluation was administered. We analyze results from quantitative questions on the assessments, pre- and post-tests, and evaluations. Results CARES fellows knowledge increased at follow-up (75% of questions were answered correctly on average) compared with baseline (38% of questions were answered correctly on average) assessment; post-test scores were higher than pre-test scores in 9 out of 11 sessions. Fellows enjoyed the training and rated all sessions well on the evaluations. Conclusions The CARES fellows training program was successful in participant satisfaction and increasing community knowledge of public health, CBPR, and research method ology. Engaging and training community members in evidence-based public health research can develop an infrastructure for community–academic research partnerships. PMID:22982849

  4. Hydrologic connectivity: Quantitative assessments of hydrologic-enforced drainage structures in an elevation model

    USGS Publications Warehouse

    Poppenga, Sandra; Worstell, Bruce B.

    2016-01-01

    Elevation data derived from light detection and ranging present challenges for hydrologic modeling as the elevation surface includes bridge decks and elevated road features overlaying culvert drainage structures. In reality, water is carried through these structures; however, in the elevation surface these features impede modeled overland surface flow. Thus, a hydrologically-enforced elevation surface is needed for hydrodynamic modeling. In the Delaware River Basin, hydrologic-enforcement techniques were used to modify elevations to simulate how constructed drainage structures allow overland surface flow. By calculating residuals between unfilled and filled elevation surfaces, artificially pooled depressions that formed upstream of constructed drainage structure features were defined, and elevation values were adjusted by generating transects at the location of the drainage structures. An assessment of each hydrologically-enforced drainage structure was conducted using field-surveyed culvert and bridge coordinates obtained from numerous public agencies, but it was discovered the disparate drainage structure datasets were not comprehensive enough to assess all remotely located depressions in need of hydrologic-enforcement. Alternatively, orthoimagery was interpreted to define drainage structures near each depression, and these locations were used as reference points for a quantitative hydrologic-enforcement assessment. The orthoimagery-interpreted reference points resulted in a larger corresponding sample size than the assessment between hydrologic-enforced transects and field-surveyed data. This assessment demonstrates the viability of rules-based hydrologic-enforcement that is needed to achieve hydrologic connectivity, which is valuable for hydrodynamic models in sensitive coastal regions. Hydrologic-enforced elevation data are also essential for merging with topographic/bathymetric elevation data that extend over vulnerable urbanized areas and dynamic coastal

  5. Coherent and consistent decision making for mixed hazardous waste management: The application of quantitative assessment techniques

    SciTech Connect

    Smith, G.M.; Little, R.H.; Torres, C.

    1994-12-31

    This paper focuses on predictive modelling capacity for post-disposal safety assessments of land-based disposal facilities, illustrated by presentation of the development and application of a comprehensive, yet practicable, assessment framework. The issues addressed include: (1) land-based disposal practice, (2) the conceptual and mathematical representation of processes leading to release, migration and accumulation of contaminants, (3) the identification and evaluation of relevant assessment end-points, including human health, health of non-human biota and eco-systems, and property and resource effects, (4) the gap between data requirements and data availability, and (5) the application of results in decision making, given the uncertainties in assessment results and the difficulty of comparing qualitatively different impacts arising in different temporal and spatial scales. The paper illustrates the issues with examples based on disposal of metals and radionuclides to shallow facilities. The types of disposal facility considered include features consistent with facilities for radioactive wastes as well as other types of design more typical of hazardous wastes. The intention is to raise the question of whether radioactive and other hazardous wastes are being consistently managed, and to show that assessment methods are being developed which can provide quantitative information on the levels of environmental impact as well as a consistent approach for different types of waste, such methods can then be applied to mixed hazardous wastes contained radionuclides as well as other contaminants. The remaining question is whether the will exists to employ them. The discussion and worked illustrations are based on a methodology developed and being extended within the current European Atomic Energy Community`s cost-sharing research program on radioactive waste management and disposal, with co-funding support from Empresa Nacional de Residuous Radiactivos SA, Spain.

  6. Quantitative ventilation-perfusion lung scans in infants and children: utility of a submicronic radiolabeled aerosol to assess ventilation

    SciTech Connect

    O'Brodovich, H.M.; Coates, G.

    1984-09-01

    The quantitative assessment of regional pulmonary ventilation and perfusion provides useful information regarding lung function. Its use in infants and young children, however, has been minimal because of practical and technical limitations when the distribution of ventilation is assessed by radioactive gases. In 16 infants and children we used an inexpensive commercially available nebulizer to produce a submicronic aerosol labeled with 99mtechnetium-diethylenetriamine pentacetic acid to assess ventilation quantitatively, and intravenous injections of 99mtechnetium-labeled macroaggregates of albumin to assess pulmonary perfusion quantitatively. Studies were safely completed in both ambulatory and critically ill patients, including two premature infants who had endotracheal tubes in place for ventilatory support. No sedation or patient cooperation is required. This technique enables any department of nuclear medicine to measure regional pulmonary ventilation and perfusion in infants and children.

  7. Quantitative assessment of intragenic receptor tyrosine kinase deletions in primary glioblastomas: their prevalence and molecular correlates.

    PubMed

    Kastenhuber, Edward R; Huse, Jason T; Berman, Samuel H; Pedraza, Alicia; Zhang, Jianan; Suehara, Yoshiyuki; Viale, Agnes; Cavatore, Magali; Heguy, Adriana; Szerlip, Nicholas; Ladanyi, Marc; Brennan, Cameron W

    2014-05-01

    Intragenic deletion is the most common form of activating mutation among receptor tyrosine kinases (RTK) in glioblastoma. However, these events are not detected by conventional DNA sequencing methods commonly utilized for tumor genotyping. To comprehensively assess the frequency, distribution, and expression levels of common RTK deletion mutants in glioblastoma, we analyzed RNA from a set of 192 glioblastoma samples from The Cancer Genome Atlas for the expression of EGFRvIII, EGFRvII, EGFRvV (carboxyl-terminal deletion), and PDGFRAΔ8,9. These mutations were detected in 24, 1.6, 4.7, and 1.6 % of cases, respectively. Overall, 29 % (55/189) of glioblastomas expressed at least one RTK intragenic deletion transcript in this panel. For EGFRvIII, samples were analyzed by both quantitative real-time PCR (QRT-PCR) and single mRNA molecule counting on the Nanostring nCounter platform. Nanostring proved to be highly sensitive, specific, and linear, with sensitivity comparable or exceeding that of RNA seq. We evaluated the prognostic significance and molecular correlates of RTK rearrangements. EGFRvIII was only detectable in tumors with focal amplification of the gene. Moreover, we found that EGFRvIII expression was not prognostic of poor outcome and that neither recurrent copy number alterations nor global changes in gene expression differentiate EGFRvIII-positive tumors from tumors with amplification of wild-type EGFR. The wide range of expression of mutant alleles and co-expression of multiple EGFR variants suggests that quantitative RNA-based clinical assays will be important for assessing the relative expression of intragenic deletions as therapeutic targets and/or candidate biomarkers. To this end, we demonstrate the performance of the Nanostring assay in RNA derived from routinely collected formalin-fixed paraffin-embedded tissue.

  8. MO-G-12A-01: Quantitative Imaging Metrology: What Should Be Assessed and How?

    SciTech Connect

    Giger, M; Petrick, N; Obuchowski, N; Kinahan, P

    2014-06-15

    The first two symposia in the Quantitative Imaging Track focused on 1) the introduction of quantitative imaging (QI) challenges and opportunities, and QI efforts of agencies and organizations such as the RSNA, NCI, FDA, and NIST, and 2) the techniques, applications, and challenges of QI, with specific examples from CT, PET/CT, and MR. This third symposium in the QI Track will focus on metrology and its importance in successfully advancing the QI field. While the specific focus will be on QI, many of the concepts presented are more broadly applicable to many areas of medical physics research and applications. As such, the topics discussed should be of interest to medical physicists involved in imaging as well as therapy. The first talk of the session will focus on the introduction to metrology and why it is critically important in QI. The second talk will focus on appropriate methods for technical performance assessment. The third talk will address statistically valid methods for algorithm comparison, a common problem not only in QI but also in other areas of medical physics. The final talk in the session will address strategies for publication of results that will allow statistically valid meta-analyses, which is critical for combining results of individual studies with typically small sample sizes in a manner that can best inform decisions and advance the field. Learning Objectives: Understand the importance of metrology in the QI efforts. Understand appropriate methods for technical performance assessment. Understand methods for comparing algorithms with or without reference data (i.e., “ground truth”). Understand the challenges and importance of reporting results in a manner that allows for statistically valid meta-analyses.

  9. Quantitative thallium-201 myocardial imaging in assessing right ventricular pressure in patients with congenital heart defects.

    PubMed Central

    Rabinovitch, M; Fischer, K C; Treves, S

    1981-01-01

    Thallium-201 myocardial scintigraphy was performed in patients with congenital heart defects to determine whether, by quantification of right ventricular isotope uptake, one could assess the degree of right ventricular hypertrophy and so predict the level of right ventricular pressure. A total of 24 patients ranging in age from 7 months to 30 years was studied; 18 were studied before corrective surgery and six after operation. All but three had congenital heart defects which had resulted in pressure and/or volume-overload of the right ventricle. At routine cardiac catheterisation, 20 microCi/kg thallium-201 as thallous chloride was injected through the venous catheter and myocardial images were recorded in anterior and left anterior oblique projections; these were subsequently analysed quantitatively and qualitatively. Insignificant right ventricular thallium-201 counts judged as being less than 1 per cent of the injected dose or less than 0.3 of the left ventricular counts were present in six patients all with right ventricular peak systolic pressure less than 30 mmHg. In the remaining 18 patients there was a good correlation between the right ventricular/left ventricular peak systolic pressure ratio and the right ventricular/left ventricular thallium-201 counts ratio. All patients with right ventricular/left ventricular peak systolic pressure less than 0.5 had right ventricular/left ventricular thallium-201 counts less than 0.4. Qualitative evaluation of right ventricular isotope intensity proved helpful mainly in distinguishing the patients with right ventricular pressures at or above systemic levels. Thus quantitative analysis of myocardial imaging with thallium-201 is of use clinically in patients with congenital heart defects, in assessing the severity of pulmonary stenosis or the presence of pulmonary artery hypertension. Images PMID:7459178

  10. Exploring a new quantitative image marker to assess benefit of chemotherapy to ovarian cancer patients

    NASA Astrophysics Data System (ADS)

    Mirniaharikandehei, Seyedehnafiseh; Patil, Omkar; Aghaei, Faranak; Wang, Yunzhi; Zheng, Bin

    2017-03-01

    Accurately assessing the potential benefit of chemotherapy to cancer patients is an important prerequisite to developing precision medicine in cancer treatment. The previous study has shown that total psoas area (TPA) measured on preoperative cross-section CT image might be a good image marker to predict long-term outcome of pancreatic cancer patients after surgery. However, accurate and automated segmentation of TPA from the CT image is difficult due to the fuzzy boundary or connection of TPA to other muscle areas. In this study, we developed a new interactive computer-aided detection (ICAD) scheme aiming to segment TPA from the abdominal CT images more accurately and assess the feasibility of using this new quantitative image marker to predict the benefit of ovarian cancer patients receiving Bevacizumab-based chemotherapy. ICAD scheme was applied to identify a CT image slice of interest, which is located at the level of L3 (vertebral spines). The cross-sections of the right and left TPA are segmented using a set of adaptively adjusted boundary conditions. TPA is then quantitatively measured. In addition, recent studies have investigated that muscle radiation attenuation which reflects fat deposition in the tissue might be a good image feature for predicting the survival rate of cancer patients. The scheme and TPA measurement task were applied to a large national clinical trial database involving 1,247 ovarian cancer patients. By comparing with manual segmentation results, we found that ICAD scheme could yield higher accuracy and consistency for this task. Using a new ICAD scheme can provide clinical researchers a useful tool to more efficiently and accurately extract TPA as well as muscle radiation attenuation as new image makers, and allow them to investigate the discriminatory power of it to predict progression-free survival and/or overall survival of the cancer patients before and after taking chemotherapy.

  11. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    SciTech Connect

    Waters, Michael Jackson, Marcus

    2008-11-15

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens and presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including

  12. Purity assessment of ginsenoside Rg1 using quantitative (1)H nuclear magnetic resonance.

    PubMed

    Huang, Bao-Ming; Xiao, Sheng-Yuan; Chen, Ting-Bo; Xie, Ying; Luo, Pei; Liu, Liang; Zhou, Hua

    2017-05-30

    Ginseng herbs comprise a group of the most popular herbs, including Panax ginseng, P. notoginseng and P. quinquefolius (Family Araliaceae), which are used as traditional Chinese medicine (TCM) and are some of the best-selling natural products in the world. The accurate quantification of ginsenoside Rg1 is one of the major aspects of its quality control. However, the purity of the commercial Rg1 chemical reference substance (CRS) is often measured with high-performance chromatography coupled with an ultraviolet detector (HPLC-UV), which is a selective detector with unequal responses to different compounds; thus, this detector introduces probable error to purity assessments. In the present study, quantitative nuclear magnetic resonance (qNMR), due to its absolute quantification ability, was applied to accurately assess the purity of Rg1 CRS. Phenylmethyl phthalate was used as the internal standard (IS) to calibrate the purity of Rg1 CRS. The proton signal of Rg1 CRS in methanol-d4 at 4.37ppm was selected to avoid interfering signals, enabling accurate quantitative analysis. The relaxation delay, number of scans, and NMR windowing were optimized for data acquisition. For post-processing, the Lorentz/Gauss deconvolution method was employed to increase the signal accuracy by separating the impurities and noise in the integrated region of the quantitative proton. The method validation showed that the developed method has acceptable sensitivity, linearity, precision, and accuracy. The purity of the commercial Rg1 CRS examined with the method developed in this research was 90.34±0.21%, which was obviously lower than that reported by the manufacturer (>98.0%, HPLC-UV). The cross-method validation shows that the commonly used HPLC-UV, HPLC-ELSD (evaporative light scattering detector) and even LC-MS (mass spectrometry) methods provide significantly higher purity values of Rg1 CRS compared with the qNMR method, and the accuracy of these LC-based methods largely depend on the

  13. Quantitative crystalline silica exposure assessment for a historical cohort epidemiologic study in the German porcelain industry.

    PubMed

    Birk, Thomas; Guldner, Karlheinz; Mundt, Kenneth A; Dahmann, Dirk; Adams, Robert C; Parsons, William

    2010-09-01

    A time-dependent quantitative exposure assessment of silica exposure among nearly 18,000 German porcelain workers was conducted. Results will be used to evaluate exposure-response disease risks. Over 8000 historical industrial hygiene (IH) measurements with original sampling and analysis protocols from 1954-2006 were obtained from the German Berufs- genossenschaft der keramischen-und Glas-Industrie (BGGK) and used to construct a job exposure matrix (JEM). Early measurements from different devices were converted to modern gravimetric equivalent values. Conversion factors were derived from parallel historical measurements and new side-by-side measurements using historical and modern devices in laboratory dust tunnels and active workplace locations. Exposure values were summarized and smoothed using LOESS regression; estimates for early years were derived using backward extrapolation techniques. Employee work histories were merged with JEM values to determine cumulative crystalline silica exposures for cohort members. Average silica concentrations were derived for six primary similar exposure groups (SEGs) for 1938-2006. Over 40% of the cohort accumulated <0.5 mg; just over one-third accumulated >1 mg/m(3)-years. Nearly 5000 workers had cumulative crystalline silica estimates >1.5 mg/m(3)-years. Similar numbers of men and women fell into each cumulative exposure category, except for 1113 women and 1567 men in the highest category. Over half of those hired before 1960 accumulated >3 mg/m(3)-years crystalline silica compared with 4.9% of those hired after 1960. Among those ever working in the materials preparation area, half accumulated >3 mg/m(3)-year compared with 12% of those never working in this area. Quantitative respirable silica exposures were estimated for each member of this cohort, including employment periods for which sampling used now obsolete technologies. Although individual cumulative exposure estimates ranged from background to about 40 mg/m(3)-years

  14. Towards assessing cortical bone porosity using low-frequency quantitative acoustics: A phantom-based study.

    PubMed

    Vogl, Florian; Bernet, Benjamin; Bolognesi, Daniele; Taylor, William R

    2017-01-01

    Cortical porosity is a key characteristic governing the structural properties and mechanical behaviour of bone, and its quantification is therefore critical for understanding and monitoring the development of various bone pathologies such as osteoporosis. Axial transmission quantitative acoustics has shown to be a promising technique for assessing bone health in a fast, non-invasive, and radiation-free manner. One major hurdle in bringing this approach to clinical application is the entanglement of the effects of individual characteristics (e.g. geometry, porosity, anisotropy etc.) on the measured wave propagation. In order to address this entanglement problem, we therefore propose a systematic bottom-up approach, in which only one bone property is varied, before addressing interaction effects. This work therefore investigated the sensitivity of low-frequency quantitative acoustics to changes in porosity as well as individual pore characteristics using specifically designed cortical bone phantoms. 14 bone phantoms were designed with varying pore size, axial-, and radial pore number, resulting in porosities (bone volume fraction) between 0% and 15%, similar to porosity values found in human cortical bone. All phantoms were manufactured using laser sintering, measured using axial-transmission acoustics and analysed using a full-wave approach. Experimental results were compared to theoretical predictions based on a modified Timoshenko theory. A clear dependence of phase velocity on frequency and porosity produced by increasing pore size or radial pore number was demonstrated, with the velocity decreasing by between 2-5 m/s per percent of additional porosity, which corresponds to -0.5% to -1.0% of wave speed. While the change in phase velocity due to axial pore number was consistent with the results due to pore size and radial pore number, the relative uncertainties for the estimates were too high to draw any conclusions for this parameter. This work has shown the

  15. Quantitative Assessment of Regional Wall Motion Abnormalities Using Dual-Energy Digital Subtraction Intravenous Ventriculography

    NASA Astrophysics Data System (ADS)

    McCollough, Cynthia H.

    Healthy portions of the left ventricle (LV) can often compensate for regional dysfunction, thereby masking regional disease when global indices of LV function are employed. Thus, quantitation of regional function provides a more useful method of assessing LV function, especially in diseases that have regional effects such as coronary artery disease. This dissertation studied the ability of a phase -matched dual-energy digital subtraction angiography (DE -DSA) technique to quantitate changes in regional LV systolic volume. The potential benefits and a theoretical description of the DE imaging technique are detailed. A correlated noise reduction algorithm is also presented which raises the signal-to-noise ratio of DE images by a factor of 2 -4. Ten open-chest dogs were instrumented with transmural ultrasonic crystals to assess regional LV function in terms of systolic normalized-wall-thickening rate (NWTR) and percent-systolic-thickening (PST). A pneumatic occluder was placed on the left-anterior-descending (LAD) coronary artery to temporarily reduce myocardial blood flow, thereby changing regional LV function in the LAD bed. DE-DSA intravenous left ventriculograms were obtained at control and four levels of graded myocardial ischemia, as determined by reductions in PST. Phase-matched images displaying changes in systolic contractile function were created by subtracting an end-systolic (ES) control image from ES images acquired at each level of myocardial ischemia. The resulting wall-motion difference signal (WMD), which represents a change in regional systolic volume between the control and ischemic states, was quantitated by videodensitometry and compared with changes in NWTR and PST. Regression analysis of 56 data points from 10 animals shows a linear relationship between WMD and both NWTR and PST: WMD = -2.46 NWTR + 13.9, r = 0.64, p < 0.001; WMD = -2.11 PST + 18.4, r = 0.54, p < 0.001. Thus, changes in regional ES LV volume between rest and ischemic states, as

  16. Assessment and Mission Planning Capability For Quantitative Aerothermodynamic Flight Measurements Using Remote Imaging

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas; Splinter, Scott; Daryabeigi, Kamran; Wood, William; Schwartz, Richard; Ross, Martin

    2008-01-01

    assessment study focused on increasing the probability of returning spatially resolved scientific/engineering thermal imagery. This paper provides an overview of the assessment task and the systematic approach designed to establish confidence in the ability of existing assets to reliably acquire, track and return global quantitative surface temperatures of the Shuttle during entry. A discussion of capability demonstration in support of a potential Shuttle boundary layer transition flight test is presented. Successful demonstration of a quantitative, spatially resolved, global temperature measurement on the proposed Shuttle boundary layer transition flight test could lead to potential future applications with hypersonic flight test programs within the USAF and DARPA along with flight test opportunities supporting NASA s project Constellation.

  17. Quantitative microbial risk assessment for Staphylococcus aureus in natural and processed cheese in Korea.

    PubMed

    Lee, Heeyoung; Kim, Kyunga; Choi, Kyoung-Hee; Yoon, Yohan

    2015-09-01

    This study quantitatively assessed the microbial risk of Staphylococcus aureus in cheese in Korea. The quantitative microbial risk assessment was carried out for natural and processed cheese from factory to consumption. Hazards for S. aureus in cheese were identified through the literature. For exposure assessment, the levels of S. aureus contamination in cheeses were evaluated, and the growth of S. aureus was predicted by predictive models at the surveyed temperatures, and at the time of cheese processing and distribution. For hazard characterization, a dose-response model for S. aureus was found, and the model was used to estimate the risk of illness. With these data, simulation models were prepared with @RISK (Palisade Corp., Ithaca, NY) to estimate the risk of illness per person per day in risk characterization. Staphylococcus aureus cell counts on cheese samples from factories and markets were below detection limits (0.30-0.45 log cfu/g), and pert distribution showed that the mean temperature at markets was 6.63°C. Exponential model [P=1 - exp(7.64×10(-8) × N), where N=dose] for dose-response was deemed appropriate for hazard characterization. Mean temperature of home storage was 4.02°C (log-logistic distribution). The results of risk characterization for S. aureus in natural and processed cheese showed that the mean values for the probability of illness per person per day were higher in processed cheese (mean: 2.24×10(-9); maximum: 7.97×10(-6)) than in natural cheese (mean: 7.84×10(-10); maximum: 2.32×10(-6)). These results indicate that the risk of S. aureus-related foodborne illness due to cheese consumption can be considered low under the present conditions in Korea. In addition, the developed stochastic risk assessment model in this study can be useful in establishing microbial criteria for S. aureus in cheese. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  18. A comparative study of qualitative and quantitative methods for the assessment of adhesive remnant after bracket debonding.

    PubMed

    Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C

    2012-04-01

    The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.

  19. Quantitative assessment of scatter correction techniques incorporated in next generation dual-source computed tomography

    NASA Astrophysics Data System (ADS)

    Mobberley, Sean David

    Accurate, cross-scanner assessment of in-vivo air density used to quantitatively assess amount and distribution of emphysema in COPD subjects has remained elusive. Hounsfield units (HU) within tracheal air can be considerably more positive than -1000 HU. With the advent of new dual-source scanners which employ dedicated scatter correction techniques, it is of interest to evaluate how the quantitative measures of lung density compare between dual-source and single-source scan modes. This study has sought to characterize in-vivo and phantom-based air metrics using dual-energy computed tomography technology where the nature of the technology has required adjustments to scatter correction. Anesthetized ovine (N=6), swine (N=13: more human-like rib cage shape), lung phantom and a thoracic phantom were studied using a dual-source MDCT scanner (Siemens Definition Flash. Multiple dual-source dual-energy (DSDE) and single-source (SS) scans taken at different energy levels and scan settings were acquired for direct quantitative comparison. Density histograms were evaluated for the lung, tracheal, water and blood segments. Image data were obtained at 80, 100, 120, and 140 kVp in the SS mode (B35f kernel) and at 80, 100, 140, and 140-Sn (tin filtered) kVp in the DSDE mode (B35f and D30f kernels), in addition to variations in dose, rotation time, and pitch. To minimize the effect of cross-scatter, the phantom scans in the DSDE mode was obtained by reducing the tube current of one of the tubes to its minimum (near zero) value. When using image data obtained in the DSDE mode, the median HU values in the tracheal regions of all animals and the phantom were consistently closer to -1000 HU