Science.gov

Sample records for accurate quantitative assessment

  1. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  2. Groundtruth approach to accurate quantitation of fluorescence microarrays

    SciTech Connect

    Mascio-Kegelmeyer, L; Tomascik-Cheeseman, L; Burnett, M S; van Hummelen, P; Wyrobek, A J

    2000-12-01

    To more accurately measure fluorescent signals from microarrays, we calibrated our acquisition and analysis systems by using groundtruth samples comprised of known quantities of red and green gene-specific DNA probes hybridized to cDNA targets. We imaged the slides with a full-field, white light CCD imager and analyzed them with our custom analysis software. Here we compare, for multiple genes, results obtained with and without preprocessing (alignment, color crosstalk compensation, dark field subtraction, and integration time). We also evaluate the accuracy of various image processing and analysis techniques (background subtraction, segmentation, quantitation and normalization). This methodology calibrates and validates our system for accurate quantitative measurement of microarrays. Specifically, we show that preprocessing the images produces results significantly closer to the known ground-truth for these samples.

  3. Quantitative proteomic analysis by accurate mass retention time pairs.

    PubMed

    Silva, Jeffrey C; Denny, Richard; Dorschel, Craig A; Gorenstein, Marc; Kass, Ignatius J; Li, Guo-Zhong; McKenna, Therese; Nold, Michael J; Richardson, Keith; Young, Phillip; Geromanos, Scott

    2005-04-01

    Current methodologies for protein quantitation include 2-dimensional gel electrophoresis techniques, metabolic labeling, and stable isotope labeling methods to name only a few. The current literature illustrates both pros and cons for each of the previously mentioned methodologies. Keeping with the teachings of William of Ockham, "with all things being equal the simplest solution tends to be correct", a simple LC/MS based methodology is presented that allows relative changes in abundance of proteins in highly complex mixtures to be determined. Utilizing a reproducible chromatographic separations system along with the high mass resolution and mass accuracy of an orthogonal time-of-flight mass spectrometer, the quantitative comparison of tens of thousands of ions emanating from identically prepared control and experimental samples can be made. Using this configuration, we can determine the change in relative abundance of a small number of ions between the two conditions solely by accurate mass and retention time. Employing standard operating procedures for both sample preparation and ESI-mass spectrometry, one typically obtains under 5 ppm mass precision and quantitative variations between 10 and 15%. The principal focus of this paper will demonstrate the quantitative aspects of the methodology and continue with a discussion of the associated, complementary qualitative capabilities.

  4. Quantitative microbiological risk assessment.

    PubMed

    Hoornstra, E; Notermans, S

    2001-05-21

    The production of safe food is being increasingly based on the use of risk analysis, and this process is now in use to establish national and international food safety objectives. It is also being used more frequently to guarantee that safety objectives are met and that such guarantees are achieved in a cost-effective manner. One part of the overall risk analysis procedure-risk assessment-is the scientific process in which the hazards and risk factors are identified, and the risk estimate or risk profile is determined. Risk assessment is an especially important tool for governments when food safety objectives have to be developed in the case of 'new' contaminants in known products or known contaminants causing trouble in 'new' products. Risk assessment is also an important approach for food companies (i) during product development, (ii) during (hygienic) process optimalization, and (iii) as an extension (validation) of the more qualitative HACCP-plan. This paper discusses these two different types of risk assessment, and uses probability distribution functions to assess the risks posed by Escherichia coli O157:H7 in each case. Such approaches are essential elements of risk management, as they draw on all available information to derive accurate and realistic estimations of the risk posed. The paper also discusses the potential of scenario-analysis in simulating the impact of different or modified risk factors during the consideration of new or improved control measures.

  5. Microbiological Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dominguez, Silvia; Schaffner, Donald W.

    The meat and poultry industry faces ongoing challenges due to the natural association of pathogens of concern (e.g., Salmonella, Campylobacter jejuni, Escherichia coli O157:H7) with a variety of domesticated food animals. In addition, pathogens such as Listeria monocytogenes pose a significant cross-contamination risk during further meat and poultry processing, distribution, and storage. Furthermore, the meat and poultry industries are constantly changing with the addition of new products, use of new raw materials, and targeting of new consumer populations, each of which may give rise to potential new risks. National and international regulations are increasingly using a “risk-based” approach to food safety (where the regulatory focus is driven by the magnitude of the risk), so risk assessment is becoming a valuable tool to systematically organize and evaluate the potential public health risk posed by food processing operations.

  6. Towards quantitative assessment of calciphylaxis

    NASA Astrophysics Data System (ADS)

    Deserno, Thomas M.; Sárándi, István.; Jose, Abin; Haak, Daniel; Jonas, Stephan; Specht, Paula; Brandenburg, Vincent

    2014-03-01

    Calciphylaxis is a rare disease that has devastating conditions associated with high morbidity and mortality. Calciphylaxis is characterized by systemic medial calcification of the arteries yielding necrotic skin ulcerations. In this paper, we aim at supporting the installation of multi-center registries for calciphylaxis, which includes a photographic documentation of skin necrosis. However, photographs acquired in different centers under different conditions using different equipment and photographers cannot be compared quantitatively. For normalization, we use a simple color pad that is placed into the field of view, segmented from the image, and its color fields are analyzed. In total, 24 colors are printed on that scale. A least-squares approach is used to determine the affine color transform. Furthermore, the card allows scale normalization. We provide a case study for qualitative assessment. In addition, the method is evaluated quantitatively using 10 images of two sets of different captures of the same necrosis. The variability of quantitative measurements based on free hand photography is assessed regarding geometric and color distortions before and after our simple calibration procedure. Using automated image processing, the standard deviation of measurements is significantly reduced. The coefficients of variations yield 5-20% and 2-10% for geometry and color, respectively. Hence, quantitative assessment of calciphylaxis becomes practicable and will impact a better understanding of this rare but fatal disease.

  7. Quantitative risk assessment system (QRAS)

    NASA Technical Reports Server (NTRS)

    Weinstock, Robert M (Inventor); Smidts, Carol S (Inventor); Mosleh, Ali (Inventor); Chang, Yung-Hsien (Inventor); Swaminathan, Sankaran (Inventor); Groen, Francisco J (Inventor); Tan, Zhibin (Inventor)

    2001-01-01

    A quantitative risk assessment system (QRAS) builds a risk model of a system for which risk of failure is being assessed, then analyzes the risk of the system corresponding to the risk model. The QRAS performs sensitivity analysis of the risk model by altering fundamental components and quantifications built into the risk model, then re-analyzes the risk of the system using the modifications. More particularly, the risk model is built by building a hierarchy, creating a mission timeline, quantifying failure modes, and building/editing event sequence diagrams. Multiplicities, dependencies, and redundancies of the system are included in the risk model. For analysis runs, a fixed baseline is first constructed and stored. This baseline contains the lowest level scenarios, preserved in event tree structure. The analysis runs, at any level of the hierarchy and below, access this baseline for risk quantitative computation as well as ranking of particular risks. A standalone Tool Box capability exists, allowing the user to store application programs within QRAS.

  8. Quantitative assessment of growth plate activity

    SciTech Connect

    Harcke, H.T.; Macy, N.J.; Mandell, G.A.; MacEwen, G.D.

    1984-01-01

    In the immature skeleton the physis or growth plate is the area of bone least able to withstand external forces and is therefore prone to trauma. Such trauma often leads to premature closure of the plate and results in limb shortening and/or angular deformity (varus or valgus). Active localization of bone seeking tracers in the physis makes bone scintigraphy an excellent method for assessing growth plate physiology. To be most effective, however, physeal activity should be quantified so that serial evaluations are accurate and comparable. The authors have developed a quantitative method for assessing physeal activity and have applied it ot the hip and knee. Using computer acquired pinhole images of the abnormal and contralateral normal joints, ten regions of interest are placed at key locations around each joint and comparative ratios are generated to form a growth plate profile. The ratios compare segmental physeal activity to total growth plate activity on both ipsilateral and contralateral sides and to adjacent bone. In 25 patients, ages 2 to 15 years, with angular deformities of the legs secondary to trauma, Blount's disease, and Perthes disease, this technique is able to differentiate abnormal segmental physeal activity. This is important since plate closure does not usually occur uniformly across the physis. The technique may permit the use of scintigraphy in the prediction of early closure through the quantitative analysis of serial studies.

  9. Foresight begins with FMEA. Delivering accurate risk assessments.

    PubMed

    Passey, R D

    1999-03-01

    If sufficient factors are taken into account and two- or three-stage analysis is employed, failure mode and effect analysis represents an excellent technique for delivering accurate risk assessments for products and processes, and for relating them to legal liability. This article describes a format that facilitates easy interpretation.

  10. SILAC-Based Quantitative Strategies for Accurate Histone Posttranslational Modification Profiling Across Multiple Biological Samples.

    PubMed

    Cuomo, Alessandro; Soldi, Monica; Bonaldi, Tiziana

    2017-01-01

    Histone posttranslational modifications (hPTMs) play a key role in regulating chromatin dynamics and fine-tuning DNA-based processes. Mass spectrometry (MS) has emerged as a versatile technology for the analysis of histones, contributing to the dissection of hPTMs, with special strength in the identification of novel marks and in the assessment of modification cross talks. Stable isotope labeling by amino acid in cell culture (SILAC), when adapted to histones, permits the accurate quantification of PTM changes among distinct functional states; however, its application has been mainly confined to actively dividing cell lines. A spike-in strategy based on SILAC can be used to overcome this limitation and profile hPTMs across multiple samples. We describe here the adaptation of SILAC to the analysis of histones, in both standard and spike-in setups. We also illustrate its coupling to an implemented "shotgun" workflow, by which heavy arginine-labeled histone peptides, produced upon Arg-C digestion, are qualitatively and quantitatively analyzed in an LC-MS/MS system that combines ultrahigh-pressure liquid chromatography (UHPLC) with new-generation Orbitrap high-resolution instrument.

  11. FANSe: an accurate algorithm for quantitative mapping of large scale sequencing reads

    PubMed Central

    Zhang, Gong; Fedyunin, Ivan; Kirchner, Sebastian; Xiao, Chuanle; Valleriani, Angelo; Ignatova, Zoya

    2012-01-01

    The most crucial step in data processing from high-throughput sequencing applications is the accurate and sensitive alignment of the sequencing reads to reference genomes or transcriptomes. The accurate detection of insertions and deletions (indels) and errors introduced by the sequencing platform or by misreading of modified nucleotides is essential for the quantitative processing of the RNA-based sequencing (RNA-Seq) datasets and for the identification of genetic variations and modification patterns. We developed a new, fast and accurate algorithm for nucleic acid sequence analysis, FANSe, with adjustable mismatch allowance settings and ability to handle indels to accurately and quantitatively map millions of reads to small or large reference genomes. It is a seed-based algorithm which uses the whole read information for mapping and high sensitivity and low ambiguity are achieved by using short and non-overlapping reads. Furthermore, FANSe uses hotspot score to prioritize the processing of highly possible matches and implements modified Smith–Watermann refinement with reduced scoring matrix to accelerate the calculation without compromising its sensitivity. The FANSe algorithm stably processes datasets from various sequencing platforms, masked or unmasked and small or large genomes. It shows a remarkable coverage of low-abundance mRNAs which is important for quantitative processing of RNA-Seq datasets. PMID:22379138

  12. FANSe: an accurate algorithm for quantitative mapping of large scale sequencing reads.

    PubMed

    Zhang, Gong; Fedyunin, Ivan; Kirchner, Sebastian; Xiao, Chuanle; Valleriani, Angelo; Ignatova, Zoya

    2012-06-01

    The most crucial step in data processing from high-throughput sequencing applications is the accurate and sensitive alignment of the sequencing reads to reference genomes or transcriptomes. The accurate detection of insertions and deletions (indels) and errors introduced by the sequencing platform or by misreading of modified nucleotides is essential for the quantitative processing of the RNA-based sequencing (RNA-Seq) datasets and for the identification of genetic variations and modification patterns. We developed a new, fast and accurate algorithm for nucleic acid sequence analysis, FANSe, with adjustable mismatch allowance settings and ability to handle indels to accurately and quantitatively map millions of reads to small or large reference genomes. It is a seed-based algorithm which uses the whole read information for mapping and high sensitivity and low ambiguity are achieved by using short and non-overlapping reads. Furthermore, FANSe uses hotspot score to prioritize the processing of highly possible matches and implements modified Smith-Watermann refinement with reduced scoring matrix to accelerate the calculation without compromising its sensitivity. The FANSe algorithm stably processes datasets from various sequencing platforms, masked or unmasked and small or large genomes. It shows a remarkable coverage of low-abundance mRNAs which is important for quantitative processing of RNA-Seq datasets.

  13. Integrated Environmental Modeling: Quantitative Microbial Risk Assessment

    EPA Science Inventory

    The presentation discusses the need for microbial assessments and presents a road map associated with quantitative microbial risk assessments, through an integrated environmental modeling approach. A brief introduction and the strengths of the current knowledge are illustrated. W...

  14. Quantitative assessment of scientific quality

    NASA Astrophysics Data System (ADS)

    Heinzl, Harald; Bloching, Philipp

    2012-09-01

    Scientific publications, authors, and journals are commonly evaluated with quantitative bibliometric measures. Frequently-used measures will be reviewed and their strengths and weaknesses will be highlighted. Reflections about conditions for a new, research paper-specific measure will be presented.

  15. Environmental probabilistic quantitative assessment methodologies

    USGS Publications Warehouse

    Crovelli, R.A.

    1995-01-01

    In this paper, four petroleum resource assessment methodologies are presented as possible pollution assessment methodologies, even though petroleum as a resource is desirable, whereas pollution is undesirable. A methodology is defined in this paper to consist of a probability model and a probabilistic method, where the method is used to solve the model. The following four basic types of probability models are considered: 1) direct assessment, 2) accumulation size, 3) volumetric yield, and 4) reservoir engineering. Three of the four petroleum resource assessment methodologies were written as microcomputer systems, viz. TRIAGG for direct assessment, APRAS for accumulation size, and FASPU for reservoir engineering. A fourth microcomputer system termed PROBDIST supports the three assessment systems. The three assessment systems have different probability models but the same type of probabilistic method. The type of advantages of the analytic method are in computational speed and flexibility, making it ideal for a microcomputer. -from Author

  16. Risk Assessment: A Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Baert, K.; Francois, K.; de Meulenaer, B.; Devlieghere, F.

    A risk can be defined as a function of the probability of an adverse health effect and the severity of that effect, consequential to a hazard in food (Codex Alimentarius, 1999) . During a risk assessment, an estimate of the risk is obtained. The goal is to estimate the likelihood and the extent of adverse effects occurring to humans due to possible exposure(s) to hazards. Risk assessment is a scientifically based process consisting of the following steps: (1) hazard identification, (2) hazard characterization, (3) exposure assessment and (4) and risk characterization (Codex Alimentarius, 1999).

  17. QUANTITATIVE PROCEDURES FOR NEUROTOXICOLOGY RISK ASSESSMENT

    EPA Science Inventory

    In this project, previously published information on biologically based dose-response model for brain development was used to quantitatively evaluate critical neurodevelopmental processes, and to assess potential chemical impacts on early brain development. This model has been ex...

  18. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe

    NASA Astrophysics Data System (ADS)

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-12-01

    Accurate quantitation of intracellular pH (pHi) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pHi sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pHi. Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pHi, in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF4:Yb3+, Tm3+ UCNPs were used as pHi response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pHi value 3.0–7.0 with deviation less than 0.43. This approach shall facilitate the researches in pHi related areas and development of the intracellular drug delivery systems.

  19. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe

    PubMed Central

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-01-01

    Accurate quantitation of intracellular pH (pHi) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pHi sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pHi. Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pHi, in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF4:Yb3+, Tm3+ UCNPs were used as pHi response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pHi value 3.0–7.0 with deviation less than 0.43. This approach shall facilitate the researches in pHi related areas and development of the intracellular drug delivery systems. PMID:27934889

  20. A fluorescence-based quantitative real-time PCR assay for accurate Pocillopora damicornis species identification

    NASA Astrophysics Data System (ADS)

    Thomas, Luke; Stat, Michael; Evans, Richard D.; Kennington, W. Jason

    2016-09-01

    Pocillopora damicornis is one of the most extensively studied coral species globally, but high levels of phenotypic plasticity within the genus make species identification based on morphology alone unreliable. As a result, there is a compelling need to develop cheap and time-effective molecular techniques capable of accurately distinguishing P. damicornis from other congeneric species. Here, we develop a fluorescence-based quantitative real-time PCR (qPCR) assay to genotype a single nucleotide polymorphism that accurately distinguishes P. damicornis from other morphologically similar Pocillopora species. We trial the assay across colonies representing multiple Pocillopora species and then apply the assay to screen samples of Pocillopora spp. collected at regional scales along the coastline of Western Australia. This assay offers a cheap and time-effective alternative to Sanger sequencing and has broad applications including studies on gene flow, dispersal, recruitment and physiological thresholds of P. damicornis.

  1. Quantitative Assessment of Fluorescent Proteins

    PubMed Central

    Cranfill, Paula J.; Sell, Brittney R.; Baird, Michelle A.; Allen, John R.; Lavagnino, Zeno; de Gruiter, H. Martijn; Kremers, Gert-Jan; Davidson, Michael W.; Ustione, Alessandro; Piston, David W.

    2016-01-01

    The advent of fluorescent proteins (FP) for genetic labeling of molecules and cells has revolutionized fluorescence microscopy. Genetic manipulations have created a vast array of bright and stable FPs spanning the blue to red spectral regions. Common to autofluorescent FPs is their tight β-barrel structure, which provides the rigidity and chemical environment needed for effectual fluorescence. Despite the common structure, each FP has its own unique photophysical properties. Thus, there is no single “best” fluorescent protein for every circumstance, and each FP has advantages and disadvantages. To guide decisions about which FP is right for any given application, we have characterized quantitatively over 40 different FPs for their brightness, photostability, pH stability, and monomeric properties, which permits easy apples-to-apples comparisons between these FPs. We report the values for all of the FPs measured, but focus the discussion on the more popular and/or best performing FPs in each spectral region. PMID:27240257

  2. Quantitative proteomics using the high resolution accurate mass capabilities of the quadrupole-orbitrap mass spectrometer.

    PubMed

    Gallien, Sebastien; Domon, Bruno

    2014-08-01

    High resolution/accurate mass hybrid mass spectrometers have considerably advanced shotgun proteomics and the recent introduction of fast sequencing capabilities has expanded its use for targeted approaches. More specifically, the quadrupole-orbitrap instrument has a unique configuration and its new features enable a wide range of experiments. An overview of the analytical capabilities of this instrument is presented, with a focus on its application to quantitative analyses. The high resolution, the trapping capability and the versatility of the instrument have allowed quantitative proteomic workflows to be redefined and new data acquisition schemes to be developed. The initial proteomic applications have shown an improvement of the analytical performance. However, as quantification relies on ion trapping, instead of ion beam, further refinement of the technique can be expected.

  3. CT Scan Method Accurately Assesses Humeral Head Retroversion

    PubMed Central

    Boileau, P.; Mazzoleni, N.; Walch, G.; Urien, J. P.

    2008-01-01

    Humeral head retroversion is not well described with the literature controversial regarding accuracy of measurement methods and ranges of normal values. We therefore determined normal humeral head retroversion and assessed the measurement methods. We measured retroversion in 65 cadaveric humeri, including 52 paired specimens, using four methods: radiographic, computed tomography (CT) scan, computer-assisted, and direct methods. We also assessed the distance between the humeral head central axis and the bicipital groove. CT scan methods accurately measure humeral head retroversion, while radiographic methods do not. The retroversion with respect to the transepicondylar axis was 17.9° and 21.5° with respect to the trochlear tangent axis. The difference between the right and left humeri was 8.9°. The distance between the central axis of the humeral head and the bicipital groove was 7.0 mm and was consistent between right and left humeri. Humeral head retroversion may be most accurately obtained using the patient’s own anatomic landmarks or, if not, identifiable retroversion as measured by those landmarks on contralateral side or the bicipital groove. PMID:18264854

  4. Accurate and quantitative polarization-sensitive OCT by unbiased birefringence estimator with noise-stochastic correction

    NASA Astrophysics Data System (ADS)

    Kasaragod, Deepa; Sugiyama, Satoshi; Ikuno, Yasushi; Alonso-Caneiro, David; Yamanari, Masahiro; Fukuda, Shinichi; Oshika, Tetsuro; Hong, Young-Joo; Li, En; Makita, Shuichi; Miura, Masahiro; Yasuno, Yoshiaki

    2016-03-01

    Polarization sensitive optical coherence tomography (PS-OCT) is a functional extension of OCT that contrasts the polarization properties of tissues. It has been applied to ophthalmology, cardiology, etc. Proper quantitative imaging is required for a widespread clinical utility. However, the conventional method of averaging to improve the signal to noise ratio (SNR) and the contrast of the phase retardation (or birefringence) images introduce a noise bias offset from the true value. This bias reduces the effectiveness of birefringence contrast for a quantitative study. Although coherent averaging of Jones matrix tomography has been widely utilized and has improved the image quality, the fundamental limitation of nonlinear dependency of phase retardation and birefringence to the SNR was not overcome. So the birefringence obtained by PS-OCT was still not accurate for a quantitative imaging. The nonlinear effect of SNR to phase retardation and birefringence measurement was previously formulated in detail for a Jones matrix OCT (JM-OCT) [1]. Based on this, we had developed a maximum a-posteriori (MAP) estimator and quantitative birefringence imaging was demonstrated [2]. However, this first version of estimator had a theoretical shortcoming. It did not take into account the stochastic nature of SNR of OCT signal. In this paper, we present an improved version of the MAP estimator which takes into account the stochastic property of SNR. This estimator uses a probability distribution function (PDF) of true local retardation, which is proportional to birefringence, under a specific set of measurements of the birefringence and SNR. The PDF was pre-computed by a Monte-Carlo (MC) simulation based on the mathematical model of JM-OCT before the measurement. A comparison between this new MAP estimator, our previous MAP estimator [2], and the standard mean estimator is presented. The comparisons are performed both by numerical simulation and in vivo measurements of anterior and

  5. The accurate assessment of small-angle X-ray scattering data

    SciTech Connect

    Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; Matsui, Tsutomu; Weiss, Thomas M.; Martel, Anne; Snell, Edward H.

    2015-01-01

    A set of quantitative techniques is suggested for assessing SAXS data quality. These are applied in the form of a script, SAXStats, to a test set of 27 proteins, showing that these techniques are more sensitive than manual assessment of data quality. Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targets for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. The studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.

  6. Accuracy of quantitative visual soil assessment

    NASA Astrophysics Data System (ADS)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    Visual soil assessment (VSA) is a method to assess soil quality visually, when standing in the field. VSA is increasingly used by farmers, farm organisations and companies, because it is rapid and cost-effective, and because looking at soil provides understanding about soil functioning. Often VSA is regarded as subjective, so there is a need to verify VSA. Also, many VSAs have not been fine-tuned for contrasting soil types. This could lead to wrong interpretation of soil quality and soil functioning when contrasting sites are compared to each other. We wanted to assess accuracy of VSA, while taking into account soil type. The first objective was to test whether quantitative visual field observations, which form the basis in many VSAs, could be validated with standardized field or laboratory measurements. The second objective was to assess whether quantitative visual field observations are reproducible, when used by observers with contrasting backgrounds. For the validation study, we made quantitative visual observations at 26 cattle farms. Farms were located at sand, clay and peat soils in the North Friesian Woodlands, the Netherlands. Quantitative visual observations evaluated were grass cover, number of biopores, number of roots, soil colour, soil structure, number of earthworms, number of gley mottles and soil compaction. Linear regression analysis showed that four out of eight quantitative visual observations could be well validated with standardized field or laboratory measurements. The following quantitative visual observations correlated well with standardized field or laboratory measurements: grass cover with classified images of surface cover; number of roots with root dry weight; amount of large structure elements with mean weight diameter; and soil colour with soil organic matter content. Correlation coefficients were greater than 0.3, from which half of the correlations were significant. For the reproducibility study, a group of 9 soil scientists and 7

  7. Economic Value Of Accurate Assessments Of Hydrological Uncertainty

    NASA Astrophysics Data System (ADS)

    Ajami, N. K.; Sunding, D. L.; Hornberger, G. M.

    2008-12-01

    The improvement of techniques to assist in the sustainable management of water resource systems is a crucial issue since our limited resources are under ever increasing pressure. A proper understanding of the sources and effects of uncertainty is needed to achieve goals related to improvements in reliability and sustainability in water resource management and planning. To date, many hydrological techniques have been developed to improve the quality and accuracy of hydrological forecasts and to assess the uncertainty associated with these forecasts. The economic value of improvements in calculations of uncertainty associated with hydrological forecasts from the water supply and demand management perspective remains largely unknown. We first explore the effect of more accurate assessments of hydrological uncertainty on the management of water resources by using an integrated approach to identify and quantify the sources of uncertainty. Subsequently, we analyze the value of a more reliable water supply forecast by studying the change in moments of the distribution of final surface water deliveries. This allows us to calculate the economic value of improving the information about uncertainty provided to stakeholders, especially during drought spells.

  8. Optimization of sample preparation for accurate results in quantitative NMR spectroscopy

    NASA Astrophysics Data System (ADS)

    Yamazaki, Taichi; Nakamura, Satoe; Saito, Takeshi

    2017-04-01

    Quantitative nuclear magnetic resonance (qNMR) spectroscopy has received high marks as an excellent measurement tool that does not require the same reference standard as the analyte. Measurement parameters have been discussed in detail and high-resolution balances have been used for sample preparation. However, the high-resolution balances, such as an ultra-microbalance, are not general-purpose analytical tools and many analysts may find those balances difficult to use, thereby hindering accurate sample preparation for qNMR measurement. In this study, we examined the relationship between the resolution of the balance and the amount of sample weighed during sample preparation. We were able to confirm the accuracy of the assay results for samples weighed on a high-resolution balance, such as the ultra-microbalance. Furthermore, when an appropriate tare and amount of sample was weighed on a given balance, accurate assay results were obtained with another high-resolution balance. Although this is a fundamental result, it offers important evidence that would enhance the versatility of the qNMR method.

  9. Renal Cortical Lactate Dehydrogenase: A Useful, Accurate, Quantitative Marker of In Vivo Tubular Injury and Acute Renal Failure

    PubMed Central

    Zager, Richard A.; Johnson, Ali C. M.; Becker, Kirsten

    2013-01-01

    Studies of experimental acute kidney injury (AKI) are critically dependent on having precise methods for assessing the extent of tubular cell death. However, the most widely used techniques either provide indirect assessments (e.g., BUN, creatinine), suffer from the need for semi-quantitative grading (renal histology), or reflect the status of residual viable, not the number of lost, renal tubular cells (e.g., NGAL content). Lactate dehydrogenase (LDH) release is a highly reliable test for assessing degrees of in vitro cell death. However, its utility as an in vivo AKI marker has not been defined. Towards this end, CD-1 mice were subjected to graded renal ischemia (0, 15, 22, 30, 40, or 60 min) or to nephrotoxic (glycerol; maleate) AKI. Sham operated mice, or mice with AKI in the absence of acute tubular necrosis (ureteral obstruction; endotoxemia), served as negative controls. Renal cortical LDH or NGAL levels were assayed 2 or 24 hrs later. Ischemic, glycerol, and maleate-induced AKI were each associated with striking, steep, inverse correlations (r, −0.89) between renal injury severity and renal LDH content. With severe AKI, >65% LDH declines were observed. Corresponding prompt plasma and urinary LDH increases were observed. These observations, coupled with the maintenance of normal cortical LDH mRNA levels, indicated the renal LDH efflux, not decreased LDH synthesis, caused the falling cortical LDH levels. Renal LDH content was well maintained with sham surgery, ureteral obstruction or endotoxemic AKI. In contrast to LDH, renal cortical NGAL levels did not correlate with AKI severity. In sum, the above results indicate that renal cortical LDH assay is a highly accurate quantitative technique for gauging the extent of experimental acute ischemic and toxic renal injury. That it avoids the limitations of more traditional AKI markers implies great potential utility in experimental studies that require precise quantitation of tubule cell death. PMID:23825563

  10. Physiologic basis for understanding quantitative dehydration assessment.

    PubMed

    Cheuvront, Samuel N; Kenefick, Robert W; Charkoudian, Nisha; Sawka, Michael N

    2013-03-01

    Dehydration (body water deficit) is a physiologic state that can have profound implications for human health and performance. Unfortunately, dehydration can be difficult to assess, and there is no single, universal gold standard for decision making. In this article, we review the physiologic basis for understanding quantitative dehydration assessment. We highlight how phenomenologic interpretations of dehydration depend critically on the type (dehydration compared with volume depletion) and magnitude (moderate compared with severe) of dehydration, which in turn influence the osmotic (plasma osmolality) and blood volume-dependent compensatory thresholds for antidiuretic and thirst responses. In particular, we review new findings regarding the biological variation in osmotic responses to dehydration and discuss how this variation can help provide a quantitative and clinically relevant link between the physiology and phenomenology of dehydration. Practical measures with empirical thresholds are provided as a starting point for improving the practice of dehydration assessment.

  11. Quantitative methods in assessment of neurologic function.

    PubMed

    Potvin, A R; Tourtellotte, W W; Syndulko, K; Potvin, J

    1981-01-01

    Traditionally, neurologists have emphasized qualitative techniques for assessing results of clinical trials. However, in recent years qualitative evaluations have been increasingly augmented by quantitative tests for measuring neurologic functions pertaining to mental state, strength, steadiness, reactions, speed, coordination, sensation, fatigue, gait, station, and simulated activities of daily living. Quantitative tests have long been used by psychologists for evaluating asymptomatic function, assessing human information processing, and predicting proficiency in skilled tasks; however, their methodology has never been directly assessed for validity in a clinical environment. In this report, relevant contributions from the literature on asymptomatic human performance and that on clinical quantitative neurologic function are reviewed and assessed. While emphasis is focused on tests appropriate for evaluating clinical neurologic trials, evaluations of tests for reproducibility, reliability, validity, and examiner training procedures, and for effects of motivation, learning, handedness, age, and sex are also reported and interpreted. Examples of statistical strategies for data analysis, scoring systems, data reduction methods, and data display concepts are presented. Although investigative work still remains to be done, it appears that carefully selected and evaluated tests of sensory and motor function should be an essential factor for evaluating clinical trials in an objective manner.

  12. Accurate detection and quantitation of heteroplasmic mitochondrial point mutations by pyrosequencing.

    PubMed

    White, Helen E; Durston, Victoria J; Seller, Anneke; Fratter, Carl; Harvey, John F; Cross, Nicholas C P

    2005-01-01

    Disease-causing mutations in mitochondrial DNA (mtDNA) are typically heteroplasmic and therefore interpretation of genetic tests for mitochondrial disorders can be problematic. Detection of low level heteroplasmy is technically demanding and it is often difficult to discriminate between the absence of a mutation or the failure of a technique to detect the mutation in a particular tissue. The reliable measurement of heteroplasmy in different tissues may help identify individuals who are at risk of developing specific complications and allow improved prognostic advice for patients and family members. We have evaluated Pyrosequencing technology for the detection and estimation of heteroplasmy for six mitochondrial point mutations associated with the following diseases: Leber's hereditary optical neuropathy (LHON), G3460A, G11778A, and T14484C; mitochondrial encephalopathy with lactic acidosis and stroke-like episodes (MELAS), A3243G; myoclonus epilepsy with ragged red fibers (MERRF), A8344G, and neurogenic muscle weakness, ataxia, and retinitis pigmentosa (NARP)/Leighs: T8993G/C. Results obtained from the Pyrosequencing assays for 50 patients with presumptive mitochondrial disease were compared to those obtained using the commonly used diagnostic technique of polymerase chain reaction (PCR) and restriction enzyme digestion. The Pyrosequencing assays provided accurate genotyping and quantitative determination of mutational load with a sensitivity and specificity of 100%. The MELAS A3243G mutation was detected reliably at a level of 1% heteroplasmy. We conclude that Pyrosequencing is a rapid and robust method for detecting heteroplasmic mitochondrial point mutations.

  13. How accurate are chronic wound assessments using interactive video technology?

    PubMed

    Gardner, S E; Frantz, R A; Specht, J K; Johnson-Mekota, J L; Buresh, K A; Wakefield, B; Flanagan, J

    2001-01-01

    This project examined the accuracy of chronic wound assessments made using an interactive, video telecommunications system (Teledoc 5000, NEC America, Inc., Irving, TX) by comparing a nurse expert's in-person wound assessments with wound assessments made from taped Teledoc sessions. Wound assessments determined the absence or presence of nine wound characteristics instrumental in guiding treatment (e.g., tunneling, undermining, granulation tissue, necrotic tissue, epithelial tissue, purulent exudate, erythema, edema, induration). A sample of 13 paired wound observations was analyzed. The accuracy of the Teledoc technology was examined by calculating the amount of agreement between the in-person assessments and the taped Teledoc assessments for each of the nine characteristics. Agreement for eight of the nine wound characteristic exceeded 75%, suggesting this telehealth medium does not alter wound assessment data, which are essential in guiding treatment decisions. In addition to connecting the remotely based nurse with nursing expertise to improve patient care, telehealth technology seemed to increase the remotely-based nurses' knowledge of wound assessment and treatment as well.

  14. The accurate assessment of small-angle X-ray scattering data

    PubMed Central

    Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; Matsui, Tsutomu; Weiss, Thomas M.; Martel, Anne; Snell, Edward H.

    2015-01-01

    Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targets for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. The studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality. PMID:25615859

  15. The accurate assessment of small-angle X-ray scattering data

    DOE PAGES

    Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; ...

    2015-01-23

    Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targetsmore » for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. Studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.« less

  16. The accurate assessment of small-angle X-ray scattering data

    SciTech Connect

    Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; Matsui, Tsutomu; Weiss, Thomas M.; Martel, Anne; Snell, Edward H.

    2015-01-23

    Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targets for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. Studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.

  17. Internal Medicine Residents Do Not Accurately Assess Their Medical Knowledge

    ERIC Educational Resources Information Center

    Jones, Roger; Panda, Mukta; Desbiens, Norman

    2008-01-01

    Background: Medical knowledge is essential for appropriate patient care; however, the accuracy of internal medicine (IM) residents' assessment of their medical knowledge is unknown. Methods: IM residents predicted their overall percentile performance 1 week (on average) before and after taking the in-training exam (ITE), an objective and well…

  18. Quantitative analysis in outcome assessment of instrumented lumbosacral arthrodesis.

    PubMed

    Champain, Sabina; Mazel, Christian; Mitulescu, Anca; Skalli, Wafa

    2007-08-01

    The outcome assessment in instrumented lumbosacral fusion mostly focuses on clinical criteria, complications and scores, with a high variability of imaging means, methods of fusion grading and parameters describing degenerative changes, making comparisons between studies difficult. The aim of this retrospective evaluation was to evaluate the interest of quantified radiographic analysis of lumbar spine in global outcome assessment and to highlight the key biomechanical factors involved. Clinical data and Beaujon-Lassale scores were collected for 49 patients who underwent lumbosacral arthrodesis after prior lumbar discectomy (mean follow-up: 5 years). Sagittal standing and lumbar flexion-extension X-ray films allowed quantifying vertebral, lumbar, pelvic and kinematic parameters of the lumbar spine, which were compared to reference values. Statistics were performed to assess evolution for all variables. At long-term follow-up, 90% of patients presented satisfactory clinical outcomes, associated to normal sagittal alignment; vertebral parameters objectified adjacent level degeneration in four cases (8%). Clinical outcome was correlated (r = 0.8) with fusion that was confirmed in 80% of cases, doubtful in 16% and pseudarthrosis seemed to occur in 4% (2) of cases. In addition to clinical data (outcomes comparable to the literature), quantitative analysis accurately described lumbar spine geometry and kinematics, highlighting parameters related to adjacent level's degeneration and a significant correlation between clinical outcome and fusion. Furthermore, criteria proposed to quantitatively evaluate fusion from lumbar dynamic radiographs seem to be appropriate and in agreement with surgeon's qualitative grading in 87% of cases.

  19. Numerical assessment of accurate measurements of laminar flame speed

    NASA Astrophysics Data System (ADS)

    Goulier, Joules; Bizon, Katarzyna; Chaumeix, Nabiha; Meynet, Nicolas; Continillo, Gaetano

    2016-12-01

    In combustion, the laminar flame speed constitutes an important parameter that reflects the chemistry of oxidation for a given fuel, along with its transport and thermal properties. Laminar flame speeds are used (i) in turbulent models used in CFD codes, and (ii) to validate detailed or reduced mechanisms, often derived from studies using ideal reactors and in diluted conditions as in jet stirred reactors and in shock tubes. End-users of such mechanisms need to have an assessment of their capability to predict the correct heat released by combustion in realistic conditions. In this view, the laminar flame speed constitutes a very convenient parameter, and it is then very important to have a good knowledge of the experimental errors involved with its determination. Stationary configurations (Bunsen burners, counter-flow flames, heat flux burners) or moving flames (tubes, spherical vessel, soap bubble) can be used. The spherical expanding flame configuration has recently become popular, since it can be used at high pressures and temperatures. With this method, the flame speed is not measured directly, but derived through the recording of the flame radius. The method used to process the radius history will have an impact on the estimated flame speed. Aim of this work is to propose a way to derive the laminar flame speed from experimental recording of expanding flames, and to assess the error magnitude.

  20. An accurate method of extracting fat droplets in liver images for quantitative evaluation

    NASA Astrophysics Data System (ADS)

    Ishikawa, Masahiro; Kobayashi, Naoki; Komagata, Hideki; Shinoda, Kazuma; Yamaguchi, Masahiro; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie

    2015-03-01

    The steatosis in liver pathological tissue images is a promising indicator of nonalcoholic fatty liver disease (NAFLD) and the possible risk of hepatocellular carcinoma (HCC). The resulting values are also important for ensuring the automatic and accurate classification of HCC images, because the existence of many fat droplets is likely to create errors in quantifying the morphological features used in the process. In this study we propose a method that can automatically detect, and exclude regions with many fat droplets by using the feature values of colors, shapes and the arrangement of cell nuclei. We implement the method and confirm that it can accurately detect fat droplets and quantify the fat droplet ratio of actual images. This investigation also clarifies the effective characteristics that contribute to accurate detection.

  1. Quantitative Assessment of Abdominal Aortic Aneurysm Geometry

    PubMed Central

    Shum, Judy; Martufi, Giampaolo; Di Martino, Elena; Washington, Christopher B.; Grisafi, Joseph; Muluk, Satish C.; Finol, Ender A.

    2011-01-01

    Recent studies have shown that the maximum transverse diameter of an abdominal aortic aneurysm (AAA) and expansion rate are not entirely reliable indicators of rupture potential. We hypothesize that aneurysm morphology and wall thickness are more predictive of rupture risk and can be the deciding factors in the clinical management of the disease. A non-invasive, image-based evaluation of AAA shape was implemented on a retrospective study of 10 ruptured and 66 unruptured aneurysms. Three-dimensional models were generated from segmented, contrast-enhanced computed tomography images. Geometric indices and regional variations in wall thickness were estimated based on novel segmentation algorithms. A model was created using a J48 decision tree algorithm and its performance was assessed using ten-fold cross validation. Feature selection was performed using the χ2-test. The model correctly classified 65 datasets and had an average prediction accuracy of 86.6% (κ = 0.37). The highest ranked features were sac length, sac height, volume, surface area, maximum diameter, bulge height, and intra-luminal thrombus volume. Given that individual AAAs have complex shapes with local changes in surface curvature and wall thickness, the assessment of AAA rupture risk should be based on the accurate quantification of aneurysmal sac shape and size. PMID:20890661

  2. Hydrogen quantitative risk assessment workshop proceedings.

    SciTech Connect

    Groth, Katrina M.; Harris, Aaron P.

    2013-09-01

    The Quantitative Risk Assessment (QRA) Toolkit Introduction Workshop was held at Energetics on June 11-12. The workshop was co-hosted by Sandia National Laboratories (Sandia) and HySafe, the International Association for Hydrogen Safety. The objective of the workshop was twofold: (1) Present a hydrogen-specific methodology and toolkit (currently under development) for conducting QRA to support the development of codes and standards and safety assessments of hydrogen-fueled vehicles and fueling stations, and (2) Obtain feedback on the needs of early-stage users (hydrogen as well as potential leveraging for Compressed Natural Gas [CNG], and Liquefied Natural Gas [LNG]) and set priorities for %E2%80%9CVersion 1%E2%80%9D of the toolkit in the context of the commercial evolution of hydrogen fuel cell electric vehicles (FCEV). The workshop consisted of an introduction and three technical sessions: Risk Informed Development and Approach; CNG/LNG Applications; and Introduction of a Hydrogen Specific QRA Toolkit.

  3. A toolbox for rockfall Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Agliardi, F.; Mavrouli, O.; Schubert, M.; Corominas, J.; Crosta, G. B.; Faber, M. H.; Frattini, P.; Narasimhan, H.

    2012-04-01

    Rockfall Quantitative Risk Analysis for mitigation design and implementation requires evaluating the probability of rockfall events, the probability and intensity of impacts on structures (elements at risk and countermeasures), their vulnerability, and the related expected costs for different scenarios. A sound theoretical framework has been developed during the last years both for spatially-distributed and local (i.e. single element at risk) analyses. Nevertheless, the practical application of existing methodologies remains challenging, due to difficulties in the collection of required data and to the lack of simple, dedicated analysis tools. In order to fill this gap, specific tools have been developed in the form of Excel spreadsheets, in the framework of Safeland EU project. These tools can be used by stakeholders, practitioners and other interested parties for the quantitative calculation of rock fall risk through its key components (probabilities, vulnerability, loss), using combinations of deterministic and probabilistic approaches. Three tools have been developed, namely: QuRAR (by UNIMIB), VulBlock (by UPC), and RiskNow-Falling Rocks (by ETH Zurich). QuRAR implements a spatially distributed, quantitative assessment methodology of rockfall risk for individual buildings or structures in a multi-building context (urban area). Risk is calculated in terms of expected annual cost, through the evaluation of rockfall event probability, propagation and impact probability (by 3D numerical modelling of rockfall trajectories), and empirical vulnerability for different risk protection scenarios. Vulblock allows a detailed, analytical calculation of the vulnerability of reinforced concrete frame buildings to rockfalls and related fragility curves, both as functions of block velocity and the size. The calculated vulnerability can be integrated in other methodologies/procedures based on the risk equation, by incorporating the uncertainty of the impact location of the rock

  4. Accurate radiation temperature and chemical potential from quantitative photoluminescence analysis of hot carrier populations

    NASA Astrophysics Data System (ADS)

    Gibelli, François; Lombez, Laurent; Guillemoles, Jean-François

    2017-02-01

    In order to characterize hot carrier populations in semiconductors, photoluminescence measurement is a convenient tool, enabling us to probe the carrier thermodynamical properties in a contactless way. However, the analysis of the photoluminescence spectra is based on some assumptions which will be discussed in this work. We especially emphasize the importance of the variation of the material absorptivity that should be considered to access accurate thermodynamical properties of the carriers, especially by varying the excitation power. The proposed method enables us to obtain more accurate results of thermodynamical properties by taking into account a rigorous physical description and finds direct application in investigating hot carrier solar cells, which are an adequate concept for achieving high conversion efficiencies with a relatively simple device architecture.

  5. Accurate radiation temperature and chemical potential from quantitative photoluminescence analysis of hot carrier populations.

    PubMed

    Gibelli, François; Lombez, Laurent; Guillemoles, Jean-François

    2017-02-15

    In order to characterize hot carrier populations in semiconductors, photoluminescence measurement is a convenient tool, enabling us to probe the carrier thermodynamical properties in a contactless way. However, the analysis of the photoluminescence spectra is based on some assumptions which will be discussed in this work. We especially emphasize the importance of the variation of the material absorptivity that should be considered to access accurate thermodynamical properties of the carriers, especially by varying the excitation power. The proposed method enables us to obtain more accurate results of thermodynamical properties by taking into account a rigorous physical description and finds direct application in investigating hot carrier solar cells, which are an adequate concept for achieving high conversion efficiencies with a relatively simple device architecture.

  6. How accurate is the Kubelka-Munk theory of diffuse reflection? A quantitative answer

    NASA Astrophysics Data System (ADS)

    Joseph, Richard I.; Thomas, Michael E.

    2012-10-01

    The (heuristic) Kubelka-Munk theory of diffuse reflectance and transmittance of a film on a substrate, which is widely used because it gives simple analytic results, is compared to the rigorous radiative transfer model of Chandrasekhar. The rigorous model has to be numerically solved, thus is less intuitive. The Kubelka-Munk theory uses an absorption coefficient and scatter coefficient as inputs, similar to the rigorous model of Chandrasekhar. The relationship between these two sets of coefficients is addressed. It is shown that the Kubelka-Munk theory is remarkably accurate if one uses the proper albedo parameter.

  7. Highly sensitive capillary electrophoresis-mass spectrometry for rapid screening and accurate quantitation of drugs of abuse in urine.

    PubMed

    Kohler, Isabelle; Schappler, Julie; Rudaz, Serge

    2013-05-30

    The combination of capillary electrophoresis (CE) and mass spectrometry (MS) is particularly well adapted to bioanalysis due to its high separation efficiency, selectivity, and sensitivity; its short analytical time; and its low solvent and sample consumption. For clinical and forensic toxicology, a two-step analysis is usually performed: first, a screening step for compound identification, and second, confirmation and/or accurate quantitation in cases of presumed positive results. In this study, a fast and sensitive CE-MS workflow was developed for the screening and quantitation of drugs of abuse in urine samples. A CE with a time-of-flight MS (CE-TOF/MS) screening method was developed using a simple urine dilution and on-line sample preconcentration with pH-mediated stacking. The sample stacking allowed for a high loading capacity (20.5% of the capillary length), leading to limits of detection as low as 2 ng mL(-1) for drugs of abuse. Compound quantitation of positive samples was performed by CE-MS/MS with a triple quadrupole MS equipped with an adapted triple-tube sprayer and an electrospray ionization (ESI) source. The CE-ESI-MS/MS method was validated for two model compounds, cocaine (COC) and methadone (MTD), according to the Guidance of the Food and Drug Administration. The quantitative performance was evaluated for selectivity, response function, the lower limit of quantitation, trueness, precision, and accuracy. COC and MTD detection in urine samples was determined to be accurate over the range of 10-1000 ng mL(-1) and 21-1000 ng mL(-1), respectively.

  8. Quantitative Risk Assessment for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Lowry, T. S.; McKenna, S. A.; Hadgu, T.; Kalinina, E.

    2011-12-01

    This study uses a quantitative risk-assessment approach to place the uncertainty associated with enhanced geothermal systems (EGS) development into meaningful context and to identify points of attack that can reduce risk the most. Using the integrated geothermal assessment tool, GT-Mod, we calculate the complimentary cumulative distribution function of the levelized cost of electricity (LCOE) that results from uncertainty in a variety of geologic and economic input parameter values. EGS is a developing technology that taps deep (2-10km) geologic heat sources for energy production by "enhancing" non-permeable hot rock through hydraulic stimulation. Despite the promise of EGS, uncertainties in predicting the physical end economic performance of a site has hindered its development. To address this, we apply a quantitative risk-assessment approach that calculates risk as the sum of the consequence, C, multiplied by the range of the probability, ΔP, over all estimations of a given exceedance probability, n, over time, t. The consequence here is defined as the deviation from the best estimate LCOE, which is calculated using the 'best-guess' input parameter values. The analysis assumes a realistic but fictitious EGS site with uncertainties in the exploration success rate, the sub-surface thermal gradient, the reservoir fracture pattern, and the power plant performance. Uncertainty in the exploration, construction, O&M, and drilling costs are also included. The depth to the resource is calculated from the thermal gradient and a target resource temperature of 225 °C. Thermal performance is simulated using the Gringarten analytical solution. The mass flow rate is set to produce 30 MWe of power for the given conditions and is adjusted over time to maintain that rate over the plant lifetime of 30 years. Simulations are conducted using GT-Mod, which dynamically links the physical systems of a geothermal site to simulate, as an integrated, multi-system component, the

  9. Novel micelle PCR-based method for accurate, sensitive and quantitative microbiota profiling.

    PubMed

    Boers, Stefan A; Hays, John P; Jansen, Ruud

    2017-04-05

    In the last decade, many researchers have embraced 16S rRNA gene sequencing techniques, which has led to a wealth of publications and documented differences in the composition of microbial communities derived from many different ecosystems. However, comparison between different microbiota studies is currently very difficult due to the lack of a standardized 16S rRNA gene sequencing protocol. Here we report on a novel approach employing micelle PCR (micPCR) in combination with an internal calibrator that allows for standardization of microbiota profiles via their absolute abundances. The addition of an internal calibrator allows the researcher to express the resulting operational taxonomic units (OTUs) as a measure of 16S rRNA gene copies by correcting the number of sequences of each individual OTU in a sample for efficiency differences in the NGS process. Additionally, accurate quantification of OTUs obtained from negative extraction control samples allows for the subtraction of contaminating bacterial DNA derived from the laboratory environment or chemicals/reagents used. Using equimolar synthetic microbial community samples and low biomass clinical samples, we demonstrate that the calibrated micPCR/NGS methodology possess a much higher precision and a lower limit of detection compared with traditional PCR/NGS, resulting in more accurate microbiota profiles suitable for multi-study comparison.

  10. Novel micelle PCR-based method for accurate, sensitive and quantitative microbiota profiling

    PubMed Central

    Boers, Stefan A.; Hays, John P.; Jansen, Ruud

    2017-01-01

    In the last decade, many researchers have embraced 16S rRNA gene sequencing techniques, which has led to a wealth of publications and documented differences in the composition of microbial communities derived from many different ecosystems. However, comparison between different microbiota studies is currently very difficult due to the lack of a standardized 16S rRNA gene sequencing protocol. Here we report on a novel approach employing micelle PCR (micPCR) in combination with an internal calibrator that allows for standardization of microbiota profiles via their absolute abundances. The addition of an internal calibrator allows the researcher to express the resulting operational taxonomic units (OTUs) as a measure of 16S rRNA gene copies by correcting the number of sequences of each individual OTU in a sample for efficiency differences in the NGS process. Additionally, accurate quantification of OTUs obtained from negative extraction control samples allows for the subtraction of contaminating bacterial DNA derived from the laboratory environment or chemicals/reagents used. Using equimolar synthetic microbial community samples and low biomass clinical samples, we demonstrate that the calibrated micPCR/NGS methodology possess a much higher precision and a lower limit of detection compared with traditional PCR/NGS, resulting in more accurate microbiota profiles suitable for multi-study comparison. PMID:28378789

  11. Development and Validation of a Highly Accurate Quantitative Real-Time PCR Assay for Diagnosis of Bacterial Vaginosis

    PubMed Central

    Smith, William L.; Chadwick, Sean G.; Toner, Geoffrey; Mordechai, Eli; Adelson, Martin E.; Aguin, Tina J.; Sobel, Jack D.

    2016-01-01

    Bacterial vaginosis (BV) is the most common gynecological infection in the United States. Diagnosis based on Amsel's criteria can be challenging and can be aided by laboratory-based testing. A standard method for diagnosis in research studies is enumeration of bacterial morphotypes of a Gram-stained vaginal smear (i.e., Nugent scoring). However, this technique is subjective, requires specialized training, and is not widely available. Therefore, a highly accurate molecular assay for the diagnosis of BV would be of great utility. We analyzed 385 vaginal specimens collected prospectively from subjects who were evaluated for BV by clinical signs and Nugent scoring. We analyzed quantitative real-time PCR (qPCR) assays on DNA extracted from these specimens to quantify nine organisms associated with vaginal health or disease: Gardnerella vaginalis, Atopobium vaginae, BV-associated bacteria 2 (BVAB2, an uncultured member of the order Clostridiales), Megasphaera phylotype 1 or 2, Lactobacillus iners, Lactobacillus crispatus, Lactobacillus gasseri, and Lactobacillus jensenii. We generated a logistic regression model that identified G. vaginalis, A. vaginae, and Megasphaera phylotypes 1 and 2 as the organisms for which quantification provided the most accurate diagnosis of symptomatic BV, as defined by Amsel's criteria and Nugent scoring, with 92% sensitivity, 95% specificity, 94% positive predictive value, and 94% negative predictive value. The inclusion of Lactobacillus spp. did not contribute sufficiently to the quantitative model for symptomatic BV detection. This molecular assay is a highly accurate laboratory tool to assist in the diagnosis of symptomatic BV. PMID:26818677

  12. Quantitative Assessment of Autistic Symptomatology in Preschoolers

    ERIC Educational Resources Information Center

    Pine, Elyse; Luby, Joan; Abbacchi, Anna; Constantino, John N.

    2006-01-01

    Given a growing emphasis on early intervention for children with autism, valid quantitative tools for measuring treatment response are needed. The Social Responsiveness Scale (SRS) is a brief (15-20 minute) quantitative measure of autistic traits in 4-to 18-year-olds, for which a version for 3-year-olds was recently developed. We obtained serial…

  13. Sensitive Quantitative Assessment of Balance Disorders

    NASA Technical Reports Server (NTRS)

    Paloski, Willilam H.

    2007-01-01

    Computerized dynamic posturography (CDP) has become a standard technique for objectively quantifying balance control performance, diagnosing the nature of functional impairments underlying balance disorders, and monitoring clinical treatment outcomes. We have long used CDP protocols to assess recovery of sensory-motor function in astronauts following space flight. The most reliable indicators of post-flight crew performance are the sensory organization tests (SOTs), particularly SOTs 5 and 6, which are sensitive to changes in availability and/or utilization of vestibular cues. We have noted, however, that some astronauts exhibiting obvious signs of balance impairment after flight are able to score within clinical norms on these tests, perhaps as a result of adopting competitive strategies or by their natural skills at substituting alternate sensory information sources. This insensitivity of the CDP protocol could underestimate of the degree of impairment and, perhaps, lead to premature release of those crewmembers to normal duties. To improve the sensitivity of the CDP protocol we have introduced static and dynamic head tilt SOT trials into our protocol. The pattern of postflight recovery quantified by the enhanced CDP protocol appears to more aptly track the re-integration of sensory-motor function, with recovery time increasing as the complexity of sensory-motor/biomechanical task increases. The new CDP protocol therefore seems more suitable for monitoring post-flight sensory-motor recovery and for indicating to crewmembers and flight surgeons fitness for return to duty and/or activities of daily living. There may be classes of patients (e.g., athletes, pilots) having motivation and/or performance characteristics similar to astronauts whose sensory-motor treatment outcomes would also be more accurately monitored using the enhanced CDP protocol. Furthermore, the enhanced protocol may be useful in early detection of age-related balance disorders.

  14. Quantitative spectroscopy of hot stars: accurate atomic data applied on a large scale as driver of recent breakthroughs

    NASA Astrophysics Data System (ADS)

    Przybilla, N.; Schaffenroth, V.; Nieva, M. F.; Butler, K.

    2016-10-01

    OB-type stars present hotbeds for non-LTE physics because of their strong radiation fields that drive the atmospheric plasma out of local thermodynamic equilibrium. We report on recent breakthroughs in the quantitative analysis of the optical and UV-spectra of OB-type stars that were facilitated by application of accurate and precise atomic data on a large scale. An astrophysicist's dream has come true, by bringing observed and model spectra into close match over wide parts of the observed wavelength ranges. This allows tight observational constraints to be derived from OB-type stars for a wide range of applications in astrophysics. However, despite the progress made, many details of the modelling may be improved further. We discuss atomic data needs in terms of laboratory measurements and also ab-initio calculations. Particular emphasis is given to quantitative spectroscopy in the near-IR, which will be the focus in the era of the upcoming extremely large telescopes.

  15. Restriction Site Tiling Analysis: accurate discovery and quantitative genotyping of genome-wide polymorphisms using nucleotide arrays

    PubMed Central

    2010-01-01

    High-throughput genotype data can be used to identify genes important for local adaptation in wild populations, phenotypes in lab stocks, or disease-related traits in human medicine. Here we advance microarray-based genotyping for population genomics with Restriction Site Tiling Analysis. The approach simultaneously discovers polymorphisms and provides quantitative genotype data at 10,000s of loci. It is highly accurate and free from ascertainment bias. We apply the approach to uncover genomic differentiation in the purple sea urchin. PMID:20403197

  16. There's plenty of gloom at the bottom: the many challenges of accurate quantitation in size-based oligomeric separations.

    PubMed

    Striegel, André M

    2013-11-01

    There is a variety of small-molecule species (e.g., tackifiers, plasticizers, oligosaccharides) the size-based characterization of which is of considerable scientific and industrial importance. Likewise, quantitation of the amount of oligomers in a polymer sample is crucial for the import and export of substances into the USA and European Union (EU). While the characterization of ultra-high molar mass macromolecules by size-based separation techniques is generally considered a challenge, it is this author's contention that a greater challenge is encountered when trying to perform, for quantitation purposes, separations in and of the oligomeric region. The latter thesis is expounded herein, by detailing the various obstacles encountered en route to accurate, quantitative oligomeric separations by entropically dominated techniques such as size-exclusion chromatography, hydrodynamic chromatography, and asymmetric flow field-flow fractionation, as well as by methods which are, principally, enthalpically driven such as liquid adsorption and temperature gradient interaction chromatography. These obstacles include, among others, the diminished sensitivity of static light scattering (SLS) detection at low molar masses, the non-constancy of the response of SLS and of commonly employed concentration-sensitive detectors across the oligomeric region, and the loss of oligomers through the accumulation wall membrane in asymmetric flow field-flow fractionation. The battle is not lost, however, because, with some care and given a sufficient supply of sample, the quantitation of both individual oligomeric species and of the total oligomeric region is often possible.

  17. Quantitation of Insulin-Like Growth Factor 1 in Serum by Liquid Chromatography High Resolution Accurate-Mass Mass Spectrometry.

    PubMed

    Ketha, Hemamalini; Singh, Ravinder J

    2016-01-01

    Insulin-like growth factor 1 (IGF-1) is a 70 amino acid peptide hormone which acts as the principal mediator of the effects of growth hormone (GH). Due to a wide variability in circulating concentration of GH, IGF-1 quantitation is the first step in the diagnosis of GH excess or deficiency. Majority (>95 %) of IGF-1 circulates as a ternary complex along with its principle binding protein insulin-like growth factor 1 binding protein 3 (IGFBP-3) and acid labile subunit. The assay design approach for IGF-1 quantitation has to include a step to dissociate IGF-1 from its ternary complex. Several commercial assays employ a buffer containing acidified ethanol to achieve this. Despite several modifications, commercially available immunoassays have been shown to have challenges with interference from IGFBP-3. Additionally, inter-method comparison between IGF-1 immunoassays has been shown to be suboptimal. Mass spectrometry has been utilized for quantitation of IGF-1. In this chapter a liquid chromatography high resolution accurate-mass mass spectrometry (LC-HRAMS) based method for IGF-1 quantitation has been described.

  18. A Quantitative Approach to Assessing System Evolvability

    NASA Technical Reports Server (NTRS)

    Christian, John A., III

    2004-01-01

    When selecting a system from multiple candidates, the customer seeks the one that best meets his or her needs. Recently the desire for evolvable systems has become more important and engineers are striving to develop systems that accommodate this need. In response to this search for evolvability, we present a historical perspective on evolvability, propose a refined definition of evolvability, and develop a quantitative method for measuring this property. We address this quantitative methodology from both a theoretical and practical perspective. This quantitative model is then applied to the problem of evolving a lunar mission to a Mars mission as a case study.

  19. Quantitative risk assessment of Listeria monocytogenes in French cold-smoked salmon: I. Quantitative exposure assessment.

    PubMed

    Pouillot, Régis; Miconnet, Nicolas; Afchain, Anne-Laure; Delignette-Muller, Marie Laure; Beaufort, Annie; Rosso, Laurent; Denis, Jean-Baptiste; Cornu, Marie

    2007-06-01

    A quantitative assessment of the exposure to Listeria monocytogenes from cold-smoked salmon (CSS) consumption in France is developed. The general framework is a second-order (or two-dimensional) Monte Carlo simulation, which characterizes the uncertainty and variability of the exposure estimate. The model takes into account the competitive bacterial growth between L. monocytogenes and the background competitive flora from the end of the production line to the consumer phase. An original algorithm is proposed to integrate this growth in conditions of varying temperature. As part of a more general project led by the French Food Safety Agency (Afssa), specific data were acquired and modeled for this quantitative exposure assessment model, particularly time-temperature profiles, prevalence data, and contamination-level data. The sensitivity analysis points out the main influence of the mean temperature in household refrigerators and the prevalence of contaminated CSS on the exposure level. The outputs of this model can be used as inputs for further risk assessment.

  20. Quantitative analysis in outcome assessment of instrumented lumbosacral arthrodesis

    PubMed Central

    Mazel, Christian; Mitulescu, Anca

    2007-01-01

    The outcome assessment in instrumented lumbosacral fusion mostly focuses on clinical criteria, complications and scores, with a high variability of imaging means, methods of fusion grading and parameters describing degenerative changes, making comparisons between studies difficult. The aim of this retrospective evaluation was to evaluate the interest of quantified radiographic analysis of lumbar spine in global outcome assessment and to highlight the key biomechanical factors involved. Clinical data and Beaujon–Lassale scores were collected for 49 patients who underwent lumbosacral arthrodesis after prior lumbar discectomy (mean follow-up: 5 years). Sagittal standing and lumbar flexion-extension X-ray films allowed quantifying vertebral, lumbar, pelvic and kinematic parameters of the lumbar spine, which were compared to reference values. Statistics were performed to assess evolution for all variables. At long-term follow-up, 90% of patients presented satisfactory clinical outcomes, associated to normal sagittal alignment; vertebral parameters objectified adjacent level degeneration in four cases (8%). Clinical outcome was correlated (r = 0.8) with fusion that was confirmed in 80% of cases, doubtful in 16% and pseudarthrosis seemed to occur in 4% (2) of cases. In addition to clinical data (outcomes comparable to the literature), quantitative analysis accurately described lumbar spine geometry and kinematics, highlighting parameters related to adjacent level’s degeneration and a significant correlation between clinical outcome and fusion. Furthermore, criteria proposed to quantitatively evaluate fusion from lumbar dynamic radiographs seem to be appropriate and in agreement with surgeon’s qualitative grading in 87% of cases. PMID:17216227

  1. Quantitatively accurate activity measurements with a dedicated cardiac SPECT camera: Physical phantom experiments

    SciTech Connect

    Pourmoghaddas, Amir Wells, R. Glenn

    2016-01-15

    Purpose: Recently, there has been increased interest in dedicated cardiac single photon emission computed tomography (SPECT) scanners with pinhole collimation and improved detector technology due to their improved count sensitivity and resolution over traditional parallel-hole cameras. With traditional cameras, energy-based approaches are often used in the clinic for scatter compensation because they are fast and easily implemented. Some of the cardiac cameras use cadmium-zinc-telluride (CZT) detectors which can complicate the use of energy-based scatter correction (SC) due to the low-energy tail—an increased number of unscattered photons detected with reduced energy. Modified energy-based scatter correction methods can be implemented, but their level of accuracy is unclear. In this study, the authors validated by physical phantom experiments the quantitative accuracy and reproducibility of easily implemented correction techniques applied to {sup 99m}Tc myocardial imaging with a CZT-detector-based gamma camera with multiple heads, each with a single-pinhole collimator. Methods: Activity in the cardiac compartment of an Anthropomorphic Torso phantom (Data Spectrum Corporation) was measured through 15 {sup 99m}Tc-SPECT acquisitions. The ratio of activity concentrations in organ compartments resembled a clinical {sup 99m}Tc-sestamibi scan and was kept consistent across all experiments (1.2:1 heart to liver and 1.5:1 heart to lung). Two background activity levels were considered: no activity (cold) and an activity concentration 1/10th of the heart (hot). A plastic “lesion” was placed inside of the septal wall of the myocardial insert to simulate the presence of a region without tracer uptake and contrast in this lesion was calculated for all images. The true net activity in each compartment was measured with a dose calibrator (CRC-25R, Capintec, Inc.). A 10 min SPECT image was acquired using a dedicated cardiac camera with CZT detectors (Discovery NM530c, GE

  2. Development and Validation of a Highly Accurate Quantitative Real-Time PCR Assay for Diagnosis of Bacterial Vaginosis.

    PubMed

    Hilbert, David W; Smith, William L; Chadwick, Sean G; Toner, Geoffrey; Mordechai, Eli; Adelson, Martin E; Aguin, Tina J; Sobel, Jack D; Gygax, Scott E

    2016-04-01

    Bacterial vaginosis (BV) is the most common gynecological infection in the United States. Diagnosis based on Amsel's criteria can be challenging and can be aided by laboratory-based testing. A standard method for diagnosis in research studies is enumeration of bacterial morphotypes of a Gram-stained vaginal smear (i.e., Nugent scoring). However, this technique is subjective, requires specialized training, and is not widely available. Therefore, a highly accurate molecular assay for the diagnosis of BV would be of great utility. We analyzed 385 vaginal specimens collected prospectively from subjects who were evaluated for BV by clinical signs and Nugent scoring. We analyzed quantitative real-time PCR (qPCR) assays on DNA extracted from these specimens to quantify nine organisms associated with vaginal health or disease:Gardnerella vaginalis,Atopobium vaginae, BV-associated bacteria 2 (BVAB2, an uncultured member of the orderClostridiales),Megasphaeraphylotype 1 or 2,Lactobacillus iners,Lactobacillus crispatus,Lactobacillus gasseri, andLactobacillus jensenii We generated a logistic regression model that identifiedG. vaginalis,A. vaginae, andMegasphaeraphylotypes 1 and 2 as the organisms for which quantification provided the most accurate diagnosis of symptomatic BV, as defined by Amsel's criteria and Nugent scoring, with 92% sensitivity, 95% specificity, 94% positive predictive value, and 94% negative predictive value. The inclusion ofLactobacillusspp. did not contribute sufficiently to the quantitative model for symptomatic BV detection. This molecular assay is a highly accurate laboratory tool to assist in the diagnosis of symptomatic BV.

  3. Assessing Quantitative Reasoning in Young Children

    ERIC Educational Resources Information Center

    Nunes, Terezinha; Bryant, Peter; Evans, Deborah; Barros, Rossana

    2015-01-01

    Before starting school, many children reason logically about concepts that are basic to their later mathematical learning. We describe a measure of quantitative reasoning that was administered to children at school entry (mean age 5.8 years) and accounted for more variance in a mathematical attainment test than general cognitive ability 16 months…

  4. Quantitative Assessment of Robot-Generated Maps

    NASA Astrophysics Data System (ADS)

    Scrapper, C.; Madhavan, R.; Lakaemper, R.; Censi, A.; Godil, A.; Wagan, A.; Jacoff, A.

    Mobile robotic mapping is now considered to be a sufficiently mature field with demonstrated successes in various domains. While much progress has been made in the development of computationally efficient and consistent mapping schemes, it is still murky, at best, on how these maps can be evaluated. We are motivated by the absence of an accepted standard for quantitatively measuring the performance of robotic mapping systems against user-defined requirements. It is our belief that the development of standardized methods for quantitatively evaluating existing robotic technologies will improve the utility of mobile robots in already established application areas, such as vacuum cleaning, robot surveillance, and bomb disposal. This approach will also enable the proliferation and acceptance of such technologies in emerging markets. This chapter summarizes our preliminary efforts by bringing together the research community towards addressing this important problem which has ramifications not only from researchers' perspective but also from consumers', robot manufacturers', and developers' viewpoints.

  5. A Global Approach to Accurate and Automatic Quantitative Analysis of NMR Spectra by Complex Least-Squares Curve Fitting

    NASA Astrophysics Data System (ADS)

    Martin, Y. L.

    The performance of quantitative analysis of 1D NMR spectra depends greatly on the choice of the NMR signal model. Complex least-squares analysis is well suited for optimizing the quantitative determination of spectra containing a limited number of signals (<30) obtained under satisfactory conditions of signal-to-noise ratio (>20). From a general point of view it is concluded, on the basis of mathematical considerations and numerical simulations, that, in the absence of truncation of the free-induction decay, complex least-squares curve fitting either in the time or in the frequency domain and linear-prediction methods are in fact nearly equivalent and give identical results. However, in the situation considered, complex least-squares analysis in the frequency domain is more flexible since it enables the quality of convergence to be appraised at every resonance position. An efficient data-processing strategy has been developed which makes use of an approximate conjugate-gradient algorithm. All spectral parameters (frequency, damping factors, amplitudes, phases, initial delay associated with intensity, and phase parameters of a baseline correction) are simultaneously managed in an integrated approach which is fully automatable. The behavior of the error as a function of the signal-to-noise ratio is theoretically estimated, and the influence of apodization is discussed. The least-squares curve fitting is theoretically proved to be the most accurate approach for quantitative analysis of 1D NMR data acquired with reasonable signal-to-noise ratio. The method enables complex spectral residuals to be sorted out. These residuals, which can be cumulated thanks to the possibility of correcting for frequency shifts and phase errors, extract systematic components, such as isotopic satellite lines, and characterize the shape and the intensity of the spectral distortion with respect to the Lorentzian model. This distortion is shown to be nearly independent of the chemical species

  6. Simple, fast, and accurate methodology for quantitative analysis using Fourier transform infrared spectroscopy, with bio-hybrid fuel cell examples.

    PubMed

    Mackie, David M; Jahnke, Justin P; Benyamin, Marcus S; Sumner, James J

    2016-01-01

    The standard methodologies for quantitative analysis (QA) of mixtures using Fourier transform infrared (FTIR) instruments have evolved until they are now more complicated than necessary for many users' purposes. We present a simpler methodology, suitable for widespread adoption of FTIR QA as a standard laboratory technique across disciplines by occasional users.•Algorithm is straightforward and intuitive, yet it is also fast, accurate, and robust.•Relies on component spectra, minimization of errors, and local adaptive mesh refinement.•Tested successfully on real mixtures of up to nine components. We show that our methodology is robust to challenging experimental conditions such as similar substances, component percentages differing by three orders of magnitude, and imperfect (noisy) spectra. As examples, we analyze biological, chemical, and physical aspects of bio-hybrid fuel cells.

  7. Preferential access to genetic information from endogenous hominin ancient DNA and accurate quantitative SNP-typing via SPEX

    PubMed Central

    Brotherton, Paul; Sanchez, Juan J.; Cooper, Alan; Endicott, Phillip

    2010-01-01

    The analysis of targeted genetic loci from ancient, forensic and clinical samples is usually built upon polymerase chain reaction (PCR)-generated sequence data. However, many studies have shown that PCR amplification from poor-quality DNA templates can create sequence artefacts at significant levels. With hominin (human and other hominid) samples, the pervasive presence of highly PCR-amplifiable human DNA contaminants in the vast majority of samples can lead to the creation of recombinant hybrids and other non-authentic artefacts. The resulting PCR-generated sequences can then be difficult, if not impossible, to authenticate. In contrast, single primer extension (SPEX)-based approaches can genotype single nucleotide polymorphisms from ancient fragments of DNA as accurately as modern DNA. A single SPEX-type assay can amplify just one of the duplex DNA strands at target loci and generate a multi-fold depth-of-coverage, with non-authentic recombinant hybrids reduced to undetectable levels. Crucially, SPEX-type approaches can preferentially access genetic information from damaged and degraded endogenous ancient DNA templates over modern human DNA contaminants. The development of SPEX-type assays offers the potential for highly accurate, quantitative genotyping from ancient hominin samples. PMID:19864251

  8. Method for accurate quantitation of background tissue optical properties in the presence of emission from a strong fluorescence marker

    NASA Astrophysics Data System (ADS)

    Bravo, Jaime; Davis, Scott C.; Roberts, David W.; Paulsen, Keith D.; Kanick, Stephen C.

    2015-03-01

    Quantification of targeted fluorescence markers during neurosurgery has the potential to improve and standardize surgical distinction between normal and cancerous tissues. However, quantitative analysis of marker fluorescence is complicated by tissue background absorption and scattering properties. Correction algorithms that transform raw fluorescence intensity into quantitative units, independent of absorption and scattering, require a paired measurement of localized white light reflectance to provide estimates of the optical properties. This study focuses on the unique problem of developing a spectral analysis algorithm to extract tissue absorption and scattering properties from white light spectra that contain contributions from both elastically scattered photons and fluorescence emission from a strong fluorophore (i.e. fluorescein). A fiber-optic reflectance device was used to perform measurements in a small set of optical phantoms, constructed with Intralipid (1% lipid), whole blood (1% volume fraction) and fluorescein (0.16-10 μg/mL). Results show that the novel spectral analysis algorithm yields accurate estimates of tissue parameters independent of fluorescein concentration, with relative errors of blood volume fraction, blood oxygenation fraction (BOF), and the reduced scattering coefficient (at 521 nm) of <7%, <1%, and <22%, respectively. These data represent a first step towards quantification of fluorescein in tissue in vivo.

  9. A Quantitative Software Risk Assessment Model

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    2002-01-01

    This slide presentation reviews a risk assessment model as applied to software development. the presentation uses graphs to demonstrate basic concepts of software reliability. It also discusses the application to the risk model to the software development life cycle.

  10. Modeling logistic performance in quantitative microbial risk assessment.

    PubMed

    Rijgersberg, Hajo; Tromp, Seth; Jacxsens, Liesbeth; Uyttendaele, Mieke

    2010-01-01

    In quantitative microbial risk assessment (QMRA), food safety in the food chain is modeled and simulated. In general, prevalences, concentrations, and numbers of microorganisms in media are investigated in the different steps from farm to fork. The underlying rates and conditions (such as storage times, temperatures, gas conditions, and their distributions) are determined. However, the logistic chain with its queues (storages, shelves) and mechanisms for ordering products is usually not taken into account. As a consequence, storage times-mutually dependent in successive steps in the chain-cannot be described adequately. This may have a great impact on the tails of risk distributions. Because food safety risks are generally very small, it is crucial to model the tails of (underlying) distributions as accurately as possible. Logistic performance can be modeled by describing the underlying planning and scheduling mechanisms in discrete-event modeling. This is common practice in operations research, specifically in supply chain management. In this article, we present the application of discrete-event modeling in the context of a QMRA for Listeria monocytogenes in fresh-cut iceberg lettuce. We show the potential value of discrete-event modeling in QMRA by calculating logistic interventions (modifications in the logistic chain) and determining their significance with respect to food safety.

  11. QUANTITATIVE RISK ASSESSMENT FOR MICROBIAL AGENTS

    EPA Science Inventory

    Compared to chemical risk assessment, the process for microbial agents and infectious disease is more complex because of host factors and the variety of settings in which disease transmission can occur. While the National Academy of Science has established a paradigm for performi...

  12. CUMULATIVE RISK ASSESSMENT FOR QUANTITATIVE RESPONSE DATA

    EPA Science Inventory

    The Relative Potency Factor approach (RPF) is used to normalize and combine different toxic potencies among a group of chemicals selected for cumulative risk assessment. The RPF method assumes that the slopes of the dose-response functions are all equal; but this method depends o...

  13. SPECT-OPT multimodal imaging enables accurate evaluation of radiotracers for β-cell mass assessments

    PubMed Central

    Eter, Wael A.; Parween, Saba; Joosten, Lieke; Frielink, Cathelijne; Eriksson, Maria; Brom, Maarten; Ahlgren, Ulf; Gotthardt, Martin

    2016-01-01

    Single Photon Emission Computed Tomography (SPECT) has become a promising experimental approach to monitor changes in β-cell mass (BCM) during diabetes progression. SPECT imaging of pancreatic islets is most commonly cross-validated by stereological analysis of histological pancreatic sections after insulin staining. Typically, stereological methods do not accurately determine the total β-cell volume, which is inconvenient when correlating total pancreatic tracer uptake with BCM. Alternative methods are therefore warranted to cross-validate β-cell imaging using radiotracers. In this study, we introduce multimodal SPECT - optical projection tomography (OPT) imaging as an accurate approach to cross-validate radionuclide-based imaging of β-cells. Uptake of a promising radiotracer for β-cell imaging by SPECT, 111In-exendin-3, was measured by ex vivo-SPECT and cross evaluated by 3D quantitative OPT imaging as well as with histology within healthy and alloxan-treated Brown Norway rat pancreata. SPECT signal was in excellent linear correlation with OPT data as compared to histology. While histological determination of islet spatial distribution was challenging, SPECT and OPT revealed similar distribution patterns of 111In-exendin-3 and insulin positive β-cell volumes between different pancreatic lobes, both visually and quantitatively. We propose ex vivo SPECT-OPT multimodal imaging as a highly accurate strategy for validating the performance of β-cell radiotracers. PMID:27080529

  14. SPECT-OPT multimodal imaging enables accurate evaluation of radiotracers for β-cell mass assessments.

    PubMed

    Eter, Wael A; Parween, Saba; Joosten, Lieke; Frielink, Cathelijne; Eriksson, Maria; Brom, Maarten; Ahlgren, Ulf; Gotthardt, Martin

    2016-04-15

    Single Photon Emission Computed Tomography (SPECT) has become a promising experimental approach to monitor changes in β-cell mass (BCM) during diabetes progression. SPECT imaging of pancreatic islets is most commonly cross-validated by stereological analysis of histological pancreatic sections after insulin staining. Typically, stereological methods do not accurately determine the total β-cell volume, which is inconvenient when correlating total pancreatic tracer uptake with BCM. Alternative methods are therefore warranted to cross-validate β-cell imaging using radiotracers. In this study, we introduce multimodal SPECT - optical projection tomography (OPT) imaging as an accurate approach to cross-validate radionuclide-based imaging of β-cells. Uptake of a promising radiotracer for β-cell imaging by SPECT, (111)In-exendin-3, was measured by ex vivo-SPECT and cross evaluated by 3D quantitative OPT imaging as well as with histology within healthy and alloxan-treated Brown Norway rat pancreata. SPECT signal was in excellent linear correlation with OPT data as compared to histology. While histological determination of islet spatial distribution was challenging, SPECT and OPT revealed similar distribution patterns of (111)In-exendin-3 and insulin positive β-cell volumes between different pancreatic lobes, both visually and quantitatively. We propose ex vivo SPECT-OPT multimodal imaging as a highly accurate strategy for validating the performance of β-cell radiotracers.

  15. A High Resolution/Accurate Mass (HRAM) Data-Dependent MS3 Neutral Loss Screening, Classification, and Relative Quantitation Methodology for Carbonyl Compounds in Saliva

    NASA Astrophysics Data System (ADS)

    Dator, Romel; Carrà, Andrea; Maertens, Laura; Guidolin, Valeria; Villalta, Peter W.; Balbo, Silvia

    2016-10-01

    Reactive carbonyl compounds (RCCs) are ubiquitous in the environment and are generated endogenously as a result of various physiological and pathological processes. These compounds can react with biological molecules inducing deleterious processes believed to be at the basis of their toxic effects. Several of these compounds are implicated in neurotoxic processes, aging disorders, and cancer. Therefore, a method characterizing exposures to these chemicals will provide insights into how they may influence overall health and contribute to disease pathogenesis. Here, we have developed a high resolution accurate mass (HRAM) screening strategy allowing simultaneous identification and relative quantitation of DNPH-derivatized carbonyls in human biological fluids. The screening strategy involves the diagnostic neutral loss of hydroxyl radical triggering MS3 fragmentation, which is only observed in positive ionization mode of DNPH-derivatized carbonyls. Unique fragmentation pathways were used to develop a classification scheme for characterizing known and unanticipated/unknown carbonyl compounds present in saliva. Furthermore, a relative quantitation strategy was implemented to assess variations in the levels of carbonyl compounds before and after exposure using deuterated d 3 -DNPH. This relative quantitation method was tested on human samples before and after exposure to specific amounts of alcohol. The nano-electrospray ionization (nano-ESI) in positive mode afforded excellent sensitivity with detection limits on-column in the high-attomole levels. To the best of our knowledge, this is the first report of a method using HRAM neutral loss screening of carbonyl compounds. In addition, the method allows simultaneous characterization and relative quantitation of DNPH-derivatized compounds using nano-ESI in positive mode.

  16. Asbestos exposure--quantitative assessment of risk

    SciTech Connect

    Hughes, J.M.; Weill, H.

    1986-01-01

    Methods for deriving quantitative estimates of asbestos-associated health risks are reviewed and their numerous assumptions and uncertainties described. These methods involve extrapolation of risks observed at past relatively high asbestos concentration levels down to usually much lower concentration levels of interest today--in some cases, orders of magnitude lower. These models are used to calculate estimates of the potential risk to workers manufacturing asbestos products and to students enrolled in schools containing asbestos products. The potential risk to workers exposed for 40 yr to 0.5 fibers per milliliter (f/ml) of mixed asbestos fiber type (a permissible workplace exposure limit under consideration by the Occupational Safety and Health Administration (OSHA) ) are estimated as 82 lifetime excess cancers per 10,000 exposed. The risk to students exposed to an average asbestos concentration of 0.001 f/ml of mixed asbestos fiber types for an average enrollment period of 6 school years is estimated as 5 lifetime excess cancers per one million exposed. If the school exposure is to chrysotile asbestos only, then the estimated risk is 1.5 lifetime excess cancers per million. Risks from other causes are presented for comparison; e.g., annual rates (per million) of 10 deaths from high school football, 14 from bicycling (10-14 yr of age), 5 to 20 for whooping cough vaccination. Decisions concerning asbestos products require participation of all parties involved and should only be made after a scientifically defensible estimate of the associated risk has been obtained. In many cases to date, such decisions have been made without adequate consideration of the level of risk or the cost-effectiveness of attempts to lower the potential risk. 73 references.

  17. Quantitative performance assessments for neuromagnetic imaging systems.

    PubMed

    Koga, Ryo; Hiyama, Ei; Matsumoto, Takuya; Sekihara, Kensuke

    2013-01-01

    We have developed a Monte-Carlo simulation method to assess the performance of neuromagnetic imaging systems using two kinds of performance metrics: A-prime metric and spatial resolution. We compute these performance metrics for virtual sensor systems having 80, 160, 320, and 640 sensors, and discuss how the system performance is improved, depending on the number of sensors. We also compute these metrics for existing whole-head MEG systems, MEGvision™ (Yokogawa Electric Corporation, Tokyo, Japan) that uses axial-gradiometer sensors, and TRIUX™ (Elekta Corporate, Stockholm, Sweden) that uses planar-gradiometer and magnetometer sensors. We discuss performance comparisons between these significantly different systems.

  18. Accurate, fast and cost-effective diagnostic test for monosomy 1p36 using real-time quantitative PCR.

    PubMed

    Cunha, Pricila da Silva; Pena, Heloisa B; D'Angelo, Carla Sustek; Koiffmann, Celia P; Rosenfeld, Jill A; Shaffer, Lisa G; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5-0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs.

  19. Accurate, Fast and Cost-Effective Diagnostic Test for Monosomy 1p36 Using Real-Time Quantitative PCR

    PubMed Central

    Cunha, Pricila da Silva; Pena, Heloisa B.; D'Angelo, Carla Sustek; Koiffmann, Celia P.; Rosenfeld, Jill A.; Shaffer, Lisa G.; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5–0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs. PMID:24839341

  20. Improving the Linkages between Air Pollution Epidemiology and Quantitative Risk Assessment

    PubMed Central

    Bell, Michelle L.; Walker, Katy; Hubbell, Bryan

    2011-01-01

    Background: Air pollution epidemiology plays an integral role in both identifying the hazards of air pollution as well as supplying the risk coefficients that are used in quantitative risk assessments. Evidence from both epidemiology and risk assessments has historically supported critical environmental policy decisions. The extent to which risk assessors can properly specify a quantitative risk assessment and characterize key sources of uncertainty depends in part on the availability, and clarity, of data and assumptions in the epidemiological studies. Objectives: We discuss the interests shared by air pollution epidemiology and risk assessment communities in ensuring that the findings of epidemiological studies are appropriately characterized and applied correctly in risk assessments. We highlight the key input parameters for risk assessments and consider how modest changes in the characterization of these data might enable more accurate risk assessments that better represent the findings of epidemiological studies. Discussion: We argue that more complete information regarding the methodological choices and input data used in epidemiological studies would support more accurate risk assessments—to the benefit of both disciplines. In particular, we suggest including additional details regarding air quality, demographic, and health data, as well as certain types of data-rich graphics. Conclusions: Relatively modest changes to the data reported in epidemiological studies will improve the quality of risk assessments and help prevent the misinterpretation and mischaracterization of the results of epidemiological studies. Such changes may also benefit epidemiologists undertaking meta-analyses. We suggest workshops as a way to improve the dialogue between the two communities. PMID:21816702

  1. Quantitative Assessments of the Martian Hydrosphere

    NASA Astrophysics Data System (ADS)

    Lasue, Jeremie; Mangold, Nicolas; Hauber, Ernst; Clifford, Steve; Feldman, William; Gasnault, Olivier; Grima, Cyril; Maurice, Sylvestre; Mousis, Olivier

    2013-01-01

    In this paper, we review current estimates of the global water inventory of Mars, potential loss mechanisms, the thermophysical characteristics of the different reservoirs that water may be currently stored in, and assess how the planet's hydrosphere and cryosphere evolved with time. First, we summarize the water inventory quantified from geological analyses of surface features related to both liquid water erosion, and ice-related landscapes. They indicate that, throughout most of Martian geologic history (and possibly continuing through to the present day), water was present to substantial depths, with a total inventory ranging from several 100 to as much as 1000 m Global Equivalent Layer (GEL). We then review the most recent estimates of water content based on subsurface detection by orbital and landed instruments, including deep penetrating radars such as SHARAD and MARSIS. We show that the total amount of water measured so far is about 30 m GEL, although a far larger amount of water may be stored below the sounding depths of currently operational instruments. Finally, a global picture of the current state of the subsurface water reservoirs and their evolution is discussed.

  2. A comparison of risk assessment techniques from qualitative to quantitative

    SciTech Connect

    Altenbach, T.J.

    1995-02-13

    Risk assessment techniques vary from purely qualitative approaches, through a regime of semi-qualitative to the more traditional quantitative. Constraints such as time, money, manpower, skills, management perceptions, risk result communication to the public, and political pressures all affect the manner in which risk assessments are carried out. This paper surveys some risk matrix techniques, examining the uses and applicability for each. Limitations and problems for each technique are presented and compared to the others. Risk matrix approaches vary from purely qualitative axis descriptions of accident frequency vs consequences, to fully quantitative axis definitions using multi-attribute utility theory to equate different types of risk from the same operation.

  3. Automated and quantitative headspace in-tube extraction for the accurate determination of highly volatile compounds from wines and beers.

    PubMed

    Zapata, Julián; Mateo-Vivaracho, Laura; Lopez, Ricardo; Ferreira, Vicente

    2012-03-23

    An automatic headspace in-tube extraction (ITEX) method for the accurate determination of acetaldehyde, ethyl acetate, diacetyl and other volatile compounds from wine and beer has been developed and validated. Method accuracy is based on the nearly quantitative transference of volatile compounds from the sample to the ITEX trap. For achieving that goal most methodological aspects and parameters have been carefully examined. The vial and sample sizes and the trapping materials were found to be critical due to the pernicious saturation effects of ethanol. Small 2 mL vials containing very small amounts of sample (20 μL of 1:10 diluted sample) and a trap filled with 22 mg of Bond Elut ENV resins could guarantee a complete trapping of sample vapors. The complete extraction requires 100 × 0.5 mL pumping strokes at 60 °C and takes 24 min. Analytes are further desorbed at 240 °C into the GC injector under a 1:5 split ratio. The proportion of analytes finally transferred to the trap ranged from 85 to 99%. The validation of the method showed satisfactory figures of merit. Determination coefficients were better than 0.995 in all cases and good repeatability was also obtained (better than 7% in all cases). Reproducibility was better than 8.3% except for acetaldehyde (13.1%). Detection limits were below the odor detection thresholds of these target compounds in wine and beer and well below the normal ranges of occurrence. Recoveries were not significantly different to 100%, except in the case of acetaldehyde. In such a case it could be determined that the method is not able to break some of the adducts that this compound forms with sulfites. However, such problem was avoided after incubating the sample with glyoxal. The method can constitute a general and reliable alternative for the analysis of very volatile compounds in other difficult matrixes.

  4. Quantitative health impact assessment: current practice and future directions

    PubMed Central

    Veerman, J; Barendregt, J; Mackenbach, J

    2005-01-01

    Study objective: To assess what methods are used in quantitative health impact assessment (HIA), and to identify areas for future research and development. Design: HIA reports were assessed for (1) methods used to quantify effects of policy on determinants of health (exposure impact assessment) and (2) methods used to quantify health outcomes resulting from changes in exposure to determinants (outcome assessment). Main results: Of 98 prospective HIA studies, 17 reported quantitative estimates of change in exposure to determinants, and 16 gave quantified health outcomes. Eleven (categories of) determinants were quantified up to the level of health outcomes. Methods for exposure impact assessment were: estimation on the basis of routine data and measurements, and various kinds of modelling of traffic related and environmental factors, supplemented with experts' estimates and author's assumptions. Some studies used estimates from other documents pertaining to the policy. For the calculation of health outcomes, variants of epidemiological and toxicological risk assessment were used, in some cases in mathematical models. Conclusions: Quantification is comparatively rare in HIA. Methods are available in the areas of environmental health and, to a lesser extent, traffic accidents, infectious diseases, and behavioural factors. The methods are diverse and their reliability and validity are uncertain. Research and development in the following areas could benefit quantitative HIA: methods to quantify the effect of socioeconomic and behavioural determinants; user friendly simulation models; the use of summary measures of public health, expert opinion and scenario building; and empirical research into validity and reliability. PMID:15831683

  5. Quantitative wearable sensors for objective assessment of Parkinson's disease.

    PubMed

    Maetzler, Walter; Domingos, Josefa; Srulijes, Karin; Ferreira, Joaquim J; Bloem, Bastiaan R

    2013-10-01

    There is a rapidly growing interest in the quantitative assessment of Parkinson's disease (PD)-associated signs and disability using wearable technology. Both persons with PD and their clinicians see advantages in such developments. Specifically, quantitative assessments using wearable technology may allow for continuous, unobtrusive, objective, and ecologically valid data collection. Also, this approach may improve patient-doctor interaction, influence therapeutic decisions, and ultimately ameliorate patients' global health status. In addition, such measures have the potential to be used as outcome parameters in clinical trials, allowing for frequent assessments; eg, in the home setting. This review discusses promising wearable technology, addresses which parameters should be prioritized in such assessment strategies, and reports about studies that have already investigated daily life issues in PD using this new technology.

  6. Quantitative image analysis in the assessment of diffuse large B-cell lymphoma.

    PubMed

    Chabot-Richards, Devon S; Martin, David R; Myers, Orrin B; Czuchlewski, David R; Hunt, Kristin E

    2011-12-01

    Proliferation rates in diffuse large B-cell lymphoma have been associated with conflicting outcomes in the literature, more often with high proliferation associated with poor prognosis. In most studies, the proliferation rate was estimated by a pathologist using an immunohistochemical stain for the monoclonal antibody Ki-67. We hypothesized that a quantitative image analysis algorithm would give a more accurate estimate of the proliferation rate, leading to better associations with survival. In all, 84 cases of diffuse large B-cell lymphoma were selected according to the World Health Organization criteria. Ki-67 percentage positivity estimated by the pathologist was recorded from the original report. The same slides were then scanned using an Aperio ImageScope, and Ki-67 percentage positivity was calculated using a computer-based quantitative immunohistochemistry nuclear algorithm. In addition, chart review was performed and survival time was recorded. The Ki-67 percentage estimated by the pathologist from the original report versus quantitative image analysis was significantly correlated (P<0.001), but pathologist Ki-67 percentages were significantly higher than quantitative image analysis (P=0.021). There was less agreement at lower Ki-67 percentages. Comparison of Ki-67 percentage positivity versus survival did not show significant association either with pathologist estimate or quantitative image analysis. However, although not significant, there was a trend of worse survival at higher proliferation rates detected by the pathologist but not by quantitative image analysis. Interestingly, our data suggest that the Ki-67 percentage positivity as assessed by the pathologist may be more closely associated with survival outcome than that identified by quantitative image analysis. This may indicate that pathologists are better at selecting appropriate areas of the slide. More cases are needed to assess whether this finding would be statistically significant. Due to

  7. [The method of quantitative assessment of dentition aesthetic parameters].

    PubMed

    Ryakhovsky, A N; Kalacheva, Ya A

    2016-01-01

    This article describes the formula for calculating the aesthetic index of treatment outcome. The formula was derived on the basis of the obtained regression equations showing the dependence of visual assessment of the value of aesthetic violations. The formula can be used for objective quantitative evaluation of the aesthetics of the teeth when smiling before and after dental treatment.

  8. Qualitative and Quantitative Hippocampal MRI Assessments in Intractable Epilepsy

    PubMed Central

    Singh, Paramdeep; Kaur, Rupinderjeet; Saggar, Kavita; Singh, Gagandeep; Kaur, Amarpreet

    2013-01-01

    Aims. To acquire normative data of hippocampal volumes and T2 relaxation times, to evaluate and compare qualitative and quantitative assessments in evaluating hippocampi in patients with different durations of intractable epilepsy, and to propose an imaging protocol based on performance of these techniques. Methods. MRI analysis was done in 50 nonepileptic controls and 30 patients with intractable epilepsy on 1.5T scanner. Visual assessment and hippocampal volumetry were done on oblique coronal IR/T2W and T1W MP-RAGE images, respectively. T2 relaxation times were measured using 16-echo Carr-Purcell-Meiboom-Gill sequence. Volumetric data was normalized for variation in head size between individuals. Patients were divided into temporal (n = 20) and extratemporal (n = 10) groups based on clinical and EEG localization. Results. In controls, right hippocampal volume was slightly more than the left with no effect of age or gender. In TLE patients, hippocampal volumetry provided maximum concordance with EEG. Visual assessment of unilateral pathology concurred well with measured quantitative values but poorly in cases with bilateral pathologies. There were no significant differences of mean values between extratemporal group and controls group. Quantitative techniques detected mild abnormalities, undetected on visual assessment. Conclusions. Quantitative techniques are more sensitive to diagnose bilateral and mild unilateral hippocampal abnormalities. PMID:23984369

  9. Application of an Effective Statistical Technique for an Accurate and Powerful Mining of Quantitative Trait Loci for Rice Aroma Trait

    PubMed Central

    Golestan Hashemi, Farahnaz Sadat; Rafii, Mohd Y.; Ismail, Mohd Razi; Mohamed, Mahmud Tengku Muda; Rahim, Harun A.; Latif, Mohammad Abdul; Aslani, Farzad

    2015-01-01

    When a phenotype of interest is associated with an external/internal covariate, covariate inclusion in quantitative trait loci (QTL) analyses can diminish residual variation and subsequently enhance the ability of QTL detection. In the in vitro synthesis of 2-acetyl-1-pyrroline (2AP), the main fragrance compound in rice, the thermal processing during the Maillard-type reaction between proline and carbohydrate reduction produces a roasted, popcorn-like aroma. Hence, for the first time, we included the proline amino acid, an important precursor of 2AP, as a covariate in our QTL mapping analyses to precisely explore the genetic factors affecting natural variation for rice scent. Consequently, two QTLs were traced on chromosomes 4 and 8. They explained from 20% to 49% of the total aroma phenotypic variance. Additionally, by saturating the interval harboring the major QTL using gene-based primers, a putative allele of fgr (major genetic determinant of fragrance) was mapped in the QTL on the 8th chromosome in the interval RM223-SCU015RM (1.63 cM). These loci supported previous studies of different accessions. Such QTLs can be widely used by breeders in crop improvement programs and for further fine mapping. Moreover, no previous studies and findings were found on simultaneous assessment of the relationship among 2AP, proline and fragrance QTLs. Therefore, our findings can help further our understanding of the metabolomic and genetic basis of 2AP biosynthesis in aromatic rice. PMID:26061689

  10. Quantitative phylogenetic assessment of microbial communities indiverse environments

    SciTech Connect

    von Mering, C.; Hugenholtz, P.; Raes, J.; Tringe, S.G.; Doerks,T.; Jensen, L.J.; Ward, N.; Bork, P.

    2007-01-01

    The taxonomic composition of environmental communities is an important indicator of their ecology and function. Here, we use a set of protein-coding marker genes, extracted from large-scale environmental shotgun sequencing data, to provide a more direct, quantitative and accurate picture of community composition than traditional rRNA-based approaches using polymerase chain reaction (PCR). By mapping marker genes from four diverse environmental data sets onto a reference species phylogeny, we show that certain communities evolve faster than others, determine preferred habitats for entire microbial clades, and provide evidence that such habitat preferences are often remarkably stable over time.

  11. Detailed behavioral assessment promotes accurate diagnosis in patients with disorders of consciousness

    PubMed Central

    Gilutz, Yael; Lazary, Avraham; Karpin, Hana; Vatine, Jean-Jacques; Misha, Tamar; Fortinsky, Hadassah; Sharon, Haggai

    2015-01-01

    Introduction: Assessing the awareness level in patients with disorders of consciousness (DOC) is made on the basis of exhibited behaviors. However, since motor signs of awareness (i.e., non-reflex motor responses) can be very subtle, differentiating the vegetative from minimally conscious states (which is in itself not clear-cut) is often challenging. Even the careful clinician relying on standardized scales may arrive at a wrong diagnosis. Aim: To report our experience in tackling this problem by using two in-house use assessment procedures developed at Reuth Rehabilitation Hospital, and demonstrate their clinical significance by reviewing two cases. Methods: (1) Reuth DOC Response Assessment (RDOC-RA) –administered in addition to the standardized tools, and emphasizes the importance of assessing a wide range of motor responses. In our experience, in some patients the only evidence for awareness may be a private specific movement that is not assessed by standard assessment tools. (2) Reuth DOC Periodic Intervention Model (RDOC-PIM) – current literature regarding assessment and diagnosis in DOC refers mostly to the acute phase of up to 1 year post injury. However, we have found major changes in responsiveness occurring 1 year or more post-injury in many patients. Therefore, we conduct periodic assessments at predetermined times points to ensure patients are not misdiagnosed or neurological changes overlooked. Results: In the first case the RDOC-RA promoted a more accurate diagnosis than that based on standardized scales alone. The second case shows how the RDOC-PIM allowed us to recognize late recovery and promoted reinstatement of treatment with good results. Conclusion: Adding a detailed periodic assessment of DOC patients to existing scales can yield critical information, promoting better diagnosis, treatment, and clinical outcomes. We discuss the implications of this observation for the future development and validation of assessment tools in DOC patients

  12. Some suggested future directions of quantitative resource assessments

    USGS Publications Warehouse

    Singer, D.A.

    2001-01-01

    Future quantitative assessments will be expected to estimate quantities, values, and locations of undiscovered mineral resources in a form that conveys both economic viability and uncertainty associated with the resources. Historically, declining metal prices point to the need for larger deposits over time. Sensitivity analysis demonstrates that the greatest opportunity for reducing uncertainty in assessments lies in lowering uncertainty associated with tonnage estimates. Of all errors possible in assessments, those affecting tonnage estimates are by far the most important. Selecting the correct deposit model is the most important way of controlling errors because the dominance of tonnage-deposit models are the best known predictor of tonnage. Much of the surface is covered with apparently barren rocks and sediments in many large regions. Because many exposed mineral deposits are believed to have been found, a prime concern is the presence of possible mineralized rock under cover. Assessments of areas with resources under cover must rely on extrapolation from surrounding areas, new geologic maps of rocks under cover, or analogy with other well-explored areas that can be considered training tracts. Cover has a profound effect on uncertainty and on methods and procedures of assessments because geology is seldom known and geophysical methods typically have attenuated responses. Many earlier assessment methods were based on relationships of geochemical and geophysical variables to deposits learned from deposits exposed on the surface-these will need to be relearned based on covered deposits. Mineral-deposit models are important in quantitative resource assessments for two reasons: (1) grades and tonnages of most deposit types are significantly different, and (2) deposit types are present in different geologic settings that can be identified from geologic maps. Mineral-deposit models are the keystone in combining the diverse geoscience information on geology, mineral

  13. Quantitative risk assessment methods for cancer and noncancer effects.

    PubMed

    Baynes, Ronald E

    2012-01-01

    Human health risk assessments have evolved from the more qualitative approaches to more quantitative approaches in the past decade. This has been facilitated by the improvement in computer hardware and software capability and novel computational approaches being slowly recognized by regulatory agencies. These events have helped reduce the reliance on experimental animals as well as better utilization of published animal toxicology data in deriving quantitative toxicity indices that may be useful for risk management purposes. This chapter briefly describes some of the approaches as described in the guidance documents from several of the regulatory agencies as it pertains to hazard identification and dose-response assessment of a chemical. These approaches are contrasted with more novel computational approaches that provide a better grasp of the uncertainty often associated with chemical risk assessments.

  14. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    PubMed

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  15. Sunlight exposure assessment: can we accurately assess vitamin D exposure from sunlight questionnaires?

    PubMed

    McCarty, Catherine A

    2008-04-01

    The purpose of this review is to summarize the peer-reviewed literature in relation to sunlight exposure assessment and the validity of using sunlight exposure questionnaires to quantify vitamin D status. There is greater variability in personal ultraviolet (UV) light exposure as the result of personal behavior than as the result of ambient UV light exposure. Although statistically significant, the correlation coefficients for the relation between personal report of sun exposure and ambient UV light measured by dosimetry (assessment of radiation dose) are relatively low. Moreover, the few studies to assess the relation between sunlight measures and serum 25-hydroxyvitamin D show low correlations. These low correlations may not be surprising given that personal factors like melanin content in skin and age also influence cutaneous synthesis of vitamin D. In summary, sunlight exposure questionnaires currently provide imprecise estimates of vitamin D status. Research should be directed to develop more objective, nonintrusive, and economical measures of sunlight exposure to quantify personal vitamin D status.

  16. Assessment of and standardization for quantitative nondestructive test

    NASA Technical Reports Server (NTRS)

    Neuschaefer, R. W.; Beal, J. B.

    1972-01-01

    Present capabilities and limitations of nondestructive testing (NDT) as applied to aerospace structures during design, development, production, and operational phases are assessed. It will help determine what useful structural quantitative and qualitative data may be provided from raw materials to vehicle refurbishment. This assessment considers metal alloys systems and bonded composites presently applied in active NASA programs or strong contenders for future use. Quantitative and qualitative data has been summarized from recent literature, and in-house information, and presented along with a description of those structures or standards where the information was obtained. Examples, in tabular form, of NDT technique capabilities and limitations have been provided. NDT techniques discussed and assessed were radiography, ultrasonics, penetrants, thermal, acoustic, and electromagnetic. Quantitative data is sparse; therefore, obtaining statistically reliable flaw detection data must be strongly emphasized. The new requirements for reusable space vehicles have resulted in highly efficient design concepts operating in severe environments. This increases the need for quantitative NDT evaluation of selected structural components, the end item structure, and during refurbishment operations.

  17. Status and future of Quantitative Microbiological Risk Assessment in China.

    PubMed

    Dong, Q L; Barker, G C; Gorris, L G M; Tian, M S; Song, X Y; Malakar, P K

    2015-03-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives.

  18. Status and future of Quantitative Microbiological Risk Assessment in China

    PubMed Central

    Dong, Q.L.; Barker, G.C.; Gorris, L.G.M.; Tian, M.S.; Song, X.Y.; Malakar, P.K.

    2015-01-01

    Since the implementation of the Food Safety Law of the People's Republic of China in 2009 use of Quantitative Microbiological Risk Assessment (QMRA) has increased. QMRA is used to assess the risk posed to consumers by pathogenic bacteria which cause the majority of foodborne outbreaks in China. This review analyses the progress of QMRA research in China from 2000 to 2013 and discusses 3 possible improvements for the future. These improvements include planning and scoping to initiate QMRA, effectiveness of microbial risk assessment utility for risk management decision making, and application of QMRA to establish appropriate Food Safety Objectives. PMID:26089594

  19. Home Circadian Phase Assessments with Measures of Compliance Yield Accurate Dim Light Melatonin Onsets

    PubMed Central

    Burgess, Helen J.; Wyatt, James K.; Park, Margaret; Fogg, Louis F.

    2015-01-01

    Study Objectives: There is a need for the accurate assessment of circadian phase outside of the clinic/laboratory, particularly with the gold standard dim light melatonin onset (DLMO). We tested a novel kit designed to assist in saliva sampling at home for later determination of the DLMO. The home kit includes objective measures of compliance to the requirements for dim light and half-hourly saliva sampling. Design: Participants were randomized to one of two 10-day protocols. Each protocol consisted of two back-to-back home and laboratory phase assessments in counterbalanced order, separated by a 5-day break. Setting: Laboratory or participants' homes. Participants: Thirty-five healthy adults, age 21–62 y. Interventions: N/A. Measurements and Results: Most participants received at least one 30-sec epoch of light > 50 lux during the home phase assessments (average light intensity 4.5 lux), but on average for < 9 min of the required 8.5 h. Most participants collected every saliva sample within 5 min of the scheduled time. Ninety-two percent of home DLMOs were not affected by light > 50 lux or sampling errors. There was no significant difference between the home and laboratory DLMOs (P > 0.05); on average the home DLMOs occurred 9.6 min before the laboratory DLMOs. The home DLMOs were highly correlated with the laboratory DLMOs (r = 0.91, P < 0.001). Conclusions: Participants were reasonably compliant to the home phase assessment procedures. The good agreement between the home and laboratory dim light melatonin onsets (DLMOs) demonstrates that including objective measures of light exposure and sample timing during home saliva sampling can lead to accurate home DLMOs. Clinical Trial Registration: Circadian Phase Assessments at Home, http://clinicaltrials.gov/show/NCT01487252, NCT01487252. Citation: Burgess HJ, Wyatt JK, Park M, Fogg LF. Home circadian phase assessments with measures of compliance yield accurate dim light melatonin onsets. SLEEP 2015;38(6):889–897

  20. Accurate assessment of Congo basin forest carbon stocks requires forest type specific assessments

    NASA Astrophysics Data System (ADS)

    Moonen, Pieter C. J.; Van Ballaert, Siege; Verbist, Bruno; Boyemba, Faustin; Muys, Bart

    2014-05-01

    carbon stocks despite poorer physical and chemical soil properties. Soil organic carbon stocks (0-100cm) did not significantly differ between forest types and were estimated at 109 ± 35 Mg C ha-1. Our results confirm recent findings of significantly lower carbon stocks in the Central Congo Basin as compared to the outer regions and of the importance of local tree height-diameter relationships for accurate carbon stock estimations.

  1. DREAM: a method for semi-quantitative dermal exposure assessment.

    PubMed

    Van-Wendel-de-Joode, Berna; Brouwer, Derk H; Vermeulen, Roel; Van Hemmen, Joop J; Heederik, Dick; Kromhout, Hans

    2003-01-01

    This paper describes a new method (DREAM) for structured, semi-quantitative dermal exposure assessment for chemical or biological agents that can be used in occupational hygiene or epidemiology. It is anticipated that DREAM could serve as an initial assessment of dermal exposure, amongst others, resulting in a ranking of tasks and subsequently jobs. DREAM consists of an inventory and evaluation part. Two examples of dermal exposure of workers of a car-construction company show that DREAM characterizes tasks and gives insight into exposure mechanisms, forming a basis for systematic exposure reduction. DREAM supplies estimates for exposure levels on the outside clothing layer as well as on skin, and provides insight into the distribution of dermal exposure over the body. Together with the ranking of tasks and people, this provides information for measurement strategies and helps to determine who, where and what to measure. In addition to dermal exposure assessment, the systematic description of dermal exposure pathways helps to prioritize and determine most adequate measurement strategies and methods. DREAM could be a promising approach for structured, semi-quantitative, dermal exposure assessment.

  2. Quantitative assessment of regional right ventricular function with color kinesis.

    PubMed

    Vignon, P; Weinert, L; Mor-Avi, V; Spencer, K T; Bednarz, J; Lang, R M

    1999-06-01

    We used color kinesis, a recent echocardiographic technique that provides regional information on the magnitude and timing of endocardial wall motion, to quantitatively assess regional right ventricular (RV) systolic and diastolic properties in 76 subjects who were divided into five groups, as follows: normal (n = 20), heart failure (n = 15), pressure/volume overload (n = 14), pressure overload (n = 12), and RV hypertrophy (n = 15). Quantitative segmental analysis of color kinesis images was used to obtain regional fractional area change (RFAC), which was displayed in the form of stacked histograms to determine patterns of endocardial wall motion. Time curves of integrated RFAC were used to objectively identify asynchrony of diastolic endocardial motion. When compared with normal subjects, patients with pressure overload or heart failure exhibited significantly decreased endocardial motion along the RV free wall. In the presence of mixed pressure/volume overload, the markedly increased ventricular septal motion compensated for decreased RV free wall motion. Diastolic endocardial wall motion was delayed in 17 of 72 segments (24%) in patients with RV pressure overload, and in 31 of 90 segments (34%) in patients with RV hypertrophy. Asynchrony of diastolic endocardial wall motion was greater in the latter group than in normal subjects (16% versus 10%: p < 0.01). Segmental analysis of color kinesis images allows quantitative assessment of regional RV systolic and diastolic properties.

  3. Numerical system utilising a Monte Carlo calculation method for accurate dose assessment in radiation accidents.

    PubMed

    Takahashi, F; Endo, A

    2007-01-01

    A system utilising radiation transport codes has been developed to derive accurate dose distributions in a human body for radiological accidents. A suitable model is quite essential for a numerical analysis. Therefore, two tools were developed to setup a 'problem-dependent' input file, defining a radiation source and an exposed person to simulate the radiation transport in an accident with the Monte Carlo calculation codes-MCNP and MCNPX. Necessary resources are defined by a dialogue method with a generally used personal computer for both the tools. The tools prepare human body and source models described in the input file format of the employed Monte Carlo codes. The tools were validated for dose assessment in comparison with a past criticality accident and a hypothesized exposure.

  4. Long maximal incremental tests accurately assess aerobic fitness in class II and III obese men.

    PubMed

    Lanzi, Stefano; Codecasa, Franco; Cornacchia, Mauro; Maestrini, Sabrina; Capodaglio, Paolo; Brunani, Amelia; Fanari, Paolo; Salvadori, Alberto; Malatesta, Davide

    2015-01-01

    This study aimed to compare two different maximal incremental tests with different time durations [a maximal incremental ramp test with a short time duration (8-12 min) (STest) and a maximal incremental test with a longer time duration (20-25 min) (LTest)] to investigate whether an LTest accurately assesses aerobic fitness in class II and III obese men. Twenty obese men (BMI≥35 kg.m-2) without secondary pathologies (mean±SE; 36.7±1.9 yr; 41.8±0.7 kg*m-2) completed an STest (warm-up: 40 W; increment: 20 W*min-1) and an LTest [warm-up: 20% of the peak power output (PPO) reached during the STest; increment: 10% PPO every 5 min until 70% PPO was reached or until the respiratory exchange ratio reached 1.0, followed by 15 W.min-1 until exhaustion] on a cycle-ergometer to assess the peak oxygen uptake [Formula: see text] and peak heart rate (HRpeak) of each test. There were no significant differences in [Formula: see text] (STest: 3.1±0.1 L*min-1; LTest: 3.0±0.1 L*min-1) and HRpeak (STest: 174±4 bpm; LTest: 173±4 bpm) between the two tests. Bland-Altman plot analyses showed good agreement and Pearson product-moment and intra-class correlation coefficients showed a strong correlation between [Formula: see text] (r=0.81 for both; p≤0.001) and HRpeak (r=0.95 for both; p≤0.001) during both tests. [Formula: see text] and HRpeak assessments were not compromised by test duration in class II and III obese men. Therefore, we suggest that the LTest is a feasible test that accurately assesses aerobic fitness and may allow for the exercise intensity prescription and individualization that will lead to improved therapeutic approaches in treating obesity and severe obesity.

  5. Quantitative CT: technique dependency of volume assessment for pulmonary nodules

    NASA Astrophysics Data System (ADS)

    Chen, Baiyu; Richard, Samuel; Barnhart, Huiman; Colsher, James; Amurao, Maxwell; Samei, Ehsan

    2010-04-01

    Current lung nodule size assessment methods typically rely on one-dimensional estimation of lesions. While new 3D volume assessment techniques using MSCT scan data have enabled improved estimation of lesion size, the effect of acquisition and reconstruction parameters on accuracy and precision of such estimation has not been adequately investigated. To characterize such dependencies, we scanned an anthropomorphic thoracic phantom containing synthetic nodules with different protocols, including various acquisition and reconstruction parameters. We also scanned the phantom repeatedly with the same protocol to investigate repeatability. The nodule's volume was estimated by a clinical lung analysis software package, LungVCAR. Accuracy (bias) and precision (variance) of the volume assessment were calculated across the nodules and compared between protocols via Generalized Estimating Equation analysis. Results suggest a strong dependence of accuracy and precision on dose level but little dependence on reconstruction thickness, thus providing possible guidelines for protocol optimization for quantitative tasks.

  6. A new noninvasive method for the accurate and precise assessment of varicose vein diameters.

    PubMed

    Baldassarre, Damiano; Pustina, Linda; Castelnuovo, Samuela; Bondioli, Alighiero; Carlà, Matteo; Sirtori, Cesare R

    2003-01-01

    The feasibility and reproducibility of a new ultrasonic method for the direct assessment of maximal varicose vein diameter (VVD) were evaluated. A study was also performed to demonstrate the capacity of the method to detect changes in venous diameter induced by a pharmacologic treatment. Patients with varicose vein disease were recruited. A method that allows the precise positioning of patient and transducer and performance of scans in a gel-bath was developed. Maximal VVD was recorded both in the standing and supine positions. The intraassay reproducibility was determined by replicate scans made within 15 minutes in both positions. The interobserver variability was assessed by comparing VVDs measured during the first phase baseline examination with those obtained during baseline examinations in the second phase of the study. The error in reproducibility of VVD determinations was 5.3% when diameters were evaluated in the standing position and 6.4% when assessed in the supine position. The intramethod agreement was high, with a bias between readings of 0.06 +/- 0.18 mm and of -0.02 +/- 0.19 mm, respectively, in standing and supine positions. Correlation coefficients were better than 0.99 in both positions. The method appears to be sensitive enough to detect small changes in VVDs induced by treatments. The proposed technique provides a tool of potential valid use in the detection and in vivo monitoring of VVD changes in patients with varicose vein disease. The method offers an innovative approach to obtain a quantitative assessment of varicose vein progression and of treatment effects, thus providing a basis for epidemiologic surveys.

  7. Quantitative risk assessment in aerospace: Evolution from the nuclear industry

    SciTech Connect

    Frank, M.V.

    1996-12-31

    In 1987, the National Aeronautics and Space Administration (NASA) and the aerospace industry relied on failure mode and effects analysis (FMEA) and hazards analysis as the primary tools for safety and reliability of their systems. The FMEAs were reviewed to provide critical items using a set of qualitative criteria. Hazards and critical items judged the worst, by a qualitative method, were to be either eliminated by a design change or controlled by the addition of a safeguard. However, it is frequently the case that limitations of space, weight, technical feasibility, and cost left critical items and hazards unable to be eliminated or controlled. In these situations, program management accepted the risk. How much risk was being accepted was unknown because quantitative risk assessment methods were not used. Perhaps the greatest contribution of the nuclear industry to NASA and the aerospace industry was the introduction of modern (i.e., post-WASH-1400) quantitative risk assessment concepts and techniques. The concepts of risk assessment that have been most useful in the aerospace industry are the following: 1. combination of accident sequence diagrams, event trees, and fault trees to model scenarios and their causative factors; 2. use of Bayesian analysis of system and component failure data; 3. evaluation and presentation of uncertainties in the risk estimates.

  8. Direct, quantitative clinical assessment of hand function: usefulness and reproducibility.

    PubMed

    Goodson, Alexander; McGregor, Alison H; Douglas, Jane; Taylor, Peter

    2007-05-01

    Methods of assessing functional impairment in arthritic hands include pain assessments and disability scoring scales which are subjective, variable over time and fail to take account of the patients' need to adapt to deformities. The aim of this study was to evaluate measures of functional strength and joint motion in the assessment of the rheumatoid (RA) and osteoarthritic (OA) hand. Ten control subjects, ten RA and ten OA patients were recruited for the study. All underwent pain and disability scoring and functional assessment of the hand using measures of pinch/grip strength and range of joint motion (ROM). Functional assessments including ROM analyses at interphalangeal (IP), metacarpophalangeal (MCP) and wrist joints along with pinch/grip strength clearly discriminated between patient groups (RA vs. OA MCP ROM P<0.0001), pain and disability scales were unable to. In the RA there were demonstrable relationships between ROM measurements and disability (R2=0.31) as well as disease duration (R2=0.37). Intra-patient measures of strength were robust whereas inter-patient comparisons showed variability. In conclusion, pinch/grip strength and ROM are clinically reproducible assessments that may more accurately reflect functional impairment associated with arthritis.

  9. Quantitative risk assessment for skin sensitization: Success or failure?

    PubMed

    Kimber, Ian; Gerberick, G Frank; Basketter, David A

    2017-02-01

    Skin sensitization is unique in the world of toxicology. There is a combination of reliable, validated predictive test methods for identification of skin sensitizing chemicals, a clearly documented and transparent approach to risk assessment, and effective feedback from dermatology clinics around the world delivering evidence of the success or failure of the hazard identification/risk assessment/management process. Recent epidemics of contact allergy, particularly to preservatives, have raised questions of whether the safety/risk assessment process is working in an optimal manner (or indeed is working at all!). This review has as its focus skin sensitization quantitative risk assessment (QRA). The core toxicological principles of QRA are reviewed, and evidence of use and misuse examined. What becomes clear is that skin sensitization QRA will only function adequately if two essential criteria are met. The first is that QRA is applied rigourously, and the second is that potential exposure to the sensitizing substance is assessed adequately. This conclusion will come as no surprise to any toxicologist who appreciates the basic premise that "risk = hazard x exposure". Accordingly, use of skin sensitization QRA is encouraged, not least because the essential feedback from dermatology clinics can be used as a tool to refine QRA in situations where this risk assessment tool has not been properly used.

  10. Can a quantitative simulation of an Otto engine be accurately rendered by a simple Novikov model with heat leak?

    NASA Astrophysics Data System (ADS)

    Fischer, A.; Hoffmann, K.-H.

    2004-03-01

    In this case study a complex Otto engine simulation provides data including, but not limited to, effects from losses due to heat conduction, exhaust losses and frictional losses. This data is used as a benchmark to test whether the Novikov engine with heat leak, a simple endoreversible model, can reproduce the complex engine behavior quantitatively by an appropriate choice of model parameters. The reproduction obtained proves to be of high quality.

  11. Short Course Introduction to Quantitative Mineral Resource Assessments

    USGS Publications Warehouse

    Singer, Donald A.

    2007-01-01

    This is an abbreviated text supplementing the content of three sets of slides used in a short course that has been presented by the author at several workshops. The slides should be viewed in the order of (1) Introduction and models, (2) Delineation and estimation, and (3) Combining estimates and summary. References cited in the slides are listed at the end of this text. The purpose of the three-part form of mineral resource assessments discussed in the accompanying slides is to make unbiased quantitative assessments in a format needed in decision-support systems so that consequences of alternative courses of action can be examined. The three-part form of mineral resource assessments was developed to assist policy makers evaluate the consequences of alternative courses of action with respect to land use and mineral-resource development. The audience for three-part assessments is a governmental or industrial policy maker, a manager of exploration, a planner of regional development, or similar decision-maker. Some of the tools and models presented here will be useful for selection of exploration sites, but that is a side benefit, not the goal. To provide unbiased information, we recommend the three-part form of mineral resource assessments where general locations of undiscovered deposits are delineated from a deposit type's geologic setting, frequency distributions of tonnages and grades of well-explored deposits serve as models of grades and tonnages of undiscovered deposits, and number of undiscovered deposits are estimated probabilistically by type. The internally consistent descriptive, grade and tonnage, deposit density, and economic models used in the design of the three-part form of assessments reduce the chances of biased estimates of the undiscovered resources. What and why quantitative resource assessments: The kind of assessment recommended here is founded in decision analysis in order to provide a framework for making decisions concerning mineral

  12. Improved Surgical Site Infection (SSI) rate through accurately assessed surgical wounds.

    PubMed

    John, Honeymol; Nimeri, Abdelrahman; Ellahham, Samer

    2015-01-01

    assignment was 36%, and the worst rates were in appendectomies (97%). Over time our incorrect wound classification decreased down to 22%, while at the same time our actual SSI wound occurrences per month and our odds ratio of SSI in the department have decreased an average of six to three per month. We followed the best practice guidelines of the ACS NSQIP. Accurate assessment of wound classification is necessary to make sure the expected SSI rates are not falsely high if wounds are under-classified. The present study shows that accurate wound classification in contaminated and dirty wounds can lead to lower odds ratio of SSI.

  13. Improved Surgical Site Infection (SSI) rate through accurately assessed surgical wounds

    PubMed Central

    John, Honeymol; Nimeri, Abdelrahman; ELLAHHAM, SAMER

    2015-01-01

    assignment was 36%, and the worst rates were in appendectomies (97%). Over time our incorrect wound classification decreased down to 22%, while at the same time our actual SSI wound occurrences per month and our odds ratio of SSI in the department have decreased an average of six to three per month. We followed the best practice guidelines of the ACS NSQIP. Accurate assessment of wound classification is necessary to make sure the expected SSI rates are not falsely high if wounds are under-classified. The present study shows that accurate wound classification in contaminated and dirty wounds can lead to lower odds ratio of SSI. PMID:26734358

  14. A novel method for accurate collagen and biochemical assessment of pulmonary tissue utilizing one animal

    PubMed Central

    Kliment, Corrine R; Englert, Judson M; Crum, Lauren P; Oury, Tim D

    2011-01-01

    Aim: The purpose of this study was to develop an improved method for collagen and protein assessment of fibrotic lungs while decreasing animal use. methods: 8-10 week old, male C57BL/6 mice were given a single intratracheal instillation of crocidolite asbestos or control titanium dioxide. Lungs were collected on day 14 and dried as whole lung, or homogenized in CHAPS buffer, for hydroxyproline analysis. Insoluble and salt-soluble collagen content was also determined in lung homogenates using a modified Sirius red colorimetric 96-well plate assay. results: The hydroxyproline assay showed significant increases in collagen content in the lungs of asbestos-treated mice. Identical results were present between collagen content determined on dried whole lung or whole lung homogenates. The Sirius red plate assay showed a significant increase in collagen content in lung homogenates however, this assay grossly over-estimated the total amount of collagen and underestimated changes between control and fibrotic lungs, conclusions: The proposed method provides accurate quantification of collagen content in whole lungs and additional homogenate samples for biochemical analysis from a single animal. The Sirius-red colorimetric plate assay provides a complementary method for determination of the relative changes in lung collagen but the values tend to overestimate absolute values obtained by the gold standard hydroxyproline assay and underestimate the overall fibrotic injury. PMID:21577320

  15. Algal productivity modeling: a step toward accurate assessments of full-scale algal cultivation.

    PubMed

    Béchet, Quentin; Chambonnière, Paul; Shilton, Andy; Guizard, Guillaume; Guieysse, Benoit

    2015-05-01

    A new biomass productivity model was parameterized for Chlorella vulgaris using short-term (<30 min) oxygen productivities from algal microcosms exposed to 6 light intensities (20-420 W/m(2)) and 6 temperatures (5-42 °C). The model was then validated against experimental biomass productivities recorded in bench-scale photobioreactors operated under 4 light intensities (30.6-74.3 W/m(2)) and 4 temperatures (10-30 °C), yielding an accuracy of ± 15% over 163 days of cultivation. This modeling approach addresses major challenges associated with the accurate prediction of algal productivity at full-scale. Firstly, while most prior modeling approaches have only considered the impact of light intensity on algal productivity, the model herein validated also accounts for the critical impact of temperature. Secondly, this study validates a theoretical approach to convert short-term oxygen productivities into long-term biomass productivities. Thirdly, the experimental methodology used has the practical advantage of only requiring one day of experimental work for complete model parameterization. The validation of this new modeling approach is therefore an important step for refining feasibility assessments of algae biotechnologies.

  16. Quantitative polymerase chain reaction analysis of DNA from noninvasive samples for accurate microsatellite genotyping of wild chimpanzees (Pan troglodytes verus).

    PubMed

    Morin, P A; Chambers, K E; Boesch, C; Vigilant, L

    2001-07-01

    Noninvasive samples are useful for molecular genetic analyses of wild animal populations. However, the low DNA content of such samples makes DNA amplification difficult, and there is the potential for erroneous results when one of two alleles at heterozygous microsatellite loci fails to be amplified. In this study we describe an assay designed to measure the amount of amplifiable nuclear DNA in low DNA concentration extracts from noninvasive samples. We describe the range of DNA amounts obtained from chimpanzee faeces and shed hair samples and formulate a new efficient approach for accurate microsatellite genotyping. Prescreening of extracts for DNA quantity is recommended for sorting of samples for likely success and reliability. Repetition of results remains extensive for analysis of microsatellite amplifications beginning from low starting amounts of DNA, but is reduced for those with higher DNA content.

  17. Identification and evaluation of new reference genes in Gossypium hirsutum for accurate normalization of real-time quantitative RT-PCR data

    PubMed Central

    2010-01-01

    Background Normalizing through reference genes, or housekeeping genes, can make more accurate and reliable results from reverse transcription real-time quantitative polymerase chain reaction (qPCR). Recent studies have shown that no single housekeeping gene is universal for all experiments. Thus, suitable reference genes should be the first step of any qPCR analysis. Only a few studies on the identification of housekeeping gene have been carried on plants. Therefore qPCR studies on important crops such as cotton has been hampered by the lack of suitable reference genes. Results By the use of two distinct algorithms, implemented by geNorm and NormFinder, we have assessed the gene expression of nine candidate reference genes in cotton: GhACT4, GhEF1α5, GhFBX6, GhPP2A1, GhMZA, GhPTB, GhGAPC2, GhβTUB3 and GhUBQ14. The candidate reference genes were evaluated in 23 experimental samples consisting of six distinct plant organs, eight stages of flower development, four stages of fruit development and in flower verticils. The expression of GhPP2A1 and GhUBQ14 genes were the most stable across all samples and also when distinct plants organs are examined. GhACT4 and GhUBQ14 present more stable expression during flower development, GhACT4 and GhFBX6 in the floral verticils and GhMZA and GhPTB during fruit development. Our analysis provided the most suitable combination of reference genes for each experimental set tested as internal control for reliable qPCR data normalization. In addition, to illustrate the use of cotton reference genes we checked the expression of two cotton MADS-box genes in distinct plant and floral organs and also during flower development. Conclusion We have tested the expression stabilities of nine candidate genes in a set of 23 tissue samples from cotton plants divided into five different experimental sets. As a result of this evaluation, we recommend the use of GhUBQ14 and GhPP2A1 housekeeping genes as superior references for normalization of gene

  18. Infectious titres of sheep scrapie and bovine spongiform encephalopathy agents cannot be accurately predicted from quantitative laboratory test results.

    PubMed

    González, Lorenzo; Thorne, Leigh; Jeffrey, Martin; Martin, Stuart; Spiropoulos, John; Beck, Katy E; Lockey, Richard W; Vickery, Christopher M; Holder, Thomas; Terry, Linda

    2012-11-01

    It is widely accepted that abnormal forms of the prion protein (PrP) are the best surrogate marker for the infectious agent of prion diseases and, in practice, the detection of such disease-associated (PrP(d)) and/or protease-resistant (PrP(res)) forms of PrP is the cornerstone of diagnosis and surveillance of the transmissible spongiform encephalopathies (TSEs). Nevertheless, some studies question the consistent association between infectivity and abnormal PrP detection. To address this discrepancy, 11 brain samples of sheep affected with natural scrapie or experimental bovine spongiform encephalopathy were selected on the basis of the magnitude and predominant types of PrP(d) accumulation, as shown by immunohistochemical (IHC) examination; contra-lateral hemi-brain samples were inoculated at three different dilutions into transgenic mice overexpressing ovine PrP and were also subjected to quantitative analysis by three biochemical tests (BCTs). Six samples gave 'low' infectious titres (10⁶·⁵ to 10⁶·⁷ LD₅₀ g⁻¹) and five gave 'high titres' (10⁸·¹ to ≥ 10⁸·⁷ LD₅₀ g⁻¹) and, with the exception of the Western blot analysis, those two groups tended to correspond with samples with lower PrP(d)/PrP(res) results by IHC/BCTs. However, no statistical association could be confirmed due to high individual sample variability. It is concluded that although detection of abnormal forms of PrP by laboratory methods remains useful to confirm TSE infection, infectivity titres cannot be predicted from quantitative test results, at least for the TSE sources and host PRNP genotypes used in this study. Furthermore, the near inverse correlation between infectious titres and Western blot results (high protease pre-treatment) argues for a dissociation between infectivity and PrP(res).

  19. Remotely Sensed Quantitative Drought Risk Assessment in Vulnerable Agroecosystems

    NASA Astrophysics Data System (ADS)

    Dalezios, N. R.; Blanta, A.; Spyropoulos, N. V.

    2012-04-01

    Hazard may be defined as a potential threat to humans and their welfare and risk (or consequence) as the probability of a hazard occurring and creating loss. Drought is considered as one of the major natural hazards with significant impact to agriculture, environment, economy and society. This paper deals with drought risk assessment, which the first step designed to find out what the problems are and comprises three distinct steps, namely risk identification, risk management which is not covered in this paper, there should be a fourth step to address the need for feedback and to take post-audits of all risk assessment exercises. In particular, quantitative drought risk assessment is attempted by using statistical methods. For the qualification of drought, the Reconnaissance Drought Index (RDI) is employed, which is a new index based on hydrometeorological parameters, such as precipitation and potential evapotranspiration. The remotely sensed estimation of RDI is based on NOA-AVHRR satellite data for a period of 20 years (1981-2001). The study area is Thessaly, central Greece, which is a drought-prone agricultural region characterized by vulnerable agriculture. Specifically, the undertaken drought risk assessment processes are specified as follows: 1. Risk identification: This step involves drought quantification and monitoring based on remotely sensed RDI and extraction of several features such as severity, duration, areal extent, onset and end time. Moreover, it involves a drought early warning system based on the above parameters. 2. Risk estimation: This step includes an analysis of drought severity, frequency and their relationships. 3. Risk evaluation: This step covers drought evaluation based on analysis of RDI images before and after each drought episode, which usually lasts one hydrological year (12month). The results of these three-step drought assessment processes are considered quite satisfactory in a drought-prone region such as Thessaly in central

  20. Quantitative assessment of Mycoplasma hemadsorption activity by flow cytometry.

    PubMed

    García-Morales, Luis; González-González, Luis; Costa, Manuela; Querol, Enrique; Piñol, Jaume

    2014-01-01

    A number of adherent mycoplasmas have developed highly complex polar structures that are involved in diverse aspects of the biology of these microorganisms and play a key role as virulence factors by promoting adhesion to host cells in the first stages of infection. Attachment activity of mycoplasma cells has been traditionally investigated by determining their hemadsorption ability to red blood cells and it is a distinctive trait widely examined when characterizing the different mycoplasma species. Despite the fact that protocols to qualitatively determine the hemadsorption or hemagglutination of mycoplasmas are straightforward, current methods when investigating hemadsorption at the quantitative level are expensive and poorly reproducible. By using flow cytometry, we have developed a procedure to quantify rapidly and accurately the hemadsorption activity of mycoplasmas in the presence of SYBR Green I, a vital fluorochrome that stains nucleic acids, allowing to resolve erythrocyte and mycoplasma cells by their different size and fluorescence. This method is very reproducible and permits the kinetic analysis of the obtained data and a precise hemadsorption quantification based on standard binding parameters such as the dissociation constant K d. The procedure we developed could be easily implemented in a standardized assay to test the hemadsorption activity of the growing number of clinical isolates and mutant strains of different mycoplasma species, providing valuable data about the virulence of these microorganisms.

  1. Assessment of metabolic bone diseases by quantitative computed tomography

    NASA Technical Reports Server (NTRS)

    Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.

    1985-01-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.

  2. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  3. Quantitative Security Risk Assessment and Management for Railway Transportation Infrastructures

    NASA Astrophysics Data System (ADS)

    Flammini, Francesco; Gaglione, Andrea; Mazzocca, Nicola; Pragliola, Concetta

    Scientists have been long investigating procedures, models and tools for the risk analysis in several domains, from economics to computer networks. This paper presents a quantitative method and a tool for the security risk assessment and management specifically tailored to the context of railway transportation systems, which are exposed to threats ranging from vandalism to terrorism. The method is based on a reference mathematical model and it is supported by a specifically developed tool. The tool allows for the management of data, including attributes of attack scenarios and effectiveness of protection mechanisms, and the computation of results, including risk and cost/benefit indices. The main focus is on the design of physical protection systems, but the analysis can be extended to logical threats as well. The cost/benefit analysis allows for the evaluation of the return on investment, which is a nowadays important issue to be addressed by risk analysts.

  4. Assessing the Reliability of Quantitative Imaging of Sm-153

    NASA Astrophysics Data System (ADS)

    Poh, Zijie; Dagan, Maáyan; Veldman, Jeanette; Trees, Brad

    2013-03-01

    Samarium-153 is used for palliation of and recently has been investigated for therapy for bone metastases. Patient specific dosing of Sm-153 is based on quantitative single-photon emission computed tomography (SPECT) and knowing the accuracy and precision of image-based estimates of the in vivo activity distribution. Physical phantom studies are useful for estimating these in simple objects, but do not model realistic activity distributions. We are using realistic Monte Carlo simulations combined with a realistic digital phantom modeling human anatomy to assess the accuracy and precision of Sm-153 SPECT. Preliminary data indicates that we can simulate projection images and reconstruct them with compensation for various physical image degrading factors, such as attenuation and scatter in the body as well as non-idealities in the imaging system, to provide realistic SPECT images.

  5. Is photometry an accurate and reliable method to assess boar semen concentration?

    PubMed

    Camus, A; Camugli, S; Lévêque, C; Schmitt, E; Staub, C

    2011-02-01

    Sperm concentration assessment is a key point to insure appropriate sperm number per dose in species subjected to artificial insemination (AI). The aim of the present study was to evaluate the accuracy and reliability of two commercially available photometers, AccuCell™ and AccuRead™ pre-calibrated for boar semen in comparison to UltiMate™ boar version 12.3D, NucleoCounter SP100 and Thoma hemacytometer. For each type of instrument, concentration was measured on 34 boar semen samples in quadruplicate and agreement between measurements and instruments were evaluated. Accuracy for both photometers was illustrated by mean of percentage differences to the general mean. It was -0.6% and 0.5% for Accucell™ and Accuread™ respectively, no significant differences were found between instrument and mean of measurement among all equipment. Repeatability for both photometers was 1.8% and 3.2% for AccuCell™ and AccuRead™ respectively. Low differences were observed between instruments (confidence interval 3%) except when hemacytometer was used as a reference. Even though hemacytometer is considered worldwide as the gold standard, it is the more variable instrument (confidence interval 7.1%). The conclusion is that routine photometry measures of raw semen concentration are reliable, accurate and precise using AccuRead™ or AccuCell™. There are multiple steps in semen processing that can induce sperm loss and therefore increase differences between theoretical and real sperm numbers in doses. Potential biases that depend on the workflow but not on the initial photometric measure of semen concentration are discussed.

  6. Quantitative Assessment of Eye Phenotypes for Functional Genetic Studies Using Drosophila melanogaster

    PubMed Central

    Iyer, Janani; Wang, Qingyu; Le, Thanh; Pizzo, Lucilla; Grönke, Sebastian; Ambegaokar, Surendra S.; Imai, Yuzuru; Srivastava, Ashutosh; Troisí, Beatriz Llamusí; Mardon, Graeme; Artero, Ruben; Jackson, George R.; Isaacs, Adrian M.; Partridge, Linda; Lu, Bingwei; Kumar, Justin P.; Girirajan, Santhosh

    2016-01-01

    About two-thirds of the vital genes in the Drosophila genome are involved in eye development, making the fly eye an excellent genetic system to study cellular function and development, neurodevelopment/degeneration, and complex diseases such as cancer and diabetes. We developed a novel computational method, implemented as Flynotyper software (http://flynotyper.sourceforge.net), to quantitatively assess the morphological defects in the Drosophila eye resulting from genetic alterations affecting basic cellular and developmental processes. Flynotyper utilizes a series of image processing operations to automatically detect the fly eye and the individual ommatidium, and calculates a phenotypic score as a measure of the disorderliness of ommatidial arrangement in the fly eye. As a proof of principle, we tested our method by analyzing the defects due to eye-specific knockdown of Drosophila orthologs of 12 neurodevelopmental genes to accurately document differential sensitivities of these genes to dosage alteration. We also evaluated eye images from six independent studies assessing the effect of overexpression of repeats, candidates from peptide library screens, and modifiers of neurotoxicity and developmental processes on eye morphology, and show strong concordance with the original assessment. We further demonstrate the utility of this method by analyzing 16 modifiers of sine oculis obtained from two genome-wide deficiency screens of Drosophila and accurately quantifying the effect of its enhancers and suppressors during eye development. Our method will complement existing assays for eye phenotypes, and increase the accuracy of studies that use fly eyes for functional evaluation of genes and genetic interactions. PMID:26994292

  7. Can a Rescuer or Simulated Patient Accurately Assess Motion During Cervical Spine Stabilization Practice Sessions?

    PubMed Central

    Shrier, Ian; Boissy, Patrick; Brière, Simon; Mellette, Jay; Fecteau, Luc; Matheson, Gordon O.; Garza, Daniel; Meeuwisse, Willem H.; Segal, Eli; Boulay, John; Steele, Russell J.

    2012-01-01

    Context: Health care providers must be prepared to manage all potential spine injuries as if they are unstable. Therefore, most sport teams devote resources to training for sideline cervical spine (C-spine) emergencies. Objective: To determine (1) how accurately rescuers and simulated patients can assess motion during C-spine stabilization practice and (2) whether providing performance feedback to rescuers influences their choice of stabilization technique. Design: Crossover study. Setting: Training studio. Patients or Other Participants: Athletic trainers, athletic therapists, and physiotherapists experienced at managing suspected C-spine injuries. Intervention(s): Twelve lead rescuers (at the patient's head) performed both the head-squeeze and trap-squeeze C-spine stabilization maneuvers during 4 test scenarios: lift-and-slide and log-roll placement on a spine board and confused patient trying to sit up or rotate the head. Main Outcome Measure(s): Interrater reliability between rescuer and simulated patient quality scores for subjective evaluation of C-spine stabilization during trials (0 = best, 10 = worst), correlation between rescuers' quality scores and objective measures of motion with inertial measurement units, and frequency of change in preference for the head-squeeze versus trap-squeeze maneuver. Results: Although the weighted κ value for interrater reliability was acceptable (0.71–0.74), scores varied by 2 points or more between rescuers and simulated patients for approximately 10% to 15% of trials. Rescuers' scores correlated with objective measures, but variability was large: 38% of trials scored as 0 or 1 by the rescuer involved more than 10° of motion in at least 1 direction. Feedback did not affect the preference for the lift-and-slide placement. For the log-roll placement, 6 of 8 participants who preferred the head squeeze at baseline preferred the trap squeeze after feedback. For the confused patient, 5 of 5 participants initially preferred

  8. A Framework for General Education Assessment: Assessing Information Literacy and Quantitative Literacy with ePortfolios

    ERIC Educational Resources Information Center

    Hubert, David A.; Lewis, Kati J.

    2014-01-01

    This essay presents the findings of an authentic and holistic assessment, using a random sample of one hundred student General Education ePortfolios, of two of Salt Lake Community College's (SLCC) college-wide learning outcomes: quantitative literacy (QL) and information literacy (IL). Performed by four faculty from biology, humanities, and…

  9. Preschool Temperament Assessment: A Quantitative Assessment of the Validity of Behavioral Style Questionnaire Data

    ERIC Educational Resources Information Center

    Huelsman, Timothy J.; Gagnon, Sandra Glover; Kidder-Ashley, Pamela; Griggs, Marissa Swaim

    2014-01-01

    Research Findings: Child temperament is an important construct, but its measurement has been marked by a number of weaknesses that have diminished the frequency with which it is assessed in practice. We address this problem by presenting the results of a quantitative construct validation study. We calculated validity indices by hypothesizing the…

  10. Can MRI accurately detect pilon articular malreduction? A quantitative comparison between CT and 3T MRI bone models

    PubMed Central

    Radzi, Shairah; Dlaska, Constantin Edmond; Cowin, Gary; Robinson, Mark; Pratap, Jit; Schuetz, Michael Andreas; Mishra, Sanjay

    2016-01-01

    Background Pilon fracture reduction is a challenging surgery. Radiographs are commonly used to assess the quality of reduction, but are limited in revealing the remaining bone incongruities. The study aimed to develop a method in quantifying articular malreductions using 3D computed tomography (CT) and magnetic resonance imaging (MRI) models. Methods CT and MRI data were acquired using three pairs of human cadaveric ankle specimens. Common tibial pilon fractures were simulated by performing osteotomies to the ankle specimens. Five of the created fractures [three AO type-B (43-B1), and two AO type-C (43-C1) fractures] were then reduced and stabilised using titanium implants, then rescanned. All datasets were reconstructed into CT and MRI models, and were analysed in regards to intra-articular steps and gaps, surface deviations, malrotations and maltranslations of the bone fragments. Results Initial results reveal that type B fracture CT and MRI models differed by ~0.2 (step), ~0.18 (surface deviations), ~0.56° (rotation) and ~0.4 mm (translation). Type C fracture MRI models showed metal artefacts extending to the articular surface, thus unsuitable for analysis. Type C fracture CT models differed from their CT and MRI contralateral models by ~0.15 (surface deviation), ~1.63° (rotation) and ~0.4 mm (translation). Conclusions Type B fracture MRI models were comparable to CT and may potentially be used for the postoperative assessment of articular reduction on a case-to-case basis. PMID:28090442

  11. Thermography as a quantitative imaging method for assessing postoperative inflammation

    PubMed Central

    Christensen, J; Matzen, LH; Vaeth, M; Schou, S; Wenzel, A

    2012-01-01

    Objective To assess differences in skin temperature between the operated and control side of the face after mandibular third molar surgery using thermography. Methods 127 patients had 1 mandibular third molar removed. Before the surgery, standardized thermograms were taken of both sides of the patient's face using a Flir ThermaCam™ E320 (Precisions Teknik AB, Halmstad, Sweden). The imaging procedure was repeated 2 days and 7 days after surgery. A region of interest including the third molar region was marked on each image. The mean temperature within each region of interest was calculated. The difference between sides and over time were assessed using paired t-tests. Results No significant difference was found between the operated side and the control side either before or 7 days after surgery (p > 0.3). The temperature of the operated side (mean: 32.39 °C, range: 28.9–35.3 °C) was higher than that of the control side (mean: 32.06 °C, range: 28.5–35.0 °C) 2 days after surgery [0.33 °C, 95% confidence interval (CI): 0.22–0.44 °C, p < 0.001]. No significant difference was found between the pre-operative and the 7-day post-operative temperature (p > 0.1). After 2 days, the operated side was not significantly different from the temperature pre-operatively (p = 0.12), whereas the control side had a lower temperature (0.57 °C, 95% CI: 0.29–0.86 °C, p < 0.001). Conclusions Thermography seems useful for quantitative assessment of inflammation between the intervention side and the control side after surgical removal of mandibular third molars. However, thermography cannot be used to assess absolute temperature changes due to normal variations in skin temperature over time. PMID:22752326

  12. Toward Quantitatively Accurate Calculation of the Redox-Associated Acid–Base and Ligand Binding Equilibria of Aquacobalamin

    SciTech Connect

    Johnston, Ryne C.; Zhou, Jing; Smith, Jeremy C.; Parks, Jerry M.

    2016-07-08

    In redox processes in complex transition metal-containing species are often intimately associated with changes in ligand protonation states and metal coordination number. Moreover, a major challenge is therefore to develop consistent computational approaches for computing pH-dependent redox and ligand dissociation properties of organometallic species. Reduction of the Co center in the vitamin B12 derivative aquacobalamin can be accompanied by ligand dissociation, protonation, or both, making these properties difficult to compute accurately. We examine this challenge here by using density functional theory and continuum solvation to compute Co ligand binding equilibrium constants (Kon/off), pKas and reduction potentials for models of aquacobalamin in aqueous solution. We consider two models for cobalamin ligand coordination: the first follows the hexa, penta, tetra coordination scheme for CoIII, CoII, and CoI species, respectively, and the second model features saturation of each vacant axial coordination site on CoII and CoI species with a single, explicit water molecule to maintain six directly interacting ligands or water molecules in each oxidation state. Comparing these two coordination schemes in combination with five dispersion-corrected density functionals, we find that the accuracy of the computed properties is largely independent of the scheme used, but including only a continuum representation of the solvent yields marginally better results than saturating the first solvation shell around Co throughout. PBE performs best, displaying balanced accuracy and superior performance overall, with RMS errors of 80 mV for seven reduction potentials, 2.0 log units for five pKas and 2.3 log units for two log Kon/off values for the aquacobalamin system. Furthermore, we find that the BP86 functional commonly used in corrinoid studies suffers from erratic behavior and inaccurate descriptions of

  13. Toward Quantitatively Accurate Calculation of the Redox-Associated Acid–Base and Ligand Binding Equilibria of Aquacobalamin

    DOE PAGES

    Johnston, Ryne C.; Zhou, Jing; Smith, Jeremy C.; ...

    2016-07-08

    In redox processes in complex transition metal-containing species are often intimately associated with changes in ligand protonation states and metal coordination number. Moreover, a major challenge is therefore to develop consistent computational approaches for computing pH-dependent redox and ligand dissociation properties of organometallic species. Reduction of the Co center in the vitamin B12 derivative aquacobalamin can be accompanied by ligand dissociation, protonation, or both, making these properties difficult to compute accurately. We examine this challenge here by using density functional theory and continuum solvation to compute Co ligand binding equilibrium constants (Kon/off), pKas and reduction potentials for models of aquacobalaminmore » in aqueous solution. We consider two models for cobalamin ligand coordination: the first follows the hexa, penta, tetra coordination scheme for CoIII, CoII, and CoI species, respectively, and the second model features saturation of each vacant axial coordination site on CoII and CoI species with a single, explicit water molecule to maintain six directly interacting ligands or water molecules in each oxidation state. Comparing these two coordination schemes in combination with five dispersion-corrected density functionals, we find that the accuracy of the computed properties is largely independent of the scheme used, but including only a continuum representation of the solvent yields marginally better results than saturating the first solvation shell around Co throughout. PBE performs best, displaying balanced accuracy and superior performance overall, with RMS errors of 80 mV for seven reduction potentials, 2.0 log units for five pKas and 2.3 log units for two log Kon/off values for the aquacobalamin system. Furthermore, we find that the BP86 functional commonly used in corrinoid studies suffers from erratic behavior and inaccurate descriptions of Co axial ligand binding, leading to substantial errors in predicted

  14. Toward Quantitatively Accurate Calculation of the Redox-Associated Acid-Base and Ligand Binding Equilibria of Aquacobalamin.

    PubMed

    Johnston, Ryne C; Zhou, Jing; Smith, Jeremy C; Parks, Jerry M

    2016-08-04

    Redox processes in complex transition metal-containing species are often intimately associated with changes in ligand protonation states and metal coordination number. A major challenge is therefore to develop consistent computational approaches for computing pH-dependent redox and ligand dissociation properties of organometallic species. Reduction of the Co center in the vitamin B12 derivative aquacobalamin can be accompanied by ligand dissociation, protonation, or both, making these properties difficult to compute accurately. We examine this challenge here by using density functional theory and continuum solvation to compute Co-ligand binding equilibrium constants (Kon/off), pKas, and reduction potentials for models of aquacobalamin in aqueous solution. We consider two models for cobalamin ligand coordination: the first follows the hexa, penta, tetra coordination scheme for Co(III), Co(II), and Co(I) species, respectively, and the second model features saturation of each vacant axial coordination site on Co(II) and Co(I) species with a single, explicit water molecule to maintain six directly interacting ligands or water molecules in each oxidation state. Comparing these two coordination schemes in combination with five dispersion-corrected density functionals, we find that the accuracy of the computed properties is largely independent of the scheme used, but including only a continuum representation of the solvent yields marginally better results than saturating the first solvation shell around Co throughout. PBE performs best, displaying balanced accuracy and superior performance overall, with RMS errors of 80 mV for seven reduction potentials, 2.0 log units for five pKas and 2.3 log units for two log Kon/off values for the aquacobalamin system. Furthermore, we find that the BP86 functional commonly used in corrinoid studies suffers from erratic behavior and inaccurate descriptions of Co-axial ligand binding, leading to substantial errors in predicted pKas and

  15. Quantitative assessment of human body shape using Fourier analysis

    NASA Astrophysics Data System (ADS)

    Friess, Martin; Rohlf, F. J.; Hsiao, Hongwei

    2004-04-01

    Fall protection harnesses are commonly used to reduce the number and severity of injuries. Increasing the efficiency of harness design requires the size and shape variation of the user population to be assessed as detailed and as accurately as possible. In light of the unsatisfactory performance of traditional anthropometry with respect to such assessments, we propose the use of 3D laser surface scans of whole bodies and the statistical analysis of elliptic Fourier coefficients. Ninety-eight male and female adults were scanned. Key features of each torso were extracted as a 3D curve along front, back and the thighs. A 3D extension of Elliptic Fourier analysis4 was used to quantify their shape through multivariate statistics. Shape change as a function of size (allometry) was predicted by regressing the coefficients onto stature, weight and hip circumference. Upper and lower limits of torso shape variation were determined and can be used to redefine the design of the harness that will fit most individual body shapes. Observed allometric changes are used for adjustments to the harness shape in each size. Finally, the estimated outline data were used as templates for a free-form deformation of the complete torso surface using NURBS models (non-uniform rational B-splines).

  16. The potential optical coherence tomography in tooth bleaching quantitative assessment

    NASA Astrophysics Data System (ADS)

    Ni, Y. R.; Guo, Z. Y.; Shu, S. Y.; Zeng, C. C.; Zhong, H. Q.; Chen, B. L.; Liu, Z. M.; Bao, Y.

    2011-12-01

    In this paper, we report the outcomes from a pilot study on using OCT functional imaging method to evaluate and quantify color alteration in the human teeth in vitro. The image formations of the dental tissues without and with treatment 35% hydrogen peroxide were obtained by an OCT system at a 1310 nm central wavelength. One parameter for the quantification of optical properties from OCT measurements is introduced in our study: attenuate coefficient (μ). And the attenuate coefficient have significant decrease ( p < 0.001) in dentine as well as a significant increase ( p < 0.001) in enamel was observed during tooth bleaching process. From the experimental results, it is found that attenuate coefficient could be useful to assess color alteration of the human tooth samples. OCT has a potential to become an effective tool for the assessment tooth bleaching. And our experiment offer a now method to evaluate color change in visible region by quantitative analysis of the infrared region information from OCT.

  17. Quantitative computed tomography assessment of lung structure and function in pulmonary emphysema.

    PubMed

    Madani, A; Keyzer, C; Gevenois, P A

    2001-10-01

    Accurate diagnosis and quantification of pulmonary emphysema during life is important to understand the natural history of the disease, to assess the extent of the disease, and to evaluate and follow-up therapeutic interventions. Since pulmonary emphysema is defined through pathological criteria, new methods of diagnosis and quantification should be validated by comparisons against histological references. Recent studies have addressed the capability of computed tomography (CT) to quantify pulmonary emphysema accurately. The studies reviewed in this article have been based on CT scans obtained after deep inspiration or expiration, on subjective visual grading and on objective measurements of attenuation values. Especially dedicated software was used for this purpose, which provided numerical data, on both two- and three-dimensional approaches, and compared CT data with pulmonary function tests. More recently, fractal and textural analyses were applied to computed tomography scans to assess the presence, the extent, and the types of emphysema. Quantitative computed tomography has already been used in patient selection for surgical treatment of pulmonary emphysema and in pharmacotherapeutical trials. However, despite numerous and extensive studies, this technique has not yet been standardized and important questions about how best to use computed tomography for the quantification of pulmonary emphysema are still unsolved.

  18. Quantitative safety assessment of computer based I and C systems via modular Markov analysis

    SciTech Connect

    Elks, C. R.; Yu, Y.; Johnson, B. W.

    2006-07-01

    This paper gives a brief overview of the methodology based on quantitative metrics for evaluating digital I and C system that has been under development at the Univ. of Virginia for a number years. Our quantitative assessment methodology is based on three well understood and extensively practiced disciplines in the dependability assessment field: (1) System level fault modeling and fault injection, (2) safety and coverage based dependability modeling methods, and (3) statistical estimation of model parameters used for safety predication. There are two contributions of this paper; the first contribution is related to incorporating design flaw information into homogenous Markov models when such data is available. The second is to introduce a Markov modeling method for managing the modeling complexities of large distributed I and C systems for the predication of safety and reliability. The method is called Modular Markov Chain analysis. This method allows Markov models of the system to be composed in a modular manner. In doing so, it address two important issues. (1) The models are more visually representative of the functional the system. (2) Important failure dependencies that naturally occur in complex systems are modeled accurately with our approach. (authors)

  19. The challenge of measuring lung structure. On the "Standards for the Quantitative Assessment of Lung Structure".

    PubMed

    Weibel, Ewald R

    2010-09-01

    The purpose of this review is to call attention of respiratory scientists to an Official Policy Statement jointly issued by the American Thoracic Society and the European Respiratory Society on "Standards for the Quantitative Assessment of Lung Structure", based on an extended report of a joint task force of 20 experts, and recently published in the Am. J. Respir. Crit. Care Med. This document provides investigators of normal and diseased lung structure with a review of the stereological methods that allow measurements to be done on sections. It critically discusses the preparation procedures, the conditions for unbiased sampling of the lung for microscopic study, and the potential applications of such methods. Here we present some case studies that underpin the importance of using accurate methods of structure quantification and outline paths into the future for structure-function studies on lung diseases.

  20. Reading Assessment Methods for Middle-School Students: An Investigation of Reading Comprehension Rate and Maze Accurate Response Rate

    ERIC Educational Resources Information Center

    Hale, Andrea D.; Henning, Jaime B.; Hawkins, Renee O.; Sheeley, Wesley; Shoemaker, Larissa; Reynolds, Jennifer R.; Moch, Christina

    2011-01-01

    This study was designed to investigate the validity of four different aloud reading comprehension assessment measures: Maze, comprehension questions, Maze accurate response rate (MARR), and reading comprehension rate (RCR). The criterion measures used in this study were the Woodcock-Johnson III Tests of Achievement (WJ-III ACH) Broad Reading…

  1. Purity assessment of ginsenoside Rg1 using quantitative (1)H nuclear magnetic resonance.

    PubMed

    Huang, Bao-Ming; Xiao, Sheng-Yuan; Chen, Ting-Bo; Xie, Ying; Luo, Pei; Liu, Liang; Zhou, Hua

    2017-05-30

    Ginseng herbs comprise a group of the most popular herbs, including Panax ginseng, P. notoginseng and P. quinquefolius (Family Araliaceae), which are used as traditional Chinese medicine (TCM) and are some of the best-selling natural products in the world. The accurate quantification of ginsenoside Rg1 is one of the major aspects of its quality control. However, the purity of the commercial Rg1 chemical reference substance (CRS) is often measured with high-performance chromatography coupled with an ultraviolet detector (HPLC-UV), which is a selective detector with unequal responses to different compounds; thus, this detector introduces probable error to purity assessments. In the present study, quantitative nuclear magnetic resonance (qNMR), due to its absolute quantification ability, was applied to accurately assess the purity of Rg1 CRS. Phenylmethyl phthalate was used as the internal standard (IS) to calibrate the purity of Rg1 CRS. The proton signal of Rg1 CRS in methanol-d4 at 4.37ppm was selected to avoid interfering signals, enabling accurate quantitative analysis. The relaxation delay, number of scans, and NMR windowing were optimized for data acquisition. For post-processing, the Lorentz/Gauss deconvolution method was employed to increase the signal accuracy by separating the impurities and noise in the integrated region of the quantitative proton. The method validation showed that the developed method has acceptable sensitivity, linearity, precision, and accuracy. The purity of the commercial Rg1 CRS examined with the method developed in this research was 90.34±0.21%, which was obviously lower than that reported by the manufacturer (>98.0%, HPLC-UV). The cross-method validation shows that the commonly used HPLC-UV, HPLC-ELSD (evaporative light scattering detector) and even LC-MS (mass spectrometry) methods provide significantly higher purity values of Rg1 CRS compared with the qNMR method, and the accuracy of these LC-based methods largely depend on the

  2. PET optimization for improved assessment and accurate quantification of {sup 90}Y-microsphere biodistribution after radioembolization

    SciTech Connect

    Martí-Climent, Josep M. Prieto, Elena; Elosúa, César; Rodríguez-Fraile, Macarena; Domínguez-Prado, Inés; Vigil, Carmen; García-Velloso, María J.; Arbizu, Javier; Peñuelas, Iván; Richter, José A.

    2014-09-15

    Purpose: {sup 90}Y-microspheres are widely used for the radioembolization of metastatic liver cancer or hepatocellular carcinoma and there is a growing interest for imaging {sup 90}Y-microspheres with PET. The aim of this study is to evaluate the performance of a current generation PET/CT scanner for {sup 90}Y imaging and to optimize the PET protocol to improve the assessment and the quantification of {sup 90}Y-microsphere biodistribution after radioembolization. Methods: Data were acquired on a Biograph mCT-TrueV scanner with time of flight (TOF) and point spread function (PSF) modeling. Spatial resolution was measured with a{sup 90}Y point source. Sensitivity was evaluated using the NEMA 70 cm line source filled with {sup 90}Y. To evaluate the count rate performance, {sup 90}Y vials with activity ranging from 3.64 to 0.035 GBq were measured in the center of the field of view (CFOV). The energy spectrum was evaluated. Image quality with different reconstructions was studied using the Jaszczak phantom containing six hollow spheres (diameters: 31.3, 28.1, 21.8, 16.1, 13.3, and 10.5 mm), filled with a 207 kBq/ml {sup 90}Y concentration and a 5:1 sphere-to-background ratio. Acquisition time was adjusted to simulate the quality of a realistic clinical PET acquisition of a patient treated with SIR-Spheres{sup ®}. The developed methodology was applied to ten patients after SIR-Spheres{sup ®} treatment acquiring a 10 min per bed PET. Results: The energy spectrum showed the{sup 90}Y bremsstrahlung radiation. The {sup 90}Y transverse resolution, with filtered backprojection reconstruction, was 4.5 mm in the CFOV and degraded to 5.0 mm at 10 cm off-axis. {sup 90}Y absolute sensitivity was 0.40 kcps/MBq in the center of the field of view. Tendency of true and random rates as a function of the {sup 90}Y activity could be accurately described using linear and quadratic models, respectively. Phantom studies demonstrated that, due to low count statistics in {sup 90}Y PET

  3. A Quantitative Measure of Handwriting Dysfluency for Assessing Tardive Dyskinesia

    PubMed Central

    Caligiuri, Michael P.; Teulings, Hans-Leo; Dean, Charles E.; Lohr, James B.

    2015-01-01

    Tardive dyskinesia (TD) is movement disorder commonly associated with chronic exposure to antidopaminergic medications which may be in some cases disfiguring and socially disabling. The consensus from a growing body of research on the incidence and prevalence of TD in the modern era of antipsychotics indicates that this disorder has not disappeared continues to challenge the effective management of psychotic symptoms in patients with schizophrenia. A fundamental component in an effective strategy for managing TD is its reliable and accurate assessment. In the present study, we examined the clinical utility of a brief handwriting dysfluency measure for quantifying TD. Digitized samples of handwritten circles and loops were obtained from 62 psychosis patients with or without TD and from 50 healthy subjects. Two measures of dysfluent pen movements were extracted from each vertical pen stroke, including normalized jerk and the number of acceleration peaks. TD patients exhibited significantly higher dysfluency scores than non-TD patients and controls. Severity of handwriting movement dysfluency was correlated with AIMS severity ratings for some tasks. The procedure yielded high degrees of test-retest reliability. These results suggest that measures of handwriting movement dysfluency may be particularly useful for objectively evaluating the efficacy of pharmacotherapeutic strategies for treating TD. PMID:25679121

  4. The U.S. Department of Agriculture Automated Multiple-Pass Method accurately assesses sodium intakes

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Accurate and practical methods to monitor sodium intake of the U.S. population are critical given current sodium reduction strategies. While the gold standard for estimating sodium intake is the 24 hour urine collection, few studies have used this biomarker to evaluate the accuracy of a dietary ins...

  5. How many standard area diagram sets are needed for accurate disease severity assessment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Standard area diagram sets (SADs) are widely used in plant pathology: a rater estimates disease severity by comparing an unknown sample to actual severities in the SADs and interpolates an estimate as accurately as possible (although some SADs have been developed for categorizing disease too). Most ...

  6. Quantitative Assessment of Islets of Langerhans Encapsulated in Alginate

    PubMed Central

    Johnson, Amy S.; O'Sullivan, Esther; D'Aoust, Laura N.; Omer, Abdulkadir; Bonner-Weir, Susan; Fisher, Robert J.; Weir, Gordon C.

    2011-01-01

    Improved methods have recently been developed for assessing islet viability and quantity in human islet preparations for transplantation, and these measurements have proven useful for predicting transplantation outcome. The objectives of this study were to adapt these methods for use with microencapsulated islets, to verify that they provide meaningful quantitative measurements, and to test them with two model systems: (1) barium alginate and (2) barium alginate containing a 70% (w/v) perfluorocarbon (PFC) emulsion, which presents challenges to use of these assays and is of interest in its own right as a means for reducing oxygen supply limitations to encapsulated tissue. Mitochondrial function was assessed by oxygen consumption rate measurements, and the analysis of data was modified to account for the increased solubility of oxygen in the PFC-alginate capsules. Capsules were dissolved and tissue recovered for nuclei counting to measure the number of cells. Capsule volume was determined from alginate or PFC content and used to normalize measurements. After low oxygen culture for 2 days, islets in normal alginate lost substantial viable tissue and displayed necrotic cores, whereas most of the original oxygen consumption rate was recovered with PFC alginate, and little necrosis was observed. All nuclei were recovered with normal alginate, but some nuclei from nonrespiring cells were lost with PFC alginate. Biocompatibility tests revealed toxicity at the islet periphery associated with the lipid emulsion used to provide surfactants during the emulsification process. We conclude that these new assay methods can be applied to islets encapsulated in materials as complex as PFC-alginate. Measurements made with these materials revealed that enhancement of oxygen permeability of the encapsulating material with a concentrated PFC emulsion improves survival of encapsulated islets under hypoxic conditions, but reformulation of the PFC emulsion is needed to reduce toxicity

  7. Quantitative assessment of computational models for retinotopic map formation

    PubMed Central

    Sterratt, David C; Cutts, Catherine S; Willshaw, David J; Eglen, Stephen J

    2014-01-01

    ABSTRACT Molecular and activity‐based cues acting together are thought to guide retinal axons to their terminal sites in vertebrate optic tectum or superior colliculus (SC) to form an ordered map of connections. The details of mechanisms involved, and the degree to which they might interact, are still not well understood. We have developed a framework within which existing computational models can be assessed in an unbiased and quantitative manner against a set of experimental data curated from the mouse retinocollicular system. Our framework facilitates comparison between models, testing new models against known phenotypes and simulating new phenotypes in existing models. We have used this framework to assess four representative models that combine Eph/ephrin gradients and/or activity‐based mechanisms and competition. Two of the models were updated from their original form to fit into our framework. The models were tested against five different phenotypes: wild type, Isl2‐EphA3 ki/ki, Isl2‐EphA3 ki/+, ephrin‐A2,A3,A5 triple knock‐out (TKO), and Math5 −/− (Atoh7). Two models successfully reproduced the extent of the Math5 −/− anteromedial projection, but only one of those could account for the collapse point in Isl2‐EphA3 ki/+. The models needed a weak anteroposterior gradient in the SC to reproduce the residual order in the ephrin‐A2,A3,A5 TKO phenotype, suggesting either an incomplete knock‐out or the presence of another guidance molecule. Our article demonstrates the importance of testing retinotopic models against as full a range of phenotypes as possible, and we have made available MATLAB software, we wrote to facilitate this process. © 2014 Wiley Periodicals, Inc. Develop Neurobiol 75: 641–666, 2015 PMID:25367067

  8. Estimating distributions out of qualitative and (semi)quantitative microbiological contamination data for use in risk assessment.

    PubMed

    Busschaert, P; Geeraerd, A H; Uyttendaele, M; Van Impe, J F

    2010-04-15

    A framework using maximum likelihood estimation (MLE) is used to fit a probability distribution to a set of qualitative (e.g., absence in 25 g), semi-quantitative (e.g., presence in 25 g and absence in 1g) and/or quantitative test results (e.g., 10 CFU/g). Uncertainty about the parameters of the variability distribution is characterized through a non-parametric bootstrapping method. The resulting distribution function can be used as an input for a second order Monte Carlo simulation in quantitative risk assessment. As an illustration, the method is applied to two sets of in silico generated data. It is demonstrated that correct interpretation of data results in an accurate representation of the contamination level distribution. Subsequently, two case studies are analyzed, namely (i) quantitative analyses of Campylobacter spp. in food samples with nondetects, and (ii) combined quantitative, qualitative, semiquantitative analyses and nondetects of Listeria monocytogenes in smoked fish samples. The first of these case studies is also used to illustrate what the influence is of the limit of quantification, measurement error, and the number of samples included in the data set. Application of these techniques offers a way for meta-analysis of the many relevant yet diverse data sets that are available in literature and (inter)national reports of surveillance or baseline surveys, therefore increases the information input of a risk assessment and, by consequence, the correctness of the outcome of the risk assessment.

  9. A quantitative risk assessment model for Salmonella and whole chickens.

    PubMed

    Oscar, Thomas P

    2004-06-01

    Existing data and predictive models were used to define the input settings of a previously developed but modified quantitative risk assessment model (QRAM) for Salmonella and whole chickens. The QRAM was constructed in an Excel spreadsheet and was simulated using @Risk. The retail-to-table pathway was modeled as a series of unit operations and associated pathogen events that included initial contamination at retail, growth during consumer transport, thermal inactivation during cooking, cross-contamination during serving, and dose response after consumption. Published data as well as predictive models for growth and thermal inactivation of Salmonella were used to establish input settings. Noncontaminated chickens were simulated so that the QRAM could predict changes in the incidence of Salmonella contamination. The incidence of Salmonella contamination changed from 30% at retail to 0.16% after cooking to 4% at consumption. Salmonella growth on chickens during consumer transport was the only pathogen event that did not impact the risk of salmonellosis. For the scenario simulated, the QRAM predicted 0.44 cases of salmonellosis per 100,000 consumers, which was consistent with recent epidemiological data that indicate a rate of 0.66-0.88 cases of salmonellosis per 100,000 consumers of chicken. Although the QRAM was in agreement with the epidemiological data, surrogate data and models were used, assumptions were made, and potentially important unit operations and pathogen events were not included because of data gaps and thus, further refinement of the QRAM is needed.

  10. Quantitative assessment of the effectiveness of a rockfall warning system

    NASA Astrophysics Data System (ADS)

    Bründl, Michael; Sättele, Martina; Krautblatter, Michael; Straub, Daniel

    2016-04-01

    Rockslides and rockfalls can pose high risk to human settlements and traffic infrastructure. In addition to structural mitigation measures like rockfall nets, warning systems are increasingly installed to reduce rockfall risks. Whereas for structural mitigation measures with reducing effects on the spatial extent a structured evaluation method is existing, no or only few approaches to assess the effectiveness for warning systems are known. Especially for higher magnitude rockfalls structural mitigation measures are not effective, and reliable early warning systems will be essential in future. In response to that, we developed a classification and a framework to assess the reliability and effectiveness of early warning systems (Sättele et al, 2015a; 2016). Here, we demonstrate an application for the rockfall warning system installed in Preonzo prior to a major rockfall in May 2012 (Sättele et al., 2015b). We show that it is necessary to design such a warning system as fail-safe construction, which has to incorporate components with low failure probabilities, high redundancy, low warning thresholds, and additional control systems. With a hypothetical probabilistic analysis, we investigate the effect of the risk attitude of decision makers and of the number of sensors on the probability of detecting an event and on initiating a timely evacuation, as well as on related intervention cost. We conclude that it is possible to quantitatively assess the effectiveness of warning systems, which helps to optimize mitigation strategies against rockfall events. References Sättele, M., Bründl, M., and Straub, D.: Reliability and effectiveness of warning systems for natural hazards: concept and application to debris flow warning, Rel. Eng. Syst. Safety, 142, 192-202, 2015a. Sättele, M., Krautblatter, M., Bründl, M., and Straub, D.: Forecasting rock slope failure: How reliable and effective are warning systems?, Landslides, 605, 1-14, 2015b. Sättele, M., Bründl, M., and

  11. Quantitative assessments of burn degree by high-frequency ultrasonic backscattering and statistical model

    NASA Astrophysics Data System (ADS)

    Lin, Yi-Hsun; Huang, Chih-Chung; Wang, Shyh-Hau

    2011-02-01

    An accurate and quantitative modality to assess the burn degree is crucial for determining further treatments to be properly applied to burn injury patients. Ultrasounds with frequencies higher than 20 MHz have been applied to dermatological diagnosis due to its high resolution and noninvasive capability. Yet, it is still lacking a substantial means to sensitively correlate the burn degree and ultrasonic measurements quantitatively. Thus, a 50 MHz ultrasound system was developed and implemented to measure ultrasonic signals backscattered from the burned skin tissues. Various burn degrees were achieved by placing a 100 °C brass plate onto the dorsal skins of anesthetized rats for various durations ranged from 5 to 20 s. The burn degrees were correlated with ultrasonic parameters, including integrated backscatter (IB) and Nakagami parameter (m) calculated from ultrasonic signals acquired from the burned tissues of a 5 × 1.4 mm (width × depth) area. Results demonstrated that both IB and m decreased exponentially with the increase of burn degree. Specifically, an IB of -79.0 ± 2.4 (mean ± standard deviation) dB for normal skin tissues tended to decrease to -94.0 ± 1.3 dB for those burned for 20 s, while the corresponding Nakagami parameters tended to decrease from 0.76 ± 0.08 to 0.45 ± 0.04. The variation of both IB and m was partially associated with the change of properties of collagen fibers from the burned tissues verified by samples of tissue histological sections. Particularly, the m parameter may be more sensitive to differentiate burned skin due to the fact that it has a greater rate of change with respect to different burn durations. These ultrasonic parameters in conjunction with high-frequency B-mode and Nakagami images could have the potential to assess the burn degree quantitatively.

  12. The effect of manipulated and accurate assessment feedback on the self-efficacy of dance students.

    PubMed

    García-Dantas, Ana; Quested, Eleanor

    2015-03-01

    Research undertaken with athletes has shown that lower-evaluated feedback is related to low self-efficacy levels. However, the relationship between teacher feedback and self-efficacy has not been studied in the dance setting. In sports or dance contexts, very few studies have manipulated feedback content to examine its impact on performers' self-efficacy in relation to the execution of a specific movement. Therefore, the aim of this investigation was to explore the effect of manipulated upper, lower, and accurate grade feedback on changes in dancers' self-efficacy levels for the execution of the "Zapateado" (a flamenco foot movement). Sixty-one students (56 female, 5 male, ages 13 to 22 ± 3.25 years) from a Spanish dance conservatory participated in this experimental study. They were randomly divided into four feedback groups: 1. upper-evaluated, 2. objective and informational, 3. lower-evaluated, and 4. no feedback-control. Participants performed three trials during a 1-hour session and completed questionnaires tapping self-efficacy pre-feedback and post-feedback. After each trial, teachers (who were confederates in the study) were first asked to rate their perception of each dancer's competence level at performing the movement according to conventional criteria (scores from 0 to 10). The results were then manipulated, and students accurate, lower-evaluated, or upper-evaluated scores were given. Those in the accurate feedback group reported positive change in self-efficacy, whereas those in the lower-evaluated group showed no significant change in self-efficacy during the course of the trial. Findings call into question the common perception among teachers that it can be motivating to provide students with inaccurate feedback that indicates that the students' performance level is much better or much worse than they actually perceive it to be. Self-efficacy appears most likely to increase in students when feedback is accurate.

  13. Is there a place for quantitative risk assessment?

    PubMed

    Hall, Eric J

    2009-06-01

    The use of ionising radiations is so well established, especially in the practice of medicine, that it is impossible to imagine contemporary life without them. At the same time, ionising radiations are a known and proven human carcinogen. Exposure to radiation in some contexts elicits fear and alarm (nuclear power for example) while in other situations, until recently at least, it was accepted with alacrity (diagnostic x-rays for example). This non-uniform reaction to the potential hazards of radiation highlights the importance of quantitative risk estimates, which are necessary to help put things into perspective. Three areas will be discussed where quantitative risk estimates are needed and where uncertainties and limitations are a problem. First, the question of diagnostic x-rays. CT usage over the past quarter of a century has increased about 12 fold in the UK and more than 20 fold in the US. In both countries, more than 90% of the collective population dose from diagnostic x-rays comes from the few high dose procedures, such as interventional radiology, CT scans, lumbar spine x-rays and barium enemas. These all involve doses close to the lower limit at which there are credible epidemiological data for an excess cancer incidence. This is a critical question; what is the lowest dose at which there is good evidence of an elevated cancer incidence? Without low dose risk estimates the risk-benefit ratio of diagnostic procedures cannot be assessed. Second, the use of new techniques in radiation oncology. IMRT is widely used to obtain a more conformal dose distribution, particularly in children. It results in a larger total body dose, due to an increased number of monitor units and to the application of more radiation fields. The Linacs used today were not designed for IMRT and are based on leakage standards that were decided decades ago. It will be difficult and costly to reduce leakage from treatment machines, and a necessary first step is to refine the available

  14. Disaster Metrics: A Proposed Quantitative Assessment Tool in Complex Humanitarian Emergencies - The Public Health Impact Severity Scale (PHISS)

    PubMed Central

    Bayram, Jamil D.; Kysia, Rashid; Kirsch, Thomas D.

    2012-01-01

    Background: Complex Humanitarian Emergencies (CHE) result in rapid degradation of population health and quickly overwhelm indigenous health resources. Numerous governmental, non-governmental, national and international organizations and agencies are involved in the assessment of post-CHE affected populations. To date, there is no entirely quantitative assessment tool conceptualized to measure the public health impact of CHE. Methods: Essential public health parameters in CHE were identified based on the Sphere Project "Minimum Standards", and scoring rubrics were proposed based on the prevailing evidence when applicable. Results: 12 quantitative parameters were identified, representing the four categories of “Minimum Standards for Disaster Response” according to the Sphere Project; health, shelter, food and nutrition, in addition to water and sanitation. The cumulative tool constitutes a quantitative scale, referred to as the Public Health Impact Severity Scale (PHISS), and the score on this scale ranges from a minimum of 0 to a maximum of 100. Conclusion: Quantitative measurement of the public health impact of CHE is germane to accurate assessment, in order to identify the scale and scope of the critical response required for the relief efforts of the affected populations. PHISS is a new conceptual metric tool, proposed to add an objective quantitative dimension to the post-CHE assessment arsenal. PHISS has not yet been validated, and studies are needed with prospective data collection to test its validity, feasibility and reliability. Citation: Bayram JD, Kysia R, Kirsch TD. Disaster Metrics: A Proposed Quantitative Assessment Tool in Complex Humanitarian Emergencies – The Public Health Impact Severity Scale (PHISS). PLOS Currents Disasters. 2012 Aug 21. doi: 10.1371/4f7b4bab0d1a3. PMID:22984643

  15. A poultry-processing model for quantitative microbiological risk assessment.

    PubMed

    Nauta, Maarten; van der Fels-Klerx, Ine; Havelaar, Arie

    2005-02-01

    A poultry-processing model for a quantitative microbiological risk assessment (QMRA) of campylobacter is presented, which can also be applied to other QMRAs involving poultry processing. The same basic model is applied in each consecutive stage of industrial processing. It describes the effects of inactivation and removal of the bacteria, and the dynamics of cross-contamination in terms of the transfer of campylobacter from the intestines to the carcass surface and the environment, from the carcasses to the environment, and from the environment to the carcasses. From the model it can be derived that, in general, the effect of inactivation and removal is dominant for those carcasses with high initial bacterial loads, and cross-contamination is dominant for those with low initial levels. In other QMRA poultry-processing models, the input-output relationship between the numbers of bacteria on the carcasses is usually assumed to be linear on a logarithmic scale. By including some basic mechanistics, it is shown that this may not be realistic. As nonlinear behavior may affect the predicted effects of risk mitigations; this finding is relevant for risk management. Good knowledge of the variability of bacterial loads on poultry entering the process is important. The common practice in microbiology to only present geometric mean of bacterial counts is insufficient: arithmetic mean are more suitable, in particular, to describe the effect of cross-contamination. The effects of logistic slaughter (scheduled processing) as a risk mitigation strategy are predicted to be small. Some additional complications in applying microbiological data obtained in processing plants are discussed.

  16. A quantitative assessment method for Ascaris eggs on hands.

    PubMed

    Jeandron, Aurelie; Ensink, Jeroen H J; Thamsborg, Stig M; Dalsgaard, Anders; Sengupta, Mita E

    2014-01-01

    The importance of hands in the transmission of soil transmitted helminths, especially Ascaris and Trichuris infections, is under-researched. This is partly because of the absence of a reliable method to quantify the number of eggs on hands. Therefore, the aim of this study was to develop a method to assess the number of Ascaris eggs on hands and determine the egg recovery rate of the method. Under laboratory conditions, hands were seeded with a known number of Ascaris eggs, air dried and washed in a plastic bag retaining the washing water, in order to determine recovery rates of eggs for four different detergents (cationic [benzethonium chloride 0.1% and cetylpyridinium chloride CPC 0.1%], anionic [7X 1% - quadrafos, glycol ether, and dioctyl sulfoccinate sodium salt] and non-ionic [Tween80 0.1% -polyethylene glycol sorbitan monooleate]) and two egg detection methods (McMaster technique and FLOTAC). A modified concentration McMaster technique showed the highest egg recovery rate from bags. Two of the four diluted detergents (benzethonium chloride 0.1% and 7X 1%) also showed a higher egg recovery rate and were then compared with de-ionized water for recovery of helminth eggs from hands. The highest recovery rate (95.6%) was achieved with a hand rinse performed with 7X 1%. Washing hands with de-ionized water resulted in an egg recovery rate of 82.7%. This washing method performed with a low concentration of detergent offers potential for quantitative investigation of contamination of hands with Ascaris eggs and of their role in human infection. Follow-up studies are needed that validate the hand washing method under field conditions, e.g. including people of different age, lower levels of contamination and various levels of hand cleanliness.

  17. Quantitative risk assessment of Cryptosporidium in tap water in Ireland.

    PubMed

    Cummins, E; Kennedy, R; Cormican, M

    2010-01-15

    Cryptosporidium species are protozoan parasites associated with gastro-intestinal illness. Following a number of high profile outbreaks worldwide, it has emerged as a parasite of major public health concern. A quantitative Monte Carlo simulation model was developed to evaluate the annual risk of infection from Cryptosporidium in tap water in Ireland. The assessment considers the potential initial contamination levels in raw water, oocyst removal and decontamination events following various process stages, including coagulation/flocculation, sedimentation, filtration and disinfection. A number of scenarios were analysed to represent potential risks from public water supplies, group water schemes and private wells. Where surface water is used additional physical and chemical water treatment is important in terms of reducing the risk to consumers. The simulated annual risk of illness for immunocompetent individuals was below 1 x 10(-4) per year (as set by the US EPA) except under extreme contamination events. The risk for immunocompromised individuals was 2-3 orders of magnitude greater for the scenarios analysed. The model indicates a reduced risk of infection from tap water that has undergone microfiltration, as this treatment is more robust in the event of high contamination loads. The sensitivity analysis highlighted the importance of watershed protection and the importance of adequate coagulation/flocculation in conventional treatment. The frequency of failure of the treatment process is the most important parameter influencing human risk in conventional treatment. The model developed in this study may be useful for local authorities, government agencies and other stakeholders to evaluate the likely risk of infection given some basic input data on source water and treatment processes used.

  18. How to Achieve Accurate Peer Assessment for High Value Written Assignments in a Senior Undergraduate Course

    ERIC Educational Resources Information Center

    Jeffery, Daniel; Yankulov, Krassimir; Crerar, Alison; Ritchie, Kerry

    2016-01-01

    The psychometric measures of accuracy, reliability and validity of peer assessment are critical qualities for its use as a supplement to instructor grading. In this study, we seek to determine which factors related to peer review are the most influential on these psychometric measures, with a primary focus on the accuracy of peer assessment or how…

  19. Noninvasive Qualitative and Quantitative Assessment of Spoilage Attributes of Chilled Pork Using Hyperspectral Scattering Technique.

    PubMed

    Zhang, Leilei; Peng, Yankun

    2016-08-01

    The objective of this research was to develop a rapid noninvasive method for quantitative and qualitative determination of chilled pork spoilage. Microbiological, physicochemical, and organoleptic characteristics such as the total viable count (TVC), Pseudomonas spp., total volatile basic-nitrogen (TVB-N), pH value, and color parameter L* were determined to appraise pork quality. The hyperspectral scattering characteristics from 54 meat samples were fitted by four-parameter modified Gompertz function accurately. Support vector machines (SVM) was applied to establish quantitative prediction model between scattering fitting parameters and reference values. In addition, partial least squares discriminant analysis (PLS-DA) and Bayesian analysis were utilized as supervised and unsupervised techniques for the qualitative identification of meat spoilage. All stored chilled meat samples were classified into three grades: "fresh," "semi-fresh," and "spoiled." Bayesian classification model was superior to PLS-DA with overall classification accuracy of 92.86%. The results demonstrated that hyperspectral scattering technique combined with SVM and Bayesian possessed a powerful capability for meat spoilage assessment rapidly and noninvasively.

  20. Use of Quantitative Microbial Risk Assessment to Improve Interpretation of a Recreational Water Epidemiological Study

    EPA Science Inventory

    We conducted a supplemental water quality monitoring study and quantitative microbial risk assessment (QMRA) to complement the United States Environmental Protection Agency’s (U.S. EPA) National Epidemiological and Environmental Assessment of Recreational Water study at Boq...

  1. A quantitative health assessment index for rapid evaluation of fish condition in the field

    SciTech Connect

    Adams, S.M. ); Brown, A.M. ); Goede, R.W. )

    1993-01-01

    The health assessment index (HAI) is an extension and refinement of a previously published field necropsy system. The HAI is a quantitative index that allows statistical comparisons of fish health among data sets. Index variables are assigned numerical values based on the degree of severity or damage incurred by an organ or tissue from environmental stressors. This approach has been used to evaluate the general health status of fish populations in a wide range of reservoir types in the Tennessee River basin (North Carolina, Tennessee, Alabama, Kentucky), in Hartwell Reservoir (Georgia, South Carolina) that is contaminated by polychlorinated biphenyls, and in the Pigeon River (Tennessee, North Carolina) that receives effluents from a bleaches kraft mill. The ability of the HAI to accurately characterize the health of fish in these systems was evaluated by comparing this index to other types of fish health measures (contaminant, bioindicator, and reproductive analysis) made at the same time as the HAI. In all cases, the HAI demonstrated the same pattern of fish health status between sites as did each of the other more sophisticated health assessment methods. The HAI has proven to be a simple and inexpensive means of rapidly assessing general fish health in field situations. 29 refs., 5 tabs.

  2. Assessment of the sources of error affecting the quantitative accuracy of SPECT imaging in small animals

    SciTech Connect

    Joint Graduate Group in Bioengineering, University of California, San Francisco and University of California, Berkeley; Department of Radiology, University of California; Gullberg, Grant T; Hwang, Andrew B.; Franc, Benjamin L.; Gullberg, Grant T.; Hasegawa, Bruce H.

    2008-02-15

    Small animal SPECT imaging systems have multiple potential applications in biomedical research. Whereas SPECT data are commonly interpreted qualitatively in a clinical setting, the ability to accurately quantify measurements will increase the utility of the SPECT data for laboratory measurements involving small animals. In this work, we assess the effect of photon attenuation, scatter and partial volume errors on the quantitative accuracy of small animal SPECT measurements, first with Monte Carlo simulation and then confirmed with experimental measurements. The simulations modeled the imaging geometry of a commercially available small animal SPECT system. We simulated the imaging of a radioactive source within a cylinder of water, and reconstructed the projection data using iterative reconstruction algorithms. The size of the source and the size of the surrounding cylinder were varied to evaluate the effects of photon attenuation and scatter on quantitative accuracy. We found that photon attenuation can reduce the measured concentration of radioactivity in a volume of interest in the center of a rat-sized cylinder of water by up to 50percent when imaging with iodine-125, and up to 25percent when imaging with technetium-99m. When imaging with iodine-125, the scatter-to-primary ratio can reach up to approximately 30percent, and can cause overestimation of the radioactivity concentration when reconstructing data with attenuation correction. We varied the size of the source to evaluate partial volume errors, which we found to be a strong function of the size of the volume of interest and the spatial resolution. These errors can result in large (>50percent) changes in the measured amount of radioactivity. The simulation results were compared with and found to agree with experimental measurements. The inclusion of attenuation correction in the reconstruction algorithm improved quantitative accuracy. We also found that an improvement of the spatial resolution through the

  3. Assessment of the sources of error affecting the quantitative accuracy of SPECT imaging in small animals

    NASA Astrophysics Data System (ADS)

    Hwang, Andrew B.; Franc, Benjamin L.; Gullberg, Grant T.; Hasegawa, Bruce H.

    2008-05-01

    Small animal SPECT imaging systems have multiple potential applications in biomedical research. Whereas SPECT data are commonly interpreted qualitatively in a clinical setting, the ability to accurately quantify measurements will increase the utility of the SPECT data for laboratory measurements involving small animals. In this work, we assess the effect of photon attenuation, scatter and partial volume errors on the quantitative accuracy of small animal SPECT measurements, first with Monte Carlo simulation and then confirmed with experimental measurements. The simulations modeled the imaging geometry of a commercially available small animal SPECT system. We simulated the imaging of a radioactive source within a cylinder of water, and reconstructed the projection data using iterative reconstruction algorithms. The size of the source and the size of the surrounding cylinder were varied to evaluate the effects of photon attenuation and scatter on quantitative accuracy. We found that photon attenuation can reduce the measured concentration of radioactivity in a volume of interest in the center of a rat-sized cylinder of water by up to 50% when imaging with iodine-125, and up to 25% when imaging with technetium-99m. When imaging with iodine-125, the scatter-to-primary ratio can reach up to approximately 30%, and can cause overestimation of the radioactivity concentration when reconstructing data with attenuation correction. We varied the size of the source to evaluate partial volume errors, which we found to be a strong function of the size of the volume of interest and the spatial resolution. These errors can result in large (>50%) changes in the measured amount of radioactivity. The simulation results were compared with and found to agree with experimental measurements. The inclusion of attenuation correction in the reconstruction algorithm improved quantitative accuracy. We also found that an improvement of the spatial resolution through the use of resolution

  4. Assessing framing assumptions in quantitative health impact assessments: a housing intervention example.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Foss, Anna M

    2013-09-01

    Health impact assessment (HIA) is often used to determine ex ante the health impact of an environmental policy or an environmental intervention. Underpinning any HIA is the framing assumption, which defines the causal pathways mapping environmental exposures to health outcomes. The sensitivity of the HIA to the framing assumptions is often ignored. A novel method based on fuzzy cognitive map (FCM) is developed to quantify the framing assumptions in the assessment stage of a HIA, and is then applied to a housing intervention (tightening insulation) as a case-study. Framing assumptions of the case-study were identified through a literature search of Ovid Medline (1948-2011). The FCM approach was used to identify the key variables that have the most influence in a HIA. Changes in air-tightness, ventilation, indoor air quality and mould/humidity have been identified as having the most influence on health. The FCM approach is widely applicable and can be used to inform the formulation of the framing assumptions in any quantitative HIA of environmental interventions. We argue that it is necessary to explore and quantify framing assumptions prior to conducting a detailed quantitative HIA during the assessment stage.

  5. Mitochondrial DNA as a non-invasive biomarker: Accurate quantification using real time quantitative PCR without co-amplification of pseudogenes and dilution bias

    SciTech Connect

    Malik, Afshan N.; Shahni, Rojeen; Rodriguez-de-Ledesma, Ana; Laftah, Abas; Cunningham, Phil

    2011-08-19

    Highlights: {yields} Mitochondrial dysfunction is central to many diseases of oxidative stress. {yields} 95% of the mitochondrial genome is duplicated in the nuclear genome. {yields} Dilution of untreated genomic DNA leads to dilution bias. {yields} Unique primers and template pretreatment are needed to accurately measure mitochondrial DNA content. -- Abstract: Circulating mitochondrial DNA (MtDNA) is a potential non-invasive biomarker of cellular mitochondrial dysfunction, the latter known to be central to a wide range of human diseases. Changes in MtDNA are usually determined by quantification of MtDNA relative to nuclear DNA (Mt/N) using real time quantitative PCR. We propose that the methodology for measuring Mt/N needs to be improved and we have identified that current methods have at least one of the following three problems: (1) As much of the mitochondrial genome is duplicated in the nuclear genome, many commonly used MtDNA primers co-amplify homologous pseudogenes found in the nuclear genome; (2) use of regions from genes such as {beta}-actin and 18S rRNA which are repetitive and/or highly variable for qPCR of the nuclear genome leads to errors; and (3) the size difference of mitochondrial and nuclear genomes cause a 'dilution bias' when template DNA is diluted. We describe a PCR-based method using unique regions in the human mitochondrial genome not duplicated in the nuclear genome; unique single copy region in the nuclear genome and template treatment to remove dilution bias, to accurately quantify MtDNA from human samples.

  6. Accurate, quantitative assays for the hydrolysis of soluble type I, II, and III /sup 3/H-acetylated collagens by bacterial and tissue collagenases

    SciTech Connect

    Mallya, S.K.; Mookhtiar, K.A.; Van Wart, H.E.

    1986-11-01

    Accurate and quantitative assays for the hydrolysis of soluble /sup 3/H-acetylated rat tendon type I, bovine cartilage type II, and human amnion type III collagens by both bacterial and tissue collagenases have been developed. The assays are carried out at any temperature in the 1-30/sup 0/C range in a single reaction tube and the progress of the reaction is monitored by withdrawing aliquots as a function of time, quenching with 1,10-phenanthroline, and quantitation of the concentration of hydrolysis fragments. The latter is achieved by selective denaturation of these fragments by incubation under conditions described in the previous paper of this issue. The assays give percentages of hydrolysis of all three collagen types by neutrophil collagenase that agree well with the results of gel electrophoresis experiments. The initial rates of hydrolysis of all three collagens are proportional to the concentration of both neutrophil or Clostridial collagenases over a 10-fold range of enzyme concentrations. All three assays can be carried out at collagen concentrations that range from 0.06 to 2 mg/ml and give linear double reciprocal plots for both tissue and bacterial collagenases that can be used to evaluate the kinetic parameters K/sub m/ and k/sub cat/ or V/sub max/. The assay developed for the hydrolysis of rat type I collagen by neutrophil collagenase is shown to be more sensitive by at least one order of magnitude than comparable assays that use rat type I collagen fibrils or gels as substrate.

  7. Assessing temporal flux of plant hormones in stored processing potatoes using high definition accurate mass spectrometry

    PubMed Central

    Ordaz-Ortiz, José Juan; Foukaraki, Sofia; Terry, Leon Alexander

    2015-01-01

    Plant hormones are important molecules which at low concentration can regulate various physiological processes. Mass spectrometry has become a powerful technique for the quantification of multiple classes of plant hormones because of its high sensitivity and selectivity. We developed a new ultrahigh pressure liquid chromatography–full-scan high-definition accurate mass spectrometry method, for simultaneous determination of abscisic acid and four metabolites phaseic acid, dihydrophaseic acid, 7′-hydroxy-abscisic acid and abscisic acid glucose ester, cytokinins zeatin, zeatin riboside, gibberellins (GA1, GA3, GA4 and GA7) and indole-3-acetyl-L-aspartic acid. We measured the amount of plant hormones in the flesh and skin of two processing potato cvs. Sylvana and Russet Burbank stored for up to 30 weeks at 6 °C under ambient air conditions. Herein, we report for the first time that abscisic acid glucose ester seems to accumulate in the skin of potato tubers throughout storage time. The method achieved a lowest limit of detection of 0.22 ng g−1 of dry weight and a limit of quantification of 0.74 ng g−1 dry weight (zeatin riboside), and was able to recover, detect and quantify a total of 12 plant hormones spiked on flesh and skin of potato tubers. In addition, the mass accuracy for all compounds (<5 ppm) was evaluated. PMID:26504563

  8. Assessing temporal flux of plant hormones in stored processing potatoes using high definition accurate mass spectrometry.

    PubMed

    Ordaz-Ortiz, José Juan; Foukaraki, Sofia; Terry, Leon Alexander

    2015-01-01

    Plant hormones are important molecules which at low concentration can regulate various physiological processes. Mass spectrometry has become a powerful technique for the quantification of multiple classes of plant hormones because of its high sensitivity and selectivity. We developed a new ultrahigh pressure liquid chromatography-full-scan high-definition accurate mass spectrometry method, for simultaneous determination of abscisic acid and four metabolites phaseic acid, dihydrophaseic acid, 7'-hydroxy-abscisic acid and abscisic acid glucose ester, cytokinins zeatin, zeatin riboside, gibberellins (GA1, GA3, GA4 and GA7) and indole-3-acetyl-L-aspartic acid. We measured the amount of plant hormones in the flesh and skin of two processing potato cvs. Sylvana and Russet Burbank stored for up to 30 weeks at 6 °C under ambient air conditions. Herein, we report for the first time that abscisic acid glucose ester seems to accumulate in the skin of potato tubers throughout storage time. The method achieved a lowest limit of detection of 0.22 ng g(-1) of dry weight and a limit of quantification of 0.74 ng g(-1) dry weight (zeatin riboside), and was able to recover, detect and quantify a total of 12 plant hormones spiked on flesh and skin of potato tubers. In addition, the mass accuracy for all compounds (<5 ppm) was evaluated.

  9. Electroencephalographic Data Analysis With Visibility Graph Technique for Quantitative Assessment of Brain Dysfunction.

    PubMed

    Bhaduri, Susmita; Ghosh, Dipak

    2015-07-01

    Usual techniques for electroencephalographic (EEG) data analysis lack some of the important properties essential for quantitative assessment of the progress of the dysfunction of the human brain. EEG data are essentially nonlinear and this nonlinear time series has been identified as multi-fractal in nature. We need rigorous techniques for such analysis. In this article, we present the visibility graph as the latest, rigorous technique that can assess the degree of multifractality accurately and reliably. Moreover, it has also been found that this technique can give reliable results with test data of comparatively short length. In this work, the visibility graph algorithm has been used for mapping a time series-EEG signals-to a graph to study complexity and fractality of the time series through investigation of its complexity. The power of scale-freeness of visibility graph has been used as an effective method for measuring fractality in the EEG signal. The scale-freeness of the visibility graph has also been observed after averaging the statistically independent samples of the signal. Scale-freeness of the visibility graph has been calculated for 5 sets of EEG data patterns varying from normal eye closed to epileptic. The change in the values is analyzed further, and it has been observed that it reduces uniformly from normal eye closed to epileptic.

  10. Assessing the Impact of a Quantitative Skills Course for Undergraduates

    ERIC Educational Resources Information Center

    Andersen, Kristi; Harsell, Dana Michael

    2005-01-01

    This paper evaluates the long-term benefits of a Syracuse University course offering, "Maxwell 201: Quantitative Methods for the Social Sciences" (MAX 201). The authors analyze data collected from class-administered pre- and post-tests and from a questionnaire sent to a random sample MAX 201 alumni to evaluate the extent to which…

  11. CUMULATIVE RISK ASSESSMENT: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS

    EPA Science Inventory

    INTRODUCTION: GETTING FROM TOXICOLOGY TO QUANTITATIVE ANALYSIS FOR CUMULATIVE RISK

    Hugh A. Barton1 and Carey N. Pope2
    1US EPA, Office of Research and Development, National Health and Environmental Effects Research Laboratory, Research Triangle Park, NC
    2Department of...

  12. A Comparative Assessment of Greek Universities' Efficiency Using Quantitative Analysis

    ERIC Educational Resources Information Center

    Katharaki, Maria; Katharakis, George

    2010-01-01

    In part due to the increased demand for higher education, typical evaluation frameworks for universities often address the key issue of available resource utilisation. This study seeks to estimate the efficiency of 20 public universities in Greece through quantitative analysis (including performance indicators, data envelopment analysis (DEA) and…

  13. Serum Protein Profile at Remission Can Accurately Assess Therapeutic Outcomes and Survival for Serous Ovarian Cancer

    PubMed Central

    Ghamande, Sharad A.; Bush, Stephen; Ferris, Daron; Zhi, Wenbo; He, Mingfang; Wang, Meiyao; Wang, Xiaoxiao; Miller, Eric; Hopkins, Diane; Macfee, Michael; Guan, Ruili; Tang, Jinhai; She, Jin-Xiong

    2013-01-01

    Background Biomarkers play critical roles in early detection, diagnosis and monitoring of therapeutic outcome and recurrence of cancer. Previous biomarker research on ovarian cancer (OC) has mostly focused on the discovery and validation of diagnostic biomarkers. The primary purpose of this study is to identify serum biomarkers for prognosis and therapeutic outcomes of ovarian cancer. Experimental Design Forty serum proteins were analyzed in 70 serum samples from healthy controls (HC) and 101 serum samples from serous OC patients at three different disease phases: post diagnosis (PD), remission (RM) and recurrence (RC). The utility of serum proteins as OC biomarkers was evaluated using a variety of statistical methods including survival analysis. Results Ten serum proteins (PDGF-AB/BB, PDGF-AA, CRP, sFas, CA125, SAA, sTNFRII, sIL-6R, IGFBP6 and MDC) have individually good area-under-the-curve (AUC) values (AUC = 0.69–0.86) and more than 10 three-marker combinations have excellent AUC values (0.91–0.93) in distinguishing active cancer samples (PD & RC) from HC. The mean serum protein levels for RM samples are usually intermediate between HC and OC patients with active cancer (PD & RC). Most importantly, five proteins (sICAM1, RANTES, sgp130, sTNFR-II and sVCAM1) measured at remission can classify, individually and in combination, serous OC patients into two subsets with significantly different overall survival (best HR = 17, p<10−3). Conclusion We identified five serum proteins which, when measured at remission, can accurately predict the overall survival of serous OC patients, suggesting that they may be useful for monitoring the therapeutic outcomes for ovarian cancer. PMID:24244307

  14. Chromatography paper as a low-cost medium for accurate spectrophotometric assessment of blood hemoglobin concentration.

    PubMed

    Bond, Meaghan; Elguea, Carlos; Yan, Jasper S; Pawlowski, Michal; Williams, Jessica; Wahed, Amer; Oden, Maria; Tkaczyk, Tomasz S; Richards-Kortum, Rebecca

    2013-06-21

    Anemia affects a quarter of the world's population, and a lack of appropriate diagnostic tools often prevents treatment in low-resource settings. Though the HemoCue 201+ is an appropriate device for diagnosing anemia in low-resource settings, the high cost of disposables ($0.99 per test in Malawi) limits its availability. We investigated using spectrophotometric measurement of blood spotted on chromatography paper as a low-cost (<$0.01 per test) alternative to HemoCue cuvettes. For this evaluation, donor blood was diluted with plasma to simulate anemia, a micropipette spotted blood on paper, and a bench-top spectrophotometer validated the approach before the development of a low-cost reader. We optimized impregnating paper with chemicals to lyse red blood cells, paper type, drying time, wavelengths measured, and sensitivity to variations in volume of blood, and we validated our approach using patient samples. Lysing the blood cells with sodium deoxycholate dried in Whatman Chr4 chromatography paper gave repeatable results, and the absorbance difference between 528 nm and 656 nm was stable over time in measurements taken up to 10 min after sample preparation. The method was insensitive to the amount of blood spotted on the paper over the range of 5 μL to 25 μL. We created a low-cost, handheld reader to measure the transmission of paper cuvettes at these optimal wavelengths. Training and validating our method with patient samples on both the spectrometer and the handheld reader showed that both devices are accurate to within 2 g dL(-1) of the HemoCue device for 98% and 95% of samples, respectively.

  15. Detection and quantitation of trace phenolphthalein (in pharmaceutical preparations and in forensic exhibits) by liquid chromatography-tandem mass spectrometry, a sensitive and accurate method.

    PubMed

    Sharma, Kakali; Sharma, Shiba P; Lahiri, Sujit C

    2013-01-01

    Phenolphthalein, an acid-base indicator and laxative, is important as a constituent of widely used weight-reducing multicomponent food formulations. Phenolphthalein is an useful reagent in forensic science for the identification of blood stains of suspected victims and for apprehending erring officials accepting bribes in graft or trap cases. The pink-colored alkaline hand washes originating from the phenolphthalein-smeared notes can easily be determined spectrophotometrically. But in many cases, colored solution turns colorless with time, which renders the genuineness of bribe cases doubtful to the judiciary. No method is known till now for the detection and identification of phenolphthalein in colorless forensic exhibits with positive proof. Liquid chromatography-tandem mass spectrometry had been found to be most sensitive, accurate method capable of detection and quantitation of trace phenolphthalein in commercial formulations and colorless forensic exhibits with positive proof. The detection limit of phenolphthalein was found to be 1.66 pg/L or ng/mL, and the calibration curve shows good linearity (r(2) = 0.9974).

  16. Allele Specific Locked Nucleic Acid Quantitative PCR (ASLNAqPCR): An Accurate and Cost-Effective Assay to Diagnose and Quantify KRAS and BRAF Mutation

    PubMed Central

    Morandi, Luca; de Biase, Dario; Visani, Michela; Cesari, Valentina; De Maglio, Giovanna; Pizzolitto, Stefano; Pession, Annalisa; Tallini, Giovanni

    2012-01-01

    The use of tyrosine kinase inhibitors (TKIs) requires the testing for hot spot mutations of the molecular effectors downstream the membrane-bound tyrosine kinases since their wild type status is expected for response to TKI therapy. We report a novel assay that we have called Allele Specific Locked Nucleic Acid quantitative PCR (ASLNAqPCR). The assay uses LNA-modified allele specific primers and LNA-modified beacon probes to increase sensitivity, specificity and to accurately quantify mutations. We designed primers specific for codon 12/13 KRAS mutations and BRAF V600E, and validated the assay with 300 routine samples from a variety of sources, including cytology specimens. All were analyzed by ASLNAqPCR and Sanger sequencing. Discordant cases were pyrosequenced. ASLNAqPCR correctly identified BRAF and KRAS mutations in all discordant cases and all had a mutated/wild type DNA ratio below the analytical sensitivity of the Sanger method. ASLNAqPCR was 100% specific with greater accuracy, positive and negative predictive values compared with Sanger sequencing. The analytical sensitivity of ASLNAqPCR is 0.1%, allowing quantification of mutated DNA in small neoplastic cell clones. ASLNAqPCR can be performed in any laboratory with real-time PCR equipment, is very cost-effective and can easily be adapted to detect hot spot mutations in other oncogenes. PMID:22558339

  17. Aggregate versus individual-level sexual behavior assessment: how much detail is needed to accurately estimate HIV/STI risk?

    PubMed

    Pinkerton, Steven D; Galletly, Carol L; McAuliffe, Timothy L; DiFranceisco, Wayne; Raymond, H Fisher; Chesson, Harrell W

    2010-02-01

    The sexual behaviors of HIV/sexually transmitted infection (STI) prevention intervention participants can be assessed on a partner-by-partner basis: in aggregate (i.e., total numbers of sex acts, collapsed across partners) or using a combination of these two methods (e.g., assessing five partners in detail and any remaining partners in aggregate). There is a natural trade-off between the level of sexual behavior detail and the precision of HIV/STI acquisition risk estimates. The results of this study indicate that relatively simple aggregate data collection techniques suffice to adequately estimate HIV risk. For highly infectious STIs, in contrast, accurate STI risk assessment requires more intensive partner-by-partner methods.

  18. Assessing Student Teachers' Reflective Writing through Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Poldner, Eric; Van der Schaaf, Marieke; Simons, P. Robert-Jan; Van Tartwijk, Jan; Wijngaards, Guus

    2014-01-01

    Students' reflective essay writing can be stimulated by the formative assessments provided to them by their teachers. Such assessments contain information about the quality of students' reflective writings and offer suggestions for improvement. Despite the importance of formatively assessing students' reflective writings in teacher education…

  19. A Cost-Benefit and Accurate Method for Assessing Microalbuminuria: Single versus Frequent Urine Analysis.

    PubMed

    Hemmati, Roholla; Gharipour, Mojgan; Khosravi, Alireza; Jozan, Mahnaz

    2013-01-01

    Background. The purpose of this study was to answer the question whether a single testing for microalbuminuria results in a reliable conclusion leading costs saving. Methods. This current cross-sectional study included a total of 126 consecutive persons. Microalbuminuria was assessed by collection of two fasting random urine specimens on arrival to the clinic as well as one week later in the morning. Results. In overall, 17 out of 126 participants suffered from microalbuminuria that, among them, 12 subjects were also diagnosed as microalbuminuria once assessing this factor with a sensitivity of 70.6%, a specificity of 100%, a PPV of 100%, a NPV of 95.6%, and an accuracy of 96.0%. The measured sensitivity, specificity, PVV, NPV, and accuracy in hypertensive patients were 73.3%, 100%, 100%, 94.8%, and 95.5%, respectively. Also, these rates in nonhypertensive groups were 50.0%, 100%, 100%, 97.3%, and 97.4%, respectively. According to the ROC curve analysis, a single measurement of UACR had a high value for discriminating defected from normal renal function state (c = 0.989). Urinary albumin concentration in a single measurement had also high discriminative value for diagnosis of damaged kidney (c = 0.995). Conclusion. The single testing of both UACR and urine albumin level rather frequent testing leads to high diagnostic sensitivity, specificity, and accuracy as well as high predictive values in total population and also in hypertensive subgroups.

  20. Quantitative assessment of scatter correction techniques incorporated in next generation dual-source computed tomography

    NASA Astrophysics Data System (ADS)

    Mobberley, Sean David

    Accurate, cross-scanner assessment of in-vivo air density used to quantitatively assess amount and distribution of emphysema in COPD subjects has remained elusive. Hounsfield units (HU) within tracheal air can be considerably more positive than -1000 HU. With the advent of new dual-source scanners which employ dedicated scatter correction techniques, it is of interest to evaluate how the quantitative measures of lung density compare between dual-source and single-source scan modes. This study has sought to characterize in-vivo and phantom-based air metrics using dual-energy computed tomography technology where the nature of the technology has required adjustments to scatter correction. Anesthetized ovine (N=6), swine (N=13: more human-like rib cage shape), lung phantom and a thoracic phantom were studied using a dual-source MDCT scanner (Siemens Definition Flash. Multiple dual-source dual-energy (DSDE) and single-source (SS) scans taken at different energy levels and scan settings were acquired for direct quantitative comparison. Density histograms were evaluated for the lung, tracheal, water and blood segments. Image data were obtained at 80, 100, 120, and 140 kVp in the SS mode (B35f kernel) and at 80, 100, 140, and 140-Sn (tin filtered) kVp in the DSDE mode (B35f and D30f kernels), in addition to variations in dose, rotation time, and pitch. To minimize the effect of cross-scatter, the phantom scans in the DSDE mode was obtained by reducing the tube current of one of the tubes to its minimum (near zero) value. When using image data obtained in the DSDE mode, the median HU values in the tracheal regions of all animals and the phantom were consistently closer to -1000 HU regardless of reconstruction kernel (chapters 3 and 4). Similarly, HU values of water and blood were consistently closer to their nominal values of 0 HU and 55 HU respectively. When using image data obtained in the SS mode the air CT numbers demonstrated a consistent positive shift of up to 35 HU

  1. Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals' Behaviour.

    PubMed

    Barnard, Shanis; Calderara, Simone; Pistocchi, Simone; Cucchiara, Rita; Podaliri-Vulpiani, Michele; Messori, Stefano; Ferri, Nicola

    2016-01-01

    Mankind directly controls the environment and lifestyles of several domestic species for purposes ranging from production and research to conservation and companionship. These environments and lifestyles may not offer these animals the best quality of life. Behaviour is a direct reflection of how the animal is coping with its environment. Behavioural indicators are thus among the preferred parameters to assess welfare. However, behavioural recording (usually from video) can be very time consuming and the accuracy and reliability of the output rely on the experience and background of the observers. The outburst of new video technology and computer image processing gives the basis for promising solutions. In this pilot study, we present a new prototype software able to automatically infer the behaviour of dogs housed in kennels from 3D visual data and through structured machine learning frameworks. Depth information acquired through 3D features, body part detection and training are the key elements that allow the machine to recognise postures, trajectories inside the kennel and patterns of movement that can be later labelled at convenience. The main innovation of the software is its ability to automatically cluster frequently observed temporal patterns of movement without any pre-set ethogram. Conversely, when common patterns are defined through training, a deviation from normal behaviour in time or between individuals could be assessed. The software accuracy in correctly detecting the dogs' behaviour was checked through a validation process. An automatic behaviour recognition system, independent from human subjectivity, could add scientific knowledge on animals' quality of life in confinement as well as saving time and resources. This 3D framework was designed to be invariant to the dog's shape and size and could be extended to farm, laboratory and zoo quadrupeds in artificial housing. The computer vision technique applied to this software is innovative in non

  2. Quick, Accurate, Smart: 3D Computer Vision Technology Helps Assessing Confined Animals’ Behaviour

    PubMed Central

    Calderara, Simone; Pistocchi, Simone; Cucchiara, Rita; Podaliri-Vulpiani, Michele; Messori, Stefano; Ferri, Nicola

    2016-01-01

    Mankind directly controls the environment and lifestyles of several domestic species for purposes ranging from production and research to conservation and companionship. These environments and lifestyles may not offer these animals the best quality of life. Behaviour is a direct reflection of how the animal is coping with its environment. Behavioural indicators are thus among the preferred parameters to assess welfare. However, behavioural recording (usually from video) can be very time consuming and the accuracy and reliability of the output rely on the experience and background of the observers. The outburst of new video technology and computer image processing gives the basis for promising solutions. In this pilot study, we present a new prototype software able to automatically infer the behaviour of dogs housed in kennels from 3D visual data and through structured machine learning frameworks. Depth information acquired through 3D features, body part detection and training are the key elements that allow the machine to recognise postures, trajectories inside the kennel and patterns of movement that can be later labelled at convenience. The main innovation of the software is its ability to automatically cluster frequently observed temporal patterns of movement without any pre-set ethogram. Conversely, when common patterns are defined through training, a deviation from normal behaviour in time or between individuals could be assessed. The software accuracy in correctly detecting the dogs’ behaviour was checked through a validation process. An automatic behaviour recognition system, independent from human subjectivity, could add scientific knowledge on animals’ quality of life in confinement as well as saving time and resources. This 3D framework was designed to be invariant to the dog’s shape and size and could be extended to farm, laboratory and zoo quadrupeds in artificial housing. The computer vision technique applied to this software is innovative in non

  3. Distinguishing nanomaterial particles from background airborne particulate matter for quantitative exposure assessment

    NASA Astrophysics Data System (ADS)

    Ono-Ogasawara, Mariko; Serita, Fumio; Takaya, Mitsutoshi

    2009-10-01

    As the production of engineered nanomaterials quantitatively expands, the chance that workers involved in the manufacturing process will be exposed to nanoparticles also increases. A risk management system is needed for workplaces in the nanomaterial industry based on the precautionary principle. One of the problems in the risk management system is difficulty of exposure assessment. In this article, examples of exposure assessment in nanomaterial industries are reviewed with a focus on distinguishing engineered nanomaterial particles from background nanoparticles in workplace atmosphere. An approach by JNIOSH (Japan National Institute of Occupational Safety and Health) to quantitatively measure exposure to carbonaceous nanomaterials is also introduced. In addition to real-time measurements and qualitative analysis by electron microscopy, quantitative chemical analysis is necessary for quantitatively assessing exposure to nanomaterials. Chemical analysis is suitable for quantitative exposure measurement especially at facilities with high levels of background NPs.

  4. Optimization of Dual-Energy Xenon-CT for Quantitative Assessment of Regional Pulmonary Ventilation

    PubMed Central

    Fuld, Matthew K.; Halaweish, Ahmed; Newell, John D.; Krauss, Bernhard; Hoffman, Eric A.

    2013-01-01

    Objective Dual-energy X-ray computed tomography (DECT) offers visualization of the airways and quantitation of regional pulmonary ventilation using a single breath of inhaled xenon gas. In this study we seek to optimize scanning protocols for DECT xenon gas ventilation imaging of the airways and lung parenchyma and to characterize the quantitative nature of the developed protocols through a series of test-object and animal studies. Materials and Methods The Institutional Animal Care and Use Committee approved all animal studies reported here. A range of xenon-oxygen gas mixtures (0, 20, 25, 33, 50, 66, 100%; balance oxygen) were scanned in syringes and balloon test-objects to optimize the delivered gas mixture for assessment of regional ventilation while allowing for the development of improved three-material decomposition calibration parameters. Additionally, to alleviate gravitational effects on xenon gas distribution, we replaced a portion of the oxygen in the xenon/oxygen gas mixture with helium and compared gas distributions in a rapid-prototyped human central-airway test-object. Additional syringe tests were performed to determine if the introduction of helium had any effect on xenon quantitation. Xenon gas mixtures were delivered to anesthetized swine in order to assess airway and lung parenchymal opacification while evaluating various DECT scan acquisition settings. Results Attenuation curves for xenon were obtained from the syringe test objects and were used to develop improved three-material decomposition parameters (HU enhancement per percent xenon: Within the chest phantom: 2.25 at 80kVp, 1.7 at 100 kVp, and 0.76 at 140 kVp with tin filtration; In open air: 2.5 at 80kVp, 1.95 at 100 kVp, and 0.81 at 140 kVp with tin filtration). The addition of helium improved the distribution of xenon gas to the gravitationally non-dependent portion of the airway tree test-object, while not affecting quantitation of xenon in the three-material decomposition DECT. 40%Xe

  5. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses

    ERIC Educational Resources Information Center

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory…

  6. Quantitative Assessment of Neuromotor Function in Adolescents with High Functioning Autism and Asperger Syndrome

    ERIC Educational Resources Information Center

    Freitag, Christine M.; Kleser, Christina; Schneider, Marc; von Gontard, Alexander

    2007-01-01

    Background: Motor impairment in children with Asperger Syndrome (AS) or High functioning autism (HFA) has been reported previously. This study presents results of a quantitative assessment of neuromotor skills in 14-22 year old HFA/AS. Methods: 16 HFA/AS and 16 IQ-matched controls were assessed by the Zurich Neuromotor Assessment (ZNA). Results:…

  7. Personal Exposure Monitoring Wearing Protocol Compliance: An Initial Assessment of Quantitative Measurements

    EPA Science Inventory

    Personal exposure sampling provides the most accurate and representative assessment of exposure to a pollutant, but only if measures are implemented to minimize exposure misclassification and reduce confounders that may cause misinterpretation of the collected data. Poor complian...

  8. Sensors on the humerus are not necessary for an accurate assessment of humeral kinematics in constrained movements.

    PubMed

    Lin, Yin-Liang; Karduna, Andrew R

    2013-08-01

    The measurement of humeral kinematics with a sensor on the humerus is susceptible to large errors due to skin motion artifacts. An alternative approach is to use data from a forearm sensor, combined with data from either a scapular or thoracic sensor. We used three tasks to assess the errors of these approaches: humeral elevation, elbow flexion and humeral internal rotation. Compared with the humeral method, the forearm methods (using either a scapular or thoracic sensor) demonstrated significantly smaller root mean square errors in humeral elevation and humeral internal rotation tasks. Although the errors of the forearm methods were significantly larger than those of the humeral method during elbow flexion, the errors of the forearm methods still were below 3°. Therefore, these forearm methods may be able to accurately measure humeral motion. In addition, since no difference was found between the forearm methods using the scapular or thoracic sensor, it may be possible to accurately assess both shoulder and elbow kinematics with only two sensors: one on the forearm and one on the scapula.

  9. Duplicate portion sampling combined with spectrophotometric analysis affords the most accurate results when assessing daily dietary phosphorus intake.

    PubMed

    Navarro-Alarcon, Miguel; Zambrano, Esmeralda; Moreno-Montoro, Miriam; Agil, Ahmad; Olalla, Manuel

    2012-08-01

    The assessment of daily dietary phosphorus (P) intake is a major concern in human nutrition because of its relationship with Ca and Mg metabolism and osteoporosis. Within this context, we hypothesized that several of the methods available for the assessment of daily dietary intake of P are equally accurate and reliable, although few studies have been conducted to confirm this. The aim of this study then was to evaluate daily dietary P intake, which we did by 3 methods: duplicate portion sampling of 108 hospital meals, combined either with spectrophotometric analysis or the use of food composition tables, and 24-hour dietary recall for 3 consecutive days plus the use of food composition tables. The mean P daily dietary intakes found were 1106 ± 221, 1480 ± 221, and 1515 ± 223 mg/d, respectively. Daily dietary intake of P determined by spectrophotometric analysis was significantly lower (P < .001) and closer to dietary reference intakes for adolescents aged from 14 to 18 years (88.5%) and adult subjects (158.1%) compared with the other 2 methods. Duplicate portion sampling with P analysis takes into account the influence of technological and cooking processes on the P content of foods and meals and therefore afforded the most accurate and reliable P daily dietary intakes. The use of referred food composition tables overestimated daily dietary P intake. No adverse effects in relation to P nutrition (deficiencies or toxic effects) were encountered.

  10. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, an...

  11. Using Integrated Environmental Modeling to Automate a Process-Based Quantitative Microbial Risk Assessment (presentation)

    EPA Science Inventory

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and...

  12. Quantitative Microbial Risk Assessment Tutorial: Installation of Software for Watershed Modeling in Support of QMRA

    EPA Science Inventory

    This tutorial provides instructions for accessing, retrieving, and downloading the following software to install on a host computer in support of Quantitative Microbial Risk Assessment (QMRA) modeling:• SDMProjectBuilder (which includes the Microbial Source Module as part...

  13. Using integrated environmental modeling to automate a process-based Quantitative Microbial Risk Assessment

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Integrated Environmental Modeling (IEM) organizes multidisciplinary knowledge that explains and predicts environmental-system response to stressors. A Quantitative Microbial Risk Assessment (QMRA) is an approach integrating a range of disparate data (fate/transport, exposure, and human health effect...

  14. Assessment of metabolic bone diseases by quantitative computed tomography

    SciTech Connect

    Richardson, M.L.; Genant, H.K.; Cann, C.E.; Ettinger, B.; Gordan, G.S.; Kolb, F.O.; Reiser, U.J.

    1985-05-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid- induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements.

  15. Quantitative Assessment of Neurite Outgrowth in PC12 Cells

    EPA Science Inventory

    In vitro test methods can provide a rapid approach for the screening of large numbers of chemicals for their potential to produce toxicity. In order to identify potential developmental neurotoxicants, assessment of critical neurodevelopmental processes such as neuronal differenti...

  16. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    DOE PAGES

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; ...

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results weremore » compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.« less

  17. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis

    SciTech Connect

    Larimer, Curtis J.; Winder, Eric M.; Jeters, Robert T.; Prowant, Matthew S.; Nettleship, Ian; Addleman, Raymond S.; Bonheyo, George T.

    2015-12-07

    Here, the accumulation of bacteria in surface attached biofilms, or biofouling, can be detrimental to human health, dental hygiene, and many industrial processes. A critical need in identifying and preventing the deleterious effects of biofilms is the ability to observe and quantify their development. Analytical methods capable of assessing early stage fouling are cumbersome or lab-confined, subjective, and qualitative. Herein, a novel photographic method is described that uses biomolecular staining and image analysis to enhance contrast of early stage biofouling. A robust algorithm was developed to objectively and quantitatively measure surface accumulation of Pseudomonas putida from photographs and results were compared to independent measurements of cell density. Results from image analysis quantified biofilm growth intensity accurately and with approximately the same precision of the more laborious cell counting method. This simple method for early stage biofilm detection enables quantifiable measurement of surface fouling and is flexible enough to be applied from the laboratory to the field. Broad spectrum staining highlights fouling biomass, photography quickly captures a large area of interest, and image analysis rapidly quantifies fouling in the image.

  18. A quantitative assessment of Arctic shipping in 2010-2014.

    PubMed

    Eguíluz, Victor M; Fernández-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M

    2016-08-01

    Rapid loss of sea ice is opening up the Arctic Ocean to shipping, a practice that is forecasted to increase rapidly by 2050 when many models predict that the Arctic Ocean will largely be free of ice toward the end of summer. These forecasts carry considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011-2014 is already significant and that it is concentrated (i) in the Norwegian and Barents Seas, and (ii) predominantly accessed via the Northeast and Northwest Passages. Thick ice along the forecasted direct trans-Arctic route was still present in 2014, preventing transit. Although Arctic shipping remains constrained by the extent of ice coverage, during every September, this coverage is at a minimum, allowing the highest levels of shipping activity. Access to Arctic resources, particularly fisheries, is the most important driver of Arctic shipping thus far.

  19. A quantitative assessment of Arctic shipping in 2010–2014

    NASA Astrophysics Data System (ADS)

    Eguíluz, Victor M.; Fernández-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M.

    2016-08-01

    Rapid loss of sea ice is opening up the Arctic Ocean to shipping, a practice that is forecasted to increase rapidly by 2050 when many models predict that the Arctic Ocean will largely be free of ice toward the end of summer. These forecasts carry considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011–2014 is already significant and that it is concentrated (i) in the Norwegian and Barents Seas, and (ii) predominantly accessed via the Northeast and Northwest Passages. Thick ice along the forecasted direct trans-Arctic route was still present in 2014, preventing transit. Although Arctic shipping remains constrained by the extent of ice coverage, during every September, this coverage is at a minimum, allowing the highest levels of shipping activity. Access to Arctic resources, particularly fisheries, is the most important driver of Arctic shipping thus far.

  20. Assessing the Phagosome Proteome by Quantitative Mass Spectrometry.

    PubMed

    Peltier, Julien; Härtlova, Anetta; Trost, Matthias

    2017-01-01

    Phagocytosis is the process that engulfs particles in vesicles called phagosomes that are trafficked through a series of maturation steps, culminating in the destruction of the internalized cargo. Because phagosomes are in direct contact with the particle and undergo constant fusion and fission events with other organelles, characterization of the phagosomal proteome is a powerful tool to understand mechanisms controlling innate immunity as well as vesicle trafficking. The ability to isolate highly pure phagosomes through the use of latex beads led to an extensive use of proteomics to study phagosomes under different stimuli. Thousands of different proteins have been identified and quantified, revealing new properties and shedding new light on the dynamics and composition of maturing phagosomes and innate immunity mechanisms. In this chapter, we describe how quantitative-based proteomic methods such as label-free, dimethyl labeling or Tandem Mass Tag (TMT) labeling can be applied for the characterization of protein composition and translocation during maturation of phagosomes in macrophages.

  1. Uncertainty in environmental health impact assessment: quantitative methods and perspectives.

    PubMed

    Mesa-Frias, Marco; Chalabi, Zaid; Vanni, Tazio; Foss, Anna M

    2013-01-01

    Environmental health impact assessment models are subjected to great uncertainty due to the complex associations between environmental exposures and health. Quantifying the impact of uncertainty is important if the models are used to support health policy decisions. We conducted a systematic review to identify and appraise current methods used to quantify the uncertainty in environmental health impact assessment. In the 19 studies meeting the inclusion criteria, several methods were identified. These were grouped into random sampling methods, second-order probability methods, Bayesian methods, fuzzy sets, and deterministic sensitivity analysis methods. All 19 studies addressed the uncertainty in the parameter values but only 5 of the studies also addressed the uncertainty in the structure of the models. None of the articles reviewed considered conceptual sources of uncertainty associated with the framing assumptions or the conceptualisation of the model. Future research should attempt to broaden the way uncertainty is taken into account in environmental health impact assessments.

  2. Developing a Quantitative Tool for Sustainability Assessment of HEIs

    ERIC Educational Resources Information Center

    Waheed, Bushra; Khan, Faisal I.; Veitch, Brian

    2011-01-01

    Purpose: Implementation of a sustainability paradigm demands new choices and innovative ways of thinking. The main objective of this paper is to provide a meaningful sustainability assessment tool for make informed decisions, which is applied to higher education institutions (HEIs). Design/methodology/approach: The objective is achieved by…

  3. Quantitative Assessment of Spray Deposition with Water-Sensitive Paper

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Spray droplets, discharged from the lower six nozzles of an airblast sprayer, were sampled on pairs of absorbent filter and water-sensitive papers at nine distances from sprayer. Spray deposition on filter targets were measured by fluorometry and spray distribution on WSP targets were assessed by t...

  4. Quantitative Assessments of Sensitivity to Reinforcement Contingencies in Mental Retardation.

    ERIC Educational Resources Information Center

    Dube, William V.; McIlvane, William J.

    2002-01-01

    Sensitivity to reinforcement contingencies was examined in six individuals with mental retardation using a concurrent operants procedure in the context of a computer game. Results included individual differences in sensitivity and differential sensitivity to rate and magnitude variation. Results suggest that comprehensive assessments of potential…

  5. INCORPORATION OF MOLECULAR ENDPOINTS INTO QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    The U.S. Environmental Protection Agency has recently released its Guidelines for Carcinogen Risk Assessment. These new guidelines benefit from the significant progress that has been made in understanding the cancer process and also from the more than 20 years experience that EPA...

  6. High Resolution Peripheral Quantitative Computed Tomography for Assessment of Bone Quality

    NASA Astrophysics Data System (ADS)

    Kazakia, Galateia

    2014-03-01

    The study of bone quality is motivated by the high morbidity, mortality, and societal cost of skeletal fractures. Over 10 million people are diagnosed with osteoporosis in the US alone, suffering 1.5 million osteoporotic fractures and costing the health care system over 17 billion annually. Accurate assessment of fracture risk is necessary to ensure that pharmacological and other interventions are appropriately administered. Currently, areal bone mineral density (aBMD) based on 2D dual-energy X-ray absorptiometry (DXA) is used to determine osteoporotic status and predict fracture risk. Though aBMD is a significant predictor of fracture risk, it does not completely explain bone strength or fracture incidence. The major limitation of aBMD is the lack of 3D information, which is necessary to distinguish between cortical and trabecular bone and to quantify bone geometry and microarchitecture. High resolution peripheral quantitative computed tomography (HR-pQCT) enables in vivo assessment of volumetric BMD within specific bone compartments as well as quantification of geometric and microarchitectural measures of bone quality. HR-pQCT studies have documented that trabecular bone microstructure alterations are associated with fracture risk independent of aBMD.... Cortical bone microstructure - specifically porosity - is a major determinant of strength, stiffness, and fracture toughness of cortical tissue and may further explain the aBMD-independent effect of age on bone fragility and fracture risk. The application of finite element analysis (FEA) to HR-pQCT data permits estimation of patient-specific bone strength, shown to be associated with fracture incidence independent of aBMD. This talk will describe the HR-pQCT scanner, established metrics of bone quality derived from HR-pQCT data, and novel analyses of bone quality currently in development. Cross-sectional and longitudinal HR-pQCT studies investigating the impact of aging, disease, injury, gender, race, and

  7. Quantitative Evaluation of MODIS Fire Radiative Power Measurement for Global Smoke Emissions Assessment

    NASA Technical Reports Server (NTRS)

    Ichoku, Charles; Ellison, Luke

    2011-01-01

    Satellite remote sensing is providing us tremendous opportunities to measure the fire radiative energy (FRE) release rate or power (FRP) from open biomass burning, which affects many vegetated regions of the world on a seasonal basis. Knowledge of the biomass burning characteristics and emission source strengths of different (particulate and gaseous) smoke constituents is one of the principal ingredients upon which the assessment, modeling, and forecasting of their distribution and impacts depend. This knowledge can be gained through accurate measurement of FRP, which has been shown to have a direct relationship with the rates of biomass consumption and emissions of major smoke constituents. Over the last decade or so, FRP has been routinely measured from space by both the MODIS sensors aboard the polar orbiting Terra and Aqua satellites, and the SEVIRI sensor aboard the Meteosat Second Generation (MSG) geostationary satellite. During the last few years, FRP has steadily gained increasing recognition as an important parameter for facilitating the development of various scientific studies and applications relating to the quantitative characterization of biomass burning and their emissions. To establish the scientific integrity of the FRP as a stable quantity that can be measured consistently across a variety of sensors and platforms, with the potential of being utilized to develop a unified long-term climate data record of fire activity and impacts, it needs to be thoroughly evaluated, calibrated, and validated. Therefore, we are conducting a detailed analysis of the FRP products from MODIS to evaluate the uncertainties associated with them, such as those due to the effects of satellite variable observation geometry and other factors, in order to establish their error budget for use in diverse scientific research and applications. In this presentation, we will show recent results of the MODIS FRP uncertainty analysis and error mitigation solutions, and demonstrate

  8. Quantitative Assessment of Parametric Uncertainty in Northern Hemisphere PAH Concentrations.

    PubMed

    Thackray, Colin P; Friedman, Carey L; Zhang, Yanxu; Selin, Noelle E

    2015-08-04

    We quantitatively examine the relative importance of uncertainty in emissions and physicochemical properties (including reaction rate constants) to Northern Hemisphere (NH) and Arctic polycyclic aromatic hydrocarbon (PAH) concentrations, using a computationally efficient numerical uncertainty technique applied to the global-scale chemical transport model GEOS-Chem. Using polynomial chaos (PC) methods, we propagate uncertainties in physicochemical properties and emissions for the PAHs benzo[a]pyrene, pyrene and phenanthrene to simulated spatially resolved concentration uncertainties. We find that the leading contributors to parametric uncertainty in simulated concentrations are the black carbon-air partition coefficient and oxidation rate constant for benzo[a]pyrene, and the oxidation rate constants for phenanthrene and pyrene. NH geometric average concentrations are more sensitive to uncertainty in the atmospheric lifetime than to emissions rate. We use the PC expansions and measurement data to constrain parameter uncertainty distributions to observations. This narrows a priori parameter uncertainty distributions for phenanthrene and pyrene, and leads to higher values for OH oxidation rate constants and lower values for European PHE emission rates.

  9. A quantitative assessment of Arctic shipping in 2010–2014

    PubMed Central

    Eguíluz, Victor M.; Fernández-Gracia, Juan; Irigoien, Xabier; Duarte, Carlos M.

    2016-01-01

    Rapid loss of sea ice is opening up the Arctic Ocean to shipping, a practice that is forecasted to increase rapidly by 2050 when many models predict that the Arctic Ocean will largely be free of ice toward the end of summer. These forecasts carry considerable uncertainty because Arctic shipping was previously considered too sparse to allow for adequate validation. Here, we provide quantitative evidence that the extent of Arctic shipping in the period 2011–2014 is already significant and that it is concentrated (i) in the Norwegian and Barents Seas, and (ii) predominantly accessed via the Northeast and Northwest Passages. Thick ice along the forecasted direct trans-Arctic route was still present in 2014, preventing transit. Although Arctic shipping remains constrained by the extent of ice coverage, during every September, this coverage is at a minimum, allowing the highest levels of shipping activity. Access to Arctic resources, particularly fisheries, is the most important driver of Arctic shipping thus far. PMID:27477878

  10. Quantitative assessment of rabbit alveolar macrophage function by chemiluminescence

    SciTech Connect

    Brennan, P.C.; Kirchner, F.R.

    1985-08-01

    Rabbit alveolar macrophages (RAM) were cultured for 24 hr with concentrations ranging from 3 to 12 ..mu..g/ml of vanadium oxide (V/sub 2/O/sub 5/), a known cytotoxic agent, or with high-molecular-weight organic by-products from coal gasification processes. After culture the cells were harvested and tested for functional capacity using three types of indicators: (1) luminol-amplified chemiluminescence (CL), which quantitatively detects photon emission due to respiratory burst activity measured in a newly designed instrument with standardized reagents; (2) the reduction of nitro blue tetrazolium-saturated polyacrylamide beads, a semiquantitative measure of respiratory burst activity; and (3) phagocytic efficiency, defined as percentage of cells incorporating immunoglobulin-coated polyacrylamide beads. Chemiluminescence declined linearly with increasing concentrations of V/sub 2/O/sub 5/ over the dose range tested. Dye reduction and phagocytic efficiency similarly decreased with increasing V/sub 2/O/sub 5/ concentration, but were less sensitive indicators of functional impairment than CL as measured by the amount required to reduce the response to 50% of untreated cells. The effect of coal gasification condensates on RAM function varied, but in general these test also indicated that the CL response was the most sensitive indicator.

  11. Quantitative statistical assessment of conditional models for synthetic aperture radar.

    PubMed

    DeVore, Michael D; O'Sullivan, Joseph A

    2004-02-01

    Many applications of object recognition in the presence of pose uncertainty rely on statistical models-conditioned on pose-for observations. The image statistics of three-dimensional (3-D) objects are often assumed to belong to a family of distributions with unknown model parameters that vary with one or more continuous-valued pose parameters. Many methods for statistical model assessment, for example the tests of Kolmogorov-Smirnov and K. Pearson, require that all model parameters be fully specified or that sample sizes be large. Assessing pose-dependent models from a finite number of observations over a variety of poses can violate these requirements. However, a large number of small samples, corresponding to unique combinations of object, pose, and pixel location, are often available. We develop methods for model testing which assume a large number of small samples and apply them to the comparison of three models for synthetic aperture radar images of 3-D objects with varying pose. Each model is directly related to the Gaussian distribution and is assessed both in terms of goodness-of-fit and underlying model assumptions, such as independence, known mean, and homoscedasticity. Test results are presented in terms of the functional relationship between a given significance level and the percentage of samples that wold fail a test at that level.

  12. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  13. Shrinking the Psoriasis Assessment Gap: Early Gene-Expression Profiling Accurately Predicts Response to Long-Term Treatment.

    PubMed

    Correa da Rosa, Joel; Kim, Jaehwan; Tian, Suyan; Tomalin, Lewis E; Krueger, James G; Suárez-Fariñas, Mayte

    2017-02-01

    There is an "assessment gap" between the moment a patient's response to treatment is biologically determined and when a response can actually be determined clinically. Patients' biochemical profiles are a major determinant of clinical outcome for a given treatment. It is therefore feasible that molecular-level patient information could be used to decrease the assessment gap. Thanks to clinically accessible biopsy samples, high-quality molecular data for psoriasis patients are widely available. Psoriasis is therefore an excellent disease for testing the prospect of predicting treatment outcome from molecular data. Our study shows that gene-expression profiles of psoriasis skin lesions, taken in the first 4 weeks of treatment, can be used to accurately predict (>80% area under the receiver operating characteristic curve) the clinical endpoint at 12 weeks. This could decrease the psoriasis assessment gap by 2 months. We present two distinct prediction modes: a universal predictor, aimed at forecasting the efficacy of untested drugs, and specific predictors aimed at forecasting clinical response to treatment with four specific drugs: etanercept, ustekinumab, adalimumab, and methotrexate. We also develop two forms of prediction: one from detailed, platform-specific data and one from platform-independent, pathway-based data. We show that key biomarkers are associated with responses to drugs and doses and thus provide insight into the biology of pathogenesis reversion.

  14. Accurate dose assessment system for an exposed person utilising radiation transport calculation codes in emergency response to a radiological accident.

    PubMed

    Takahashi, F; Shigemori, Y; Seki, A

    2009-01-01

    A system has been developed to assess radiation dose distribution inside the body of exposed persons in a radiological accident by utilising radiation transport calculation codes-MCNP and MCNPX. The system consists mainly of two parts, pre-processor and post-processor of the radiation transport calculation. Programs for the pre-processor are used to set up a 'problem-dependent' input file, which defines the accident condition and dosimetric quantities to be estimated. The program developed for the post-processor part can effectively indicate dose information based upon the output file of the code. All of the programs in the dosimetry system can be executed with a generally used personal computer and accurately give the dose profile to an exposed person in a radiological accident without complicated procedures. An experiment using a physical phantom was carried out to verify the availability of the dosimetry system with the developed programs in a gamma ray irradiation field.

  15. A quantitative epigenetic approach for the assessment of cigarette consumption

    PubMed Central

    Philibert, Robert; Hollenbeck, Nancy; Andersen, Eleanor; Osborn, Terry; Gerrard, Meg; Gibbons, Frederick X.; Wang, Kai

    2015-01-01

    Smoking is the largest preventable cause of morbidity and mortality in the world. Despite the development of numerous preventive and treatment interventions, the rate of daily smoking in the United States is still approximately 22%. Effective psychosocial interventions and pharmacologic agents exist for the prevention and treatment of smoking. Unfortunately, both approaches are hindered by our inability to accurately quantify amount of cigarette consumption from the point of initial experimentation to the point of total dependency. Recently, we and others have demonstrated that smoking is associated with genome-wide changes in DNA methylation. However, whether this advance in basic science can be employed as a reliable assay that is useful for clinical diagnosis and treatment has not been shown. In this communication, we determine the sensitivity and specificity of five of the most consistently replicated CpG loci with respect to smoking status using data from a publically available dataset. We show that methylation status at a CpG locus in the aryl hydrocarbon receptor repressor, cg05575921, is both sensitive and specific for smoking status in adults with a receiver operated curve characteristic area under the curve of 0.99. Given recent demonstrations that methylation at this locus reflects both intensity of smoking and the degree of smoking cessation, we conclude that a methylation-based diagnostic at this locus could have a prominent role in understanding the impact of new products, such as e-cigarettes on initiation of cigarette smoking among adolescents, while improving the prevention and treatment of smoking, and smoking related disorders. PMID:26082730

  16. Quantitative phase imaging technologies to assess neuronal activity (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Thouvenin, Olivier; Fink, Mathias; Boccara, Claude

    2016-03-01

    Active neurons tends to have a different dynamical behavior compared to resting ones. Non-exhaustively, vesicular transport towards the synapses is increased, since axonal growth becomes slower. Previous studies also reported small phase variations occurring simultaneously with the action potential. Such changes exhibit times scales ranging from milliseconds to several seconds on spatial scales smaller than the optical diffraction limit. Therefore, QPI systems are of particular interest to measure neuronal activity without labels. Here, we report the development of two new QPI systems that should enable the detection of such activity. Both systems can acquire full field phase images with a sub nanometer sensitivity at a few hundreds of frames per second. The first setup is a synchronous combination of Full Field Optical Coherence Tomography (FF-OCT) and Fluorescence wide field imaging. The latter modality enables the measurement of neurons electrical activity using calcium indicators. In cultures, FF-OCT exhibits similar features to Digital Holographic Microscopy (DHM), except from complex computational reconstruction. However, FF-OCT is of particular interest in order to measure phase variations in tissues. The second setup is based on a Quantitative Differential Interference Contrast setup mounted in an epi-illumination configuration with a spectrally incoherent illumination. Such a common path interferometer exhibits a very good mechanical stability, and thus enables the measurement of phase images during hours. Additionally, such setup can not only measure a height change, but also an optical index change for both polarization. Hence, one can measure simultaneously a phase change and a birefringence change.

  17. Assessment of Quantitative Precipitation Forecasts from Operational NWP Models (Invited)

    NASA Astrophysics Data System (ADS)

    Sapiano, M. R.

    2010-12-01

    Previous work has shown that satellite and numerical model estimates of precipitation have complimentary strengths, with satellites having greater skill at detecting convective precipitation events and model estimates having greater skill at detecting stratiform precipitation. This is due in part to the challenges associated with retrieving stratiform precipitation from satellites and the difficulty in resolving sub-grid scale processes in models. These complimentary strengths can be exploited to obtain new merged satellite/model datasets, and several such datasets have been constructed using reanalysis data. Whilst reanalysis data are stable in a climate sense, they also have relatively coarse resolution compared to the satellite estimates (many of which are now commonly available at quarter degree resolution) and they necessarily use fixed forecast systems that are not state-of-the-art. An alternative to reanalysis data is to use Operational Numerical Weather Prediction (NWP) model estimates, which routinely produce precipitation with higher resolution and using the most modern techniques. Such estimates have not been combined with satellite precipitation and their relative skill has not been sufficiently assessed beyond model validation. The aim of this work is to assess the information content of the models relative to satellite estimates with the goal of improving techniques for merging these data types. To that end, several operational NWP precipitation forecasts have been compared to satellite and in situ data and their relative skill in forecasting precipitation has been assessed. In particular, the relationship between precipitation forecast skill and other model variables will be explored to see if these other model variables can be used to estimate the skill of the model at a particular time. Such relationships would be provide a basis for determining weights and errors of any merged products.

  18. Quantitative Assessment of a Field-Based Course on Integrative Geology, Ecology and Cultural History

    ERIC Educational Resources Information Center

    Sheppard, Paul R.; Donaldson, Brad A.; Huckleberry, Gary

    2010-01-01

    A field-based course at the University of Arizona called Sense of Place (SOP) covers the geology, ecology and cultural history of the Tucson area. SOP was quantitatively assessed for pedagogical effectiveness. Students of the Spring 2008 course were given pre- and post-course word association surveys in order to assess awareness and comprehension…

  19. An integrated environmental modeling framework for performing quantitative microbial risk assessments

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail potential human-heal...

  20. An integrated environmental modeling framework for performing Quantitative Microbial Risk Assessments

    EPA Science Inventory

    Standardized methods are often used to assess the likelihood of a human-health effect from exposure to a specified hazard, and inform opinions and decisions about risk management and communication. A Quantitative Microbial Risk Assessment (QMRA) is specifically adapted to detail ...

  1. Disability and occupational assessment: objective diagnosis and quantitative impairment rating.

    PubMed

    Williams, C Donald

    2010-01-01

    Industrial insurance originated in Europe in the nineteenth century and replaced the old system of negligence liability in the United States between 1910 and 1940. Today psychiatric disability assessments are performed by psychiatrists in the context of Social Security Disability Insurance applications, workers' compensation claims, private disability insurance claims, and fitness for duty evaluations. Expertise in the performance of psychiatric disability evaluations is required, but general psychiatric residency programs provide experience only with treatment evaluations, which differ fundamentally from independent medical evaluations as to role boundaries and the focus of assessment. Psychiatrists offer opinions regarding psychiatric impairments, but administrative or judicial tribunals make the actual determinations of disability. Social Security Disability Insurance evaluations and workers' compensation evaluations are discussed, as is the distinction between diagnoses, which are categorical, and impairment ratings, which are dimensional. Inconsistency in impairment ratings has been problematic in the United States and elsewhere in the workers' compensation arena. A protocol for achieving more consistent impairment ratings is proposed, one that correlates three commonly used global rating scales in a 3 × 5 grid, supplemented by objective psychological test data.

  2. A comprehensive reliability assessment of quantitative diffusion tensor tractography.

    PubMed

    Wang, Jun Yi; Abdi, Hervé; Bakhadirov, Khamid; Diaz-Arrastia, Ramon; Devous, Michael D

    2012-04-02

    Diffusion tensor tractography is increasingly used to examine structural connectivity in the brain in various conditions, but its test-retest reliability is understudied. The main purposes of this study were to evaluate 1) the reliability of quantitative measurements of diffusion tensor tractography and 2) the effect on reliability of the number of gradient sampling directions and scan repetition. Images were acquired from ten healthy participants. Ten fiber regions of nine major fiber tracts were reconstructed and quantified using six fiber variables. Intra- and inter-session reliabilities were estimated using intraclass correlation coefficient (ICC) and coefficient of variation (CV), and were compared to pinpoint major error sources. Additional pairwise comparisons were made between the reliability of images with 30 directions and NEX 2 (DTI30-2), 30 directions and NEX 1 (DTI30-1), and 15 directions and NEX 2 (DTI15-2) to determine whether increasing gradient directions and scan repetition improved reliability. Of the 60 tractography measurements, 43 showed intersession CV ≤ 10%, ICC ≥ .70, or both for DTI30-2, 40 measurements for DTI30-1, and 37 for DTI15-2. Most of the reliable measurements were associated with the tracts corpus callosum, cingulum, cerebral peduncular fibers, uncinate fasciculus, and arcuate fasciculus. These reliable measurements included factional anisotropy (FA) and mean diffusivity of all 10 fiber regions. Intersession reliability was significantly worse than intra-session reliability for FA, mean length, and tract volume measurements from DTI15-2, indicating that the combination of MRI signal variation and physiological noise/change over time was the major error source for this sequence. Increasing the number of gradient directions from 15 to 30 while controlling the scan time, significantly affected values for all six variables and reduced intersession variability for mean length and tract volume measurements. Additionally, while

  3. Analytical Validation of a Highly Quantitative, Sensitive, Accurate, and Reproducible Assay (HERmark®) for the Measurement of HER2 Total Protein and HER2 Homodimers in FFPE Breast Cancer Tumor Specimens

    PubMed Central

    Larson, Jeffrey S.; Goodman, Laurie J.; Tan, Yuping; Defazio-Eli, Lisa; Paquet, Agnes C.; Cook, Jennifer W.; Rivera, Amber; Frankson, Kristi; Bose, Jolly; Chen, Lili; Cheung, Judy; Shi, Yining; Irwin, Sarah; Kiss, Linda D. B.; Huang, Weidong; Utter, Shannon; Sherwood, Thomas; Bates, Michael; Weidler, Jodi; Parry, Gordon; Winslow, John; Petropoulos, Christos J.; Whitcomb, Jeannette M.

    2010-01-01

    We report here the results of the analytical validation of assays that measure HER2 total protein (H2T) and HER2 homodimer (H2D) expression in Formalin Fixed Paraffin Embedded (FFPE) breast cancer tumors as well as cell line controls. The assays are based on the VeraTag technology platform and are commercially available through a central CAP-accredited clinical reference laboratory. The accuracy of H2T measurements spans a broad dynamic range (2-3 logs) as evaluated by comparison with cross-validating technologies. The measurement of H2T expression demonstrates a sensitivity that is approximately 7–10 times greater than conventional immunohistochemistry (IHC) (HercepTest). The HERmark assay is a quantitative assay that sensitively and reproducibly measures continuous H2T and H2D protein expression levels and therefore may have the potential to stratify patients more accurately with respect to response to HER2-targeted therapies than current methods which rely on semiquantitative protein measurements (IHC) or on indirect assessments of gene amplification (FISH). PMID:21151530

  4. Quantitative Assessment of Workload and Stressors in Clinical Radiation Oncology

    SciTech Connect

    Mazur, Lukasz M.; Mosaly, Prithima R.; Jackson, Marianne; Chang, Sha X.; Burkhardt, Katharin Deschesne; Adams, Robert D.; Jones, Ellen L.; Hoyle, Lesley; Xu, Jing; Rockwell, John; Marks, Lawrence B.

    2012-08-01

    Purpose: Workload level and sources of stressors have been implicated as sources of error in multiple settings. We assessed workload levels and sources of stressors among radiation oncology professionals. Furthermore, we explored the potential association between workload and the frequency of reported radiotherapy incidents by the World Health Organization (WHO). Methods and Materials: Data collection was aimed at various tasks performed by 21 study participants from different radiation oncology professional subgroups (simulation therapists, radiation therapists, physicists, dosimetrists, and physicians). Workload was assessed using National Aeronautics and Space Administration Task-Load Index (NASA TLX). Sources of stressors were quantified using observational methods and segregated using a standard taxonomy. Comparisons between professional subgroups and tasks were made using analysis of variance ANOVA, multivariate ANOVA, and Duncan test. An association between workload levels (NASA TLX) and the frequency of radiotherapy incidents (WHO incidents) was explored (Pearson correlation test). Results: A total of 173 workload assessments were obtained. Overall, simulation therapists had relatively low workloads (NASA TLX range, 30-36), and physicists had relatively high workloads (NASA TLX range, 51-63). NASA TLX scores for physicians, radiation therapists, and dosimetrists ranged from 40-52. There was marked intertask/professional subgroup variation (P<.0001). Mental demand (P<.001), physical demand (P=.001), and effort (P=.006) significantly differed among professional subgroups. Typically, there were 3-5 stressors per cycle of analyzed tasks with the following distribution: interruptions (41.4%), time factors (17%), technical factors (13.6%), teamwork issues (11.6%), patient factors (9.0%), and environmental factors (7.4%). A positive association between workload and frequency of reported radiotherapy incidents by the WHO was found (r = 0.87, P value=.045

  5. Quantitation of carboxyhaemoglobin in blood: external quality assessment of techniques.

    PubMed

    Barnett, K; Wilson, J F

    1998-06-01

    The performance of four dedicated carbon monoxide (CO)-oximeters (AVL, Chiron, IL, Radiometer), spectrophotometry with and without dithionite, spectrophotometry by second derivative, and the Whitehead and Worthington precipitation technique for the measurement of carboxyhaemoglobin in blood was compared by a mean of 136 participants in the United Kingdom National External Quality Assessment Scheme in 21 samples formulated to contain from 4% to 48% carboxyhaemoglobin. The dedicated instruments and spectrophotometry by second derivative were of significantly higher precision than the other techniques, producing fewer measurements rejected as being > 3 standard deviations from the sample mean and having a lower standard deviation for non-rejected measurements. The AVL instrument and spectrophotometry by second derivative had a significant positive bias compared to the other techniques. The Whitehead and Worthington method was of an unacceptably low precision.

  6. Quantitative assessment of impedance tomography for temperature measurements in hyperthermia.

    PubMed

    Blad, B; Persson, B; Lindström, K

    1992-01-01

    The objective of this study is a non-invasive assessment of the thermal dose in hyperthermia. Electrical impedance tomography (EIT) has previously been given a first trial as a temperature monitoring method together with microwave-induced hyperthermia treatment, but it has not been thoroughly investigated. In the present work we have examined this method in order to investigate the correlation in vitro between the true spatial temperature distribution and the corresponding measured relative resistivity changes. Different hyperthermia techniques, such as interstitial water tubings, microwave-induced, laser-induced and ferromagnetic seeds have been used. The results show that it is possible to find a correlation between the measured temperature values and the tomographically measured relative resistivity changes in tissue-equivalent phantoms. But the uncertainty of the temperature coefficients, which has been observed, shows that the method has to be improved before it can be applied to clinical in vivo applications.

  7. Compressed natural gas bus safety: a quantitative risk assessment.

    PubMed

    Chamberlain, Samuel; Modarres, Mohammad

    2005-04-01

    This study assesses the fire safety risks associated with compressed natural gas (CNG) vehicle systems, comprising primarily a typical school bus and supporting fuel infrastructure. The study determines the sensitivity of the results to variations in component failure rates and consequences of fire events. The components and subsystems that contribute most to fire safety risk are determined. Finally, the results are compared to fire risks of the present generation of diesel-fueled school buses. Direct computation of the safety risks associated with diesel-powered vehicles is possible because these are mature technologies for which historical performance data are available. Because of limited experience, fatal accident data for CNG bus fleets are minimal. Therefore, this study uses the probabilistic risk assessment (PRA) approach to model and predict fire safety risk of CNG buses. Generic failure data, engineering judgments, and assumptions are used in this study. This study predicts the mean fire fatality risk for typical CNG buses as approximately 0.23 fatalities per 100-million miles for all people involved, including bus passengers. The study estimates mean values of 0.16 fatalities per 100-million miles for bus passengers only. Based on historical data, diesel school bus mean fire fatality risk is 0.091 and 0.0007 per 100-million miles for all people and bus passengers, respectively. One can therefore conclude that CNG buses are more prone to fire fatality risk by 2.5 times that of diesel buses, with the bus passengers being more at risk by over two orders of magnitude. The study estimates a mean fire risk frequency of 2.2 x 10(-5) fatalities/bus per year. The 5% and 95% uncertainty bounds are 9.1 x 10(-6) and 4.0 x 10(-5), respectively. The risk result was found to be affected most by failure rates of pressure relief valves, CNG cylinders, and fuel piping.

  8. An educationally inspired illustration of two-dimensional Quantitative Microbiological Risk Assessment (QMRA) and sensitivity analysis.

    PubMed

    Vásquez, G A; Busschaert, P; Haberbeck, L U; Uyttendaele, M; Geeraerd, A H

    2014-11-03

    Quantitative Microbiological Risk Assessment (QMRA) is a structured methodology used to assess the risk involved by ingestion of a pathogen. It applies mathematical models combined with an accurate exploitation of data sets, represented by distributions and - in the case of two-dimensional Monte Carlo simulations - their hyperparameters. This research aims to highlight background information, assumptions and truncations of a two-dimensional QMRA and advanced sensitivity analysis. We believe that such a detailed listing is not always clearly presented in actual risk assessment studies, while it is essential to ensure reliable and realistic simulations and interpretations. As a case-study, we are considering the occurrence of listeriosis in smoked fish products in Belgium during the period 2008-2009, using two-dimensional Monte Carlo and two sensitivity analysis methods (Spearman correlation and Sobol sensitivity indices) to estimate the most relevant factors of the final risk estimate. A risk estimate of 0.018% per consumption of contaminated smoked fish by an immunocompromised person was obtained. The final estimate of listeriosis cases (23) is within the actual reported result obtained for the same period and for the same population. Variability on the final risk estimate is determined by the variability regarding (i) consumer refrigerator temperatures, (ii) the reference growth rate of L. monocytogenes, (iii) the minimum growth temperature of L. monocytogenes and (iv) consumer portion size. Variability regarding the initial contamination level of L. monocytogenes tends to appear as a determinant of risk variability only when the minimum growth temperature is not included in the sensitivity analysis; when it is included the impact regarding the variability on the initial contamination level of L. monocytogenes is disappearing. Uncertainty determinants of the final risk indicated the need of gathering more information on the reference growth rate and the minimum

  9. Stepwise quantitative risk assessment as a tool for characterization of microbiological food safety.

    PubMed

    van Gerwen, S J; te Giffel, M C; van't Riet, K; Beumer, R R; Zwietering, M H

    2000-06-01

    This paper describes a system for the microbiological quantitative risk assessment for food products and their production processes. The system applies a stepwise risk assessment, allowing the main problems to be addressed before focusing on less important problems. First, risks are assessed broadly, using order of magnitude estimates. Characteristic numbers are used to quantitatively characterize microbial behaviour during the production process. These numbers help to highlight the major risk-determining phenomena, and to find negligible aspects. Second, the risk-determining phenomena are studied in more detail. Both general and/or specific models can be used for this and varying situations can be simulated to quantitatively describe the risk-determining phenomena. Third, even more detailed studies can be performed where necessary, for instance by using stochastic variables. The system for quantitative risk assessment has been implemented as a decision supporting expert system called SIEFE: Stepwise and Interactive Evaluation of Food safety by an Expert System. SIEFE performs bacterial risk assessments in a structured manner, using various information sources. Because all steps are transparent, every step can easily be scrutinized. In the current study the effectiveness of SIEFE is shown for a cheese spread. With this product, quantitative data concerning the major risk-determining factors were not completely available to carry out a full detailed assessment. However, this did not necessarily hamper adequate risk estimation. Using ranges of values instead helped identifying the quantitatively most important parameters and the magnitude of their impact. This example shows that SIEFE provides quantitative insights into production processes and their risk-determining factors to both risk assessors and decision makers, and highlights critical gaps in knowledge.

  10. Quantitative assessment of the retinal microvasculature using optical coherence tomography angiography

    NASA Astrophysics Data System (ADS)

    Chu, Zhongdi; Lin, Jason; Gao, Chen; Xin, Chen; Zhang, Qinqin; Chen, Chieh-Li; Roisman, Luis; Gregori, Giovanni; Rosenfeld, Philip J.; Wang, Ruikang K.

    2016-06-01

    Optical coherence tomography angiography (OCTA) is clinically useful for the qualitative assessment of the macular microvasculature. However, there is a need for comprehensive quantitative tools to help objectively analyze the OCT angiograms. Few studies have reported the use of a single quantitative index to describe vessel density in OCT angiograms. In this study, we introduce a five-index quantitative analysis of OCT angiograms in an attempt to detect and assess vascular abnormalities from multiple perspectives. The indices include vessel area density, vessel skeleton density, vessel diameter index, vessel perimeter index, and vessel complexity index. We show the usefulness of the proposed indices with five illustrative cases. Repeatability is tested on both a healthy case and a stable diseased case, giving interclass coefficients smaller than 0.031. The results demonstrate that our proposed quantitative analysis may be useful as a complement to conventional OCTA for the diagnosis of disease and monitoring of treatment.

  11. Validation of a quantitative phosphorus loss assessment tool.

    PubMed

    White, Michael J; Storm, Daniel E; Smolen, Michael D; Busteed, Philip R; Zhang, Hailin; Fox, Garey A

    2014-01-01

    Pasture Phosphorus Management Plus (PPM Plus) is a tool that allows nutrient management and conservation planners to evaluate phosphorus (P) loss from agricultural fields. This tool uses a modified version of the widely used Soil and Water Assessment Tool model with a vastly simplified interface. The development of PPM Plus has been fully described in previous publications; in this article we evaluate the accuracy of PPM Plus using 286 field-years of runoff, sediment, and P validation data from runoff studies at various locations in Oklahoma, Texas, Arkansas, and Georgia. Land uses include pasture, small grains, and row crops with rainfall ranging from 630 to 1390 mm yr, with and without animal manure application. PPM Plus explained 68% of the variability in total P loss, 56% of runoff, and 73% of the variability of sediment yield. An empirical model developed from these data using soil test P, total applied P, slope, and precipitation only accounted for 15% of the variability in total P loss, which implies that a process-based model is required to account for the diversity present in these data. PPM Plus is an easy-to-use conservation planning tool for P loss prediction, which, with modification, could be applicable at the regional and national scales.

  12. Quantitative assessment of protein function prediction from metagenomics shotgun sequences.

    PubMed

    Harrington, E D; Singh, A H; Doerks, T; Letunic, I; von Mering, C; Jensen, L J; Raes, J; Bork, P

    2007-08-28

    To assess the potential of protein function prediction in environmental genomics data, we analyzed shotgun sequences from four diverse and complex habitats. Using homology searches as well as customized gene neighborhood methods that incorporate intergenic and evolutionary distances, we inferred specific functions for 76% of the 1.4 million predicted ORFs in these samples (83% when nonspecific functions are considered). Surprisingly, these fractions are only slightly smaller than the corresponding ones in completely sequenced genomes (83% and 86%, respectively, by using the same methodology) and considerably higher than previously thought. For as many as 75,448 ORFs (5% of the total), only neighborhood methods can assign functions, illustrated here by a previously undescribed gene associated with the well characterized heme biosynthesis operon and a potential transcription factor that might regulate a coupling between fatty acid biosynthesis and degradation. Our results further suggest that, although functions can be inferred for most proteins on earth, many functions remain to be discovered in numerous small, rare protein families.

  13. Quantitative assessment of gene expression network module-validation methods.

    PubMed

    Li, Bing; Zhang, Yingying; Yu, Yanan; Wang, Pengqian; Wang, Yongcheng; Wang, Zhong; Wang, Yongyan

    2015-10-16

    Validation of pluripotent modules in diverse networks holds enormous potential for systems biology and network pharmacology. An arising challenge is how to assess the accuracy of discovering all potential modules from multi-omic networks and validating their architectural characteristics based on innovative computational methods beyond function enrichment and biological validation. To display the framework progress in this domain, we systematically divided the existing Computational Validation Approaches based on Modular Architecture (CVAMA) into topology-based approaches (TBA) and statistics-based approaches (SBA). We compared the available module validation methods based on 11 gene expression datasets, and partially consistent results in the form of homogeneous models were obtained with each individual approach, whereas discrepant contradictory results were found between TBA and SBA. The TBA of the Zsummary value had a higher Validation Success Ratio (VSR) (51%) and a higher Fluctuation Ratio (FR) (80.92%), whereas the SBA of the approximately unbiased (AU) p-value had a lower VSR (12.3%) and a lower FR (45.84%). The Gray area simulated study revealed a consistent result for these two models and indicated a lower Variation Ratio (VR) (8.10%) of TBA at 6 simulated levels. Despite facing many novel challenges and evidence limitations, CVAMA may offer novel insights into modular networks.

  14. Quantitative assessment of damage growth in graphite epoxy laminates by acousto-ultrasonic measurements

    NASA Technical Reports Server (NTRS)

    Talreja, R.; Govada, A.; Henneke, E. G., II

    1984-01-01

    The acoustoultrasonic NDT method proposed by Vary (1976, 1978) for composite laminate damage growth quantitative assessment can both respond to the development of damage states and furnish quantitative parameters that monitor this damage development. Attention is presently given to data obtained for the case of quasi-static loading and fatigue testing of graphite-epoxy laminates. The shape parameters of the power spectral density for the ultrasonic signals correlate well with such other indications of damage development as stiffness degradation.

  15. Quantitative assessment of corpus callosum morphology in periventricular nodular heterotopia.

    PubMed

    Pardoe, Heath R; Mandelstam, Simone A; Hiess, Rebecca Kucharsky; Kuzniecky, Ruben I; Jackson, Graeme D

    2015-01-01

    We investigated systematic differences in corpus callosum morphology in periventricular nodular heterotopia (PVNH). Differences in corpus callosum mid-sagittal area and subregional area changes were measured using an automated software-based method. Heterotopic gray matter deposits were automatically labeled and compared with corpus callosum changes. The spatial pattern of corpus callosum changes were interpreted in the context of the characteristic anterior-posterior development of the corpus callosum in healthy individuals. Individuals with periventricular nodular heterotopia were imaged at the Melbourne Brain Center or as part of the multi-site Epilepsy Phenome Genome project. Whole brain T1 weighted MRI was acquired in cases (n=48) and controls (n=663). The corpus callosum was segmented on the mid-sagittal plane using the software "yuki". Heterotopic gray matter and intracranial brain volume was measured using Freesurfer. Differences in corpus callosum area and subregional areas were assessed, as well as the relationship between corpus callosum area and heterotopic GM volume. The anterior-posterior distribution of corpus callosum changes and heterotopic GM nodules were quantified using a novel metric and compared with each other. Corpus callosum area was reduced by 14% in PVNH (p=1.59×10(-9)). The magnitude of the effect was least in the genu (7% reduction) and greatest in the isthmus and splenium (26% reduction). Individuals with higher heterotopic GM volume had a smaller corpus callosum. Heterotopic GM volume was highest in posterior brain regions, however there was no linear relationship between the anterior-posterior position of corpus callosum changes and PVNH nodules. Reduced corpus callosum area is strongly associated with PVNH, and is probably associated with abnormal brain development in this neurological disorder. The primarily posterior corpus callosum changes may inform our understanding of the etiology of PVNH. Our results suggest that

  16. Rapid and accurate assessment of seizure liability of drugs by using an optimal support vector machine method.

    PubMed

    Zhang, Hui; Li, Wei; Xie, Yang; Wang, Wen-Jing; Li, Lin-Li; Yang, Sheng-Yong

    2011-12-01

    Drug-induced seizures are a serious adverse effect and assessment of seizure risk usually takes place at the late stage of drug discovery process, which does not allow sufficient time to reduce the risk by chemical modification. Thus early identification of chemicals with seizure liability using rapid and cheaper approaches would be preferable. In this study, an optimal support vector machine (SVM) modeling method has been employed to develop a prediction model of seizure liability of chemicals. A set of 680 compounds were used to train the SVM model. The established SVM model was then validated by an independent test set comprising 175 compounds, which gave a prediction accuracy of 86.9%. Further, the SVM-based prediction model of seizure liability was compared with various preclinical seizure assays, including in vitro rat hippocampal brain slice, in vivo zebrafish larvae assay, mouse spontaneous seizure model, and mouse EEG model. In terms of predictability, the SVM model was ranked just behind the mouse EEG model, but better than the rat brain slice and zebrafish models. Nevertheless, the SVM model has considerable advantages compared with the preclinical seizure assays in speed and cost. In summary, the SVM-based prediction model of seizure liability established here offers potential as a cheaper, rapid and accurate assessment of seizure liability of drugs, which could be used in the seizure risk assessment at the early stage of drug discovery. The prediction model is freely available online at http://www.sklb.scu.edu.cn/lab/yangsy/download/ADMET/seizure_pred.tar.

  17. Real Time Quantitative Radiological Monitoring Equipment for Environmental Assessment

    SciTech Connect

    John R. Giles; Lyle G. Roybal; Michael V. Carpenter

    2006-03-01

    The Idaho National Laboratory (INL) has developed a suite of systems that rapidly scan, analyze, and characterize radiological contamination in soil. These systems have been successfully deployed at several Department of Energy (DOE) laboratories and Cold War Legacy closure sites. Traditionally, these systems have been used during the characterization and remediation of radiologically contaminated soils and surfaces; however, subsequent to the terrorist attacks of September 11, 2001, the applications of these systems have expanded to include homeland security operations for first response, continuing assessment and verification of cleanup activities in the event of the detonation of a radiological dispersal device. The core system components are a detector, a spectral analyzer, and a global positioning system (GPS). The system is computer controlled by menu-driven, user-friendly custom software designed for a technician-level operator. A wide variety of detectors have been used including several configurations of sodium iodide (NaI) and high-purity germanium (HPGe) detectors, and a large area proportional counter designed for the detection of x-rays from actinides such as Am-241 and Pu-238. Systems have been deployed from several platforms including a small all-terrain vehicle (ATV), hand-pushed carts, a backpack mounted unit, and an excavator mounted unit used where personnel safety considerations are paramount. The INL has advanced this concept, and expanded the system functionality to create an integrated, field-deployed analytical system through the use of tailored analysis and operations software. Customized, site specific software is assembled from a supporting toolbox of algorithms that streamline the data acquisition, analysis and reporting process. These algorithms include region specific spectral stripping, automated energy calibration, background subtraction, activity calculations based on measured detector efficiencies, and on-line data quality checks

  18. Towards quantitative ecological risk assessment of elevated carbon dioxide levels in the marine environment.

    PubMed

    de Vries, Pepijn; Tamis, Jacqueline E; Foekema, Edwin M; Klok, Chris; Murk, Albertinka J

    2013-08-30

    The environmental impact of elevated carbon dioxide (CO2) levels has become of more interest in recent years. This, in relation to globally rising CO2 levels and related considerations of geological CO2 storage as a mitigating measure. In the present study effect data from literature were collected in order to conduct a marine ecological risk assessment of elevated CO2 levels, using a Species Sensitivity Distribution (SSD). It became evident that information currently available from the literature is mostly insufficient for such a quantitative approach. Most studies focus on effects of expected future CO2 levels, testing only one or two elevated concentrations. A full dose-response relationship, a uniform measure of exposure, and standardized test protocols are essential for conducting a proper quantitative risk assessment of elevated CO2 levels. Improvements are proposed to make future tests more valuable and usable for quantitative risk assessment.

  19. Assessment of Scientific Literacy: Development and Validation of the Quantitative Assessment of Socio-Scientific Reasoning (QuASSR)

    ERIC Educational Resources Information Center

    Romine, William L.; Sadler, Troy D.; Kinslow, Andrew T.

    2017-01-01

    We describe the development and validation of the Quantitative Assessment of Socio-scientific Reasoning (QuASSR) in a college context. The QuASSR contains 10 polytomous, two-tiered items crossed between two scenarios, and is based on theory suggesting a four-pronged structure for SSR (complexity, perspective taking, inquiry, and skepticism). In…

  20. Assessment of extravascular lung water by quantitative ultrasound and CT in isolated bovine lung.

    PubMed

    Corradi, Francesco; Ball, Lorenzo; Brusasco, Claudia; Riccio, Anna Maria; Baroffio, Michele; Bovio, Giulio; Pelosi, Paolo; Brusasco, Vito

    2013-07-01

    Lung ultrasonography (LUS) and computed tomography (CT) were compared for quantitative assessment of extravascular lung water (EVLW) in 10 isolated bovine lung lobes. LUS and CT were obtained at different inflation pressures before and after instillation with known amounts of hypotonic saline. A video-based quantitative LUS analysis was superior to both single-frame quantitative analysis and visual scoring in the assessment of EVLW. Video-based mean LUS intensity was strongly correlated with EVLW density (r(2)=0.87) but weakly correlated with mean CT attenuation (r(2)=0.49) and physical density (r(2)=0.49). Mean CT attenuation was weakly correlated with EVLW density (r(2)=0.62) but strongly correlated with physical density (r(2)=0.99). When the effect of physical density was removed by partial correlation analysis, EVLW density was significantly correlated with video-based LUS intensity (r(2)=0.75) but not mean CT attenuation (r(2)=0.007). In conclusion, these findings suggest that quantitative LUS by video gray-scale analysis can assess EVLW more reliably than LUS visual scoring or quantitative CT.

  1. The application of intraoperative transit time flow measurement to accurately assess anastomotic quality in sequential vein grafting

    PubMed Central

    Yu, Yang; Zhang, Fan; Gao, Ming-Xin; Li, Hai-Tao; Li, Jing-Xing; Song, Wei; Huang, Xin-Sheng; Gu, Cheng-Xiong

    2013-01-01

    OBJECTIVES Intraoperative transit time flow measurement (TTFM) is widely used to assess anastomotic quality in coronary artery bypass grafting (CABG). However, in sequential vein grafting, the flow characteristics collected by the conventional TTFM method are usually associated with total graft flow and might not accurately indicate the quality of every distal anastomosis in a sequential graft. The purpose of our study was to examine a new TTFM method that could assess the quality of each distal anastomosis in a sequential graft more reliably than the conventional TTFM approach. METHODS Two TTFM methods were tested in 84 patients who underwent sequential saphenous off-pump CABG in Beijing An Zhen Hospital between April and August 2012. In the conventional TTFM method, normal blood flow in the sequential graft was maintained during the measurement, and the flow probe was placed a few centimetres above the anastomosis to be evaluated. In the new method, blood flow in the sequential graft was temporarily reduced during the measurement by placing an atraumatic bulldog clamp at the graft a few centimetres distal to the anastomosis to be evaluated, while the position of the flow probe remained the same as in the conventional method. This new TTFM method was named the flow reduction TTFM. Graft flow parameters measured by both methods were compared. RESULTS Compared with the conventional TTFM, the flow reduction TTFM resulted in significantly lower mean graft blood flow (P < 0.05); in contrast, yielded significantly higher pulsatility index (P < 0.05). Diastolic filling was not significantly different between the two methods and was >50% in both cases. Interestingly, the flow reduction TTFM identified two defective middle distal anastomoses that the conventional TTFM failed to detect. Graft flows near the defective distal anastomoses were improved substantially after revision. CONCLUSIONS In this study, we found that temporary reduction of graft flow during TTFM seemed to

  2. Quantitative micro-computed tomography: a non-invasive method to assess equivalent bone mineral density.

    PubMed

    Nazarian, Ara; Snyder, Brian D; Zurakowski, David; Müller, Ralph

    2008-08-01

    One of the many applications of micro computed tomography (microCT) is to accurately visualize and quantify cancellous bone microstructure. However, microCT based assessment of bone mineral density has yet to be thoroughly investigated. Specifically, the effects of varying imaging parameters, such as tube voltage (kVp), current (microA), integration time (ms), object to X-ray source distance (mm), projection number, detector array size and imaging media (surrounding the specimen), on the relationship between equivalent tissue density (rhoEQ) and its linear attenuation coefficient (micro) have received little attention. In this study, in house manufactured, hydrogen dipotassium phosphate liquid calibration phantoms (K2HPO4) were employed in addition to a resin embedded hydroxyapatite solid calibration phantoms supplied by Scanco Medical AG Company. Variations in current, integration time and projection number had no effect on the conversion relationship between micro and rhoEQ for the K2HPO4 and Scanco calibration phantoms [p>0.05 for all cases]. However, as expected, variations in scanning tube voltage, object to X-ray source distance, detector array size and imaging media (referring to the solution that surrounds the specimen in the imaging vial) significantly affected the conversion relationship between mu and rhoEQ for K2HPO4 and Scanco calibration phantoms [p<0.05 for all cases]. A multivariate linear regression approach was used to estimate rhoEQ based on attenuation coefficient, tube voltage, object to X-ray source distance, detector array size and imaging media for K2HPO4 liquid calibration phantoms, explaining 90% of the variation in rhoEQ. Furthermore, equivalent density values of bovine cortical bone (converted from attenuation coefficient to equivalent density using the K2HPO4 liquid calibration phantoms) samples highly correlated [R2=0.92] with the ash densities of the samples. In conclusion, Scanco calibration phantoms can be used to assess equivalent

  3. Quantitative and qualitative assessment of the bovine abortion surveillance system in France.

    PubMed

    Bronner, Anne; Gay, Emilie; Fortané, Nicolas; Palussière, Mathilde; Hendrikx, Pascal; Hénaux, Viviane; Calavas, Didier

    2015-06-01

    Bovine abortion is the main clinical sign of bovine brucellosis, a disease of which France has been declared officially free since 2005. To ensure the early detection of any brucellosis outbreak, event-driven surveillance relies on the mandatory notification of bovine abortions and the brucellosis testing of aborting cows. However, the under-reporting of abortions appears frequent. Our objectives were to assess the aptitude of the bovine abortion surveillance system to detect each and every bovine abortion and to identify factors influencing the system's effectiveness. We evaluated five attributes defined by the U.S. Centers for Disease Control with a method suited to each attribute: (1) data quality was studied quantitatively and qualitatively, as this factor considerably influences data analysis and results; (2) sensitivity and representativeness were estimated using a unilist capture-recapture approach to quantify the surveillance system's effectiveness; (3) acceptability and simplicity were studied through qualitative interviews of actors in the field, given that the surveillance system relies heavily on abortion notifications by farmers and veterinarians. Our analysis showed that (1) data quality was generally satisfactory even though some errors might be due to actors' lack of awareness of the need to collect accurate data; (2) from 2006 to 2011, the mean annual sensitivity - i.e. the proportion of farmers who reported at least one abortion out of all those who detected such events - was around 34%, but was significantly higher in dairy than beef cattle herds (highlighting a lack of representativeness); (3) overall, the system's low sensitivity was related to its low acceptability and lack of simplicity. This study showed that, in contrast to policy-makers, most farmers and veterinarians perceived the risk of a brucellosis outbreak as negligible. They did not consider sporadic abortions as a suspected case of brucellosis and usually reported abortions only to

  4. Genetic toxicology at the crossroads-from qualitative hazard evaluation to quantitative risk assessment.

    PubMed

    White, Paul A; Johnson, George E

    2016-05-01

    Applied genetic toxicology is undergoing a transition from qualitative hazard identification to quantitative dose-response analysis and risk assessment. To facilitate this change, the Health and Environmental Sciences Institute (HESI) Genetic Toxicology Technical Committee (GTTC) sponsored a workshop held in Lancaster, UK on July 10-11, 2014. The event included invited speakers from several institutions and the contents was divided into three themes-1: Point-of-departure Metrics for Quantitative Dose-Response Analysis in Genetic Toxicology; 2: Measurement and Estimation of Exposures for Better Extrapolation to Humans and 3: The Use of Quantitative Approaches in Genetic Toxicology for human health risk assessment (HHRA). A host of pertinent issues were discussed relating to the use of in vitro and in vivo dose-response data, the development of methods for in vitro to in vivo extrapolation and approaches to use in vivo dose-response data to determine human exposure limits for regulatory evaluations and decision-making. This Special Issue, which was inspired by the workshop, contains a series of papers that collectively address topics related to the aforementioned themes. The Issue includes contributions that collectively evaluate, describe and discuss in silico, in vitro, in vivo and statistical approaches that are facilitating the shift from qualitative hazard evaluation to quantitative risk assessment. The use and application of the benchmark dose approach was a central theme in many of the workshop presentations and discussions, and the Special Issue includes several contributions that outline novel applications for the analysis and interpretation of genetic toxicity data. Although the contents of the Special Issue constitutes an important step towards the adoption of quantitative methods for regulatory assessment of genetic toxicity, formal acceptance of quantitative methods for HHRA and regulatory decision-making will require consensus regarding the

  5. Climate Change Education: Quantitatively Assessing the Impact of a Botanical Garden as an Informal Learning Environment

    ERIC Educational Resources Information Center

    Sellmann, Daniela; Bogner, Franz X.

    2013-01-01

    Although informal learning environments have been studied extensively, ours is one of the first studies to quantitatively assess the impact of learning in botanical gardens on students' cognitive achievement. We observed a group of 10th graders participating in a one-day educational intervention on climate change implemented in a botanical garden.…

  6. A quantitative microbial risk assessment for center pivot irrigation of dairy wastewaters

    Technology Transfer Automated Retrieval System (TEKTRAN)

    In the western United States where livestock wastewaters are commonly land applied, there are concerns over individuals being exposed to airborne pathogens. In response, a quantitative microbial risk assessment (QMRA) was performed to estimate infectious risks from inhaling pathogens aerosolized dur...

  7. DOSIMETRY MODELING OF INHALED FORMALDEHYDE: BINNING NASAL FLUX PREDICTIONS FOR QUANTITATIVE RISK ASSESSMENT

    EPA Science Inventory

    Dosimetry Modeling of Inhaled Formaldehyde: Binning Nasal Flux Predictions for Quantitative Risk Assessment. Kimbell, J.S., Overton, J.H., Subramaniam, R.P., Schlosser, P.M., Morgan, K.T., Conolly, R.B., and Miller, F.J. (2001). Toxicol. Sci. 000, 000:000.

    Interspecies e...

  8. Quantitative Approach to Collaborative Learning: Performance Prediction, Individual Assessment, and Group Composition

    ERIC Educational Resources Information Center

    Cen, Ling; Ruta, Dymitr; Powell, Leigh; Hirsch, Benjamin; Ng, Jason

    2016-01-01

    The benefits of collaborative learning, although widely reported, lack the quantitative rigor and detailed insight into the dynamics of interactions within the group, while individual contributions and their impacts on group members and their collaborative work remain hidden behind joint group assessment. To bridge this gap we intend to address…

  9. QUANTITATIVE ASSESSMENT OF CORAL DISEASES IN THE FLORIDA KEYS: STRATEGY AND METHODOLOGY

    EPA Science Inventory

    Most studies of coral disease have focused on the incidence of a single disease within a single location. Our overall objective is to use quantitative assessments to characterize annual patterns in the distribution and frequency of scleractinian and gorgonian coral diseases over ...

  10. Can Community Health Workers Report Accurately on Births and Deaths? Results of Field Assessments in Ethiopia, Malawi and Mali

    PubMed Central

    Silva, Romesh; Amouzou, Agbessi; Munos, Melinda; Marsh, Andrew; Hazel, Elizabeth; Victora, Cesar; Black, Robert; Bryce, Jennifer

    2016-01-01

    Introduction Most low-income countries lack complete and accurate vital registration systems. As a result, measures of under-five mortality rates rely mostly on household surveys. In collaboration with partners in Ethiopia, Ghana, Malawi, and Mali, we assessed the completeness and accuracy of reporting of births and deaths by community-based health workers, and the accuracy of annualized under-five mortality rate estimates derived from these data. Here we report on results from Ethiopia, Malawi and Mali. Method In all three countries, community health workers (CHWs) were trained, equipped and supported to report pregnancies, births and deaths within defined geographic areas over a period of at least fifteen months. In-country institutions collected these data every month. At each study site, we administered a full birth history (FBH) or full pregnancy history (FPH), to women of reproductive age via a census of households in Mali and via household surveys in Ethiopia and Malawi. Using these FBHs/FPHs as a validation data source, we assessed the completeness of the counts of births and deaths and the accuracy of under-five, infant, and neonatal mortality rates from the community-based method against the retrospective FBH/FPH for rolling twelve-month periods. For each method we calculated total cost, average annual cost per 1,000 population, and average cost per vital event reported. Results On average, CHWs submitted monthly vital event reports for over 95 percent of catchment areas in Ethiopia and Malawi, and for 100 percent of catchment areas in Mali. The completeness of vital events reporting by CHWs varied: we estimated that 30%-90% of annualized expected births (i.e. the number of births estimated using a FPH) were documented by CHWs and 22%-91% of annualized expected under-five deaths were documented by CHWs. Resulting annualized under-five mortality rates based on the CHW vital events reporting were, on average, under-estimated by 28% in Ethiopia, 32% in

  11. Sewage sludge toxicity assessment using earthworm Eisenia fetida: can biochemical and histopathological analysis provide fast and accurate insight?

    PubMed

    Babić, S; Barišić, J; Malev, O; Klobučar, G; Popović, N Topić; Strunjak-Perović, I; Krasnići, N; Čož-Rakovac, R; Klobučar, R Sauerborn

    2016-06-01

    Sewage sludge (SS) is a complex organic by-product of wastewater treatment plants. Deposition of large amounts of SS can increase the risk of soil contamination. Therefore, there is an increasing need for fast and accurate assessment of SS toxic potential. Toxic effects of SS were tested on earthworm Eisenia fetida tissue, at the subcellular and biochemical level. Earthworms were exposed to depot sludge (DS) concentration ratio of 30 or 70 %, to undiluted and to 100 and 10 times diluted active sludge (AS). The exposure to DS lasted for 24/48 h (acute exposure), 96 h (semi-acute exposure) and 7/14/28 days (sub-chronic exposure) and 48 h for AS. Toxic effects were tested by the measurements of multixenobiotic resistance mechanism (MXR) activity and lipid peroxidation levels, as well as the observation of morphological alterations and behavioural changes. Biochemical markers confirmed the presence of MXR inhibitors in the tested AS and DS and highlighted the presence of SS-induced oxidative stress. The MXR inhibition and thiobarbituric acid reactive substance (TBARS) concentration in the whole earthworm's body were higher after the exposition to lower concentration of the DS. Furthermore, histopathological changes revealed damage to earthworm body wall tissue layers as well as to the epithelial and chloragogen cells in the typhlosole region. These changes were proportional to SS concentration in tested soils and to exposure duration. Obtained results may contribute to the understanding of SS-induced toxic effects on terrestrial invertebrates exposed through soil contact and to identify defence mechanisms of earthworms.

  12. Study Protocol - Accurate assessment of kidney function in Indigenous Australians: aims and methods of the eGFR Study

    PubMed Central

    2010-01-01

    Background There is an overwhelming burden of cardiovascular disease, type 2 diabetes and chronic kidney disease among Indigenous Australians. In this high risk population, it is vital that we are able to measure accurately kidney function. Glomerular filtration rate is the best overall marker of kidney function. However, differences in body build and body composition between Indigenous and non-Indigenous Australians suggest that creatinine-based estimates of glomerular filtration rate derived for European populations may not be appropriate for Indigenous Australians. The burden of kidney disease is borne disproportionately by Indigenous Australians in central and northern Australia, and there is significant heterogeneity in body build and composition within and amongst these groups. This heterogeneity might differentially affect the accuracy of estimation of glomerular filtration rate between different Indigenous groups. By assessing kidney function in Indigenous Australians from Northern Queensland, Northern Territory and Western Australia, we aim to determine a validated and practical measure of glomerular filtration rate suitable for use in all Indigenous Australians. Methods/Design A cross-sectional study of Indigenous Australian adults (target n = 600, 50% male) across 4 sites: Top End, Northern Territory; Central Australia; Far North Queensland and Western Australia. The reference measure of glomerular filtration rate was the plasma disappearance rate of iohexol over 4 hours. We will compare the accuracy of the following glomerular filtration rate measures with the reference measure: Modification of Diet in Renal Disease 4-variable formula, Chronic Kidney Disease Epidemiology Collaboration equation, Cockcroft-Gault formula and cystatin C- derived estimates. Detailed assessment of body build and composition was performed using anthropometric measurements, skinfold thicknesses, bioelectrical impedance and a sub-study used dual-energy X-ray absorptiometry. A

  13. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses.

    PubMed

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning.

  14. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses

    PubMed Central

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning. PMID:27146161

  15. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined.

  16. A novel tolerance range approach for the quantitative assessment of ecosystems.

    PubMed

    Hearnshaw, Edward J S; Hughey, Kenneth F D

    2012-03-15

    This paper develops a novel tolerance range approach that allows for the quantitative assessment of ecosystems with only a minimum amount of information. The quantitative assessment is achieved through the determination of tolerance range scores and indices that indicate the vulnerability of species. For the purposes of demonstrating the tolerance range approach an ecosystem assessment is performed on Te Waihora/Lake Ellesmere, a large shallow lake found in the Canterbury region of New Zealand. From the analysis of tolerance range scores and indices it was found that brown trout and lake-margin vegetation are the most vulnerable species of value to further degradation. This information implies that management actions should prioritize towards preserving these species to maintain all valued species along sustainable pathways.

  17. Comparison study on qualitative and quantitative risk assessment methods for urban natural gas pipeline network.

    PubMed

    Han, Z Y; Weng, W G

    2011-05-15

    In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment.

  18. Photon-tissue interaction model for quantitative assessment of biological tissues

    NASA Astrophysics Data System (ADS)

    Lee, Seung Yup; Lloyd, William R.; Wilson, Robert H.; Chandra, Malavika; McKenna, Barbara; Simeone, Diane; Scheiman, James; Mycek, Mary-Ann

    2014-02-01

    In this study, we describe a direct fit photon-tissue interaction model to quantitatively analyze reflectance spectra of biological tissue samples. The model rapidly extracts biologically-relevant parameters associated with tissue optical scattering and absorption. This model was employed to analyze reflectance spectra acquired from freshly excised human pancreatic pre-cancerous tissues (intraductal papillary mucinous neoplasm (IPMN), a common precursor lesion to pancreatic cancer). Compared to previously reported models, the direct fit model improved fit accuracy and speed. Thus, these results suggest that such models could serve as real-time, quantitative tools to characterize biological tissues assessed with reflectance spectroscopy.

  19. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  20. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    NASA Astrophysics Data System (ADS)

    Jha, Abhinav K.; Caffo, Brian; Frey, Eric C.

    2016-04-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  1. A quantitative microbial risk assessment for meatborne Toxoplasma gondii infection in The Netherlands.

    PubMed

    Opsteegh, Marieke; Prickaerts, Saskia; Frankena, Klaas; Evers, Eric G

    2011-11-01

    Toxoplasma gondii is an important foodborne pathogen, and the cause of a high disease burden due to congenital toxoplasmosis in The Netherlands. The aim of this study was to quantify the relative contribution of sheep, beef and pork products to human T. gondii infections by Quantitative Microbial Risk Assessment (QMRA). Bradyzoite concentration and portion size data were used to estimate the bradyzoite number in infected unprocessed portions for human consumption. The reduction factors for salting, freezing and heating as estimated based on published experiments in mice, were subsequently used to estimate the bradyzoite number in processed portions. A dose-response relation for T. gondii infection in mice was used to estimate the human probability of infection due to consumption of these originally infected processed portions. By multiplying these probabilities with the prevalence of T. gondii per livestock species and the number of portions consumed per year, the number of infections per year was calculated for the susceptible Dutch population and the subpopulation of susceptible pregnant women. QMRA results predict high numbers of infections per year with beef as the most important source. Although many uncertainties were present in the data and the number of congenital infections predicted by the model was almost twenty times higher than the number estimated based on the incidence in newborns, the usefulness of the advice to thoroughly heat meat is confirmed by our results. Forty percent of all predicted infections is due to the consumption of unheated meat products, and sensitivity analysis indicates that heating temperature has the strongest influence on the predicted number of infections. The results also demonstrate that, even with a low prevalence of infection in cattle, consumption of beef remains an important source of infection. Developing this QMRA model has helped identify important gaps of knowledge and resulted in the following recommendations for

  2. Quantitative Risk Assessment of CO2 Sequestration in a commerical-scale EOR Site

    NASA Astrophysics Data System (ADS)

    Pan, F.; McPherson, B. J. O. L.; Dai, Z.; Jia, W.; Lee, S. Y.; Ampomah, W.; Viswanathan, H. S.

    2015-12-01

    Enhanced Oil Recovery with CO2 (CO2-EOR) is perhaps the most feasible option for geologic CO2 sequestration (GCS), if only due to existing infrastructure and economic opportunities of associated oil production. Probably the most significant source of uncertainty of CO2 storage forecasts is heterogeneity of reservoir properties. Quantification of storage forecast uncertainty is critical for accurate assessment of risks associated with GCS in EOR fields. This study employs a response surface methodology (RSM) to quantify uncertainties of CO2 storage associated with oil production in an active CO2-EOR field. Specifically, the Morrow formation, a clastic reservoir within the Farnsworth EOR Unit (FWU) in Texas, was selected as a case study. Four uncertain parameters (i.e., independent variables) are reservoir permeability, anisotropy ratio of permeability, water-alternating-gas (WAG) time ratio, and initial oil saturation. Cumulative oil production and net CO2 injection are the output dependent variables. A 3-D FWU reservoir model, including a representative 5-spot well pattern, was constructed for CO2-oil-water multiphase flow analysis. A total of 25 permutations of 3-D reservoir simulations were executed using Eclipse simulator. After performing stepwise regression analysis, a series of response surface models of the output variables at each step were constructed and verified using appropriate goodness-of-fit measures. The R2 values are larger than 0.9 and NRMSE values are less than 5% between the simulated and predicted oil production and net CO2 injection, suggesting that the response surface (or proxy) models are sufficient for predicting CO2-EOR system behavior for FWU case. Given the range of uncertainties in the independent variables, the cumulative distribution functions (CDFs) of dependent variables were estimated using the proxy models. The predicted cumulative oil production and net CO2 injection at 95th percentile after 5 years are about 3.65 times, and 1

  3. Assessing exposure to allied ground troops in the Vietnam War: a quantitative evaluation of the Stellman Exposure Opportunity Index model.

    PubMed

    Ginevan, Michael E; Watkins, Deborah K; Ross, John H; O'Boyle, Randy A

    2009-06-01

    The Exposure Opportunity Index (EOI) is a proximity-based model developed to estimate relative exposure of ground troops in Vietnam to aerially applied herbicides. We conducted a detailed quantitative evaluation of the EOI model by using actual herbicide spray missions isolated in time and space. EOI scores were calculated for each of 36 hypothetical receptor location points associated with each spray mission for 30 herbicide missions for two time periods - day of herbicide application and day 2-3 post-application. Our analysis found an enormous range of EOI predictions with 500-1000-fold differences across missions directly under the flight path. This quantitative examination of the EOI suggests that extensive testing of the model's code is warranted. Researchers undertaking development of a proximity-based exposure model for epidemiologic studies of either Vietnam veterans or the Vietnamese population should conduct a thorough and realistic analysis of how precise and accurate the model results are likely to be and then assess whether the model results provide a useful basis for their planned epidemiologic studies.

  4. Food Consumption and Handling Survey for Quantitative Microbiological Consumer Phase Risk Assessments.

    PubMed

    Chardon, Jurgen; Swart, Arno

    2016-07-01

    In the consumer phase of a typical quantitative microbiological risk assessment (QMRA), mathematical equations identify data gaps. To acquire useful data we designed a food consumption and food handling survey (2,226 respondents) for QMRA applications that is especially aimed at obtaining quantitative data. For a broad spectrum of food products, the survey covered the following topics: processing status at retail, consumer storage, preparation, and consumption. Questions were designed to facilitate distribution fitting. In the statistical analysis, special attention was given to the selection of the most adequate distribution to describe the data. Bootstrap procedures were used to describe uncertainty. The final result was a coherent quantitative consumer phase food survey and parameter estimates for food handling and consumption practices in The Netherlands, including variation over individuals and uncertainty estimates.

  5. An assessment of software solutions for the analysis of mass spectrometry based quantitative proteomics data.

    PubMed

    Mueller, Lukas N; Brusniak, Mi-Youn; Mani, D R; Aebersold, Ruedi

    2008-01-01

    Over the past decade, a series of experimental strategies for mass spectrometry based quantitative proteomics and corresponding computational methodology for the processing of the resulting data have been generated. We provide here an overview of the main quantification principles and available software solutions for the analysis of data generated by liquid chromatography coupled to mass spectrometry (LC-MS). Three conceptually different methods to perform quantitative LC-MS experiments have been introduced. In the first, quantification is achieved by spectral counting, in the second via differential stable isotopic labeling, and in the third by using the ion current in label-free LC-MS measurements. We discuss here advantages and challenges of each quantification approach and assess available software solutions with respect to their instrument compatibility and processing functionality. This review therefore serves as a starting point for researchers to choose an appropriate software solution for quantitative proteomic experiments based on their experimental and analytical requirements.

  6. A quantitative approach to the intraoperative echocardiographic assessment of the mitral valve for repair.

    PubMed

    Mahmood, Feroze; Matyal, Robina

    2015-07-01

    Intraoperative echocardiography of the mitral valve has evolved from a qualitative assessment of flow-dependent variables to quantitative geometric analyses before and after repair. In addition, 3-dimensional echocardiographic data now allow for a precise assessment of mitral valve apparatus. Complex structures, such as the mitral annulus, can be interrogated comprehensively without geometric assumptions. Quantitative analyses of mitral valve apparatus are particularly valuable for identifying indices of left ventricular and mitral remodeling to establish the chronicity and severity of mitral regurgitation. This can help identify patients who may be unsuitable candidates for repair as the result of irreversible remodeling of the mitral valve apparatus. Principles of geometric analyses also have been extended to the assessment of repaired mitral valves. Changes in mitral annular shape and size determine the stress exerted on the mitral leaflets and, therefore, the durability of repair. Given this context, echocardiographers may be expected to diagnose and quantify valvular dysfunction, assess suitability for repair, assist in annuloplasty ring sizing, and determine the success and failure of the repair procedure. As a result, anesthesiologists have progressed from being mere service providers to participants in the decision-making process. It is therefore prudent for them to acquaint themselves with the principles of intraoperative quantitative mitral valve analysis to assist in rational and objective decision making.

  7. Image coregistration: quantitative processing framework for the assessment of brain lesions.

    PubMed

    Huhdanpaa, Hannu; Hwang, Darryl H; Gasparian, Gregory G; Booker, Michael T; Cen, Yong; Lerner, Alexander; Boyko, Orest B; Go, John L; Kim, Paul E; Rajamohan, Anandh; Law, Meng; Shiroishi, Mark S

    2014-06-01

    The quantitative, multiparametric assessment of brain lesions requires coregistering different parameters derived from MRI sequences. This will be followed by analysis of the voxel values of the ROI within the sequences and calculated parametric maps, and deriving multiparametric models to classify imaging data. There is a need for an intuitive, automated quantitative processing framework that is generalized and adaptable to different clinical and research questions. As such flexible frameworks have not been previously described, we proceeded to construct a quantitative post-processing framework with commonly available software components. Matlab was chosen as the programming/integration environment, and SPM was chosen as the coregistration component. Matlab routines were created to extract and concatenate the coregistration transforms, take the coregistered MRI sequences as inputs to the process, allow specification of the ROI, and store the voxel values to the database for statistical analysis. The functionality of the framework was validated using brain tumor MRI cases. The implementation of this quantitative post-processing framework enables intuitive creation of multiple parameters for each voxel, facilitating near real-time in-depth voxel-wise analysis. Our initial empirical evaluation of the framework is an increased usage of analysis requiring post-processing and increased number of simultaneous research activities by clinicians and researchers with non-technical backgrounds. We show that common software components can be utilized to implement an intuitive real-time quantitative post-processing framework, resulting in improved scalability and increased adoption of post-processing needed to answer important diagnostic questions.

  8. [Application of uncertainty assessment in NIR quantitative analysis of traditional Chinese medicine].

    PubMed

    Xue, Zhong; Xu, Bing; Liu, Qian; Shi, Xin-Yuan; Li, Jian-Yu; Wu, Zhi-Sheng; Qiao, Yan-Jiang

    2014-10-01

    The near infrared (NIR) spectra of Liuyi San samples were collected during the mixing process and the quantitative models by PLS (partial least squares) method were generated for the quantification of the concentration of glycyrrhizin. The PLS quantitative model had good calibration and prediction performances (r(cal) 0.998 5, RMSEC = 0.044 mg · g(-1); r(val) = 0.947 4, RMSEP = 0.124 mg · g(-1)), indicating that NIR spectroscopy can be used as a rapid determination method of the concentration of glycyrrhizin in Liuyi San powder. After the validation tests were designed, the Liao-Lin-Iyer approach based on Monte Carlo simulation was used to estimate β-content-γ-confidence tolerance intervals. Then the uncertainty was calculated, and the uncer- tainty profile was drawn. The NIR analytical method was considered valid when the concentration of glycyrrhizin is above 1.56 mg · g(-1) since the uncertainty fell within the acceptable limits (λ = ± 20%). The results showed that uncertainty assessment can be used in NIR quantitative models of glycyrrhizin for different concentrations and provided references for other traditional Chinese medicine to finish the uncertainty assessment using NIR quantitative analysis.

  9. Differential label-free quantitative proteomic analysis of Shewanella oneidensis cultured under aerobic and suboxic conditions by accurate mass and time tag approach.

    PubMed

    Fang, Ruihua; Elias, Dwayne A; Monroe, Matthew E; Shen, Yufeng; McIntosh, Martin; Wang, Pei; Goddard, Carrie D; Callister, Stephen J; Moore, Ronald J; Gorby, Yuri A; Adkins, Joshua N; Fredrickson, Jim K; Lipton, Mary S; Smith, Richard D

    2006-04-01

    We describe the application of LC-MS without the use of stable isotope labeling for differential quantitative proteomic analysis of whole cell lysates of Shewanella oneidensis MR-1 cultured under aerobic and suboxic conditions. LC-MS/MS was used to initially identify peptide sequences, and LC-FTICR was used to confirm these identifications as well as measure relative peptide abundances. 2343 peptides covering 668 proteins were identified with high confidence and quantified. Among these proteins, a subset of 56 changed significantly using statistical approaches such as statistical analysis of microarrays, whereas another subset of 56 that were annotated as performing housekeeping functions remained essentially unchanged in relative abundance. Numerous proteins involved in anaerobic energy metabolism exhibited up to a 10-fold increase in relative abundance when S. oneidensis was transitioned from aerobic to suboxic conditions.

  10. Differential Label-free Quantitative Proteomic Analysis of Shewanella oneidensis Cultured under Aerobic and Suboxic Conditions by Accurate Mass and Time Tag Approach

    SciTech Connect

    Fang, Ruihua; Elias, Dwayne A.; Monroe, Matthew E.; Shen, Yufeng; McIntosh, Martin; Wang, Pei; Goddard, Carrie D.; Callister, Stephen J.; Moore, Ronald J.; Gorby, Yuri A.; Adkins, Joshua N.; Fredrickson, Jim K.; Lipton, Mary S.; Smith, Richard D.

    2006-04-01

    We describe the application of liquid chromatography coupled to mass spectrometry (LC/MS) without the use of stable isotope labeling for differential quantitative proteomics analysis of whole cell lysates of Shewanella oneidensis MR-1 cultured under aerobic and sub-oxic conditions. Liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) was used to initially identify peptide sequences, and LC coupled to Fourier transform ion cyclotron resonance mass spectrometry (LC-FTICR) was used to confirm these identifications, as well as measure relative peptide abundances. 2343 peptides, covering 668 proteins were identified with high confidence and quantified. Among these proteins, a subset of 56 changed significantly using statistical approaches such as SAM, while another subset of 56 that were annotated as performing housekeeping functions remained essentially unchanged in relative abundance. Numerous proteins involved in anaerobic energy metabolism exhibited up to a 10-fold increase in relative abundance when S. oneidensis is transitioned from aerobic to sub-oxic conditions.

  11. Sensitivity analysis of a two-dimensional quantitative microbiological risk assessment: keeping variability and uncertainty separated.

    PubMed

    Busschaert, Pieter; Geeraerd, Annemie H; Uyttendaele, Mieke; Van Impe, Jan F

    2011-08-01

    The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo-randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used-that is, an ANOVA-like model and Sobol sensitivity indices-to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.

  12. Purity assessment problem in quantitative NMR--impurity resonance overlaps with monitor signal multiplets from stereoisomers.

    PubMed

    Malz, Frank; Jancke, Harald

    2006-06-01

    This paper describes the situation that can emerge when the signals to be evaluated in quantitative NMR measurements-so-called "monitor signals"--consist of several resonance lines from the stereoisomers of the analyte in addition to an impurity signal underneath. The monitor signal problem is demonstrated in the purity assessment of two samples of 2-(isopropylamino)-4-(ethylamino)-6-chloro-1,3,5-triazine (atrazine), a common herbizide which served as analyte in a CCQM intercomparison. It is shown that, in DMSO-d6 solution, a mixture of stereoisomers leads to several individual overlapping singlets, which are further split by spin-spin coupling. A measurement protocol was developed for finding and identifying an impurity that has a signal that is positioned precisely beneath the methyl signal chosen as the monitor signal in one of the samples. Quantitative NMR purity assessment is still possible in this special case, but with higher uncertainty.

  13. Quantitative assessment of brain microvascular and tissue oxygenation during cardiac arrest and resuscitation in pigs.

    PubMed

    Yu, J; Ramadeen, A; Tsui, A K Y; Hu, X; Zou, L; Wilson, D F; Esipova, T V; Vinogradov, S A; Leong-Poi, H; Zamiri, N; Mazer, C D; Dorian, P; Hare, G M T

    2013-07-01

    Cardiac arrest is associated with a very high rate of mortality, in part due to inadequate tissue perfusion during attempts at resuscitation. Parameters such as mean arterial pressure and end-tidal carbon dioxide may not accurately reflect adequacy of tissue perfusion during cardiac resuscitation. We hypothesised that quantitative measurements of tissue oxygen tension would more accurately reflect adequacy of tissue perfusion during experimental cardiac arrest. Using oxygen-dependent quenching of phosphorescence, we made measurements of oxygen in the microcirculation and in the interstitial space of the brain and muscle in a porcine model of ventricular fibrillation and cardiopulmonary resuscitation. Measurements were performed at baseline, during untreated ventricular fibrillation, during resuscitation and after return of spontaneous circulation. After achieving stable baseline brain tissue oxygen tension, as measured using an Oxyphor G4-based phosphorescent microsensor, ventricular fibrillation resulted in an immediate reduction in all measured parameters. During cardiopulmonary resuscitation, brain oxygen tension remained unchanged. After the return of spontaneous circulation, all measured parameters including brain oxygen tension recovered to baseline levels. Muscle tissue oxygen tension followed a similar trend as the brain, but with slower response times. We conclude that measurements of brain tissue oxygen tension, which more accurately reflect adequacy of tissue perfusion during cardiac arrest and resuscitation, may contribute to the development of new strategies to optimise perfusion during cardiac resuscitation and improve patient outcomes after cardiac arrest.

  14. Quantitative assessment of short amplicons in FFPE-derived long-chain RNA

    PubMed Central

    Kong, Hui; Zhu, Mengou; Cui, Fengyun; Wang, Shuyang; Gao, Xue; Lu, Shaohua; Wu, Ying; Zhu, Hongguang

    2014-01-01

    Formalin-fixed paraffin-embedded (FFPE) tissues are important resources for molecular medical research. However, long-chain RNA analysis is restricted in FFPE tissues due to high levels of degradation. To explore the possibility of long RNA quantification in FFPE tissues, we selected 14 target RNAs (8 mRNAs and 6 long noncoding RNAs) from literatures, and designed short (~60 bp) and long (~200 bp) amplicons for each of them. Colorectal carcinomas with adjacent normal tissues were subjected to quantitative reverse-transcription PCR (quantitative RT-PCR) in 3 cohorts, including 18 snap-frozen and 83 FFPE tissues. We found that short amplicons were amplified more efficiently than long amplicons both in snap-frozen (P = 0.0006) and FFPE (P = 0.0152) tissues. Nonetheless, comparison of colorectal carcinomas with their adjacent normal tissues demonstrated that the consistency of fold-change trends in a single short amplicon between snap-frozen and FFPE tissues was only 36%. Therefore, we innovatively performed quantitative RT-PCR with 3 non-overlapping short amplicons for 14 target RNAs in FFPE tissues. All target RNAs showed a concordance of 100% of fold-change trends in at least two short amplicons, which offers sufficient information for accurate quantification of target RNAs. Our findings demonstrated the possibility of long-chain RNA analysis with 3 non-overlapping short amplicons in standardized-preserved FFPE tissues. PMID:25430878

  15. Quantitative assessment of radiation force effect at the dielectric air-liquid interface

    PubMed Central

    Capeloto, Otávio Augusto; Zanuto, Vitor Santaella; Malacarne, Luis Carlos; Baesso, Mauro Luciano; Lukasievicz, Gustavo Vinicius Bassi; Bialkowski, Stephen Edward; Astrath, Nelson Guilherme Castelli

    2016-01-01

    We induce nanometer-scale surface deformation by exploiting momentum conservation of the interaction between laser light and dielectric liquids. The effect of radiation force at the air-liquid interface is quantitatively assessed for fluids with different density, viscosity and surface tension. The imparted pressure on the liquids by continuous or pulsed laser light excitation is fully described by the Helmholtz electromagnetic force density. PMID:26856622

  16. Postoperative Quantitative Assessment of Reconstructive Tissue Status in Cutaneous Flap Model using Spatial Frequency Domain Imaging

    PubMed Central

    Yafi, Amr; Vetter, Thomas S; Scholz, Thomas; Patel, Sarin; Saager, Rolf B; Cuccia, David J; Evans, Gregory R; Durkin, Anthony J

    2010-01-01

    Background The purpose of this study is to investigate the capabilities of a novel optical wide-field imaging technology known as Spatial Frequency Domain Imaging (SFDI) to quantitatively assess reconstructive tissue status. Methods Twenty two cutaneous pedicle flaps were created on eleven rats based on the inferior epigastric vessels. After baseline measurement, all flaps underwent vascular ischemia, induced by clamping the supporting vessels for two hours (either arterio-venous or selective venous occlusions) normal saline was injected to the control flap, and hypertonic hyperoncotic saline solution to the experimental flap. Flaps were monitored for two hours after reperfusion. The SFDI system was used for quantitative assessment of flap status over the duration of the experiment. Results All flaps demonstrated a significant decline in oxy-hemoglobin and tissue oxygen saturation in response to occlusion. Total hemoglobin and deoxy-hemoglobin were markedly increased in the selective venous occlusion group. After reperfusion and the solutions were administered, oxy-hemoglobin and tissue oxygen saturation in those flaps that survived gradually returned to the baseline levels. However, flaps for which oxy-hemoglobin and tissue oxygen saturation didn’t show any signs of recovery appeared to be compromised and eventually became necrotic within 24–48 hours in both occlusion groups. Conclusion SFDI technology provides a quantitative, objective method to assess tissue status. This study demonstrates the potential of this optical technology to assess tissue perfusion in a very precise and quantitative way, enabling wide-field visualization of physiological parameters. The results of this study suggest that SFDI may provide a means for prospectively identifying dysfunctional flaps well in advance of failure. PMID:21200206

  17. The Quantitative Reasoning for College Science (QuaRCS) Assessment in non-Astro 101 Courses

    NASA Astrophysics Data System (ADS)

    Kirkman, Thomas W.; Jensen, Ellen

    2016-06-01

    The innumeracy of American students and adults is a much lamented educational problem. The quantitative reasoning skills of college students may be particularly addressed and improved in "general education" science courses like Astro 101. Demonstrating improvement requires a standardized instrument. Among the non-proprietary instruments the Quantitative Literacy and Reasoning Assessment[1] (QRLA) and the Quantitative Reasoning for College Science (QuaRCS) Assessment[2] stand out.Follette et al. developed the QuaRCS in the context of Astro 101 at University of Arizona. We report on QuaRCS results in different contexts: pre-med physics and pre-nursing microbiology at a liberal arts college. We report on the mismatch between students' contemporaneous report of a question's difficulty and the actual probability of success. We report correlations between QuaRCS and other assessments of overall student performance in the class. We report differences in attitude towards mathematics in these two different but health-related student populations .[1] QLRA, Gaze et al., 2014, DOI: http://dx.doi.org/10.5038/1936-4660.7.2.4[2] QuaRCS, Follette, et al., 2015, DOI: http://dx.doi.org/10.5038/1936-4660.8.2.2

  18. Summary of the workshop on issues in risk assessment: quantitative methods for developmental toxicology.

    PubMed

    Mattison, D R; Sandler, J D

    1994-08-01

    This report summarizes the proceedings of a conference on quantitative methods for assessing the risks of developmental toxicants. The conference was planned by a subcommittee of the National Research Council's Committee on Risk Assessment Methodology in conjunction with staff from several federal agencies, including the U.S. Environmental Protection Agency, U.S. Food and Drug Administration, U.S. Consumer Products Safety Commission, and Health and Welfare Canada. Issues discussed at the workshop included computerized techniques for hazard identification, use of human and animal data for defining risks in a clinical setting, relationships between end points in developmental toxicity testing, reference dose calculations for developmental toxicology, analysis of quantitative dose-response data, mechanisms of developmental toxicity, physiologically based pharmacokinetic models, and structure-activity relationships. Although a formal consensus was not sought, many participants favored the evolution of quantitative techniques for developmental toxicology risk assessment, including the replacement of lowest observed adverse effect levels (LOAELs) and no observed adverse effect levels (NOAELs) with the benchmark dose methodology.

  19. Synthesized quantitative assessment of human mental fatigue with EEG and HRV

    NASA Astrophysics Data System (ADS)

    Han, Qingpeng; Wang, Li; Wang, Ping; Wen, Bangchun

    2005-12-01

    The electroencephalograph (EEG) signals and heart rate variable (HRV) signals, which are relative to human body mental stress, are analyzed with the nonlinear dynamics and chaos. Based on calculated three nonlinear parameters, a synthesized quantitative criterion is proposed to assess the body's mental fatigue states. Firstly, the HRV and α wave of EEG from original signals are extracted based on wavelet transform technique. Then, the Largest Lyapunov Exponents, Complexity and Approximate Entropy, are calculated for both HRV and α wave. The three nonlinear parameters reflect quantitatively human physiological activities and can be used to evaluate the mental workload degree. Based on the computation and statistical analysis of practical EEG and HRV data, a synthesized quantitative assessment criterion is induced for mental fatigues with three nonlinear parameters of the above two rhythms. For the known 10 measured data of EEG and HRV signals, the assessment results are obtained with the above laws for different metal fatigue states. To compare with the practical cases, the identification accuracy of mental fatigue or not is up to 100 percent. Furthermore, the accuracies of weak fatigue, middle fatigue and serious fatigue mental workload are all relatively higher; they are about 94.44, 88.89, and 83.33 percent, respectively.

  20. The Utility of Maze Accurate Response Rate in Assessing Reading Comprehension in Upper Elementary and Middle School Students

    ERIC Educational Resources Information Center

    McCane-Bowling, Sara J.; Strait, Andrea D.; Guess, Pamela E.; Wiedo, Jennifer R.; Muncie, Eric

    2014-01-01

    This study examined the predictive utility of five formative reading measures: words correct per minute, number of comprehension questions correct, reading comprehension rate, number of maze correct responses, and maze accurate response rate (MARR). Broad Reading cluster scores obtained via the Woodcock-Johnson III (WJ III) Tests of Achievement…

  1. A framework for quantitative assessment of impacts related to energy and mineral resource development

    USGS Publications Warehouse

    Haines, Seth S.; Diffendorfer, James; Balistrieri, Laurie S.; Berger, Byron R.; Cook, Troy A.; Gautier, Donald L.; Gallegos, Tanya J.; Gerritsen, Margot; Graffy, Elisabeth; Hawkins, Sarah; Johnson, Kathleen; Macknick, Jordan; McMahon, Peter; Modde, Tim; Pierce, Brenda; Schuenemeyer, John H.; Semmens, Darius; Simon, Benjamin; Taylor, Jason; Walton-Day, Katherine

    2013-01-01

    Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sage grouse leks and piñon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. The framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.

  2. Quantitative assessment of tension in wires of fine-wire external fixators.

    PubMed

    Dong, Yin; Saleh, Micheal; Yang, Lang

    2005-01-01

    Fine-wire fixators are widely used in fracture management. Stable fixation requires the wires maintaining tension throughout the treatment. Clinical experience indicates that wire site complications relate to wire tension. However, there lacks a method to assess wire tension quantitatively in the clinic. The objective of this study was to develop a quantitative assessment method for in situ wire tension and to investigate the factors that influence the assessment. An apparatus was developed based on a linear variable differential transformer (LVDT) displacement transducer that measured the deflection of the testing wire with respect to a parallel reference wire when a constant transverse force of 30N was applied to the testing wire. The wire deflection measured was correlated with the wire tension measured by the force transducer. The experiment was performed under different conditions to assess the effect of bone-clamp distance, reference wire tension, number of wires, and fracture stiffness. The results showed that there was a significant and negative correlation between wire tension and deflection and the bone-clamp distance was the most important factor that affected the wire tension-deflection relationship. The assessment method makes it possible to investigate the relationship between wire tension and wire site complications in the clinic.

  3. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit.

    PubMed

    Dai, Houde; Zhang, Pengyue; Lueth, Tim C

    2015-09-29

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor assessment method. This inertial sensor-based method was verified through comparison with an electromagnetic motion tracking system. Seven Parkinson's disease (PD) patients were tested using this tremor assessment system. The measured tremor amplitudes correlated well with the judgments of a neurologist (r = 0.98). The systematic analysis of sensor-based tremor quantification and the corresponding experiments could be of great help in monitoring the severity of parkinsonian tremor.

  4. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit

    PubMed Central

    Dai, Houde; Zhang, Pengyue; Lueth, Tim C.

    2015-01-01

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor assessment method. This inertial sensor-based method was verified through comparison with an electromagnetic motion tracking system. Seven Parkinson’s disease (PD) patients were tested using this tremor assessment system. The measured tremor amplitudes correlated well with the judgments of a neurologist (r = 0.98). The systematic analysis of sensor-based tremor quantification and the corresponding experiments could be of great help in monitoring the severity of parkinsonian tremor. PMID:26426020

  5. Quantitative assessment of historical coastal landfill contamination using in-situ field portable XRF (FPXRF)

    NASA Astrophysics Data System (ADS)

    O'Shea, Francis; Spencer, Kate; Brasington, James

    2014-05-01

    in the field to determine the presence, location and extent of the sub-surface contaminant plume. Although XRF analysis has gained acceptance in the study of in-situ metal contamination (Kalnicky and Singhvi 2001; Martin Peinado et al. 2010) field moisture content and sample heterogeneity can suppress X-ray signals. Therefore, sediment samples were also collected and returned to the laboratory and analysed by ICP OES for comparison. Both wet and dry certified reference materials were also analysed in the laboratory using XRF and ICP OES to observe the impact of moisture content and to produce a correction factor allowing quantitative data to be collected in the field. In-situ raw XRF data identified the location of contamination plumes in the field in agreement with ICP data, although the data were systematically suppressed compared to ICP data, under-estimating the levels of contamination. Applying a correction factor for moisture content provided accurate measurements of concentration. The use of field portable XRF with the application of a moisture content correction factor enables the rapid screening of sediment fronting coastal landfill sites, goes some way towards providing a national baseline dataset and can contribute to the development of risk assessments.

  6. ICSH recommendations for assessing automated high-performance liquid chromatography and capillary electrophoresis equipment for the quantitation of HbA2.

    PubMed

    Stephens, A D; Colah, R; Fucharoen, S; Hoyer, J; Keren, D; McFarlane, A; Perrett, D; Wild, B J

    2015-10-01

    Automated high performance liquid chromatography and Capillary electrophoresis are used to quantitate the proportion of Hemoglobin A2 (HbA2 ) in blood samples order to enable screening and diagnosis of carriers of β-thalassemia. Since there is only a very small difference in HbA2 levels between people who are carriers and people who are not carriers such analyses need to be both precise and accurate. This paper examines the different parameters of such equipment and discusses how they should be assessed.

  7. Depth-dependent displacement sensitivity analysis and the influence of Doppler angle for quantitative assessment of mechanical properties using phase-sensitive spectral domain optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Lynch, Gillian; Subhash, Hrebesh; Alexandrov, Sergey; Leahy, Martin

    2016-03-01

    Optical coherence elastography (OCE) asesses the mechanical properties of samples by applying a mechanical stimulation and detecting the resulting sample displacement using optical coherence tomography (OCT). OCE methods which utilise the phase of the OCT signal offer the potential to detect displacements on the sub-nanometre scale. However, the displacement sensitivity achieveable is directly related to the signal-to-noise ratio and phase stability of the underlying OCT system. Furthermore, the estimation of Doppler angle is imperative in accurately measuring the sample displacement. This work evaluates the contributions of each of these parameters for quantitative assessment of mechanical properties using phase-sensitive spectral domain OCT.

  8. Validation of reference genes for accurate normalization of gene expression for real time-quantitative PCR in strawberry fruits using different cultivars and osmotic stresses.

    PubMed

    Galli, Vanessa; Borowski, Joyce Moura; Perin, Ellen Cristina; Messias, Rafael da Silva; Labonde, Julia; Pereira, Ivan dos Santos; Silva, Sérgio Delmar Dos Anjos; Rombaldi, Cesar Valmor

    2015-01-10

    The increasing demand of strawberry (Fragaria×ananassa Duch) fruits is associated mainly with their sensorial characteristics and the content of antioxidant compounds. Nevertheless, the strawberry production has been hampered due to its sensitivity to abiotic stresses. Therefore, to understand the molecular mechanisms highlighting stress response is of great importance to enable genetic engineering approaches aiming to improve strawberry tolerance. However, the study of expression of genes in strawberry requires the use of suitable reference genes. In the present study, seven traditional and novel candidate reference genes were evaluated for transcript normalization in fruits of ten strawberry cultivars and two abiotic stresses, using RefFinder, which integrates the four major currently available software programs: geNorm, NormFinder, BestKeeper and the comparative delta-Ct method. The results indicate that the expression stability is dependent on the experimental conditions. The candidate reference gene DBP (DNA binding protein) was considered the most suitable to normalize expression data in samples of strawberry cultivars and under drought stress condition, and the candidate reference gene HISTH4 (histone H4) was the most stable under osmotic stresses and salt stress. The traditional genes GAPDH (glyceraldehyde-3-phosphate dehydrogenase) and 18S (18S ribosomal RNA) were considered the most unstable genes in all conditions. The expression of phenylalanine ammonia lyase (PAL) and 9-cis epoxycarotenoid dioxygenase (NCED1) genes were used to further confirm the validated candidate reference genes, showing that the use of an inappropriate reference gene may induce erroneous results. This study is the first survey on the stability of reference genes in strawberry cultivars and osmotic stresses and provides guidelines to obtain more accurate RT-qPCR results for future breeding efforts.

  9. Computed tomography-based quantitative assessment of lower extremity lymphedema following treatment for gynecologic cancer

    PubMed Central

    Chung, Seung Hyun; Kim, Young Jae; Kim, Kwang Gi; Hwang, Ji Hye

    2017-01-01

    Objective To develop an algorithmic quantitative skin and subcutaneous tissue volume measurement protocol for lower extremity lymphedema (LEL) patients using computed tomography (CT), to verify the usefulness of the measurement techniques in LEL patients, and to observe the structural characteristics of subcutaneous tissue according to the progression of LEL in gynecologic cancer. Methods A program for algorithmic quantitative analysis of lower extremity CT scans has been developed to measure the skin and subcutaneous volume, muscle compartment volume, and the extent of the peculiar trabecular area with a honeycombed pattern. The CT venographies of 50 lower extremities from 25 subjects were reviewed in two groups (acute and chronic lymphedema). Results A significant increase in the total volume, subcutaneous volume, and extent of peculiar trabecular area with a honeycombed pattern except quantitative muscle volume was identified in the more-affected limb. The correlation of CT-based total volume and subcutaneous volume measurements with volumetry measurement was strong (correlation coefficient: 0.747 and 0.749, respectively). The larger extent of peculiar trabecular area with a honeycombed pattern in the subcutaneous tissue was identified in the more-affected limb of chronic lymphedema group. Conclusion CT-based quantitative assessments could provide objective volume measurements and information about the structural characteristics of subcutaneous tissue in women with LEL following treatment for gynecologic cancer. PMID:28028991

  10. Quantitative assessment of microvasculopathy in arcAβ mice with USPIO-enhanced gradient echo MRI

    PubMed Central

    Deistung, Andreas; Ielacqua, Giovanna D; Seuwen, Aline; Kindler, Diana; Schweser, Ferdinand; Vaas, Markus; Kipar, Anja; Reichenbach, Jürgen R; Rudin, Markus

    2015-01-01

    Magnetic resonance imaging employing administration of iron oxide-based contrast agents is widely used to visualize cellular and molecular processes in vivo. In this study, we investigated the ability of R2* and quantitative susceptibility mapping to quantitatively assess the accumulation of ultrasmall superparamagnetic iron oxide (USPIO) particles in the arcAβ mouse model of cerebral amyloidosis. Gradient-echo data of mouse brains were acquired at 9.4 T after injection of USPIO. Focal areas with increased magnetic susceptibility and R2* values were discernible across several brain regions in 12-month-old arcAβ compared to 6-month-old arcAβ mice and to non-transgenic littermates, indicating accumulation of particles after USPIO injection. This was concomitant with higher R2* and increased magnetic susceptibility differences relative to cerebrospinal fluid measured in USPIO-injected compared to non-USPIO-injected 12-month-old arcAβ mice. No differences in R2* and magnetic susceptibility were detected in USPIO-injected compared to non-injected 12-month-old non-transgenic littermates. Histological analysis confirmed focal uptake of USPIO particles in perivascular macrophages adjacent to small caliber cerebral vessels with radii of 2–8 µm that showed no cerebral amyloid angiopathy. USPIO-enhanced R2* and quantitative susceptibility mapping constitute quantitative tools to monitor such functional microvasculopathies. PMID:26661253

  11. Quantitative assessment of microvasculopathy in arcAβ mice with USPIO-enhanced gradient echo MRI.

    PubMed

    Klohs, Jan; Deistung, Andreas; Ielacqua, Giovanna D; Seuwen, Aline; Kindler, Diana; Schweser, Ferdinand; Vaas, Markus; Kipar, Anja; Reichenbach, Jürgen R; Rudin, Markus

    2016-09-01

    Magnetic resonance imaging employing administration of iron oxide-based contrast agents is widely used to visualize cellular and molecular processes in vivo. In this study, we investigated the ability of [Formula: see text] and quantitative susceptibility mapping to quantitatively assess the accumulation of ultrasmall superparamagnetic iron oxide (USPIO) particles in the arcAβ mouse model of cerebral amyloidosis. Gradient-echo data of mouse brains were acquired at 9.4 T after injection of USPIO. Focal areas with increased magnetic susceptibility and [Formula: see text] values were discernible across several brain regions in 12-month-old arcAβ compared to 6-month-old arcAβ mice and to non-transgenic littermates, indicating accumulation of particles after USPIO injection. This was concomitant with higher [Formula: see text] and increased magnetic susceptibility differences relative to cerebrospinal fluid measured in USPIO-injected compared to non-USPIO-injected 12-month-old arcAβ mice. No differences in [Formula: see text] and magnetic susceptibility were detected in USPIO-injected compared to non-injected 12-month-old non-transgenic littermates. Histological analysis confirmed focal uptake of USPIO particles in perivascular macrophages adjacent to small caliber cerebral vessels with radii of 2-8 µm that showed no cerebral amyloid angiopathy. USPIO-enhanced [Formula: see text] and quantitative susceptibility mapping constitute quantitative tools to monitor such functional microvasculopathies.

  12. Quantitative assessment of p-glycoprotein expression and function using confocal image analysis.

    PubMed

    Hamrang, Zahra; Arthanari, Yamini; Clarke, David; Pluen, Alain

    2014-10-01

    P-glycoprotein is implicated in clinical drug resistance; thus, rapid quantitative analysis of its expression and activity is of paramout importance to the design and success of novel therapeutics. The scope for the application of quantitative imaging and image analysis tools in this field is reported here at "proof of concept" level. P-glycoprotein expression was utilized as a model for quantitative immunofluorescence and subsequent spatial intensity distribution analysis (SpIDA). Following expression studies, p-glycoprotein inhibition as a function of verapamil concentration was assessed in two cell lines using live cell imaging of intracellular Calcein retention and a routine monolayer fluorescence assay. Intercellular and sub-cellular distributions in the expression of the p-glycoprotein transporter between parent and MDR1-transfected Madin-Derby Canine Kidney cell lines were examined. We have demonstrated that quantitative imaging can provide dose-response parameters while permitting direct microscopic analysis of intracellular fluorophore distributions in live and fixed samples. Analysis with SpIDA offers the ability to detect heterogeniety in the distribution of labeled species, and in conjunction with live cell imaging and immunofluorescence staining may be applied to the determination of pharmacological parameters or analysis of biopsies providing a rapid prognostic tool.

  13. Evaluation of reference genes for accurate normalization of gene expression for real time-quantitative PCR in Pyrus pyrifolia using different tissue samples and seasonal conditions.

    PubMed

    Imai, Tsuyoshi; Ubi, Benjamin E; Saito, Takanori; Moriguchi, Takaya

    2014-01-01

    We have evaluated suitable reference genes for real time (RT)-quantitative PCR (qPCR) analysis in Japanese pear (Pyrus pyrifolia). We tested most frequently used genes in the literature such as β-Tubulin, Histone H3, Actin, Elongation factor-1α, Glyceraldehyde-3-phosphate dehydrogenase, together with newly added genes Annexin, SAND and TIP41. A total of 17 primer combinations for these eight genes were evaluated using cDNAs synthesized from 16 tissue samples from four groups, namely: flower bud, flower organ, fruit flesh and fruit skin. Gene expression stabilities were analyzed using geNorm and NormFinder software packages or by ΔCt method. geNorm analysis indicated three best performing genes as being sufficient for reliable normalization of RT-qPCR data. Suitable reference genes were different among sample groups, suggesting the importance of validation of gene expression stability of reference genes in the samples of interest. Ranking of stability was basically similar between geNorm and NormFinder, suggesting usefulness of these programs based on different algorithms. ΔCt method suggested somewhat different results in some groups such as flower organ or fruit skin; though the overall results were in good correlation with geNorm or NormFinder. Gene expression of two cold-inducible genes PpCBF2 and PpCBF4 were quantified using the three most and the three least stable reference genes suggested by geNorm. Although normalized quantities were different between them, the relative quantities within a group of samples were similar even when the least stable reference genes were used. Our data suggested that using the geometric mean value of three reference genes for normalization is quite a reliable approach to evaluating gene expression by RT-qPCR. We propose that the initial evaluation of gene expression stability by ΔCt method, and subsequent evaluation by geNorm or NormFinder for limited number of superior gene candidates will be a practical way of finding out

  14. Evaluating quantitative formulas for dose-response assessment of chemical mixtures.

    PubMed

    Hertzberg, Richard C; Teuschler, Linda K

    2002-12-01

    Risk assessment formulas are often distinguished from dose-response models by being rough but necessary. The evaluation of these rough formulas is described here, using the example of mixture risk assessment. Two conditions make the dose-response part of mixture risk assessment difficult, lack of data on mixture dose-response relationships, and the need to address risk from combinations of chemicals because of public demands and statutory requirements. Consequently, the U.S. Environmental Protection Agency has developed methods for carrying out quantitative dose-response assessment for chemical mixtures that require information only on the toxicity of single chemicals and of chemical pair interactions. These formulas are based on plausible ideas and default parameters but minimal supporting data on whole mixtures. Because of this lack of mixture data, the usual evaluation of accuracy (predicted vs. observed) cannot be performed. Two approaches to the evaluation of such formulas are to consider fundamental biological concepts that support the quantitative formulas (e.g., toxicologic similarity) and to determine how well the proposed method performs under simplifying constraints (e.g., as the toxicologic interactions disappear). These ideas are illustrated using dose addition and two weight-of-evidence formulas for incorporating toxicologic interactions.

  15. Fibrosis assessment: impact on current management of chronic liver disease and application of quantitative invasive tools.

    PubMed

    Wang, Yan; Hou, Jin-Lin

    2016-05-01

    Fibrosis, a common pathogenic pathway of chronic liver disease (CLD), has long been indicated to be significantly and most importantly associated with severe prognosis. Nowadays, with remarkable advances in understanding and/or treatment of major CLDs such as hepatitis C, B, and nonalcoholic fatty liver disease, there is an unprecedented requirement for the diagnosis and assessment of liver fibrosis or cirrhosis in various clinical settings. Among the available approaches, liver biopsy remains the one which possibly provides the most direct and reliable information regarding fibrosis patterns and changes in the parenchyma at different clinical stages and with different etiologies. Thus, many endeavors have been undertaken for developing methodologies based on the strategy of quantitation for the invasive assessment. Here, we analyze the impact of fibrosis assessment on the CLD patient care based on the data of recent clinical studies. We discuss and update the current invasive tools regarding their technological features and potentials for the particular clinical applications. Furthermore, we propose the potential resolutions with application of quantitative invasive tools for some major issues in fibrosis assessment, which appear to be obstacles against the nowadays rapid progress in CLD medicine.

  16. A Compressed Sensing-Based Wearable Sensor Network for Quantitative Assessment of Stroke Patients.

    PubMed

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-02-05

    Clinical rehabilitation assessment is an important part of the therapy process because it is the premise for prescribing suitable rehabilitation interventions. However, the commonly used assessment scales have the following two drawbacks: (1) they are susceptible to subjective factors; (2) they only have several rating levels and are influenced by a ceiling effect, making it impossible to exactly detect any further improvement in the movement. Meanwhile, energy constraints are a primary design consideration in wearable sensor network systems since they are often battery-operated. Traditionally, for wearable sensor network systems that follow the Shannon/Nyquist sampling theorem, there are many data that need to be sampled and transmitted. This paper proposes a novel wearable sensor network system to monitor and quantitatively assess the upper limb motion function, based on compressed sensing technology. With the sparse representation model, less data is transmitted to the computer than with traditional systems. The experimental results show that the accelerometer signals of Bobath handshake and shoulder touch exercises can be compressed, and the length of the compressed signal is less than 1/3 of the raw signal length. More importantly, the reconstruction errors have no influence on the predictive accuracy of the Brunnstrom stage classification model. It also indicated that the proposed system can not only reduce the amount of data during the sampling and transmission processes, but also, the reconstructed accelerometer signals can be used for quantitative assessment without any loss of useful information.

  17. A Compressed Sensing-Based Wearable Sensor Network for Quantitative Assessment of Stroke Patients

    PubMed Central

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-01-01

    Clinical rehabilitation assessment is an important part of the therapy process because it is the premise for prescribing suitable rehabilitation interventions. However, the commonly used assessment scales have the following two drawbacks: (1) they are susceptible to subjective factors; (2) they only have several rating levels and are influenced by a ceiling effect, making it impossible to exactly detect any further improvement in the movement. Meanwhile, energy constraints are a primary design consideration in wearable sensor network systems since they are often battery-operated. Traditionally, for wearable sensor network systems that follow the Shannon/Nyquist sampling theorem, there are many data that need to be sampled and transmitted. This paper proposes a novel wearable sensor network system to monitor and quantitatively assess the upper limb motion function, based on compressed sensing technology. With the sparse representation model, less data is transmitted to the computer than with traditional systems. The experimental results show that the accelerometer signals of Bobath handshake and shoulder touch exercises can be compressed, and the length of the compressed signal is less than 1/3 of the raw signal length. More importantly, the reconstruction errors have no influence on the predictive accuracy of the Brunnstrom stage classification model. It also indicated that the proposed system can not only reduce the amount of data during the sampling and transmission processes, but also, the reconstructed accelerometer signals can be used for quantitative assessment without any loss of useful information. PMID:26861337

  18. Quantitative MRI assessments of white matter in children treated for acute lymphoblastic leukemia

    NASA Astrophysics Data System (ADS)

    Reddick, Wilburn E.; Glass, John O.; Helton, Kathleen J.; Li, Chin-Shang; Pui, Ching-Hon

    2005-04-01

    The purpose of this study was to use objective quantitative MR imaging methods to prospectively assess changes in the physiological structure of white matter during the temporal evolution of leukoencephalopathy (LE) in children treated for acute lymphoblastic leukemia. The longitudinal incidence, extent (proportion of white matter affect), and intensity (elevation of T1 and T2 relaxation rates) of LE was evaluated for 44 children. A combined imaging set consisting of T1, T2, PD, and FLAIR MR images and white matter, gray matter and CSF a priori maps from a spatially normalized atlas were analyzed with a neural network segmentation based on a Kohonen Self-Organizing Map (SOM). Quantitative T1 and T2 relaxation maps were generated using a nonlinear parametric optimization procedure to fit the corresponding multi-exponential models. A Cox proportional regression was performed to estimate the effect of intravenous methotrexate (IV-MTX) exposure on the development of LE followed by a generalized linear model to predict the probability of LE in new patients. Additional T-tests of independent samples were performed to assess differences in quantitative measures of extent and intensity at four different points in therapy. Higher doses and more courses of IV-MTX placed patients at a higher risk of developing LE and were associated with more intense changes affecting more of the white matter volume; many of the changes resolved after completion of therapy. The impact of these changes on neurocognitive functioning and quality of life in survivors remains to be determined.

  19. A remote quantitative Fugl-Meyer assessment framework for stroke patients based on wearable sensor networks.

    PubMed

    Yu, Lei; Xiong, Daxi; Guo, Liquan; Wang, Jiping

    2016-05-01

    To extend the use of wearable sensor networks for stroke patients training and assessment in non-clinical settings, this paper proposes a novel remote quantitative Fugl-Meyer assessment (FMA) framework, in which two accelerometer and seven flex sensors were used to monitoring the movement function of upper limb, wrist and fingers. The extreme learning machine based ensemble regression model was established to map the sensor data to clinical FMA scores while the RRelief algorithm was applied to find the optimal features subset. Considering the FMA scale is time-consuming and complicated, seven training exercises were designed to replace the upper limb related 33 items in FMA scale. 24 stroke inpatients participated in the experiments in clinical settings and 5 of them were involved in the experiments in home settings after they left the hospital. Both the experimental results in clinical and home settings showed that the proposed quantitative FMA model can precisely predict the FMA scores based on wearable sensor data, the coefficient of determination can reach as high as 0.917. It also indicated that the proposed framework can provide a potential approach to the remote quantitative rehabilitation training and evaluation.

  20. Automated Tracking of Quantitative Assessments of Tumor Burden in Clinical Trials1

    PubMed Central

    Rubin, Daniel L; Willrett, Debra; O'Connor, Martin J; Hage, Cleber; Kurtz, Camille; Moreira, Dilvan A

    2014-01-01

    There are two key challenges hindering effective use of quantitative assessment of imaging in cancer response assessment: 1) Radiologists usually describe the cancer lesions in imaging studies subjectively and sometimes ambiguously, and 2) it is difficult to repurpose imaging data, because lesion measurements are not recorded in a format that permits machine interpretation and interoperability. We have developed a freely available software platform on the basis of open standards, the electronic Physician Annotation Device (ePAD), to tackle these challenges in two ways. First, ePAD facilitates the radiologist in carrying out cancer lesion measurements as part of routine clinical trial image interpretation workflow. Second, ePAD records all image measurements and annotations in a data format that permits repurposing image data for analyses of alternative imaging biomarkers of treatment response. To determine the impact of ePAD on radiologist efficiency in quantitative assessment of imaging studies, a radiologist evaluated computed tomography (CT) imaging studies from 20 subjects having one baseline and three consecutive follow-up imaging studies with and without ePAD. The radiologist made measurements of target lesions in each imaging study using Response Evaluation Criteria in Solid Tumors 1.1 criteria, initially with the aid of ePAD, and then after a 30-day washout period, the exams were reread without ePAD. The mean total time required to review the images and summarize measurements of target lesions was 15% (P < .039) shorter using ePAD than without using this tool. In addition, it was possible to rapidly reanalyze the images to explore lesion cross-sectional area as an alternative imaging biomarker to linear measure. We conclude that ePAD appears promising to potentially improve reader efficiency for quantitative assessment of CT examinations, and it may enable discovery of future novel image-based biomarkers of cancer treatment response. PMID:24772204

  1. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    PubMed

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical

  2. A Quantitative Climate-Match Score for Risk-Assessment Screening of Reptile and Amphibian Introductions

    NASA Astrophysics Data System (ADS)

    van Wilgen, Nicola J.; Roura-Pascual, Núria; Richardson, David M.

    2009-09-01

    Assessing climatic suitability provides a good preliminary estimate of the invasive potential of a species to inform risk assessment. We examined two approaches for bioclimatic modeling for 67 reptile and amphibian species introduced to California and Florida. First, we modeled the worldwide distribution of the biomes found in the introduced range to highlight similar areas worldwide from which invaders might arise. Second, we modeled potentially suitable environments for species based on climatic factors in their native ranges, using three sources of distribution data. Performance of the three datasets and both approaches were compared for each species. Climate match was positively correlated with species establishment success (maximum predicted suitability in the introduced range was more strongly correlated with establishment success than mean suitability). Data assembled from the Global Amphibian Assessment through NatureServe provided the most accurate models for amphibians, while ecoregion data compiled by the World Wide Fund for Nature yielded models which described reptile climatic suitability better than available point-locality data. We present three methods of assigning a climate-match score for use in risk assessment using both the mean and maximum climatic suitabilities. Managers may choose to use different methods depending on the stringency of the assessment and the available data, facilitating higher resolution and accuracy for herpetofaunal risk assessment. Climate-matching has inherent limitations and other factors pertaining to ecological interactions and life-history traits must also be considered for thorough risk assessment.

  3. Aggregate versus Individual-Level Sexual Behavior Assessment: How Much Detail Is Needed to Accurately Estimate HIV/STI Risk?

    ERIC Educational Resources Information Center

    Pinkerton, Steven D.; Galletly, Carol L.; McAuliffe, Timothy L.; DiFranceisco, Wayne; Raymond, H. Fisher; Chesson, Harrell W.

    2010-01-01

    The sexual behaviors of HIV/sexually transmitted infection (STI) prevention intervention participants can be assessed on a partner-by-partner basis: in aggregate (i.e., total numbers of sex acts, collapsed across partners) or using a combination of these two methods (e.g., assessing five partners in detail and any remaining partners in aggregate).…

  4. Experimental assessment of bone mineral density using quantitative computed tomography in holstein dairy cows

    PubMed Central

    MAETANI, Ayami; ITOH, Megumi; NISHIHARA, Kahori; AOKI, Takahiro; OHTANI, Masayuki; SHIBANO, Kenichi; KAYANO, Mitsunori; YAMADA, Kazutaka

    2016-01-01

    The aim of this study was to assess the measurement of bone mineral density (BMD) by quantitative computed tomography (QCT), comparing the relationships of BMD between QCT and dual-energy X-ray absorptiometry (DXA) and between QCT and radiographic absorptiometry (RA) in the metacarpal bone of Holstein dairy cows (n=27). A significant positive correlation was found between QCT and DXA measurements (r=0.70, P<0.01), and a significant correlation was found between QCT and RA measurements (r=0.50, P<0.01). We conclude that QCT provides quantitative evaluation of BMD in dairy cows, because BMD measured by QCT showed positive correlations with BMD measured by the two conventional methods: DXA and RA. PMID:27075115

  5. A quantitative collagen fibers orientation assessment using birefringence measurements: Calibration and application to human osteons

    PubMed Central

    Spiesz, Ewa M.; Kaminsky, Werner; Zysset, Philippe K.

    2011-01-01

    Even though mechanical properties depend strongly on the arrangement of collagen fibers in mineralized tissues, it is not yet well resolved. Only a few semi-quantitative evaluations of the fiber arrangement in bone, like spectroscopic techniques or circularly polarized light microscopy methods are available. In this study the out-of-plane collagen arrangement angle was calibrated to the linear birefringence of a longitudinally fibered mineralized turkey leg tendon cut at variety of angles to the main axis. The calibration curve was applied to human cortical bone osteons to quantify the out-of-plane collagen fibers arrangement. The proposed calibration curve is normalized to sample thickness and wavelength of the probing light to enable a universally applicable quantitative assessment. This approach may improve our understanding of the fibrillar structure of bone and its implications on mechanical properties. PMID:21970947

  6. Valuation of ecotoxicological impacts from tributyltin based on a quantitative environmental assessment framework.

    PubMed

    Noring, Maria; Håkansson, Cecilia; Dahlgren, Elin

    2016-02-01

    In the scientific literature, few valuations of biodiversity and ecosystem services following the impacts of toxicity are available, hampered by the lack of ecotoxicological documentation. Here, tributyltin is used to conduct a contingent valuation study as well as cost-benefit analysis (CBA) of measures for improving the environmental status in Swedish coastal waters of the Baltic Sea. Benefits considering different dimensions when assessing environmental status are highlighted and a quantitative environmental assessment framework based on available technology, ecological conditions, and economic valuation methodology is developed. Two scenarios are used in the valuation study: (a) achieving good environmental status by 2020 in accordance with EU legislation (USD 119 household(-1) year(-1)) and (b) achieving visible improvements by 2100 due to natural degradation (USD 108 household(-1) year(-1)) during 8 years. The later scenario was used to illustrate an application of the assessment framework. The CBA results indicate that both scenarios might generate a welfare improvement.

  7. Accurate quantitative measurements of brachial artery cross-sectional vascular area and vascular volume elastic modulus using automated oscillometric measurements: comparison with brachial artery ultrasound

    PubMed Central

    Tomiyama, Yuuki; Yoshinaga, Keiichiro; Fujii, Satoshi; Ochi, Noriki; Inoue, Mamiko; Nishida, Mutumi; Aziki, Kumi; Horie, Tatsunori; Katoh, Chietsugu; Tamaki, Nagara

    2015-01-01

    Increasing vascular diameter and attenuated vascular elasticity may be reliable markers for atherosclerotic risk assessment. However, previous measurements have been complex, operator-dependent or invasive. Recently, we developed a new automated oscillometric method to measure a brachial artery's estimated area (eA) and volume elastic modulus (VE). The aim of this study was to investigate the reliability of new automated oscillometric measurement of eA and VE. Rest eA and VE were measured using the recently developed automated detector with the oscillometric method. eA was estimated using pressure/volume curves and VE was defined as follows (VE=Δ pressure/ (100 × Δ area/area) mm Hg/%). Sixteen volunteers (age 35.2±13.1 years) underwent the oscillometric measurements and brachial ultrasound at rest and under nitroglycerin (NTG) administration. Oscillometric measurement was performed twice on different days. The rest eA correlated with ultrasound-measured brachial artery area (r=0.77, P<0.001). Rest eA and VE measurement showed good reproducibility (eA: intraclass correlation coefficient (ICC)=0.88, VE: ICC=0.78). Under NTG stress, eA was significantly increased (12.3±3.0 vs. 17.1±4.6 mm2, P<0.001), and this was similar to the case with ultrasound evaluation (4.46±0.72 vs. 4.73±0.75 mm, P<0.001). VE was also decreased (0.81±0.16 vs. 0.65±0.11 mm Hg/%, P<0.001) after NTG. Cross-sectional vascular area calculated using this automated oscillometric measurement correlated with ultrasound measurement and showed good reproducibility. Therefore, this is a reliable approach and this modality may have practical application to automatically assess muscular artery diameter and elasticity in clinical or epidemiological settings. PMID:25693851

  8. Qualitative and quantitative approaches in the dose-response assessment of genotoxic carcinogens.

    PubMed

    Fukushima, Shoji; Gi, Min; Kakehashi, Anna; Wanibuchi, Hideki; Matsumoto, Michiharu

    2016-05-01

    Qualitative and quantitative approaches are important issues in field of carcinogenic risk assessment of the genotoxic carcinogens. Herein, we provide quantitative data on low-dose hepatocarcinogenicity studies for three genotoxic hepatocarcinogens: 2-amino-3,8-dimethylimidazo[4,5-f]quinoxaline (MeIQx), 2-amino-3-methylimidazo[4,5-f]quinoline (IQ) and N-nitrosodiethylamine (DEN). Hepatocarcinogenicity was examined by quantitative analysis of glutathione S-transferase placental form (GST-P) positive foci, which are the preneoplastic lesions in rat hepatocarcinogenesis and the endpoint carcinogenic marker in the rat liver medium-term carcinogenicity bioassay. We also examined DNA damage and gene mutations which occurred through the initiation stage of carcinogenesis. For the establishment of points of departure (PoD) from which the cancer-related risk can be estimated, we analyzed the above events by quantitative no-observed-effect level and benchmark dose approaches. MeIQx at low doses induced formation of DNA-MeIQx adducts; somewhat higher doses caused elevation of 8-hydroxy-2'-deoxyquanosine levels; at still higher doses gene mutations occurred; and the highest dose induced formation of GST-P positive foci. These data indicate that early genotoxic events in the pathway to carcinogenesis showed the expected trend of lower PoDs for earlier events in the carcinogenic process. Similarly, only the highest dose of IQ caused an increase in the number of GST-P positive foci in the liver, while IQ-DNA adduct formation was observed with low doses. Moreover, treatment with DEN at low doses had no effect on development of GST-P positive foci in the liver. These data on PoDs for the markers contribute to understand whether genotoxic carcinogens have a threshold for their carcinogenicity. The most appropriate approach to use in low dose-response assessment must be approved on the basis of scientific judgment.

  9. Quantitative muscle strength assessment in duchenne muscular dystrophy: longitudinal study and correlation with functional measures

    PubMed Central

    2012-01-01

    Background The aim of this study was to perform a longitudinal assessment using Quantitative Muscle Testing (QMT) in a cohort of ambulant boys affected by Duchenne muscular dystrophy (DMD) and to correlate the results of QMT with functional measures. This study is to date the most thorough long-term evaluation of QMT in a cohort of DMD patients correlated with other measures, such as the North Star Ambulatory Assessment (NSAA) or thee 6-min walk test (6MWT). Methods This is a single centre, prospective, non-randomised, study assessing QMT using the Kin Com® 125 machine in a study cohort of 28 ambulant DMD boys, aged 5 to 12 years. This cohort was assessed longitudinally over a 12 months period of time with 3 monthly assessments for QMT and with assessment of functional abilities, using the NSAA and the 6MWT at baseline and at 12 months only. QMT was also used in a control group of 13 healthy age-matched boys examined at baseline and at 12 months. Results There was an increase in QMT over 12 months in boys below the age of 7.5 years while in boys above the age of 7.5 years, QMT showed a significant decrease. All the average one-year changes were significantly different than those experienced by healthy controls. We also found a good correlation between quantitative tests and the other measures that was more obvious in the stronger children. Conclusion Our longitudinal data using QMT in a cohort of DMD patients suggest that this could be used as an additional tool to monitor changes, providing additional information on segmental strength. PMID:22974002

  10. MINKOWSKI FUNCTIONALS FOR QUANTITATIVE ASSESSMENTS OF SHOCK-INDUCED MIXING FLOWS

    SciTech Connect

    STRELITZ, RICHARD A.; KAMM, JAMES R.

    2007-01-22

    We describe the morphological descriptors known as Minkowski Functionals (MFs) on a shock-induced mixing problem. MFs allow accurate and compact characterization of complex images. MFs characterize connectivity, size, and shape of disordered structures. They possess several desirable properties, such as additivity, smoothness, and a direct relationship to certain physical properties. The scalar MFs that we describe can be extended to a moment-based tensor form that allows more thorough image descriptions. We apply MFs to experimental data for shock-induced mixing experiments conducted at the LANL shock tube facility. Those experiments, using low Mach number shock waves in air to induce the Richtmyer-Meshkov instability on air-SF{sub 6} interfaces, provide high-resolution, quantitative planar laser-induced fluorescence (PLIF) images. We describe MFs and use them to quantify experimental PLIF images of shock-induced mixing. This method can be used as a tool fo r validation, i.e., the quantitative comparison of simulation results against experimental data.

  11. Exploring new quantitative CT image features to improve assessment of lung cancer prognosis

    NASA Astrophysics Data System (ADS)

    Emaminejad, Nastaran; Qian, Wei; Kang, Yan; Guan, Yubao; Lure, Fleming; Zheng, Bin

    2015-03-01

    Due to the promotion of lung cancer screening, more Stage I non-small-cell lung cancers (NSCLC) are currently detected, which usually have favorable prognosis. However, a high percentage of the patients have cancer recurrence after surgery, which reduces overall survival rate. To achieve optimal efficacy of treating and managing Stage I NSCLC patients, it is important to develop more accurate and reliable biomarkers or tools to predict cancer prognosis. The purpose of this study is to investigate a new quantitative image analysis method to predict the risk of lung cancer recurrence of Stage I NSCLC patients after the lung cancer surgery using the conventional chest computed tomography (CT) images and compare the prediction result with a popular genetic biomarker namely, protein expression of the excision repair cross-complementing 1 (ERCC1) genes. In this study, we developed and tested a new computer-aided detection (CAD) scheme to segment lung tumors and initially compute 35 tumor-related morphologic and texture features from CT images. By applying a machine learning based feature selection method, we identified a set of 8 effective and non-redundant image features. Using these features we trained a naïve Bayesian network based classifier to predict the risk of cancer recurrence. When applying to a test dataset with 79 Stage I NSCLC cases, the computed areas under ROC curves were 0.77±0.06 and 0.63±0.07 when using the quantitative image based classifier and ERCC1, respectively. The study results demonstrated the feasibility of improving accuracy of predicting cancer prognosis or recurrence risk using a CAD-based quantitative image analysis method.

  12. Application of a Multiplex Quantitative PCR to Assess Prevalence and Intensity Of Intestinal Parasite Infections in a Controlled Clinical Trial

    PubMed Central

    Llewellyn, Stacey; Inpankaew, Tawin; Nery, Susana Vaz; Gray, Darren J.; Verweij, Jaco J.; Clements, Archie C. A.; Gomes, Santina J.; Traub, Rebecca; McCarthy, James S.

    2016-01-01

    Background Accurate quantitative assessment of infection with soil transmitted helminths and protozoa is key to the interpretation of epidemiologic studies of these parasites, as well as for monitoring large scale treatment efficacy and effectiveness studies. As morbidity and transmission of helminth infections are directly related to both the prevalence and intensity of infection, there is particular need for improved techniques for assessment of infection intensity for both purposes. The current study aimed to evaluate two multiplex PCR assays to determine prevalence and intensity of intestinal parasite infections, and compare them to standard microscopy. Methodology/Principal Findings Faecal samples were collected from a total of 680 people, originating from rural communities in Timor-Leste (467 samples) and Cambodia (213 samples). DNA was extracted from stool samples and subject to two multiplex real-time PCR reactions the first targeting: Necator americanus, Ancylostoma spp., Ascaris spp., and Trichuris trichiura; and the second Entamoeba histolytica, Cryptosporidium spp., Giardia. duodenalis, and Strongyloides stercoralis. Samples were also subject to sodium nitrate flotation for identification and quantification of STH eggs, and zinc sulphate centrifugal flotation for detection of protozoan parasites. Higher parasite prevalence was detected by multiplex PCR (hookworms 2.9 times higher, Ascaris 1.2, Giardia 1.6, along with superior polyparasitism detection with this effect magnified as the number of parasites present increased (one: 40.2% vs. 38.1%, two: 30.9% vs. 12.9%, three: 7.6% vs. 0.4%, four: 0.4% vs. 0%). Although, all STH positive samples were low intensity infections by microscopy as defined by WHO guidelines the DNA-load detected by multiplex PCR suggested higher intensity infections. Conclusions/Significance Multiplex PCR, in addition to superior sensitivity, enabled more accurate determination of infection intensity for Ascaris, hookworms and

  13. Quantitative risk assessment for human salmonellosis through the consumption of pork sausage in Porto Alegre, Brazil.

    PubMed

    Mürmann, Lisandra; Corbellini, Luis Gustavo; Collor, Alexandre Ávila; Cardoso, Marisa

    2011-04-01

    A quantitative microbiology risk assessment was conducted to evaluate the risk of Salmonella infection to consumers of fresh pork sausages prepared at barbecues in Porto Alegre, Brazil. For the analysis, a prevalence of 24.4% positive pork sausages with a level of contamination between 0.03 and 460 CFU g(-1) was assumed. Data related to frequency and habits of consumption were obtained by a questionnaire survey given to 424 people. A second-order Monte Carlo simulation separating the uncertain parameter of cooking time from the variable parameters was run. Of the people interviewed, 87.5% consumed pork sausage, and 85.4% ate it at barbecues. The average risk of salmonellosis per barbecue at a minimum cooking time of 15.6 min (worst-case scenario) was 6.24 × 10(-4), and the risk assessed per month was 1.61 × 10(-3). Cooking for 19 min would fully inactivate Salmonella in 99.9% of the cases. At this cooking time, the sausage reached a mean internal temperature of 75.7°C. The results of the quantitative microbiology risk assessment revealed that the consumption of fresh pork sausage is safe when cooking time is approximately 19 min, whereas undercooked pork sausage may represent a nonnegligible health risk for consumers.

  14. Evaluating variability and uncertainty separately in microbial quantitative risk assessment using two R packages.

    PubMed

    Pouillot, Régis; Delignette-Muller, Marie Laure

    2010-09-01

    Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org).

  15. Quantitative microbial risk assessment for Staphylococcus aureus and Staphylococcus enterotoxin A in raw milk.

    PubMed

    Heidinger, Joelle C; Winter, Carl K; Cullor, James S

    2009-08-01

    A quantitative microbial risk assessment was constructed to determine consumer risk from Staphylococcus aureus and staphylococcal enterotoxin in raw milk. A Monte Carlo simulation model was developed to assess the risk from raw milk consumption using data on levels of S. aureus in milk collected by the University of California-Davis Dairy Food Safety Laboratory from 2,336 California dairies from 2005 to 2008 and using U.S. milk consumption data from the National Health and Nutrition Examination Survey of 2003 and 2004. Four modules were constructed to simulate pathogen growth and staphylococcal enterotoxin A production scenarios to quantify consumer risk levels under various time and temperature storage conditions. The three growth modules predicted that S. aureus levels could surpass the 10(5) CFU/ml level of concern at the 99.9th or 99.99th percentile of servings and therefore may represent a potential consumer risk. Results obtained from the staphylococcal enterotoxin A production module predicted that exposure at the 99.99th percentile could represent a dose capable of eliciting staphylococcal enterotoxin intoxication in all consumer age groups. This study illustrates the utility of quantitative microbial risk assessments for identifying potential food safety issues.

  16. A relative quantitative assessment of myocardial perfusion by first-pass technique: animal study

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Zhang, Zhang; Yu, Xuefang; Zhou, Kenneth J.

    2015-03-01

    The purpose of this study is to quantitatively assess the myocardial perfusion by first-pass technique in swine model. Numerous techniques based on the analysis of Computed Tomography (CT) Hounsfield Unit (HU) density have emerged. Although these methods proposed to be able to assess haemodynamically significant coronary artery stenosis, their limitations are noticed. There are still needs to develop some new techniques. Experiments were performed upon five (5) closed-chest swine. Balloon catheters were placed into the coronary artery to simulate different degrees of luminal stenosis. Myocardial Blood Flow (MBF) was measured using color microsphere technique. Fractional Flow Reserve (FFR) was measured using pressure wire. CT examinations were performed twice during First-pass phase under adenosine-stress condition. CT HU Density (HUDCT) and CT HU Density Ratio (HUDRCT) were calculated using the acquired CT images. Our study presents that HUDRCT shows a good (y=0.07245+0.09963x, r2=0.898) correlation with MBF and FFR. In receiver operating characteristic (ROC) curve analyses, HUDRCT provides excellent diagnostic performance for the detection of significant ischemia during adenosine-stress as defined by FFR indicated by the value of Area Under the Curve (AUC) of 0.927. HUDRCT has the potential to be developed as a useful indicator of quantitative assessment of myocardial perfusion.

  17. Impact assessment of abiotic resources in LCA: quantitative comparison of selected characterization models.

    PubMed

    Rørbech, Jakob T; Vadenbo, Carl; Hellweg, Stefanie; Astrup, Thomas F

    2014-10-07

    Resources have received significant attention in recent years resulting in development of a wide range of resource depletion indicators within life cycle assessment (LCA). Understanding the differences in assessment principles used to derive these indicators and the effects on the impact assessment results is critical for indicator selection and interpretation of the results. Eleven resource depletion methods were evaluated quantitatively with respect to resource coverage, characterization factors (CF), impact contributions from individual resources, and total impact scores. We included 2247 individual market inventory data sets covering a wide range of societal activities (ecoinvent database v3.0). Log-linear regression analysis was carried out for all pairwise combinations of the 11 methods for identification of correlations in CFs (resources) and total impacts (inventory data sets) between methods. Significant differences in resource coverage were observed (9-73 resources) revealing a trade-off between resource coverage and model complexity. High correlation in CFs between methods did not necessarily manifest in high correlation in total impacts. This indicates that also resource coverage may be critical for impact assessment results. Although no consistent correlations between methods applying similar assessment models could be observed, all methods showed relatively high correlation regarding the assessment of energy resources. Finally, we classify the existing methods into three groups, according to method focus and modeling approach, to aid method selection within LCA.

  18. Quantitative assessment of emphysema from whole lung CT scans: comparison with visual grading

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Apanosovich, Tatiyana V.; Wang, Jianwei; Yankelevitz, David F.; Henschke, Claudia I.

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for imaging of the anatomical basis of emphysema and for visual assessment by radiologists of the extent present in the lungs. Several measures have been introduced for the quantification of the extent of disease directly from CT data in order to add to the qualitative assessments made by radiologists. In this paper we compare emphysema index, mean lung density, histogram percentiles, and the fractal dimension to visual grade in order to evaluate the predictability of radiologist visual scoring of emphysema from low-dose CT scans through quantitative scores, in order to determine which measures can be useful as surrogates for visual assessment. All measures were computed over nine divisions of the lung field (whole lung, individual lungs, and upper/middle/lower thirds of each lung) for each of 148 low-dose, whole lung scans. In addition, a visual grade of each section was also given by an expert radiologist. One-way ANOVA and multinomial logistic regression were used to determine the ability of the measures to predict visual grade from quantitative score. We found that all measures were able to distinguish between normal and severe grades (p<0.01), and between mild/moderate and all other grades (p<0.05). However, no measure was able to distinguish between mild and moderate cases. Approximately 65% prediction accuracy was achieved from using quantitative score to predict visual grade, with 73% if mild and moderate cases are considered as a single class.

  19. Quantitative Assessment of the Effects of Oxidants on Antigen-Antibody Binding In Vitro

    PubMed Central

    Han, Shuang; Wang, Guanyu; Xu, Naijin; Liu, Hui

    2016-01-01

    Objective. We quantitatively assessed the influence of oxidants on antigen-antibody-binding activity. Methods. We used several immunological detection methods, including precipitation reactions, agglutination reactions, and enzyme immunoassays, to determine antibody activity. The oxidation-reduction potential was measured in order to determine total serum antioxidant capacity. Results. Certain concentrations of oxidants resulted in significant inhibition of antibody activity but had little influence on total serum antioxidant capacity. Conclusions. Oxidants had a significant influence on interactions between antigen and antibody, but minimal effect on the peptide of the antibody molecule. PMID:27313823

  20. Quantitative Passive Diffusive Sampling for Assessing Soil Vapor Intrusion to Indoor Air

    DTIC Science & Technology

    2012-03-28

    4/11/2012 1 Quantitative Passive Diffusive Sampling for Assessing Soil Vapor Intrusion to Indoor Air Todd McAlary and Hester Groenevelt, Geosyntec... Intrusion to Indoor Air 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK...10-6 risk (ppb) Vapour pressure (atm) Water solubility (g/l) 1,1,1-Trichloroethane 110 400 0.16 1.33 1,2,4-Trimethylbenzene

  1. A quantitative method for risk assessment of agriculture due to climate change

    NASA Astrophysics Data System (ADS)

    Dong, Zhiqiang; Pan, Zhihua; An, Pingli; Zhang, Jingting; Zhang, Jun; Pan, Yuying; Huang, Lei; Zhao, Hui; Han, Guolin; Wu, Dong; Wang, Jialin; Fan, Dongliang; Gao, Lin; Pan, Xuebiao

    2016-11-01

    Climate change has greatly affected agriculture. Agriculture is facing increasing risks as its sensitivity and vulnerability to climate change. Scientific assessment of climate change-induced agricultural risks could help to actively deal with climate change and ensure food security. However, quantitative assessment of risk is a difficult issue. Here, based on the IPCC assessment reports, a quantitative method for risk assessment of agriculture due to climate change is proposed. Risk is described as the product of the degree of loss and its probability of occurrence. The degree of loss can be expressed by the yield change amplitude. The probability of occurrence can be calculated by the new concept of climate change effect-accumulated frequency (CCEAF). Specific steps of this assessment method are suggested. This method is determined feasible and practical by using the spring wheat in Wuchuan County of Inner Mongolia as a test example. The results show that the fluctuation of spring wheat yield increased with the warming and drying climatic trend in Wuchuan County. The maximum yield decrease and its probability were 3.5 and 64.6%, respectively, for the temperature maximum increase 88.3%, and its risk was 2.2%. The maximum yield decrease and its probability were 14.1 and 56.1%, respectively, for the precipitation maximum decrease 35.2%, and its risk was 7.9%. For the comprehensive impacts of temperature and precipitation, the maximum yield decrease and its probability were 17.6 and 53.4%, respectively, and its risk increased to 9.4%. If we do not adopt appropriate adaptation strategies, the degree of loss from the negative impacts of multiclimatic factors and its probability of occurrence will both increase accordingly, and the risk will also grow obviously.

  2. Quantitative assessment of multiple sclerosis using inertial sensors and the TUG test.

    PubMed

    Greene, Barry R; Healy, Michael; Rutledge, Stephanie; Caulfield, Brian; Tubridy, Niall

    2014-01-01

    Multiple sclerosis (MS) is a progressive neurological disorder affecting between 2 and 2.5 million people globally. Tests of mobility form part of clinical assessments of MS. Quantitative assessment of mobility using inertial sensors has the potential to provide objective, longitudinal monitoring of disease progression in patients with MS. The mobility of 21 patients (aged 25-59 years, 8 M, 13 F), diagnosed with relapsing-remitting MS was assessed using the Timed up and Go (TUG) test, while patients wore shank-mounted inertial sensors. This exploratory, cross-sectional study aimed to examine the reliability of quantitative measures derived from inertial sensors during the TUG test, in patients with MS. Furthermore, we aimed to determine if disease status (as measured by the Multiple Sclerosis Impact Scale (MSIS-29) and the Expanded Disability Status Score (EDSS)) can be predicted by assessment using a TUG test and inertial sensors. Reliability analysis showed that 32 of 52 inertial sensors parameters obtained during the TUG showed excellent intrasession reliability, while 11 of 52 showed moderate reliability. Using the inertial sensors parameters, regression models of the EDSS and MSIS-29 scales were derived using the elastic net procedure. Using cross validation, an elastic net regularized regression model of MSIS yielded a mean square error (MSE) of 334.6 with 25 degrees of freedom (DoF). Similarly, an elastic net regularized regression model of EDSS yielded a cross-validated MSE of 1.5 with 6 DoF. Results suggest that inertial sensor parameters derived from MS patients while completing the TUG test are reliable and may have utility in assessing disease state as measured using EDSS and MSIS.

  3. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate

    PubMed Central

    Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul

    2015-01-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere. PMID:26633821

  4. Incentives Increase Participation in Mass Dog Rabies Vaccination Clinics and Methods of Coverage Estimation Are Assessed to Be Accurate.

    PubMed

    Minyoo, Abel B; Steinmetz, Melissa; Czupryna, Anna; Bigambo, Machunde; Mzimbiri, Imam; Powell, George; Gwakisa, Paul; Lankester, Felix

    2015-12-01

    In this study we show that incentives (dog collars and owner wristbands) are effective at increasing owner participation in mass dog rabies vaccination clinics and we conclude that household questionnaire surveys and the mark-re-sight (transect survey) method for estimating post-vaccination coverage are accurate when all dogs, including puppies, are included. Incentives were distributed during central-point rabies vaccination clinics in northern Tanzania to quantify their effect on owner participation. In villages where incentives were handed out participation increased, with an average of 34 more dogs being vaccinated. Through economies of scale, this represents a reduction in the cost-per-dog of $0.47. This represents the price-threshold under which the cost of the incentive used must fall to be economically viable. Additionally, vaccination coverage levels were determined in ten villages through the gold-standard village-wide census technique, as well as through two cheaper and quicker methods (randomized household questionnaire and the transect survey). Cost data were also collected. Both non-gold standard methods were found to be accurate when puppies were included in the calculations, although the transect survey and the household questionnaire survey over- and under-estimated the coverage respectively. Given that additional demographic data can be collected through the household questionnaire survey, and that its estimate of coverage is more conservative, we recommend this method. Despite the use of incentives the average vaccination coverage was below the 70% threshold for eliminating rabies. We discuss the reasons and suggest solutions to improve coverage. Given recent international targets to eliminate rabies, this study provides valuable and timely data to help improve mass dog vaccination programs in Africa and elsewhere.

  5. Feasibility study for image guided kidney surgery: assessment of required intraoperative surface for accurate image to physical space registrations

    NASA Astrophysics Data System (ADS)

    Benincasa, Anne B.; Clements, Logan W.; Herrell, S. Duke; Chang, Sam S.; Cookson, Michael S.; Galloway, Robert L.

    2006-03-01

    Currently, the removal of kidney tumor masses uses only direct or laparoscopic visualizations, resulting in prolonged procedure and recovery times and reduced clear margin. Applying current image guided surgery (IGS) techniques, as those used in liver cases, to kidney resections (nephrectomies) presents a number of complications. Most notably is the limited field of view of the intraoperative kidney surface, which constrains the ability to obtain a surface delineation that is geometrically descriptive enough to drive a surface-based registration. Two different phantom orientations were used to model the laparoscopic and traditional partial nephrectomy views. For the laparoscopic view, fiducial point sets were compiled from a CT image volume using anatomical features such as the renal artery and vein. For the traditional view, markers attached to the phantom set-up were used for fiducials and targets. The fiducial points were used to perform a point-based registration, which then served as a guide for the surface-based registration. Laser range scanner (LRS) obtained surfaces were registered to each phantom surface using a rigid iterative closest point algorithm. Subsets of each phantom's LRS surface were used in a robustness test to determine the predictability of their registrations to transform the entire surface. Results from both orientations suggest that about half of the kidney's surface needs to be obtained intraoperatively for accurate registrations between the image surface and the LRS surface, suggesting the obtained kidney surfaces were geometrically descriptive enough to perform accurate registrations. This preliminary work paves the way for further development of kidney IGS systems.

  6. Quantitative MRI and strength measurements in the assessment of muscle quality in Duchenne muscular dystrophy.

    PubMed

    Wokke, B H; van den Bergen, J C; Versluis, M J; Niks, E H; Milles, J; Webb, A G; van Zwet, E W; Aartsma-Rus, A; Verschuuren, J J; Kan, H E

    2014-05-01

    The purpose of this study was to assess leg muscle quality and give a detailed description of leg muscle involvement in a series of Duchenne muscular dystrophy patients using quantitative MRI and strength measurements. Fatty infiltration, as well as total and contractile (not fatty infiltrated) cross sectional areas of various leg muscles were determined in 16 Duchenne patients and 11 controls (aged 8-15). To determine specific muscle strength, four leg muscle groups (quadriceps femoris, hamstrings, anterior tibialis and triceps surae) were measured and related to the amount of contractile tissue. In patients, the quadriceps femoris showed decreased total and contractile cross sectional area, attributable to muscle atrophy. The total, but not the contractile, cross sectional area of the triceps surae was increased in patients, corresponding to hypertrophy. Specific strength decreased in all four muscle groups of Duchenne patients, indicating reduced muscle quality. This suggests that muscle hypertrophy and fatty infiltration are two distinct pathological processes, differing between muscle groups. Additionally, the quality of remaining muscle fibers is severely reduced in the legs of Duchenne patients. The combination of quantitative MRI and quantitative muscle testing could be a valuable outcome parameter in longitudinal studies and in the follow-up of therapeutic effects.

  7. Quantitative breast MRI radiomics for cancer risk assessment and the monitoring of high-risk populations

    NASA Astrophysics Data System (ADS)

    Mendel, Kayla R.; Li, Hui; Giger, Maryellen L.

    2016-03-01

    Breast density is routinely assessed qualitatively in screening mammography. However, it is challenging to quantitatively determine a 3D density from a 2D image such as a mammogram. Furthermore, dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) is used more frequently in the screening of high-risk populations. The purpose of our study is to segment parenchyma and to quantitatively determine volumetric breast density on pre-contrast axial DCE-MRI images (i.e., non-contrast) using a semi-automated quantitative approach. In this study, we retroactively examined 3D DCE-MRI images taken for breast cancer screening of a high-risk population. We analyzed 66 cases with ages between 28 and 76 (mean 48.8, standard deviation 10.8). DCE-MRIs were obtained on a Philips 3.0 T scanner. Our semi-automated DCE-MRI algorithm includes: (a) segmentation of breast tissue from non-breast tissue using fuzzy cmeans clustering (b) separation of dense and fatty tissues using Otsu's method, and (c) calculation of volumetric density as the ratio of dense voxels to total breast voxels. We examined the relationship between pre-contrast DCE-MRI density and clinical BI-RADS density obtained from radiology reports, and obtained a statistically significant correlation [Spearman ρ-value of 0.66 (p < 0.0001)]. Our method within precision medicine may be useful for monitoring high-risk populations.

  8. The Quantitative Ideas and Methods in Assessment of Four Properties of Chinese Medicinal Herbs.

    PubMed

    Fu, Jialei; Pang, Jingxiang; Zhao, Xiaolei; Han, Jinxiang

    2015-04-01

    The purpose of this review is to summarize and reflect on the current status and problems of the research on the properties of Chinese medicinal herbs. Hot, warm, cold, and cool are the four properties/natures of Chinese medicinal herbs. They are defined based on the interaction between the herbs with human body. How to quantitatively assess the therapeutic effect of Chinese medicinal herbs based on the theoretical system of Traditional Chinese medicine (TCM) remains to be a challenge. Previous studies on the topic from several perspectives have been presented. Results and problems were discussed. New ideas based on the technology of biophoton radiation detection are proposed. With the development of biophoton detection technology, detection and characterization of human biophoton emission has led to its potential applications in TCM. The possibility of using the biophoton analysis system to study the interaction of Chinese medicinal herbs with human body and to quantitatively determine the effect of the Chinese medicinal herbal is entirely consistent with the holistic concept of TCM theory. The statistical entropy of electromagnetic radiations from the biological systems can characterize the four properties of Chinese medicinal herbs, and the spectrum can characterize the meridian tropism of it. Therefore, we hypothesize that by the use of biophoton analysis system, the four properties and meridian tropism of Chinese medicinal herbs can be quantitatively expressed.

  9. Quantitative sonoelastography for the in vivo assessment of skeletal muscle viscoelasticity

    NASA Astrophysics Data System (ADS)

    Hoyt, Kenneth; Kneezel, Timothy; Castaneda, Benjamin; Parker, Kevin J.

    2008-08-01

    A novel quantitative sonoelastography technique for assessing the viscoelastic properties of skeletal muscle tissue was developed. Slowly propagating shear wave interference patterns (termed crawling waves) were generated using a two-source configuration vibrating normal to the surface. Theoretical models predict crawling wave displacement fields, which were validated through phantom studies. In experiments, a viscoelastic model was fit to dispersive shear wave speed sonoelastographic data using nonlinear least-squares techniques to determine frequency-independent shear modulus and viscosity estimates. Shear modulus estimates derived using the viscoelastic model were in agreement with that obtained by mechanical testing on phantom samples. Preliminary sonoelastographic data acquired in healthy human skeletal muscles confirm that high-quality quantitative elasticity data can be acquired in vivo. Studies on relaxed muscle indicate discernible differences in both shear modulus and viscosity estimates between different skeletal muscle groups. Investigations into the dynamic viscoelastic properties of (healthy) human skeletal muscles revealed that voluntarily contracted muscles exhibit considerable increases in both shear modulus and viscosity estimates as compared to the relaxed state. Overall, preliminary results are encouraging and quantitative sonoelastography may prove clinically feasible for in vivo characterization of the dynamic viscoelastic properties of human skeletal muscle.

  10. Automated quantitative assessment of three-dimensional bioprinted hydrogel scaffolds using optical coherence tomography

    PubMed Central

    Wang, Ling; Xu, Mingen; Zhang, LieLie; Zhou, QingQing; Luo, Li

    2016-01-01

    Reconstructing and quantitatively assessing the internal architecture of opaque three-dimensional (3D) bioprinted hydrogel scaffolds is difficult but vital to the improvement of 3D bioprinting techniques and to the fabrication of functional engineered tissues. In this study, swept-source optical coherence tomography was applied to acquire high-resolution images of hydrogel scaffolds. Novel 3D gelatin/alginate hydrogel scaffolds with six different representative architectures were fabricated using our 3D bioprinting system. Both the scaffold material networks and the interconnected flow channel networks were reconstructed through volume rendering and binarisation processing to provide a 3D volumetric view. An image analysis algorithm was developed based on the automatic selection of the spatially-isolated region-of–interest. Via this algorithm, the spatially-resolved morphological parameters including pore size, pore shape, strut size, surface area, porosity, and interconnectivity were quantified precisely. Fabrication defects and differences between the designed and as-produced scaffolds were clearly identified in both 2D and 3D; the locations and dimensions of each of the fabrication defects were also defined. It concludes that this method will be a key tool for non-destructive and quantitative characterization, design optimisation and fabrication refinement of 3D bioprinted hydrogel scaffolds. Furthermore, this method enables investigation into the quantitative relationship between scaffold structure and biological outcome. PMID:27231597

  11. Quantitative photoacoustic characterization of blood clot in blood: A mechanobiological assessment through spectral information

    NASA Astrophysics Data System (ADS)

    Biswas, Deblina; Vasudevan, Srivathsan; Chen, George C. K.; Sharma, Norman

    2017-02-01

    Formation of blood clots, called thrombus, can happen due to hyper-coagulation of blood. Thrombi, while moving through blood vessels can impede blood flow, an important criterion for many critical diseases like deep vein thrombosis and heart attacks. Understanding mechanical properties of clot formation is vital for assessment of severity of thrombosis and proper treatment. However, biomechanics of thrombus is less known to clinicians and not very well investigated. Photoacoustic (PA) spectral response, a non-invasive technique, is proposed to investigate the mechanism of formation of blood clots through elasticity and also differentiate clots from blood. Distinct shift (increase in frequency) of the PA response dominant frequency during clot formation is reported. In addition, quantitative differentiation of blood clots from blood has been achieved through parameters like dominant frequency and spectral energy of PA spectral response. Nearly twofold increases in dominant frequency in blood clots compared to blood were found in the PA spectral response. Significant changes in energy also help in quantitatively differentiating clots from blood, in the blood. Our results reveal that increase in density during clot formation is reflected in the PA spectral response, a significant step towards understanding the mechanobiology of thrombus formation. Hence, the proposed tool, in addition to detecting thrombus formation, could reveal mechanical properties of the sample through quantitative photoacoustic spectral parameters.

  12. Monitoring and quantitative assessment of tumor burden using in vivo bioluminescence imaging

    NASA Astrophysics Data System (ADS)

    Chen, Chia-Chi; Hwang, Jeng-Jong; Ting, Gann; Tseng, Yun-Long; Wang, Shyh-Jen; Whang-Peng, Jaqueline

    2007-02-01

    In vivo bioluminescence imaging (BLI) is a sensitive imaging modality that is rapid and accessible, and may comprise an ideal tool for evaluating tumor growth. In this study, the kinetic of tumor growth has been assessed in C26 colon carcinoma bearing BALB/c mouse model. The ability of BLI to noninvasively quantitate the growth of subcutaneous tumors transplanted with C26 cells genetically engineered to stably express firefly luciferase and herpes simplex virus type-1 thymidine kinase (C26/ tk-luc). A good correlation ( R2=0.998) of photon emission to the cell number was found in vitro. Tumor burden and tumor volume were monitored in vivo over time by quantitation of photon emission using Xenogen IVIS 50 and standard external caliper measurement, respectively. At various time intervals, tumor-bearing mice were imaged to determine the correlation of in vivo BLI to tumor volume. However, a correlation of BLI to tumor volume was observed when tumor volume was smaller than 1000 mm 3 ( R2=0.907). γ Scintigraphy combined with [ 131I]FIAU was another imaging modality used for verifying the previous results. In conclusion, this study showed that bioluminescence imaging is a powerful and quantitative tool for the direct assay to monitor tumor growth in vivo. The dual reporter genes transfected tumor-bearing animal model can be applied in the evaluation of the efficacy of new developed anti-cancer drugs.

  13. Lung extraction, lobe segmentation and hierarchical region assessment for quantitative analysis on high resolution computed tomography images.

    PubMed

    Ross, James C; Estépar, Raúl San José; Díaz, Alejandro; Westin, Carl-Fredrik; Kikinis, Ron; Silverman, Edwin K; Washko, George R

    2009-01-01

    Regional assessment of lung disease (such as chronic obstructive pulmonary disease) is a critical component to accurate patient diagnosis. Software tools than enable such analysis are also important for clinical research studies. In this work, we present an image segmentation and data representation framework that enables quantitative analysis specific to different lung regions on high resolution computed tomography (HRCT) datasets. We present an offline, fully automatic image processing chain that generates airway, vessel, and lung mask segmentations in which the left and right lung are delineated. We describe a novel lung lobe segmentation tool that produces reproducible results with minimal user interaction. A usability study performed across twenty datasets (inspiratory and expiratory exams including a range of disease states) demonstrates the tool's ability to generate results within five to seven minutes on average. We also describe a data representation scheme that involves compact encoding of label maps such that both "regions" (such as lung lobes) and "types" (such as emphysematous parenchyma) can be simultaneously represented at a given location in the HRCT.

  14. Assessment of diffuse coronary artery disease by quantitative analysis of coronary morphology based upon 3-D reconstruction from biplane angiograms

    SciTech Connect

    Wahel, A.; Wellnhofer, E.; Mugaragu, I.; Sauer, H.U.; Oswald, H.; Fleck, E. |

    1995-06-01

    Quantitative evaluations on coronary vessel systems are of increasing importance in cardiovascular diagnosis, therapy planning, and surgical verification. Whereas local evaluations, such as stenosis analysis, are already available with sufficient accuracy, global evaluations of vessel segments or vessel subsystems are not yet common. Especially for the diagnosis of diffuse coronary artery diseases, the authors combined a 3-D reconstruction system operating on biplane angiograms with a length/volume calculation. The 3-D reconstruction results in a 3-D model of the coronary vessel system, consisting of the vessel skeleton and a discrete number of contours. To obtain an utmost accurate model, the authors focused on exact geometry determination. Several algorithms for calculating missing geometric parameters and correcting remaining geometry errors were implemented and verified. The length/volume evaluation can be performed either on single vessel segments, on a set of segments, or on subtrees. A volume model based on generalized elliptical conic sections is created for the selected segments. Volumes and lengths (measured along the vessel course) of those elements are summed up. In this way, the morphological parameters of a vessel subsystem can be set in relation to the parameters of the proximal segment supplying it. These relations allow objective assessments of diffuse coronary artery diseases.

  15. Safety evaluation of disposable baby diapers using principles of quantitative risk assessment.

    PubMed

    Rai, Prashant; Lee, Byung-Mu; Liu, Tsung-Yun; Yuhui, Qin; Krause, Edburga; Marsman, Daniel S; Felter, Susan

    2009-01-01

    Baby diapers are complex products consisting of multiple layers of materials, most of which are not in direct contact with the skin. The safety profile of a diaper is determined by the biological properties of individual components and the extent to which the baby is exposed to each component during use. Rigorous evaluation of the toxicological profile and realistic exposure conditions of each material is important to ensure the overall safety of the diaper under normal and foreseeable use conditions. Quantitative risk assessment (QRA) principles may be applied to the safety assessment of diapers and similar products. Exposure to component materials is determined by (1) considering the conditions of product use, (2) the degree to which individual layers of the product are in contact with the skin during use, and (3) the extent to which some components may be extracted by urine and delivered to skin. This assessment of potential exposure is then combined with data from standard safety assessments of components to determine the margin of safety (MOS). This study examined the application of QRA to the safety evaluation of baby diapers, including risk assessments for some diaper ingredient chemicals for which establishment of acceptable and safe exposure levels were demonstrated.

  16. Quantitative assessment of groundwater vulnerability using index system and transport simulation, Huangshuihe catchment, China.

    PubMed

    Yu, Cheng; Yao, Yingying; Hayes, Gregory; Zhang, Baoxiang; Zheng, Chunmiao

    2010-11-15

    Groundwater vulnerability assessment has been an increasingly important environment management tool. The existing vulnerability assessment approaches are mostly index systems which have significant disadvantages. There need to be some quantitative studies on vulnerability indicators based on objective physical process study. In this study, we tried to do vulnerability assessment in Huangshuihe catchment in Shandong province of China using both contaminant transport simulations and index system approach. Transit time of 75% of hypothetical injected contaminant concentration was considered as the vulnerability indicator. First, we collected the field data of the Huangshuihe catchment and the catchment was divided into 34 sub areas that can each be treated as a transport sub model. Next, we constructed a Hydrus1D transport model of Huangshuihe catchment. Different sub areas had different input values. Thirdly, we used Monte-Carlo simulation to improve the collected data and did vulnerability assessment using the statistics of the contaminant transit time as a vulnerability indicator. Finally, to compare with the assessment result by transport simulation, we applied two index systems to Huangshuihe catchment. The first was DRASTIC system, and the other was a system we tentatively constructed examining the relationships between the transit time and the input parameters by simply changing the input values. The result of comparisons between the two index systems and transport simulation approach suggested partial validation to DRASTIC, and the construction of the new tentative index system was an attempt of building up index approaches based on physical process simulation.

  17. [Quantitative diagnosis of hypernasality in cleft lip and palate patients by computerized nasal quality assessment].

    PubMed

    Bressmann, T; Sader, R; Awan, S; Busch, R; Zeilhofer, H F; Horch, H H

    1999-05-01

    In patients with cleft lip and palate (CLP), the assessment of velopharyngeal morphology and function and the quantitative analysis of perceptual consequences of velopharyngeal insufficiency are of major importance regarding the effective planning of velopharyngoplasties for speech improvement. The NasalView, a new instrument for the objective assessment of rhinophonia, is presented. The NasalView measures nasalance, the relative sound pressure level of the nasal signal in speech, expressed as a percentage. In order to evaluate the effectiveness of the computerised measurement of nasalance, 156 patients with surgically treated CLP were examined. The NasalView differentiated with high sensitivity and specificity between patients with normal nasal resonance and patients with varying degrees of hypernasality. To illustrate the importance of the NasalView for making the decision for a velopharyngoplasty, a single case is presented.

  18. Intentional Movement Performance Ability (IMPA): a method for robot-aided quantitative assessment of motor function.

    PubMed

    Shin, Sung Yul; Kim, Jung Yoon; Lee, Sanghyeop; Lee, Junwon; Kim, Seung-Jong; Kim, ChangHwan

    2013-06-01

    The purpose of this paper is to propose a new assessment method for evaluating motor function of the patients who are suffering from physical weakness after stroke, incomplete spinal cord injury (iSCI) or other diseases. In this work, we use a robotic device to obtain the information of interaction occur between patient and robot, and use it as a measure for assessing the patients. The Intentional Movement Performance Ability (IMPA) is defined by the root mean square of the interactive torque, while the subject performs given periodic movement with the robot. IMPA is proposed to quantitatively determine the level of subject's impaired motor function. The method is indirectly tested by asking the healthy subjects to lift a barbell to disturb their motor function. The experimental result shows that the IMPA has a potential for providing a proper information of the subject's motor function level.

  19. Survey of Quantitative Research Metrics to Assess Pilot Performance in Upset Recovery

    NASA Technical Reports Server (NTRS)

    Le Vie, Lisa R.

    2016-01-01

    Accidents attributable to in-flight loss of control are the primary cause for fatal commercial jet accidents worldwide. The National Aeronautics and Space Administration (NASA) conducted a literature review to determine and identify the quantitative standards for assessing upset recovery performance. This review contains current recovery procedures for both military and commercial aviation and includes the metrics researchers use to assess aircraft recovery performance. Metrics include time to first input, recognition time and recovery time and whether that input was correct or incorrect. Other metrics included are: the state of the autopilot and autothrottle, control wheel/sidestick movement resulting in pitch and roll, and inputs to the throttle and rudder. In addition, airplane state measures, such as roll reversals, altitude loss/gain, maximum vertical speed, maximum/minimum air speed, maximum bank angle and maximum g loading are reviewed as well.

  20. Quantitative assessment of Cerenkov luminescence for radioguided brain tumor resection surgery.

    PubMed

    Klein, Justin S; Mitchell, Gregory; Cherry, Simon

    2017-03-13

    Cerenkov luminescence imaging (CLI) is a developing imaging modality that detects radiolabeled molecules via visible light emitted during the radioactive decay process. We used a Monte Carlo based computer simulation to quantitatively investigate CLI compared to direct detection of the ionizing radiation itself as an intraoperative imaging tool for assessment of brain tumor margins. Our brain tumor model consisted of a 1 mm spherical tumor remnant embedded up to 5 mm in depth below the surface of normal brain tissue. Tumor to background contrast ranging from 2:1 to 10:1 were considered. We quantified all decay signals (e+/-, gamma photon, Cerenkov photons) reaching the brain volume surface. CLI proved to be the most sensitive method for detecting the tumor volume in both imaging and non-imaging strategies as assessed by contrast-to-noise ratio and by receiver operating characteristic output of a channelized Hotelling observer.

  1. A 3D assessment tool for accurate volume measurement for monitoring the evolution of cutaneous leishmaniasis wounds.

    PubMed

    Zvietcovich, Fernando; Castañeda, Benjamin; Valencia, Braulio; Llanos-Cuentas, Alejandro

    2012-01-01

    Clinical assessment and outcome metrics are serious weaknesses identified on the systematic reviews of cutaneous Leishmaniasis wounds. Methods with high accuracy and low-variability are required to standarize study outcomes in clinical trials. This work presents a precise, complete and noncontact 3D assessment tool for monitoring the evolution of cutaneous Leishmaniasis (CL) wounds based on a 3D laser scanner and computer vision algorithms. A 3D mesh of the wound is obtained by a commercial 3D laser scanner. Then, a semi-automatic segmentation using active contours is performed to separate the ulcer from the healthy skin. Finally, metrics of volume, area, perimeter and depth are obtained from the mesh. Traditional manual 3D and 3D measurements are obtained as a gold standard. Experiments applied to phantoms and real CL wounds suggest that the proposed 3D assessment tool provides higher accuracy (error <2%) and precision rates (error <4%) than conventional manual methods (precision error < 35%). This 3D assessment tool provides high accuracy metrics which deserve more formal prospective study.

  2. The Development of Multiple-Choice Items Consistent with the AP Chemistry Curriculum Framework to More Accurately Assess Deeper Understanding

    ERIC Educational Resources Information Center

    Domyancich, John M.

    2014-01-01

    Multiple-choice questions are an important part of large-scale summative assessments, such as the advanced placement (AP) chemistry exam. However, past AP chemistry exam items often lacked the ability to test conceptual understanding and higher-order cognitive skills. The redesigned AP chemistry exam shows a distinctive shift in item types toward…

  3. Disc Degeneration Assessed by Quantitative T2* (T2 star) Correlated with Functional Lumbar Mechanics

    PubMed Central

    Ellingson, Arin M.; Mehta, Hitesh; Polly, David W.; Ellermann, Jutta; Nuckley, David J.

    2013-01-01

    Study Design Experimental correlation study design to quantify features of disc health, including signal intensity and distinction between the annulus fibrosus (AF) and nucleus pulposus (NP), with T2* magnetic resonance imaging (MRI) and correlate with the functional mechanics in corresponding motion segments. Objective Establish the relationship between disc health assessed by quantitative T2* MRI and functional lumbar mechanics. Summary of Background Data Degeneration leads to altered biochemistry in the disc, affecting the mechanical competence. Clinical routine MRI sequences are not adequate in detecting early changes in degeneration and fails to correlate with pain or improve patient stratification. Quantitative T2* relaxation time mapping probes biochemical features and may offer more sensitivity in assessing disc degeneration. Methods Cadaveric lumbar spines were imaged using quantitative T2* mapping, as well as conventional T2-weighted MRI sequences. Discs were graded by the Pfirrmann scale and features of disc health, including signal intensity (T2* Intensity Area) and distinction between the AF and NP (Transition Zone Slope), were quantified by T2*. Each motion segment was subjected to pure moment bending to determine range of motion (ROM), neutral zone (NZ), and bending stiffness. Results T2* Intensity Area and Transition Zone Slope were significantly correlated with flexion ROM (p=0.015; p=0.002), ratio of NZ/ROM (p=0.010; p=0.028), and stiffness (p=0.044; p=0.026), as well as lateral bending NZ/ROM (p=0.005; p=0.010) and stiffness (p=0.022; p=0.029). T2* Intensity Area was also correlated with LB ROM (p=0.023). Pfirrmann grade was only correlated with lateral bending NZ/ROM (p=0.001) and stiffness (p=0.007). Conclusions T2* mapping is a sensitive quantitative method capable of detecting changes associated with disc degeneration. Features of disc health quantified with T2* predicted altered functional mechanics of the lumbar spine better than

  4. Quantitative microbial risk assessment for Staphylococcus aureus in natural and processed cheese in Korea.

    PubMed

    Lee, Heeyoung; Kim, Kyunga; Choi, Kyoung-Hee; Yoon, Yohan

    2015-09-01

    This study quantitatively assessed the microbial risk of Staphylococcus aureus in cheese in Korea. The quantitative microbial risk assessment was carried out for natural and processed cheese from factory to consumption. Hazards for S. aureus in cheese were identified through the literature. For exposure assessment, the levels of S. aureus contamination in cheeses were evaluated, and the growth of S. aureus was predicted by predictive models at the surveyed temperatures, and at the time of cheese processing and distribution. For hazard characterization, a dose-response model for S. aureus was found, and the model was used to estimate the risk of illness. With these data, simulation models were prepared with @RISK (Palisade Corp., Ithaca, NY) to estimate the risk of illness per person per day in risk characterization. Staphylococcus aureus cell counts on cheese samples from factories and markets were below detection limits (0.30-0.45 log cfu/g), and pert distribution showed that the mean temperature at markets was 6.63°C. Exponential model [P=1 - exp(7.64×10(-8) × N), where N=dose] for dose-response was deemed appropriate for hazard characterization. Mean temperature of home storage was 4.02°C (log-logistic distribution). The results of risk characterization for S. aureus in natural and processed cheese showed that the mean values for the probability of illness per person per day were higher in processed cheese (mean: 2.24×10(-9); maximum: 7.97×10(-6)) than in natural cheese (mean: 7.84×10(-10); maximum: 2.32×10(-6)). These results indicate that the risk of S. aureus-related foodborne illness due to cheese consumption can be considered low under the present conditions in Korea. In addition, the developed stochastic risk assessment model in this study can be useful in establishing microbial criteria for S. aureus in cheese.

  5. Quantitative photoacoustic assessment of ex-vivo lymph nodes of colorectal cancer patients

    NASA Astrophysics Data System (ADS)

    Sampathkumar, Ashwin; Mamou, Jonathan; Saegusa-Beercroft, Emi; Chitnis, Parag V.; Machi, Junji; Feleppa, Ernest J.

    2015-03-01

    Staging of cancers and selection of appropriate treatment requires histological examination of multiple dissected lymph nodes (LNs) per patient, so that a staggering number of nodes require histopathological examination, and the finite resources of pathology facilities create a severe processing bottleneck. Histologically examining the entire 3D volume of every dissected node is not feasible, and therefore, only the central region of each node is examined histologically, which results in severe sampling limitations. In this work, we assess the feasibility of using quantitative photoacoustics (QPA) to overcome the limitations imposed by current procedures and eliminate the resulting under sampling in node assessments. QPA is emerging as a new hybrid modality that assesses tissue properties and classifies tissue type based on multiple estimates derived from spectrum analysis of photoacoustic (PA) radiofrequency (RF) data and from statistical analysis of envelope-signal data derived from the RF signals. Our study seeks to use QPA to distinguish cancerous from non-cancerous regions of dissected LNs and hence serve as a reliable means of imaging and detecting small but clinically significant cancerous foci that would be missed by current methods. Dissected lymph nodes were placed in a water bath and PA signals were generated using a wavelength-tunable (680-950 nm) laser. A 26-MHz, f-2 transducer was used to sense the PA signals. We present an overview of our experimental setup; provide a statistical analysis of multi-wavelength classification parameters (mid-band fit, slope, intercept) obtained from the PA signal spectrum generated in the LNs; and compare QPA performance with our established quantitative ultrasound (QUS) techniques in distinguishing metastatic from non-cancerous tissue in dissected LNs. QPA-QUS methods offer a novel general means of tissue typing and evaluation in a broad range of disease-assessment applications, e.g., cardiac, intravascular

  6. Second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygenation

    NASA Astrophysics Data System (ADS)

    Huang, Jiwei; Zhang, Shiwu; Gnyawali, Surya; Sen, Chandan K.; Xu, Ronald X.

    2015-03-01

    We report a second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygen saturation (StO2). The algorithm is based on a forward model of light transport in multilayered skin tissue and an inverse algorithm for StO2 reconstruction. Based on the forward simulation results, a parameter of a second derivative ratio (SDR) is derived as a function of cutaneous tissue StO2. The SDR function is optimized at a wavelength set of 544, 552, 568, 576, 592, and 600 nm so that cutaneous tissue StO2 can be derived with minimal artifacts by blood concentration, tissue scattering, and melanin concentration. The proposed multispectral StO2 imaging algorithm is verified in both benchtop and in vivo experiments. The experimental results show that the proposed multispectral imaging algorithm is able to map cutaneous tissue StO2 in high temporal resolution with reduced measurement artifacts induced by different skin conditions in comparison with other three commercial tissue oxygen measurement systems. These results indicate that the multispectral StO2 imaging technique has the potential for noninvasive and quantitative assessment of skin tissue oxygenation with a high temporal resolution.

  7. Second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygenation.

    PubMed

    Huang, Jiwei; Zhang, Shiwu; Gnyawali, Surya; Sen, Chandan K; Xu, Ronald X

    2015-03-01

    We report a second derivative multispectral algorithm for quantitative assessment of cutaneous tissue oxygen saturation (StO₂). The algorithm is based on a forward model of light transport in multilayered skin tissue and an inverse algorithm for StO₂ reconstruction. Based on the forward simulation results, a parameter of a second derivative ratio (SDR) is derived as a function of cutaneous tissue StO₂. The SDR function is optimized at a wavelength set of 544, 552, 568, 576, 592, and 600 nm so that cutaneous tissue StO₂ can be derived with minimal artifacts by blood concentration, tissue scattering, and melanin concentration. The proposed multispectral StO₂ imaging algorithm is verified in both benchtop and in vivo experiments. The experimental results show that the proposed multispectral imaging algorithm is able to map cutaneous tissue StO₂ in high temporal resolution with reduced measurement artifacts induced by different skin conditions in comparison with other three commercial tissue oxygen measurement systems. These results indicate that the multispectral StO₂ imaging technique has the potential for noninvasive and quantitative assessment of skin tissue oxygenation with a high temporal resolution.

  8. Semi-quantitative assessment of pulmonary perfusion in children using dynamic contrast-enhanced MRI

    NASA Astrophysics Data System (ADS)

    Fetita, Catalin; Thong, William E.; Ou, Phalla

    2013-03-01

    This paper addresses the study of semi-quantitative assessment of pulmonary perfusion acquired from dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) in a study population mainly composed of children with pulmonary malformations. The automatic analysis approach proposed is based on the indicator-dilution theory introduced in 1954. First, a robust method is developed to segment the pulmonary artery and the lungs from anatomical MRI data, exploiting 2D and 3D mathematical morphology operators. Second, the time-dependent contrast signal of the lung regions is deconvolved by the arterial input function for the assessment of the local hemodynamic system parameters, ie. mean transit time, pulmonary blood volume and pulmonary blood flow. The discrete deconvolution method implements here a truncated singular value decomposition (tSVD) method. Parametric images for the entire lungs are generated as additional elements for diagnosis and quantitative follow-up. The preliminary results attest the feasibility of perfusion quantification in pulmonary DCE-MRI and open an interesting alternative to scintigraphy for this type of evaluation, to be considered at least as a preliminary decision in the diagnostic due to the large availability of the technique and to the non-invasive aspects.

  9. A quantitative model to assess Social Responsibility in Environmental Science and Technology.

    PubMed

    Valcárcel, M; Lucena, R

    2014-01-01

    The awareness of the impact of human activities in society and environment is known as "Social Responsibility" (SR). It has been a topic of growing interest in many enterprises since the fifties of the past Century, and its implementation/assessment is nowadays supported by international standards. There is a tendency to amplify its scope of application to other areas of the human activities, such as Research, Development and Innovation (R + D + I). In this paper, a model of quantitative assessment of Social Responsibility in Environmental Science and Technology (SR EST) is described in detail. This model is based on well established written standards as the EFQM Excellence model and the ISO 26000:2010 Guidance on SR. The definition of five hierarchies of indicators, the transformation of qualitative information into quantitative data and the dual procedure of self-evaluation and external evaluation are the milestones of the proposed model, which can be applied to Environmental Research Centres and institutions. In addition, a simplified model that facilitates its implementation is presented in the article.

  10. Using Non-Invasive Multi-Spectral Imaging to Quantitatively Assess Tissue Vasculature

    SciTech Connect

    Vogel, A; Chernomordik, V; Riley, J; Hassan, M; Amyot, F; Dasgeb, B; Demos, S G; Pursley, R; Little, R; Yarchoan, R; Tao, Y; Gandjbakhche, A H

    2007-10-04

    This research describes a non-invasive, non-contact method used to quantitatively analyze the functional characteristics of tissue. Multi-spectral images collected at several near-infrared wavelengths are input into a mathematical optical skin model that considers the contributions from different analytes in the epidermis and dermis skin layers. Through a reconstruction algorithm, we can quantify the percent of blood in a given area of tissue and the fraction of that blood that is oxygenated. Imaging normal tissue confirms previously reported values for the percent of blood in tissue and the percent of blood that is oxygenated in tissue and surrounding vasculature, for the normal state and when ischemia is induced. This methodology has been applied to assess vascular Kaposi's sarcoma lesions and the surrounding tissue before and during experimental therapies. The multi-spectral imaging technique has been combined with laser Doppler imaging to gain additional information. Results indicate that these techniques are able to provide quantitative and functional information about tissue changes during experimental drug therapy and investigate progression of disease before changes are visibly apparent, suggesting a potential for them to be used as complementary imaging techniques to clinical assessment.

  11. Quantitative Risk Assessment of Human Trichinellosis Caused by Consumption of Pork Meat Sausages in Argentina.

    PubMed

    Sequeira, G J; Zbrun, M V; Soto, L P; Astesana, D M; Blajman, J E; Rosmini, M R; Frizzo, L S; Signorini, M L

    2016-03-01

    In Argentina, there are three known species of genus Trichinella; however, Trichinella spiralis is most commonly associated with domestic pigs and it is recognized as the main cause of human trichinellosis by the consumption of products made with raw or insufficiently cooked pork meat. In some areas of Argentina, this disease is endemic and it is thus necessary to develop a more effective programme of prevention and control. Here, we developed a quantitative risk assessment of human trichinellosis following pork meat sausage consumption, which may be used to identify the stages with greater impact on the probability of acquiring the disease. The quantitative model was designed to describe the conditions in which the meat is produced, processed, transported, stored, sold and consumed in Argentina. The model predicted a risk of human trichinellosis of 4.88 × 10(-6) and an estimated annual number of trichinellosis cases of 109. The risk of human trichinellosis was sensitive to the number of Trichinella larvae that effectively survived the storage period (r = 0.89), the average probability of infection (PPinf ) (r = 0.44) and the storage time (Storage) (r = 0.08). This model allowed assessing the impact of different factors influencing the risk of acquiring trichinellosis. The model may thus help to select possible strategies to reduce the risk in the chain of by-products of pork production.

  12. Quantitative assessment on soil enzyme activities of heavy metal contaminated soils with various soil properties.

    PubMed

    Xian, Yu; Wang, Meie; Chen, Weiping

    2015-11-01

    Soil enzyme activities are greatly influenced by soil properties and could be significant indicators of heavy metal toxicity in soil for bioavailability assessment. Two groups of experiments were conducted to determine the joint effects of heavy metals and soil properties on soil enzyme activities. Results showed that arylsulfatase was the most sensitive soil enzyme and could be used as an indicator to study the enzymatic toxicity of heavy metals under various soil properties. Soil organic matter (SOM) was the dominant factor affecting the activity of arylsulfatase in soil. A quantitative model was derived to predict the changes of arylsulfatase activity with SOM content. When the soil organic matter content was less than the critical point A (1.05% in our study), the arylsulfatase activity dropped rapidly. When the soil organic matter content was greater than the critical point A, the arylsulfatase activity gradually rose to higher levels showing that instead of harm the soil microbial activities were enhanced. The SOM content needs to be over the critical point B (2.42% in our study) to protect its microbial community from harm due to the severe Pb pollution (500mgkg(-1) in our study). The quantitative model revealed the pattern of variation of enzymatic toxicity due to heavy metals under various SOM contents. The applicability of the model under wider soil properties need to be tested. The model however may provide a methodological basis for ecological risk assessment of heavy metals in soil.

  13. A rapid and accurate method for the quantitative estimation of natural polysaccharides and their fractions using high performance size exclusion chromatography coupled with multi-angle laser light scattering and refractive index detector.

    PubMed

    Cheong, Kit-Leong; Wu, Ding-Tao; Zhao, Jing; Li, Shao-Ping

    2015-06-26

    In this study, a rapid and accurate method for quantitative analysis of natural polysaccharides and their different fractions was developed. Firstly, high performance size exclusion chromatography (HPSEC) was utilized to separate natural polysaccharides. And then the molecular masses of their fractions were determined by multi-angle laser light scattering (MALLS). Finally, quantification of polysaccharides or their fractions was performed based on their response to refractive index detector (RID) and their universal refractive index increment (dn/dc). Accuracy of the developed method for the quantification of individual and mixed polysaccharide standards, including konjac glucomannan, CM-arabinan, xyloglucan, larch arabinogalactan, oat β-glucan, dextran (410, 270, and 25 kDa), mixed xyloglucan and CM-arabinan, and mixed dextran 270 K and CM-arabinan was determined, and their average recoveries were between 90.6% and 98.3%. The limits of detection (LOD) and quantification (LOQ) were ranging from 10.68 to 20.25 μg/mL, and 42.70 to 68.85 μg/mL, respectively. Comparing to the conventional phenol sulfuric acid assay and HPSEC coupled with evaporative light scattering detection (HPSEC-ELSD) analysis, the developed HPSEC-MALLS-RID method based on universal dn/dc for the quantification of polysaccharides and their fractions is much more simple, rapid, and accurate with no need of individual polysaccharide standard, as well as free of calibration curve. The developed method was also successfully utilized for quantitative analysis of polysaccharides and their different fractions from three medicinal plants of Panax genus, Panax ginseng, Panax notoginseng and Panax quinquefolius. The results suggested that the HPSEC-MALLS-RID method based on universal dn/dc could be used as a routine technique for the quantification of polysaccharides and their fractions in natural resources.

  14. Quantitative risk assessment for zoonotic transmission of Cryptosporidium parvum infection attributable to recreational use of farmland.

    PubMed

    Hill, A; Nally, P; Chalmers, R M; Pritchard, G C; Giles, M

    2011-08-01

    Cryptosporidiosis caused by Cryptosporidium parvum infection is a major cause of enteric illness in man and there is a significant reservoir in animals, particularly young ruminant species. To preliminary assess the magnitude of the risk posed by contact with faeces produced by infected livestock, two microbiological risk assessments have been developed: one for the risk of human infection with C. parvum while camping on contaminated land recently grazed by infected suckler cattle and a comparable risk assessment for camping on land recently spread with contaminated cattle slurry. Using a worst-case scenario approach, the upper level of risk was estimated to be one infection in every 6211 person-visits for a camping event on land recently grazed by infected cattle. Translated into camping events of 100 persons, this risk estimate would most likely lead to zero (98% likelihood) or one infection (1% likelihood). The results for cattle slurry model are similar despite different pathways. Sensitivity analysis was conducted for the grazing cattle model only. This suggested that the time between grazing and camping was the most important control strategy, but increasing hand-washing frequency and the removal of cattle faeces before camping would also be beneficial. If the upper level of risk were to be judged unacceptable then further data would be required to more accurately estimate the risk of infection through these scenarios. Further research would also be required to assess the fraction of cases attributable to camping and/or environmental contact with Cryptosporidium oocysts.

  15. A qualitative and quantitative needs assessment of pain management for hospitalized orthopedic patients.

    PubMed

    Cordts, Grace A; Grant, Marian S; Brandt, Lynsey E; Mears, Simon C

    2011-08-08

    Despite advances in pain management, little formal teaching is given to practitioners and nurses in its use for postoperative orthopedic patients. The goal of our study was to determine the educational needs for orthopedic pain management of our residents, nurses, and physical therapists using a quantitative and qualitative assessment. The needs analysis was conducted in a 10-bed orthopedic unit at a teaching hospital and included a survey given to 20 orthopedic residents, 9 nurses, and 6 physical therapists, followed by focus groups addressing barriers to pain control and knowledge of pain management. Key challenges for nurses included not always having breakthrough pain medication orders and the gap in pain management between cessation of patient-controlled analgesia and ordering and administering oral medications. Key challenges for orthopedic residents included treating pain in patients with a history of substance abuse, assessing pain, and determining when to use long-acting vs short-acting opioids. Focus group assessments revealed a lack of training in pain management and the need for better coordination of care between nurses and practitioners and improved education about special needs groups (the elderly and those with substance abuse issues). This needs assessment showed that orthopedic residents and nurses receive little formal education on pain management, despite having to address pain on a daily basis. This information will be used to develop an educational program to improve pain management for postoperative orthopedic patients. An integrated educational program with orthopedic residents, nurses, and physical therapists would promote understanding of issues for each discipline.

  16. Quantitative assessment of fibrosis and steatosis in liver biopsies from patients with chronic hepatitis C

    PubMed Central

    Zaitoun, A; Al, M; Awad, S; Ukabam, S; Makadisi, S; Record, C

    2001-01-01

    Backgrounds—Hepatic fibrosis is one of the main consequences of liver disease. Both fibrosis and steatosis may be seen in some patients with chronic hepatitis C and alcoholic liver disease (ALD). Aims—To quantitate fibrosis and steatosis by stereological and morphometric techniques in patients with chronic hepatitis C and compare the results with a control group of patients with ALD. In addition, to correlate the quantitative features of fibrosis with the Ishak modified histological score. Materials and methods—Needle liver biopsies from 86 patients with chronic hepatitis C and from 32 patients with alcoholic liver disease (disease controls) were analysed by stereological and morphometric analyses using the Prodit 5.2 system. Haematoxylin and eosin and Picro-Mallory stained sections were used. The area fractions (AA) of fibrosis, steatosis, parenchyma, and other structures (bile duct and central vein areas) were assessed by stereological method. The mean diameters of fat globules were determined by morphometric analysis. Results—Significant differences were found in the AA of fibrosis, including fibrosis within portal tract areas, between chronic hepatitis C patients and those with ALD (mean (SD): 19.14 (10.59) v 15.97 (12.51)). Portal and periportal (zone 1) fibrosis was significantly higher (p = 0.00004) in patients with chronic hepatitis C compared with the control group (mean (SD): 9.04 (6.37) v 3.59 (3.16)). Pericentral fibrosis (zone 3) occurred in both groups but was significantly more pronounced in patients with ALD. These results correlate well with the modified Ishak scoring system. However, in patients with cirrhosis (stage 6) with chronic hepatitis C the AA of fibrosis varied between 20% and 74%. The diameter of fat globules was significantly lower in patients with hepatitis C (p = 0.00002) than the ALD group (mean (SD): 14.44 (3.45) v 18.4 (3.32)). Microglobules were more frequent in patients with chronic hepatitis C than in patients with ALD

  17. Comprehensive, Quantitative Risk Assessment of CO{sub 2} Geologic Sequestration

    SciTech Connect

    Lepinski, James

    2013-09-30

    A Quantitative Failure Modes and Effects Analysis (QFMEA) was developed to conduct comprehensive, quantitative risk assessments on CO{sub 2} capture, transportation, and sequestration or use in deep saline aquifers, enhanced oil recovery operations, or enhanced coal bed methane operations. The model identifies and characterizes potential risks; identifies the likely failure modes, causes, effects and methods of detection; lists possible risk prevention and risk mitigation steps; estimates potential damage recovery costs, mitigation costs and costs savings resulting from mitigation; and ranks (prioritizes) risks according to the probability of failure, the severity of failure, the difficulty of early failure detection and the potential for fatalities. The QFMEA model generates the necessary information needed for effective project risk management. Diverse project information can be integrated into a concise, common format that allows comprehensive, quantitative analysis, by a cross-functional team of experts, to determine: What can possibly go wrong? How much will damage recovery cost? How can it be prevented or mitigated? What is the cost savings or benefit of prevention or mitigation? Which risks should be given highest priority for resolution? The QFMEA model can be tailored to specific projects and is applicable to new projects as well as mature projects. The model can be revised and updated as new information comes available. It accepts input from multiple sources, such as literature searches, site characterization, field data, computer simulations, analogues, process influence diagrams, probability density functions, financial analysis models, cost factors, and heuristic best practices manuals, and converts the information into a standardized format in an Excel spreadsheet. Process influence diagrams, geologic models, financial models, cost factors and an insurance schedule were developed to support the QFMEA model. Comprehensive, quantitative risk assessments

  18. Quantitative imaging of cartilage and bone for functional assessment of gene therapy approaches in experimental arthritis.

    PubMed

    Stok, Kathryn S; Noël, Danièle; Apparailly, Florence; Gould, David; Chernajovsky, Yuti; Jorgensen, Christian; Müller, Ralph

    2010-07-01

    Anti-inflammatory gene therapy can inhibit inflammation driven by TNFalpha in experimental models of rheumatoid arthritis. However, assessment of the therapeutic effect on cartilage and bone quality is either missing or unsatisfactory. A multimodal imaging approach, using confocal laser scanning microscopy (CLSM) and micro-computed tomography (microCT), was used for gathering 3D quantitative image data on diseased and treated murine joints. As proof of concept, the efficacy of anti-TNF-based gene therapy was assessed, comparing imaging techniques with classical investigations. SCID mice knees were injected with human synoviocytes overexpressing TNFalpha. Two days later, electric pulse-mediated DNA transfer was performed after injection of the pGTRTT-plasmid containing a dimeric soluble-TNF receptor (dsTNFR) under the control of a doxycycline-inducible promoter. After 21 days the mice were sacrificed, TNFalpha levels were measured and the joints assessed for cartilage and bone degradation, using CLSM, microCT and histology. TNFalpha levels were decreased in the joints of mice treated with the plasmid in the presence of doxycycline. Concomitantly, histological analysis showed an increase in cartilage thickness and a decrease in specific synovial hyperplasia and cartilage erosion. Bone morphometry revealed that groups with the plasmid in the presence of doxycycline displayed a higher cortical thickness and decreased porosity. Using an anti-TNF gene therapy approach, known to reduce inflammation, as proof of concept, 3D imaging allowed quantitative evaluation of its benefits to joint architecture. It showed that local delivery of a regulated anti-TNF vector allowed decreasing arthritis severity through TNFalpha inhibition. These tools are valuable for understanding the efficacy of gene therapy on whole-joint morphometry.

  19. Quantitative assessment of risk reduction from hand washing with antibacterial soaps.

    PubMed

    Gibson, L L; Rose, J B; Haas, C N; Gerba, C P; Rusin, P A

    2002-01-01

    The Centers for Disease Control and Prevention have estimated that there are 3,713,000 cases of infectious disease associated with day care facilities each year. The objective of this study was to examine the risk reduction achieved from using different soap formulations after diaper changing using a microbial quantitative risk assessment approach. To achieve this, a probability of infection model and an exposure assessment based on micro-organism transfer were used to evaluate the efficacy of different soap formulations in reducing the probability of disease following hand contact with an enteric pathogen. Based on this model, it was determined that the probability of infection ranged from 24/100 to 91/100 for those changing diapers of babies with symptomatic shigellosis who used a control product (soap without an antibacterial ingredient), 22/100 to 91/100 for those who used an antibacterial soap (chlorohexadine 4%), and 15/100 to 90/100 for those who used a triclosan (1.5%) antibacterial soap. Those with asymptomatic shigellosis who used a non-antibacterial control soap had a risk between 49/100,000 and 53/100, those who used the 4% chlorohexadine-containing soap had a risk between 43/100,000 and 51/100, and for those who used a 1.5% triclosan soap had a risk between 21/100,000 and 43/100. The adequate washing of hands after diapering reduces risk and can be further reduced by a factor of 20% by the use of an antibacterial soap. Quantitative risk assessment is a valuable tool in the evaluation of household sanitizing agents and low risk outcomes.

  20. Assessment of involuntary choreatic movements in Huntington's disease--toward objective and quantitative measures.

    PubMed

    Reilmann, Ralf; Bohlen, Stefan; Kirsten, Florian; Ringelstein, E Bernd; Lange, Herwig W

    2011-10-01

    Objective measures of motor impairment may improve the sensitivity and reliability of motor end points in clinical trials. In Huntington's disease, involuntary choreatic movements are one of the hallmarks of motor dysfunction. Chorea is commonly assessed by subitems of the Unified-Huntington's Disease Rating Scale. However, clinical rating scales are limited by inter- and intrarater variability, subjective error, and categorical design. We hypothesized that assessment of position and orientation changes interfering with a static upper extremity holding task may provide objective and quantitative measures of involuntary movements in patients with Huntington's disease. Subjects with symptomatic Huntington's disease (n = 19), premanifest gene carriers (n = 15; Unified-Huntington's Disease Rating Scale total motor score ≤ 3), and matched controls (n = 19) were asked to grasp and lift a device (250 and 500 g) equipped with an electromagnetic sensor. While subjects were instructed to hold the device as stable as possible, changes in position (x, y, z) and orientation (roll, pitch, yaw) were recorded. These were used to calculate a position index and an orientation index, both depicting the amount of choreatic movement interfering with task performance. Both indices were increased in patients with symptomatic Huntington's disease compared with controls and premanifest gene carriers for both weights, whereas only the position index with 500 g was increased in premanifest gene carriers compared with controls. Correlations were observed with the Disease Burden Score based on CAG-repeat length and age and with the Unified-Huntington's Disease Rating Scale. We conclude that quantitative assessment of chorea is feasible in Huntington's disease. The method is safe, noninvasive, and easily applicable and can be used repeatedly in outpatient settings. A use in clinical trials should be further explored in larger cohorts and follow-up studies.

  1. Basic concepts in three-part quantitative assessments of undiscovered mineral resources

    USGS Publications Warehouse

    Singer, D.A.

    1993-01-01

    Since 1975, mineral resource assessments have been made for over 27 areas covering 5??106 km2 at various scales using what is now called the three-part form of quantitative assessment. In these assessments, (1) areas are delineated according to the types of deposits permitted by the geology,(2) the amount of metal and some ore characteristics are estimated using grade and tonnage models, and (3) the number of undiscovered deposits of each type is estimated. Permissive boundaries are drawn for one or more deposit types such that the probability of a deposit lying outside the boundary is negligible, that is, less than 1 in 100,000 to 1,000,000. Grade and tonnage models combined with estimates of the number of deposits are the fundamental means of translating geologists' resource assessments into a language that economists can use. Estimates of the number of deposits explicitly represent the probability (or degree of belief) that some fixed but unknown number of undiscovered deposits exist in the delineated tracts. Estimates are by deposit type and must be consistent with the grade and tonnage model. Other guidelines for these estimates include (1) frequency of deposits from well-explored areas, (2) local deposit extrapolations, (3) counting and assigning probabilities to anomalies and occurrences, (4) process constraints, (5) relative frequencies of related deposit types, and (6) area spatial limits. In most cases, estimates are made subjectively, as they are in meteorology, gambling, and geologic interpretations. In three-part assessments, the estimates are internally consistent because delineated tracts are consistent with descriptive models, grade and tonnage models are consistent with descriptive models, as well as with known deposits in the area, and estimates of number of deposits are consistent with grade and tonnage models. All available information is used in the assessment, and uncertainty is explicitly represented. ?? 1993 Oxford University Press.

  2. Assessment of Nutritional Status of Nepalese Hemodialysis Patients by Anthropometric Examinations and Modified Quantitative Subjective Global Assessment

    PubMed Central

    Sedhain, Arun; Hada, Rajani; Agrawal, Rajendra Kumar; Bhattarai, Gandhi R; Baral, Anil

    2015-01-01

    OBJECTIVE To assess the nutritional status of patients on maintenance hemodialysis by using modified quantitative subjective global assessment (MQSGA) and anthropometric measurements. METHOD We Conducted a cross sectional descriptive analytical study to assess the nutritional status of fifty four patients with chronic kidney disease undergoing maintenance hemodialysis by using MQSGA and different anthropometric and laboratory measurements like body mass index (BMI), mid-arm circumference (MAC), mid-arm muscle circumference (MAMC), triceps skin fold (TSF) and biceps skin fold (BSF), serum albumin, C-reactive protein (CRP) and lipid profile in a government tertiary hospital at Kathmandu, Nepal. RESULTS Based on MQSGA criteria, 66.7% of the patients suffered from mild to moderate malnutrition and 33.3% were well nourished. None of the patients were severely malnourished. CRP was positive in 56.3% patients. Serum albumin, MAC and BMI were (mean + SD) 4.0 + 0.3 mg/dl, 22 + 2.6 cm and 19.6 ± 3.2 kg/m2 respectively. MQSGA showed negative correlation with MAC (r = −0.563; P = <0.001), BMI (r = −0.448; P = <0.001), MAMC (r = −0.506; P = <.0001), TSF (r = −0.483; P = <.0002), and BSF (r = −0.508; P = <0.0001). Negative correlation of MQSGA was also found with total cholesterol, triglyceride, LDL cholesterol and HDL cholesterol without any statistical significance. CONCLUSION Mild to moderate malnutrition was found to be present in two thirds of the patients undergoing hemodialysis. Anthropometric measurements like BMI, MAC, MAMC, BSF and TSF were negatively correlated with MQSGA. Anthropometric and laboratory assessment tools could be used for nutritional assessment as they are relatively easier, cheaper and practical markers of nutritional status. PMID:26327781

  3. Quantitative assessment of changes in landslide risk using a regional scale run-out model

    NASA Astrophysics Data System (ADS)

    Hussin, Haydar; Chen, Lixia; Ciurean, Roxana; van Westen, Cees; Reichenbach, Paola; Sterlacchini, Simone

    2015-04-01

    The risk of landslide hazard continuously changes in time and space and is rarely a static or constant phenomena in an affected area. However one of the main challenges of quantitatively assessing changes in landslide risk is the availability of multi-temporal data for the different components of risk. Furthermore, a truly "quantitative" landslide risk analysis requires the modeling of the landslide intensity (e.g. flow depth, velocities or impact pressures) affecting the elements at risk. Such a quantitative approach is often lacking in medium to regional scale studies in the scientific literature or is left out altogether. In this research we modelled the temporal and spatial changes of debris flow risk in a narrow alpine valley in the North Eastern Italian Alps. The debris flow inventory from 1996 to 2011 and multi-temporal digital elevation models (DEMs) were used to assess the susceptibility of debris flow triggering areas and to simulate debris flow run-out using the Flow-R regional scale model. In order to determine debris flow intensities, we used a linear relationship that was found between back calibrated physically based Flo-2D simulations (local scale models of five debris flows from 2003) and the probability values of the Flow-R software. This gave us the possibility to assign flow depth to a total of 10 separate classes on a regional scale. Debris flow vulnerability curves from the literature and one curve specifically for our case study area were used to determine the damage for different material and building types associated with the elements at risk. The building values were obtained from the Italian Revenue Agency (Agenzia delle Entrate) and were classified per cadastral zone according to the Real Estate Observatory data (Osservatorio del Mercato Immobiliare, Agenzia Entrate - OMI). The minimum and maximum market value for each building was obtained by multiplying the corresponding land-use value (€/msq) with building area and number of floors

  4. Quantitative assessment of MS plaques and brain atrophy in multiple sclerosis using semiautomatic segmentation method

    NASA Astrophysics Data System (ADS)

    Heinonen, Tomi; Dastidar, Prasun; Ryymin, Pertti; Lahtinen, Antti J.; Eskola, Hannu; Malmivuo, Jaakko

    1997-05-01

    Quantitative magnetic resonance (MR) imaging of the brain is useful in multiple sclerosis (MS) in order to obtain reliable indices of disease progression. The goal of this project was to estimate the total volume of gliotic and non gliotic plaques in chronic progressive multiple sclerosis with the help of a semiautomatic segmentation method developed at the Ragnar Granit Institute. Youth developed program running on a PC based computer provides de displays of the segmented data, in addition to the volumetric analyses. The volumetric accuracy of the program was demonstrated by segmenting MR images of fluid filed syringes. An anatomical atlas is to be incorporated in the segmentation system to estimate the distribution of MS plaques in various neural pathways of the brain. A total package including MS plaque volume estimation, estimation of brain atrophy and ventricular enlargement, distribution of MS plaques in different neural segments of the brain has ben planned for the near future. Our study confirmed that total lesion volumes in chronic MS disease show a poor correlation to EDSS scores but show a positive correlation to neuropsychological scores. Therefore accurate total volume measurements of MS plaques using the developed semiautomatic segmentation technique helped us to evaluate the degree of neuropsychological impairment.

  5. Using Modified Contour Deformable Model to Quantitatively Estimate Ultrasound Parameters for Osteoporosis Assessment

    NASA Astrophysics Data System (ADS)

    Chen, Yung-Fu; Du, Yi-Chun; Tsai, Yi-Ting; Chen, Tainsong

    Osteoporosis is a systemic skeletal disease, which is characterized by low bone mass and micro-architectural deterioration of bone tissue, leading to bone fragility. Finding an effective method for prevention and early diagnosis of the disease is very important. Several parameters, including broadband ultrasound attenuation (BUA), speed of sound (SOS), and stiffness index (STI), have been used to measure the characteristics of bone tissues. In this paper, we proposed a method, namely modified contour deformable model (MCDM), bases on the active contour model (ACM) and active shape model (ASM) for automatically detecting the calcaneus contour from quantitative ultrasound (QUS) parametric images. The results show that the difference between the contours detected by the MCDM and the true boundary for the phantom is less than one pixel. By comparing the phantom ROIs, significant relationship was found between contour mean and bone mineral density (BMD) with R=0.99. The influence of selecting different ROI diameters (12, 14, 16 and 18 mm) and different region-selecting methods, including fixed region (ROI fix ), automatic circular region (ROI cir ) and calcaneal contour region (ROI anat ), were evaluated for testing human subjects. Measurements with large ROI diameters, especially using fixed region, result in high position errors (10-45%). The precision errors of the measured ultrasonic parameters for ROI anat are smaller than ROI fix and ROI cir . In conclusion, ROI anat provides more accurate measurement of ultrasonic parameters for the evaluation of osteoporosis and is useful for clinical application.

  6. Quantitative ventilation-perfusion lung scans in infants and children: utility of a submicronic radiolabeled aerosol to assess ventilation

    SciTech Connect

    O'Brodovich, H.M.; Coates, G.

    1984-09-01

    The quantitative assessment of regional pulmonary ventilation and perfusion provides useful information regarding lung function. Its use in infants and young children, however, has been minimal because of practical and technical limitations when the distribution of ventilation is assessed by radioactive gases. In 16 infants and children we used an inexpensive commercially available nebulizer to produce a submicronic aerosol labeled with 99mtechnetium-diethylenetriamine pentacetic acid to assess ventilation quantitatively, and intravenous injections of 99mtechnetium-labeled macroaggregates of albumin to assess pulmonary perfusion quantitatively. Studies were safely completed in both ambulatory and critically ill patients, including two premature infants who had endotracheal tubes in place for ventilatory support. No sedation or patient cooperation is required. This technique enables any department of nuclear medicine to measure regional pulmonary ventilation and perfusion in infants and children.

  7. Residual Isocyanates in Medical Devices and Products: A Qualitative and Quantitative Assessment.

    PubMed

    Franklin, Gillian; Harari, Homero; Ahsan, Samavi; Bello, Dhimiter; Sterling, David A; Nedrelow, Jonathan; Raynaud, Scott; Biswas, Swati; Liu, Youcheng

    2016-01-01

    We conducted a pilot qualitative and quantitative assessment of residual isocyanates and their potential initial exposures in neonates, as little is known about their contact effect. After a neonatal intensive care unit (NICU) stockroom inventory, polyurethane (PU) and PU foam (PUF) devices and products were qualitatively evaluated for residual isocyanates using Surface SWYPE™. Those containing isocyanates were quantitatively tested for methylene diphenyl diisocyanate (MDI) species, using UPLC-UV-MS/MS method. Ten of 37 products and devices tested, indicated both free and bound residual surface isocyanates; PU/PUF pieces contained aromatic isocyanates; one product contained aliphatic isocyanates. Overall, quantified mean MDI concentrations were low (4,4'-MDI = 0.52 to 140.1 pg/mg) and (2,4'-MDI = 0.01 to 4.48 pg/mg). The 4,4'-MDI species had the highest measured concentration (280 pg/mg). Commonly used medical devices/products contain low, but measurable concentrations of residual isocyanates. Quantifying other isocyanate species and neonatal skin exposure to isocyanates from these devices and products requires further investigation.

  8. Dynamic and quantitative assessment of blood coagulation using optical coherence elastography

    NASA Astrophysics Data System (ADS)

    Xu, Xiangqun; Zhu, Jiang; Chen, Zhongping

    2016-04-01

    Reliable clot diagnostic systems are needed for directing treatment in a broad spectrum of cardiovascular diseases and coagulopathy. Here, we report on non-contact measurement of elastic modulus for dynamic and quantitative assessment of whole blood coagulation using acoustic radiation force orthogonal excitation optical coherence elastography (ARFOE-OCE). In this system, acoustic radiation force (ARF) is produced by a remote ultrasonic transducer, and a shear wave induced by ARF excitation is detected by the optical coherence tomography (OCT) system. During porcine whole blood coagulation, changes in the elastic property of the clots increase the shear modulus of the sample, altering the propagating velocity of the shear wave. Consequently, dynamic blood coagulation status can be measured quantitatively by relating the velocity of the shear wave with clinically relevant coagulation metrics, including reaction time, clot formation kinetics and maximum shear modulus. The results show that the ARFOE-OCE is sensitive to the clot formation kinetics and can differentiate the elastic properties of the recalcified porcine whole blood, blood added with kaolin as an activator, and blood spiked with fibrinogen.

  9. Quantitative MR assessment of structural changes in white matter of children treated for ALL

    NASA Astrophysics Data System (ADS)

    Reddick, Wilburn E.; Glass, John O.; Mulhern, Raymond K.

    2001-07-01

    Our research builds on the hypothesis that white matter damage resulting from therapy spans a continuum of severity that can be reliably probed using non-invasive MR technology. This project focuses on children treated for ALL with a regimen containing seven courses of high-dose methotrexate (HDMTX) which is known to cause leukoencephalopathy. Axial FLAIR, T1-, T2-, and PD-weighted images were acquired, registered and then analyzed with a hybrid neural network segmentation algorithm to identify normal brain parenchyma and leukoencephalopathy. Quantitative T1 and T2 maps were also analyzed at the level of the basal ganglia and the centrum semiovale. The segmented images were used as mask to identify regions of normal appearing white matter (NAWM) and leukoencephalopathy in the quantitative T1 and T2 maps. We assessed the longitudinal changes in volume, T1 and T2 in NAWM and leukoencephalopathy for 42 patients. The segmentation analysis revealed that 69% of patients had leukoencephalopathy after receiving seven courses of HDMTX. The leukoencephalopathy affected approximately 17% of the patients' white matter volume on average (range 2% - 38%). Relaxation rates in the NAWM were not significantly changed between the 1st and 7th courses. Regions of leukoencephalopathy exhibited a 13% elevation in T1 and a 37% elevation in T2 relaxation rates.

  10. Specific and quantitative assessment of naphthalene and salicylate bioavailability by using a bioluminescent catabolic reporter bacterium

    SciTech Connect

    Heitzer, A.; Thonnard, J.E.; Sayler, G.S.; Webb, O.F. )

    1992-06-01

    A bioassay was developed and standardized for the rapid, specific, and quantitative assessment of naphthalene and salicylate bioavailability by use of bioluminescence monitoring of catabolic gene expression. The bioluminescent reporter strain Pseudomonas fluorescens HK44, which carries a transcriptional nahG-luxCDABE fusion for naphthalene and salicylate catabolism, was used. The physiological state of the reporter cultures as well as the intrinsic regulatory properties of the naphthalene degradation operon must be taken into account to obtain a high specificity at low target substrate concentrations. Experiments have shown that the use of exponentially growing reporter cultures has advantages over the use of carbon-starved, resting cultures. In aqueous solutions for both substrates, naphthalene and salicylate, linear relationships between initial substrate concentration and bioluminescence response were found over concentration ranges of 1 to 2 orders of magnitude. Naphthalene could be detected at a concentration of 45 ppb. Studies conducted under defined conditions with extracts and slurries of experimentally contaminated sterile soils and identical uncontaminated soil controls demonstrated that this method can be used for specific and quantitative estimations of target pollutant presence and bioavailability in soil extracts and for specific and qualitative estimations of napthalene in soil slurries.

  11. Specific and Quantitative Assessment of Naphthalene and Salicylate Bioavailability by Using a Bioluminescent Catabolic Reporter Bacterium

    PubMed Central

    Heitzer, Armin; Webb, Oren F.; Thonnard, Janeen E.; Sayler, Gary S.

    1992-01-01

    A bioassay was developed and standardized for the rapid, specific, and quantitative assessment of naphthalene and salicylate bioavailability by use of bioluminescence monitoring of catabolic gene expression. The bioluminescent reporter strain Pseudomonas fluorescens HK44, which carries a transcriptional nahG-luxCDABE fusion for naphthalene and salicylate catabolism, was used. The physiological state of the reporter cultures as well as the intrinsic regulatory properties of the naphthalene degradation operon must be taken into account to obtain a high specificity at low target substrate concentrations. Experiments have shown that the use of exponentially growing reporter cultures has advantages over the use of carbon-starved, resting cultures. In aqueous solutions for both substrates, naphthalene and salicylate, linear relationships between initial substrate concentration and bioluminescence response were found over concentration ranges of 1 to 2 orders of magnitude. Naphthalene could be detected at a concentration of 45 ppb. Studies conducted under defined conditions with extracts and slurries of experimentally contaminated sterile soils and identical uncontaminated soil controls demonstrated that this method can be used for specific and quantitative estimations of target pollutant presence and bioavailability in soil extracts and for specific and qualitative estimations of napthalene in soil slurries. PMID:16348717

  12. Residual Isocyanates in Medical Devices and Products: A Qualitative and Quantitative Assessment

    PubMed Central

    Franklin, Gillian; Harari, Homero; Ahsan, Samavi; Bello, Dhimiter; Sterling, David A.; Nedrelow, Jonathan; Raynaud, Scott; Biswas, Swati; Liu, Youcheng

    2016-01-01

    We conducted a pilot qualitative and quantitative assessment of residual isocyanates and their potential initial exposures in neonates, as little is known about their contact effect. After a neonatal intensive care unit (NICU) stockroom inventory, polyurethane (PU) and PU foam (PUF) devices and products were qualitatively evaluated for residual isocyanates using Surface SWYPE™. Those containing isocyanates were quantitatively tested for methylene diphenyl diisocyanate (MDI) species, using UPLC-UV-MS/MS method. Ten of 37 products and devices tested, indicated both free and bound residual surface isocyanates; PU/PUF pieces contained aromatic isocyanates; one product contained aliphatic isocyanates. Overall, quantified mean MDI concentrations were low (4,4′-MDI = 0.52 to 140.1 pg/mg) and (2,4′-MDI = 0.01 to 4.48 pg/mg). The 4,4′-MDI species had the highest measured concentration (280 pg/mg). Commonly used medical devices/products contain low, but measurable concentrations of residual isocyanates. Quantifying other isocyanate species and neonatal skin exposure to isocyanates from these devices and products requires further investigation. PMID:27773989

  13. Quantitative assessment of thermodynamic constraints on the solution space of genome-scale metabolic models.

    PubMed

    Hamilton, Joshua J; Dwivedi, Vivek; Reed, Jennifer L

    2013-07-16

    Constraint-based methods provide powerful computational techniques to allow understanding and prediction of cellular behavior. These methods rely on physiochemical constraints to eliminate infeasible behaviors from the space of available behaviors. One such constraint is thermodynamic feasibility, the requirement that intracellular flux distributions obey the laws of thermodynamics. The past decade has seen several constraint-based methods that interpret this constraint in different ways, including those that are limited to small networks, rely on predefined reaction directions, and/or neglect the relationship between reaction free energies and metabolite concentrations. In this work, we utilize one such approach, thermodynamics-based metabolic flux analysis (TMFA), to make genome-scale, quantitative predictions about metabolite concentrations and reaction free energies in the absence of prior knowledge of reaction directions, while accounting for uncertainties in thermodynamic estimates. We applied TMFA to a genome-scale network reconstruction of Escherichia coli and examined the effect of thermodynamic constraints on the flux space. We also assessed the predictive performance of TMFA against gene essentiality and quantitative metabolomics data, under both aerobic and anaerobic, and optimal and suboptimal growth conditions. Based on these results, we propose that TMFA is a useful tool for validating phenotypes and generating hypotheses, and that additional types of data and constraints can improve predictions of metabolite concentrations.

  14. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    NASA Astrophysics Data System (ADS)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66-1.06, 1.06-1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  15. Coherent and consistent decision making for mixed hazardous waste management: The application of quantitative assessment techniques

    SciTech Connect

    Smith, G.M.; Little, R.H.; Torres, C.

    1994-12-31

    This paper focuses on predictive modelling capacity for post-disposal safety assessments of land-based disposal facilities, illustrated by presentation of the development and application of a comprehensive, yet practicable, assessment framework. The issues addressed include: (1) land-based disposal practice, (2) the conceptual and mathematical representation of processes leading to release, migration and accumulation of contaminants, (3) the identification and evaluation of relevant assessment end-points, including human health, health of non-human biota and eco-systems, and property and resource effects, (4) the gap between data requirements and data availability, and (5) the application of results in decision making, given the uncertainties in assessment results and the difficulty of comparing qualitatively different impacts arising in different temporal and spatial scales. The paper illustrates the issues with examples based on disposal of metals and radionuclides to shallow facilities. The types of disposal facility considered include features consistent with facilities for radioactive wastes as well as other types of design more typical of hazardous wastes. The intention is to raise the question of whether radioactive and other hazardous wastes are being consistently managed, and to show that assessment methods are being developed which can provide quantitative information on the levels of environmental impact as well as a consistent approach for different types of waste, such methods can then be applied to mixed hazardous wastes contained radionuclides as well as other contaminants. The remaining question is whether the will exists to employ them. The discussion and worked illustrations are based on a methodology developed and being extended within the current European Atomic Energy Community`s cost-sharing research program on radioactive waste management and disposal, with co-funding support from Empresa Nacional de Residuous Radiactivos SA, Spain.

  16. Hydrologic connectivity: Quantitative assessments of hydrologic-enforced drainage structures in an elevation model

    USGS Publications Warehouse

    Poppenga, Sandra; Worstell, Bruce B.

    2016-01-01

    Elevation data derived from light detection and ranging present challenges for hydrologic modeling as the elevation surface includes bridge decks and elevated road features overlaying culvert drainage structures. In reality, water is carried through these structures; however, in the elevation surface these features impede modeled overland surface flow. Thus, a hydrologically-enforced elevation surface is needed for hydrodynamic modeling. In the Delaware River Basin, hydrologic-enforcement techniques were used to modify elevations to simulate how constructed drainage structures allow overland surface flow. By calculating residuals between unfilled and filled elevation surfaces, artificially pooled depressions that formed upstream of constructed drainage structure features were defined, and elevation values were adjusted by generating transects at the location of the drainage structures. An assessment of each hydrologically-enforced drainage structure was conducted using field-surveyed culvert and bridge coordinates obtained from numerous public agencies, but it was discovered the disparate drainage structure datasets were not comprehensive enough to assess all remotely located depressions in need of hydrologic-enforcement. Alternatively, orthoimagery was interpreted to define drainage structures near each depression, and these locations were used as reference points for a quantitative hydrologic-enforcement assessment. The orthoimagery-interpreted reference points resulted in a larger corresponding sample size than the assessment between hydrologic-enforced transects and field-surveyed data. This assessment demonstrates the viability of rules-based hydrologic-enforcement that is needed to achieve hydrologic connectivity, which is valuable for hydrodynamic models in sensitive coastal regions. Hydrologic-enforced elevation data are also essential for merging with topographic/bathymetric elevation data that extend over vulnerable urbanized areas and dynamic coastal

  17. Quantitative Assessment of Participant Knowledge and Evaluation of Participant Satisfaction in the CARES Training Program

    PubMed Central

    Goodman, Melody S.; Si, Xuemei; Stafford, Jewel D.; Obasohan, Adesuwa; Mchunguzi, Cheryl

    2016-01-01

    Background The purpose of the Community Alliance for Research Empowering Social change (CARES) training program was to (1) train community members on evidence-based public health, (2) increase their scientific literacy, and (3) develop the infrastructure for community-based participatory research (CBPR). Objectives We assessed participant knowledge and evaluated participant satisfaction of the CARES training program to identify learning needs, obtain valuable feedback about the training, and ensure learning objectives were met through mutually beneficial CBPR approaches. Methods A baseline assessment was administered before the first training session and a follow-up assessment and evaluation was administered after the final training session. At each training session a pretest was administered before the session and a posttest and evaluation were administered at the end of the session. After training session six, a mid-training evaluation was administered. We analyze results from quantitative questions on the assessments, pre- and post-tests, and evaluations. Results CARES fellows knowledge increased at follow-up (75% of questions were answered correctly on average) compared with baseline (38% of questions were answered correctly on average) assessment; post-test scores were higher than pre-test scores in 9 out of 11 sessions. Fellows enjoyed the training and rated all sessions well on the evaluations. Conclusions The CARES fellows training program was successful in participant satisfaction and increasing community knowledge of public health, CBPR, and research method ology. Engaging and training community members in evidence-based public health research can develop an infrastructure for community–academic research partnerships. PMID:22982849

  18. QMRAspot: a tool for Quantitative Microbial Risk Assessment from surface water to potable water.

    PubMed

    Schijven, Jack F; Teunis, Peter F M; Rutjes, Saskia A; Bouwknegt, Martijn; de Roda Husman, Ana Maria

    2011-11-01

    In the Netherlands, a health based target for microbially safe drinking water is set at less than one infection per 10,000 persons per year. For the assessment of the microbial safety of drinking water, Dutch drinking water suppliers must conduct a Quantitative Microbial Risk Assessment (QMRA) at least every three years for the so-called index pathogens enterovirus, Campylobacter, Cryptosporidium and Giardia. In order to collect raw data in the proper format and to automate the process of QMRA, an interactive user-friendly computational tool, QMRAspot, was developed to analyze and conduct QMRA for drinking water produced from surface water. This paper gives a description of the raw data requirements for QMRA as well as a functional description of the tool. No extensive prior knowledge about QMRA modeling is required by the user, because QMRAspot provides guidance to the user on the quantity, type and format of raw data and performs a complete analysis of the raw data to yield a risk outcome for drinking water consumption that can be compared with other production locations, a legislative standard or an acceptable health based target. The uniform approach promotes proper collection and usage of raw data and, warrants quality of the risk assessment as well as enhances efficiency, i.e., less time is required. QMRAspot may facilitate QMRA for drinking water suppliers worldwide. The tool aids policy makers and other involved parties in formulating mitigation strategies, and prioritization and evaluation of effective preventive measures as integral part of water safety plans.

  19. Assessing the toxic effects of ethylene glycol ethers using Quantitative Structure Toxicity Relationship models

    SciTech Connect

    Ruiz, Patricia; Mumtaz, Moiz; Gombar, Vijay

    2011-07-15

    Experimental determination of toxicity profiles consumes a great deal of time, money, and other resources. Consequently, businesses, societies, and regulators strive for reliable alternatives such as Quantitative Structure Toxicity Relationship (QSTR) models to fill gaps in toxicity profiles of compounds of concern to human health. The use of glycol ethers and their health effects have recently attracted the attention of international organizations such as the World Health Organization (WHO). The board members of Concise International Chemical Assessment Documents (CICAD) recently identified inadequate testing as well as gaps in toxicity profiles of ethylene glycol mono-n-alkyl ethers (EGEs). The CICAD board requested the ATSDR Computational Toxicology and Methods Development Laboratory to conduct QSTR assessments of certain specific toxicity endpoints for these chemicals. In order to evaluate the potential health effects of EGEs, CICAD proposed a critical QSTR analysis of the mutagenicity, carcinogenicity, and developmental effects of EGEs and other selected chemicals. We report here results of the application of QSTRs to assess rodent carcinogenicity, mutagenicity, and developmental toxicity of four EGEs: 2-methoxyethanol, 2-ethoxyethanol, 2-propoxyethanol, and 2-butoxyethanol and their metabolites. Neither mutagenicity nor carcinogenicity is indicated for the parent compounds, but these compounds are predicted to be developmental toxicants. The predicted toxicity effects were subjected to reverse QSTR (rQSTR) analysis to identify structural attributes that may be the main drivers of the developmental toxicity potential of these compounds.

  20. Quantitative Gait Measurement With Pulse-Doppler Radar for Passive In-Home Gait Assessment

    PubMed Central

    Skubic, Marjorie; Rantz, Marilyn; Cuddihy, Paul E.

    2014-01-01

    In this paper, we propose a pulse-Doppler radar system for in-home gait assessment of older adults. A methodology has been developed to extract gait parameters including walking speed and step time using Doppler radar. The gait parameters have been validated with a Vicon motion capture system in the lab with 13 participants and 158 test runs. The study revealed that for an optimal step recognition and walking speed estimation, a dual radar set up with one radar placed at foot level and the other at torso level is necessary. An excellent absolute agreement with intraclass correlation coefficients of 0.97 was found for step time estimation with the foot level radar. For walking speed, although both radars show excellent consistency they all have a system offset compared to the ground truth due to walking direction with respect to the radar beam. The torso level radar has a better performance (9% offset on average) in the speed estimation compared to the foot level radar (13%–18% offset). Quantitative analysis has been performed to compute the angles causing the systematic error. These lab results demonstrate the capability of the system to be used as a daily gait assessment tool in home environments, useful for fall risk assessment and other health care applications. The system is currently being tested in an unstructured home environment. PMID:24771566

  1. Quantitative gait measurement with pulse-Doppler radar for passive in-home gait assessment.

    PubMed

    Wang, Fang; Skubic, Marjorie; Rantz, Marilyn; Cuddihy, Paul E

    2014-09-01

    In this paper, we propose a pulse-Doppler radar system for in-home gait assessment of older adults. A methodology has been developed to extract gait parameters including walking speed and step time using Doppler radar. The gait parameters have been validated with a Vicon motion capture system in the lab with 13 participants and 158 test runs. The study revealed that for an optimal step recognition and walking speed estimation, a dual radar set up with one radar placed at foot level and the other at torso level is necessary. An excellent absolute agreement with intraclass correlation coefficients of 0.97 was found for step time estimation with the foot level radar. For walking speed, although both radars show excellent consistency they all have a system offset compared to the ground truth due to walking direction with respect to the radar beam. The torso level radar has a better performance (9% offset on average) in the speed estimation compared to the foot level radar (13%-18% offset). Quantitative analysis has been performed to compute the angles causing the systematic error. These lab results demonstrate the capability of the system to be used as a daily gait assessment tool in home environments, useful for fall risk assessment and other health care applications. The system is currently being tested in an unstructured home environment.

  2. A rapid, non-invasive procedure for quantitative assessment of drought survival using chlorophyll fluorescence

    PubMed Central

    Woo, Nick S; Badger, Murray R; Pogson, Barry J

    2008-01-01

    Background Analysis of survival is commonly used as a means of comparing the performance of plant lines under drought. However, the assessment of plant water status during such studies typically involves detachment to estimate water shock, imprecise methods of estimation or invasive measurements such as osmotic adjustment that influence or annul further evaluation of a specimen's response to drought. Results This article presents a procedure for rapid, inexpensive and non-invasive assessment of the survival of soil-grown plants during drought treatment. The changes in major photosynthetic parameters during increasing water deficit were monitored via chlorophyll fluorescence imaging and the selection of the maximum efficiency of photosystem II (Fv/Fm) parameter as the most straightforward and practical means of monitoring survival is described. The veracity of this technique is validated through application to a variety of Arabidopsis thaliana ecotypes and mutant lines with altered tolerance to drought or reduced photosynthetic efficiencies. Conclusion The method presented here allows the acquisition of quantitative numerical estimates of Arabidopsis drought survival times that are amenable to statistical analysis. Furthermore, the required measurements can be obtained quickly and non-invasively using inexpensive equipment and with minimal expertise in chlorophyll fluorometry. This technique enables the rapid assessment and comparison of the relative viability of germplasm during drought, and may complement detailed physiological and water relations studies. PMID:19014425

  3. Quantitative assessment of reactive hyperemia using laser speckle contrast imaging at multiple wavelengths

    NASA Astrophysics Data System (ADS)

    Young, Anthony; Vishwanath, Karthik

    2016-03-01

    Reactive hyperemia refers to an increase of blood flow in tissue post release of an occlusion in the local vasculature. Measuring the temporal response of reactive hyperemia, post-occlusion in patients has the potential to shed information about microvascular diseases such as systemic sclerosis and diabetes. Laser speckle contrast imaging (LSCI) is an imaging technique capable of sensing superficial blood flow in tissue which can be used to quantitatively assess reactive hyperemia. Here, we employ LSCI using coherent sources in the blue, green and red wavelengths to evaluate reactive hyperemia in healthy human volunteers. Blood flow in the forearms of subjects were measured using LSCI to assess the time-course of reactive hyperemia that was triggered by a pressure cuff applied to the biceps of the subjects. Raw speckle images were acquired and processed to yield blood-flow parameters from a region of interest before, during and after application of occlusion. Reactive hyperemia was quantified via two measures - (1) by calculating the difference between the peak LSCI flow during the hyperemia and baseline flow, and (2) by measuring the amount of time that elapsed between the release of the occlusion and peak flow. These measurements were acquired in three healthy human participants, under the three laser wavelengths employed. The studies shed light on the utility of in vivo LSCI-based flow sensing for non-invasive assessment of reactive hyperemia responses and how they varied with the choice source wavelength influences the measured parameters.

  4. Quantitative Microbial Risk Assessment in Occupational Settings Applied to the Airborne Human Adenovirus Infection

    PubMed Central

    Carducci, Annalaura; Donzelli, Gabriele; Cioni, Lorenzo; Verani, Marco

    2016-01-01

    Quantitative Microbial Risk Assessment (QMRA) methodology, which has already been applied to drinking water and food safety, may also be applied to risk assessment and management at the workplace. The present study developed a preliminary QMRA model to assess microbial risk that is associated with inhaling bioaerosols that are contaminated with human adenovirus (HAdV). This model has been applied to air contamination data from different occupational settings, including wastewater systems, solid waste landfills, and toilets in healthcare settings and offices, with different exposure times. Virological monitoring showed the presence of HAdVs in all the evaluated settings, thus confirming that HAdV is widespread, but with different average concentrations of the virus. The QMRA results, based on these concentrations, showed that toilets had the highest probability of viral infection, followed by wastewater treatment plants and municipal solid waste landfills. Our QMRA approach in occupational settings is novel, and certain caveats should be considered. Nonetheless, we believe it is worthy of further discussions and investigations. PMID:27447658

  5. Quantitative risk assessment for skin sensitisation: consideration of a simplified approach for hair dye ingredients.

    PubMed

    Goebel, Carsten; Diepgen, Thomas L; Krasteva, Maya; Schlatter, Harald; Nicolas, Jean-Francois; Blömeke, Brunhilde; Coenraads, Pieter Jan; Schnuch, Axel; Taylor, James S; Pungier, Jacquemine; Fautz, Rolf; Fuchs, Anne; Schuh, Werner; Gerberick, G Frank; Kimber, Ian

    2012-12-01

    With the availability of the local lymph node assay, and the ability to evaluate effectively the relative skin sensitizing potency of contact allergens, a model for quantitative-risk-assessment (QRA) has been developed. This QRA process comprises: (a) determination of a no-expected-sensitisation-induction-level (NESIL), (b) incorporation of sensitization-assessment-factors (SAFs) reflecting variations between subjects, product use patterns and matrices, and (c) estimation of consumer-exposure-level (CEL). Based on these elements an acceptable-exposure-level (AEL) can be calculated by dividing the NESIL of the product by individual SAFs. Finally, the AEL is compared with the CEL to judge about risks to human health. We propose a simplified approach to risk assessment of hair dye ingredients by making use of precise experimental product exposure data. This data set provides firmly established dose/unit area concentrations under relevant consumer use conditions referred to as the measured-exposure-level (MEL). For that reason a direct comparison is possible between the NESIL with the MEL as a proof-of-concept quantification of the risk of skin sensitization. This is illustrated here by reference to two specific hair dye ingredients p-phenylenediamine and resorcinol. Comparison of these robust and toxicologically relevant values is therefore considered an improvement versus a hazard-based classification of hair dye ingredients.

  6. Quantitative assessment of resilience of a water supply system under rainfall reduction due to climate change

    NASA Astrophysics Data System (ADS)

    Amarasinghe, Pradeep; Liu, An; Egodawatta, Prasanna; Barnes, Paul; McGree, James; Goonetilleke, Ashantha

    2016-09-01

    A water supply system can be impacted by rainfall reduction due to climate change, thereby reducing its supply potential. This highlights the need to understand the system resilience, which refers to the ability to maintain service under various pressures (or disruptions). Currently, the concept of resilience has not yet been widely applied in managing water supply systems. This paper proposed three technical resilience indictors to assess the resilience of a water supply system. A case study analysis was undertaken of the Water Grid system of Queensland State, Australia, to showcase how the proposed indicators can be applied to assess resilience. The research outcomes confirmed that the use of resilience indicators is capable of identifying critical conditions in relation to the water supply system operation, such as the maximum allowable rainfall reduction for the system to maintain its operation without failure. Additionally, resilience indicators also provided useful insight regarding the sensitivity of the water supply system to a changing rainfall pattern in the context of climate change, which represents the system's stability when experiencing pressure. The study outcomes will help in the quantitative assessment of resilience and provide improved guidance to system operators to enhance the efficiency and reliability of a water supply system.

  7. Quantitative Assessment of Mouse Mammary Gland Morphology Using Automated Digital Image Processing and TEB Detection.

    PubMed

    Blacher, Silvia; Gérard, Céline; Gallez, Anne; Foidart, Jean-Michel; Noël, Agnès; Péqueux, Christel

    2016-04-01

    The assessment of rodent mammary gland morphology is largely used to study the molecular mechanisms driving breast development and to analyze the impact of various endocrine disruptors with putative pathological implications. In this work, we propose a methodology relying on fully automated digital image analysis methods including image processing and quantification of the whole ductal tree and of the terminal end buds as well. It allows to accurately and objectively measure both growth parameters and fine morphological glandular structures. Mammary gland elongation was characterized by 2 parameters: the length and the epithelial area of the ductal tree. Ductal tree fine structures were characterized by: 1) branch end-point density, 2) branching density, and 3) branch length distribution. The proposed methodology was compared with quantification methods classically used in the literature. This procedure can be transposed to several software and thus largely used by scientists studying rodent mammary gland morphology.

  8. Markov chain Monte Carlo estimation of a multiparameter decision model: consistency of evidence and the accurate assessment of uncertainty.

    PubMed

    Ades, A E; Cliffe, S

    2002-01-01

    Decision models are usually populated 1 parameter at a time, with 1 item of information informing each parameter. Often, however, data may not be available on the parameters themselves but on several functions of parameters, and there may be more items of information than there are parameters to be estimated. The authors show how in these circumstances all the model parameters can be estimated simultaneously using Bayesian Markov chain Monte Carlo methods. Consistency of the information and/or the adequacy of the model can also be assessed within this framework. Statistical evidence synthesis using all available data should result in more precise estimates of parameters and functions of parameters, and is compatible with the emphasis currently placed on systematic use of evidence. To illustrate this, WinBUGS software is used to estimate a simple 9-parameter model of the epidemiology of HIV in women attending prenatal clinics, using information on 12 functions of parameters, and to thereby compute the expected net benefit of 2 alternative prenatal testing strategies, universal testing and targeted testing of high-risk groups. The authors demonstrate improved precision of estimates, and lower estimates of the expected value of perfect information, resulting from the use of all available data.

  9. Assessment of the extended Koopmans' theorem for the chemical reactivity: Accurate computations of chemical potentials, chemical hardnesses, and electrophilicity indices.

    PubMed

    Yildiz, Dilan; Bozkaya, Uğur

    2016-01-30

    The extended Koopmans' theorem (EKT) provides a straightforward way to compute ionization potentials and electron affinities from any level of theory. Although it is widely applied to ionization potentials, the EKT approach has not been applied to evaluation of the chemical reactivity. We present the first benchmarking study to investigate the performance of the EKT methods for predictions of chemical potentials (μ) (hence electronegativities), chemical hardnesses (η), and electrophilicity indices (ω). We assess the performance of the EKT approaches for post-Hartree-Fock methods, such as Møller-Plesset perturbation theory, the coupled-electron pair theory, and their orbital-optimized counterparts for the evaluation of the chemical reactivity. Especially, results of the orbital-optimized coupled-electron pair theory method (with the aug-cc-pVQZ basis set) for predictions of the chemical reactivity are very promising; the corresponding mean absolute errors are 0.16, 0.28, and 0.09 eV for μ, η, and ω, respectively.

  10. Assessing the Expected Impact of Global Health Treaties: Evidence From 90 Quantitative Evaluations

    PubMed Central

    Røttingen, John-Arne

    2015-01-01

    We assessed what impact can be expected from global health treaties on the basis of 90 quantitative evaluations of existing treaties on trade, finance, human rights, conflict, and the environment. It appears treaties consistently succeed in shaping economic matters and consistently fail in achieving social progress. There are at least 3 differences between these domains that point to design characteristics that new global health treaties can incorporate to achieve positive impact: (1) incentives for those with power to act on them; (2) institutions designed to bring edicts into effect; and (3) interests advocating their negotiation, adoption, ratification, and domestic implementation. Experimental and quasiexperimental evaluations of treaties would provide more information about what can be expected from this type of global intervention. PMID:25393196

  11. Quantitative Framework for Retrospective Assessment of Interim Decisions in Clinical Trials.

    PubMed

    Stanev, Roger

    2016-11-01

    This article presents a quantitative way of modeling the interim decisions of clinical trials. While statistical approaches tend to focus on the epistemic aspects of statistical monitoring rules, often overlooking ethical considerations, ethical approaches tend to neglect the key epistemic dimension. The proposal is a second-order decision-analytic framework. The framework provides means for retrospective assessment of interim decisions based on a clear and consistent set of criteria that combines both ethical and epistemic considerations. The framework is broadly Bayesian and addresses a fundamental question behind many concerns about clinical trials: What does it take for an interim decision (e.g., whether to stop the trial or continue) to be a good decision? Simulations illustrating the modeling of interim decisions counterfactually are provided.

  12. Quantitative indices for the assessment of the repeatability of distortion product otoacoustic emissions in laboratory animals.

    PubMed

    Parazzini, Marta; Galloni, Paolo; Brazzale, Alessandra R; Tognola, Gabriella; Marino, Carmela; Ravazzani, Paolo

    2006-08-01

    Distortion product otoacoustic emissions (DPOAE) can be used to study cochlear function in an objective and non-invasive manner. One practical and essential aspect of any investigating measure is the consistency of its results upon repeated testing of the same individual/animal (i.e., its test/retest repeatability). The goal of the present work is to propose two indices to quantitatively assess the repeatability of DPOAE in laboratory animals. The methodology is here illustrated using two data sets which consist of DPOAE subsequently collected from Sprague-Dawley rats. The results of these experiments showed that the proposed indices are capable of estimating both the repeatability of the true emission level and the inconsistencies associated with measurement error. These indices could be a significantly useful tool to identify real and even small changes in the cochlear function exerted by potential ototoxic agents.

  13. A quantitative assessment of using the Kinect for Xbox 360 for respiratory surface motion tracking

    NASA Astrophysics Data System (ADS)

    Alnowami, M.; Alnwaimi, B.; Tahavori, F.; Copland, M.; Wells, K.

    2012-02-01

    This paper describes a quantitative assessment of the Microsoft Kinect for X-box360TM for potential application in tracking respiratory and body motion in diagnostic imaging and external beam radiotherapy. However, the results can also be used in many other biomedical applications. We consider the performance of the Kinect in controlled conditions and find mm precision at depths of 0.8-1.5m. We also demonstrate the use of the Kinect for monitoring respiratory motion of the anterior surface. To improve the performance of respiratory monitoring, we fit a spline model of the chest surface through the depth data as a method of a marker-less monitoring of a respiratory motion. In addition, a comparison between the Kinect camera with and without zoom lens and a marker-based system was used to evaluate the accuracy of using the Kinect camera as a respiratory tracking system.

  14. Quantitative Framework for Retrospective Assessment of Interim Decisions in Clinical Trials

    PubMed Central

    Stanev, Roger

    2016-01-01

    This article presents a quantitative way of modeling the interim decisions of clinical trials. While statistical approaches tend to focus on the epistemic aspects of statistical monitoring rules, often overlooking ethical considerations, ethical approaches tend to neglect the key epistemic dimension. The proposal is a second-order decision-analytic framework. The framework provides means for retrospective assessment of interim decisions based on a clear and consistent set of criteria that combines both ethical and epistemic considerations. The framework is broadly Bayesian and addresses a fundamental question behind many concerns about clinical trials: What does it take for an interim decision (e.g., whether to stop the trial or continue) to be a good decision? Simulations illustrating the modeling of interim decisions counterfactually are provided. PMID:27353825

  15. Decorticate spasticity: a re-examination using quantitative assessment in the primate.

    PubMed

    Tasker, R R; Gentili, F; Sogabe, K; Shanlin, M; Hawrylyshyn, P

    1975-08-01

    Decorticate spasticity in the squirrel monkey was chosen as a convenient laboratory model of spasticity capable of quantitative assessment upon which to evaluate various currently popular clinical spasmolytic measures. The effects of a wide variety of cortical lesions were studied involving primary and supplementary motor, premotor and parietal cortex unilaterally and bilaterally, measuring muscle tone with the evoked integrated E.M.G. technique. Measurable spasticity resulted only if primary motor cortex was ablated bilaterally usually but not always preferentially involving biceps brachii and quadriceps. Resulting postures were variable offering no justification for the term "decorticate posture". The integrated evoked E.M.G. was proportional to rate of stretch and chiefly phasic in type as in hemiplegic man. Stereotactic dentatectomy resulted in profound ipsilateral reduction in this spasticity, but was without effect in intercollicular or anemic decerebrate cats. The mechanism of the spasticity and of the cerebellar effects are discussed.

  16. Quantitative assessment of the benefits of specific information technologies applied to clinical studies in developing countries.

    PubMed

    Avilés, William; Ortega, Oscar; Kuan, Guillermina; Coloma, Josefina; Harris, Eva

    2008-02-01

    Clinical studies and trials require accessibility of large amounts of high-quality information in a timely manner, often daily. The integrated application of information technologies can greatly improve quality control as well as facilitate compliance with established standards such as Good Clinical Practice (GCP) and Good Laboratory Practice (GLP). We have customized and implemented a number of information technologies, such as personal data assistants (PDAs), geographic information system (GIS), and barcode and fingerprint scanning, to streamline a pediatric dengue cohort study in Managua, Nicaragua. Quantitative data was obtained to assess the actual contribution of each technology in relation to processing time, accuracy, real-time access to data, savings in consumable materials, and time to proficiency in training sessions. In addition to specific advantages, these information technologies benefited not only the study itself but numerous routine clinical and laboratory processes in the health center and laboratories of the Nicaraguan Ministry of Health.

  17. MO-G-12A-01: Quantitative Imaging Metrology: What Should Be Assessed and How?

    SciTech Connect

    Giger, M; Petrick, N; Obuchowski, N; Kinahan, P

    2014-06-15

    The first two symposia in the Quantitative Imaging Track focused on 1) the introduction of quantitative imaging (QI) challenges and opportunities, and QI efforts of agencies and organizations such as the RSNA, NCI, FDA, and NIST, and 2) the techniques, applications, and challenges of QI, with specific examples from CT, PET/CT, and MR. This third symposium in the QI Track will focus on metrology and its importance in successfully advancing the QI field. While the specific focus will be on QI, many of the concepts presented are more broadly applicable to many areas of medical physics research and applications. As such, the topics discussed should be of interest to medical physicists involved in imaging as well as therapy. The first talk of the session will focus on the introduction to metrology and why it is critically important in QI. The second talk will focus on appropriate methods for technical performance assessment. The third talk will address statistically valid methods for algorithm comparison, a common problem not only in QI but also in other areas of medical physics. The final talk in the session will address strategies for publication of results that will allow statistically valid meta-analyses, which is critical for combining results of individual studies with typically small sample sizes in a manner that can best inform decisions and advance the field. Learning Objectives: Understand the importance of metrology in the QI efforts. Understand appropriate methods for technical performance assessment. Understand methods for comparing algorithms with or without reference data (i.e., “ground truth”). Understand the challenges and importance of reporting results in a manner that allows for statistically valid meta-analyses.

  18. Quantitative assessment of intragenic receptor tyrosine kinase deletions in primary glioblastomas: their prevalence and molecular correlates.

    PubMed

    Kastenhuber, Edward R; Huse, Jason T; Berman, Samuel H; Pedraza, Alicia; Zhang, Jianan; Suehara, Yoshiyuki; Viale, Agnes; Cavatore, Magali; Heguy, Adriana; Szerlip, Nicholas; Ladanyi, Marc; Brennan, Cameron W

    2014-05-01

    Intragenic deletion is the most common form of activating mutation among receptor tyrosine kinases (RTK) in glioblastoma. However, these events are not detected by conventional DNA sequencing methods commonly utilized for tumor genotyping. To comprehensively assess the frequency, distribution, and expression levels of common RTK deletion mutants in glioblastoma, we analyzed RNA from a set of 192 glioblastoma samples from The Cancer Genome Atlas for the expression of EGFRvIII, EGFRvII, EGFRvV (carboxyl-terminal deletion), and PDGFRAΔ8,9. These mutations were detected in 24, 1.6, 4.7, and 1.6 % of cases, respectively. Overall, 29 % (55/189) of glioblastomas expressed at least one RTK intragenic deletion transcript in this panel. For EGFRvIII, samples were analyzed by both quantitative real-time PCR (QRT-PCR) and single mRNA molecule counting on the Nanostring nCounter platform. Nanostring proved to be highly sensitive, specific, and linear, with sensitivity comparable or exceeding that of RNA seq. We evaluated the prognostic significance and molecular correlates of RTK rearrangements. EGFRvIII was only detectable in tumors with focal amplification of the gene. Moreover, we found that EGFRvIII expression was not prognostic of poor outcome and that neither recurrent copy number alterations nor global changes in gene expression differentiate EGFRvIII-positive tumors from tumors with amplification of wild-type EGFR. The wide range of expression of mutant alleles and co-expression of multiple EGFR variants suggests that quantitative RNA-based clinical assays will be important for assessing the relative expression of intragenic deletions as therapeutic targets and/or candidate biomarkers. To this end, we demonstrate the performance of the Nanostring assay in RNA derived from routinely collected formalin-fixed paraffin-embedded tissue.

  19. A comparative study of qualitative and quantitative methods for the assessment of adhesive remnant after bracket debonding.

    PubMed

    Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C

    2012-04-01

    The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.

  20. Databases applicable to quantitative hazard/risk assessment-Towards a predictive systems toxicology

    SciTech Connect

    Waters, Michael Jackson, Marcus

    2008-11-15

    The Workshop on The Power of Aggregated Toxicity Data addressed the requirement for distributed databases to support quantitative hazard and risk assessment. The authors have conceived and constructed with federal support several databases that have been used in hazard identification and risk assessment. The first of these databases, the EPA Gene-Tox Database was developed for the EPA Office of Toxic Substances by the Oak Ridge National Laboratory, and is currently hosted by the National Library of Medicine. This public resource is based on the collaborative evaluation, by government, academia, and industry, of short-term tests for the detection of mutagens and presumptive carcinogens. The two-phased evaluation process resulted in more than 50 peer-reviewed publications on test system performance and a qualitative database on thousands of chemicals. Subsequently, the graphic and quantitative EPA/IARC Genetic Activity Profile (GAP) Database was developed in collaboration with the International Agency for Research on Cancer (IARC). A chemical database driven by consideration of the lowest effective dose, GAP has served IARC for many years in support of hazard classification of potential human carcinogens. The Toxicological Activity Profile (TAP) prototype database was patterned after GAP and utilized acute, subchronic, and chronic data from the Office of Air Quality Planning and Standards. TAP demonstrated the flexibility of the GAP format for air toxics, water pollutants and other environmental agents. The GAP format was also applied to developmental toxicants and was modified to represent quantitative results from the rodent carcinogen bioassay. More recently, the authors have constructed: 1) the NIEHS Genetic Alterations in Cancer (GAC) Database which quantifies specific mutations found in cancers induced by environmental agents, and 2) the NIEHS Chemical Effects in Biological Systems (CEBS) Knowledgebase that integrates genomic and other biological data including

  1. Assessment and Mission Planning Capability For Quantitative Aerothermodynamic Flight Measurements Using Remote Imaging

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas; Splinter, Scott; Daryabeigi, Kamran; Wood, William; Schwartz, Richard; Ross, Martin

    2008-01-01

    assessment study focused on increasing the probability of returning spatially resolved scientific/engineering thermal imagery. This paper provides an overview of the assessment task and the systematic approach designed to establish confidence in the ability of existing assets to reliably acquire, track and return global quantitative surface temperatures of the Shuttle during entry. A discussion of capability demonstration in support of a potential Shuttle boundary layer transition flight test is presented. Successful demonstration of a quantitative, spatially resolved, global temperature measurement on the proposed Shuttle boundary layer transition flight test could lead to potential future applications with hypersonic flight test programs within the USAF and DARPA along with flight test opportunities supporting NASA s project Constellation.

  2. Towards accurate assessments of CH4 and N2O soil-atmosphere exchange rates with the combination of automated systems and new detection techniques

    NASA Astrophysics Data System (ADS)

    Díaz-Pinés, E.; Wolf, B.; Kiese, R.; Butterbach-Bahl, K.

    2012-04-01

    Soils can be either a source or a sink of CH4 and N2O. Accurate assessment of CH4 and N2O soil-atmosphere exchange processes is necessary in order to estimate the contribution of soil to the global warming potential under current and future conditions. Soil-atmosphere exchange processes of both CH4 and N2O depend on a combination of soil temperature and soil moisture status, as well as on nutrient availability and various microbial processes. The task of measuring CH4 and N2O exchange processes is challenging due to, among other factors: high spatial ("hot spots") and temporal heterogeneity ("hot moments") in the emissions of these species. In addition, accurate determination of CH4 and N2O concentrations is still difficult. So far, this prevents from a full understanding and contributes to a high uncertainty degree in the assessment of CH4 and N2O soil-atmosphere exchange rates across different ecosystems. Aiming at the achievement of a deeper understanding of the role of the soil in the GHG balance, we have combined new laser spectroscopy detection techniques (Quantum Cascade Laser, QCL) with automatic and semi-automatic chamber measurement systems. Therefore, different applications will be presented: A three-month-long field campaign in a poplar plantation in NE Romania allowed us to demonstrate the feasibility of the QCL coupled with automatic chambers to accurately estimate the soil-atmosphere GHG exchange at a high time resolution with a very low detection limit. A new semi-automatic system with relatively low human-maintenance requirements was tested in a poplar plantation in SW Germany. The system is not able to record fine-scale temporal variations of the GHG exchange processes; however, cumulative fluxes obtained with the semi-automatic system were very close to those measured with an automatic system with high temporal resolution. Within a climate change experiment in grassland ecosystems, an application of the QCL in combination with a robotized chamber

  3. Computer-aided quantitative bone scan assessment of prostate cancer treatment response

    PubMed Central

    Brown, Matthew S.; Chu, Gregory H.; Kim, Hyun J.; Allen-Auerbach, Martin; Poon, Cheryce; Bridges, Juliette; Vidovic, Adria; Ramakrishna, Bharath; Ho, Judy; Morris, Michael J.; Larson, Steven M.; Scher, Howard I.; Goldin, Jonathan G.

    2012-01-01

    Objective The development and evaluation of a computer-aided bone scan analysis technique to quantify changes in tumor burden and assess treatment effects in prostate cancer clinical trials. Methods We have developed and report on a commercial fully automated computer-aided detection system. Using this system, scan images were intensity normalized, then lesions identified and segmented by anatomic region-specific intensity thresholding. Detected lesions were compared against expert markings to assess the accuracy of the computer-aided detection system. The metrics Bone Scan Lesion Area, Bone Scan Lesion Intensity, and Bone Scan Lesion Count were calculated from identified lesions, and their utility in assessing treatment effects was evaluated by analyzing before and after scans from metastatic castration-resistant prostate cancer patients: 10 treated and 10 untreated. In this study, patients were treated with cabozantinib, a MET/VEGF inhibitor resulting in high rates of resolution of bone scan abnormalities. Results Our automated computer-aided detection system identified bone lesion pixels with 94% sensitivity, 89% specificity, and 89% accuracy. Significant differences in changes from baseline were found between treated and untreated groups in all assessed measurements derived by our system. The most significant measure, Bone Scan Lesion Area, showed a median (interquartile range) change from baseline at week 6 of 7.13% (27.61) in the untreated group compared with −73.76% (45.38) in the cabozantinib-treated group (P = 0.0003). Conclusions Our system accurately and objectively identified and quantified metastases in bone scans, allowing for interpatient and intrapatient comparison. It demonstrates potential as an objective measurement of treatment effects, laying the foundation for validation against other clinically relevant outcome measures. PMID:22367858

  4. Quantitative safety assessment of air traffic control systems through system control capacity

    NASA Astrophysics Data System (ADS)

    Guo, Jingjing

    Quantitative Safety Assessments (QSA) are essential to safety benefit verification and regulations of developmental changes in safety critical systems like the Air Traffic Control (ATC) systems. Effectiveness of the assessments is particularly desirable today in the safe implementations of revolutionary ATC overhauls like NextGen and SESAR. QSA of ATC systems are however challenged by system complexity and lack of accident data. Extending from the idea "safety is a control problem" in the literature, this research proposes to assess system safety from the control perspective, through quantifying a system's "control capacity". A system's safety performance correlates to this "control capacity" in the control of "safety critical processes". To examine this idea in QSA of the ATC systems, a Control-capacity Based Safety Assessment Framework (CBSAF) is developed which includes two control capacity metrics and a procedural method. The two metrics are Probabilistic System Control-capacity (PSC) and Temporal System Control-capacity (TSC); each addresses an aspect of a system's control capacity. And the procedural method consists three general stages: I) identification of safety critical processes, II) development of system control models and III) evaluation of system control capacity. The CBSAF was tested in two case studies. The first one assesses an en-route collision avoidance scenario and compares three hypothetical configurations. The CBSAF was able to capture the uncoordinated behavior between two means of control, as was observed in a historic midair collision accident. The second case study compares CBSAF with an existing risk based QSA method in assessing the safety benefits of introducing a runway incursion alert system. Similar conclusions are reached between the two methods, while the CBSAF has the advantage of simplicity and provides a new control-based perspective and interpretation to the assessments. The case studies are intended to investigate the

  5. Assessing quantitative resistance against Leptosphaeria maculans (phoma stem canker) in Brassica napus (oilseed rape) in young plants.

    PubMed

    Huang, Yong-Ju; Qi, Aiming; King, Graham J; Fitt, Bruce D L

    2014-01-01

    Quantitative resistance against Leptosphaeria maculans in Brassica napus is difficult to assess in young plants due to the long period of symptomless growth of the pathogen from the appearance of leaf lesions to the appearance of canker symptoms on the stem. By using doubled haploid (DH) lines A30 (susceptible) and C119 (with quantitative resistance), quantitative resistance against L. maculans was assessed in young plants in controlled environments at two stages: stage 1, growth of the pathogen along leaf veins/petioles towards the stem by leaf lamina inoculation; stage 2, growth in stem tissues to produce stem canker symptoms by leaf petiole inoculation. Two types of inoculum (ascospores; conidia) and three assessment methods (extent of visible necrosis; symptomless pathogen growth visualised using the GFP reporter gene; amount of pathogen DNA quantified by PCR) were used. In stage 1 assessments, significant differences were observed between lines A30 and C119 in area of leaf lesions, distance grown along veins/petioles assessed by visible necrosis or by viewing GFP and amount of L. maculans DNA in leaf petioles. In stage 2 assessments, significant differences were observed between lines A30 and C119 in severity of stem canker and amount of L. maculans DNA in stem tissues. GFP-labelled L. maculans spread more quickly from the stem cortex to the stem pith in A30 than in C119. Stem canker symptoms were produced more rapidly by using ascospore inoculum than by using conidial inoculum. These results suggest that quantitative resistance against L. maculans in B. napus can be assessed in young plants in controlled conditions. Development of methods to phenotype quantitative resistance against plant pathogens in young plants in controlled environments will help identification of stable quantitative resistance for control of crop diseases.

  6. Assessing Quantitative Resistance against Leptosphaeria maculans (Phoma Stem Canker) in Brassica napus (Oilseed Rape) in Young Plants

    PubMed Central

    Huang, Yong-Ju; Qi, Aiming; King, Graham J.; Fitt, Bruce D. L.

    2014-01-01

    Quantitative resistance against Leptosphaeria maculans in Brassica napus is difficult to assess in young plants due to the long period of symptomless growth of the pathogen from the appearance of leaf lesions to the appearance of canker symptoms on the stem. By using doubled haploid (DH) lines A30 (susceptible) and C119 (with quantitative resistance), quantitative resistance against L. maculans was assessed in young plants in controlled environments at two stages: stage 1, growth of the pathogen along leaf veins/petioles towards the stem by leaf lamina inoculation; stage 2, growth in stem tissues to produce stem canker symptoms by leaf petiole inoculation. Two types of inoculum (ascospores; conidia) and three assessment methods (extent of visible necrosis; symptomless pathogen growth visualised using the GFP reporter gene; amount of pathogen DNA quantified by PCR) were used. In stage 1 assessments, significant differences were observed between lines A30 and C119 in area of leaf lesions, distance grown along veins/petioles assessed by visible necrosis or by viewing GFP and amount of L. maculans DNA in leaf petioles. In stage 2 assessments, significant differences were observed between lines A30 and C119 in severity of stem canker and amount of L. maculans DNA in stem tissues. GFP-labelled L. maculans spread more quickly from the stem cortex to the stem pith in A30 than in C119. Stem canker symptoms were produced more rapidly by using ascospore inoculum than by using conidial inoculum. These results suggest that quantitative resistance against L. maculans in B. napus can be assessed in young plants in controlled conditions. Development of methods to phenotype quantitative resistance against plant pathogens in young plants in controlled environments will help identification of stable quantitative resistance for control of crop diseases. PMID:24454767

  7. Quantitative risk assessment of human salmonellosis in Canadian broiler chicken breast from retail to consumption.

    PubMed

    Smadi, Hanan; Sargeant, Jan M

    2013-02-01

    The current quantitative risk assessment model followed the framework proposed by the Codex Alimentarius to provide an estimate of the risk of human salmonellosis due to consumption of chicken breasts which were bought from Canadian retail stores and prepared in Canadian domestic kitchens. The model simulated the level of Salmonella contamination on chicken breasts throughout the retail-to-table pathway. The model used Canadian input parameter values, where available, to represent risk of salmonellosis. From retail until consumption, changes in the concentration of Salmonella on each chicken breast were modeled using equations for growth and inactivation. The model predicted an average of 318 cases of salmonellosis per 100,000 consumers per year. Potential reasons for this overestimation were discussed. A sensitivity analysis showed that concentration of Salmonella on chicken breasts at retail and food hygienic practices in private kitchens such as cross-contamination due to not washing cutting boards (or utensils) and hands after handling raw meat along with inadequate cooking contributed most significantly to the risk of human salmonellosis. The outcome from this model emphasizes that responsibility for protection from Salmonella hazard on chicken breasts is a shared responsibility. Data needed for a comprehensive Canadian Salmonella risk assessment were identified for future research.

  8. Modelling bacterial growth in quantitative microbiological risk assessment: is it possible?

    PubMed

    Nauta, Maarten J

    2002-03-01

    Quantitative microbiological risk assessment (QMRA), predictive modelling and HACCP may be used as tools to increase food safety and can be integrated fruitfully for many purposes. However, when QMRA is applied for public health issues like the evaluation of the status of public health, existing predictive models may not be suited to model bacterial growth. In this context, precise quantification of risks is more important than in the context of food manufacturing alone. In this paper, the modular process risk model (MPRM) is briefly introduced as a QMRA modelling framework. This framework can be used to model the transmission of pathogens through any food pathway, by assigning one of six basic processes (modules) to each of the processing steps. Bacterial growth is one of these basic processes. For QMRA, models of bacterial growth need to be expressed in terms of probability, for example to predict the probability that a critical concentration is reached within a certain amount of time. In contrast, available predictive models are developed and validated to produce point estimates of population sizes and therefore do not fit with this requirement. Recent experience from a European risk assessment project is discussed to illustrate some of the problems that may arise when predictive growth models are used in QMRA. It is suggested that a new type of predictive models needs to be developed that incorporates modelling of variability and uncertainty in growth.

  9. Techniques for rapid quantitative assessment of activity levels in small-group tutorials.

    PubMed

    Prinz, J F; Yip, H Y; Tipoe, G L; Lucas, P W; Lenstrup, M

    1998-07-01

    Two techniques for the rapid quantitative analysis of student participation in small-group teaching were investigated. In the first approach an observer, who also acted as a 'critical friend', recorded the length of individual contributions using a computer keyboard as a simple timing device. In the second approach, small-group sessions were recorded with a portable stereophonic audiotape recorder. The teacher was recorded on one channel, all students on the other. A computer program produced automated analysis of these small group interactions by computing relative amount of speech on each channel. Simple analysis produced automatically by the programs revealed the overall style of the tutorial--variably 'mini-lectures' by teachers with very little participation by the student body, rapid 'question and answer' sessions with about equal teacher/student body involvement or 'mini-presentations' by students with the teacher offering sparse comments in the manner of a facilitator. By presenting results in a graphic format, teachers can be given rapid objective feedback on their teaching style. Coupled with short verbal/non-verbal quizzes at the end of tutorials and information from other assessments, the value of using levels of participation as a measure of the efficiency of such small-group sessions can itself be assessed.

  10. Quantitative assessment of the probability of bluetongue virus overwintering by horizontal transmission: application to Germany

    PubMed Central

    2011-01-01

    Even though bluetongue virus (BTV) transmission is apparently interrupted during winter, bluetongue outbreaks often reappear in the next season (overwintering). Several mechanisms for BTV overwintering have been proposed, but to date, their relative importance remain unclear. In order to assess the probability of BTV overwintering by persistence in adult vectors, ruminants (through prolonged viraemia) or a combination of both, a quantitative risk assessment model was developed. Furthermore, the model allowed the role played by the residual number of vectors present during winter to be examined, and the effect of a proportion of Culicoides living inside buildings (endophilic behaviour) to be explored. The model was then applied to a real scenario: overwintering in Germany between 2006 and 2007. The results showed that the limited number of vectors active during winter seemed to allow the transmission of BTV during this period, and that while transmission was favoured by the endophilic behaviour of some Culicoides, its effect was limited. Even though transmission was possible, the likelihood of BTV overwintering by the mechanisms studied seemed too low to explain the observed re-emergence of the disease. Therefore, other overwintering mechanisms not considered in the model are likely to have played a significant role in BTV overwintering in Germany between 2006 and 2007. PMID:21314966

  11. Non-destructive assessment of human ribs mechanical properties using quantitative ultrasound.

    PubMed

    Mitton, David; Minonzio, Jean-Gabriel; Talmant, Maryline; Ellouz, Rafaa; Rongieras, Frédéric; Laugier, Pascal; Bruyère-Garnier, Karine

    2014-04-11

    Advanced finite element models of the thorax have been developed to study, for example, the effects of car crashes. While there is a need for material properties to parameterize such models, specific properties are largely missing. Non-destructive techniques applicable in vivo would, therefore, be of interest to support further development of thorax models. The only non-destructive technique available today to derive rib bone properties would be based on quantitative computed tomography that measures bone mineral density. However, this approach is limited by the radiation dose. Bidirectional ultrasound axial transmission was developed on long bones ex vivo and used to assess in vivo health status of the radius. However, it is currently unknown if the ribs are good candidates for such a measurement. Therefore, the goal of this study is to evaluate the relationship between ex vivo ultrasonic measurements (axial transmission) and the mechanical properties of human ribs to determine if the mechanical properties of the ribs can be quantified non-destructively. The results show statistically significant relationships between the ultrasonic measurements and mechanical properties of the ribs. These results are promising with respect to a non-destructive and non-ionizing assessment of rib mechanical properties. This ex vivo study is a first step toward in vivo studies to derive subject-specific rib properties.

  12. Disability adjusted life year (DALY): a useful tool for quantitative assessment of environmental pollution.

    PubMed

    Gao, Tingting; Wang, Xiaochang C; Chen, Rong; Ngo, Huu Hao; Guo, Wenshan

    2015-04-01

    Disability adjusted life year (DALY) has been widely used since 1990s for evaluating global and/or regional burden of diseases. As many environmental pollutants are hazardous to human health, DALY is also recognized as an indicator to quantify the health impact of environmental pollution related to disease burden. Based on literature reviews, this article aims to give an overview of the applicable methodologies and research directions for using DALY as a tool for quantitative assessment of environmental pollution. With an introduction of the methodological framework of DALY, the requirements on data collection and manipulation for quantifying disease burdens are summarized. Regarding environmental pollutants hazardous to human beings, health effect/risk evaluation is indispensable for transforming pollution data into disease data through exposure and dose-response analyses which need careful selection of models and determination of parameters. Following the methodological discussions, real cases are analyzed with attention paid to chemical pollutants and pathogens usually encountered in environmental pollution. It can be seen from existing studies that DALY is advantageous over conventional environmental impact assessment for quantification and comparison of the risks resulted from environmental pollution. However, further studies are still required to standardize the methods of health effect evaluation regarding varied pollutants under varied circumstances before DALY calculation.

  13. Multi-observation PET image analysis for patient follow-up quantitation and therapy assessment.

    PubMed

    David, S; Visvikis, D; Roux, C; Hatt, M

    2011-09-21

    In positron emission tomography (PET) imaging, an early therapeutic response is usually characterized by variations of semi-quantitative parameters restricted to maximum SUV measured in PET scans during the treatment. Such measurements do not reflect overall tumor volume and radiotracer uptake variations. The proposed approach is based on multi-observation image analysis for merging several PET acquisitions to assess tumor metabolic volume and uptake variations. The fusion algorithm is based on iterative estimation using a stochastic expectation maximization (SEM) algorithm. The proposed method was applied to simulated and clinical follow-up PET images. We compared the multi-observation fusion performance to threshold-based methods, proposed for the assessment of the therapeutic response based on functional volumes. On simulated datasets the adaptive threshold applied independently on both images led to higher errors than the ASEM fusion and on clinical datasets it failed to provide coherent measurements for four patients out of seven due to aberrant delineations. The ASEM method demonstrated improved and more robust estimation of the evaluation leading to more pertinent measurements. Future work will consist in extending the methodology and applying it to clinical multi-tracer datasets in order to evaluate its potential impact on the biological tumor volume definition for radiotherapy applications.

  14. Quantitative assessment of desertification in south of Iran using MEDALUS method.

    PubMed

    Sepehr, A; Hassanli, A M; Ekhtesasi, M R; Jamali, J B

    2007-11-01

    The main aim of this study was the quantitative assessment of desertification process in the case study area of the Fidoye-Garmosht plain (Southern Iran). Based on the MEDALUS approach and the characteristics of study area a regional model developed using GIS. Six main factors or indicators of desertification including: soil, climate, erosion, plant cover, groundwater and management were considered for evaluation. Then several sub-indicators affecting the quality of each main indicator were identified. Based on the MEDALUS approach, each sub-indicator was quantified according to its quality and given a weighting of between 1.0 and 2.0. ArcGIS 9 was used to analyze and prepare the layers of quality maps using the geometric mean to integrate the individual sub-indicator maps. In turn the geometric mean of all six quality maps was used to generate a single desertification status map. Results showed that 12% of the area is classified as very severe, 81% as severe and 7% as moderately affected by desertification. In addition the plant cover and groundwater indicators were the most important factors affecting desertification process in the study area. The model developed may be used to assess desertification process and distinguish the areas sensitive to desertification in the study region and in regions with the similar characteristics.

  15. Quantitative microbial risk assessment of distributed drinking water using faecal indicator incidence and concentrations.

    PubMed

    van Lieverloo, J Hein M; Blokker, E J Mirjam; Medema, Gertjan

    2007-01-01

    Quantitative Microbial Risk Assessments (QMRA) have focused on drinking water system components upstream of distribution to customers, for nominal and event conditions. Yet some 15-33% of waterborne outbreaks are reported to be caused by contamination events in distribution systems. In the majority of these cases and probably in all non-outbreak contamination events, no pathogen concentration data was available. Faecal contamination events are usually detected or confirmed by the presence of E. coli or other faecal indicators, although the absence of this indicator is no guarantee of the absence of faecal pathogens. In this paper, the incidence and concentrations of various coliforms and sources of faecal contamination were used to estimate the possible concentrations of faecal pathogens and consequently the infection risks to consumers in event-affected areas. The results indicate that the infection risks may be very high, especially from Campylobacter and enteroviruses, but also that the uncertainties are very high. The high variability of pathogen to thermotolerant coliform ratios estimated in environmental samples severely limits the applicability of the approach described. Importantly, the highest ratios of enteroviruses to thermotolerant coliform were suggested from soil and shallow groundwaters, the most likely sources of faecal contamination that are detected in distribution systems. Epidemiological evaluations of non-outbreak faecal contamination of drinking water distribution systems and thorough tracking and characterisation of the contamination sources are necessary to assess the actual risks of these events.

  16. Multi-observation PET image analysis for patient follow-up quantitation and therapy assessment

    PubMed Central

    David, Simon; Visvikis, Dimitris; Roux, Christian; Hatt, Mathieu

    2011-01-01

    In Positron Emission Tomography (PET) imaging, an early therapeutic response is usually characterized by variations of semi-quantitative parameters restricted to maximum SUV measured in PET scans during the treatment. Such measurements do not reflect overall tumour volume and radiotracer uptake variations. The proposed approach is based on multi-observation image analysis for merging several PET acquisitions to assess tumour metabolic volume and uptake variations. The fusion algorithm is based on iterative estimation using stochastic expectation maximization (SEM) algorithm. The proposed method was applied to simulated and clinical follow-up PET images. We compared the multi-observation fusion performance to threshold-based methods, proposed for the assessment of the therapeutic response based on functional volumes. On simulated datasets, the adaptive threshold applied independently on both images led to higher errors than the ASEM fusion and on the clinical datasets, it failed to provide coherent measurements for four patients out of seven due to aberrant delineations. The ASEM method demonstrated improved and more robust estimation of the evaluation leading to more pertinent measurements. Future work will consist in extending the methodology and applying it to clinical multi-tracers datasets in order to evaluate its potential impact on the biological tumour volume definition for radiotherapy applications. PMID:21846937

  17. Estimation of undiscovered deposits in quantitative mineral resource assessments-examples from Venezuela and Puerto Rico

    USGS Publications Warehouse

    Cox, D.P.

    1993-01-01

    Quantitative mineral resource assessments used by the United States Geological Survey are based on deposit models. These assessments consist of three parts: (1) selecting appropriate deposit models and delineating on maps areas permissive for each type of deposit; (2) constructing a grade-tonnage model for each deposit model; and (3) estimating the number of undiscovered deposits of each type. In this article, I focus on the estimation of undiscovered deposits using two methods: the deposit density method and the target counting method. In the deposit density method, estimates are made by analogy with well-explored areas that are geologically similar to the study area and that contain a known density of deposits per unit area. The deposit density method is useful for regions where there is little or no data. This method was used to estimate undiscovered low-sulfide gold-quartz vein deposits in Venezuela. Estimates can also be made by counting targets such as mineral occurrences, geophysical or geochemical anomalies, or exploration "plays" and by assigning to each target a probability that it represents an undiscovered deposit that is a member of the grade-tonnage distribution. This method is useful in areas where detailed geological, geophysical, geochemical, and mineral occurrence data exist. Using this method, porphyry copper-gold deposits were estimated in Puerto Rico. ?? 1993 Oxford University Press.

  18. Tracking Epidermal Nerve Fiber Changes in Asian Macaques: Tools and Techniques for Quantitative Assessment.

    PubMed

    Mangus, Lisa M; Dorsey, Jamie L; Weinberg, Rachel L; Ebenezer, Gigi J; Hauer, Peter; Laast, Victoria A; Mankowski, Joseph L

    2016-08-01

    Quantitative assessment of epidermal nerve fibers (ENFs) has become a widely used clinical tool for the diagnosis of small fiber neuropathies such as diabetic neuropathy and human immunodeficiency virus-associated sensory neuropathy (HIV-SN). To model and investigate the pathogenesis of HIV-SN using simian immunodeficiency virus (SIV)-infected Asian macaques, we adapted the skin biopsy and immunostaining techniques currently employed in human patients and then developed two unbiased image analysis techniques for quantifying ENF in macaque footpad skin. This report provides detailed descriptions of these tools and techniques for ENF assessment in macaques and outlines important experimental considerations that we have identified in the course of our long-term studies. Although initially developed for studies of HIV-SN in the SIV-infected macaque model, these methods could be readily translated to a range of studies involving peripheral nerve degeneration and neurotoxicity in nonhuman primates as well as preclinical investigations of agents aimed at neuroprotection and regeneration.

  19. Quantitative assessment of human and pet exposure to Salmonella associated with dry pet foods.

    PubMed

    Lambertini, Elisabetta; Buchanan, Robert L; Narrod, Clare; Ford, Randall M; Baker, Robert C; Pradhan, Abani K

    2016-01-04

    Recent Salmonella outbreaks associated with dry pet foods and treats highlight the importance of these foods as previously overlooked exposure vehicles for both pets and humans. In the last decade efforts have been made to raise the safety of this class of products, for instance by upgrading production equipment, cleaning protocols, and finished product testing. However, no comprehensive or quantitative risk profile is available for pet foods, thus limiting the ability to establish safety standards and assess the effectiveness of current and proposed Salmonella control measures. This study sought to develop an ingredients-to-consumer quantitative microbial exposure assessment model to: 1) estimate pet and human exposure to Salmonella via dry pet food, and 2) assess the impact of industry and household-level mitigation strategies on exposure. Data on prevalence and concentration of Salmonella in pet food ingredients, production process parameters, bacterial ecology, and contact transfer in the household were obtained through literature review, industry data, and targeted research. A probabilistic Monte Carlo modeling framework was developed to simulate the production process and basic household exposure routes. Under the range of assumptions adopted in this model, human exposure due to handling pet food is null to minimal if contamination occurs exclusively before extrusion. Exposure increases considerably if recontamination occurs post-extrusion during coating with fat, although mean ingested doses remain modest even at high fat contamination levels, due to the low percent of fat in the finished product. Exposure is highly variable, with the distribution of doses ingested by adult pet owners spanning 3Log CFU per exposure event. Child exposure due to ingestion of 1g of pet food leads to significantly higher doses than adult doses associated with handling the food. Recontamination after extrusion and coating, e.g., via dust or equipment surfaces, may also lead to

  20. Differences in quantitative assessment of myocardial scar and gray zone by LGE-CMR imaging using established gray zone protocols.

    PubMed

    Mesubi, Olurotimi; Ego-Osuala, Kelechi; Jeudy, Jean; Purtilo, James; Synowski, Stephen; Abutaleb, Ameer; Niekoop, Michelle; Abdulghani, Mohammed; Asoglu, Ramazan; See, Vincent; Saliaris, Anastasios; Shorofsky, Stephen; Dickfeld, Timm

    2015-02-01

    Late gadolinium enhancement cardiac magnetic resonance (LGE-CMR) imaging is the gold standard for myocardial scar evaluation. Heterogeneous areas of scar ('gray zone'), may serve as arrhythmogenic substrate. Various gray zone protocols have been correlated to clinical outcomes and ventricular tachycardia channels. This study assessed the quantitative differences in gray zone and scar core sizes as defined by previously validated signal intensity (SI) threshold algorithms. High quality LGE-CMR images performed in 41 cardiomyopathy patients [ischemic (33) or non-ischemic (8)] were analyzed using previously validated SI threshold methods [Full Width at Half Maximum (FWHM), n-standard deviation (NSD) and modified-FWHM]. Myocardial scar was defined as scar core and gray zone using SI thresholds based on these methods. Scar core, gray zone and total scar sizes were then computed and compared among these models. The median gray zone mass was 2-3 times larger with FWHM (15 g, IQR: 8-26 g) compared to NSD or modified-FWHM (5 g, IQR: 3-9 g; and 8 g. IQR: 6-12 g respectively, p < 0.001). Conversely, infarct core mass was 2.3 times larger with NSD (30 g, IQR: 17-53 g) versus FWHM and modified-FWHM (13 g, IQR: 7-23 g, p < 0.001). The gray zone extent (percentage of total scar that was gray zone) also varied significantly among the three methods, 51 % (IQR: 42-61 %), 17 % (IQR: 11-21 %) versus 38 % (IQR: 33-43 %) for FWHM, NSD and modified-FWHM respectively (p < 0.001). Considerable variability exists among the current methods for MRI defined gray zone and scar core. Infarct core and total myocardial scar mass also differ using these methods. Further evaluation of the most accurate quantification method is needed.

  1. Groundwater availability in the United States: the value of quantitative regional assessments

    USGS Publications Warehouse

    Dennehy, Kevin F.; Reilly, Thomas E.; Cunningham, William L.

    2015-01-01

    The sustainability of water resources is under continued threat from the challenges associated with a growing population, competing demands, and a changing climate. Freshwater scarcity has become a fact in many areas. Much of the United States surface-water supplies are fully apportioned for use; thus, in some areas the only potential alternative freshwater source that can provide needed quantities is groundwater. Although frequently overlooked, groundwater serves as the principal reserve of freshwater in the US and represents much of the potential supply during periods of drought. Some nations have requirements to monitor and characterize the availability of groundwater such as the European Union’s Water Framework Directive (EPCEU 2000). In the US there is no such national requirement. Quantitative regional groundwater availability assessments, however, are essential to document the status and trends of groundwater availability for the US and make informed water-resource decisions possible now and in the future. Barthel (2014) highlighted that the value of regional groundwater assessments goes well beyond just quantifying the resource so that it can be better managed. The tools and techniques required to evaluate these unique regional systems advance the science of hydrogeology and provide enhanced methods that can benefit local-scale groundwater investigations. In addition, a significant, yet under-utilized benefit is the digital spatial and temporal data sets routinely generated as part of these studies. Even though there is no legal or regulatory requirement for regional groundwater assessments in the US, there is a logical basis for their implementation. The purpose of this essay is to articulate the rationale for and reaffirm the value of regional groundwater assessments primarily in the US; however, the arguments hold for all nations. The importance of the data sets and the methods and model development that occur as part of these assessments is stressed

  2. Comparison of Diagnostic Performance Between Visual and Quantitative Assessment of Bone Scintigraphy Results in Patients With Painful Temporomandibular Disorder

    PubMed Central

    Choi, Bong-Hoi; Yoon, Seok-Ho; Song, Seung-Il; Yoon, Joon-Kee; Lee, Su Jin; An, Young-Sil

    2016-01-01

    Abstract This retrospective clinical study was performed to evaluate whether a visual or quantitative method is more valuable for assessing painful temporomandibular disorder (TMD) using bone scintigraphy results. In total, 230 patients (172 women and 58 men) with TMD were enrolled. All patients were questioned about their temporomandibular joint (TMJ) pain. Bone scintigraphic data were acquired in all patients, and images were analyzed by visual and quantitative methods using the TMJ-to-skull uptake ratio. The diagnostic performances of both bone scintigraphic assessment methods for painful TMD were compared. In total, 241 of 460 TMJs (52.4%) were finally diagnosed with painful TMD. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of the visual analysis for diagnosing painful TMD were 62.8%, 59.6%, 58.6%, 63.8%, and 61.1%, respectively. The quantitative assessment showed the ability to diagnose painful TMD with a sensitivity of 58.8% and specificity of 69.3%. The diagnostic ability of the visual analysis for diagnosing painful TMD was not significantly different from that of the quantitative analysis. Visual bone scintigraphic analysis showed a diagnostic utility similar to that of quantitative assessment for the diagnosis of painful TMD. PMID:26765456

  3. Assessing the quality of conformal treatment planning: a new tool for quantitative comparison.

    PubMed

    Menhel, J; Levin, D; Alezra, D; Symon, Z; Pfeffer, R

    2006-10-21

    We develop a novel radiotherapy plan comparison index, critical organ scoring index (COSI), which is a measure of both target coverage and critical organ overdose. COSI is defined as COSI=1-(V(OAR)>tol/TC), where V(OAR)>tol is the fraction of volume of organ at risk receiving more than tolerance dose, and TC is the target coverage, VT,PI/VT, where VT,PI is the target volume receiving at a least prescription dose and VT is the total target volume. COSI approaches unity when the critical structure is completely spared and the target coverage is unity. We propose a two-dimensional, graphical representation of COSI versus conformity index (CI), where CI is a measure of a normal tissue overdose. We show that this 2D representation is a reliable, visual quantitative tool for evaluating competing plans. We generate COSI-CI plots for three sites: head and neck, cavernous sinus, and pancreas, and evaluate competing non-coplanar 3D and IMRT treatment plans. For all three sites this novel 2D representation assisted the physician in choosing the optimal plan, both in terms of target coverage and in terms of critical organ sparing. We verified each choice by analysing individual DVHs and isodose lines. Comparing our results to the widely used conformation number, we found that in all cases where there were discrepancies in the choice of the best treatment plan, the COSI-CI choice was considered the correct one, in several cases indicating that a non-coplanar 3D plan was superior to the IMRT plans. The choice of plan was quick, simple and accurate using the new graphical representation.

  4. Qualitative and Quantitative Assessment of Hepatitis A Virus in Wastewaters in Tunisia.

    PubMed

    Béji-Hamza, A; Khélifi-Gharbi, H; Hassine-Zaafrane, M; Della Libera, S; Iaconelli, M; Muscillo, M; Petricca, S; Ciccaglione, A R; Bruni, R; Taffon, S; Equestre, M; Aouni, M; La Rosa, G

    2014-12-01

    Hepatitis A causes substantial morbidity in both industrialized and non-industrialized countries and represents an important health problem in several southern Mediterranean countries. The objectives of the study were as follows: (a) to assess the occurrence of hepatitis A virus (HAV) in Tunisia through the monitoring of urban wastewaters collected at wastewater treatment plants (WTPs); (b) to characterize environmental strains; and (c) to estimate the viral load in raw and treated sewages, in order to evaluate the potential impact on superficial waters receiving discharges. A total of 150 raw and treated wastewaters were collected from three WTPs and analyzed by both qualitative (RT-PCR/nested) and quantitative (qRT-PCR) methods. Of these, 100 (66%) were found to be positive for HAV by the qualitative assay: 68.3% in influents and 64.7% in effluents. The vast majority of HAV sequences belonged to sub-genotype IA, with 11 different strains detected found to be identical to clinical strains isolated from Tunisian patients with acute hepatitis. Five unique variants were also detected, not previously reported in clinical cases. Only two IB strains were found, confirming the rarity of this sub-genotype in this country. The results of the present study indicate a wide circulation of the pathogen in the population, most probably in the form of asymptomatic infections, a finding consistent with the classification of the country as having intermediate/high endemicity. Quantitative data showed high viral loads in influents (3.5E+05 genome copies/liter, mean value) as well as effluents (2.5E+05 genome copies/liter, mean value), suggesting that contaminated water could be a critical element in transmission.

  5. Quantitative assessment of inhalation exposure and deposited dose of aerosol from nanotechnology-based consumer sprays†

    PubMed Central

    Nazarenko, Yevgen; Lioy, Paul J.; Mainelis, Gediminas

    2015-01-01

    This study provides a quantitative assessment of inhalation exposure and deposited aerosol dose in the 14 nm to 20 μm particle size range based on the aerosol measurements conducted during realistic usage simulation of five nanotechnology-based and five regular spray products matching the nano-products by purpose of application. The products were also examined using transmission electron microscopy. In seven out of ten sprays, the highest inhalation exposure was observed for the coarse (2.5–10 μm) particles while being minimal or below the detection limit for the remaining three sprays. Nanosized aerosol particles (14–100 nm) were released, which resulted in low but measurable inhalation exposures from all of the investigated consumer sprays. Eight out of ten products produced high total deposited aerosol doses on the order of 101–103 ng kg−1 bw per application, ~85–88% of which were in the head airways, only <10% in the alveolar region and <8% in the tracheobronchial region. One nano and one regular spray produced substantially lower total deposited doses (by 2–4 orders of magnitude less), only ~52–64% of which were in the head while ~29–40% in the alveolar region. The electron microscopy data showed nanosized objects in some products not labeled as nanotechnology-based and conversely did not find nano-objects in some nano-sprays. We found no correlation between nano-object presence and abundance as per the electron microscopy data and the determined inhalation exposures and deposited doses. The findings of this study and the reported quantitative exposure data will be valuable for the manufacturers of nanotechnology-based consumer sprays to minimize inhalation exposure from their products, as well as for the regulators focusing on protecting the public health. PMID:25621175

  6. Quick, non-invasive and quantitative assessment of small fiber neuropathy in patients receiving chemotherapy.

    PubMed

    Saad, Mehdi; Psimaras, Dimitri; Tafani, Camille; Sallansonnet-Froment, Magali; Calvet, Jean-Henri; Vilier, Alice; Tigaud, Jean-Marie; Bompaire, Flavie; Lebouteux, Marie; de Greslan, Thierry; Ceccaldi, Bernard; Poirier, Jean-Michel; Ferrand, François-Régis; Le Moulec, Sylvestre; Huillard, Olivier; Goldwasser, François; Taillia, Hervé; Maisonobe, Thierry; Ricard, Damien

    2016-04-01

    Chemotherapy-induced peripheral neurotoxicity (CIPN) is a common, potentially severe and dose-limiting adverse effect; however, it is poorly investigated at an early stage due to the lack of a simple assessment tool. As sweat glands are innervated by small autonomic C-fibers, sudomotor function testing has been suggested for early screening of peripheral neuropathy. This study aimed to evaluate Sudoscan, a non-invasive and quantitative method to assess sudomotor function, in the detection and follow-up of CIPN. Eighty-eight patients receiving at least two infusions of Oxaliplatin only (45.4%), Paclitaxel only (14.8%), another drug only (28.4%) or two drugs (11.4%) were enrolled in the study. At each chemotherapy infusion the accumulated dose of chemotherapy was calculated and the Total Neuropathy Score clinical version (TNSc) was carried out. Small fiber neuropathy was assessed using Sudoscan (a 3-min test). The device measures the Electrochemical Skin Conductance (ESC) of the hands and feet expressed in microSiemens (µS). For patients receiving Oxaliplatin mean hands ESC changed from 73 ± 2 to 63 ± 2 and feet ESC from 77 ± 2 to 66 ± 3 µS (p < 0.001) while TNSc changed from 2.9 ± 0.5 to 4.3 ± 0.4. Similar results were observed in patients receiving Paclitaxel or another neurotoxic chemotherapy. During the follow-up, ESC values of both hands and feet with a corresponding TNSc < 2 were 70 ± 2 and 73 ± 2 µS respectively while they were 59 ± 1.4 and 64 ± 1.5 µS with a corresponding TNSc ≥ 6 (p < 0.0001 and p = 0.0003 respectively). This preliminary study suggests that small fiber neuropathy could be screened and followed using Sudoscan in patients receiving chemotherapy.

  7. [High resolution peripheral quantitative computed tomography for the assessment of morphological and mechanical bone parameters].

    PubMed

    Fuller, Henrique; Fuller, Ricardo; Pereira, Rosa Maria R

    2015-01-01

    High resolution peripheral quantitative computed tomography (HR-pQCT) is a new technology commercially available for less than 10 years that allows performing in vivo assessment of bone parameters. HR-pQCT assesses the trabecular thickness, trabecular separation, trabecular number and connectivity density and, in addition, cortical bone density and thickness and total bone volume and density in high-definition mode, which additionally allows obtaining digital constructs of bone microarchitecture. The application of mathematics to captured data, a method called finite element analysis (FEA), allows the estimation of the physical properties of the tissue, simulating supported loads in a non-invasive way. Thus, HR-pQCT simultaneously acquires data previously provided separately by dual energy x-ray absorptiometry (DXA), magnetic resonance imaging and histomorphometry, aggregating biomechanical estimates previously only possible in extracted tissues. This method has a satisfactory reproducibility, with coefficients of variation rarely exceeding 3%. Regarding accuracy, the method shows a fair to good agreement (r(2) = 0.37-0.97). The main clinical application of this method is in the quantification and monitoring of metabolic bone disorders, more fully evaluating bone strength and fracture risk. In rheumatoid arthritis patients, this allows gauging the number and size of erosions and cysts, in addition to joint space. In osteoarthritis, it is possible to characterize the bone marrow edema-like areas that show a correlation with cartilage breakdown. Given its high cost, HR-pQCT is still a research tool, but the high resolution and efficiency of this method reveal advantages over the methods currently used for bone assessment, with a potential to become an important tool in clinical practice.

  8. Extrapolating cetacean densities to quantitatively assess human impacts on populations in the high seas.

    PubMed

    Mannocci, Laura; Roberts, Jason J; Miller, David L; Halpin, Patrick N

    2016-10-24

    As human activities expand beyond national jurisdictions to the high seas, there is increasing need to consider anthropogenic impacts to species that inhabit these waters. The current scarcity of scientific observations of cetaceans in the high seas impedes the assessment of population-level impacts of these activities. This study is directed towards an important management need in the high seas-the development of plausible density estimates to facilitate a quantitative assessment of anthropogenic impacts on cetacean populations in these waters. Our study region extends from a well-surveyed region within the United States Exclusive Economic Zone into a large region of the western North Atlantic sparsely surveyed for cetaceans. We modeled densities of 15 cetacean taxa using available line transect survey data and habitat covariates and extrapolated predictions to sparsely surveyed regions. We formulated models carefully to reduce the extent of extrapolation beyond covariate ranges, and constrained them to model simple and generalizable relationships. To evaluate confidence in the predictions, we performed several qualitative assessments, such as mapping where predictions were made outside sampled covariate ranges, and comparing them with maps of sightings from a variety of sources that could not be integrated into our models. Our study revealed a range of confidence levels for the model results depending on the taxon and geographic area, and highlights the need for additional surveying in environmentally distinct areas. Combined with their explicit confidence levels and necessary caution, our density estimates can inform a variety of management needs in the high seas, such as the quantification of potential cetacean interactions with military training exercises, shipping, fisheries, deep-sea mining, as well as delineation of areas of special biological significance in international waters. Our approach is generally applicable to other marine taxa and geographic

  9. A simplified method for quantitative assessment of the relative health and safety risk of environmental management activities

    SciTech Connect

    Eide, S.A.; Smith, T.H.; Peatross, R.G.; Stepan, I.E.

    1996-09-01

    This report presents a simplified method to assess the health and safety risk of Environmental Management activities of the US Department of Energy (DOE). The method applies to all types of Environmental Management activities including waste management, environmental restoration, and decontamination and decommissioning. The method is particularly useful for planning or tradeoff studies involving multiple conceptual options because it combines rapid evaluation with a quantitative approach. The method is also potentially applicable to risk assessments of activities other than DOE Environmental Management activities if rapid quantitative results are desired.

  10. Quantitative in Situ Assessment of the Somatostatin Receptor in Breast Cancer to Assess Response to Targeted Therapy With 111-in-Penetreotide

    DTIC Science & Technology

    2006-05-01

    Somatostatin Receptor in Breast Cancer to Assess Response to Targeted Therapy with 111-in-Pentetreotide PRINCIPAL INVESTIGATOR: Gina G...31 Mar 2006 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER Quantitative in situ Assessment of the Somatostatin Receptor in Breast Cancer to Assess...Somatostatin (SST) is a peptide hormone implicated in the growth and progression of cancers and SSTR2 is the predominant receptor subtype expressed

  11. Quantitative in situ Assessment of the Somatostatin Receptor in Breast Cancer to Assess Response to Targeted Therapy with 111-in-Pentetreotide

    DTIC Science & Technology

    2007-05-01

    Somatostatin Receptor in Breast Cancer to Assess Response to Targeted Therapy with 111-in-Pentetreotide PRINCIPAL INVESTIGATOR: Gina G...Quantitative in situ Assessment of the Somatostatin Receptor in Breast Cancer to Assess Response to Targeted Therapy with 111-in-Pentetreotide 5b. GRANT NUMBER...Somatostatin (SST) is a peptide hormone implicated in the growth and progression of cancers and SSTR2 is the predominant receptor subtype expressed in

  12. Quantitative assessments of traumatic axonal injury in human brain: concordance of microdialysis and advanced MRI

    PubMed Central

    Magnoni, Sandra; Mac Donald, Christine L.; Esparza, Thomas J.; Conte, Valeria; Sorrell, James; Macrì, Mario; Bertani, Giulio; Biffi, Riccardo; Costa, Antonella; Sammons, Brian; Snyder, Abraham Z.; Shimony, Joshua S.; Triulzi, Fabio; Stocchetti, Nino

    2015-01-01

    white matter regions. We interpret this result to mean that both microdialysis and diffusion tensor magnetic resonance imaging accurately reflect the same pathophysiological process: traumatic axonal injury. This cross-validation increases confidence in both methods for the clinical assessment of axonal injury. However, neither microdialysis nor diffusion tensor magnetic resonance imaging have been validated versus post-mortem histology in humans. Furthermore, future work will be required to determine the prognostic significance of these assessments of traumatic axonal injury when combined with other clinical and radiological measures. PMID:26084657

  13. Quantitative assessments of traumatic axonal injury in human brain: concordance of microdialysis and advanced MRI.

    PubMed

    Magnoni, Sandra; Mac Donald, Christine L; Esparza, Thomas J; Conte, Valeria; Sorrell, James; Macrì, Mario; Bertani, Giulio; Biffi, Riccardo; Costa, Antonella; Sammons, Brian; Snyder, Abraham Z; Shimony, Joshua S; Triulzi, Fabio; Stocchetti, Nino; Brody, David L

    2015-08-01

    regions. We interpret this result to mean that both microdialysis and diffusion tensor magnetic resonance imaging accurately reflect the same pathophysiological process: traumatic axonal injury. This cross-validation increases confidence in both methods for the clinical assessment of axonal injury. However, neither microdialysis nor diffusion tensor magnetic resonance imaging have been validated versus post-mortem histology in humans. Furthermore, future work will be required to determine the prognostic significance of these assessments of traumatic axonal injury when combined with other clinical and radiological measures.

  14. Combined visual and semi-quantitative assessment of (123)I-FP-CIT SPECT for the diagnosis of dopaminergic neurodegenerative diseases.

    PubMed

    Ueda, Jun; Yoshimura, Hajime; Shimizu, Keiji; Hino, Megumu; Kohara, Nobuo

    2017-04-07

    Visual and semi-quantitative assessments of (123)I-FP-CIT single-photon emission computed tomography (SPECT) are useful for the diagnosis of dopaminergic neurodegenerative diseases (dNDD), including Parkinson's disease, dementia with Lewy bodies, progressive supranuclear palsy, multiple system atrophy, and corticobasal degeneration. However, the diagnostic value of combined visual and semi-quantitative assessment in dNDD remains unclear. Among 239 consecutive patients with a newly diagnosed possible parkinsonian syndrome who underwent (123)I-FP-CIT SPECT in our medical center, 114 patients with a disease duration less than 7 years were diagnosed as dNDD with the established criteria or as non-dNDD according to clinical judgment. We retrospectively examined their clinical characteristics and visual and semi-quantitative assessments of (123)I-FP-CIT SPECT. The striatal binding ratio (SBR) was used as a semi-quantitative measure of (123)I-FP-CIT SPECT. We calculated the sensitivity and specificity of visual assessment alone, semi-quantitative assessment alone, and combined visual and semi-quantitative assessment for the diagnosis of dNDD. SBR was correlated with visual assessment. Some dNDD patients with a normal visual assessment had an abnormal SBR, and vice versa. There was no statistically significant difference between sensitivity of the diagnosis with visual assessment alone and semi-quantitative assessment alone (91.2 vs. 86.8%, respectively, p = 0.29). Combined visual and semi-quantitative assessment demonstrated superior sensitivity (96.7%) to visual assessment (p = 0.03) or semi-quantitative assessment (p = 0.003) alone with equal specificity. Visual and semi-quantitative assessments of (123)I-FP-CIT SPECT are helpful for the diagnosis of dNDD, and combined visual and semi-quantitative assessment shows superior sensitivity with equal specificity.

  15. Qualitative and quantitative assessment of degeneration of cervical intervertebral discs and facet joints.

    PubMed

    Walraevens, Joris; Liu, Baoge; Meersschaert, Joke; Demaerel, Philippe; Delye, Hans; Depreitere, Bart; Vander Sloten, Jos; Goffin, Jan

    2009-03-01

    Degeneration of intervertebral discs and facet joints is one of the most frequently encountered spinal disorders. In order to describe and quantify degeneration and evaluate a possible relationship between degeneration and biomechanical parameters, e.g., the intervertebral range of motion and intradiscal pressure, a scoring system for degeneration is mandatory. However, few scoring systems for the assessment of degeneration of the cervical spine exist. Therefore, two separate objective scoring systems to qualitatively and quantitatively assess the degree of cervical intervertebral disc and facet joint degeneration were developed and validated. The scoring system for cervical disc degeneration consists of three variables which are individually scored on neutral lateral radiographs: "height loss" (0-4 points), "anterior osteophytes" (0-3 points) and "endplate sclerosis" (0-2 points). The scoring system for facet joint degeneration consists of four variables which are individually scored on neutral computed tomography scans: "hypertrophy" (0-2 points), "osteophytes" (0-1 point), "irregularity" on the articular surface (0-1 point) and "joint space narrowing" (0-1 point). Each variable contributes with varying importance to the overall degeneration score (max 9 points for the scoring system of cervical disc degeneration and max 5 points for facet joint degeneration). Degeneration of 20 discs and facet joints of 20 patients was blindly assessed by four raters: two neurosurgeons (one senior and one junior) and two radiologists (one senior and one junior), firstly based on first subjective impression and secondly using the scoring systems. Measurement errors and inter- and intra-rater agreement were determined. The measurement error of the scoring system for cervical disc degeneration was 11.1 versus 17.9% of the subjective impression results. This scoring system showed excellent intra-rater agreement (ICC = 0.86, 0.75-0.93) and excellent inter-rater agreement (ICC = 0

  16. Assessment of a quantitative metric for 4D CT artifact evaluation by observer consensus.

    PubMed

    Castillo, Sarah J; Castillo, Richard; Balter, Peter; Pan, Tinsu; Ibbott, Geoffrey; Hobbs, Brian; Yuan, Ying; Guerrero, Thomas

    2014-05-08

    The benefits of four-dimensional computed tomography (4D CT) are limited by the presence of artifacts that remain difficult to quantify. A correlation-based metric previously proposed for ciné 4D CT artifact identification was further validated as an independent artifact evaluator by using a novel qualitative assessment featuring a group of observers reaching a consensus decision on artifact location and magnitude. The consensus group evaluated ten ciné 4D CT scans for artifacts over each breathing phase of coronal lung views assuming one artifact per couch location. Each artifact was assigned a magnitude score of 1-5, 1 indicating lowest severity and 5 indicating highest severity. Consensus group results served as the ground truth for assessment of the correlation metric. The ten patients were split into two cohorts; cohort 1 generated an artifact identification threshold derived from receiver operating characteristic analysis using the Youden Index, while cohort 2 generated sensitivity and specificity values from application of the artifact threshold. The Pearson correlation coefficient was calculated between the correlation metric values and the consensus group scores for both cohorts. The average sensitivity and specificity values found with application of the artifact threshold were 0.703 and 0.476, respectively. The correlation coefficients of artifact magnitudes for cohort 1 and 2 were 0.80 and 0.61, respectively, (p < 0.001 for both); these correlation coefficients included a few scans with only two of the five possible magnitude scores. Artifact incidence was associated with breathing phase (p < 0.002), with presentation less likely near maximum exhale. Overall, the correlation metric allowed accurate and automated artifact identification. The consensus group evaluation resulted in efficient qualitative scoring, reduced interobserver variation, and provided consistent identification of artifact location and magnitudes.

  17. A methodological approach to identify cheap and accurate indicators for biodiversity assessment: application to grazing management and two grassland bird species.

    PubMed

    Tichit, M; Barbottin, A; Makowski, D

    2010-06-01

    In response to environmental threats, numerous indicators have been developed to assess the impact of livestock farming systems on the environment. Some of them, notably those based on management practices have been reported to have low accuracy. This paper reports the results of a study aimed at assessing whether accuracy can be increased at a reasonable cost by mixing individual indicators into models. We focused on proxy indicators representing an alternative to the direct impact measurement on two grassland bird species, the lapwing Vanellus vanellus and the redshank Tringa totanus. Models were developed using stepwise selection procedures or Bayesian model averaging (BMA). Sensitivity, specificity, and probability of correctly ranking fields (area under the curve, AUC) were estimated for each individual indicator or model from observational data measured on 252 grazed plots during 2 years. The cost of implementation of each model was computed as a function of the number and types of input variables. Among all management indicators, 50% had an AUC lower than or equal to 0.50 and thus were not better than a random decision. Independently of the statistical procedure, models combining management indicators were always more accurate than individual indicators for lapwings only. In redshanks, models based either on BMA or some selection procedures were non-informative. Higher accuracy could be reached, for both species, with model mixing management and habitat indicators. However, this increase in accuracy was also associated with an increase in model cost. Models derived by BMA were more expensive and slightly less accurate than those derived with selection procedures. Analysing trade-offs between accuracy and cost of indicators opens promising application perspectives as time consuming and expensive indicators are likely to be of low practical utility.

  18. Linking quantitative microbial risk assessment and epidemiological data: informing safe drinking water trials in developing countries.

    PubMed

    Enger, Kyle S; Nelson, Kara L; Clasen, Thomas; Rose, Joan B; Eisenberg, Joseph N S

    2012-05-01

    Intervention trials are used extensively to assess household water treatment (HWT) device efficacy against diarrheal disease in developing countries. Using these data for policy, however, requires addressing issues of generalizability (relevance of one trial in other contexts) and systematic bias associated with design and conduct of a study. To illustrate how quantitative microbial risk assessment (QMRA) can address water safety and health issues, we analyzed a published randomized controlled trial (RCT) of the LifeStraw Family Filter in the Congo. The model accounted for bias due to (1) incomplete compliance with filtration, (2) unexpected antimicrobial activity by the placebo device, and (3) incomplete recall of diarrheal disease. Effectiveness was measured using the longitudinal prevalence ratio (LPR) of reported diarrhea. The Congo RCT observed an LPR of 0.84 (95% CI: 0.61, 1.14). Our model predicted LPRs, assuming a perfect placebo, ranging from 0.50 (2.5-97.5 percentile: 0.33, 0.77) to 0.86 (2.5-97.5 percentile: 0.68, 1.09) for high (but not perfect) and low (but not zero) compliance, respectively. The calibration step provided estimates of the concentrations of three pathogen types (modeled as diarrheagenic E. coli, Giardia, and rotavirus) in drinking water, consistent with the longitudinal prevalence of reported diarrhea measured in the trial, and constrained by epidemiological data from the trial. Use of a QMRA model demonstrated the importance of compliance in HWT efficacy, the need for pathogen data from source waters, the effect of quantifying biases associated with epidemiological data, and the usefulness of generalizing the effectiveness of HWT trials to other contexts.

  19. Quantitative assessment of rotator cuff muscle elasticity: Reliability and feasibility of shear wave elastography.

    PubMed

    Hatta, Taku; Giambini, Hugo; Uehara, Kosuke; Okamoto, Seiji; Chen, Shigao; Sperling, John W; Itoi, Eiji; An, Kai-Nan

    2015-11-05

    Ultrasound imaging has been used to evaluate various shoulder pathologies, whereas, quantification of the rotator cuff muscle stiffness using shear wave elastography (SWE) has not been verified. The purpose of this study was to investigate the reliability and feasibility of SWE measurements for the quantification of supraspinatus (SSP) muscle elasticity. Thirty cadaveric shoulders (18 intact and 12 with torn rotator cuff) were used. Intra- and inter-observer reliability was evaluated on an established SWE technique for measuring the SSP muscle elasticity. To assess the effect of overlying soft tissues above the SSP muscle, SWE values were measured with the transducer placed on the skin, on the subcutaneous fat after removing the skin, on the trapezius muscle after removing the subcutaneous fat, and directly on the SSP muscle. In addition, SWE measurements on 4 shoulder positions (0°, 30°, 60°, and 90° abduction) were compared in those with/without rotator cuff tears. Intra- and inter-observer reliability of SWE measurements were excellent for all regions in SSP muscle. Also, removing the overlying soft tissue showed no significant difference on SWE values measured in the SSP muscle. The SSP muscle with 0° abduction showed large SWE values, whereas, shoulders with large-massive tear showed smaller variation throughout the adduction-abduction positions. SWE is a reliable and feasible tool for quantitatively assessing the SSP muscle elasticity. This study also presented SWE measurements on the SSP muscle under various shoulder abduction positions which might help characterize patterns in accordance to the size of rotator cuff tears.

  20. A Quantitative Microbiological Risk Assessment for Salmonella in Pigs for the European Union.

    PubMed

    Snary, Emma L; Swart, Arno N; Simons, Robin R L; Domingues, Ana Rita Calado; Vigre, Hakan; Evers, Eric G; Hald, Tine; Hill, Andrew A

    2016-03-01

    A farm-to-consumption quantitative microbiological risk assessment (QMRA) for Salmonella in pigs in the European Union has been developed for the European Food Safety Authority. The primary aim of the QMRA was to assess the impact of hypothetical reductions of slaughter-pig prevalence and the impact of control measures on the risk of human Salmonella infection. A key consideration during the QMRA development was the characterization of variability between E.U. Member States (MSs), and therefore a generic MS model was developed that accounts for differences in pig production, slaughterhouse practices, and consumption patterns. To demonstrate the parameterization of the model, four case study MSs were selected that illustrate the variability in production of pork meat and products across MSs. For the case study MSs the average probability of illness was estimated to be between 1 in 100,000 and 1 in 10 million servings given consumption of one of the three product types considered (pork cuts, minced meat, and fermented ready-to-eat sausages). Further analyses of the farm-to-consumption QMRA suggest that the vast majority of human risk derives from infected pigs with a high concentration of Salmonella in their feces (≥10(4) CFU/g). Therefore, it is concluded that interventions should be focused on either decreasing the level of Salmonella in the feces of infected pigs, the introduction of a control step at the abattoir to reduce the transfer of feces to the exterior of the pig, or a control step to reduce the level of Salmonella on the carcass post-evisceration.

  1. A quantitative integrated assessment of pollution prevention achieved by integrated pollution prevention control licensing.

    PubMed

    Styles, David; O'Brien, Kieran; Jones, Michael B

    2009-11-01

    This paper presents an innovative, quantitative assessment of pollution avoidance attributable to environmental regulation enforced through integrated licensing, using Ireland's pharmaceutical-manufacturing sector as a case study. Emissions data reported by pharmaceutical installations were aggregated into a pollution trend using an Environmental Emissions Index (EEI) based on Lifecycle Assessment methodologies. Complete sectoral emissions data from 2001 to 2007 were extrapolated back to 1995, based on available data. Production volume data were used to derive a sectoral production index, and determine 'no-improvement' emission trends, whilst questionnaire responses from 20 industry representatives were used to quantify the contribution of integrated licensing to emission avoidance relative to these trends. Between 2001 and 2007, there was a 40% absolute reduction in direct pollution from 27 core installations, and 45% pollution avoidance relative to hypothetical 'no-improvement' pollution. It was estimated that environmental regulation avoided 20% of 'no-improvement' pollution, in addition to 25% avoidance under business-as-usual. For specific emissions, avoidance ranged from 14% and 30 kt a(-1) for CO(2) to 88% and 598 t a(-1) for SO(x). Between 1995 and 2007, there was a 59% absolute reduction in direct pollution, and 76% pollution avoidance. Pollution avoidance was dominated by reductions in emissions of VOCs, SO(x) and NO(x) to air, and emissions of heavy metals to water. Pollution avoidance of 35% was attributed to integrated licensing, ranging from between 8% and 2.9 t a(-1) for phosphorus emissions to water to 49% and 3143 t a(-1) for SO(x) emissions to air. Environmental regulation enforced through integrated licensing has been the major driver of substantial pollution avoidance achieved by Ireland's pharmaceutical sector - through emission limit values associated with Best Available Techniques, emissions monitoring and reporting requirements, and

  2. Quantitative Microbial Risk Assessment for Escherichia coli O157:H7 in Fresh-Cut Lettuce.

    PubMed

    Pang, Hao; Lambertini, Elisabetta; Buchanan, Robert L; Schaffner, Donald W; Pradhan, Abani K

    2017-02-01

    Leafy green vegetables, including lettuce, are recognized as potential vehicles for foodborne pathogens such as Escherichia coli O157:H7. Fresh-cut lettuce is potentially at high risk of causing foodborne illnesses, as it is generally consumed without cooking. Quantitative microbial risk assessments (QMRAs) are gaining more attention as an effective tool to assess and control potential risks associated with foodborne pathogens. This study developed a QMRA model for E. coli O157:H7 in fresh-cut lettuce and evaluated the effects of different potential intervention strategies on the reduction of public health risks. The fresh-cut lettuce production and supply chain was modeled from field production, with both irrigation water and soil as initial contamination sources, to consumption at home. The baseline model (with no interventions) predicted a mean probability of 1 illness per 10 million servings and a mean of 2,160 illness cases per year in the United States. All intervention strategies evaluated (chlorine, ultrasound and organic acid, irradiation, bacteriophage, and consumer washing) significantly reduced the estimated mean number of illness cases when compared with the baseline model prediction (from 11.4- to 17.9-fold reduction). Sensitivity analyses indicated that retail and home storage temperature were the most important factors affecting the predicted number of illness cases. The developed QMRA model provided a framework for estimating risk associated with consumption of E. coli O157:H7-contaminated fresh-cut lettuce and can guide the evaluation and development of intervention strategies aimed at reducing such risk.

  3. Quantitative assessment of infection risk from exposure to waterborne pathogens in urban floodwater.

    PubMed

    de Man, H; van den Berg, H H J L; Leenen, E J T M; Schijven, J F; Schets, F M; van der Vliet, J C; van Knapen, F; de Roda Husman, A M

    2014-01-01

    Flooding and heavy rainfall have been associated with waterborne infectious disease outbreaks, however, it is unclear to which extent they pose a risk for public health. Here, risks of infection from exposure to urban floodwater were assessed using quantitative microbial risk assessment (QMRA). To that aim, urban floodwaters were sampled in the Netherlands during 23 events in 2011 and 2012. The water contained Campylobacter jejuni (prevalence 61%, range 14- >10(3) MPN/l), Giardia spp. (35%, 0.1-142 cysts/l), Cryptosporidium (30%, 0.1-9.8 oocysts/l), noroviruses (29%, 10(2)-10(4) pdu/l) and enteroviruses (35%, 10(3)-10(4) pdu/l). Exposure data collected by questionnaire, revealed that children swallowed 1.7 ml (mean, 95% Confidence Interval 0-4.6 ml) per exposure event and adults swallowed 0.016 ml (mean, 95% CI 0-0.068 ml) due to hand-mouth contact. The mean risk of infection per event for children, who were exposed to floodwater originating from combined sewers, storm sewers and rainfall generated surface runoff was 33%, 23% and 3.5%, respectively, and for adults it was 3.9%, 0.58% and 0.039%. The annual risk of infection was calculated to compare flooding from different urban drainage systems. An exposure frequency of once every 10 years to flooding originating from combined sewers resulted in an annual risk of infection of 8%, which was equal to the risk of infection of flooding originating from rainfall generated surface runoff 2.3 times per year. However, these annual infection risks will increase with a higher frequency of urban flooding due to heavy rainfall as foreseen in climate change projections.

  4. The Strategic Environment Assessment bibliographic network: A quantitative literature review analysis

    SciTech Connect

    Caschili, Simone; De Montis, Andrea; Ganciu, Amedeo; Ledda, Antonio; Barra, Mario

    2014-07-01

    Academic literature has been continuously growing at such a pace that it can be difficult to follow the progression of scientific achievements; hence, the need to dispose of quantitative knowledge support systems to analyze the literature of a subject. In this article we utilize network analysis tools to build a literature review of scientific documents published in the multidisciplinary field of Strategic Environment Assessment (SEA). The proposed approach helps researchers to build unbiased and comprehensive literature reviews. We collect information on 7662 SEA publications and build the SEA Bibliographic Network (SEABN) employing the basic idea that two publications are interconnected if one cites the other. We apply network analysis at macroscopic (network architecture), mesoscopic (sub graph) and microscopic levels (node) in order to i) verify what network structure characterizes the SEA literature, ii) identify the authors, disciplines and journals that are contributing to the international discussion on SEA, and iii) scrutinize the most cited and important publications in the field. Results show that the SEA is a multidisciplinary subject; the SEABN belongs to the class of real small world networks with a dominance of publications in Environmental studies over a total of 12 scientific sectors. Christopher Wood, Olivia Bina, Matthew Cashmore, and Andrew Jordan are found to be the leading authors while Environmental Impact Assessment Review is by far the scientific journal with the highest number of publications in SEA studies. - Highlights: • We utilize network analysis to analyze scientific documents in the SEA field. • We build the SEA Bibliographic Network (SEABN) of 7662 publications. • We apply network analysis at macroscopic, mesoscopic and microscopic network levels. • We identify SEABN architecture, relevant publications, authors, subjects and journals.

  5. A quantitative methodology to assess the risks to human health from CO 2 leakage into groundwater

    NASA Astrophysics Data System (ADS)

    Siirila, Erica R.; Navarre-Sitchler, Alexis K.; Maxwell, Reed M.; McCray, John E.

    2012-02-01

    Leakage of CO 2 and associated gases into overlying aquifers as a result of geologic carbon capture and sequestration may have adverse impacts on aquifer drinking-water quality. Gas or aqueous-phase leakage may occur due to transport via faults and fractures, through faulty well bores, or through leaky confining materials. Contaminants of concern include aqueous salts and dissolved solids, gaseous or aqueous-phase organic contaminants, and acidic gas or aqueous-phase fluids that can liberate metals from aquifer minerals. Here we present a quantitative risk assessment framework to predict potential human health risk from CO 2 leakage into drinking water aquifers. This framework incorporates the potential release of CO 2 into the drinking water aquifer; mobilization of metals due to a decrease in pH; transport of these metals down gradient to municipal receptors; distributions of contaminated groundwater to multiple households; and exposure and health risk to individuals using this water for household purposes. Additionally, this framework is stochastic, incorporates detailed variations in geological and geostatistical parameters and discriminates between uncertain and variable parameters using a two-stage, or nested, Monte Carlo approach. This approach is demonstrated using example simulations with hypothetical, yet realistic, aquifer characteristics and leakage scenarios. These example simulations show a greater risk for arsenic than for lead for both cancer and non-cancer endpoints, an unexpected finding. Higher background groundwater gradients also yield higher risk. The overall risk and the associated uncertainty are sensitive to the extent of aquifer stratification and the degree of local-scale dispersion. These results all highlight the importance of hydrologic modeling in risk assessment. A linear relationship between carcinogenic and noncarcinogenic risk was found for arsenic and suggests action levels for carcinogenic risk will be exceeded in exposure

  6. Digital tomosynthesis (DTS) for quantitative assessment of trabecular microstructure in human vertebral bone.

    PubMed

    Kim, Woong; Oravec, Daniel; Nekkanty, Srikant; Yerramshetty, Janardhan; Sander, Edward A; Divine, George W; Flynn, Michael J; Yeni, Yener N

    2015-01-01

    Digital tomosynthesis (DTS) provides slice images of an object using conventional radiographic methods with high in-plane resolution. The objective of this study was to explore the potential of DTS for describing microstructural, stiffness and stress distribution properties of vertebral cancellous bone. Forty vertebrae (T6, T8, T11, and L3) from 10 cadavers (63-90 years) were scanned using microCT and DTS. Anisotropy (μCT.DA), and the specimen-average and standard deviation of trabecular bone volume fraction (BV/TV), thickness (Tb.Th), number (Tb.N) and separation (Tb.Sp) were obtained using stereology. Apparent modulus (EFEM), and the magnitude (VMExp/σapp) and variability (VMCV) of trabecular stresses were calculated using microCT-based finite element modeling. Mean intercept length, line fraction deviation and fractal parameters were obtained from coronal DTS slices, then correlated with stereological and finite element parameters using linear regression models. Twenty-one DTS parameters (out of 27) correlated to BV/TV, Tb.Th, Tb.N, Tb.Sp and/or μCT.DA (p<0.0001-p<0.05). DTS parameters increased the explained variability in EFEM and VMCV (by 9-11% and 13-19%, respectively; p<0.0001-p<0.04) over that explained by BV/TV. In conclusion, DTS has potential for quantitative assessment of cancellous bone and may be used as a modality complementary to those measuring bone mass for assessing spinal fracture risk.

  7. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    PubMed

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation.

  8. Quantitative risk assessment of the New York State operated West Valley Radioactive Waste Disposal Area.

    PubMed

    Garrick, B John; Stetkar, John W; Bembia, Paul J

    2010-08-01

    This article is based on a quantitative risk assessment (QRA) that was performed on a radioactive waste disposal area within the Western New York Nuclear Service Center in western New York State. The QRA results were instrumental in the decision by the New York State Energy Research and Development Authority to support a strategy of in-place management of the disposal area for another decade. The QRA methodology adopted for this first of a kind application was a scenario-based approach in the framework of the triplet definition of risk (scenarios, likelihoods, consequences). The measure of risk is the frequency of occurrence of different levels of radiation dose to humans at prescribed locations. The risk from each scenario is determined by (1) the frequency of disruptive events or natural processes that cause a release of radioactive materials from the disposal area; (2) the physical form, quantity, and radionuclide content of the material that is released during each scenario; (3) distribution, dilution, and deposition of the released materials throughout the environment surrounding the disposal area; and (4) public exposure to the distributed material and the accumulated radiation dose from that exposure. The risks of the individual scenarios are assembled into a representation of the risk from the disposal area. In addition to quantifying the total risk to the public, the analysis ranks the importance of each contributing scenario, which facilitates taking corrective actions and implementing effective risk management. Perhaps most importantly, quantification of the uncertainties is an intrinsic part of the risk results. This approach to safety analysis has demonstrated many advantages of applying QRA principles to assessing the risk of facilities involving hazardous materials.

  9. Assessment of trabecular bone mineral density using quantitative computed tomography in normal cats.

    PubMed

    Cheon, Haengbok; Choi, Wooshin; Lee, Youngjae; Lee, Donghoon; Kim, Juhyung; Kang, Ji-Houn; Na, Kijeong; Chang, Jinhwa; Chang, Dongwoo

    2012-11-01

    The aim of this study was to assess age-related changes and anatomic variation in trabecular bone mineral density (tBMD) using quantitative computed tomography (QCT) in normal cats. Seventeen normal cats were included in this study and divided into the following 3 age groups:<6 months (n=4), 2-5 years (n=10) and >6 years (n=3). A computed tomographic scan of each vertebra from the 12th thoracic to the 7th lumbar spine and the pelvis was performed with a bone-density phantom (50, 100 and 150 mg/cm(3), calcium hydroxyapatite, CIRS phantom(®)). On the central transverse section, the elliptical region of interest (ROI) was drawn to measure the mean Hounsfield unit (HU) value. Those values were converted to equivalent tBMD (mg/cm(3)) by use of the bone-density phantom and linear regression analysis (r(2) >0.95). The mean tBMD value of the thoracic vertebrae (369.4 ± 31.8 mg/cm(3)) was significantly higher than that of the lumbar vertebrae (285 ± 58.1 mg/cm(3)). The maximum tBMD occurred at the T12, T13 and L1 levels in all age groups. There was a statistically significant difference in the mean tBMD value among the 3 age groups at the T12 (P<0.001), T13 (P<0.001) and L4 levels (P=0.013), respectively. The present study suggests that age-related changes and anatomic variation in tBMD values should be considered when assessing tBMD using QCT in cats with bone disorders.

  10. Quantitative assessment of the enamel machinability in tooth preparation with dental diamond burs.

    PubMed

    Song, Xiao-Fei; Jin, Chen-Xin; Yin, Ling

    2015-01-01

    Enamel cutting using dental handpieces is a critical process in tooth preparation for dental restorations and treatment but the machinability of enamel is poorly understood. This paper reports on the first quantitative assessment of the enamel machinability using computer-assisted numerical control, high-speed data acquisition, and force sensing systems. The enamel machinability in terms of cutting forces, force ratio, cutting torque, cutting speed and specific cutting energy were characterized in relation to enamel surface orientation, specific material removal rate and diamond bur grit size. The results show that enamel surface orientation, specific material removal rate and diamond bur grit size critically affected the enamel cutting capability. Cutting buccal/lingual surfaces resulted in significantly higher tangential and normal forces, torques and specific energy (p<0.05) but lower cutting speeds than occlusal surfaces (p<0.05). Increasing material removal rate for high cutting efficiencies using coarse burs yielded remarkable rises in cutting forces and torque (p<0.05) but significant reductions in cutting speed and specific cutting energy (p<0.05). In particular, great variations in cutting forces, torques and specific energy were observed at the specific material removal rate of 3mm(3)/min/mm using coarse burs, indicating the cutting limit. This work provides fundamental data and the scientific understanding of the enamel machinability for clinical dental practice.

  11. A method for rapid quantitative assessment of biofilms with biomolecular staining and image analysis.

    PubMed

    Larimer, Curtis; Winder, Eric; Jeters, Robert; Prowant, Matthew; Nettleship, Ian; Addleman, Raymond Shane; Bonheyo, George T

    2016-01-01

    The accumulation of bacteria in surface-attached biofilms can be detrimental to human health, dental hygiene, and many industrial processes. Natural biofilms are soft and often transparent, and they have heterogeneous biological composition and structure over micro- and macroscales. As a result, it is challenging to quantify the spatial distribution and overall intensity of biofilms. In this work, a new method was developed to enhance the visibility and quantification of bacterial biofilms. First, broad-spectrum biomolecular staining was used to enhance the visibility of the cells, nucleic acids, and proteins that make up biofilms. Then, an image analysis algorithm was developed to objectively and quantitatively measure biofilm accumulation from digital photographs and results were compared to independent measurements of cell density. This new method was used to quantify the growth intensity of Pseudomonas putida biofilms as they grew over time. This method is simple and fast, and can quantify biofilm growth over a large area with approximately the same precision as the more laborious cell counting method. Stained and processed images facilitate assessment of spatial heterogeneity of a biofilm across a surface. This new approach to biofilm analysis could be applied in studies of natural, industrial, and environmental biofilms.

  12. Utilization of quantitative structure-activity relationships (QSARs) in risk assessment: Alkylphenols

    SciTech Connect

    Beck, B.D.; Toole, A.P.; Callahan, B.G.; Siddhanti, S.K. )

    1991-12-01

    Alkylphenols are a class of environmentally pervasive compounds, found both in natural (e.g., crude oils) and in anthropogenic (e.g., wood tar, coal gasification waste) materials. Despite the frequent environmental occurrence of these chemicals, there is a limited toxicity database on alkylphenols. The authors have therefore developed a 'toxicity equivalence approach' for alkylphenols which is based on their ability to inhibit, in a specific manner, the enzyme cyclooxygenase. Enzyme-inhibiting ability for individual alkylphenols can be estimated based on the quantitative structure-activity relationship developed by Dewhirst (1980) and is a function of the free hydroxyl group, electron-donating ring substituents, and hydrophobic aromatic ring substituents. The authors evaluated the toxicological significance of cyclooxygenase inhibition by comparison of the inhibitory capacity of alkylphenols with the inhibitory capacity of acetylsalicylic acid, or aspirin, a compound whose low-level effects are due to cyclooxygenase inhibition. Since nearly complete absorption for alkylphenols and aspirin is predicted, based on estimates of hydrophobicity and fraction of charged molecules at gastrointestinal pHs, risks from alkylphenols can be expressed directly in terms of 'milligram aspirin equivalence,' without correction for absorption differences. They recommend this method for assessing risks of mixtures of alkylphenols, especially for those compounds with no chronic toxicity data.38 references.

  13. A quantitative assay for assessing the effects of DNA lesions on transcription.

    PubMed

    You, Changjun; Dai, Xiaoxia; Yuan, Bifeng; Wang, Jin; Wang, Jianshuang; Brooks, Philip J; Niedernhofer, Laura J; Wang, Yinsheng

    2012-10-01

    Most mammalian cells in nature are quiescent but actively transcribing mRNA for normal physiological processes; thus, it is important to investigate how endogenous and exogenous DNA damage compromises transcription in cells. Here we describe a new competitive transcription and adduct bypass (CTAB) assay to determine the effects of DNA lesions on the fidelity and efficiency of transcription. Using this strategy, we demonstrate that the oxidatively induced lesions 8,5'-cyclo-2'-deoxyadenosine (cdA) and 8,5'-cyclo-2'-deoxyguanosine (cdG) and the methylglyoxal-induced lesion N(2)-(1-carboxyethyl)-2'-deoxyguanosine (N(2)-CEdG) strongly inhibited transcription in vitro and in mammalian cells. In addition, cdA and cdG, but not N(2)-CEdG, induced transcriptional mutagenesis in vitro and in vivo. Furthermore, when located on the template DNA strand, all examined lesions were primarily repaired by transcription-coupled nucleotide excision repair in mammalian cells. This newly developed CTAB assay should be generally applicable for quantitatively assessing how other DNA lesions affect DNA transcription in vitro and in cells.

  14. A miniaturized technique for assessing protein thermodynamics and function using fast determination of quantitative cysteine reactivity.

    PubMed

    Isom, Daniel G; Marguet, Philippe R; Oas, Terrence G; Hellinga, Homme W

    2011-04-01

    Protein thermodynamic stability is a fundamental physical characteristic that determines biological function. Furthermore, alteration of thermodynamic stability by macromolecular interactions or biochemical modifications is a powerful tool for assessing the relationship between protein structure, stability, and biological function. High-throughput approaches for quantifying protein stability are beginning to emerge that enable thermodynamic measurements on small amounts of material, in short periods of time, and using readily accessible instrumentation. Here we present such a method, fast quantitative cysteine reactivity, which exploits the linkage between protein stability, sidechain protection by protein structure, and structural dynamics to characterize the thermodynamic and kinetic properties of proteins. In this approach, the reaction of a protected cysteine and thiol-reactive fluorogenic indicator is monitored over a gradient of temperatures after a short incubation time. These labeling data can be used to determine the midpoint of thermal unfolding, measure the temperature dependence of protein stability, quantify ligand-binding affinity, and, under certain conditions, estimate folding rate constants. Here, we demonstrate the fQCR method by characterizing these thermodynamic and kinetic properties for variants of Staphylococcal nuclease and E. coli ribose-binding protein engineered to contain single, protected cysteines. These straightforward, information-rich experiments are likely to find applications in protein engineering and functional genomics.

  15. Quantitative risk assessment of noroviruses in drinking water based on qualitative data in Japan.

    PubMed

    Masago, Yoshifumi; Katayama, Hiroyuki; Watanabe, Toru; Haramoto, Eiji; Hashimoto, Atsushi; Omura, Tatsuo; Hirata, Tsuyoshi; Ohgaki, Shinichiro

    2006-12-01

    Noroviruses are one of the major causes of viral gastroenteritis in Japan. A quantitative risk assessment was conducted to evaluate the health risk caused by this virus in drinking water. A Monte Carlo analysis was used to calculate both the probability of infection and the disease burden using disability-adjusted life years (DALYs). The concentration of noroviruses in tap water was estimated based on qualitative data and a most probable number (MPN) method with an assumed Poisson lognormal distribution. This numerical method was evaluated using two sets of available count data of Cryptosporidium: that collected from a river and that found in tap water in Japan. The dose-response relationships for noroviruses were estimated using assumed ID50 (10 or 100). The annual risk was higher than the US-EPA acceptable level (10(-4) [infection/ person-year]) but around the WHO level (10(-6) [DALYs/ person-year]). As suggested by others, since microbial concentrations are generally lognormally distributed, the arithmetic mean was directly related to the annual risk, suggesting that the arithmetic mean is more useful in representing the degree of microbial contamination than the geometric mean.

  16. Assessment of Quantitative and Allelic MGMT Methylation Patterns as a Prognostic Marker in Glioblastoma.

    PubMed

    Kristensen, Lasse S; Michaelsen, Signe R; Dyrbye, Henrik; Aslan, Derya; Grunnet, Kirsten; Christensen, Ib J; Poulsen, Hans S; Grønbæk, Kirsten; Broholm, Helle

    2016-03-01

    Methylation of the O(6)-methylguanine-DNA methyltransferase (MGMT) gene is a predictive and prognostic marker in newly diagnosed glioblastoma patients treated with temozolomide but how MGMT methylation should be assessed to ensure optimal detection accuracy is debated. We developed a novel quantitative methylation-specific PCR (qMSP) MGMT assay capable of providing allelic methylation data and analyzed 151 glioblastomas from patients receiving standard of care treatment (Stupp protocol). The samples were also analyzed by immunohistochemistry (IHC), standard bisulfite pyrosequencing, and genotyped for the rs1690252 MGMT promoter single nucleotide polymorphism. Monoallelic methylation was observed more frequently than biallelic methylation, and some cases with monoallelic methylation expressed the MGMT protein whereas others did not. The presence of MGMT methylation was associated with better overall survival (p = 0.006; qMSP and p = 0.002; standard pyrosequencing), and the presence of the protein was associated with worse overall survival (p = 0.009). Combined analyses of qMSP and standard pyrosequencing or IHC identified additional patients who benefited from temozolomide treatment. Finally, low methylation levels were also associated with better overall survival (p = 0.061; qMSP and p = 0.02; standard pyrosequencing). These data support the use of both MGMT methylation and MGMT IHC but not allelic methylation data as prognostic markers in patients with temozolomide-treated glioblastoma.

  17. In-hospital evaluation of contamination of duodenoscopes: a quantitative assessment of the effect of drying.

    PubMed

    Alfa, M J; Sitter, D L

    1991-10-01

    A prospective, quantitative assessment was undertaken of the effect of drying on the bacterial load in duodenoscopes that had been used for endoscopic retrograde cholangiopancreatography procedures. The endoscopes were washed and disinfected using an automatic washer and samples were taken through the suction channel at 2, 24 and 48 h post-disinfection. Twenty-one of the 42 duodenoscopes tested were contaminated. The ratio of Gram-negative bacilli to Gram-positive cocci increased from 70:1 at 2 h up to 4000:1 at 48 h for those duodenoscopes that were contaminated. Pseudomonas species (6 of 12 contaminated endoscopes) and Acinetobacter species (7 of 21 contaminated endoscopes) were the most common isolates. There was visible moisture remaining in the suction channel despite the use of the complete recommended automatic washer cycle. Bacterial concentrations reached as high as 1 x 10(7) colony forming units (cfu) ml-1. An additional 10 min of drying using either an 'in house' air line or the manual machine dry prevented bacterial overgrowth of all 19 endoscopes tested 48 h post-disinfection. If the additional 10 min of drying was used, then no alcohol rinse was required. Although no infections related to use of contaminated endoscopes were reported, it was apparent that Gram-negative bacilli were multiplying to unacceptably high concentrations and that this could be prevented by an additional 10 min of drying. The additional drying was only required at the end of the endoscopy list and not between patients.

  18. Quantitative CT assessment of bone mineral density in dogs with hyperadrenocorticism

    PubMed Central

    Lee, Donghoon; Lee, Youngjae; Choi, Wooshin; Chang, Jinhwa; Kang, Ji-Houn; Na, Ki-Jeong

    2015-01-01

    Canine hyperadrenocorticism (HAC) is one of the most common causes of general osteopenia. In this study, quantitative computed tomography (QCT) was used to compare the bone mineral densities (BMD) between 39 normal dogs and 8 dogs with HAC (6 pituitary-dependent hyperadrenocorticism [PDH]; pituitary dependent hyperadrenocorticism, 2 adrenal hyperadrenocorticism [ADH]; adrenal dependent hyperadrenocorticism) diagnosed through hormonal assay. A computed tomogaraphy scan of the 12th thoracic to 7th lumbar vertebra was performed and the region of interest was drawn in each trabecular and cortical bone. Mean Hounsfield unit values were converted to equivalent BMD with bone-density phantom by linear regression analysis. The converted mean trabecular BMDs were significantly lower than those of normal dogs. ADH dogs showed significantly lower BMDs at cortical bone than normal dogs. Mean trabecular BMDs of dogs with PDH using QCT were significantly lower than those of normal dogs, and both mean trabecular and cortical BMDs in dogs with ADH were significantly lower than those of normal dogs. Taken together, these findings indicate that QCT is useful to assess BMD in dogs with HAC. PMID:26040613

  19. Quantitative microbial risk assessment applied to irrigation of salad crops with waste stabilization pond effluents.

    PubMed

    Pavione, D M S; Bastos, R K X; Bevilacqua, P D

    2013-01-01

    A quantitative microbial risk assessment model for estimating infection risks arising from consuming crops eaten raw that have been irrigated with effluents from stabilization ponds was constructed. A log-normal probability distribution function was fitted to a large database from a comprehensive monitoring of an experimental pond system to account for variability in Escherichia coli concentration in irrigation water. Crop contamination levels were estimated using predic