Science.gov

Sample records for addition quantitative results

  1. Quantitative results from the focusing schlieren technique

    NASA Technical Reports Server (NTRS)

    Cook, S. P.; Chokani, Ndaona

    1993-01-01

    An iterative theoretical approach to obtain quantitative density data from the focusing schlieren technique is proposed. The approach is based on an approximate modeling of the focusing action in a focusing schlieren system, and an estimation of an appropriate focal plane thickness. The theoretical approach is incorporated in a computer program, and results obtained from a supersonic wind tunnel experiment evaluated by comparison with CFD data. The density distributions compared favorably with CFD predictions. However, improvements to the system are required in order to reduce noise in the data, to improve specifications of a depth of focus, and to refine the modeling of the focusing action.

  2. Genetic interactions contribute less than additive effects to quantitative trait variation in yeast

    PubMed Central

    Bloom, Joshua S.; Kotenko, Iulia; Sadhu, Meru J.; Treusch, Sebastian; Albert, Frank W.; Kruglyak, Leonid

    2015-01-01

    Genetic mapping studies of quantitative traits typically focus on detecting loci that contribute additively to trait variation. Genetic interactions are often proposed as a contributing factor to trait variation, but the relative contribution of interactions to trait variation is a subject of debate. Here we use a very large cross between two yeast strains to accurately estimate the fraction of phenotypic variance due to pairwise QTL–QTL interactions for 20 quantitative traits. We find that this fraction is 9% on average, substantially less than the contribution of additive QTL (43%). Statistically significant QTL–QTL pairs typically have small individual effect sizes, but collectively explain 40% of the pairwise interaction variance. We show that pairwise interaction variance is largely explained by pairs of loci at least one of which has a significant additive effect. These results refine our understanding of the genetic architecture of quantitative traits and help guide future mapping studies. PMID:26537231

  3. A correlative imaging based methodology for accurate quantitative assessment of bone formation in additive manufactured implants.

    PubMed

    Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D

    2016-06-01

    A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants. PMID:27153828

  4. Qualitative versus Quantitative Results: An Experimental Introduction to Data Interpretation.

    ERIC Educational Resources Information Center

    Johnson, Eric R.; Alter, Paula

    1989-01-01

    Described is an experiment in which the student can ascertain the meaning of a negative result from a qualitative test by performing a more sensitive quantitative test on the same sample. Methodology for testing urinary glucose with a spectrophotometer at 630 nm and with commercial assaying glucose strips is presented. (MVL)

  5. Modular Skeletal Evolution in Sticklebacks Is Controlled by Additive and Clustered Quantitative Trait Loci

    PubMed Central

    Miller, Craig T.; Glazer, Andrew M.; Summers, Brian R.; Blackman, Benjamin K.; Norman, Andrew R.; Shapiro, Michael D.; Cole, Bonnie L.; Peichel, Catherine L.; Schluter, Dolph; Kingsley, David M.

    2014-01-01

    Understanding the genetic architecture of evolutionary change remains a long-standing goal in biology. In vertebrates, skeletal evolution has contributed greatly to adaptation in body form and function in response to changing ecological variables like diet and predation. Here we use genome-wide linkage mapping in threespine stickleback fish to investigate the genetic architecture of evolved changes in many armor and trophic traits. We identify >100 quantitative trait loci (QTL) controlling the pattern of serially repeating skeletal elements, including gill rakers, teeth, branchial bones, jaws, median fin spines, and vertebrae. We use this large collection of QTL to address long-standing questions about the anatomical specificity, genetic dominance, and genomic clustering of loci controlling skeletal differences in evolving populations. We find that most QTL (76%) that influence serially repeating skeletal elements have anatomically regional effects. In addition, most QTL (71%) have at least partially additive effects, regardless of whether the QTL controls evolved loss or gain of skeletal elements. Finally, many QTL with high LOD scores cluster on chromosomes 4, 20, and 21. These results identify a modular system that can control highly specific aspects of skeletal form. Because of the general additivity and genomic clustering of major QTL, concerted changes in both protective armor and trophic traits may occur when sticklebacks inherit either marine or freshwater alleles at linked or possible “supergene” regions of the stickleback genome. Further study of these regions will help identify the molecular basis of both modular and coordinated changes in the vertebrate skeleton. PMID:24652999

  6. Modular skeletal evolution in sticklebacks is controlled by additive and clustered quantitative trait Loci.

    PubMed

    Miller, Craig T; Glazer, Andrew M; Summers, Brian R; Blackman, Benjamin K; Norman, Andrew R; Shapiro, Michael D; Cole, Bonnie L; Peichel, Catherine L; Schluter, Dolph; Kingsley, David M

    2014-05-01

    Understanding the genetic architecture of evolutionary change remains a long-standing goal in biology. In vertebrates, skeletal evolution has contributed greatly to adaptation in body form and function in response to changing ecological variables like diet and predation. Here we use genome-wide linkage mapping in threespine stickleback fish to investigate the genetic architecture of evolved changes in many armor and trophic traits. We identify >100 quantitative trait loci (QTL) controlling the pattern of serially repeating skeletal elements, including gill rakers, teeth, branchial bones, jaws, median fin spines, and vertebrae. We use this large collection of QTL to address long-standing questions about the anatomical specificity, genetic dominance, and genomic clustering of loci controlling skeletal differences in evolving populations. We find that most QTL (76%) that influence serially repeating skeletal elements have anatomically regional effects. In addition, most QTL (71%) have at least partially additive effects, regardless of whether the QTL controls evolved loss or gain of skeletal elements. Finally, many QTL with high LOD scores cluster on chromosomes 4, 20, and 21. These results identify a modular system that can control highly specific aspects of skeletal form. Because of the general additivity and genomic clustering of major QTL, concerted changes in both protective armor and trophic traits may occur when sticklebacks inherit either marine or freshwater alleles at linked or possible "supergene" regions of the stickleback genome. Further study of these regions will help identify the molecular basis of both modular and coordinated changes in the vertebrate skeleton. PMID:24652999

  7. Quantitative MR imaging in fracture dating-Initial results.

    PubMed

    Baron, Katharina; Neumayer, Bernhard; Widek, Thomas; Schick, Fritz; Scheicher, Sylvia; Hassler, Eva; Scheurer, Eva

    2016-04-01

    For exact age determinations of bone fractures in a forensic context (e.g. in cases of child abuse) improved knowledge of the time course of the healing process and use of non-invasive modern imaging technology is of high importance. To date, fracture dating is based on radiographic methods by determining the callus status and thereby relying on an expert's experience. As a novel approach, this study aims to investigate the applicability of magnetic resonance imaging (MRI) for bone fracture dating by systematically investigating time-resolved changes in quantitative MR characteristics after a fracture event. Prior to investigating fracture healing in children, adults were examined for this study in order to test the methodology for this application. Altogether, 31 MR examinations in 17 subjects (♀: 11 ♂: 6; median age 34±15 y, scanned 1-5 times over a period of up to 200 days after the fracture event) were performed on a clinical 3T MR scanner (TimTrio, Siemens AG, Germany). All subjects were treated conservatively for a fracture in either a long bone or in the collar bone. Both, qualitative and quantitative MR measurements were performed in all subjects. MR sequences for a quantitative measurement of relaxation times T1 and T2 in the fracture gap and musculature were applied. Maps of quantitative MR parameters T1, T2, and magnetisation transfer ratio (MTR) were calculated and evaluated by investigating changes over time in the fractured area by defined ROIs. Additionally, muscle areas were examined as reference regions to validate this approach. Quantitative evaluation of 23 MR data sets (12 test subjects, ♀: 7 ♂: 5) showed an initial peak in T1 values in the fractured area (T1=1895±607ms), which decreased over time to a value of 1094±182ms (200 days after the fracture event). T2 values also peaked for early-stage fractures (T2=115±80ms) and decreased to 73±33ms within 21 days after the fracture event. After that time point, no significant changes

  8. The quantitative surface analysis of an antioxidant additive in a lubricant oil matrix by desorption electrospray ionization mass spectrometry

    PubMed Central

    Da Costa, Caitlyn; Reynolds, James C; Whitmarsh, Samuel; Lynch, Tom; Creaser, Colin S

    2013-01-01

    RATIONALE Chemical additives are incorporated into commercial lubricant oils to modify the physical and chemical properties of the lubricant. The quantitative analysis of additives in oil-based lubricants deposited on a surface without extraction of the sample from the surface presents a challenge. The potential of desorption electrospray ionization mass spectrometry (DESI-MS) for the quantitative surface analysis of an oil additive in a complex oil lubricant matrix without sample extraction has been evaluated. METHODS The quantitative surface analysis of the antioxidant additive octyl (4-hydroxy-3,5-di-tert-butylphenyl)propionate in an oil lubricant matrix was carried out by DESI-MS in the presence of 2-(pentyloxy)ethyl 3-(3,5-di-tert-butyl-4-hydroxyphenyl)propionate as an internal standard. A quadrupole/time-of-flight mass spectrometer fitted with an in-house modified ion source enabling non-proximal DESI-MS was used for the analyses. RESULTS An eight-point calibration curve ranging from 1 to 80 µg/spot of octyl (4-hydroxy-3,5-di-tert-butylphenyl)propionate in an oil lubricant matrix and in the presence of the internal standard was used to determine the quantitative response of the DESI-MS method. The sensitivity and repeatability of the technique were assessed by conducting replicate analyses at each concentration. The limit of detection was determined to be 11 ng/mm2 additive on spot with relative standard deviations in the range 3–14%. CONCLUSIONS The application of DESI-MS to the direct, quantitative surface analysis of a commercial lubricant additive in a native oil lubricant matrix is demonstrated. © 2013 The Authors. Rapid Communications in Mass Spectrometry published by John Wiley & Sons, Ltd. PMID:24097398

  9. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo)

    PubMed Central

    Li, Yi; Kim, Jong-Joo

    2015-01-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesCπ approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions. PMID:26104396

  10. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo).

    PubMed

    Li, Yi; Kim, Jong-Joo

    2015-07-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesCπ approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions. PMID:26104396

  11. Mars-GRAM 2010: Additions and Resulting Improvements

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Burns, K. Lee

    2013-01-01

    factors. The adjustment factors generated by this process had to satisfy the gas law as well as the hydrostatic relation and are expressed as a function of height (z), Latitude (Lat) and areocentric solar longitude (Ls). The greatest adjustments are made at large optical depths such as tau greater than 1. The addition of the adjustment factors has led to better correspondence to TES Limb data from 0-60 km altitude as well as better agreement with MGS, ODY and MRO data at approximately 90-130 km altitude. Improved Mars-GRAM atmospheric simulations for various locations, times and dust conditions on Mars will be presented at the workshop session. The latest results validating Mars-GRAM 2010 versus Mars Climate Sounder data will also be presented. Mars-GRAM 2010 updates have resulted in improved atmospheric simulations which will be very important when beginning systems design, performance analysis, and operations planning for future aerocapture, aerobraking or landed missions to Mars.

  12. Additional Results of Ice-Accretion Scaling at SLD Conditions

    NASA Technical Reports Server (NTRS)

    Bond, Thomas H. (Technical Monitor); Anderson, David N.; Tsao, Jen-Ching

    2005-01-01

    To determine scale velocity an additional similarity parameter is needed to supplement the Ruff scaling method. A Weber number based on water droplet MVD has been included in several studies because the effect of droplet splashing on ice accretion was believed to be important, particularly for SLD conditions. In the present study, ice shapes recorded at Appendix-C conditions and recent results at SLD conditions are reviewed to show that droplet diameter cannot be important to main ice shape, and for low airspeeds splashing does not appear to affect SLD ice shapes. Evidence is presented to show that while a supplementary similarity parameter probably has the form of a Weber number, it must be based on a length proportional to model size rather than MVD. Scaling comparisons were made between SLD reference conditions and Appendix-C scale conditions using this Weber number. Scale-to-reference model size ratios were 1:1.7 and 1:3.4. The reference tests used a 91-cm-chord NACA 0012 model with a velocity of approximately 50 m/s and an MVD of 160 m. Freezing fractions of 0.3, 0.4, and 0.5 were included in the study.

  13. Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides

    EPA Science Inventory

    Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...

  14. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    ERIC Educational Resources Information Center

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for…

  15. Small-Scale Spray Releases: Additional Aerosol Test Results

    SciTech Connect

    Schonewill, Philip P.; Gauglitz, Phillip A.; Kimura, Marcia L.; Brown, G. N.; Mahoney, Lenna A.; Tran, Diana N.; Burns, Carolyn A.; Kurath, Dean E.

    2013-08-01

    One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are largely absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale. The small-scale testing and resultant data are described in Mahoney et al. (2012b) and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used to mimic the

  16. Large-Scale Spray Releases: Additional Aerosol Test Results

    SciTech Connect

    Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.

    2013-08-01

    One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used

  17. Additional Results of Glaze Icing Scaling in SLD Conditions

    NASA Technical Reports Server (NTRS)

    Tsao, Jen-Ching

    2016-01-01

    New guidance of acceptable means of compliance with the super-cooled large drops (SLD) conditions has been issued by the U.S. Department of Transportation's Federal Aviation Administration (FAA) in its Advisory Circular AC 25-28 in November 2014. The Part 25, Appendix O is developed to define a representative icing environment for super-cooled large drops. Super-cooled large drops, which include freezing drizzle and freezing rain conditions, are not included in Appendix C. This paper reports results from recent glaze icing scaling tests conducted in NASA Glenn Icing Research Tunnel (IRT) to evaluate how well the scaling methods recommended for Appendix C conditions might apply to SLD conditions. The models were straight NACA 0012 wing sections. The reference model had a chord of 72 in. and the scale model had a chord of 21 in. Reference tests were run with airspeeds of 100 and 130.3 kn and with MVD's of 85 and 170 micron. Two scaling methods were considered. One was based on the modified Ruff method with scale velocity found by matching the Weber number WeL. The other was proposed and developed by Feo specifically for strong glaze icing conditions, in which the scale liquid water content and velocity were found by matching reference and scale values of the nondimensional water-film thickness expression and the film Weber number Wef. All tests were conducted at 0 deg AOA. Results will be presented for stagnation freezing fractions of 0.2 and 0.3. For nondimensional reference and scale ice shape comparison, a new post-scanning ice shape digitization procedure was developed for extracting 2-D ice shape profiles at any selected span-wise location from the high fidelity 3-D scanned ice shapes obtained in the IRT.

  18. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  19. Quantitative results of stellar evolution and pulsation theories.

    NASA Technical Reports Server (NTRS)

    Fricke, K.; Stobie, R. S.; Strittmatter, P. A.

    1971-01-01

    The discrepancy between the masses of Cepheid variables deduced from evolution theory and pulsation theory is examined. The effect of input physics on evolutionary tracks is first discussed; in particular, changes in the opacity are considered. The sensitivity of pulsation masses to opacity changes and to the ascribed values of luminosity and effective temperature are then analyzed. The Cepheid mass discrepancy is discussed in the light of the results already obtained. Other astronomical evidence, including the mass-luminosity relation for main sequence stars, the solar neutrino flux, and cluster ages are also considered in an attempt to determine the most likely source of error in the event that substantial mass loss has not occurred.

  20. Validation and Estimation of Additive Genetic Variation Associated with DNA Tests for Quantitative Beef Cattle Traits

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The U.S. National Beef Cattle Evaluation Consortium (NBCEC) has been involved in the validation of commercial DNA tests for quantitative beef quality traits since their first appearance on the U.S. market in the early 2000s. The NBCEC Advisory Council initially requested that the NBCEC set up a syst...

  1. Quantitative analysis of rib kinematics based on dynamic chest bone images: preliminary results

    PubMed Central

    Tanaka, Rie; Sanada, Shigeru; Sakuta, Keita; Kawashima, Hiroki

    2015-01-01

    Abstract. An image-processing technique for separating bones from soft tissue in static chest radiographs has been developed. The present study was performed to evaluate the usefulness of dynamic bone images in quantitative analysis of rib movement. Dynamic chest radiographs of 16 patients were obtained using a dynamic flat-panel detector and processed to create bone images by using commercial software (Clear Read BS, Riverain Technologies). Velocity vectors were measured in local areas on the dynamic images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as a reduced rib velocity field, resulting in an asymmetrical distribution of rib movement. Vector maps in all normal cases exhibited left/right symmetric distributions of the velocity field, whereas those in abnormal cases showed asymmetric distributions because of locally limited rib movements. Dynamic bone images were useful for accurate quantitative analysis of rib movements. The present method has a potential for an additional functional examination in chest radiography. PMID:26158097

  2. Additive effects of pollinators and herbivores result in both conflicting and reinforcing selection on floral traits.

    PubMed

    Sletvold, Nina; Moritz, Kim K; Agren, Jon

    2015-01-01

    Mutualists and antagonists are known to respond to similar floral cues, and may thus cause opposing selection on floral traits. However, we lack a quantitative understanding of their independent and interactive effects. In a population of the orchid Gymnadenia conopsea, we manipulated the intensity of pollination and herbivory in a factorial design to examine whether both interactions influence selection on flowering phenology, floral display, and morphology. Supplemental hand-pollination increased female fitness by 31% and one-quarter of all plants were damaged by herbivores. Both interactions contributed to selection. Pollinators mediated selection for later flowering and herbivores for earlier flowering, while both selected for longer spurs. The strength of selection was similar for both agents, and their effects were additive. As a consequence, there was no. net selection on phenology, whereas selection on spur length was strong. The experimental results demonstrate that both pollinators and herbivores can markedly influence the strength of selection on flowering phenology and floral morphology, and cause both conflicting and reinforcing selection. They also indicate that the direction of selection on phenology will vary with the relative intensity of the mutualistic and antagonistic interaction, potentially resulting in both temporal and among-population variation in optimal flowering time. PMID:26236906

  3. Exploring discrepancies between quantitative validation results and the geomorphic plausibility of statistical landslide susceptibility maps

    NASA Astrophysics Data System (ADS)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Petschko, Helene; Glade, Thomas

    2016-06-01

    Empirical models are frequently applied to produce landslide susceptibility maps for large areas. Subsequent quantitative validation results are routinely used as the primary criteria to infer the validity and applicability of the final maps or to select one of several models. This study hypothesizes that such direct deductions can be misleading. The main objective was to explore discrepancies between the predictive performance of a landslide susceptibility model and the geomorphic plausibility of subsequent landslide susceptibility maps while a particular emphasis was placed on the influence of incomplete landslide inventories on modelling and validation results. The study was conducted within the Flysch Zone of Lower Austria (1,354 km2) which is known to be highly susceptible to landslides of the slide-type movement. Sixteen susceptibility models were generated by applying two statistical classifiers (logistic regression and generalized additive model) and two machine learning techniques (random forest and support vector machine) separately for two landslide inventories of differing completeness and two predictor sets. The results were validated quantitatively by estimating the area under the receiver operating characteristic curve (AUROC) with single holdout and spatial cross-validation technique. The heuristic evaluation of the geomorphic plausibility of the final results was supported by findings of an exploratory data analysis, an estimation of odds ratios and an evaluation of the spatial structure of the final maps. The results showed that maps generated by different inventories, classifiers and predictors appeared differently while holdout validation revealed similar high predictive performances. Spatial cross-validation proved useful to expose spatially varying inconsistencies of the modelling results while additionally providing evidence for slightly overfitted machine learning-based models. However, the highest predictive performances were obtained for

  4. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  5. The Modern U.S. High School Astronomy Course, Its Status and Makeup II: Additional Results

    ERIC Educational Resources Information Center

    Krumenaker, Larry

    2009-01-01

    A postal survey of high school astronomy teachers strongly confirms many results of an earlier electronic survey. Additional and new results include a measure of the level of inquiry (more structured inquiry and teacher-led) in the classroom as well as data showing that more emphasis is given to traditional topics than to contemporary astronomy…

  6. Quantitative CT for volumetric analysis of medical images: initial results for liver tumors

    NASA Astrophysics Data System (ADS)

    Behnaz, Alexander S.; Snider, James; Chibuzor, Eneh; Esposito, Giuseppe; Wilson, Emmanuel; Yaniv, Ziv; Cohen, Emil; Cleary, Kevin

    2010-03-01

    Quantitative CT for volumetric analysis of medical images is increasingly being proposed for monitoring patient response during chemotherapy trials. An integrated MATLAB GUI has been developed for an oncology trial at Georgetown University Hospital. This GUI allows for the calculation and visualization of the volume of a lesion. The GUI provides an estimate of the volume of the tumor using a semi-automatic segmentation technique. This software package features a fixed parameter adaptive filter from the ITK toolkit and a tumor segmentation algorithm to reduce inter-user variability and to facilitate rapid volume measurements. The system also displays a 3D rendering of the segmented tumor, allowing the end user to have not only a quantitative measure of the tumor volume, but a qualitative view as well. As an initial validation test, several clinical cases were hand-segmented, and then compared against the results from the tool, showing good agreement.

  7. Field Testing of a Wet FGD Additive for Enhanced Mercury Control - Pilot-Scale Test Results

    SciTech Connect

    Gary M. Blythe

    2006-03-01

    Texas Lignite Flue Gas; Task 3 - Full-scale FGD Additive Testing in High Sulfur Eastern Bituminous Flue Gas; Task 4 - Pilot Wet Scrubber Additive Tests at Yates; and Task 5 - Full-scale Additive Tests at Plant Yates. This topical report presents the results from the Task 2 and Task 4 pilot-scale additive tests. The Task 3 and Task 5 full-scale additive tests will be conducted later in calendar year 2006.

  8. Additional results on space environmental effects on polymer matrix composites: Experiment A0180

    NASA Technical Reports Server (NTRS)

    Tennyson, R. C.

    1992-01-01

    Additional experimental results on the atomic oxygen erosion of boron, Kevlar, and graphite fiber reinforced epoxy matrix composites are presented. Damage of composite laminates due to micrometeoroid/debris impacts is also examined with particular emphasis on the relationship between damage area and actual hole size due to particle penetration. Special attention is given to one micrometeoroid impact on an aluminum base plate which resulted in ejecta visible on an adjoining vertical flange structure.

  9. Crystal alignment of carbonated apatite in bone and calcified tendon: results from quantitative texture analysis.

    PubMed

    Wenk, H R; Heidelbach, F

    1999-04-01

    Calcified tissue contains collagen associated with minute crystallites of carbonated apatite. In this study, methods of quantitative X-ray texture analysis were used to determine the orientation distribution and texture strength of apatite in a calcified turkey tendon and in trabecular and cortical regions of osteonal bovine ankle bone (metacarpus). To resolve local heterogeneity, a 2 or 10 microm synchrotron microfocus X-ray beam (lambda = 0.78 A) was employed. Both samples revealed a strong texture. In the case of turkey tendon, 12 times more c axes of hexagonal apatite were parallel to the fibril axis than perpendicular, and a axes had rotational freedom about the c axis. In bovine bone, the orientation density of the c axes was three times higher parallel to the surface of collagen fibrils than perpendicular to it, and there was no preferential alignment with respect to the long axis of the bone (fiber texture). Whereas half of the apatite crystallites were strongly oriented, the remaining half had a random orientation distribution. The synchrotron X-ray texture results were consistent with previous analyses of mineral orientation in calcified tissues by conventional X-ray and neutron diffraction and electron microscopy, but gave, for the first time, a quantitative description. PMID:10221548

  10. Marriage Patterns and Childbearing: Results From a Quantitative Study in North of Iran.

    PubMed

    Taghizadeh, Ziba; Behmanesh, Fereshteh; Ebadi, Abbas

    2016-03-01

    Social changes have rapidly removed arranged marriages and it seems the change in marriage pattern has played a role in childbearing. On the other hand, there is a great reduction in population in many countries which requires a comprehensive policy to manage the considerable drop in population. To achieve this goal, initially, the factors affecting fertility must be precisely identified. This study aims to examine the role of marriage patterns in childbearing. In this cross-sectional quantitative study, 880 married women 15-49 years old, living in the north of Iran were studied using a cluster sampling strategy. The results showed that there are no significant differences in reproductive behaviors of three patterns of marriage in Bobol city of Iran. It seems there is a convergence in childbearing due to the different patterns of marriage and Policymakers should pay attention to other determinants of reproductive behaviors in demographic planning. PMID:26493414

  11. Marriage Patterns and Childbearing: Results From a Quantitative Study in North of Iran

    PubMed Central

    Taghizadeh, Ziba; Behmanesh, Fereshteh; Ebadi, Abbas

    2016-01-01

    Social changes have rapidly removed arranged marriages and it seems the change in marriage pattern has played a role in childbearing. On the other hand, there is a great reduction in population in many countries which requires a comprehensive policy to manage the considerable drop in population. To achieve this goal, initially, the factors affecting fertility must be precisely identified. This study aims to examine the role of marriage patterns in childbearing. In this cross-sectional quantitative study, 880 married women 15-49 years old, living in the north of Iran were studied using a cluster sampling strategy. The results showed that there are no significant differences in reproductive behaviors of three patterns of marriage in Bobol city of Iran. It seems there is a convergence in childbearing due to the different patterns of marriage and Policymakers should pay attention to other determinants of reproductive behaviors in demographic planning. PMID:26493414

  12. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  13. PRIORITIZING FUTURE RESEACH ON OFF-LABEL PRESCRIBING: RESULTS OF A QUANTITATIVE EVALUATION

    PubMed Central

    Walton, Surrey M.; Schumock, Glen T.; Lee, Ky-Van; Alexander, G. Caleb; Meltzer, David; Stafford, Randall S.

    2015-01-01

    Background Drug use for indications not approved by the Food and Drug Administration exceeds 20% of prescribing. Available compendia indicate that a minority of off-label uses are well supported by evidence. Policy makers, however, lack information to identify where systematic reviews of the evidence or other research would be most valuable. Methods We developed a quantitative model for prioritizing individual drugs for future research on off-label uses. The base model incorporated three key factors, 1) the volume of off-label use with inadequate evidence, 2) safety, and 3) cost and market considerations. Nationally representative prescribing data were used to estimate the number of off-label drug uses by indication from 1/2005 through 6/2007 in the United States, and these indications were then categorized according to the adequacy of scientific support. Black box warnings and safety alerts were used to quantify drug safety. Drug cost, date of market entry, and marketing expenditures were used to quantify cost and market considerations. Each drug was assigned a relative value for each factor, and the factors were then weighted in the final model to produce a priority score. Sensitivity analyses were conducted by varying the weightings and model parameters. Results Drugs that were consistently ranked highly in both our base model and sensitivity analyses included quetiapine, warfarin, escitalopram, risperidone, montelukast, bupropion, sertraline, venlafaxine, celecoxib, lisinopril, duloxetine, trazodone, olanzapine, and epoetin alfa. Conclusion Future research into off-label drug use should focus on drugs used frequently with inadequate supporting evidence, particularly if further concerns are raised by known safety issues, high drug cost, recent market entry, and extensive marketing. Based on quantitative measures of these factors, we have prioritized drugs where targeted research and policy activities have high potential value. PMID:19025425

  14. Radar Based Probabilistic Quantitative Precipitation Estimation: First Results of Large Sample Data Analysis

    NASA Astrophysics Data System (ADS)

    Ciach, G. J.; Krajewski, W. F.; Villarini, G.

    2005-05-01

    Large uncertainties in the operational precipitation estimates produced by the U.S. national network of WSR-88D radars are well-acknowledged. However, quantitative information about these uncertainties is not operationally available. In an effort to fill this gap, the U.S. National Weather Service (NWS) is supporting the development of a probabilistic approach to the radar precipitation estimation. The probabilistic quantitative precipitation estimation (PQPE) methodology that was selected for this development is based on the empirically-based modeling of the functional-statistical error structure in the operational WSR-88D precipitation products under different conditions. Our first goal is to deliver a realistic parameterization of the probabilistic error model describing its dependences on the radar-estimated precipitation value, distance from the radar, season, spatiotemporal averaging scale, and the setup of the precipitation processing system (PPS). In the long-term perspective, when large samples of relevant data are available, we will extend the model to include the dependences on different types of precipitation estimates (e.g. polarimeteric and multi-sensor), geographic locations and climatic regimes. At this stage of the PQPE project, we organized a 6-year-long sample of the Level II data from the Oklahoma City radar station (KTLX), and processed it with the Built 4 of the PPS that is currently used in the NWS operations. This first set of operational products was generated with the standard setup of the PPS parameters. The radar estimates are completed with the corresponding raingauge data from the Oklahoma Mesonet, the ARS Little Washita Micronet and the EVAC PicoNet covering different spatial scales. The raingauge data are used as a ground reference (GR) to estimate the required uncertainty characteristics in the radar precipitation products. In this presentation, we describe the first results of the large-sample uncertainty analysis of the products

  15. Iterative reconstruction for quantitative computed tomography analysis of emphysema: consistent results using different tube currents

    PubMed Central

    Yamashiro, Tsuneo; Miyara, Tetsuhiro; Honda, Osamu; Tomiyama, Noriyuki; Ohno, Yoshiharu; Noma, Satoshi; Murayama, Sadayuki

    2015-01-01

    Purpose To assess the advantages of iterative reconstruction for quantitative computed tomography (CT) analysis of pulmonary emphysema. Materials and methods Twenty-two patients with pulmonary emphysema underwent chest CT imaging using identical scanners with three different tube currents: 240, 120, and 60 mA. Scan data were converted to CT images using Adaptive Iterative Dose Reduction using Three Dimensional Processing (AIDR3D) and a conventional filtered-back projection mode. Thus, six scans with and without AIDR3D were generated per patient. All other scanning and reconstruction settings were fixed. The percent low attenuation area (LAA%; < −950 Hounsfield units) and the lung density 15th percentile were automatically measured using a commercial workstation. Comparisons of LAA% and 15th percentile results between scans with and without using AIDR3D were made by Wilcoxon signed-rank tests. Associations between body weight and measurement errors among these scans were evaluated by Spearman rank correlation analysis. Results Overall, scan series without AIDR3D had higher LAA% and lower 15th percentile values than those with AIDR3D at each tube current (P<0.0001). For scan series without AIDR3D, lower tube currents resulted in higher LAA% values and lower 15th percentiles. The extent of emphysema was significantly different between each pair among scans when not using AIDR3D (LAA%, P<0.0001; 15th percentile, P<0.01), but was not significantly different between each pair among scans when using AIDR3D. On scans without using AIDR3D, measurement errors between different tube current settings were significantly correlated with patients’ body weights (P<0.05), whereas these errors between scans when using AIDR3D were insignificantly or minimally correlated with body weight. Conclusion The extent of emphysema was more consistent across different tube currents when CT scans were converted to CT images using AIDR3D than using a conventional filtered-back projection

  16. TANK 40 FINAL SB5 CHEMICAL CHARACTERIZATION RESULTS PRIOR TO NP ADDITION

    SciTech Connect

    Bannochie, C.; Click, D.

    2010-01-06

    A sample of Sludge Batch 5 (SB5) was pulled from Tank 40 in order to obtain radionuclide inventory analyses necessary for compliance with the Waste Acceptance Product Specifications (WAPS). This sample was also analyzed for chemical composition including noble metals. Prior to radionuclide inventory analyses, a final sample of the H-canyon Np stream will be added to bound the Np addition anticipated for Tank 40. These analyses along with the WAPS radionuclide analyses will help define the composition of the sludge in Tank 40 that is currently being fed to DWPF as SB5. At the Savannah River National Laboratory (SRNL) the 3-L Tank 40 SB5 sample was transferred from the shipping container into a 4-L high density polyethylene vessel and solids allowed to settle overnight. Supernate was then siphoned off and circulated through the shipping container to complete the transfer of the sample. Following thorough mixing of the 3-L sample, a 239 g sub-sample was removed. This sub-sample was then utilized for all subsequent analytical samples. Eight separate aliquots of the slurry were digested, four with HNO{sub 3}/HCl (aqua regia) in sealed Teflon{reg_sign} vessels and four in Na{sub 2}O{sub 2} (alkali or peroxide fusion) using Zr crucibles. Due to the use of Zr crucibles and Na in the peroxide fusions, Na and Zr cannot be determined from this preparation. Additionally, other alkali metals, such as Li and K that may be contaminants in the Na{sub 2}O{sub 2} are not determined from this preparation. Three Analytical Reference Glass - 14 (ARG-1) standards were digested along with a blank for each preparation. The ARG-1 glass allows for an assessment of the completeness of each digestion. Each aqua regia digestion and blank was diluted to 1:100 mL with deionized water and submitted to Analytical Development (AD) for inductively coupled plasma - atomic emission spectroscopy (ICPAES) analysis, inductively coupled plasma - mass spectrometry (ICP-MS) analysis of masses 81-209 and 230

  17. TANK 40 FINAL SB5 CHEMICAL CHARACTERIZATION RESULTS PRIOR TO NP ADDITION

    SciTech Connect

    Bannochie, C; Damon Click, D

    2009-02-26

    A sample of Sludge Batch 5 (SB5) was pulled from Tank 40 in order to obtain radionuclide inventory analyses necessary for compliance with the Waste Acceptance Product Specifications (WAPS). This sample was also analyzed for chemical composition including noble metals. Prior to radionuclide inventory analyses, a final sample of the H-canyon Np stream will be added to bound the Np addition anticipated for Tank 40. These analyses along with the WAPS radionuclide analyses will help define the composition of the sludge in Tank 40 that is currently being fed to DWPF as SB5. At the Savannah River National Laboratory (SRNL) the 3-L Tank 40 SB5 sample was transferred from the shipping container into a 4-L high density polyethylene vessel and solids allowed to settle overnight. Supernate was then siphoned off and circulated through the shipping container to complete the transfer of the sample. Following thorough mixing of the 3-L sample, a 239 g sub-sample was removed. This sub-sample was then utilized for all subsequent analytical samples. Eight separate aliquots of the slurry were digested, four with HNO{sub 3}/HCl (aqua regia) in sealed Teflon{reg_sign} vessels and four in Na{sub 2}O{sub 2} (alkali or peroxide fusion) using Zr crucibles. Due to the use of Zr crucibles and Na in the peroxide fusions, Na and Zr cannot be determined from this preparation. Additionally, other alkali metals, such as Li and K that may be contaminants in the Na{sub 2}O{sub 2} are not determined from this preparation. Three Analytical Reference Glass-1 (ARG-1) standards were digested along with a blank for each preparation. The ARG-1 glass allows for an assessment of the completeness of each digestion. Each aqua regia digestion and blank was diluted to 1:100 mL with deionized water and submitted to Analytical Development (AD) for inductively coupled plasma--atomic emission spectroscopy (ICPAES) analysis, inductively coupled plasma--mass spectrometry (ICP-MS) analysis of masses 81-209 and 230

  18. Effect of preservative addition on sensory and dynamic profile of Lucanian dry-sausages as assessed by quantitative descriptive analysis and temporal dominance of sensations.

    PubMed

    Braghieri, Ada; Piazzolla, Nicoletta; Galgano, Fernanda; Condelli, Nicola; De Rosa, Giuseppe; Napolitano, Fabio

    2016-12-01

    The quantitative descriptive analysis (QDA) was combined with temporal dominance of sensations (TDS) to assess the sensory properties of Lucanian dry-sausages either added with nitrate, nitrite and l-ascorbic acid (NS), or not (NNS). Both QDA and TDS differentiated the two groups of sausages. NNS products were perceived with higher intensity of hardness (P<0.05) and tended to be perceived with higher intensities of flavor (P<0.10), pepper (P<0.20), and oiliness (P<0.20), while resulting lower in chewiness (P<0.20). TDS showed that in all the sausages hardness was the first dominant attribute; then, in NNS products flavor remained dominant until the end of tasting, whereas in NS products oiliness prevailed. In conclusion, TDS showed that the perception of some textural parameters, such as oiliness, during mastication was more dominant in NS products, whereas using conventional QDA this attribute appeared higher in sausages manufactured without preservatives. Therefore, TDS provided additional information for the description and differentiation of Lucanian sausages. PMID:27486959

  19. Speech Perception Results for Children Using Cochlear Implants Who Have Additional Special Needs

    ERIC Educational Resources Information Center

    Dettman, Shani J.; Fiket, Hayley; Dowell, Richard C.; Charlton, Margaret; Williams, Sarah S.; Tomov, Alexandra M.; Barker, Elizabeth J.

    2004-01-01

    Speech perception outcomes in young children with cochlear implants are affected by a number of variables including the age of implantation, duration of implantation, mode of communication, and the presence of a developmental delay or additional disability. The aim of this study is to examine the association between degree of developmental delay…

  20. Researchers’ views on return of incidental genomic research results: qualitative and quantitative findings

    PubMed Central

    Klitzman, Robert; Appelbaum, Paul S.; Fyer, Abby; Martinez, Josue; Buquez, Brigitte; Wynn, Julia; Waldman, Cameron R.; Phelan, Jo; Parens, Erik; Chung, Wendy K.

    2013-01-01

    Purpose Comprehensive genomic analysis including exome and genome sequencing is increasingly being utilized in research studies, leading to the generation of incidental genetic findings. It is unclear how researchers plan to deal with incidental genetic findings. Methods We conducted a survey of the practices and attitudes of 234 members of the US genetic research community and performed qualitative semistructured interviews with 28 genomic researchers to understand their views and experiences with incidental genetic research findings. Results We found that 12% of the researchers had returned incidental genetic findings, and an additional 28% planned to do so. A large majority of researchers (95%) believe that incidental findings for highly penetrant disorders with immediate medical implications should be offered to research participants. However, there was no consensus on returning incidental results for other conditions varying in penetrance and medical actionability. Researchers raised concerns that the return of incidental findings would impose significant burdens on research and could potentially have deleterious effects on research participants if not performed well. Researchers identified assistance needed to enable effective, accurate return of incidental findings. Conclusion The majority of the researchers believe that research participants should have the option to receive at least some incidental genetic research results. PMID:23807616

  1. Quantitative mass spectrometric analysis of dipeptides in protein hydrolysate by a TNBS derivatization-aided standard addition method.

    PubMed

    Hanh, Vu Thi; Kobayashi, Yutaro; Maebuchi, Motohiro; Nakamori, Toshihiro; Tanaka, Mitsuru; Matsui, Toshiro

    2016-01-01

    The aim of this study was to establish, through a standard addition method, a convenient quantification assay for dipeptides (GY, YG, SY, YS, and IY) in soybean hydrolysate using 2,4,6-trinitrobenzene sulfonate (TNBS) derivatization-aided LC-TOF-MS. Soybean hydrolysate samples (25.0 mg mL(-1)) spiked with target standards were subjected to TNBS derivatization. Under the optimal LC-MS conditions, five target dipeptides derivatized with TNBS were successfully detected. Examination of the standard addition curves, with a correlation coefficient of r(2) > 0.979, provided a reliable quantification of the target dipeptides, GY, YG, SY, YS, and IY, in soybean hydrolysate to be 424 ± 20, 184 ± 9, 2188 ± 199, 327 ± 16, and 2211 ± 133 μg g(-1) of hydrolysate, respectively. The proposed LC-MS assay is a reliable and convenient assay method, with no interference from matrix effects in hydrolysate, and with no requirement for the use of an isotope labeled internal standard. PMID:26212980

  2. Spinel dissolution via addition of glass forming chemicals. Results of preliminary experiments

    SciTech Connect

    Fox, K. M.; Johnson, F. C.

    2015-11-01

    Increased loading of high level waste in glass can lead to crystallization within the glass. Some crystalline species, such as spinel, have no practical impact on the chemical durability of the glass, and therefore may be acceptable from both a processing and a product performance standpoint. In order to operate a melter with a controlled amount of crystallization, options must be developed for remediating an unacceptable accumulation of crystals. This report describes preliminary experiments designed to evaluate the ability to dissolve spinel crystals in simulated waste glass melts via the addition of glass forming chemicals (GFCs).

  3. Prolonged durability of electroporation microarrays as a result of addition of saccharides to nucleic acids.

    PubMed

    Fujimoto, Hiroyuki; Kato, Koichi; Iwata, Hiroo

    2009-01-01

    The electroporation microarray is a useful tool for high-throughput analysis of gene functions. However, transfection efficiency is greatly impaired by storage of the microarrays, due to water evaporation from arrayed nucleotides. In this study, we aimed at evaluating the effect of saccharides and sugar alcohols, added to the solution of the plasmid DNA or small interfering RNA (siRNA). Microarrays loaded with plasmids and siRNAs were prepared with various polyols including sugars and sugar alcohols. After storage of these microarrays at different temperatures for various time periods, transfection efficiency was evaluated using human embryonic kidney cells. In the case of plasmid-loaded microarrays, addition of monosaccharides (glucose, fructose), disaccharides (trehalose, sucrose), and trisaccharide (raffinose) served to retain transfection efficiency at a reasonably high level after storage at -20 degrees C. The observed effects may be because moisture retention serves to maintain the solubility of DNA. In contrast, polysaccharide (dextran) and sugar alcohol (glycerol) had insignificant effects on retention of transfection efficiency. On the other hand, addition of saccharides and sugar alcohols had insignificant effects on the transfection of siRNA after storage of a microarray at 25 degrees C for 7 days, presumably due to the intrinsically-high solubility of siRNA which consists of short nucleotides. PMID:18989662

  4. QUANTITATIVE EVALUATION OF ASR DETERIORATION LEVEL BASED ON SURVEY RESULT OF EXISTING STRUCTURE

    NASA Astrophysics Data System (ADS)

    Kawashima, Yasushi; Kosa, Kenji; Matsumoto, Shigeru; Miura, Masatsugu

    The relationship between the crack density and compressive strength of the core cylinder, which drilled from actual structure damaged by ASR, was investigated. The results showed that even if the crack density increased about 1.0m/m2, the compressive strength decreased only 2N/mm2. Then, the new method for estimating future compressive strength using the accumulation crack density in the current is proposed. In addition, the declining tendency of compressive strength by the ASR expansion was early proportional to the expansion, and it was examined on the reason for becoming gentle curve afterwards. As a technique, the detailed observation of ASR crack which arose in the loading test for the plane was carried out, after cylindrical specimen for test was cut in longitudinal direction. As the result, It was proven that the proportion in which line of rupture overlaps with the ASR crack was low, and the load is resisted by interlocking between coarse aggregate and concrete in the crack plane.

  5. A Quantitative Study of the Resultant Differences between Additive Practices and Reductive Practices in Data Requirements Gathering

    ERIC Educational Resources Information Center

    Johnson, Gerald

    2016-01-01

    With the increase in technology in all facets of our lives and work, there is an ever increasing set of expectations that people have regarding information availability, response time, and dependability. While expectations are affected by gender, age, experience, industry, and other factors, people have expectations of technology, and from…

  6. A field- and laboratory-based quantitative analysis of alluvium: Relating analytical results to TIMS data

    NASA Technical Reports Server (NTRS)

    Wenrich, Melissa L.; Hamilton, Victoria E.; Christensen, Philip R.

    1995-01-01

    Thermal Infrared Multispectral Scanner (TIMS) data were acquired over the McDowell Mountains northeast of Scottsdale, Arizona during August 1994. The raw data were processed to emphasize lithologic differences using a decorrelation stretch and assigning bands 5, 3, and 1 to red, green, and blue, respectively. Processed data of alluvium flanking the mountains exhibit moderate color variation. The objective of this study was to determine, using a quantitative approach, what environmental variable(s), in the absence of bedrock, is/are responsible for influencing the spectral properties of the desert alluvial surface.

  7. Comparison of Diagnostic Performance Between Visual and Quantitative Assessment of Bone Scintigraphy Results in Patients With Painful Temporomandibular Disorder

    PubMed Central

    Choi, Bong-Hoi; Yoon, Seok-Ho; Song, Seung-Il; Yoon, Joon-Kee; Lee, Su Jin; An, Young-Sil

    2016-01-01

    Abstract This retrospective clinical study was performed to evaluate whether a visual or quantitative method is more valuable for assessing painful temporomandibular disorder (TMD) using bone scintigraphy results. In total, 230 patients (172 women and 58 men) with TMD were enrolled. All patients were questioned about their temporomandibular joint (TMJ) pain. Bone scintigraphic data were acquired in all patients, and images were analyzed by visual and quantitative methods using the TMJ-to-skull uptake ratio. The diagnostic performances of both bone scintigraphic assessment methods for painful TMD were compared. In total, 241 of 460 TMJs (52.4%) were finally diagnosed with painful TMD. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of the visual analysis for diagnosing painful TMD were 62.8%, 59.6%, 58.6%, 63.8%, and 61.1%, respectively. The quantitative assessment showed the ability to diagnose painful TMD with a sensitivity of 58.8% and specificity of 69.3%. The diagnostic ability of the visual analysis for diagnosing painful TMD was not significantly different from that of the quantitative analysis. Visual bone scintigraphic analysis showed a diagnostic utility similar to that of quantitative assessment for the diagnosis of painful TMD. PMID:26765456

  8. Common standards for quantitative electrocardiography: goals and main results. CSE Working Party.

    PubMed

    Willems, J L; Arnaud, P; van Bemmel, J H; Degani, R; Macfarlane, P W; Zywietz, C

    1990-09-01

    Computer processing of electrocardiograms (ECGs) has over the last 15 years increased rapidly. Still, there are at present no standards for computer ECG interpretation. Different techniques are used not only for measurement and interpretation, but also for transmission and storage of data. In order to fill these gaps, a large international project, sponsored by the European Commission, was launched in 1980 to develop "Common Standards for Quantitative Electrocardiography (CSE)". The main objective of the first CSE study was to reduce the wide variation in wave measurements currently obtained by ECG computer programs. The second study was started in 1985 and aimed at the assessment and improvement of diagnostic classification of ECG interpretation programs. To this end reference libraries of well documented ECGs have been developed and comprehensive reviewing schemes devised for the visual and computer analysis of ECGs. This task was performed by a board of cardiologists in a Delphi review process, and by 9 VCG and 10 standard 12-lead programs developed by university research groups and by industry. A third action was started in June 1989 to harmonize acquisition, encoding, interchange and storing of digital ECG data. The action thus performed have become internationally recognized milestones for the standardization of quantitative electrocardiography. PMID:2233372

  9. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Zakharov, Sergei M.

    1997-01-10

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation.

  10. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Zakharov, S.M.

    1997-01-01

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation. {copyright} {ital 1997 American Institute of Physics.}

  11. XLF deficiency results in reduced N-nucleotide addition during V(D)J recombination

    PubMed Central

    IJspeert, Hanna; Rozmus, Jacob; Schwarz, Klaus; Warren, René L.; van Zessen, David; Holt, Robert A.; Pico-Knijnenburg, Ingrid; Simons, Erik; Jerchel, Isabel; Wawer, Angela; Lorenz, Myriam; Patıroğlu, Turkan; Akar, Himmet Haluk; Leite, Ricardo; Verkaik, Nicole S.; Stubbs, Andrew P.; van Gent, Dik C.; van Dongen, Jacques J. M.

    2016-01-01

    Repair of DNA double-strand breaks (DSBs) by the nonhomologous end-joining pathway (NHEJ) is important not only for repair of spontaneous breaks but also for breaks induced in developing lymphocytes during V(D)J (variable [V], diversity [D], and joining [J] genes) recombination of their antigen receptor loci to create a diverse repertoire. Mutations in the NHEJ factor XLF result in extreme sensitivity for ionizing radiation, microcephaly, and growth retardation comparable to mutations in LIG4 and XRCC4, which together form the NHEJ ligation complex. However, the effect on the immune system is variable (mild to severe immunodeficiency) and less prominent than that seen in deficiencies of NHEJ factors ARTEMIS and DNA-dependent protein kinase catalytic subunit, with defects in the hairpin opening step, which is crucial and unique for V(D)J recombination. Therefore, we aimed to study the role of XLF during V(D)J recombination. We obtained clinical data from 9 XLF-deficient patients and performed immune phenotyping and antigen receptor repertoire analysis of immunoglobulin (Ig) and T-cell receptor (TR) rearrangements, using next-generation sequencing in 6 patients. The results were compared with XRCC4 and LIG4 deficiency. Both Ig and TR rearrangements showed a significant decrease in the number of nontemplated (N) nucleotides inserted by terminal deoxynucleotidyl transferase, which resulted in a decrease of 2 to 3 amino acids in the CDR3. Such a reduction in the number of N-nucleotides has a great effect on the junctional diversity, and thereby on the total diversity of the Ig and TR repertoire. This shows that XLF has an important role during V(D)J recombination in creating diversity of the repertoire by stimulating N-nucleotide insertion. PMID:27281794

  12. XLF deficiency results in reduced N-nucleotide addition during V(D)J recombination.

    PubMed

    IJspeert, Hanna; Rozmus, Jacob; Schwarz, Klaus; Warren, René L; van Zessen, David; Holt, Robert A; Pico-Knijnenburg, Ingrid; Simons, Erik; Jerchel, Isabel; Wawer, Angela; Lorenz, Myriam; Patıroğlu, Turkan; Akar, Himmet Haluk; Leite, Ricardo; Verkaik, Nicole S; Stubbs, Andrew P; van Gent, Dik C; van Dongen, Jacques J M; van der Burg, Mirjam

    2016-08-01

    Repair of DNA double-strand breaks (DSBs) by the nonhomologous end-joining pathway (NHEJ) is important not only for repair of spontaneous breaks but also for breaks induced in developing lymphocytes during V(D)J (variable [V], diversity [D], and joining [J] genes) recombination of their antigen receptor loci to create a diverse repertoire. Mutations in the NHEJ factor XLF result in extreme sensitivity for ionizing radiation, microcephaly, and growth retardation comparable to mutations in LIG4 and XRCC4, which together form the NHEJ ligation complex. However, the effect on the immune system is variable (mild to severe immunodeficiency) and less prominent than that seen in deficiencies of NHEJ factors ARTEMIS and DNA-dependent protein kinase catalytic subunit, with defects in the hairpin opening step, which is crucial and unique for V(D)J recombination. Therefore, we aimed to study the role of XLF during V(D)J recombination. We obtained clinical data from 9 XLF-deficient patients and performed immune phenotyping and antigen receptor repertoire analysis of immunoglobulin (Ig) and T-cell receptor (TR) rearrangements, using next-generation sequencing in 6 patients. The results were compared with XRCC4 and LIG4 deficiency. Both Ig and TR rearrangements showed a significant decrease in the number of nontemplated (N) nucleotides inserted by terminal deoxynucleotidyl transferase, which resulted in a decrease of 2 to 3 amino acids in the CDR3. Such a reduction in the number of N-nucleotides has a great effect on the junctional diversity, and thereby on the total diversity of the Ig and TR repertoire. This shows that XLF has an important role during V(D)J recombination in creating diversity of the repertoire by stimulating N-nucleotide insertion. PMID:27281794

  13. Quantitative trait loci with additive effects on growth and carcass traits in a Wagyu-Limousin F2 population.

    PubMed

    Alexander, L J; Geary, T W; Snelling, W M; Macneil, M D

    2007-08-01

    A whole-genome scan for carcass traits [average daily gain during the pre-weaning, growth and finishing periods; birth weight; hot carcass weight and longissimus muscle area (LMA)] was performed on 328 F(2) progeny produced from Wagyu x Limousin-cross parents derived from eight founder Wagyu bulls. Nine significant (P results provide insight into genetic differences between the Wagyu and Limousin breeds. PMID:17596127

  14. Aircraft-Produced Ice Particles (APIPs): Additional Results and Further Insights.

    NASA Astrophysics Data System (ADS)

    Woodley, William L.; Gordon, Glenn; Henderson, Thomas J.; Vonnegut, Bernard; Rosenfeld, Daniel; Detwiler, Andrew

    2003-05-01

    This paper presents new results from studies of aircraft-produced ice particles (APIPs) in supercooled fog and clouds. Nine aircraft, including a Beech King Air 200T cloud physics aircraft, a Piper Aztec, a Cessna 421-C, two North American T-28s, an Aero Commander, a Piper Navajo, a Beech Turbo Baron, and a second four-bladed King Air were involved in the tests. The instrumented King Air served as the monitoring aircraft for trails of ice particles created, or not created, when the other aircraft were flown through clouds at various temperatures and served as both the test and monitoring aircraft when it itself was tested. In some cases sulfur hexafluoride (SF6) gas was released by the test aircraft during its test run and was detected by the King Air during its monitoring passes to confirm the location of the test aircraft wake. Ambient temperatures for the tests ranged between 5° and 12°C. The results confirm earlier published results and provide further insights into the APIPs phenomenon. The King Air at ambient temperatures less than 8°C can produce APIPs readily. The Piper Aztec and the Aero Commander also produced APIPs under the test conditions in which they were flown. The Cessna 421, Piper Navajo, and Beech Turbo Baron did not. The APIPs production potential of a T-28 is still indeterminate because a limited range of conditions was tested. Homogeneous nucleation in the adiabatically cooled regions where air is expanding around the rapidly rotating propeller tips is the cause of APIPs. An equation involving the propeller efficiency, engine thrust, and true airspeed of the aircraft is used along with the published thrust characteristics of the propellers to predict when the aircraft will produce APIPs. In most cases the predictions agree well with the field tests. Of all of the aircraft tested, the Piper Aztec, despite its small size and low horsepower, was predicted to be the most prolific producer of APIPs, and this was confirmed in field tests. The

  15. Continuously growing rodent molars result from a predictable quantitative evolutionary change over 50 million years.

    PubMed

    Tapaltsyan, Vagan; Eronen, Jussi T; Lawing, A Michelle; Sharir, Amnon; Janis, Christine; Jernvall, Jukka; Klein, Ophir D

    2015-05-01

    The fossil record is widely informative about evolution, but fossils are not systematically used to study the evolution of stem-cell-driven renewal. Here, we examined evolution of the continuous growth (hypselodonty) of rodent molar teeth, which is fuelled by the presence of dental stem cells. We studied occurrences of 3,500 North American rodent fossils, ranging from 50 million years ago (mya) to 2 mya. We examined changes in molar height to determine whether evolution of hypselodonty shows distinct patterns in the fossil record, and we found that hypselodont taxa emerged through intermediate forms of increasing crown height. Next, we designed a Markov simulation model, which replicated molar height increases throughout the Cenozoic and, moreover, evolution of hypselodonty. Thus, by extension, the retention of the adult stem cell niche appears to be a predictable quantitative rather than a stochastic qualitative process. Our analyses predict that hypselodonty will eventually become the dominant phenotype. PMID:25921530

  16. Continuously growing rodent molars result from a predictable quantitative evolutionary change over 50 million years

    PubMed Central

    Mushegyan, Vagan; Eronen, Jussi T.; Lawing, A. Michelle; Sharir, Amnon; Janis, Christine; Jernvall, Jukka; Klein, Ophir D.

    2015-01-01

    Summary The fossil record is widely informative about evolution, but fossils are not systematically used to study the evolution of stem cell-driven renewal. Here, we examined evolution of the continuous growth (hypselodonty) of rodent molar teeth, which is fuelled by the presence of dental stem cells. We studied occurrences of 3500 North American rodent fossils, ranging from 50 million years ago (mya) to 2 mya. We examined changes in molar height to determine if evolution of hypselodonty shows distinct patterns in the fossil record, and we found that hypselodont taxa emerged through intermediate forms of increasing crown height. Next, we designed a Markov simulation model, which replicated molar height increases throughout the Cenozoic, and, moreover, evolution of hypselodonty. Thus, by extension, the retention of the adult stem-cell niche appears to be a predictable quantitative rather than a stochastic qualitative process. Our analyses predict that hypselodonty will eventually become the dominant phenotype. PMID:25921530

  17. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer’s Disease: Results from the DIAN Study Group

    PubMed Central

    Su, Yi; Blazey, Tyler M.; Owen, Christopher J.; Christensen, Jon J.; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C.; Ances, Beau M.; Snyder, Abraham Z.; Cash, Lisa A.; Koeppe, Robert A.; Klunk, William E.; Galasko, Douglas; Brickman, Adam M.; McDade, Eric; Ringman, John M.; Thompson, Paul M.; Saykin, Andrew J.; Ghetti, Bernardino; Sperling, Reisa A.; Johnson, Keith A.; Salloway, Stephen P.; Schofield, Peter R.; Masters, Colin L.; Villemagne, Victor L.; Fox, Nick C.; Förster, Stefan; Chen, Kewei; Reiman, Eric M.; Xiong, Chengjie; Marcus, Daniel S.; Weiner, Michael W.; Morris, John C.; Bateman, Randall J.; Benzinger, Tammie L. S.

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer’s Network (DIAN), an autosomal dominant Alzheimer’s disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer’s disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted. PMID:27010959

  18. Flue gas conditioning for improved particle collection in electrostatic precipitators. First topical report, Results of laboratory screening of additives

    SciTech Connect

    Durham, M.D.

    1993-04-16

    Several tasks have been completed in a program to evaluate additives to improve fine particle collection in electrostatic precipitators. Screening tests and laboratory evaluations of additives are summarized in this report. Over 20 additives were evaluated; four were found to improve flyash precipitation rates. The Insitec particle analyzer was also evaluated; test results show that the analyzer will provide accurate sizing and counting information for particles in the size range of {le} 10 {mu}m dia.

  19. Nitrogen Addition as a Result of Long-Term Root Removal Affects Soil Organic Matter Dynamics

    NASA Astrophysics Data System (ADS)

    Crow, S. E.; Lajtha, K.

    2004-12-01

    A long-term field litter manipulation site was established in a mature coniferous forest stand at the H.J. Andrews Experimental Forest, OR, USA in 1997 in order to address how detrital inputs influence soil organic matter formation and accumulation. Soils at this site are Andisols and are characterized by high carbon (C) and low nitrogen (N) contents, due largely to the legacy of woody debris and extremely low atmospheric N deposition. Detrital treatments include trenching to remove roots, doubling wood and needle litter, and removing aboveground litter. In order to determine whether five years of detrital manipulation had altered organic matter quantity and lability at this site, soil from the top 0-5 cm of the A horizon was density fractionated to separate the labile light fraction (LF) from the more recalcitrant mineral soil in the heavy fraction (HF). Both density fractions and whole soils were incubated for one year in chambers designed such that repeated measurements of soil respiration and leachate chemistry could be made. Trenching resulted in the removal of labile root inputs from root exudates and turnover of fine roots and active mycorrhizal communities as well as an increase of available N by removing plant uptake. Since 1999, soil solution chemistry from tension lysimeters has shown greater total N and dissolved organic nitrogen (DON) flux and less dissolved organic carbon (DOC) flux to stream flow in the trenched plots relative to the other detrital treatments. C/N ratio and C content of both light and heavy fractions from the trenched plots were greater than other detrital treatments. In the lab incubation, over the course of a year C mineralization from these soils was suppressed. Cumulative DOC losses and CO2 efflux both were significantly less in soils from trenched plots than in other detrital treatments including controls. After day 150 of the incubation, leachates from the HF of plots with trenched treatments had a DOC/DON ratio significantly

  20. 21 CFR 570.14 - Indirect food additives resulting from packaging materials for animal feed and pet food.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... materials for animal feed and pet food. 570.14 Section 570.14 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS FOOD ADDITIVES General Provisions § 570.14 Indirect food additives resulting from packaging materials for animal feed...

  1. 21 CFR 570.14 - Indirect food additives resulting from packaging materials for animal feed and pet food.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... materials for animal feed and pet food. 570.14 Section 570.14 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS FOOD ADDITIVES General Provisions § 570.14 Indirect food additives resulting from packaging materials for animal feed...

  2. 21 CFR 570.14 - Indirect food additives resulting from packaging materials for animal feed and pet food.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... materials for animal feed and pet food. 570.14 Section 570.14 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS FOOD ADDITIVES General Provisions § 570.14 Indirect food additives resulting from packaging materials for animal feed...

  3. 21 CFR 570.14 - Indirect food additives resulting from packaging materials for animal feed and pet food.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... materials for animal feed and pet food. 570.14 Section 570.14 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS FOOD ADDITIVES General Provisions § 570.14 Indirect food additives resulting from packaging materials for animal feed...

  4. 21 CFR 570.14 - Indirect food additives resulting from packaging materials for animal feed and pet food.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... materials for animal feed and pet food. 570.14 Section 570.14 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS FOOD ADDITIVES General Provisions § 570.14 Indirect food additives resulting from packaging materials for animal feed...

  5. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Pavlov, Konstantin A.

    1997-01-10

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing.

  6. Detection of multivessel disease in patients with sustained myocardial infarction by thallium 201 myocardial scintigraphy: No additional value of quantitative analysis

    SciTech Connect

    Niemeyer, M.G.; Pauwels, E.K.; van der Wall, E.E.; Cramer, M.J.; Verzijlbergen, J.F.; Zwinderman, A.H.; Ascoop, C.A. )

    1989-01-01

    This study was performed to determine the value of visual and quantitative thallium 201 scintigraphy for the detection of multivessel disease in 67 patients with a sustained transmural myocardial infarction. Also the viability of the myocardial regions corresponding to pathologic Q-waves was evaluated. Of the 67 patients, 51 patients had multivessel coronary artery disease (76%). The sensitivity of the exercise test was 53%, of thallium scintigraphy 69%, when interpreted visually, and 67%, when analysed quantitatively. The specificity of these methods was 69%, 56%, and 50%, respectively. Sixty-two infarct-related flow regions were detected by visual analysis of the thallium scans, total redistribution was observed in 11/62 (18%) of patients, partial redistribution in 26/62 (42%), and no redistribution in 25/62 (40%) of patients. The infarct-related areas with total redistribution on the thallium scintigrams were more likely to be associated with normal or hypokinetic wall motion (7/11: 64%) than the areas with a persistent defect (7/25:28%) (P = 0.05), which were more related with akinetic or dyskinetic wall motion. Based on our results, it is concluded that (1) both visual and quantitative analysis of thallium exercise scintigraphy have limited value to predict the presence or absence of multivessel coronary artery disease in patients with sustained myocardial infarction, and (2) exercise-induced thallium redistribution may occur within the infarct zone, suggesting the presence of viable but jeopardized myocardium in presumed fibrotic myocardial areas.

  7. Goals of Secondary Education as Perceived by Education Consumers. Volume IV, Quantitative Results.

    ERIC Educational Resources Information Center

    New Mexico Univ., Albuquerque. Inst. for Social Research and Development.

    The results of a study to determine attitudes of parents and professional educators toward educational goals for secondary school students are analyzed in this report. The survey was conducted in two communities--Albuquerque, New Mexico, and Philadelphia, Pennsylvania. The essential nature of the results is summarized by the following categories:…

  8. Quantitative assessment of port-wine stains using chromametry: preliminary results

    NASA Astrophysics Data System (ADS)

    Beacco, Claire; Brunetaud, Jean Marc; Rotteleur, Guy; Steen, D. A.; Brunet, F.

    1996-12-01

    Objective assessment of the efficacy of different lasers for the treatment of port wine stains remains difficult. Chromametry gives reproducible information on the color of PWS, but its data are useless for a medical doctor. Thus a specific software was developed to allow graphic representation of PWS characteristics. Before the first laser treatment and after every treatment, tests were done using a chromameter on a marked zone of the PWS and on the control-lateral normal zone which represents the reference. The software calculates and represents graphically the difference of color between PWS and normal skin using data provided by the chromameter. Three parameters are calculated: (Delta) H is the difference of hue, (Delta) L is the difference of lightness and (Delta) E is the total difference of color. Each measured zone is represented by its coordinates. Calculated initial values were compared with the subjective initial color assessed by the dermatologist. The variation of the color difference was calculated using the successive values of (Delta) E after n treatments and was compared with the subjective classification of fading. Since January 1995, forty three locations have been measured before laser treatment. Purple PWS tended to differentiate from others but red and dark pink PWS could not be differentiated. The evolution of the color after treatment was calculated in 29 PWS treated 3 or 4 times. Poor result corresponded to an increase of (Delta) E. Fair and good results were associated to a decrease of (Delta) E. We did not observe excellent results during this study. These promising preliminary results need to be confirmed in a larger group of patients.

  9. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test.

    PubMed

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G

    2015-12-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as "gold standard" for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  10. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test

    PubMed Central

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.

    2015-01-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  11. Quantitative Assessment of the CCMC's Experimental Real-time SWMF-Geospace Results

    NASA Astrophysics Data System (ADS)

    Liemohn, Michael; Ganushkina, Natalia; De Zeeuw, Darren; Welling, Daniel; Toth, Gabor; Ilie, Raluca; Gombosi, Tamas; van der Holst, Bart; Kuznetsova, Maria; Maddox, Marlo; Rastaetter, Lutz

    2016-04-01

    Experimental real-time simulations of the Space Weather Modeling Framework (SWMF) are conducted at the Community Coordinated Modeling Center (CCMC), with results available there (http://ccmc.gsfc.nasa.gov/realtime.php), through the CCMC Integrated Space Weather Analysis (iSWA) site (http://iswa.ccmc.gsfc.nasa.gov/IswaSystemWebApp/), and the Michigan SWMF site (http://csem.engin.umich.edu/realtime). Presently, two configurations of the SWMF are running in real time at CCMC, both focusing on the geospace modules, using the BATS-R-US magnetohydrodynamic model, the Ridley Ionosphere Model, and with and without the Rice Convection Model for inner magnetospheric drift physics. While both have been running for several years, nearly continuous results are available since July 2015. Dst from the model output is compared against the Kyoto real-time Dst, in particular the daily minimum value of Dst to quantify the ability of the model to capture storms. Contingency tables are presented, showing that the run with the inner magnetosphere model is much better at reproducing storm-time values. For disturbances with a minimum Dst lower than -50 nT, this version yields a probability of event detection of 0.86 and a Heidke Skill Score of 0.60. In the other version of the SWMF, without the inner magnetospheric module included, the modeled Dst never dropped below -50 nT during the examined epoch.

  12. Field Testing of a Wet FGD Additive for Enhanced Mercury Control - Task 3 Full-scale Test Results

    SciTech Connect

    Gary Blythe

    2007-05-01

    in Texas Lignite Flue Gas; Task 3 - Full-scale FGD Additive Testing in High-sulfur Eastern Bituminous Flue Gas; Task 4 - Pilot Wet Scrubber Additive Tests at Plant Yates; and Task 5 - Full-scale Additive Tests at Plant Yates. The pilot-scale tests were completed in 2005 and have been previously reported. This topical report presents the results from the Task 3 full-scale additive tests, conducted at IPL's Petersburg Station Unit 2. The Task 5 full-scale additive tests will be conducted later in calendar year 2007.

  13. Design and Performance Considerations for the Quantitative Measurement of HEU Residues Resulting from 99Mo Production

    SciTech Connect

    McElroy, Robert Dennis; Chapman, Jeffrey Allen; Bogard, James S; Belian, Anthony P

    2011-01-01

    Molybdenum-99 is produced by the irradiation of high-enriched uranium (HEU) resulting in the accumulation of large quantities of HEU residues. In general, these residues are not recycled but are either disposed of or stored in containers with surface exposure rates as high as 100 R/h. The 235U content of these waste containers must be quantified for both accountability and waste disposal purposes. The challenges of quantifying such difficult-to-assay materials are discussed, along with performance estimates for each of several potential assay options. In particular, the design and performance of a High Activity Active Well Coincidence Counting (HA-AWCC) system designed and built specifically for these irradiated HEU waste materials are presented.

  14. Perspectives of Speech-Language Pathologists on the Use of Telepractice in Schools: Quantitative Survey Results

    PubMed Central

    Tucker, Janice K.

    2012-01-01

    This research surveyed 170 school-based speech-language pathologists (SLPs) in one northeastern state, with only 1.8% reporting telepractice use in school-settings. These results were consistent with two ASHA surveys (2002; 2011) that reported limited use of telepractice for school-based speech-language pathology. In the present study, willingness to use telepractice was inversely related to age, perhaps because younger members of the profession are more accustomed to using technology. Overall, respondents were concerned about the validity of assessments administered via telepractice; whether clinicians can adequately establish rapport with clients via telepractice; and if therapy conducted via telepractice can be as effective as in-person speech-language therapy. Most respondents indicated the need to establish procedures and guidelines for school-based telepractice programs. PMID:25945204

  15. 49 CFR 1155.23 - Additional requirements when filing after an unsatisfactory result from a State, local, or...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 8 2010-10-01 2010-10-01 false Additional requirements when filing after an unsatisfactory result from a State, local, or municipal authority affecting the siting of the facility. 1155.23 Section 1155.23 Transportation Other Regulations Relating to Transportation (Continued) SURFACE TRANSPORTATION BOARD, DEPARTMENT...

  16. Quantitative Results from Shockless Compression Experiments on Solids to Multi-Megabar Pressure

    NASA Astrophysics Data System (ADS)

    Davis, Jean-Paul; Brown, Justin; Knudson, Marcus; Lemke, Raymond

    2015-03-01

    Quasi-isentropic, shockless ramp-wave experiments promise accurate equation-of-state (EOS) data in the solid phase at relatively low temperatures and multi-megabar pressures. In this range of pressure, isothermal diamond-anvil techniques have limited pressure accuracy due to reliance on theoretical EOS of calibration standards, thus accurate quasi-isentropic compression data would help immensely in constraining EOS models. Multi-megabar shockless compression experiments using the Z Machine at Sandia as a magnetic drive with stripline targets continue to be performed on a number of solids. New developments will be presented in the design and analysis of these experiments, including topics such as 2-D and magneto-hydrodynamic (MHD) effects and the use of LiF windows. Results will be presented for tantalum and/or gold metals, with comparisons to independently developed EOS. * Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  17. Parents' decision-making about the human papillomavirus vaccine for their daughters: I. Quantitative results.

    PubMed

    Krawczyk, Andrea; Knäuper, Bärbel; Gilca, Vladimir; Dubé, Eve; Perez, Samara; Joyal-Desmarais, Keven; Rosberger, Zeev

    2015-01-01

    Vaccination against the human papillomavirus (HPV) is an effective primary prevention measure for HPV-related diseases. For children and young adolescents, the uptake of the vaccine is contingent on parental consent. This study sought to identify key differences between parents who obtain (acceptors) and parents who refuse (non-acceptors) the HPV vaccine for their daughters. In the context of a free, universal, school-based HPV vaccination program in Québec, 774 parents of 9-10 year-old girls completed and returned a questionnaire by mail. The questionnaire was based on the theoretical constructs of the Health Belief Model (HBM), along with constructs from other theoretical frameworks. Of the 774 parents, 88.2% reported their daughter having received the HPV vaccine. Perceived susceptibility of daughters to HPV infection, perceived benefits of the vaccine, perceived barriers (including safety of the vaccine), and cues to action significantly distinguished between parents whose daughters had received the HPV vaccine and those whose daughters had not. Other significant factors associated with daughter vaccine uptake were parents' general vaccination attitudes, anticipated regret, adherence to other routinely recommended vaccines, social norms, and positive media influence. The results of this study identify a number of important correlates related to parents' decisions to accept or refuse the HPV vaccine uptake for their daughters. Future work may benefit from targeting such factors and incorporating other health behavior theories in the design of effective HPV vaccine uptake interventions. PMID:25692455

  18. Parents’ decision-making about the human papillomavirus vaccine for their daughters: I. Quantitative results

    PubMed Central

    Krawczyk, Andrea; Knäuper, Bärbel; Gilca, Vladimir; Dubé, Eve; Perez, Samara; Joyal-Desmarais, Keven; Rosberger, Zeev

    2015-01-01

    Vaccination against the human papillomavirus (HPV) is an effective primary prevention measure for HPV-related diseases. For children and young adolescents, the uptake of the vaccine is contingent on parental consent. This study sought to identify key differences between parents who obtain (acceptors) and parents who refuse (non-acceptors) the HPV vaccine for their daughters. In the context of a free, universal, school-based HPV vaccination program in Québec, 774 parents of 9–10 year-old girls completed and returned a questionnaire by mail. The questionnaire was based on the theoretical constructs of the Health Belief Model (HBM), along with constructs from other theoretical frameworks. Of the 774 parents, 88.2% reported their daughter having received the HPV vaccine. Perceived susceptibility of daughters to HPV infection, perceived benefits of the vaccine, perceived barriers (including safety of the vaccine), and cues to action significantly distinguished between parents whose daughters had received the HPV vaccine and those whose daughters had not. Other significant factors associated with daughter vaccine uptake were parents’ general vaccination attitudes, anticipated regret, adherence to other routinely recommended vaccines, social norms, and positive media influence. The results of this study identify a number of important correlates related to parents' decisions to accept or refuse the HPV vaccine uptake for their daughters. Future work may benefit from targeting such factors and incorporating other health behavior theories in the design of effective HPV vaccine uptake interventions. PMID:25692455

  19. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Pavlov, K.A.

    1997-01-01

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing. {copyright} {ital 1997 American Institute of Physics.}

  20. The Impact of Acquisition Dose on Quantitative Breast Density Estimation with Digital Mammography: Results from ACRIN PA 4006.

    PubMed

    Chen, Lin; Ray, Shonket; Keller, Brad M; Pertuz, Said; McDonald, Elizabeth S; Conant, Emily F; Kontos, Despina

    2016-09-01

    Purpose To investigate the impact of radiation dose on breast density estimation in digital mammography. Materials and Methods With institutional review board approval and Health Insurance Portability and Accountability Act compliance under waiver of consent, a cohort of women from the American College of Radiology Imaging Network Pennsylvania 4006 trial was retrospectively analyzed. All patients underwent breast screening with a combination of dose protocols, including standard full-field digital mammography, low-dose digital mammography, and digital breast tomosynthesis. A total of 5832 images from 486 women were analyzed with previously validated, fully automated software for quantitative estimation of density. Clinical Breast Imaging Reporting and Data System (BI-RADS) density assessment results were also available from the trial reports. The influence of image acquisition radiation dose on quantitative breast density estimation was investigated with analysis of variance and linear regression. Pairwise comparisons of density estimations at different dose levels were performed with Student t test. Agreement of estimation was evaluated with quartile-weighted Cohen kappa values and Bland-Altman limits of agreement. Results Radiation dose of image acquisition did not significantly affect quantitative density measurements (analysis of variance, P = .37 to P = .75), with percent density demonstrating a high overall correlation between protocols (r = 0.88-0.95; weighted κ = 0.83-0.90). However, differences in breast percent density (1.04% and 3.84%, P < .05) were observed within high BI-RADS density categories, although they were significantly correlated across the different acquisition dose levels (r = 0.76-0.92, P < .05). Conclusion Precision and reproducibility of automated breast density measurements with digital mammography are not substantially affected by variations in radiation dose; thus, the use of low-dose techniques for the purpose of density estimation

  1. Field Testing of a Wet FGD Additive for Enhanced Mercury Control - Task 5 Full-Scale Test Results

    SciTech Connect

    Gary Blythe; MariJon Owens

    2007-12-01

    and reporting. The other four tasks involve field testing on FGD systems, either at pilot or full scale. The four tasks include: Task 2 - Pilot Additive Testing in Texas Lignite Flue Gas; Task 3 - Full-scale FGD Additive Testing in High-sulfur Eastern Bituminous Flue Gas; Task 4 - Pilot Wet Scrubber Additive Tests at Plant Yates; and Task 5 - Full-scale Additive Tests at Plant Yates. The pilot-scale tests and the full-scale test using high-sulfur coal were completed in 2005 and 2006 and have been previously reported. This topical report presents the results from the Task 5 full-scale additive tests, conducted at Southern Company's Plant Yates Unit 1. Both additives were tested there.

  2. Rock surface roughness measurement using CSI technique and analysis of surface characterization by qualitative and quantitative results

    NASA Astrophysics Data System (ADS)

    Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.

    2016-01-01

    In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.

  3. Quantitative Pleistocene calcareous nannofossil biostratigraphy: preliminary results from the IODP Site U1385 (Exp 339), the Shackleton Site

    NASA Astrophysics Data System (ADS)

    Balestra, B.; Flores, J. A.; Acton, G.; Alvarez Zarikian, C. A.; Grunert, P.; Hernandez-Molina, F. J.; Hodell, D. A.; Li, B.; Richter, C.; Sanchez Goni, M.; Sierro, F. J.; Singh, A.; Stow, D. A.; Voelker, A.; Xuan, C.

    2013-12-01

    In order to explore the effects of Mediterranean Outflow Water (MOW) on North Atlantic circulation and climate, Integrated Ocean Drilling Program (IODP) Expedition 339 (Mediterranean Outflow) cored a series of sites in the Gulf of Cadiz slope and off West Iberia (North East Atlantic). Site U1385 (37°48'N, 10°10‧W, 3146 m water depth) was selected and drilled in the lower slope of the Portuguese margin, at a location close to the so-called Shackleton Site MD95-2042 (in honor of the late Sir Nicholas Shackleton), to provide a marine reference section of Pleistocene millennial-scale climate variability. Three holes were cored at Site U1385 using the Advanced Piston Corer (APC) to a depth of ~151 meters below seafloor in order to recover a continuous stratigraphic record covering the past 1.4 Ma. Here we present preliminary results of the succession of standard and alternative calcareous nannofossil events. Our quantitative study based on calcareous nannofossils shows well-preserved and abundant assemblages throughout the core. Most conventional Pleistocene events were recognized. Moreover, our quantitative investigations provide further data on the stratigraphic distribution of some species and groups, such as the large Emiliania huxleyi (>4 μm), the small Gephyrocapsa group, and Reticulofenestra cisnerosii. A preliminary calibration of the calcareous nannofossil events with the paleomagnetic and astronomical signal, estimated by comparison with geophysical and logging parameters is also presented. *IODP Expedition 339 Scientists: Bahr, A., Ducassou. E., Flood, R., Furota, S., Jimenez-Espejo, F., Kim, J. K., Krissek, L., Kuroda, J., Llave, E., Lofi, J., Lourens, L., Miller, M., Nanayama, F., Nishida, N., Roque, C., Sloss, C., Takashimizu, Y., Tzanova, A., Williams, T.

  4. Quantitative Analysis in the General Chemistry Laboratory: Training Students to Analyze Individual Results in the Context of Collective Data

    ERIC Educational Resources Information Center

    Ling, Chris D.; Bridgeman, Adam J.

    2011-01-01

    Titration experiments are ideal for generating large data sets for use in quantitative-analysis activities that are meaningful and transparent to general chemistry students. We report the successful implementation of a sophisticated quantitative exercise in which the students identify a series of unknown acids by determining their molar masses…

  5. Lanthanum Tricyanide-Catalyzed Acyl Silane-Ketone Benzoin Additions and Kinetic Resolution of Resultant α-Silyloxyketones

    PubMed Central

    Tarr, James C.

    2010-01-01

    We report the full account of our efforts on the lanthanum tricyanide-catalyzed acyl silane-ketone benzoin reaction. The reaction exhibits a wide scope in both acyl silane (aryl, alkyl) and ketone (aryl-alkyl, alkyl-alkyl, aryl-aryl, alkenyl-alkyl, alkynyl-alkyl) coupling partners. The diastereoselectivity of the reaction has been examined in both cyclic and acyclic systems. Cyclohexanones give products arising from equatorial attack by the acyl silane. The diastereoselectivity of acyl silane addition to acyclic α-hydroxy ketones can be controlled by varying the protecting group to obtain either Felkin-Ahn or chelation control. The resultant α-silyloxyketone products can be resolved with selectivity factors from 10 to 15 by subjecting racemic ketone benzoin products to CBS reduction. PMID:20392127

  6. Divergent targets of glycolysis and oxidative phosphorylation result in additive effects of metformin and starvation in colon and breast cancer.

    PubMed

    Marini, Cecilia; Bianchi, Giovanna; Buschiazzo, Ambra; Ravera, Silvia; Martella, Roberto; Bottoni, Gianluca; Petretto, Andrea; Emionite, Laura; Monteverde, Elena; Capitanio, Selene; Inglese, Elvira; Fabbi, Marina; Bongioanni, Francesca; Garaboldi, Lucia; Bruzzi, Paolo; Orengo, Anna Maria; Raffaghello, Lizzia; Sambuceti, Gianmario

    2016-01-01

    Emerging evidence demonstrates that targeting energy metabolism is a promising strategy to fight cancer. Here we show that combining metformin and short-term starvation markedly impairs metabolism and growth of colon and breast cancer. The impairment in glycolytic flux caused by starvation is enhanced by metformin through its interference with hexokinase II activity, as documented by measurement of 18F-fluorodeoxyglycose uptake. Oxidative phosphorylation is additively compromised by combined treatment: metformin virtually abolishes Complex I function; starvation determines an uncoupled status of OXPHOS and amplifies the activity of respiratory Complexes II and IV thus combining a massive ATP depletion with a significant increase in reactive oxygen species. More importantly, the combined treatment profoundly impairs cancer glucose metabolism and virtually abolishes lesion growth in experimental models of breast and colon carcinoma. Our results strongly suggest that energy metabolism is a promising target to reduce cancer progression. PMID:26794854

  7. Divergent targets of glycolysis and oxidative phosphorylation result in additive effects of metformin and starvation in colon and breast cancer

    PubMed Central

    Marini, Cecilia; Bianchi, Giovanna; Buschiazzo, Ambra; Ravera, Silvia; Martella, Roberto; Bottoni, Gianluca; Petretto, Andrea; Emionite, Laura; Monteverde, Elena; Capitanio, Selene; Inglese, Elvira; Fabbi, Marina; Bongioanni, Francesca; Garaboldi, Lucia; Bruzzi, Paolo; Orengo, Anna Maria; Raffaghello, Lizzia; Sambuceti, Gianmario

    2016-01-01

    Emerging evidence demonstrates that targeting energy metabolism is a promising strategy to fight cancer. Here we show that combining metformin and short-term starvation markedly impairs metabolism and growth of colon and breast cancer. The impairment in glycolytic flux caused by starvation is enhanced by metformin through its interference with hexokinase II activity, as documented by measurement of 18F-fluorodeoxyglycose uptake. Oxidative phosphorylation is additively compromised by combined treatment: metformin virtually abolishes Complex I function; starvation determines an uncoupled status of OXPHOS and amplifies the activity of respiratory Complexes II and IV thus combining a massive ATP depletion with a significant increase in reactive oxygen species. More importantly, the combined treatment profoundly impairs cancer glucose metabolism and virtually abolishes lesion growth in experimental models of breast and colon carcinoma. Our results strongly suggest that energy metabolism is a promising target to reduce cancer progression. PMID:26794854

  8. The bright knots at the tops of soft X-ray flare loops: Quantitative results from Yohkoh

    NASA Technical Reports Server (NTRS)

    Doschek, G. A.; Strong, K. T.; Tsuneta, S.

    1995-01-01

    Soft X-ray Telescope (SXT) observations from the Japanese Yohkoh spacecraft have shown that confined bright regions are common features at the tops of flare loops throughout most of the duration of the flares. In this paper we present quantitative results for these flare knots, in relation to other flare regions, for four relatively 'simple' flares. Emission measure distributions, electron temperatures, and electron densities are derived from SXT and Yohkoh Bragg Crystal Spectrometer (BCS) observations. The four flares selected are dominated by what appear to be single-loop structures, with bright knots at the loop tops. The flares are neither long-duration nor impulsive events. The spatial distributions of brightness and emission measure in the flares are found to be quite similar for all four events, even though there are significant differences in dynamical behavior between at least two of the events. Temperatures and densities calculated for these flares are consistent with previous results from many solar experiments. An investigation of intensity correlations between adjacent pixels at the tops of the loops suggests the existence of local disturbances in the magnetic loops that occur on spatial scales less than the radii of the loops.

  9. Model assessment of additional contamination of water bodies as a result of wildfires in the Chernobyl exclusion zone.

    PubMed

    Bondar, Yu I; Navumau, A D; Nikitin, A N; Brown, J; Dowdall, M

    2014-12-01

    Forest fires and wild fires are recognized as a possible cause of resuspension and redistribution of radioactive substances when occurring on lands contaminated with such materials, and as such are a matter of concern within the regions of Belarus and the Ukraine which were contaminated by the Chernobyl accident in 1986. Modelling the effects of such fires on radioactive contaminants is a complex matter given the number of variables involved. In this paper, a probabilistic model was developed using empirical data drawn from the Polessie State Radiation-Ecological Reserve (PSRER), Belarus, and the Maximum Entropy Method. Using the model, it was possible to derive estimates of the contribution of fire events to overall variability in the levels of (137)Cs and (239,240)Pu in ground air as well as estimates of the deposition of these radionuclides to specific water bodies within the contaminated areas of Belarus. Results indicate that fire events are potentially significant redistributors of radioactive contaminants within the study area and may result in additional contamination being introduced to water bodies. PMID:25240987

  10. Can homeopathy bring additional benefits to thalassemic patients on hydroxyurea therapy? Encouraging results of a preliminary study.

    PubMed

    Banerjee, Antara; Chakrabarty, Sudipa Basu; Karmakar, Susanta Roy; Chakrabarty, Amit; Biswas, Surjyo Jyoti; Haque, Saiful; Das, Debarsi; Paul, Saili; Mandal, Biswapati; Naoual, Boujedaini; Belon, Philippe; Khuda-Bukhsh, Anisur Rahman

    2010-03-01

    Several homeopathic remedies, namely, Pulsatilla Nigricans (30th potency), Ceanothus Americanus (both mother tincture and 6th potency) and Ferrum Metallicum (30th potency) selected as per similia principles were administered to 38 thalassemic patients receiving Hydroxyurea (HU) therapy for a varying period of time. Levels of serum ferritin (SF), fetal hemoglobin (HbF), hemoglobin (Hb), platelet count (PC), mean corpuscular volume (MCV), mean corpuscular hemoglobin concentration (MCHC), mean corpuscular hemoglobin (MCH), white blood cell (WBC) count, bilirubin content, alanine amino transferase (ALT), aspartate amino transferase (AST) and serum total protein content of patients were determined before and 3 months after administration of the homeopathic remedies in combination with HU to evaluate additional benefits, if any, derived by the homeopathic remedies, by comparing the data with those of 38 subjects receiving only HU therapy. Preliminary results indicated that there was a significant decrease in the SF and increase in HbF levels in the combined, treated subjects. Although the changes in other parameters were not so significant, there was a significant decrease in size of spleen in most patients with spleenomegaly and improvement in general health conditions along with an increased gap between transfusions in most patients receiving the combined homeopathic treatment. The homeopathic remedies being inexpensive and without any known side-effects seem to have great potentials in bringing additional benefits to thalassemic patients; particularly in the developing world where blood transfusions suffer from inadequate screening and fall short of the stringent safety standards followed in the developed countries. Further independent studies are encouraged. PMID:18955271

  11. Hemostatic assessment, treatment strategies, and hematology consultation in massive postpartum hemorrhage: results of a quantitative survey of obstetrician-gynecologists

    PubMed Central

    James, Andra H; Cooper, David L; Paidas, Michael J

    2015-01-01

    Objective To assess potential diagnostic and practice barriers to successful management of massive postpartum hemorrhage (PPH), emphasizing recognition and management of contributing coagulation disorders. Study design A quantitative survey was conducted to assess practice patterns of US obstetrician-gynecologists in managing massive PPH, including assessment of coagulation. Results Nearly all (98%) of the 50 obstetrician-gynecologists participating in the survey reported having encountered at least one patient with “massive” PPH in the past 5 years. Approximately half (52%) reported having previously discovered an underlying bleeding disorder in a patient with PPH, with disseminated intravascular coagulation (88%, n=23/26) being identified more often than von Willebrand disease (73%, n=19/26). All reported having used methylergonovine and packed red blood cells in managing massive PPH, while 90% reported performing a hysterectomy. A drop in blood pressure and ongoing visible bleeding were the most commonly accepted indications for rechecking a “stat” complete blood count and coagulation studies, respectively, in patients with PPH; however, 4% of respondents reported that they would not routinely order coagulation studies. Forty-two percent reported having never consulted a hematologist for massive PPH. Conclusion The survey findings highlight potential areas for improved practice in managing massive PPH, including earlier and more consistent assessment, monitoring of coagulation studies, and consultation with a hematologist. PMID:26604829

  12. Quantitative trait loci with additive effect on palatability and fatty acid composition of meat in a Wagyu-Limousin F*2 population.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A whole genome scan was conducted on 328 F2 progeny in a Wagyu x Limousin cross to identify quantitative trait loci (QTL) affecting palatability and fatty acid composition of beef. We have identified seven QTL on four chromosomes involved in lipid metabolism and tenderness. These genomic regions are...

  13. Accuracy and Precision in the Southern Hemisphere Additional Ozonesondes (SHADOZ) Dataset in Light of the JOSIE-2000 Results

    NASA Technical Reports Server (NTRS)

    Witte, Jacquelyn C.; Thompson, Anne M.; Schmidlin, F. J.; Oltmans, S. J.; Smit, H. G. J.

    2004-01-01

    Since 1998 the Southern Hemisphere ADditional OZonesondes (SHADOZ) project has provided over 2000 ozone profiles over eleven southern hemisphere tropical and subtropical stations. Balloon-borne electrochemical concentration cell (ECC) ozonesondes are used to measure ozone. The data are archived at: &ttp://croc.gsfc.nasa.gov/shadoz>. In analysis of ozonesonde imprecision within the SHADOZ dataset, Thompson et al. [JGR, 108,8238,20031 we pointed out that variations in ozonesonde technique (sensor solution strength, instrument manufacturer, data processing) could lead to station-to-station biases within the SHADOZ dataset. Imprecisions and accuracy in the SHADOZ dataset are examined in light of new data. First, SHADOZ total ozone column amounts are compared to version 8 TOMS (2004 release). As for TOMS version 7, satellite total ozone is usually higher than the integrated column amount from the sounding. Discrepancies between the sonde and satellite datasets decline two percentage points on average, compared to version 7 TOMS offsets. Second, the SHADOZ station data are compared to results of chamber simulations (JOSE-2000, Juelich Ozonesonde Intercomparison Experiment) in which the various SHADOZ techniques were evaluated. The range of JOSE column deviations from a standard instrument (-10%) in the chamber resembles that of the SHADOZ station data. It appears that some systematic variations in the SHADOZ ozone record are accounted for by differences in solution strength, data processing and instrument type (manufacturer).

  14. A gene-free formulation of classical quantitative genetics used to examine results and interpretations under three standard assumptions.

    PubMed

    Taylor, Peter J

    2012-12-01

    Quantitative genetics (QG) analyses variation in traits of humans, other animals, or plants in ways that take account of the genealogical relatedness of the individuals whose traits are observed. "Classical" QG, where the analysis of variation does not involve data on measurable genetic or environmental entities or factors, is reformulated in this article using models that are free of hypothetical, idealized versions of such factors, while still allowing for defined degrees of relatedness among kinds of individuals or "varieties." The gene-free formulation encompasses situations encountered in human QG as well as in agricultural QG. This formulation is used to describe three standard assumptions involved in classical QG and provide plausible alternatives. Several concerns about the partitioning of trait variation into components and its interpretation, most of which have a long history of debate, are discussed in light of the gene-free formulation and alternative assumptions. That discussion is at a theoretical level, not dependent on empirical data in any particular situation. Additional lines of work to put the gene-free formulation and alternative assumptions into practice and to assess their empirical consequences are noted, but lie beyond the scope of this article. The three standard QG assumptions examined are: (1) partitioning of trait variation into components requires models of hypothetical, idealized genes with simple Mendelian inheritance and direct contributions to the trait; (2) all other things being equal, similarity in traits for relatives is proportional to the fraction shared by the relatives of all the genes that vary in the population (e.g., fraternal or dizygotic twins share half of the variable genes that identical or monozygotic twins share); (3) in analyses of human data, genotype-environment interaction variance (in the classical QG sense) can be discounted. The concerns about the partitioning of trait variation discussed include: the

  15. Contributions of 18 Additional DNA Sequence Variations in the Gene Encoding Apolipoprotein E to Explaining Variation in Quantitative Measures of Lipid Metabolism

    PubMed Central

    Stengård, Jari H.; Clark, Andrew G.; Weiss, Kenneth M.; Kardia, Sharon; Nickerson, Deborah A.; Salomaa, Veikko; Ehnholm, Christian; Boerwinkle, Eric; Sing, Charles F.

    2002-01-01

    Apolipoprotein E (ApoE) is a major constituent of many lipoprotein particles. Previous genetic studies have focused on six genotypes defined by three alleles, denoted ε2, ε3, and ε4, encoded by two variable exonic sites that segregate in most populations. We have reported studies of the distribution of alleles of 20 biallelic variable sites in the gene encoding the ApoE molecule within and among samples, ascertained without regard to health, from each of three populations: African Americans from Jackson, Miss.; Europeans from North Karelia, Finland; and non-Hispanic European Americans from Rochester, Minn. Here we ask (1) how much variation in blood levels of ApoE (lnApoE), of total cholesterol (TC), of high-density lipoprotein cholesterol (HDL-C), and of triglyceride (lnTG) is statistically explained by variation among APOE genotypes defined by the ε2, ε3, and ε4 alleles; (2) how much additional variation in these traits is explained by genotypes defined by combining the two variable sites that define these three alleles with one or more additional variable sites; and (3) what are the locations and relative allele frequencies of the sites that define multisite genotypes that significantly improve the statistical explanation of variation beyond that provided by the genotypes defined by the ε2, ε3, and ε4 alleles, separately for each of the six gender-population strata. This study establishes that the use of only genotypes defined by the ε2, ε3, and ε4 alleles gives an incomplete picture of the contribution that the variation in the APOE gene makes to the statistical explanation of interindividual variation in blood measurements of lipid metabolism. The addition of variable sites to the genotype definition significantly improved the ability to explain variation in lnApoE and in TC and resulted in the explanation of variation in HDL-C and in lnTG. The combination of additional sites that explained the greatest amount of trait variation was different for

  16. Examining the Role of Numeracy in College STEM Courses: Results from the Quantitative Reasoning for College Science (QuaRCS) Assessment Instrument

    NASA Astrophysics Data System (ADS)

    Follette, Katherine B.; McCarthy, Donald W.; Dokter, Erin F.; Buxner, Sanlyn; Prather, Edward E.

    2016-01-01

    Is quantitative literacy a prerequisite for science literacy? Can students become discerning voters, savvy consumers and educated citizens without it? Should college science courses for nonmajors be focused on "science appreciation", or should they engage students in the messy quantitative realities of modern science? We will present results from the recently developed and validated Quantitative Reasoning for College Science (QuaRCS) Assessment, which probes both quantitative reasoning skills and attitudes toward mathematics. Based on data from nearly two thousand students enrolled in nineteen general education science courses, we show that students in these courses did not demonstrate significant skill or attitude improvements over the course of a single semester, but find encouraging evidence for longer term trends.

  17. Acquisition and Retention of Quantitative Communication Skills in an Undergraduate Biology Curriculum: Long-Term Retention Results

    ERIC Educational Resources Information Center

    Chevalier, Cary D.; Ashley, David C.; Rushin, John W.

    2010-01-01

    The purpose of this study was to assess some of the effects of a nontraditional, experimental learning approach designed to improve rapid acquisition and long-term retention of quantitative communication skills (QCS) such as descriptive and inferential statistics, hypothesis formulation, experimental design, data characteristics, and data…

  18. Prognosis of Slagging and Fouling Properties of Coals Based on Widely Available Data and Results of Additional Measuraments

    NASA Astrophysics Data System (ADS)

    Alekhnovich, Alexander N.; Artemjeva, Natalja V.; Bogomolov, Vladimir V.; Shchelokov, Vyacheslav I.; Petukhov, Vasilij G.

    Ranging of coals according to the slagging properties of similar type and investigated coals could be made on the basis of the available reference data. However, to define the slagging and fouling properties of a random coal it is necessary to carry out additional laboratory research.

  19. 4D Seismic Monitoring at the Ketzin Pilot Site during five years of storage - Results and Quantitative Assessment

    NASA Astrophysics Data System (ADS)

    Lüth, Stefan; Ivanova, Alexandra; Ivandic, Monika; Götz, Julia

    2015-04-01

    The Ketzin pilot site for geological CO2-storage has been operative between June 2008 and August 2013. In this period, 67 kt of CO2 have been injected (Martens et al., this conference). Repeated 3D seismic monitoring surveys were performed before and during CO2 injection. A third repeat survey, providing data from the post-injection phase, is currently being prepared for the autumn of 2015. The large scale 3D surface seismic measurements have been complemented by other geophysical and geochemical monitoring methods, among which are high-resolution seismic surface-downhole observations. These observations have been concentrating on the reservoir area in the vicinity of the injection well and provide high-resolution images as well as data for petrophysical quantification of the CO2 distribution in the reservoir. The Ketzin pilot site is a saline aquifer site in an onshore environment which poses specific challenges for a reliable monitoring of the injection CO2. Although much effort was done to ensure as much as possible identical acquisition conditions, a high degree of repeatability noise was observed, mainly due to varying weather conditions, and also variations in the acquisition geometries due to logistical reasons. Nevertheless, time-lapse processing succeeded in generating 3D time-lapse data sets which could be interpreted in terms of CO2 storage related amplitude variations in the depth range of the storage reservoir. The time-lapse seismic data, pulsed-neutron-gamma logging results (saturation), and petrophysical core measurements were interpreted together in order to estimate the amount of injected carbon dioxide imaged by the seismic repeat data. For the first repeat survey, the mass estimation was summed up to 20.5 ktons, which is approximately 7% less than what had been injected then. For the second repeat survey, the mass estimation was summed up to approximately 10-15% less than what had been injected. The deviations may be explained by several factors

  20. Simple additive effects are rare: a quantitative review of plant biomass and soil process responses to combined manipulations of CO2 and temperature.

    PubMed

    Dieleman, Wouter I J; Vicca, Sara; Dijkstra, Feike A; Hagedorn, Frank; Hovenden, Mark J; Larsen, Klaus S; Morgan, Jack A; Volder, Astrid; Beier, Claus; Dukes, Jeffrey S; King, John; Leuzinger, Sebastian; Linder, Sune; Luo, Yiqi; Oren, Ram; De Angelis, Paolo; Tingey, David; Hoosbeek, Marcel R; Janssens, Ivan A

    2012-09-01

    In recent years, increased awareness of the potential interactions between rising atmospheric CO2 concentrations ([ CO2 ]) and temperature has illustrated the importance of multifactorial ecosystem manipulation experiments for validating Earth System models. To address the urgent need for increased understanding of responses in multifactorial experiments, this article synthesizes how ecosystem productivity and soil processes respond to combined warming and [ CO2 ] manipulation, and compares it with those obtained in single factor [ CO2 ] and temperature manipulation experiments. Across all combined elevated [ CO2 ] and warming experiments, biomass production and soil respiration were typically enhanced. Responses to the combined treatment were more similar to those in the [ CO2 ]-only treatment than to those in the warming-only treatment. In contrast to warming-only experiments, both the combined and the [ CO2 ]-only treatments elicited larger stimulation of fine root biomass than of aboveground biomass, consistently stimulated soil respiration, and decreased foliar nitrogen (N) concentration. Nonetheless, mineral N availability declined less in the combined treatment than in the [ CO2 ]-only treatment, possibly due to the warming-induced acceleration of decomposition, implying that progressive nitrogen limitation (PNL) may not occur as commonly as anticipated from single factor [ CO2 ] treatment studies. Responses of total plant biomass, especially of aboveground biomass, revealed antagonistic interactions between elevated [ CO2 ] and warming, i.e. the response to the combined treatment was usually less-than-additive. This implies that productivity projections might be overestimated when models are parameterized based on single factor responses. Our results highlight the need for more (and especially more long-term) multifactor manipulation experiments. Because single factor CO2 responses often dominated over warming responses in the combined treatments, our

  1. A comparison of 3D poly(ε-caprolactone) tissue engineering scaffolds produced with conventional and additive manufacturing techniques by means of quantitative analysis of SR μ-CT images

    NASA Astrophysics Data System (ADS)

    Brun, F.; Intranuovo, F.; Mohammadi, S.; Domingos, M.; Favia, P.; Tromba, G.

    2013-07-01

    The technique used to produce a 3D tissue engineering (TE) scaffold is of fundamental importance in order to guarantee its proper morphological characteristics. An accurate assessment of the resulting structural properties is therefore crucial in order to evaluate the effectiveness of the produced scaffold. Synchrotron radiation (SR) computed microtomography (μ-CT) combined with further image analysis seems to be one of the most effective techniques to this aim. However, a quantitative assessment of the morphological parameters directly from the reconstructed images is a non trivial task. This study considers two different poly(ε-caprolactone) (PCL) scaffolds fabricated with a conventional technique (Solvent Casting Particulate Leaching, SCPL) and an additive manufacturing (AM) technique (BioCell Printing), respectively. With the first technique it is possible to produce scaffolds with random, non-regular, rounded pore geometry. The AM technique instead is able to produce scaffolds with square-shaped interconnected pores of regular dimension. Therefore, the final morphology of the AM scaffolds can be predicted and the resulting model can be used for the validation of the applied imaging and image analysis protocols. It is here reported a SR μ-CT image analysis approach that is able to effectively and accurately reveal the differences in the pore- and throat-size distributions as well as connectivity of both AM and SCPL scaffolds.

  2. Quantitative trait loci with additive effects on palatability and fatty acid composition of meat in a Wagyu-Limousin F2 population.

    PubMed

    Alexander, L J; Macneil, M D; Geary, T W; Snelling, W M; Rule, D C; Scanga, J A

    2007-10-01

    A whole-genome scan was conducted on 328 F(2) progeny in a Wagyu x Limousin cross to identify quantitative trait loci (QTL) affecting palatability and fatty acid composition of beef at an age-constant endpoint. We have identified seven QTL on five chromosomes involved in lipid metabolism and tenderness. None of the genes encoding major enzymes involved in fatty acid metabolism, such as fatty acid synthase (FASN), acetyl-CoA carboxylase alpha (ACACA), solute carrier family 2 (facilitated glucose transporter) member 4 (SLC2A4), stearoyl-CoA desaturase (SCD) and genes encoding the subunits of fatty acid elongase, was located in these QTL regions. The present study may lead to a better-tasting and healthier product for consumers through improved selection for palatability and lipid content of beef. PMID:17894565

  3. Changes in lipid composition of Escherichia coli resulting from growth with organic solvents and with food additives.

    PubMed Central

    Ingram, L O

    1977-01-01

    Cells of Escherichia coli contain an altered fatty acid and phospholipid composition when grown in the presence of sublethal concentrations of a variety of organic solvents and food additives. The diversity of compounds examined which caused these changes indicates that no single catabolic pathway is involved. Many of the observed changes are consistent with the hypothesis that cells adapt their membrane lipids to compensate for the presence of these compounds in the environment. Both sodium benzoate and calcium propionate caused the synthesis of unusual fatty acids. PMID:327934

  4. Ten Years of LibQual: A Study of Qualitative and Quantitative Survey Results at the University of Mississippi 2001-2010

    ERIC Educational Resources Information Center

    Greenwood, Judy T.; Watson, Alex P.; Dennis, Melissa

    2011-01-01

    This article analyzes quantitative adequacy gap scores and coded qualitative comments from LibQual surveys at the University of Mississippi from 2001 to 2010, looking for relationships between library policy changes and LibQual results and any other trends that emerged. Analysis found no relationship between changes in policy and survey results…

  5. Additional results on palaeomagnetic stratigraphy of the Koobi Fora Formation, east of Lake Turkana (Lake Rudolf), Kenya

    USGS Publications Warehouse

    Hillhouse, J.W.; Ndombi, J.W.M.; Cox, A.; Brock, A.

    1977-01-01

    The magnetostratigraphy of the hominid-bearing sediments exposed east of Lake Turkana has been strengthened by new palaeomagnetic results. Ages obtained from several tuffs by the 40Ar/39Ar method suggest an approxmate match between the observed magnetozones and the geomagnetic polarity time scale; however, the palaeomagnetic results are also compatible with a younger chronology suggested by conventional K-Ar dating of the KBS Tuff. ?? 1977 Nature Publishing Group.

  6. Quantitative Correlation of in Vivo Properties with in Vitro Assay Results: The in Vitro Binding of a Biotin–DNA Analogue Modifier with Streptavidin Predicts the in Vivo Avidin-Induced Clearability of the Analogue-Modified Antibody

    PubMed Central

    Dou, Shuping; Virostko, John; Greiner, Dale L.; Powers, Alvin C.; Liu, Guozheng

    2016-01-01

    Quantitative prediction of in vivo behavior using an in vitro assay would dramatically accelerate pharmaceutical development. However, studies quantitatively correlating in vivo properties with in vitro assay results are rare because of the difficulty in quantitatively understanding the in vivo behavior of an agent. We now demonstrate such a correlation as a case study based on our quantitative understanding of the in vivo chemistry. In an ongoing pretargeting project, we designed a trifunctional antibody (Ab) that concomitantly carried a biotin and a DNA analogue (hereafter termed MORF). The biotin and the MORF were fused into one structure prior to conjugation to the Ab for the concomitant attachment. Because it was known that avidin-bound Ab molecules leave the circulation rapidly, this design would theoretically allow complete clearance by avidin. The clearability of the trifunctional Ab was determined by calculating the blood MORF concentration ratio of avidin-treated Ab to non-avidin-treated Ab using mice injected with these compounds. In theory, any compromised clearability should be due to the presence of impurities. In vitro, we measured the biotinylated percentage of the Ab-reacting (MORF-biotin)⊃-NH2 modifier, by addition of streptavidin to the radiolabeled (MORF-biotin)⊃-NH2 samples and subsequent high-performance liquid chromatography (HPLC) analysis. On the basis of our previous quantitative understanding, we predicted that the clearability of the Ab would be equal to the biotinylation percentage measured via HPLC. We validated this prediction within a 3% difference. In addition to the high avidin-induced clearability of the trifunctional Ab (up to ~95%) achieved by the design, we were able to predict the required quality of the (MORF-biotin)⊃-NH2 modifier for any given in vivo clearability. This approach may greatly reduce the steps and time currently required in pharmaceutical development in the process of synthesis, chemical analysis, in

  7. Quantitative Correlation of in Vivo Properties with in Vitro Assay Results: The in Vitro Binding of a Biotin-DNA Analogue Modifier with Streptavidin Predicts the in Vivo Avidin-Induced Clearability of the Analogue-Modified Antibody.

    PubMed

    Dou, Shuping; Virostko, John; Greiner, Dale L; Powers, Alvin C; Liu, Guozheng

    2015-08-01

    Quantitative prediction of in vivo behavior using an in vitro assay would dramatically accelerate pharmaceutical development. However, studies quantitatively correlating in vivo properties with in vitro assay results are rare because of the difficulty in quantitatively understanding the in vivo behavior of an agent. We now demonstrate such a correlation as a case study based on our quantitative understanding of the in vivo chemistry. In an ongoing pretargeting project, we designed a trifunctional antibody (Ab) that concomitantly carried a biotin and a DNA analogue (hereafter termed MORF). The biotin and the MORF were fused into one structure prior to conjugation to the Ab for the concomitant attachment. Because it was known that avidin-bound Ab molecules leave the circulation rapidly, this design would theoretically allow complete clearance by avidin. The clearability of the trifunctional Ab was determined by calculating the blood MORF concentration ratio of avidin-treated Ab to non-avidin-treated Ab using mice injected with these compounds. In theory, any compromised clearability should be due to the presence of impurities. In vitro, we measured the biotinylated percentage of the Ab-reacting (MORF-biotin)⊃-NH2 modifier, by addition of streptavidin to the radiolabeled (MORF-biotin)⊃-NH2 samples and subsequent high-performance liquid chromatography (HPLC) analysis. On the basis of our previous quantitative understanding, we predicted that the clearability of the Ab would be equal to the biotinylation percentage measured via HPLC. We validated this prediction within a 3% difference. In addition to the high avidin-induced clearability of the trifunctional Ab (up to ∼95%) achieved by the design, we were able to predict the required quality of the (MORF-biotin)⊃-NH2 modifier for any given in vivo clearability. This approach may greatly reduce the steps and time currently required in pharmaceutical development in the process of synthesis, chemical analysis, in

  8. Estimation of daily aluminum intake in Japan based on food consumption inspection results: impact of food additives

    PubMed Central

    Sato, Kyoko; Suzuki, Ippei; Kubota, Hiroki; Furusho, Noriko; Inoue, Tomoyuki; Yasukouchi, Yoshikazu; Akiyama, Hiroshi

    2014-01-01

    Dietary aluminum (Al) intake by young children, children, youths, and adults in Japan was estimated using the market basket method. The Al content of food category (I–VII) samples for each age group was determined by inductively coupled plasma-atomic emission spectrometry (ICP-AES). The Al content in processed foods and unprocessed foods ranged from 0.40 to 21.7 mg/kg and from 0.32 to 0.54 mg/kg, respectively. For processed foods in all age groups, the Al content in food category VI samples, sugar and confections/savories, was the highest, followed by those in category II, cereals. The daily dietary Al intake from processed foods was much larger than that from unprocessed foods. The mean weekly percentages of the provisional tolerable weekly intake (PTWI, established by the joint FAO/WHO Expert Committee on Food Additives in 2011) from processed foods for all age groups are 43.1, 22.4, 17.6 and 15.1%, respectively. Only the highest consumer Al exposure value (>P95) of the young children group exceeded the PTWI. PMID:25473496

  9. A re-examination of paleomagnetic results from NA Jurassic sedimentary rocks: Additional evidence for proposed Jurassic MUTO?

    NASA Astrophysics Data System (ADS)

    Housen, B. A.

    2015-12-01

    Kent and Irving, 2010; and Kent et al, 2015 propose a monster shift in the position of Jurassic (160 to 145 Ma) paleopoles for North America- defined by results from igneous rocks. This monster shift is likely an unrecognized true polar wander occurrence. Although subject to inclination error, results from sedimentary rocks from North America, if corrected for these effects, can be used to supplement the available data for this time period. Steiner (2003) reported results from 48 stratigraphic horizons sampled from the Callovian Summerville Fm, from NE New Mexico. A recalculated mean of these results yields a mean direction of D = 332, I = 39, n=48, k = 15, α95 = 5.4°. These data were analyzed for possible inclination error-although the dataset is small, the E-I results yielded a corrected I = 53. This yields a corrected paleopole for NA at ~165 Ma located at 67° N and 168° E.Paleomagnetic results from the Black Hills- Kilanowski (2002) for the Callovian Hulett Mbr of the Sundance Fm, and Gregiore (2001) the Oxfordian-Tithonian Morrison Fm (Gregiore, 2001) have previously been interpreted to represent Eocene-aged remagnetizations- due to the nearly exact coincidence between the in-situ pole positions of these Jurassic units with the Eocene pole for NA. Both of the tilt-corrected results for these units have high latitude poles (Sundance Fm: 79° N, 146° E; Morrison Fm: 89° N, 165° E). An E-I analysis of these data will be presented- using a provisional inclination error of 10°, corrected paleopoles are: (Sundance Fm: 76° N, 220° E; Morrison Fm: 77° N, 266° E). The Black Hills 165 Ma (Sundance Fm) and 145 Ma (Morrison Fm) poles, provisionally corrected for 10° inclination error- occur fairly close to the NA APWP proposed by Kent et al, 2015- using an updated set of results from kimberlites- the agreement between the Sundance Fm and the Triple-B (158 Ma) pole would be nearly exact with a slightly lesser inclination error. The Summerville Fm- which is

  10. Influence of a Dopamine Pathway Additive Genetic Efficacy Score on Smoking Cessation: Results from Two Randomized Clinical Trials of Bupropion

    PubMed Central

    David, Sean P.; Strong, David R.; Leventhal, Adam M.; Lancaster, Molly A.; McGeary, John E.; Munafò, Marcus R.; Bergen, Andrew W.; Swan, Gary E.; Benowitz, Neal L.; Tyndale, Rachel F.; Conti, David V.; Brown, Richard A.; Lerman, Caryn; Niaura, Raymond

    2013-01-01

    Aims To evaluate associations of treatment and an ‘additive genetic efficacy score’ (AGES) based on dopamine functional polymorphisms with time to first smoking lapse and point prevalence abstinence at end of treatment among participants enrolled in two randomized clinical trials of smoking cessation therapies. Design Double-blind pharmacogenetic efficacy trials randomizing participants to active or placebo bupropion. Study 1 also randomized participants to cognitive-behavioral smoking cessation treatment (CBT) or this treatment with CBT for depression. Study 2 provided standardized behavioural support. Setting Two Hospital-affiliated clinics (Study 1), and two University-affiliated clinics (Study 2). Participants N=792 self-identified white treatment-seeking smokers aged ≥18 years smoking ≥10 cigarettes per day over the last year. Measurements Age, gender, Fagerström Test for Nicotine Dependence, dopamine pathway genotypes (rs1800497 [ANKK1 E713K], rs4680 [COMT V158M], DRD4 exon 3 Variable Number of Tandem Repeats polymorphism [DRD4 VNTR], SLC6A3 3' VNTR) analyzed both separately and as part of an AGES, time to first lapse, and point prevalence abstinence at end of treatment. Findings Significant associations of the AGES (hazard ratio = 1.10, 95% Confidence Interval [CI] = 1.06–1.14], p=0.0099) and of the DRD4 VNTR (HR = 1.29, 95%CI 1.17–1.41, p=0.0073) were observed with time to first lapse. A significant AGES by pharmacotherapy interaction was observed (β [SE]=−0.18 [0.07], p=0.016), such that AGES predicted risk for time to first lapse only for individuals randomized to placebo. Conclusions A score based on functional polymorphisms relating to dopamine pathways appears to predict lapse to smoking following a quit attempt, and the association is mitigated in smokers using bupropion. PMID:23941313

  11. Rheological behavior of FM-9 solutions and correlation with flammability test results and interpretations. [fuel thickening additive

    NASA Technical Reports Server (NTRS)

    Peng, S. T. J.; Landel, R. F.

    1983-01-01

    The rheological behavior of progressively shear thickening FM-9 solutions, a time-dependent shear thickening material with characteristics of threshold behavior, is investigated as part of a study of the rheological properties of antimisting jet fuel. Flammability test results and test configurations from various sources are evaluated. A correlation is obtained between the rheological behavior and the flammability tests such that, for a given system, such as a fixed solvent system and the FM-9 polymer system, the flammability criterion can be applied to a wide range of concentrations and temperatures.

  12. Additive reductions in zebrafish PRPS1 activity result in a spectrum of deficiencies modeling several human PRPS1-associated diseases

    PubMed Central

    Pei, Wuhong; Xu, Lisha; Varshney, Gaurav K.; Carrington, Blake; Bishop, Kevin; Jones, MaryPat; Huang, Sunny C.; Idol, Jennifer; Pretorius, Pamela R.; Beirl, Alisha; Schimmenti, Lisa A.; Kindt, Katie S.; Sood, Raman; Burgess, Shawn M.

    2016-01-01

    Phosphoribosyl pyrophosphate synthetase-1 (PRPS1) is a key enzyme in nucleotide biosynthesis, and mutations in PRPS1 are found in several human diseases including nonsyndromic sensorineural deafness, Charcot-Marie-Tooth disease-5, and Arts Syndrome. We utilized zebrafish as a model to confirm that mutations in PRPS1 result in phenotypic deficiencies in zebrafish similar to those in the associated human diseases. We found two paralogs in zebrafish, prps1a and prps1b and characterized each paralogous mutant individually as well as the double mutant fish. Zebrafish prps1a mutants and prps1a;prps1b double mutants showed similar morphological phenotypes with increasingly severe phenotypes as the number of mutant alleles increased. Phenotypes included smaller eyes and reduced hair cell numbers, consistent with the optic atrophy and hearing impairment observed in human patients. The double mutant also showed abnormal development of primary motor neurons, hair cell innervation, and reduced leukocytes, consistent with the neuropathy and recurrent infection of the human patients possessing the most severe reductions of PRPS1 activity. Further analyses indicated the phenotypes were associated with a prolonged cell cycle likely resulting from reduced nucleotide synthesis and energy production in the mutant embryos. We further demonstrated the phenotypes were caused by delays in the tissues most highly expressing the prps1 genes. PMID:27425195

  13. Additive reductions in zebrafish PRPS1 activity result in a spectrum of deficiencies modeling several human PRPS1-associated diseases.

    PubMed

    Pei, Wuhong; Xu, Lisha; Varshney, Gaurav K; Carrington, Blake; Bishop, Kevin; Jones, MaryPat; Huang, Sunny C; Idol, Jennifer; Pretorius, Pamela R; Beirl, Alisha; Schimmenti, Lisa A; Kindt, Katie S; Sood, Raman; Burgess, Shawn M

    2016-01-01

    Phosphoribosyl pyrophosphate synthetase-1 (PRPS1) is a key enzyme in nucleotide biosynthesis, and mutations in PRPS1 are found in several human diseases including nonsyndromic sensorineural deafness, Charcot-Marie-Tooth disease-5, and Arts Syndrome. We utilized zebrafish as a model to confirm that mutations in PRPS1 result in phenotypic deficiencies in zebrafish similar to those in the associated human diseases. We found two paralogs in zebrafish, prps1a and prps1b and characterized each paralogous mutant individually as well as the double mutant fish. Zebrafish prps1a mutants and prps1a;prps1b double mutants showed similar morphological phenotypes with increasingly severe phenotypes as the number of mutant alleles increased. Phenotypes included smaller eyes and reduced hair cell numbers, consistent with the optic atrophy and hearing impairment observed in human patients. The double mutant also showed abnormal development of primary motor neurons, hair cell innervation, and reduced leukocytes, consistent with the neuropathy and recurrent infection of the human patients possessing the most severe reductions of PRPS1 activity. Further analyses indicated the phenotypes were associated with a prolonged cell cycle likely resulting from reduced nucleotide synthesis and energy production in the mutant embryos. We further demonstrated the phenotypes were caused by delays in the tissues most highly expressing the prps1 genes. PMID:27425195

  14. Zoledronate prevents lactation induced bone loss and results in additional post-lactation bone mass in mice.

    PubMed

    Wendelboe, Mette Høegh; Thomsen, Jesper Skovhus; Henriksen, Kim; Vegger, Jens Bay; Brüel, Annemarie

    2016-06-01

    In rodents, lactation is associated with a considerable and very rapid bone loss, which almost completely recovers after weaning. The aim of the present study was to investigate whether the bisphosphonate Zoledronate (Zln) can inhibit lactation induced bone loss, and if Zln interferes with recovery of bone mass after lactation has ceased. Seventy-six 10-weeks-old NMRI mice were divided into the following groups: Baseline, Pregnant, Lactation, Lactation+Zln, Recovery, Recovery+Zln, and Virgin Control (age-matched). The lactation period was 12days, then the pups were removed, and thereafter recovery took place for 28days. Zln, 100μg/kg, was given s.c. on the day of delivery, and again 4 and 8days later. Mechanical testing, μCT, and dynamic histomorphometry were performed. At L4, lactation resulted in a substantial loss of bone strength (-55% vs. Pregnant, p<0.01), BV/TV (-40% vs. Pregnant, p<0.01), and trabecular thickness (Tb.Th) (-29% vs. Pregnant, p<0.001). Treatment with Zln completely prevented lactation induced loss of bone strength, BV/TV, and Tb.Th at L4. Full recovery of micro-architectural and mechanical properties was found 28days after weaning in vehicle-treated mice. Interestingly, the recovery group treated with Zln during the lactation period had higher BV/TV (+45%, p<0.01) and Tb.Th (+16%, p<0.05) compared with virgin controls. Similar results were found at the proximal tibia and femur. This indicates that Zln did not interfere with the bone formation taking place after weaning. On this background, we conclude that post-lactation bone formation is not dependent on a preceding lactation induced bone loss. PMID:27021151

  15. Comparison of the Multiple-sample means with composite sample results for fecal indicator bacteria by quantitative PCR and culture

    EPA Science Inventory

    ABSTRACT: Few studies have addressed the efficacy of composite sampling for measurement of indicator bacteria by QPCR. In this study, composite results were compared to single sample results for culture- and QPCR-based water quality monitoring. Composite results for both methods ...

  16. Non starch polysaccharide hydrolyzing enzymes as feed additives: detection of enzyme activities and problems encountered with quantitative determination in complex samples.

    PubMed

    Vahjen, W; Gläser, K; Froeck, M; Simon, O

    1997-01-01

    Chromogenic substrates, an agar diffusion assay and viscosity reduction were used to estimate beta-glucanase and xylanase activities in water soluble extracts of different feedstuffs and digesta supernatants. The dinitrosalicylic acid reducing sugar method was employed to calibrate results from different methods based on international units (IU, glucose equivalents). The detection of dye release from chromogenic substrates was a suitable method, allowing the detection of 0.05 IU of enzyme activity per ml of extract, although measurements in digesta supernatants were limited in linearity (0.1-0.5 IU/ml supernatant). With the agar diffusion assay the detection of enzyme activity was possible over a wider concentration range (extracts: 0.05-1 IU/ml, digesta supernatants: 0.1-1 IU/ml), but visual evaluation led to inaccurate measurement. Accuracy can be improved by computer based evaluation of digital images. The use of viscosity reduction produced linear standard curves from 0.01 to 0.5 IU/ml in feed extracts, but reliability of measurements depended on modification of substrates. Quantification of enzyme activities was influenced by matrix effects of complex samples. Cereal dependant differences were found in various extracts of feed mixtures and cereal extracts. Digesta supernatants partly inhibited enzyme activity, depending on the origin of the sample. Interaction of substrates with digesta components varied between methods. The sensitivity of the methods is comparable, however, all methods require specific calibrations to account for matrix- and enzyme specific effects. PMID:9345597

  17. Flue gas conditioning for improved particle collection in electrostatic precipitators. Second topical report, Results of bench-scale screening of additives

    SciTech Connect

    Durham, M.D.

    1993-08-13

    ADA Technologies, Inc. (ADA) has completed the bench-scale testing phase of a program to evaluate additives that will improve the collection of fine particles in electrostatic precipitators (ESPs). A bench-scale ESP was installed at the Consolidation Coal Company (CONSOL) combustion research and development facility in Library, PA in order to conduct the evaluation. During a two-week test, four candidate additives were injected into the flue gas ahead of a 100 acfm ESP to determine the effect on fly ash collectability. Two additives were found to reduce the emissions from the ESP. Additives ``C`` and ``D`` performed better than initially anticipated -- reducing emissions initially by 17%. Emissions were reduced by 27% after the ESP was modified by the installation of baffles to minimize sneakage. In addition to the measured improvements in performance, no detrimental effects (i.e., electrode fouling) were observed in the operation of the ESP during the testing. The measures of success identified for the bench-scale phase of the program have been surpassed. Since the additives will affect only non-rapping reentrainment particle losses, it is expected that an even greater improvement in particle collection will be observed in larger-scale ESPs. Therefore, positive results are anticipated during the pilot-scale phase of the program and during a future full-scale demonstration test. A preliminary economic analysis was performed to evaluate the cost of the additive process and to compare its costs against alternative means for reducing emissions from ESPs. The results show that conditioning with additive C at a rate of 0.05% (wt. additive to wt. fly ash) is much less expensive than adding new ESP capacity, and more cost competitive than existing chemical conditioning processes. Preliminary chemical analysis of conditioned fly ash shows that it passes the Toxicity Characteristic Leaching Procedure criteria.

  18. Quantitative assessment of the impact of biomedical image acquisition on the results obtained from image analysis and processing

    PubMed Central

    2014-01-01

    Introduction Dedicated, automatic algorithms for image analysis and processing are becoming more and more common in medical diagnosis. When creating dedicated algorithms, many factors must be taken into consideration. They are associated with selecting the appropriate algorithm parameters and taking into account the impact of data acquisition on the results obtained. An important feature of algorithms is the possibility of their use in other medical units by other operators. This problem, namely operator’s (acquisition) impact on the results obtained from image analysis and processing, has been shown on a few examples. Material and method The analysed images were obtained from a variety of medical devices such as thermal imaging, tomography devices and those working in visible light. The objects of imaging were cellular elements, the anterior segment and fundus of the eye, postural defects and others. In total, almost 200'000 images coming from 8 different medical units were analysed. All image analysis algorithms were implemented in C and Matlab. Results For various algorithms and methods of medical imaging, the impact of image acquisition on the results obtained is different. There are different levels of algorithm sensitivity to changes in the parameters, for example: (1) for microscope settings and the brightness assessment of cellular elements there is a difference of 8%; (2) for the thyroid ultrasound images there is a difference in marking the thyroid lobe area which results in a brightness assessment difference of 2%. The method of image acquisition in image analysis and processing also affects: (3) the accuracy of determining the temperature in the characteristic areas on the patient’s back for the thermal method - error of 31%; (4) the accuracy of finding characteristic points in photogrammetric images when evaluating postural defects – error of 11%; (5) the accuracy of performing ablative and non-ablative treatments in cosmetology - error of 18

  19. Longitudinal, intermodality registration of quantitative breast PET and MRI data acquired before and during neoadjuvant chemotherapy: Preliminary results

    SciTech Connect

    Atuegwu, Nkiruka C.; Williams, Jason M.; Li, Xia; Arlinghaus, Lori R.; Abramson, Richard G.; Chakravarthy, A. Bapsi; Abramson, Vandana G.; Yankeelov, Thomas E.

    2014-05-15

    Purpose: The authors propose a method whereby serially acquired DCE-MRI, DW-MRI, and FDG-PET breast data sets can be spatially and temporally coregistered to enable the comparison of changes in parameter maps at the voxel level. Methods: First, the authors aligned the PET and MR images at each time point rigidly and nonrigidly. To register the MR images longitudinally, the authors extended a nonrigid registration algorithm by including a tumor volume-preserving constraint in the cost function. After the PET images were aligned to the MR images at each time point, the authors then used the transformation obtained from the longitudinal registration of the MRI volumes to register the PET images longitudinally. The authors tested this approach on ten breast cancer patients by calculating a modified Dice similarity of tumor size between the PET and MR images as well as the bending energy and changes in the tumor volume after the application of the registration algorithm. Results: The median of the modified Dice in the registered PET and DCE-MRI data was 0.92. For the longitudinal registration, the median tumor volume change was −0.03% for the constrained algorithm, compared to −32.16% for the unconstrained registration algorithms (p = 8 × 10{sup −6}). The medians of the bending energy were 0.0092 and 0.0001 for the unconstrained and constrained algorithms, respectively (p = 2.84 × 10{sup −7}). Conclusions: The results indicate that the proposed method can accurately spatially align DCE-MRI, DW-MRI, and FDG-PET breast images acquired at different time points during therapy while preventing the tumor from being substantially distorted or compressed.

  20. Messages that increase women’s intentions to abstain from alcohol during pregnancy: results from quantitative testing of advertising concepts

    PubMed Central

    2014-01-01

    Background Public awareness-raising campaigns targeting alcohol use during pregnancy are an important part of preventing prenatal alcohol exposure and Fetal Alcohol Spectrum Disorder. Despite this, there is little evidence on what specific elements contribute to campaign message effectiveness. This research evaluated three different advertising concepts addressing alcohol and pregnancy: a threat appeal, a positive appeal promoting a self-efficacy message, and a concept that combined the two appeals. The primary aim was to determine the effectiveness of these concepts in increasing women’s intentions to abstain from alcohol during pregnancy. Methods Women of childbearing age and pregnant women residing in Perth, Western Australia participated in a computer-based questionnaire where they viewed either a control or one of the three experimental concepts. Following exposure, participants’ intentions to abstain from and reduce alcohol intake during pregnancy were measured. Other measures assessed included perceived main message, message diagnostics, and potential to promote defensive responses or unintended consequences. Results The concepts containing a threat appeal were significantly more effective at increasing women’s intentions to abstain from alcohol during pregnancy than the self-efficacy message and the control. The concept that combined threat and self-efficacy is recommended for development as part of a mass-media campaign as it has good persuasive potential, provides a balance of positive and negative emotional responses, and is unlikely to result in defensive or unintended consequences. Conclusions This study provides important insights into the components that enhance the persuasiveness and effectiveness of messages aimed at preventing prenatal alcohol exposure. The recommended concept has good potential for use in a future campaign aimed at promoting women’s intentions to abstain from alcohol during pregnancy. PMID:24410764

  1. SU-C-210-06: Quantitative Evaluation of Dosimetric Effects Resulting From Positional Variations of Pancreatic Tumor Volumes

    SciTech Connect

    Yu, S; Sehgal, V; Wei, R; Lawrenson, L; Kuo, J; Hanna, N; Ramsinghani, N; Daroui, P; Al-Ghazi, M

    2015-06-15

    Purpose: The aim of this study is to quantify dosimetric effects resulting from variation in pancreatic tumor position assessed by bony anatomy and implanted fiducial markers Methods: Twelve pancreatic cancer patients were retrospectively analyzed for this study. All patients received modulated arc therapy (VMAT) treatment using fiducial-based Image Guided Radiation Therapy (IGRT) to the intact pancreas. Using daily orthogonal kV and/or Cone beam CT images, the shift needed to co-register the daily pre-treatment images to reference CT from fiducial to bone (Fid-Bone) were recorded as Left-Right (LR), Anterior-Posterior (AP) and Superior-Inferior (SI). The original VMAT plan iso-center was shifted based on KV bone matching positions at 5 evenly spaced fractions. Dose coverage of the planning target volumes (PTVs) (V100%), mean dose to liver, kidney and stomach/duodenum were assessed in the modified plans. Results: A total of 306 fractions were analyzed. The absolute fiducial-bone positional shifts were greatest in the SI direction, (AP = 2.7 ± 3.0, LR = 2.8 ± 2.8, and SI 6.3 ± 7.9 mm, mean ± SD). The V100% was significantly reduced by 13.5%, (Fid-Bone = 95.3 ± 2.0 vs. 82.3 ± 11.8%, p=0.02). This varied widely among patients (Fid-Bone V100% Range = 2–60%), where 33% of patients had a reduction in V100% of more than 10%. The impact on OARs was greatest to the liver (Fid-Bone= 14.6 vs. 16.1 Gy, 10%), and stomach, (Fid-Bone = 23.9 vx. 25.5 Gy, 7%), however was not statistically significant (p=0.10 both). Conclusion: Compared to matching by fiducial markers, matching by bony anatomy would have substantially reduced the PTV coverage by 13.5%. This reinforces the importance of online position verification based on fiducial markers. Hence, implantation of fiducial markers is strongly recommended for pancreatic cancer patients undergoing intensity modulated radiation therapy treatments.

  2. An approach for relating the results of quantitative nondestructive evaluation to intrinsic properties of high-performance materials

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1990-01-01

    One of the most difficult problems the manufacturing community has faced during recent years has been to accurately assess the physical state of anisotropic high-performance materials by nondestructive means. In order to advance the design of ultrasonic nondestructive testing systems, a more fundamental understanding of how ultrasonic waves travel and interact within the anisotropic material is needed. The relationship between the ultrasonic and engineering parameters needs to be explored to understand their mutual dependence. One common denominator is provided by the elastic constants. The preparation of specific graphite/epoxy samples to be used in the experimental investigation of the anisotropic properties (through the measurement of the elastic stiffness constants) is discussed. Accurate measurements of these constants will depend upon knowledge of refraction effects as well as the direction of group velocity propagation. The continuing effort for the development of improved visualization techniques for physical parameters is discussed. Group velocity images are presented and discussed. In order to fully understand the relationship between the ultrasonic and the common engineering parameters, the physical interpretation of the linear elastic coefficients (the quantities that relate applied stresses to resulting strains) are discussed. This discussion builds a more intuitional understanding of how the ultrasonic parameters are related to the traditional engineering parameters.

  3. Attitudes towards the sharing of genetic information with at-risk relatives: results of a quantitative survey.

    PubMed

    Heaton, Timothy J; Chico, Victoria

    2016-01-01

    To investigate public attitudes towards receiving genetic information arising from a test on a relative, 955 University of Sheffield students and staff were surveyed using disease vignettes. Strength of attitude was measured on whether, in the event of relevant information being discovered, they, as an at-risk relative, would want to be informed, whether the at-risk relative's interest should override proband confidentiality, and, if they had been the proband, willingness to give up confidentiality to inform such relatives. Results indicated considerably more complexity to the decision-making than simple statistical risk. Desire for information only slightly increased with risk of disease manifestation [log odds 0.05 (0.04, 0.06) per percentage point increase in manifestation risk]. Condition preventability was the primary factor increasing desire [modifiable baseline, non-preventable log odds -1.74 (-2.04, -1.44); preventable 0.64 (0.34, 0.95)]. Disease seriousness also increased desire [serious baseline, non-serious log odds -0.89 (-1.19, -0.59); fatal 0.55 (0.25, 0.86)]. Individuals with lower education levels exhibited much greater desire to be informed [GCSE log odds 1.67 (0.64, 2.66)]. Age did not affect desire. Our findings suggest that attitudes were influenced more by disease characteristics than statistical risk. Respondents generally expressed strong attitudes demonstrating that this was not an issue which people felt ambivalent about. We provide estimates of the British population in favour/against disclosure for various disease scenarios. PMID:26612611

  4. Preliminary results of a quantitative comparison of the spectral signatures of Landsat Thematic Mapper (TM) and Modular Optoelectronic Multispectral Scanner (MOMS).

    NASA Technical Reports Server (NTRS)

    Bodechtel, J.; Zilger, J.; Salomonson, V. V.

    1985-01-01

    Operationally acquired Thematic Mapper and experimental MOMS-01 data are evaluated quantitatively concerning the systems spectral response and performance for geoscientific applications. Results show the two instruments to be similar in the spectral bands compared. Although the MOMS scanner has a smaller IFOV, it has a lower modulation transfer function performance for small, low contrast features as compared to Thematic Mapper. This deficiency does not only occur when MOMS was switched to the low gain mode. It is due to the CD arrays used (ITEK CCPD 1728).

  5. Influence of binder properties, method of addition, powder type and operating conditions on fluid-bed melt granulation and resulting tablet properties.

    PubMed

    Abberger, T

    2001-12-01

    The aim of the study was to investigate melt granulation in a laboratory scale fluid-bed granulator with respect to granule growth, granule properties and resulting tablet properties. The parameters investigated were method of addition of PEG (spray-on or addition as flakes), binder concentration, PEG type (3000, 4000 and 6000, sprayed-on), size (PEG 4000, added as three different sized flakes), powder type (two different sized lactose types and corn starch) and operating conditions (volume air flow and heating temperature). Addition of binder as flakes led to layering as a growth mechanism when the size of the flakes was high. Coalescence occurred when the size was low. Coalescence also occurred when spraying was the method of addition. Due to the greater viscosity of the PEG 6000 melt it produced bigger granules than 3000 or 4000. The influence of volume air flow was moderate and the influence of heating temperature in the range of 70-90 degrees C was very low with both methods of addition. The disintegration time of tablets from granules where PEG was added as flakes was shorter than from granules where PEG was sprayed-on. The latter method of binder addition led to tablets which did not disintegrate but eroded. This was apparently caused by formation of a binder matrix, which could not be destroyed by the disintegrant. PMID:11802658

  6. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research.

    PubMed

    Fedorov, Andriy; Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM(®)) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions

  7. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research

    PubMed Central

    Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R.

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM®) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions

  8. Binary neutron-star mergers with Whisky and SACRA: First quantitative comparison of results from independent general-relativistic hydrodynamics codes

    NASA Astrophysics Data System (ADS)

    Baiotti, Luca; Shibata, Masaru; Yamamoto, Tetsuro

    2010-09-01

    We present the first quantitative comparison of two independent general-relativistic hydrodynamics codes, the whisky code and the sacra code. We compare the output of simulations starting from the same initial data and carried out with the configuration (numerical methods, grid setup, resolution, gauges) which for each code has been found to give consistent and sufficiently accurate results, in particular, in terms of cleanness of gravitational waveforms. We focus on the quantities that should be conserved during the evolution (rest mass, total mass energy, and total angular momentum) and on the gravitational-wave amplitude and frequency. We find that the results produced by the two codes agree at a reasonable level, with variations in the different quantities but always at better than about 10%.

  9. An adapted mindfulness-based stress reduction program for elders in a continuing care retirement community: quantitative and qualitative results from a pilot randomized controlled trial.

    PubMed

    Moss, Aleezé S; Reibel, Diane K; Greeson, Jeffrey M; Thapar, Anjali; Bubb, Rebecca; Salmon, Jacqueline; Newberg, Andrew B

    2015-06-01

    The purpose of this study was to test the feasibility and effectiveness of an adapted 8-week Mindfulness-Based Stress Reduction (MBSR) program for elders in a continuing care community. This mixed-methods study used both quantitative and qualitative measures. A randomized waitlist control design was used for the quantitative aspect of the study. Thirty-nine elderly were randomized to MBSR (n = 20) or a waitlist control group (n = 19), mean age was 82 years. Both groups completed pre-post measures of health-related quality of life, acceptance and psychological flexibility, facets of mindfulness, self-compassion, and psychological distress. A subset of MBSR participants completed qualitative interviews. MBSR participants showed significantly greater improvement in acceptance and psychological flexibility and in role limitations due to physical health. In the qualitative interviews, MBSR participants reported increased awareness, less judgment, and greater self-compassion. Study results demonstrate the feasibility and potential effectiveness of an adapted MBSR program in promoting mind-body health for elders. PMID:25492049

  10. Results.

    ERIC Educational Resources Information Center

    Zemsky, Robert; Shaman, Susan; Shapiro, Daniel B.

    2001-01-01

    Describes the Collegiate Results Instrument (CRI), which measures a range of collegiate outcomes for alumni 6 years after graduation. The CRI was designed to target alumni from institutions across market segments and assess their values, abilities, work skills, occupations, and pursuit of lifelong learning. (EV)

  11. Food additives.

    PubMed

    Berglund, F

    1978-01-01

    The use of additives to food fulfils many purposes, as shown by the index issued by the Codex Committee on Food Additives: Acids, bases and salts; Preservatives, Antioxidants and antioxidant synergists; Anticaking agents; Colours; Emulfifiers; Thickening agents; Flour-treatment agents; Extraction solvents; Carrier solvents; Flavours (synthetic); Flavour enhancers; Non-nutritive sweeteners; Processing aids; Enzyme preparations. Many additives occur naturally in foods, but this does not exclude toxicity at higher levels. Some food additives are nutrients, or even essential nutritents, e.g. NaCl. Examples are known of food additives causing toxicity in man even when used according to regulations, e.g. cobalt in beer. In other instances, poisoning has been due to carry-over, e.g. by nitrate in cheese whey - when used for artificial feed for infants. Poisonings also occur as the result of the permitted substance being added at too high levels, by accident or carelessness, e.g. nitrite in fish. Finally, there are examples of hypersensitivity to food additives, e.g. to tartrazine and other food colours. The toxicological evaluation, based on animal feeding studies, may be complicated by impurities, e.g. orthotoluene-sulfonamide in saccharin; by transformation or disappearance of the additive in food processing in storage, e.g. bisulfite in raisins; by reaction products with food constituents, e.g. formation of ethylurethane from diethyl pyrocarbonate; by metabolic transformation products, e.g. formation in the gut of cyclohexylamine from cyclamate. Metabolic end products may differ in experimental animals and in man: guanylic acid and inosinic acid are metabolized to allantoin in the rat but to uric acid in man. The magnitude of the safety margin in man of the Acceptable Daily Intake (ADI) is not identical to the "safety factor" used when calculating the ADI. The symptoms of Chinese Restaurant Syndrome, although not hazardous, furthermore illustrate that the whole ADI

  12. Using qualitative research to facilitate the interpretation of quantitative results from a discrete choice experiment: insights from a survey in elderly ophthalmologic patients

    PubMed Central

    Vennedey, Vera; Danner, Marion; Evers, Silvia MAA; Fauser, Sascha; Stock, Stephanie; Dirksen, Carmen D; Hiligsmann, Mickaël

    2016-01-01

    Background Age-related macular degeneration (AMD) is the leading cause of visual impairment and blindness in industrialized countries. Currently, mainly three treatment options are available, which are all intravitreal injections, but differ with regard to the frequency of injections needed, their approval status, and cost. This study aims to estimate patients’ preferences for characteristics of treatment options for neovascular AMD. Methods An interviewer-assisted discrete choice experiment was conducted among patients suffering from AMD treated with intravitreal injections. A Bayesian efficient design was used for the development of 12 choice tasks. In each task patients indicated their preference for one out of two treatment scenarios described by the attributes: side effects, approval status, effect on visual function, injection and monitoring frequency. While answering the choice tasks, patients were asked to think aloud and explain the reasons for choosing or rejecting specific characteristics. Quantitative data were analyzed with a mixed multinomial logit model. Results Eighty-six patients completed the questionnaire. Patients significantly preferred treatments that improve visual function, are approved, are administered in a pro re nata regimen (as needed), and are accompanied by bimonthly monitoring. Patients significantly disliked less frequent monitoring visits (every 4 months) and explained this was due to fear of deterioration being left unnoticed, and in turn experiencing disease deterioration. Significant preference heterogeneity was found for all levels except for bimonthly monitoring visits and severe, rare eye-related side effects. Patients gave clear explanations of their individual preferences during the interviews. Conclusion Significant preference trends were discernible for the overall sample, despite the preference heterogeneity for most treatment characteristics. Patients like to be monitored and treated regularly, but not too frequently

  13. Quantitatively Verifying the Results' Rationality for Farmland Quality Evaluation with Crop Yield, a Case Study in the Northwest Henan Province, China

    PubMed Central

    Huang, Junchang; Wang, Song

    2016-01-01

    Evaluating the assessing results’ rationality for farmland quality (FQ) is usually qualitative and based on farmers and experts’ perceptions of soil quality and crop yield. Its quantitative checking still remains difficult and is likely ignored. In this paper, FQ in Xiuwu County, the Northwest Henan Province, China was evaluated by the gray relational analysis (GRA) method and the traditional analytic hierarchy process (AHP) method. The consistency rate of two results was analysed. Research focused on proposing one method of testing the evaluation results’ rationality for FQ based on the crop yield. Firstly generating a grade map of crop yield and overlying it with the FQ evaluation maps. Then analysing their consistency rate for each grade in the same spatial position. Finally examining the consistency effects and allowing for a decision on adopting the results. The results showed that the area rate consistency and matching evaluation unit numbers between the two methods were 84.68% and 87.29%, respectively, and the space distribution was approximately equal. The area consistency rates between crop yield level and FQ evaluation levels by GRA and AHP were 78.15% and 74.29%, respectively. Therefore, the verifying effects of GRA and AHP were near, good and acceptable, and the FQ results from both could reflect the crop yield levels. The evaluation results by GCA, as a whole, were slightly more rational than that by AHP. PMID:27490247

  14. Overlapping Repressor Binding Sites Result in Additive Regulation of Escherichia coli FadH by FadR and ArcA▿

    PubMed Central

    Feng, Youjun; Cronan, John E.

    2010-01-01

    Escherichia coli fadH encodes a 2,4-dienoyl reductase that plays an auxiliary role in β-oxidation of certain unsaturated fatty acids. In the 2 decades since its discovery, FadH biochemistry has been studied extensively. However, the genetic regulation of FadH has been explored only partially. Here we report mapping of the fadH promoter and document its complex regulation by three independent regulators, the fatty acid degradation FadR repressor, the oxygen-responsive ArcA-ArcB two-component system, and the cyclic AMP receptor protein-cyclic AMP (CRP-cAMP) complex. Electrophoretic mobility shift assays demonstrated that FadR binds to the fadH promoter region and that this binding can be specifically reversed by long-chain acyl-coenzyme A (CoA) thioesters. In vivo data combining transcriptional lacZ fusion and real-time quantitative PCR (qPCR) analyses indicated that fadH is strongly repressed by FadR, in agreement with induction of fadH by long-chain fatty acids. Inactivation of arcA increased fadH transcription by >3-fold under anaerobic conditions. Moreover, fadH expression was increased 8- to 10-fold under anaerobic conditions upon deletion of both the fadR and the arcA gene, indicating that anaerobic expression is additively repressed by FadR and ArcA-ArcB. Unlike fadM, a newly reported member of the E. coli fad regulon that encodes another auxiliary β-oxidation enzyme, fadH was activated by the CRP-cAMP complex in a manner similar to those of the prototypical fad genes. In the absence of the CRP-cAMP complex, repression of fadH expression by both FadR and ArcA-ArcB was very weak, suggesting a possible interplay with other DNA binding proteins. PMID:20622065

  15. Biomechanical effects of teriparatide in women with osteoporosis treated previously with alendronate and risedronate: results from quantitative computed tomography-based finite element analysis of the vertebral body.

    PubMed

    Chevalier, Yan; Quek, Evelyn; Borah, Babul; Gross, Gary; Stewart, John; Lang, Thomas; Zysset, Philippe

    2010-01-01

    Previous antiresorptive treatment may influence the anabolic response to teriparatide. The OPTAMISE (Open-label Study to Determine How Prior Therapy with Alendronate or Risedronate in Postmenopausal Women with Osteoporosis Influences the Clinical Effectiveness of Teriparatide) study reported greater increases in biochemical markers of bone turnover and volumetric bone mineral density (BMD) when 12 months of teriparatide treatment was preceded by 2 years or more of risedronate versus alendronate treatment. The objective of this study was to use quantitative computed tomography (CT)-based nonlinear finite element modeling to evaluate how prior therapy with alendronate or risedronate in postmenopausal women with osteoporosis influences the biomechanical effectiveness of teriparatide. Finite element models of the L1 vertebra were created from quantitative CT scans, acquired before and after 12 months of therapy with teriparatide, from 171 patients from the OPTAMISE study. These models were subjected to uniaxial compression. Total BMD-derived bone volume fraction (BV/TV(d), i.e., bone volume [BV]/total volume [TV]), estimated from quantitative CT-based volumetric BMD, vertebral stiffness, and failure load (strength) were calculated for each time measurement point. The results of this study demonstrated that 12 months of treatment with teriparatide following prior treatment with either risedronate or alendronate increased BMD-derived BV/TV(d), the predicted vertebral stiffness, and failure load. However, the effects of teriparatide were more pronounced in patients treated previously with risedronate, which is consistent with the findings of the OPTAMISE study. The mean (+/-standard error) increase in stiffness was greater in the prior risedronate group than the prior alendronate group (24.6+/-3.2% versus 14.4+/-2.8%, respectively; p=0.0073). Similarly, vertebral failure load increased by 27.2+/-3.5% in the prior risedronate group versus 15.3+/-3.1% in the prior

  16. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  17. Two cases of food additive-induced severe liver damage associated with positive results on lymphocyte stimulation test and for antinuclear antibodies.

    PubMed

    Kaneko, Rena; Ohishi, Chitose; Kim, Miniru; Shiina, Masaaki; Kusayanagi, Satoshi; Ogawa, Masazumi; Munakata, Kazuo; Mizuno, Kyoichi; Sato, Yuzuru

    2012-08-01

    Two cases of severe liver injury and positive result for antinuclear antibodies induced by food additives are reported. The first patient reported long-term intake of Mabo Ramen(®) noodle soup, nutritional supplements, and over-the-counter drugs. Total bilirubin, aspartate aminotransferase, and alanine aminotransferase were 9.6 mg/dL, 1,048, and 1,574 IU/L, respectively. Antinuclear antibody was 80×. The drug-induced lymphocyte stimulation test (DLST) was positive for Mabo Ramen(®) and its additives such as Xanthan gum, guar gum, and Doubanjiang. Histologic examination of a liver biopsy specimen showed lymphocyte infiltration and necrosis. The autoimmune hepatitis score was 3. The second patient reported intake of dietary supplements, including Bimore C(®) and Chokora BB(®). Laboratory tests revealed that total bilirubin was 9.8 mg/dL, aspartate aminotransferase was 1,130 IU/L, and alanine aminotransferase was 1,094 IU/L. Antinuclear antibody was 320×. Co-existing pancreatic damage was confirmed by the findings on abdominal CT and elevation of serum lipase, span-1, and DUPAN-2. DLSTs were positive for both supplements. These two supplements contained additives such as titanium oxide, magnesium stearate, and hydroxypropylcellulose. DLSTs for all three additives were positive. Histologic examination revealed periportal necrosis and lymphocyte infiltration of lobular and portal areas. These two cases demonstrate that repeating DLSTs is useful for identifying causative constituents in foods and supplements. PMID:26182392

  18. Quantitative High-Efficiency Cadmium-Zinc-Telluride SPECT with Dedicated Parallel-Hole Collimation System in Obese Patients: Results of a Multi-Center Study

    PubMed Central

    Nakazato, Ryo; Slomka, Piotr J.; Fish, Mathews; Schwartz, Ronald G.; Hayes, Sean W.; Thomson, Louise E.J.; Friedman, John D.; Lemley, Mark; Mackin, Maria L.; Peterson, Benjamin; Schwartz, Arielle M.; Doran, Jesse A.; Germano, Guido; Berman, Daniel S.

    2014-01-01

    Background Obesity is a common source of artifact on conventional SPECT myocardial perfusion imaging (MPI). We evaluated image quality and diagnostic performance of high-efficiency (HE) cadmium-zinc-telluride (CZT) parallel-hole SPECT-MPI for coronary artery disease (CAD) in obese patients. Methods and Results 118 consecutive obese patients at 3 centers (BMI 43.6±8.9 kg/m2, range 35–79.7 kg/m2) had upright/supine HE-SPECT and ICA >6 months (n=67) or low-likelihood of CAD (n=51). Stress quantitative total perfusion deficit (TPD) for upright (U-TPD), supine (S-TPD) and combined acquisitions (C-TPD) was assessed. Image quality (IQ; 5=excellent; <3 nondiagnostic) was compared among BMI 35–39.9 (n=58), 40–44.9 (n=24) and ≥45 (n=36) groups. ROC-curve area for CAD detection (≥50% stenosis) for U-TPD, S-TPD, and C-TPD were 0.80, 0.80, and 0.87, respectively. Sensitivity/specificity was 82%/57% for U-TPD, 74%/71% for S-TPD, and 80%/82% for C-TPD. C-TPD had highest specificity (P=.02). C-TPD normalcy rate was higher than U-TPD (88% vs. 75%, P=.02). Mean IQ was similar among BMI 35–39.9, 40–44.9 and ≥45 groups [4.6 vs. 4.4 vs. 4.5, respectively (P=.6)]. No patient had a non-diagnostic stress scan. Conclusions In obese patients, HE-SPECT MPI with dedicated parallel-hole collimation demonstrated high image quality, normalcy rate, and diagnostic accuracy for CAD by quantitative analysis of combined upright/supine acquisitions. PMID:25388380

  19. Amplitudes of Pain-Related Evoked Potentials Are Useful to Detect Small Fiber Involvement in Painful Mixed Fiber Neuropathies in Addition to Quantitative Sensory Testing – An Electrophysiological Study

    PubMed Central

    Hansen, Niels; Kahn, Ann-Kathrin; Zeller, Daniel; Katsarava, Zaza; Sommer, Claudia; Üçeyler, Nurcan

    2015-01-01

    To investigate the usefulness of pain-related evoked potentials (PREP) elicited by electrical stimulation for the identification of small fiber involvement in patients with mixed fiber neuropathy (MFN). Eleven MFN patients with clinical signs of large fiber impairment and neuropathic pain and ten healthy controls underwent clinical and electrophysiological evaluation. Small fiber function, electrical conductivity and morphology were examined by quantitative sensory testing (QST), PREP, and skin punch biopsy. MFN was diagnosed following clinical and electrophysiological examination (chronic inflammatory demyelinating neuropathy: n = 6; vasculitic neuropathy: n = 3; chronic axonal ­neuropathy: n = 2). The majority of patients with MFN characterized their pain by descriptors that mainly represent C-fiber-mediated pain. In QST, patients displayed elevated cold, warm, mechanical, and vibration detection thresholds and cold pain thresholds indicative of MFN. PREP amplitudes in patients correlated with cold (p < 0.05) and warm detection thresholds (p < 0.05). Burning pain and the presence of par-/dysesthesias correlated negatively with PREP amplitudes (p < 0.05). PREP amplitudes correlating with cold and warm detection thresholds, burning pain, and par-/dysesthesias support employing PREP amplitudes as an additional tool in conjunction with QST for detecting small fiber impairment in patients with MFN. PMID:26696950

  20. Pulsed addition of HMF and furfural to batch-grown xylose-utilizing Saccharomyces cerevisiae results in different physiological responses in glucose and xylose consumption phase

    PubMed Central

    2013-01-01

    Background Pretreatment of lignocellulosic biomass generates a number of undesired degradation products that can inhibit microbial metabolism. Two of these compounds, the furan aldehydes 5-hydroxymethylfurfural (HMF) and 2-furaldehyde (furfural), have been shown to be an impediment for viable ethanol production. In the present study, HMF and furfural were pulse-added during either the glucose or the xylose consumption phase in order to dissect the effects of these inhibitors on energy state, redox metabolism, and gene expression of xylose-consuming Saccharomyces cerevisiae. Results Pulsed addition of 3.9 g L-1 HMF and 1.2 g L-1 furfural during either the glucose or the xylose consumption phase resulted in distinct physiological responses. Addition of furan aldehydes in the glucose consumption phase was followed by a decrease in the specific growth rate and the glycerol yield, whereas the acetate yield increased 7.3-fold, suggesting that NAD(P)H for furan aldehyde conversion was generated by acetate synthesis. No change in the intracellular levels of NAD(P)H was observed 1 hour after pulsing, whereas the intracellular concentration of ATP increased by 58%. An investigation of the response at transcriptional level revealed changes known to be correlated with perturbations in the specific growth rate, such as protein and nucleotide biosynthesis. Addition of furan aldehydes during the xylose consumption phase brought about an increase in the glycerol and acetate yields, whereas the xylitol yield was severely reduced. The intracellular concentrations of NADH and NADPH decreased by 58 and 85%, respectively, hence suggesting that HMF and furfural drained the cells of reducing power. The intracellular concentration of ATP was reduced by 42% 1 hour after pulsing of inhibitors, suggesting that energy-requiring repair or maintenance processes were activated. Transcriptome profiling showed that NADPH-requiring processes such as amino acid biosynthesis and sulfate and

  1. High SO{sub 2} removal efficiency testing: Results of DBA and sodium formate additive tests at Southwestern Electric Power company`s Pirkey Station

    SciTech Connect

    1996-05-30

    Tests were conducted at Southwestern Electric Power Company`s (SWEPCo) Henry W. Pirkey Station wet limestone flue gas desulfurization (FGD) system to evaluate options for achieving high sulfur dioxide removal efficiency. The Pirkey FGD system includes four absorber modules, each with dual slurry recirculation loops and with a perforated plate tray in the upper loop. The options tested involved the use of dibasic acid (DBA) or sodium formate as a performance additive. The effectiveness of other potential options was simulated with the Electric Power Research Institute`s (EPRI) FGD PRocess Integration and Simulation Model (FGDPRISM) after it was calibrated to the system. An economic analysis was done to determine the cost effectiveness of the high-efficiency options. Results are-summarized below.

  2. Quantitative Sodium MR Imaging at 7 T: Initial Results and Comparison with Diffusion-weighted Imaging in Patients with Breast Tumors.

    PubMed

    Zaric, Olgica; Pinker, Katja; Zbyn, Stefan; Strasser, Bernhard; Robinson, Simon; Minarikova, Lenka; Gruber, Stephan; Farr, Alex; Singer, Christian; Helbich, Thomas H; Trattnig, Siegfried; Bogner, Wolfgang

    2016-07-01

    Purpose To investigate the clinical feasibility of a quantitative sodium 23 ((23)Na) magnetic resonance (MR) imaging protocol developed for breast tumor assessment and to compare it with 7-T diffusion-weighted imaging (DWI). Materials and Methods Written informed consent in this institutional review board-approved study was obtained from eight healthy volunteers and 17 patients with 20 breast tumors (five benign, 15 malignant). To achieve the best image quality and reproducibility, the (23)Na sequence was optimized and tested on phantoms and healthy volunteers. For in vivo quantification of absolute tissue sodium concentration (TSC), an external phantom was used. Static magnetic field, or B0, and combined transmit and receive radiofrequency field, or B1, maps were acquired, and image quality, measurement reproducibility, and accuracy testing were performed. Bilateral (23)Na and DWI sequences were performed before contrast material-enhanced MR imaging in patients with breast tumors. TSC and apparent diffusion coefficient (ADC) were calculated and correlated for healthy glandular tissue and benign and malignant lesions. Results The (23)Na MR imaging protocol is feasible, with 1.5-mm in-plane resolution and 16-minute imaging time. Good image quality was achieved, with high reproducibility (mean TSC values ± standard deviation for the test, 36 mmol per kilogram of wet weight ± 2 [range, 34-37 mmol/kg]; for the retest, 37 mmol/kg ± 1 [range, 35-39 mmol/kg]; P = .610) and accuracy (r = 0.998, P < .001). TSC values in normal glandular and adipose breast tissue were 35 mmol/kg ± 3 and 18 mmol/kg ± 3, respectively. In malignant lesions (mean size, 31 mm ± 24; range, 6-92 mm), the TSC of 69 mmol/kg ± 10 was, on average, 49% higher than that in benign lesions (mean size, 14 mm ± 12; range, 6-35 mm), with a TSC of 47 mmol/kg ± 8 (P = .002). There were similar ADC differences between benign ([1.78 ± 0.23] × 10(-3) mm(2)/sec) and malignant ([1.03 ± 0.23] × 10(-3) mm

  3. [Quantitative ultrasound].

    PubMed

    Barkmann, R; Glüer, C-C

    2006-10-01

    Methods of quantitative ultrasound (QUS) can be used to obtain knowledge about bone fragility. Comprehensive study results exist showing the power of QUS for the estimation of osteoporotic fracture risk. Nevertheless, the variety of technologies, devices, and variables as well as different degrees of validation of the single devices have to be taken into account. Using methods to simulate ultrasound propagation, the complex interaction between ultrasound and bone could be understood and the propagation could be visualized. Preceding widespread clinical use, it has to be clarified if patients with low QUS values will profit from therapy, as it has been shown for DXA. Moreover, the introduction of quality assurance measures is essential. The user should know the limitations of the methods and be able to interpret the results correctly. Applied in an adequate manner QUS methods could then, due to lower costs and absence of ionizing radiation, become important players in osteoporosis management. PMID:16896637

  4. Combining an amyloid-beta (Aβ) cleaving enzyme inhibitor with a γ-secretase modulator results in an additive reduction of Aβ production.

    PubMed

    Strömberg, Kia; Eketjäll, Susanna; Georgievska, Biljana; Tunblad, Karin; Eliason, Kristina; Olsson, Fredrik; Radesäter, Ann-Cathrin; Klintenberg, Rebecka; Arvidsson, Per I; von Berg, Stefan; Fälting, Johanna; Cowburn, Richard F; Dabrowski, Michael

    2015-01-01

    A major hallmark of Alzheimer's disease (AD) is the deposition of amyloid-β (Aβ) peptides in amyloid plaques. Aβ peptides are produced by sequential cleavage of the amyloid precursor protein by the β amyloid cleaving enzyme (BACE) and the γ-secretase (γ-sec) complex. Pharmacological treatments that decrease brain levels of in particular the toxic Aβ42 peptide are thought to be promising approaches for AD disease modification. Potent and selective BACE1 inhibitors as well as γ-sec modulators (GSMs) have been designed. Pharmacological intervention of secretase function is not without risks of either on- or off-target adverse effects. One way of improving the therapeutic window could be to combine treatment on multiple targets, using smaller individual doses and thereby minimizing adverse effect liability. We show that combined treatment of primary cortical neurons with a BACE1 inhibitor and a GSM gives an additive effect on Aβ42 level change compared with the individual treatments. We extend this finding to C57BL/6 mice, where the combined treatment results in reduction of brain Aβ42 levels reflecting the sum of the individual treatment efficacies. These results show that pharmacological targeting of two amyloid precursor protein processing steps is feasible without negatively interfering with the mechanism of action on individual targets. We conclude that targeting Aβ production by combining a BACE inhibitor and a GSM could be a viable approach for therapeutic intervention in AD modification. PMID:25303711

  5. Change in cardio-protective medication and health-related quality of life after diagnosis of screen-detected diabetes: Results from the ADDITION-Cambridge cohort

    PubMed Central

    Black, J.A.; Long, G.H.; Sharp, S.J.; Kuznetsov, L.; Boothby, C.E.; Griffin, S.J.; Simmons, R.K.

    2015-01-01

    Aims Establishing a balance between the benefits and harms of treatment is important among individuals with screen-detected diabetes, for whom the burden of treatment might be higher than the burden of the disease. We described the association between cardio-protective medication and health-related quality of life (HRQoL) among individuals with screen-detected diabetes. Methods 867 participants with screen-detected diabetes underwent clinical measurements at diagnosis, one and five years. General HRQoL (EQ5D) was measured at baseline, one- and five-years, and diabetes-specific HRQoL (ADDQoL-AWI) and health status (SF-36) at one and five years. Multivariable linear regression was used to quantify the association between change in HRQoL and change in cardio-protective medication. Results The median (IQR) number of prescribed cardio-protective agents was 2 (1 to 3) at diagnosis, 3 (2 to 4) at one year and 4 (3 to 5) at five years. Change in cardio-protective medication was not associated with change in HRQoL from diagnosis to one year. From one year to five years, change in cardio-protective agents was not associated with change in the SF-36 mental health score. One additional agent was associated with an increase in the SF-36 physical health score (2.1; 95%CI 0.4, 3.8) and an increase in the EQ-5D (0.05; 95%CI 0.02, 0.08). Conversely, one additional agent was associated with a decrease in the ADDQoL-AWI (−0.32; 95%CI −0.51, −0.13), compared to no change. Conclusions We found little evidence that increases in the number of cardio-protective medications impacted negatively on HRQoL among individuals with screen-detected diabetes over five years. PMID:25937542

  6. Vildagliptin in addition to metformin improves retinal blood flow and erythrocyte deformability in patients with type 2 diabetes mellitus – results from an exploratory study

    PubMed Central

    2013-01-01

    Numerous rheological and microvascular alterations characterize the vascular pathology in patients with type 2 diabetes mellitus (T2DM). This study investigated effects of vildagliptin in comparison to glimepiride on retinal microvascular blood flow and erythrocyte deformability in T2DM. Fourty-four patients with T2DM on metformin monotherapy were included in this randomized, exploratory study over 24 weeks. Patients were randomized to receive either vildagliptin (50 mg twice daily) or glimepiride individually titrated up to 4 mg in addition to ongoing metformin treatment. Retinal microvascular blood flow (RBF) and the arteriolar wall to lumen ratio (WLR) were assessed using a laser doppler scanner. In addition, the erythrocyte elongation index (EI) was measured at different shear stresses using laserdiffractoscopy. Both treatments improved glycaemic control (p < 0.05 vs. baseline; respectively). While only slight changes in RBF and the WLR could be observed during treatment with glimepiride, vildagliptin significantly increased retinal blood flow and decreased the arterial WLR (p < 0.05 vs. baseline respectively). The EI increased during both treatments over a wide range of applied shear stresses (p < 0.05 vs. baseline). An inverse correlation could be observed between improved glycaemic control (HbA1c) and EI (r = −0.524; p < 0.0001) but not with the changes in retinal microvascular measurements. Our results suggest that vildagliptin might exert beneficial effects on retinal microvascular blood flow beyond glucose control. In contrast, the improvement in erythrocyte deformability observed in both treatment groups, seems to be a correlate of improved glycaemic control. PMID:23565740

  7. SU-E-J-06: Additional Imaging Guidance Dose to Patient Organs Resulting From X-Ray Tubes Used in CyberKnife Image Guidance System

    SciTech Connect

    Sullivan, A; Ding, G

    2015-06-15

    Purpose: The use of image-guided radiation therapy (IGRT) has become increasingly common, but the additional radiation exposure resulting from repeated image guidance procedures raises concerns. Although there are many studies reporting imaging dose from different image guidance devices, imaging dose for the CyberKnife Robotic Radiosurgery System is not available. This study provides estimated organ doses resulting from image guidance procedures on the CyberKnife system. Methods: Commercially available Monte Carlo software, PCXMC, was used to calculate average organ doses resulting from x-ray tubes used in the CyberKnife system. There are seven imaging protocols with kVp ranging from 60 – 120 kV and 15 mAs for treatment sites in the Cranium, Head and Neck, Thorax, and Abdomen. The output of each image protocol was measured at treatment isocenter. For each site and protocol, Adult body sizes ranging from anorexic to extremely obese were simulated since organ dose depends on patient size. Doses for all organs within the imaging field-of-view of each site were calculated for a single image acquisition from both of the orthogonal x-ray tubes. Results: Average organ doses were <1.0 mGy for every treatment site and imaging protocol. For a given organ, dose increases as kV increases or body size decreases. Higher doses are typically reported for skeletal components, such as the skull, ribs, or clavicles, than for softtissue organs. Typical organ doses due to a single exposure are estimated as 0.23 mGy to the brain, 0.29 mGy to the heart, 0.08 mGy to the kidneys, etc., depending on the imaging protocol and site. Conclusion: The organ doses vary with treatment site, imaging protocol and patient size. Although the organ dose from a single image acquisition resulting from two orthogonal beams is generally insignificant, the sum of repeated image acquisitions (>100) could reach 10–20 cGy for a typical treatment fraction.

  8. 1. 0 Million Btu combustor testing: Test results: Part 2. [Hydrate Addition at Low Temperature for the removal of SO/sub 2/

    SciTech Connect

    Babu, M.; College, J.; Forsythe, R.; Kanary, D.

    1988-12-01

    ''Hydrate Addition at Low Temperature'' or HALT is a dry calcium- based hydrate injection process for the removal of SO/sub 2/ from flue gases off a sulfur bearing fuel. In this process the hydrate is pneumatically conveyed and injected into the flue gas stream as a dry particulates. The flue gas is cooled downstream of the hydrate injection location by spraying the gas with a stream of finely atomized water droplets. The water is atomized into a fine spray mist by using air under pressure as the atomizing fluid. The waste product from this process is the dry disposable solids which differ considerably from the wet cake solids obtained from a wet FGD process. The HALT test program currently being conducted at Dravo Lime Company and Ohio Edison Company is to be carried out in two stages: (1) Parametric testing on a 1.0 MM BTU/hour combustor, and (2) Follow up long term testing (six months) on a 5 MW unit. The first stage of the program which involves the parametric testing is now completed. Results are presented. 9 refs., 18 figs.

  9. Multi-nozzle humidification tests: Test results: Part 4. [Hydrate addition at low temperature for the removal of SO/sub 2/

    SciTech Connect

    Stouffer, M.

    1988-12-01

    ''Hydrate Addition at Low Temperature'' or HALT is a dry calcium-based hydrate injection process for the removal of SO/sub 2/ from flue gases off a sulfur bearing fuel. In this process the hydrate is pneumatically conveyed and injected into the flue gas stream as a dry particulate. The flue gas is cooled downstream of the hydrate injection by spraying the gas with a stream of finely atomized water droplets. The water is atomized into a fine spray by using air under pressure as the atomizing fluid. The spray nozzles are specially designed. Results are presented on nozzle array field tests conducted using the Dravo HALT unit at Ohio Edison's Toronto station. A method for humidifier scale-up from single-nozzle pilot test data was demonstrated. The method uses arrays of nozzles, with each individual nozzle operated at fixed conditions determined as optimum in the single-nozzle tests. By applying this method, the Consol 8.3-inch pilot humidifier operation with a single Spraying Systems 1/8JJ-J12 nozzle was successfully scaled up to operation of the Dravo 31 x 31-inch humidifier with arrays of up to 46 J12 nozzles. The tests provided data on nozzle deposition and solids dropout that may be useful for large-scale humidifier design. 4 refs., 16 figs., 10 tabs.

  10. Movement of tagged dredged sand at thalweg disposal sites in the Upper Mississippi River. Volume 3. Additional results at Gordon's Ferry and Whitney Island sites

    SciTech Connect

    McCown, D.L.; Paddock, R.A.

    1985-04-01

    During routine channel maintenance, hydraulically dredged sand was tagged with sand coated with fluorescent dye before being deposited as a pile in the thalweg at three sites on the Upper Mississippi River. As discussed in the first two volumes of this report, bathymetry was measured and surface sediments were sampled to study changes in the topography of the disposal pile and the downstream movement of the tagged sand. At all three sites, topographic evidence of the pile disappeared after the first period of high river flow, which was followed by redevelopment of dunes in the disposal area. The tagged sand did not migrate into nearby border areas, backwaters, or sloughs, remaining in the main channel as it moved downstream. This volume presents the results of additional surveys at the Gordon's Ferry and Whitney Island sites. At Gordon's Ferry, 25 bottom cores were taken to examine the three-dimensional distribution of tagged sand in the bottom sediments. The core analyses indicated that much of the tagged sand had been incorporated into the dune structure and that it resided primarily in the crests of the dunes.

  11. Accuracy and Precision in the Southern Hemisphere Additional Ozonesondes (SHADOZ) Dataset 1998-2000 in Light of the JOSIE-2000 Results

    NASA Technical Reports Server (NTRS)

    Witte, J. C.; Thompson, A. M.; Schmidlin, F. J.; Oltmans, S. J.; McPeters, R. D.; Smit, H. G. J.

    2003-01-01

    A network of 12 southern hemisphere tropical and subtropical stations in the Southern Hemisphere ADditional OZonesondes (SHADOZ) project has provided over 2000 profiles of stratospheric and tropospheric ozone since 1998. Balloon-borne electrochemical concentration cell (ECC) ozonesondes are used with standard radiosondes for pressure, temperature and relative humidity measurements. The archived data are available at:http: //croc.gsfc.nasa.gov/shadoz. In Thompson et al., accuracies and imprecisions in the SHADOZ 1998- 2000 dataset were examined using ground-based instruments and the TOMS total ozone measurement (version 7) as references. Small variations in ozonesonde technique introduced possible biases from station-to-station. SHADOZ total ozone column amounts are now compared to version 8 TOMS; discrepancies between the two datasets are reduced 2\\% on average. An evaluation of ozone variations among the stations is made using the results of a series of chamber simulations of ozone launches (JOSIE-2000, Juelich Ozonesonde Intercomparison Experiment) in which a standard reference ozone instrument was employed with the various sonde techniques used in SHADOZ. A number of variations in SHADOZ ozone data are explained when differences in solution strength, data processing and instrument type (manufacturer) are taken into account.

  12. High SO{sub 2} removal efficiency testing. Topical report - results of sodium formate additive tests at New York State Electric & Gas Corporation`s Kintigh Station

    SciTech Connect

    Murphy, J.

    1997-02-14

    Tests were conducted at New York State Gas & Electric`s (NYSEG`s) Kintigh Station to evaluate options for achieving high sulfur dioxide (SO{sub 2}) removal efficiency in the wet limestone flue gas desulfurization (FGD) system. This test program was one of six conducted by the U.S. Department of Energy to evaluate low-capital-cost upgrades to existing FGD systems as a means for utilities to comply with the requirements of the 1990 Clean Air Act Amendments. The upgrade option tested at Kintigh was sodium formate additive. Results from the tests were used to calibrate the Electric Power Research Institute`s (EPRI) FGD PRocess Integration and Simulation Model (FGDPRISM) to the Kintigh scrubber configuration. FGDPRISM was then used to predict system performance for evaluating conditions other than those tested. An economic evaluation was then done to determine the cost effectiveness of various high-efficiency upgrade options. These costs can be compared with the estimated market value of SO{sub 2} allowance or the expected costs of allowances generated by other means, such as fuel switching or new scrubbers, to arrive at the most cost-effective strategy for Clean Air Act compliance.

  13. Prevalence of sexual desire and satisfaction among patients with screen-detected diabetes and impact of intensive multifactorial treatment: Results from the ADDITION-Denmark study

    PubMed Central

    Giraldi, Annamaria; Kristensen, Ellids; Lauritzen, Torsten; Sandbæk, Annelli; Charles, Morten

    2015-01-01

    Abstract Objective. Sexual problems are common in people with diabetes. It is unknown whether early detection of diabetes and subsequent intensive multifactorial treatment (IT) are associated with sexual health. We report the prevalence of low sexual desire and low sexual satisfaction among people with screen-detected diabetes and compare the impact of intensive multifactorial treatment with the impact of routine care (RC) on these measures. Design. A cross-sectional analysis of the ADDITION-Denmark trial cohort six years post-diagnosis. Setting. 190 general practices around Denmark. Subjects. A total of 968 patients with screen-detected type 2 diabetes. Main outcome measures. Low sexual desire and low sexual satisfaction. Results. Mean (standard deviation, SD) age was 64.9 (6.9) years. The prevalence of low sexual desire was 53% (RC) and 54% (IT) among women, and 24% (RC) and 25% (IT) among men. The prevalence of low sexual satisfaction was 23% (RC) and 18% (IT) among women, and 27% (RC) and 37% (IT) among men. Among men, the prevalence of low sexual satisfaction was significantly higher in the IT group than in the RC group, p = 0.01. Conclusion. Low sexual desire and low satisfaction are frequent among men and women with screen-detected diabetes, and IT may negatively impact men's sexual satisfaction. PMID:25659194

  14. Addition of niclosamide to palladium(II) saccharinate complex of terpyridine results in enhanced cytotoxic activity inducing apoptosis on cancer stem cells of breast cancer.

    PubMed

    Karakas, Didem; Cevatemre, Buse; Aztopal, Nazlihan; Ari, Ferda; Yilmaz, Veysel Turan; Ulukaya, Engin

    2015-09-01

    Wnt signaling is one of the core signaling pathways of cancer stem cells (CSCs). It is re-activated in CSCs and plays essential role in the survival, self-renewal and proliferation of these cells. Therefore, we aimed to evaluate the cytotoxic effects of palladium(II) complex which is formulated as [PdCl(terpy)](sac)2H2O and its combination with niclosamide which is an inhibitor of Wnt signaling pathway associated with breast cancer stem cells. Characteristic cell surface markers (CD44(+)/CD24(-)) were determined by flow cytometry in CSCs. ATP viability assay was used to determine the cytotoxic activity. The mode of cell death was evaluated morphologically using fluorescence microscopy and biochemically using M30 ELISA assay as well as performing qPCR. Our study demonstrated that the combination of niclosamide (1.5 μM) and Pd(II) complex (12.5, 25 and 50 μM) at 48 h has enhanced cytotoxic activity resulted from the induction of apoptosis (indicated by the presence of pyknotic nuclei, increments in M30 and over expression of proapoptotic genes of TNFRSF10A and FAS). Importantly, the addition of niclosamide resulted in the suppression of autophagy (proved by the decrease in ATG5 gene levels) that might have contributed to the enhanced cytotoxicity. In conclusion, the application of this combination may be regarded as a novel and effective approach for the treatment of breast cancer due to its promising cytotoxic effect on cancer stem cells that cause recurrence of the disease. PMID:26234907

  15. High frequency transcutaneous electrical nerve stimulation with diphenidol administration results in an additive antiallodynic effect in rats following chronic constriction injury.

    PubMed

    Lin, Heng-Teng; Chiu, Chong-Chi; Wang, Jhi-Joung; Hung, Ching-Hsia; Chen, Yu-Wen

    2015-03-01

    The impact of coadministration of transcutaneous electrical nerve stimulation (TENS) and diphenidol is not well established. Here we estimated the effects of diphenidol in combination with TENS on mechanical allodynia and tumor necrosis factor-α (TNF-α) expression. Using an animal chronic constriction injury (CCI) model, the rat was estimated for evidence of mechanical sensitivity via von Frey hair stimulation and TNF-α expression in the sciatic nerve using the ELISA assay. High frequency (100Hz) TENS or intraperitoneal injection of diphenidol (2.0μmol/kg) was applied daily, starting on postoperative day 1 (POD1) and lasting for the next 13 days. We demonstrated that both high frequency TENS and diphenidol groups had an increase in mechanical withdrawal thresholds of 60%. Coadministration of high frequency TENS and diphenidol gives better results of paw withdrawal thresholds in comparison with high frequency TENS alone or diphenidol alone. Both diphenidol and coadministration of high frequency TENS with diphenidol groups showed a significant reduction of the TNF-α level compared with the CCI or HFS group (P<0.05) in the sciatic nerve on POD7, whereas the CCI or high frequency TENS group exhibited a higher TNF-α level than the sham group (P<0.05). Our resulting data revealed that diphenidol alone, high frequency TENS alone, and the combination produced a reduction of neuropathic allodynia. Both diphenidol and the combination of diphenidol with high frequency TENS inhibited TNF-α expression. A moderately effective dose of diphenidol appeared to have an additive effect with high frequency TENS. Therefore, multidisciplinary treatments could be considered for this kind of mechanical allodynia. PMID:25596445

  16. Quantitative Liver Function Tests Improve the Prediction of Clinical Outcomes in Chronic Hepatitis C: Results from the HALT-C Trial

    PubMed Central

    Everson, Gregory T.; Shiffman, Mitchell L.; Hoefs, John C.; Morgan, Timothy R.; Sterling, Richard K.; Wagner, David A.; Lauriski, Shannon; Curto, Teresa M.; Stoddard, Anne; Wright, Elizabeth C.

    2011-01-01

    Risk for future clinical outcomes is proportional to the severity of liver disease in patients with chronic hepatitis C. We measured disease severity by quantitative liver function tests (QLFTs) to determine cutoffs for QLFTs that identified patients who were at low and high risk for a clinical outcome. Two hundred twenty seven participants in the Hepatitis C Antiviral Long-Term Treatment Against Cirrhosis (HALT-C) Trial underwent baseline QLFTs and were followed for a median of 5.5 years for clinical outcomes. QLFTs were repeated in 196 patients at month 24 and in 165 patients at month 48. Caffeine elimination rate (k), antipyrine (AP) clearance (Cl), MEGX concentration, methionine breath test (MBT), galactose elimination capacity (GEC), dual cholate (CA) clearances and shunt, and perfused hepatic mass (PHM) and liver and spleen volumes (SPECT) were measured. Baseline QLFTs were significantly worse (p=0.0017 to <0.0001) and spleen volumes larger (p<0.0001) in the 54 patients who subsequently experienced clinical outcomes. QLFT cutoffs that characterized patients as “low” and “high risk” for clinical outcome yielded hazard ratios ranging from 2.21 (95%CI 1.29–3.78) for GEC to 6.52 (95%CI 3.63–11.71) for CA Cloral. QLFTs independently predicted outcome in models with Ishak fibrosis score, platelet count, and standard laboratory tests. In serial studies, patients with “high risk” results for CA Cloral or PHM had a nearly 15-fold increase in risk for clinical outcome. Less than 5% of patients with “low risk” QLFTs experienced a clinical outcome. Conclusion QLFTs independently predict risk for future clinical outcomes. By improving risk assessment, QLFTs could enhance noninvasive monitoring, counseling, and management of patients with chronic hepatitis C. PMID:22030902

  17. Three-dimensional parametric mapping in quantitative micro-CT imaging of post-surgery femoral head-neck samples: preliminary results

    PubMed Central

    Giannotti, Stefano; Bottai, Vanna; Panetta, Daniele; De Paola, Gaia; Tripodi, Maria; Citarelli, Carmine; Dell’Osso, Giacomo; Lazzerini, Ilaria; Salvadori, Piero Antonio; Guido, Giulio

    2015-01-01

    Summary Osteoporosis and pathological increased occurrence of fractures are an important public health problem. They may affect patients’ quality of life and even increase mortality of osteoporotic patients, and consequently represent a heavy economic burden for national healthcare systems. The adoption of simple and inexpensive methods for mass screening of population at risk may be the key for an effective prevention. The current clinical standards of diagnosing osteoporosis and assessing the risk of an osteoporotic bone fracture include dual-energy X-ray absorptiometry (DXA) and quantitative computed tomography (QCT) for the measurement of bone mineral density (BMD). Micro-computed tomography (micro-CT) is a tomographic imaging technique with very high resolution allowing direct quantification of cancellous bone microarchitecture. The Authors performed micro-CT analysis of the femoral heads harvested from 8 patients who have undergone surgery for hip replacement for primary and secondary degenerative disease to identify possible new morphometric parameters based on the analysis of the distribution of intra-subject microarchitectural parameters through the creation of parametric images. Our results show that the micro-architectural metrics commonly used may not be sufficient for the realistic assessment of bone microarchitecture of the femoral head in patients with hip osteoarthritis. The innovative micro-CT approach considers the entire femoral head in its physiological shape with all its components like cartilage, cortical layer and trabecular region. The future use of these methods for a more detailed study of the reaction of trabecular bone for the internal fixation or prostheses would be desirable. PMID:26811703

  18. Fully Automated Quantitative Estimation of Volumetric Breast Density from Digital Breast Tomosynthesis Images: Preliminary Results and Comparison with Digital Mammography and MR Imaging.

    PubMed

    Pertuz, Said; McDonald, Elizabeth S; Weinstein, Susan P; Conant, Emily F; Kontos, Despina

    2016-04-01

    Purpose To assess a fully automated method for volumetric breast density (VBD) estimation in digital breast tomosynthesis (DBT) and to compare the findings with those of full-field digital mammography (FFDM) and magnetic resonance (MR) imaging. Materials and Methods Bilateral DBT images, FFDM images, and sagittal breast MR images were retrospectively collected from 68 women who underwent breast cancer screening from October 2011 to September 2012 with institutional review board-approved, HIPAA-compliant protocols. A fully automated computer algorithm was developed for quantitative estimation of VBD from DBT images. FFDM images were processed with U.S. Food and Drug Administration-cleared software, and the MR images were processed with a previously validated automated algorithm to obtain corresponding VBD estimates. Pearson correlation and analysis of variance with Tukey-Kramer post hoc correction were used to compare the multimodality VBD estimates. Results Estimates of VBD from DBT were significantly correlated with FFDM-based and MR imaging-based estimates with r = 0.83 (95% confidence interval [CI]: 0.74, 0.90) and r = 0.88 (95% CI: 0.82, 0.93), respectively (P < .001). The corresponding correlation between FFDM and MR imaging was r = 0.84 (95% CI: 0.76, 0.90). However, statistically significant differences after post hoc correction (α = 0.05) were found among VBD estimates from FFDM (mean ± standard deviation, 11.1% ± 7.0) relative to MR imaging (16.6% ± 11.2) and DBT (19.8% ± 16.2). Differences between VDB estimates from DBT and MR imaging were not significant (P = .26). Conclusion Fully automated VBD estimates from DBT, FFDM, and MR imaging are strongly correlated but show statistically significant differences. Therefore, absolute differences in VBD between FFDM, DBT, and MR imaging should be considered in breast cancer risk assessment. (©) RSNA, 2015 Online supplemental material is available for this article. PMID:26491909

  19. The results with the addition of metronomic cyclophosphamide to palliative radiotherapy for the treatment of non-small cell lung carcinoma

    PubMed Central

    Joshi, Subhash Chandra; Pandey, Kailash Chandra; Rastogi, Madhup; Sharma, Mukesh; Gupta, Manoj

    2015-01-01

    Background A considerable proportion of non-small cell lung carcinoma (NSCLC) patients are ineligible for radical therapies. Many are frail not to tolerate intravenous palliative chemotherapy either. These patients often receive palliative radiotherapy (RT), or supportive care alone. We intend to compare outcomes with palliative RT alone, versus palliative RT plus oral low dose metronomic cyclophosphamide. Methods Data was mined from 139 eligible NSCLC patient records. Comparisons were made between 65 patients treated from January 2011 to March 2013 with palliative RT (20-30 Gray in 5-10 fractions) alone, versus 74 patients treated from April 2013 to December 2014 with palliative RT plus oral metronomic cyclophosphamide (50 mg once daily from day of initiation of RT until at least the day of disease progression). Response was assessed after 1-month post-RT by computed tomography. Patients with complete or partial response were recorded as responders. For the determination of progression free survival (PFS), progression would be declared in case of increase in size of lesions, development of new lesions, or development of effusions. The proportions of responders were compared with the Fisher exact test, and the PFS curves were compared with the log-rank test. Results Differences in response rates were statistically insignificant. The PFS was significantly higher when metronomic chemotherapy was added to RT in comparison to treatment with RT alone (mean PFS 3.1 vs. 2.55 months; P=0.0501). Further histological sub-group analysis revealed that the enhanced outcomes with addition of metronomic cyclophosphamide to RT were limited to patients with adenocarcinoma histology (3.5 vs. 2.4 months; P=0.0053), while there was no benefit for those with squamous cell histology (2.6 vs. 2.6 months; P=1). At the dose of oral cyclophosphamide used, there was no recorded instance of any measurable hematological toxicity. Conclusions For pulmonary adenocarcinoma patients, the treatment

  20. Food additives

    MedlinePlus

    Food additives are substances that become part of a food product when they are added during the processing or making of that food. "Direct" food additives are often added during processing to: Add nutrients ...

  1. 5-MW Toronto HALT (Hydrate Addition at Low Temperature) pilot plant testing: Test results: Part 1-A. [Hydrate addition at low temperature for the removal of SO/sub 2/

    SciTech Connect

    Babu, M.; College, J.; Forsythe, R.; Kerivan, D.; Lee, K.; Herbert, R.; Kanary, D.

    1988-12-01

    ''Hydrate Addition at Low Temperature'' of HALT is a dry calcium- based hydrate injection process for the removal of SO/sub 2/ from flue gases off a sulfur bearing fuel. In this process the hydrate is pneumatically conveyed and injected into the flue gas stream as a dry particulate. The flue gas is cooled downstream of the hydrate injection location by spraying the gas with a stream finely atomized water droplets. The water is atomized into a fine spray mist by using air under pressure as the atomizing fluid. The spray nozzles are specially designed. A 5MW pilot HALT was designed, constructed and operated to demonstrate the viability of the HALT process. The unit was designed to use a baghouse for particulate removal. A rented ESP was used for a pre-scheduled test period for comparison with the baghouse. Tests were conducted to cover all of the following variables: humidification, stoichiometric ratio, approach temperature, flue gas velocity, inlet flue gas SO/sub 2/ concentration, and inlet flue gas temperature. Solids samples of hydrates disposal and ESP waste solids were chemically analyzed and are reported. Hydrate samples were analyzed for particle size distribution and surface area. A two month long duration test operating 24 hours/day was successfully concluded. EPA leachate tests were conducted on the solids waste. Corrosion tests were conducted on coupons installed in the baghouse. 79 figs., 5 tabs.

  2. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  3. Comparison of Enterococcus quantitative polymerase chain reaction analysis results from midwest U.S. river samples using EPA Method 1611 and Method 1609 PCR reagents

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...

  4. Correlation between Herrold’s egg yolk medium culture results and quantitative real-time PCR for Mycobacterium avium subspecies paratuberculosis in pooled fecal and environmental slurry samples

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Quantitative real-time PCR (qPCR) testing for Mycobacterium avium subspecies paratuberculosis (MAP) in fecal samples is a rapid alternative to culture on Herrold’s egg yolk medium (HEYM), the traditional ante-mortem reference test for MAP. Although the sensitivity and specificity of these two tests ...

  5. Need for a gender-sensitive human security framework: results of a quantitative study of human security and sexual violence in Djohong District, Cameroon

    PubMed Central

    2014-01-01

    Background Human security shifts traditional concepts of security from interstate conflict and the absence of war to the security of the individual. Broad definitions of human security include livelihoods and food security, health, psychosocial well-being, enjoyment of civil and political rights and freedom from oppression, and personal safety, in addition to absence of conflict. Methods In March 2010, we undertook a population-based health and livelihood study of female refugees from conflict-affected Central African Republic living in Djohong District, Cameroon and their female counterparts within the Cameroonian host community. Embedded within the survey instrument were indicators of human security derived from the Leaning-Arie model that defined three domains of psychosocial stability suggesting individuals and communities are most stable when their core attachments to home, community and the future are intact. Results While the female refugee human security outcomes describe a population successfully assimilated and thriving in their new environments based on these three domains, the ability of human security indicators to predict the presence or absence of lifetime and six-month sexual violence was inadequate. Using receiver operating characteristic (ROC) analysis, the study demonstrates that common human security indicators do not uncover either lifetime or recent prevalence of sexual violence. Conclusions These data suggest that current gender-blind approaches of describing human security are missing serious threats to the safety of one half of the population and that efforts to develop robust human security indicators should include those that specifically measure violence against women. PMID:24829613

  6. Addition of cetuximab to oxaliplatin-based first-line combination chemotherapy for treatment of advanced colorectal cancer: results of the randomised phase 3 MRC COIN trial

    PubMed Central

    Maughan, Timothy S; Adams, Richard A; Smith, Christopher G; Meade, Angela M; Seymour, Matthew T; Wilson, Richard H; Idziaszczyk, Shelley; Harris, Rebecca; Fisher, David; Kenny, Sarah L; Kay, Edward; Mitchell, Jenna K; Madi, Ayman; Jasani, Bharat; James, Michelle D; Bridgewater, John; Kennedy, M John; Claes, Bart; Lambrechts, Diether; Kaplan, Richard; Cheadle, Jeremy P

    2011-01-01

    Summary Background In the Medical Research Council (MRC) COIN trial, the epidermal growth factor receptor (EGFR)-targeted antibody cetuximab was added to standard chemotherapy in first-line treatment of advanced colorectal cancer with the aim of assessing effect on overall survival. Methods In this randomised controlled trial, patients who were fit for but had not received previous chemotherapy for advanced colorectal cancer were randomly assigned to oxaliplatin and fluoropyrimidine chemotherapy (arm A), the same combination plus cetuximab (arm B), or intermittent chemotherapy (arm C). The choice of fluoropyrimidine therapy (capecitabine or infused fluouroracil plus leucovorin) was decided before randomisation. Randomisation was done centrally (via telephone) by the MRC Clinical Trials Unit using minimisation. Treatment allocation was not masked. The comparison of arms A and C is described in a companion paper. Here, we present the comparison of arm A and B, for which the primary outcome was overall survival in patients with KRAS wild-type tumours. Analysis was by intention to treat. Further analyses with respect to NRAS, BRAF, and EGFR status were done. The trial is registered, ISRCTN27286448. Findings 1630 patients were randomly assigned to treatment groups (815 to standard therapy and 815 to addition of cetuximab). Tumour samples from 1316 (81%) patients were used for somatic molecular analyses; 565 (43%) had KRAS mutations. In patients with KRAS wild-type tumours (arm A, n=367; arm B, n=362), overall survival did not differ between treatment groups (median survival 17·9 months [IQR 10·3–29·2] in the control group vs 17·0 months [9·4–30·1] in the cetuximab group; HR 1·04, 95% CI 0·87–1·23, p=0·67). Similarly, there was no effect on progression-free survival (8·6 months [IQR 5·0–12·5] in the control group vs 8·6 months [5·1–13·8] in the cetuximab group; HR 0·96, 0·82–1·12, p=0·60). Overall response rate increased from 57% (n=209

  7. The impacts of tracer selection and corrections for organic matter and particle size on the results of quantitative sediment fingerprinting. A case study from the Nene basin, UK.

    NASA Astrophysics Data System (ADS)

    Pulley, Simon; Ian, Foster; Paula, Antunes

    2014-05-01

    In recent years, sediment fingerprinting methodologies have gained widespread adoption when tracing sediment provenance in geomorphological research. A wide variety of tracers have been employed in the published literature, with corrections for particle size and organic matter applied when the researcher judged them necessary. This paper aims to explore the errors associated with tracer use by a comparison of fingerprinting results obtained using fallout and lithogenic radionuclides, geochemical, and mineral magnetic tracers in a range of environments located in the Nene basin, UK. Specifically, fingerprinting was undertaken on lake, reservoir and floodplain sediment cores, on actively transported suspended sediment and on overbank and channel bed sediment deposits. Tracer groups were investigated both alone and in combination to determine the differences between their sediment provenance predictions and potential causes of these differences. Additionally, simple organic and particle size corrections were applied to determine if they improve the agreement between the tracer group predictions. Key results showed that when fingerprinting contributions from channel banks to actively transported or recently deposited sediments the tracer group predictions varied by 24% on average. These differences could not be clearly attributed to changes in the sediment during erosion or transport. Instead, the most likely cause of differences was the pre-existing spatial variability in tracer concentrations within sediment sources, combined with highly localised erosion. This resulted in the collected sediment source samples not being representative of the actual sediment sources. Average differences in provenance predictions between the different tracer groups in lake, reservoir and floodplain sediment cores were lowest in the reservoir core at 19% and highest in some floodplain cores, with differences in predictions in excess of 50%. In these latter samples organic enrichment of

  8. Significance of quantitative enzyme-linked immunosorbent assay (ELISA) results in evaluation of three ELISAs and Western blot tests for detection of antibodies to human immunodeficiency virus in a high-risk population.

    PubMed Central

    Nishanian, P; Taylor, J M; Korns, E; Detels, R; Saah, A; Fahey, J L

    1987-01-01

    The characteristics of primary (first) tests with three enzyme-linked immunosorbent assay (ELISA) kits for human immunodeficiency virus (HIV) antibody were determined. The three ELISAs were performed on 3,229, 3,130, and 685 specimens from high-risk individuals using the Litton (LT; Litton Bionetics Laboratory Products, Charleston, S.C.), Dupont (DP; E. I. du Pont de Nemours & Co., Inc., Wilmington, Del.), and Genetic Systems (GS; Genetic Systems, Seattle, Wash.) kits, respectively. Evaluation was based on the distribution of quantitative test results (such as optical densities), a comparison with Western blot (WB) results, reproducibility of the tests, and identification of seroconverters. The performances of the GS and the DP kits were good by all four criteria and exceeded that of the LT kit. Primary ELISA-negative results were not always confirmed with repeat ELISA and by WB testing. The largest percentage of these unconfirmed negative test results came from samples with quantitative results in the fifth percentile nearest the cutoff. Thus, supplementary testing was indicated for samples with test results in this borderline negative range. Similarly, borderline positive primary ELISA results that were quantitatively nearest (fifth percentile) the cutoff value were more likely to be antibody negative on supplementary testing than samples with high antibody values. In this study, results of repeated tests by GS ELISA showed the least change from first test results. DP ELISA showed more unconfirmed primary positive test results, and LT ELISA showed more unconfirmed primary negative test results. Designation of a specimen with a single ELISA quantitative level near the cutoff value as positive or negative should be viewed with skepticism. A higher than normal proportion of specimens with high negative optical densities by GS ELISA (fifth percentile nearest the cutoff) and also negative by WB were found to be from individuals in the process of seroconversion. PMID

  9. From provocative narrative scenarios to quantitative biophysical model results: Simulating plausible futures to 2070 in an urbanizing agricultural watershed in Wisconsin, USA

    NASA Astrophysics Data System (ADS)

    Booth, E.; Chen, X.; Motew, M.; Qiu, J.; Zipper, S. C.; Carpenter, S. R.; Kucharik, C. J.; Steven, L. I.

    2015-12-01

    Scenario analysis is a powerful tool for envisioning future social-ecological change and its consequences on human well-being. Scenarios that integrate qualitative storylines and quantitative biophysical models can create a vivid picture of these potential futures but the integration process is not straightforward. We present - using the Yahara Watershed in southern Wisconsin (USA) as a case study - a method for developing quantitative inputs (climate, land use/cover, and land management) to drive a biophysical modeling suite based on four provocative and contrasting narrative scenarios that describe plausible futures of the watershed to 2070. The modeling suite consists of an agroecosystem model (AgroIBIS-VSF), hydrologic routing model (THMB), and empirical lake water quality model and estimates several biophysical indicators to evaluate the watershed system under each scenario. These indicators include water supply, lake flooding, agricultural production, and lake water quality. Climate (daily precipitation and air temperature) for each scenario was determined using statistics from 210 different downscaled future climate projections for two 20-year time periods (2046-2065 and 2081-2100) and modified using a stochastic weather generator to allow flexibility for matching specific climate events within the scenario narratives. Land use/cover for each scenario was determined first by quantifying changes in areal extent every decade for 15 categories at the watershed scale to be consistent with the storyline events and theme. Next, these changes were spatially distributed using a rule-based framework based on land suitability metrics that determine transition probabilities. Finally, agricultural inputs including manure and fertilizer application rates were determined for each scenario based on the prevalence of livestock, water quality regulations, and technological innovations. Each scenario is compared using model inputs (maps and time-series of land use/cover and

  10. BPI-ANCA Provides Additional Clinical Information to Anti-Pseudomonas Serology: Results from a Cohort of 117 Swedish Cystic Fibrosis Patients.

    PubMed

    Lindberg, Ulrika; Carlsson, Malin; Hellmark, Thomas; Segelmark, Mårten

    2015-01-01

    Patients with cystic fibrosis (CF) colonized with Pseudomonas aeruginosa (P. aeruginosa) have worse prognosis compared with patients who are not. BPI-ANCA is an anti-neutrophil cytoplasmic antibody against BPI (bactericidal/permeability increasing protein) correlating with P. aeruginosa colonization and adverse long time prognosis. Whether it provides additional information as compared to standard anti-P. aeruginosa serology tests is not known. 117 nontransplanted CF patients at the CF centre in Lund, Sweden, were followed prospectively for ten years. Bacterial colonisation was classified according to the Leeds criteria. IgA BPI-ANCA was compared with assays for antibodies against alkaline protease (AP), Elastase (ELA), and Exotoxin A (ExoA). Lung function and patient outcome, alive, lung transplanted, or dead, were registered. BPI-ANCA showed the highest correlation with lung function impairment with an r-value of 0.44. Forty-eight of the 117 patients were chronically colonized with P. aeruginosa. Twenty of these patients experienced an adverse outcome. Receiver operator curve (ROC) analysis revealed that this could be predicted by BPI-ANCA (AUC = 0.77), (p = 0.002) to a better degree compared with serology tests. BPI-ANCA correlates better with lung function impairment and long time prognosis than anti-P. aeruginosa serology and has similar ability to identify patients with chronic P. aeruginosa. PMID:26273683

  11. Phosphazene additives

    SciTech Connect

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  12. Liquid crystal quantitative temperature measurement technique

    NASA Astrophysics Data System (ADS)

    Lu, Wei; Wu, Zongshan

    2001-10-01

    Quantitative temperature measurement using wide band thermochromic liquid crystals is an “area” thermal measurement technique. This technique utilizes the feature that liquid crystal changes its reflex light color with variation of temperature and applies an image capturing and processing system to calibrate the characteristic curve of liquid crystal’s color-temperature. Afterwards, the technique uses this curve to measure the distribution of temperature on experimental model. In this paper, firstly, each part of quantitative temperature measurement system using liquid crystal is illustrated and discussed. Then the technique is employed in a long duration hypersonic wind tunnel, and the quantitative result of the heat transfer coefficient along laminar plate is obtained. Additionally, some qualitative results are also given. In the end, comparing the experimental results with reference enthalpy theoretical results, a conclusion of thermal measurement accuracy is drawn.

  13. Quantitative Thinking.

    ERIC Educational Resources Information Center

    DuBridge, Lee A.

    An appeal for more research to determine how to educate children as effectively as possible is made. Mathematics teachers can readily examine the educational problems of today in their classrooms since learning progress in mathematics can easily be measured and evaluated. Since mathematics teachers have learned to think in quantitative terms and…

  14. On Quantitizing

    ERIC Educational Resources Information Center

    Sandelowski, Margarete; Voils, Corrine I.; Knafl, George

    2009-01-01

    "Quantitizing", commonly understood to refer to the numerical translation, transformation, or conversion of qualitative data, has become a staple of mixed methods research. Typically glossed are the foundational assumptions, judgments, and compromises involved in converting disparate data sets into each other and whether such conversions advance…

  15. QUANTITATIVE MORPHOLOGY

    EPA Science Inventory

    Abstract: In toxicology, the role of quantitative assessment of brain morphology can be understood in the context of two types of treatment-related alterations. One type of alteration is specifically associated with treatment and is not observed in control animals. Measurement ...

  16. Additional correction for energy transfer efficiency calculation in filter-based Förster resonance energy transfer microscopy for more accurate results

    NASA Astrophysics Data System (ADS)

    Sun, Yuansheng; Periasamy, Ammasi

    2010-03-01

    Förster resonance energy transfer (FRET) microscopy is commonly used to monitor protein interactions with filter-based imaging systems, which require spectral bleedthrough (or cross talk) correction to accurately measure energy transfer efficiency (E). The double-label (donor+acceptor) specimen is excited with the donor wavelength, the acceptor emission provided the uncorrected FRET signal and the donor emission (the donor channel) represents the quenched donor (qD), the basis for the E calculation. Our results indicate this is not the most accurate determination of the quenched donor signal as it fails to consider the donor spectral bleedthrough (DSBT) signals in the qD for the E calculation, which our new model addresses, leading to a more accurate E result. This refinement improves E comparisons made with lifetime and spectral FRET imaging microscopy as shown here using several genetic (FRET standard) constructs, where cerulean and venus fluorescent proteins are tethered by different amino acid linkers.

  17. An Economic Evaluation of TENS in Addition to Usual Primary Care Management for the Treatment of Tennis Elbow: Results from the TATE Randomized Controlled Trial

    PubMed Central

    Lewis, Martyn; Chesterton, Linda S.; Sim, Julius; Mallen, Christian D.; Hay, Elaine M.; van der Windt, Daniëlle A.

    2015-01-01

    Background The TATE trial was a multicentre pragmatic randomized controlled trial of supplementing primary care management (PCM)–consisting of a GP consultation followed by information and advice on exercises–with transcutaneous electrical nerve stimulation (TENS), to reduce pain intensity in patients with tennis elbow. This paper reports the health economic evaluation. Methods and Findings Adults with new diagnosis of tennis elbow were recruited from 38 general practices in the UK, and randomly allocated to PCM (n = 120) or PCM plus TENS (n = 121). Outcomes included reduction in pain intensity and quality-adjusted-life-years (QALYs) based on the EQ5D and SF6D. Two economic perspectives were evaluated: (i) healthcare–inclusive of NHS and private health costs for the tennis elbow; (ii) societal–healthcare costs plus productivity losses through work absenteeism. Mean outcome and cost differences between the groups were evaluated using a multiple imputed dataset as the base case evaluation, with uncertainty represented in cost-effectiveness planes and through probabilistic cost-effectiveness acceptability curves). Incremental healthcare cost was £33 (95%CI -40, 106) and societal cost £65 (95%CI -307, 176) for PCM plus TENS. Mean differences in outcome were: 0.11 (95%CI -0.13, 0.35) for change in pain (0–10 pain scale); -0.015 (95%CI -0.058, 0.029) for QALYEQ5D; 0.007 (95%CI -0.022, 0.035) for QALYSF6D (higher score differences denote greater benefit for PCM plus TENS). The ICER (incremental cost effectiveness ratio) for the main evaluation of mean difference in societal cost (£) relative to mean difference in pain outcome was -582 (95%CI -8666, 8113). However, incremental ICERs show differences in cost–effectiveness of additional TENS, according to the outcome being evaluated. Conclusion Our findings do not provide evidence for or against the cost-effectiveness of TENS as an adjunct to primary care management of tennis elbow. PMID:26317528

  18. The addition of rituximab to fludarabine and cyclophosphamide chemotherapy results in a significant improvement in overall survival in patients with newly diagnosed mantle cell lymphoma: results of a randomized UK National Cancer Research Institute trial

    PubMed Central

    Rule, Simon; Smith, Paul; Johnson, Peter W.M.; Bolam, Simon; Follows, George; Gambell, Joanne; Hillmen, Peter; Jack, Andrew; Johnson, Stephen; Kirkwood, Amy A; Kruger, Anton; Pocock, Christopher; Seymour, John F.; Toncheva, Milena; Walewski, Jan; Linch, David

    2016-01-01

    Mantle cell lymphoma is an incurable and generally aggressive lymphoma that is more common in elderly patients. Whilst a number of different chemotherapeutic regimens are active in this disease, there is no established gold standard therapy. Rituximab has been used widely to good effect in B-cell malignancies but there is no evidence that it improves outcomes when added to chemotherapy in this disease. We performed a randomized, open-label, multicenter study looking at the addition of rituximab to the standard chemotherapy regimen of fludarabine and cyclophosphamide in patients with newly diagnosed mantle cell lymphoma. A total of 370 patients were randomized. With a median follow up of six years, rituximab improved the median progression-free survival from 14.9 to 29.8 months (P<0.001) and overall survival from 37.0 to 44.5 months (P=0.005). This equates to absolute differences of 9.0% and 22.1% for overall and progression-free survival, respectively, at two years. Overall response rates were similar, but complete response rates were significantly higher in the rituximab arm: 52.7% vs. 39.9% (P=0.014). There was no clinically significant additional toxicity observed with the addition of rituximab. Overall, approximately 18% of patients died of non-lymphomatous causes, most commonly infections. The addition of rituximab to fludarabine and cyclophosphamide chemotherapy significantly improves outcomes in patients with mantle cell lymphoma. However, these regimens have significant late toxicity and should be used with caution. This trial has been registered (ISRCTN81133184 and clinicaltrials.gov:00641095) and is supported by the UK National Cancer Research Network. PMID:26611473

  19. The addition of rituximab to fludarabine and cyclophosphamide chemotherapy results in a significant improvement in overall survival in patients with newly diagnosed mantle cell lymphoma: results of a randomized UK National Cancer Research Institute trial.

    PubMed

    Rule, Simon; Smith, Paul; Johnson, Peter W M; Bolam, Simon; Follows, George; Gambell, Joanne; Hillmen, Peter; Jack, Andrew; Johnson, Stephen; Kirkwood, Amy A; Kruger, Anton; Pocock, Christopher; Seymour, John F; Toncheva, Milena; Walewski, Jan; Linch, David

    2016-02-01

    Mantle cell lymphoma is an incurable and generally aggressive lymphoma that is more common in elderly patients. Whilst a number of different chemotherapeutic regimens are active in this disease, there is no established gold standard therapy. Rituximab has been used widely to good effect in B-cell malignancies but there is no evidence that it improves outcomes when added to chemotherapy in this disease. We performed a randomized, open-label, multicenter study looking at the addition of rituximab to the standard chemotherapy regimen of fludarabine and cyclophosphamide in patients with newly diagnosed mantle cell lymphoma. A total of 370 patients were randomized. With a median follow up of six years, rituximab improved the median progression-free survival from 14.9 to 29.8 months (P<0.001) and overall survival from 37.0 to 44.5 months (P=0.005). This equates to absolute differences of 9.0% and 22.1% for overall and progression-free survival, respectively, at two years. Overall response rates were similar, but complete response rates were significantly higher in the rituximab arm: 52.7% vs. 39.9% (P=0.014). There was no clinically significant additional toxicity observed with the addition of rituximab. Overall, approximately 18% of patients died of non-lymphomatous causes, most commonly infections. The addition of rituximab to fludarabine and cyclophosphamide chemotherapy significantly improves outcomes in patients with mantle cell lymphoma. However, these regimens have significant late toxicity and should be used with caution. This trial has been registered (ISRCTN81133184 and clinicaltrials.gov:00641095) and is supported by the UK National Cancer Research Network. PMID:26611473

  20. Structure of transition-metal cluster compounds: Use of an additional orbital resulting from the f, g character of spd bond orbitals*

    PubMed Central

    Pauling, Linus

    1977-01-01

    A general theory of the structure of complexes of the transition metals is developed on the basis of the enneacovalence of the metals and the requirements of the electroneutrality principle. An extra orbital may be provided through the small but not negligible amount of f and g character of spd bond orbitals, and an extra electron or electron pair may be accepted in this orbital for a single metal or a cluster to neutralize the positive electric charge resulting from the partial ionic character of the bonds with ligands, such as the carbonyl group. Examples of cluster compounds of cobalt, ruthenium, rhodium, osmium, and gold are discussed. PMID:16592470

  1. Large changes in the structure of the major histone H1 subtype result in small effects on quantitative traits in legumes.

    PubMed

    Berdnikov, Vladimir A; Bogdanova, Vera S; Gorel, Faina L; Kosterin, Oleg E; Trusov, Yurii A

    2003-10-01

    Electrophoretic analysis of the most abundant subtype of histone H1 (H1-1) of 301 accessions of grasspea (Lathyrus sativus) and 575 accessions of lentil (Lens culinaris) revealed allelic variants which most probably arose due to recent mutations. In each species, a single heterozygote for a mutation was taken for construction of isogenic lines carrying different H1-1 variants. Sequencing of alleles encoding H1-1 in lentil, grasspea, pea and Lathyrus aphaca showed the presence of an extended region in C-terminal tail which we termed 'regular zone' (RZ). It consists of 14 6-amino-acid units of which 12 (pea and Lathyrus species) or 13 (lentil) are represented by an AKPAAK sequence. The structure of the hypervariable unit 8 is species-specific. At the DNA level most AKPAAK units differ in the third codon positions, implying the action of natural selection preserving the RZ organization. In lentil, the fast variant lost two units (including unit 8), while one AKPAAK repeat of the slow variant is transformed into an anomalous SMPAAK. The mutant variant of the grasspea H1-1 differs from the standard one by duplication of an 11-amino-acid segment in N-terminal tail. The isogenic lines of lentil and grasspea were compared for a number of quantitative traits, some of them showing small (1-8%) significant differences. PMID:14620956

  2. Effect of Using Local Intrawound Vancomycin Powder in Addition to Intravenous Antibiotics in Posterior Lumbar Surgery: Midterm Result in a Single-Center Study

    PubMed Central

    Lee, Gun-Ill; Chun, Hyoung-Joon; Choi, Kyu-Sun

    2016-01-01

    Objective We conducted this study to report the efficacy of local application of vancomycin powder in the setting of surgical site infection (SSI) of posterior lumbar surgical procedures and to figure out risk factors of SSIs. Methods From February 2013 to December 2013, SSI rates following 275 posterior lumbar surgeries of which intrawound vancomycin powder was used in combination with intravenous antibiotics (Vanco group) were assessed. Compared with 296 posterior lumbar procedures with intravenous antibiotic only group from February 2012 to December 2012 (non-Vanco group), various infection rates were assessed. Univariate and multivariate analysis to figure out risk factors of infection among Vanco group were done. Results Statistically significant reduction of SSI in Vanco group (5.5%) from non-Vanco group (10.5%) was confirmed (p=0.028). Mean follow-up period was 8 months. Rate of acute staphylococcal SSIs reduced statistically significantly to 4% compared to 7.4% of non-Vanco group (p=0.041). Deep staphylococcal infection decreased to 2 compared to 8 and deep methicillin-resistant Staphylococcus aureus infection also decreased to 1 compared to 5 in non-Vanco group. No systemic complication was observed. Statistically significant risk factors associated with SSI were diabetes mellitus, history of cardiovascular disease, length of hospital stay, number of instrumented level and history of previous surgery. Conclusion In this series of 571 patients, intrawound vancomycin powder usage resulted in significant decrease in SSI rates in our posterior lumbar surgical procedures. Patients at high risk of infection are highly recommended as a candidate for this technique. PMID:27437012

  3. Does early intensive multifactorial therapy reduce modelled cardiovascular risk in individuals with screen-detected diabetes? Results from the ADDITION-Europe cluster randomized trial

    PubMed Central

    Black, J A; Sharp, S J; Wareham, N J; Sandbæk, A; Rutten, G E H M; Lauritzen, T; Khunti, K; Davies, M J; Borch-Johnsen, K; Griffin, S J; Simmons, R K

    2014-01-01

    Aims Little is known about the long-term effects of intensive multifactorial treatment early in the diabetes disease trajectory. In the absence of long-term data on hard outcomes, we described change in 10-year modelled cardiovascular risk in the 5 years following diagnosis, and quantified the impact of intensive treatment on 10-year modelled cardiovascular risk at 5 years. Methods In a pragmatic, cluster-randomized, parallel-group trial in Denmark, the Netherlands and the UK, 3057 people with screen-detected Type 2 diabetes were randomized by general practice to receive (1) routine care of diabetes according to national guidelines (1379 patients) or (2) intensive multifactorial target-driven management (1678 patients). Ten-year modelled cardiovascular disease risk was calculated at baseline and 5 years using the UK Prospective Diabetes Study Risk Engine (version 3β). Results Among 2101 individuals with complete data at follow up (73.4%), 10-year modelled cardiovascular disease risk was 27.3% (sd 13.9) at baseline and 21.3% (sd 13.8) at 5-year follow-up (intensive treatment group difference –6.9, sd 9.0; routine care group difference –5.0, sd 12.2). Modelled 10-year cardiovascular disease risk was lower in the intensive treatment group compared with the routine care group at 5 years, after adjustment for baseline cardiovascular disease risk and clustering (–2.0; 95% CI –3.1 to –0.9). Conclusions Despite increasing age and diabetes duration, there was a decline in modelled cardiovascular disease risk in the 5 years following diagnosis. Compared with routine care, 10-year modelled cardiovascular disease risk was lower in the intensive treatment group at 5 years. Our results suggest that patients benefit from intensive treatment early in the diabetes disease trajectory, where the rate of cardiovascular disease risk progression may be slowed. PMID:24533664

  4. Addition of GM-CSF to a peptide/KLH vaccine results in increased frequencies of CXCR3-expressing KLH-specific T cells.

    PubMed

    Na, Il-Kang; Keilholz, Ulrich; Letsch, Anne; Bauer, Sandra; Asemissen, Anne Marie; Nagorsen, Dirk; Thiel, Eckhard; Scheibenbogen, Carmen

    2007-03-01

    T-cell trafficking is determined by expression patterns of chemokine receptors. The chemokine receptor CXCR3 is expressed on a subpopulation of type 1 T cells and plays an important role for migration of T cells into inflamed and tumor tissues. Here, we studied the chemokine receptor expression on specific T cells generated against the neoantigen keyhole limpet hemocyanin (KLH) in patients who had been immunized in the context of a tumor peptide vaccination trial with or without the adjuvant granulocyte-macrophage colony-stimulating factor (GM-CSF). In patients immunized in the presence of GM-CSF the fraction of CXCR3(+) KLH-specific T cells was significantly higher than in patients immunized in the absence of GM-CSF (median 45 vs. 20%, P = 0.001). In contrast, the chemokine receptor CCR4, associated with migration to the skin was found in both cohorts on less than 10% of KLH-specific T cells. These results show that CXCR3 expression on vaccine-induced T cells can be modulated by modifying the local vaccine milieu. PMID:16850346

  5. Technetium-99m labelled LDL as a tracer for quantitative LDL scintigraphy. II. In vivo validation, LDL receptor-dependent and unspecific hepatic uptake and scintigraphic results.

    PubMed

    Leitha, T; Staudenherz, A; Gmeiner, B; Hermann, M; Hüttinger, M; Dudczak, R

    1993-08-01

    The purpose of this study was to determine whether the hepatic uptake of dialysed technetium-99m labelled low-density lipoprotein (99mTc-LDL) reflects the hepatic LDL receptor activity and to what extent the non-LDL receptor-dependent 99mTc-LDL uptake by non-parenchymal cells relates to the diagnostic utility of quantitative 99mTc-LDL scintigraphy of the liver. New Zealand White rabbits and Watanabe Heritable Hyperlipidaemic rabbits, which were sacrificed 24 h after simultaneous injection of 99mTc-LDL and iodine-125 labelled LDL, were clearly discriminated by their hepatic 99mTc-LDL uptake according to their genetically different hepatic LDL receptor activity. Yet the hepatic 99mTc-LDL uptake exceeded the 125I-LDL uptake in all animals. The different hepatic uptake of the tracers was elucidated in the isolated perfused rat liver and was due to rapid intracellular degradation and the release of low molecular catabolites of 125I-LDL. In contrast, 99mTc activity was trapped in the liver. Analysis of biliary 99mTc activity provided evidence for the excretion of 99mTc-labelled apolipoprotein B. The amount of biliary excreted protein-bound 99mTc was linked to total hepatic 99mTc-LDL uptake and presumably reflected LDL receptor-mediated apolipoprotein excretion. Collagenase liver perfusion in Sprague-Dawley rats 90 min following simultaneous injection of 99mTc- and 125I-LDL and subsequent cell separation by gradient centrifugation revealed that 99mTc-LDL and 125I-LDL had a comparably low uptake into non-parenchymal cells; thus its contribution can be neglected for scintigraphic purposes. Planar scintigraphy was performed in New Zealand White and Watanabe Heritable Hyperlipidaemic rabbits.(ABSTRACT TRUNCATED AT 250 WORDS) PMID:8404953

  6. Quantitative photoacoustic tomography

    PubMed Central

    Yuan, Zhen; Jiang, Huabei

    2009-01-01

    In this paper, several algorithms that allow for quantitative photoacoustic reconstruction of tissue optical, acoustic and physiological properties are described in a finite-element method based framework. These quantitative reconstruction algorithms are compared, and the merits and limitations associated with these methods are discussed. In addition, a multispectral approach is presented for concurrent reconstructions of multiple parameters including deoxyhaemoglobin, oxyhaemoglobin and water concentrations as well as acoustic speed. Simulation and in vivo experiments are used to demonstrate the effectiveness of the reconstruction algorithms presented. PMID:19581254

  7. Soil Greenhouse Gas Fluxes in a Pacific Northwest Douglas-Fir Forest: Results from a Soil Fertilization and Biochar Addition Experiment

    NASA Astrophysics Data System (ADS)

    Hawthorne, I.; Johnson, M. S.; Jassal, R. S.; Black, T. A.

    2013-12-01

    evacuated 12-mL vials and analyzed by gas chromatography. Chamber headspace GHG mixing ratios vs. time data were fit to linear and exponential models in R (Version 2.14.0) and fluxes were calculated. Results showed high variability in GHG fluxes over time in all treatments. Higher CO2 emissions were observed during early summer (119 μg CO2 m-2 s-1 in the control plots), decreasing with drought (19 μg CO2 m-2 s-1 in the control plots). CH4 uptake by soil increased during summer months from -0.004 μg CH4 m-2 s-1 to -0.089 μg CH4 m-2 s-1 in the control plots, in response to drying conditions in the upper soil profile. N2O was both consumed and emitted in all treatments, with fluxes ranging from -0.0009 to 0.0019 μg N2O m-2 s-1 in the control plots. Analysis of variance indicated that there were significant differences in GHG fluxes between treatments over time. We also investigated the potential effects of large volume headspace removal, and H2O vapour saturation leading to a dilution effect by using a closed-path infra-red gas analyzer with an inline humidity sensor.

  8. POPULATION DYNAMICS OF COTTON RATS ACROSS A LANDSCAPE MANIPULATED BY NITROGEN ADDITIONS AND ENCLOSURE FENCING

    EPA Science Inventory

    Nitrogen additions in grasslands have produced qualitative and quantitative changes in vegetation resulting in an increase in biomass and decrease in plant species diversity. As with plants, we theorize that animal communities will decrease in species richness and become dominat...

  9. The Role of Introductory Geosciences in Students' Quantitative Literacy

    NASA Astrophysics Data System (ADS)

    Wenner, J. M.; Manduca, C.; Baer, E. M.

    2006-12-01

    Quantitative literacy is more than mathematics; it is about reasoning with data. Colleges and universities have begun to recognize the distinction between mathematics and quantitative literacy, modifying curricula to reflect the need for numerate citizens. Although students may view geology as 'rocks for jocks', the geosciences are truthfully rife with data, making introductory geoscience topics excellent context for developing the quantitative literacy of students with diverse backgrounds. In addition, many news items that deal with quantitative skills, such as the global warming phenomenon, have their basis in the Earth sciences and can serve as timely examples of the importance of quantitative literacy for all students in introductory geology classrooms. Participants at a workshop held in 2006, 'Infusing Quantitative Literacy into Introductory Geoscience Courses,' discussed and explored the challenges and opportunities associated with the inclusion of quantitative material and brainstormed about effective practices for imparting quantitative literacy to students with diverse backgrounds. The tangible results of this workshop add to the growing collection of quantitative materials available through the DLESE- and NSF-supported Teaching Quantitative Skills in the Geosciences website, housed at SERC. There, faculty can find a collection of pages devoted to the successful incorporation of quantitative literacy in introductory geoscience. The resources on the website are designed to help faculty to increase their comfort with presenting quantitative ideas to students with diverse mathematical abilities. A methods section on "Teaching Quantitative Literacy" (http://serc.carleton.edu/quantskills/methods/quantlit/index.html) focuses on connecting quantitative concepts with geoscience context and provides tips, trouble-shooting advice and examples of quantitative activities. The goal in this section is to provide faculty with material that can be readily incorporated

  10. A mental health needs assessment of children and adolescents in post-conflict Liberia: results from a quantitative key-informant survey

    PubMed Central

    Borba, Christina P.C.; Ng, Lauren C.; Stevenson, Anne; Vesga-Lopez, Oriana; Harris, Benjamin L.; Parnarouskis, Lindsey; Gray, Deborah A.; Carney, Julia R.; Domínguez, Silvia; Wang, Edward K.S.; Boxill, Ryan; Song, Suzan J.; Henderson, David C.

    2016-01-01

    Between 1989 and 2004, Liberia experienced a devastating civil war that resulted in widespread trauma with almost no mental health infrastructure to help citizens cope. In 2009, the Liberian Ministry of Health and Social Welfare collaborated with researchers from Massachusetts General Hospital to conduct a rapid needs assessment survey in Liberia with local key informants (n = 171) to examine the impact of war and post-war events on emotional and behavioral problems of, functional limitations of, and appropriate treatment settings for Liberian youth aged 5–22. War exposure and post-conflict sexual violence, poverty, infectious disease and parental death negatively impacted youth mental health. Key informants perceived that youth displayed internalizing and externalizing symptoms and mental health-related functional impairment at home, school, work and in relationships. Medical clinics were identified as the most appropriate setting for mental health services. Youth in Liberia continue to endure the harsh social, economic and material conditions of everyday life in a protracted post-conflict state, and have significant mental health needs. Their observed functional impairment due to mental health issues further limited their access to protective factors such as education, employment and positive social relationships. Results from this study informed Liberia's first post-conflict mental health policy. PMID:26807147

  11. Quantitation of camptothecin and related compounds.

    PubMed

    Palumbo, M; Sissi, C; Gatto, B; Moro, S; Zagotto, G

    2001-11-25

    Camptothecin and congeners represent a clinically very useful class of anticancer agents. Proper identification and quantitation of the original compounds and their metabolites in biological fluids is fundamental to assess drug metabolism and distribution in animals and in man. In this paper we will review the recent literature available on the methods used for separation and quantitative determination of the camptothecin family of drugs. Complications arise from the fact that they are chemically labile, and the pharmacologically active lactone structure can undergo ring opening at physiological conditions. In addition, a number of metabolic changes usually occur, producing a variety of active or inactive metabolites. Hence, the conditions of extraction, pre-treatment and quantitative analysis are to be carefully calibrated in order to provide meaningful results. PMID:11817024

  12. Incorporating patient preferences into drug development and regulatory decision making: Results from a quantitative pilot study with cancer patients, carers, and regulators.

    PubMed

    Postmus, D; Mavris, M; Hillege, H L; Salmonson, T; Ryll, B; Plate, A; Moulon, I; Eichler, H-G; Bere, N; Pignatti, F

    2016-05-01

    Currently, patient preference studies are not required to be included in marketing authorization applications to regulatory authorities, and the role and methodology for such studies have not been agreed upon. The European Medicines Agency (EMA) conducted a pilot study to gain experience on how the collection of individual preferences can inform the regulatory review. Using a short online questionnaire, ordinal statements regarding the desirability of different outcomes in the treatment of advanced cancer were elicited from 139 participants (98 regulators, 29 patient or carers, and 12 healthcare professionals). This was followed by face-to-face meetings to gather feedback and validate the individual responses. In this article we summarize the EMA pilot study and discuss the role of patient preference studies within the regulatory review. Based on the results, we conclude that our preference elicitation instrument was easy to implement and sufficiently precise to learn about the distribution of the participants' individual preferences. PMID:26715217

  13. Early perfusion changes in patients with recurrent high-grade brain tumor treated with Bevacizumab: preliminary results by a quantitative evaluation

    PubMed Central

    2012-01-01

    Background To determine whether early monitoring of the effects of bevacizumab in patients with recurrent high-grade gliomas, by a Perfusion Computed Tomography (PCT), may be a predictor of the response to treatment assessed through conventional MRI follow-up. Methods Sixteen patients were enrolled in the present study. For each patient, two PCT examinations, before and after the first dose of bevacizumab, were acquired. Areas of abnormal Cerebral Blood Volume (CBV) were manually defined on the CBV maps, using co-registered T1- weighted images, acquired before treatment, as a guide to the tumor location. Different perfusion metrics were derived from the histogram analysis of the normalized CBV (nCBV) maps; both hyper and hypo-perfused sub-volumes were quantified in the lesion, including tumor necrosis. A two-tailed Wilcoxon test was used to establish the significance of changes in the different perfusion metrics, observed at baseline and during treatment. The relationships between changes in perfusion and morphological MRI modifications at first follow-up were investigated. Results Significant reductions in mean and median nCBV were detected throughout the entire patient population, after only a single dose of bevacizumab. The nCBV histogram modifications indicated the normalization effect of bevacizumab on the tumor abnormal vasculature. An improvement in hypoxia after a single dose of bevacizumab was predictive of a greater reduction in T1-weighted contrast-enhanced volumes at first follow-up. Conclusions These preliminary results show that a quantification of changes in necrotic intra-tumoral regions could be proposed as a potential imaging biomarker of tumor response to anti-VEGF therapies. PMID:22494770

  14. Drugs, Women and Violence in the Americas: U.S. Quantitative Results of a Multi-Centric Pilot Project (Phase 2)

    PubMed Central

    González-Guarda, Rosa María; Peragallo, Nilda; Lynch, Ami; Nemes, Susanna

    2011-01-01

    Objectives To explore the collective and individual experiences that Latin American females in the U.S. have with substance abuse, violence and risky sexual behaviors. Methods This study was conducted in two phases from July 2006 to June 2007 in south Florida. This paper covers Phase 2. In Phase 2, questionnaires were provided to women to test whether there is a relationship between demographics, acculturation, depression, self-esteem and substance use/abuse; whether there is a relationship between demographics, acculturation, depression, self-esteem and violence exposure and victimization; whether there is a relationship between demographics, acculturation, depression, self-esteem, HIV knowledge and STD and HIV/AIDS risks among respondents; and whether there is a relationship between substance abuse, violence victimization and HIV/AIDS risks among respondents. Results Participants reported high rates of alcohol and drug abuse among their current or most recent partners. This is a major concern because partner alcohol use and drug use was related to partner physical, sexual and psychological abuse. Only two factors were associated with lifetime drug use: income and acculturation. Over half of the participants reported being victims of at least one form of abuse during childhood and adulthood. A substantial component of abuse reported during adulthood was perpetrated by a currently or recent intimate partner. Conclusions The results from this study suggest that substance abuse, violence and HIV should be addressed in an integrative and comprehensive manner. Recommendations for the development of policies, programs and services addressing substance abuse, violence and risk for HIV among Latinos are provided. PMID:22504304

  15. The CheMin XRD on the Mars Science Laboratory Rover Curiosity: Construction, Operation, and Quantitative Mineralogical Results from the Surface of Mars

    NASA Technical Reports Server (NTRS)

    Blake, David F.

    2015-01-01

    The Mars Science Laboratory mission was launched from Cape Canaveral, Florida on Nov. 26, 2011 and landed in Gale crater, Mars on Aug. 6, 2012. MSL's mission is to identify and characterize ancient "habitable" environments on Mars. MSL's precision landing system placed the Curiosity rover within 2 km of the center of its 20 X 6 km landing ellipse, next to Gale's central mound, a 5,000 meter high pile of laminated sediment which may contain 1 billion years of Mars history. Curiosity carries with it a full suite of analytical instruments, including the CheMin X-ray diffractometer, the first XRD flown in space. CheMin is essentially a transmission X-ray pinhole camera. A fine-focus Co source and collimator transmits a 50µm beam through a powdered sample held between X-ray transparent plastic windows. The sample holder is shaken by a piezoelectric actuator such that the powder flows like a liquid, each grain passing in random orientation through the beam over time. Forward-diffracted and fluoresced X-ray photons from the sample are detected by an X-ray sensitive Charge Coupled Device (CCD) operated in single photon counting mode. When operated in this way, both the x,y position and the energy of each photon are detected. The resulting energy-selected Co Kalpha Debye-Scherrer pattern is used to determine the identities and amounts of minerals present via Rietveld refinement, and a histogram of all X-ray events constitutes an X-ray fluorescence analysis of the sample.The key role that definitive mineralogy plays in understanding the Martian surface is a consequence of the fact that minerals are thermodynamic phases, having known and specific ranges of temperature, pressure and composition within which they are stable. More than simple compositional analysis, definitive mineralogical analysis can provide information about pressure/temperature conditions of formation, past climate, water activity and the like. Definitive mineralogical analyses are necessary to establish

  16. Consumption of Antimicrobials in Pigs, Veal Calves, and Broilers in The Netherlands: Quantitative Results of Nationwide Collection of Data in 2011

    PubMed Central

    Bos, Marian E. H.; Taverne, Femke J.; van Geijlswijk, Ingeborg M.; Mouton, Johan W.; Mevius, Dik J.; Heederik, Dick J. J.

    2013-01-01

    In 2011, Dutch animal production sectors started recording veterinary antimicrobial consumption. These data are used by the Netherlands Veterinary Medicines Authority to create transparency in and define benchmark indicators for veterinary consumption of antimicrobials. This paper presents the results of sector wide consumption of antimicrobials, in the form of prescriptions or deliveries, for all pig, veal calf, and broiler farms. Data were used to calculate animal defined daily dosages per year (ADDD/Y) per pig or veal calf farm. For broiler farms, number of animal treatment days per year was calculated. Furthermore, data were used to calculate the consumption of specific antimicrobial classes per administration route per pig or veal calf farm. The distribution of antimicrobial consumption per farm varied greatly within and between farm categories. All categories, except for rosé starter farms, showed a highly right skewed distribution with a long tail. Median ADDD/Y values varied from 1.2 ADDD/Y for rosé finisher farms to 83.2 ADDD/Y for rosé starter farms, with 28.6 ADDD/Y for white veal calf farms. Median consumption in pig farms was 9.3 ADDD/Y for production pig farms and 3.0 ADDD/Y for slaughter pig farms. Median consumption in broiler farms was 20.9 ATD/Y. Regarding specific antimicrobial classes, fluoroquinolones were mainly used on veal calf farms, but in low quantities: P75 range was 0 – 0.99 ADDD/Y, and 0 – 0.04 ADDD/Y in pig farms. The P75 range for 3rd/4th-generation cephalosporins was 0 – 0.07 ADDD/Y for veal calf farms, and 0 – 0.1 ADDD/Y for pig farms. The insights obtained from these results, and the full transparency obtained by monitoring antimicrobial consumption per farm, will help reduce antimicrobial consumption and endorse antimicrobial stewardship. The wide and skewed distribution in consumption has important practical and methodological implications for benchmarking, surveillance and future analysis of trends. PMID:24204857

  17. Consumption of antimicrobials in pigs, veal calves, and broilers in the Netherlands: quantitative results of nationwide collection of data in 2011.

    PubMed

    Bos, Marian E H; Taverne, Femke J; van Geijlswijk, Ingeborg M; Mouton, Johan W; Mevius, Dik J; Heederik, Dick J J

    2013-01-01

    In 2011, Dutch animal production sectors started recording veterinary antimicrobial consumption. These data are used by the Netherlands Veterinary Medicines Authority to create transparency in and define benchmark indicators for veterinary consumption of antimicrobials. This paper presents the results of sector wide consumption of antimicrobials, in the form of prescriptions or deliveries, for all pig, veal calf, and broiler farms. Data were used to calculate animal defined daily dosages per year (ADDD/Y) per pig or veal calf farm. For broiler farms, number of animal treatment days per year was calculated. Furthermore, data were used to calculate the consumption of specific antimicrobial classes per administration route per pig or veal calf farm. The distribution of antimicrobial consumption per farm varied greatly within and between farm categories. All categories, except for rosé starter farms, showed a highly right skewed distribution with a long tail. Median ADDD/Y values varied from 1.2 ADDD/Y for rosé finisher farms to 83.2 ADDD/Y for rosé starter farms, with 28.6 ADDD/Y for white veal calf farms. Median consumption in pig farms was 9.3 ADDD/Y for production pig farms and 3.0 ADDD/Y for slaughter pig farms. Median consumption in broiler farms was 20.9 ATD/Y. Regarding specific antimicrobial classes, fluoroquinolones were mainly used on veal calf farms, but in low quantities: P75 range was 0 - 0.99 ADDD/Y, and 0 - 0.04 ADDD/Y in pig farms. The P75 range for 3(rd)/4(th)-generation cephalosporins was 0 - 0.07 ADDD/Y for veal calf farms, and 0 - 0.1 ADDD/Y for pig farms. The insights obtained from these results, and the full transparency obtained by monitoring antimicrobial consumption per farm, will help reduce antimicrobial consumption and endorse antimicrobial stewardship. The wide and skewed distribution in consumption has important practical and methodological implications for benchmarking, surveillance and future analysis of trends. PMID:24204857

  18. Hand-to-mouth contacts result in greater ingestion of feces than dietary water consumption in Tanzania: a quantitative fecal exposure assessment model.

    PubMed

    Mattioli, Mia Catharine M; Davis, Jennifer; Boehm, Alexandria B

    2015-02-01

    Diarrheal diseases kill 1800 children under the age of five die each day, and nearly half of these deaths occur in sub-Saharan Africa. Contaminated drinking water and hands are two important environmental transmission routes of diarrhea-causing pathogens to young children in low-income countries. The objective of this research is to evaluate the relative contribution of these two major exposure pathways in a low-income country setting. A Monte Carlo simulation was used to model the amount of human feces ingested by children under five years old from exposure via hand-to-mouth contacts and stored drinking water ingestion in Bagamoyo, Tanzania. Child specific exposure data were obtained from the USEPA 2011 Exposure Factors Handbook, and fecal contamination was estimated using hand rinse and stored water fecal indicator bacteria concentrations from over 1200 Tanzanian households. The model outcome is a distribution of a child's daily dose of feces via each exposure route. The model results show that Tanzanian children ingest a significantly greater amount of feces each day from hand-to-mouth contacts than from drinking water, which may help elucidate why interventions focused on water without also addressing hygiene often see little to no effect on reported incidence of diarrhea. PMID:25559008

  19. Can Patient Safety Incident Reports Be Used to Compare Hospital Safety? Results from a Quantitative Analysis of the English National Reporting and Learning System Data

    PubMed Central

    2015-01-01

    claims per bed were significantly negatively associated with incident reports. Patient satisfaction and mortality outcomes were not significantly associated with reporting rates. Staff survey responses revealed that keeping reports confidential, keeping staff informed about incidents and giving feedback on safety initiatives increased reporting rates [r = 0.26 (p<0.01), r = 0.17 (p = 0.04), r = 0.23 (p = 0.01), r = 0.20 (p = 0.02)]. Conclusion The NRLS is the largest patient safety reporting system in the world. This study did not demonstrate many hospital characteristics to significantly influence overall reporting rate. There were no association between size of hospital, number of staff, mortality outcomes or patient satisfaction outcomes and incident reporting rate. The study did show that hospitals where staff reported more incidents had reduced litigation claims and when clinician staffing is increased fewer incidents reporting patient harm are reported, whilst near misses remain the same. Certain specialties report more near misses than others, and doctors report more harm incidents than near misses. Staff survey results showed that open environments and reduced fear of punitive response increases incident reporting. We suggest that reporting rates should not be used to assess hospital safety. Different healthcare professionals focus on different types of safety incidents and focusing on these areas whilst creating a responsive, confidential learning environment will increase staff engagement with error disclosure. PMID:26650823

  20. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. R.; St. Clair, T. L.; Burks, H. D.; Stoakley, D. M.

    1987-01-01

    A method has been found for enhancing the melt flow of thermoplastic polyimides during processing. A high molecular weight 422 copoly(amic acid) or copolyimide was fused with approximately 0.05 to 5 pct by weight of a low molecular weight amic acid or imide additive, and this melt was studied by capillary rheometry. Excellent flow and improved composite properties on graphite resulted from the addition of a PMDA-aniline additive to LARC-TPI. Solution viscosity studies imply that amic acid additives temporarily lower molecular weight and, hence, enlarge the processing window. Thus, compositions containing the additive have a lower melt viscosity for a longer time than those unmodified.

  1. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  2. Quantitative Graphics in Newspapers.

    ERIC Educational Resources Information Center

    Tankard, James W., Jr.

    The use of quantitative graphics in newspapers requires achieving a balance between being accurate and getting the attention of the reader. The statistical representations in newspapers are drawn by graphic designers whose key technique is fusion--the striking combination of two visual images. This technique often results in visual puns,…

  3. Berkeley Quantitative Genome Browser

    SciTech Connect

    Hechmer, Aaron

    2008-02-29

    The Berkeley Quantitative Genome Browser provides graphical browsing functionality for genomic data organized, at a minimum, by sequence and position. While supporting the annotation browsing features typical of many other genomic browsers, additional emphasis is placed on viewing and utilizing quantitative data. Data may be read from GFF, SGR, FASTA or any column delimited format. Once the data has been read into the browser's buffer, it may be searched. filtered or subjected to mathematical transformation. The browser also supplies some graphical design manipulation functionality geared towards preparing figures for presentations or publication. A plug-in mechanism enables development outside the core functionality that adds more advanced or esoteric analysis capabilities. BBrowse's development and distribution is open-source and has been built to run on Linux, OSX and MS Windows operating systems.

  4. Berkeley Quantitative Genome Browser

    2008-02-29

    The Berkeley Quantitative Genome Browser provides graphical browsing functionality for genomic data organized, at a minimum, by sequence and position. While supporting the annotation browsing features typical of many other genomic browsers, additional emphasis is placed on viewing and utilizing quantitative data. Data may be read from GFF, SGR, FASTA or any column delimited format. Once the data has been read into the browser's buffer, it may be searched. filtered or subjected to mathematical transformation.more » The browser also supplies some graphical design manipulation functionality geared towards preparing figures for presentations or publication. A plug-in mechanism enables development outside the core functionality that adds more advanced or esoteric analysis capabilities. BBrowse's development and distribution is open-source and has been built to run on Linux, OSX and MS Windows operating systems.« less

  5. Multiplicative and additive Adelson's snake illusions.

    PubMed

    Petrini, Karin

    2008-01-01

    Two different versions of Adelson's snake lightness illusion are quantitatively investigated. In one experiment an additive version of the illusion is investigated by varying the additive component of the atmosphere transfer function (ATF) introduced by Adelson [2000, in The New Cognitive Neuroscience Ed. M Gazzaniga (Cambridge, MA: MIT Press) pp 339-351]. In the other, a multiplicative version of the illusion is examined by varying the multiplicative component of the ATE In both experiments four observers matched the targets' lightness of the snake patterns with Munsell samples. Increasing the additive or the multiplicative component elicited an approximately equal increase in the magnitude of the lightness illusion. The results show that both components, in the absence of other kinds of information, can be used as heuristics by our visual system to anchor luminance of the object when converting it into lightness. PMID:19189728

  6. Unraveling Additive from Nonadditive Effects Using Genomic Relationship Matrices

    PubMed Central

    Muñoz, Patricio R.; Resende, Marcio F. R.; Gezan, Salvador A.; Resende, Marcos Deon Vilela; de los Campos, Gustavo; Kirst, Matias; Huber, Dudley; Peter, Gary F.

    2014-01-01

    The application of quantitative genetics in plant and animal breeding has largely focused on additive models, which may also capture dominance and epistatic effects. Partitioning genetic variance into its additive and nonadditive components using pedigree-based models (P-genomic best linear unbiased predictor) (P-BLUP) is difficult with most commonly available family structures. However, the availability of dense panels of molecular markers makes possible the use of additive- and dominance-realized genomic relationships for the estimation of variance components and the prediction of genetic values (G-BLUP). We evaluated height data from a multifamily population of the tree species Pinus taeda with a systematic series of models accounting for additive, dominance, and first-order epistatic interactions (additive by additive, dominance by dominance, and additive by dominance), using either pedigree- or marker-based information. We show that, compared with the pedigree, use of realized genomic relationships in marker-based models yields a substantially more precise separation of additive and nonadditive components of genetic variance. We conclude that the marker-based relationship matrices in a model including additive and nonadditive effects performed better, improving breeding value prediction. Moreover, our results suggest that, for tree height in this population, the additive and nonadditive components of genetic variance are similar in magnitude. This novel result improves our current understanding of the genetic control and architecture of a quantitative trait and should be considered when developing breeding strategies. PMID:25324160

  7. The NIST Quantitative Infrared Database

    PubMed Central

    Chu, P. M.; Guenther, F. R.; Rhoderick, G. C.; Lafferty, W. J.

    1999-01-01

    With the recent developments in Fourier transform infrared (FTIR) spectrometers it is becoming more feasible to place these instruments in field environments. As a result, there has been enormous increase in the use of FTIR techniques for a variety of qualitative and quantitative chemical measurements. These methods offer the possibility of fully automated real-time quantitation of many analytes; therefore FTIR has great potential as an analytical tool. Recently, the U.S. Environmental Protection Agency (U.S.EPA) has developed protocol methods for emissions monitoring using both extractive and open-path FTIR measurements. Depending upon the analyte, the experimental conditions and the analyte matrix, approximately 100 of the hazardous air pollutants (HAPs) listed in the 1990 U.S.EPA Clean Air Act amendment (CAAA) can be measured. The National Institute of Standards and Technology (NIST) has initiated a program to provide quality-assured infrared absorption coefficient data based on NIST prepared primary gas standards. Currently, absorption coefficient data has been acquired for approximately 20 of the HAPs. For each compound, the absorption coefficient spectrum was calculated using nine transmittance spectra at 0.12 cm−1 resolution and the Beer’s law relationship. The uncertainties in the absorption coefficient data were estimated from the linear regressions of the transmittance data and considerations of other error sources such as the nonlinear detector response. For absorption coefficient values greater than 1 × 10−4 μmol/mol)−1 m−1 the average relative expanded uncertainty is 2.2 %. This quantitative infrared database is currently an ongoing project at NIST. Additional spectra will be added to the database as they are acquired. Our current plans include continued data acquisition of the compounds listed in the CAAA, as well as the compounds that contribute to global warming and ozone depletion.

  8. Primary enzyme quantitation

    DOEpatents

    Saunders, G.C.

    1982-03-04

    The disclosure relates to the quantitation of a primary enzyme concentration by utilizing a substrate for the primary enzyme labeled with a second enzyme which is an indicator enzyme. Enzyme catalysis of the substrate occurs and results in release of the indicator enzyme in an amount directly proportional to the amount of primary enzyme present. By quantifying the free indicator enzyme one determines the amount of primary enzyme present.

  9. Human Brain Atlas-based Multimodal MRI Analysis of Volumetry, Diffusimetry, Relaxometry and Lesion Distribution in Multiple Sclerosis Patients and Healthy Adult Controls: Implications for understanding the Pathogenesis of Multiple Sclerosis and Consolidation of Quantitative MRI Results in MS

    PubMed Central

    Hasan, Khader M.; Walimuni, Indika S.; Abid, Humaira; Datta, Sushmita; Wolinsky, Jerry S.; Narayana, Ponnada A.

    2011-01-01

    Multiple sclerosis (MS) is the most common immune-mediated disabling neurological disease of the central nervous system. The pathogenesis of MS is not fully understood. Histopathology implicates both demyelination and axonal degeneration as the major contributors to the accumulation of disability. The application of several in vivo quantitative magnetic resonance imaging (MRI) methods to both lesioned and normal-appearing brain tissue has not yet provided a solid conclusive support of the hypothesis that MS might be a diffuse disease. In this work, we adopted FreeSurfer to provide standardized macrostructure or volumetry of lesion free normal-appearing brain tissue in combination with multiple quantitative MRI metrics (T2 relaxation time, diffusion tensor anisotropy and diffusivities) that characterize tissue microstructural integrity. By incorporating a large number of healthy controls, we have attempted to separate the natural age-related change from the disease-induced effects. Our work shows elevation in diffusivity and relaxation times and reduction in volume in a number of normal-appearing white matter and gray matter structures in relapsing-remitting multiple sclerosis patients. These changes were related in part with the spatial distribution of lesions. The whole brain lesion load and age-adjusted expanded disability status score showed strongest correlations in regions such as corpus callosum with qMRI metrics that are believed to be specific markers of axonal dysfunction, consistent with histologic data of others indicating axonal loss that is independent of focal lesions. Our results support that MS at least in part has a neurodegenerative component. PMID:21978603

  10. Recovery of clinical but not radiographic outcomes by the delayed addition of adalimumab to methotrexate-treated Japanese patients with early rheumatoid arthritis: 52-week results of the HOPEFUL-1 trial

    PubMed Central

    Ishiguro, Naoki; Takeuchi, Tsutomu; Miyasaka, Nobuyuki; Mukai, Masaya; Matsubara, Tsukasa; Uchida, Shoji; Akama, Hideto; Kupper, Hartmut; Arora, Vipin; Tanaka, Yoshiya

    2014-01-01

    Objective. The aim of this study was to compare efficacy outcomes of initial treatment with adalimumab + MTX vs adalimumab addition following 26 weeks of MTX monotherapy in Japanese early RA patients naive to MTX with high disease activity. Methods. Patients completing the 26-week, randomized, placebo-controlled trial of adalimumab + MTX were eligible to receive 26 weeks of open-label adalimumab + MTX. Patients were assessed for mean change from baseline in the 28-joint DAS with ESR (DAS28-ESR) and modified total Sharp score (mTSS), and for the proportions of patients achieving clinical, functional or radiographic remission. Results. Of 333 patients assessed, 278 (137 from the initial adalimumab + MTX and 141 from the initial placebo + MTX groups) completed the 52-week study. Significant differences in clinical and functional parameters observed during the 26-week blinded period were not apparent following the addition of open-label adalimumab to MTX. Open-label adalimumab + MTX slowed radiographic progression through week 52 in both groups, but patients who received adalimumab + MTX throughout the study exhibited less radiographic progression than those who received placebo + MTX during the first 26 weeks (mean ΔmTSS at week 52 = 2.56 vs 3.30, P < 0.001). Conclusion. Delayed addition of adalimumab in Japanese MTX-naive early RA patients did not impact clinical and functional outcomes at week 52 compared with the earlier addition of adalimumab. However, the accrual of significant structural damage during blinded placebo + MTX therapy contributed to the persistence of differences between the treatment strategies, suggesting that Japanese patients at risk for aggressive disease should benefit from the early inclusion of adalimumab + MTX combination therapy. Trial registration. ClinicalTrials.gov (http://clinicaltrials.gov/), NCT00870467. PMID:24441150

  11. Smog control fuel additives

    SciTech Connect

    Lundby, W.

    1993-06-29

    A method is described of controlling, reducing or eliminating, ozone and related smog resulting from photochemical reactions between ozone and automotive or industrial gases comprising the addition of iodine or compounds of iodine to hydrocarbon-base fuels prior to or during combustion in an amount of about 1 part iodine per 240 to 10,000,000 parts fuel, by weight, to be accomplished by: (a) the addition of these inhibitors during or after the refining or manufacturing process of liquid fuels; (b) the production of these inhibitors for addition into fuel tanks, such as automotive or industrial tanks; or (c) the addition of these inhibitors into combustion chambers of equipment utilizing solid fuels for the purpose of reducing ozone.

  12. Boron addition to alloys

    SciTech Connect

    Coad, B. C.

    1985-08-20

    A process for addition of boron to an alloy which involves forming a melt of the alloy and a reactive metal, selected from the group consisting of aluminum, titanium, zirconium and mixtures thereof to the melt, maintaining the resulting reactive mixture in the molten state and reacting the boric oxide with the reactive metal to convert at least a portion of the boric oxide to boron which dissolves in the resulting melt, and to convert at least portion of the reactive metal to the reactive metal oxide, which oxide remains with the resulting melt, and pouring the resulting melt into a gas stream to form a first atomized powder which is subsequently remelted with further addition of boric oxide, re-atomized, and thus reprocessed to convert essentially all the reactive metal to metal oxide to produce a powdered alloy containing specified amounts of boron.

  13. Using quantitative phase petrology to understand metamorphism

    NASA Astrophysics Data System (ADS)

    White, Richard

    2015-04-01

    Quantitative phase petrology has become one of the mainstay methods for interpreting metamorphic rocks and processes. Its increased utility has been driven by improvements to end-member thermodynamics, activity-composition relationships and computer programs to undertake calculations. Such improvements now allow us to undertake calculations in increasingly complex chemical systems that more closely reflect those of rocks. Recent progress in activity-composition (a-x) relationships is aimed at developing suites of a-x relationships in large chemical systems that are calibrated together, which will allow a more direct application of the method to metamorphic rocks. In addition, considerable progress has been made in how quantitative phase diagrams can be used to understand features, including chemical potential diagrams for reaction textures, methods for fractionating bulk compositions and methods for modelling open system processes. One feature of calculated phase diagrams is that they present us with a great amount of information, such as mineral assemblages, mineral proportions, phase compositions, volume or density etc. An important aspect to using this information is to understand the potential uncertainties associated with these, which are significant. These uncertainties require that calculated phase diagrams be used with caution to interpret observed features in rocks. Features such as mineral zoning and reaction textures should still be interpreted in a semi-quantitative way, even if based on a fully quantitative diagram. Exercises such as the interpretation of reaction overstepping based on relating phase diagrams to observed mineral core compositions are likely to give spurious results given the infelicities in existing a-x models. Despite these limitations, quantitative phase petrology remains the most useful approach to interpreting the metamorphic history of rocks in that it provides a theoretical framework in which to interpret observed features rather

  14. Quantitative velocity modulation spectroscopy.

    PubMed

    Hodges, James N; McCall, Benjamin J

    2016-05-14

    Velocity Modulation Spectroscopy (VMS) is arguably the most important development in the 20th century for spectroscopic study of molecular ions. For decades, interpretation of VMS lineshapes has presented challenges due to the intrinsic covariance of fit parameters including velocity modulation amplitude, linewidth, and intensity. This limitation has stifled the growth of this technique into the quantitative realm. In this work, we show that subtle changes in the lineshape can be used to help address this complexity. This allows for determination of the linewidth, intensity relative to other transitions, velocity modulation amplitude, and electric field strength in the positive column of a glow discharge. Additionally, we explain the large homogeneous component of the linewidth that has been previously described. Using this component, the ion mobility can be determined. PMID:27179476

  15. Quantitative velocity modulation spectroscopy

    NASA Astrophysics Data System (ADS)

    Hodges, James N.; McCall, Benjamin J.

    2016-05-01

    Velocity Modulation Spectroscopy (VMS) is arguably the most important development in the 20th century for spectroscopic study of molecular ions. For decades, interpretation of VMS lineshapes has presented challenges due to the intrinsic covariance of fit parameters including velocity modulation amplitude, linewidth, and intensity. This limitation has stifled the growth of this technique into the quantitative realm. In this work, we show that subtle changes in the lineshape can be used to help address this complexity. This allows for determination of the linewidth, intensity relative to other transitions, velocity modulation amplitude, and electric field strength in the positive column of a glow discharge. Additionally, we explain the large homogeneous component of the linewidth that has been previously described. Using this component, the ion mobility can be determined.

  16. Phenylethynyl Containing Reactive Additives

    NASA Technical Reports Server (NTRS)

    Connell, John W. (Inventor); Smith, Joseph G., Jr. (Inventor); Hergenrother, Paul M. (Inventor)

    2002-01-01

    Phenylethynyl containing reactive additives were prepared from aromatic diamine, containing phenylethvnvl groups and various ratios of phthalic anhydride and 4-phenylethynviphthalic anhydride in glacial acetic acid to form the imide in one step or in N-methyl-2-pvrrolidinone to form the amide acid intermediate. The reactive additives were mixed in various amounts (10% to 90%) with oligomers containing either terminal or pendent phenylethynyl groups (or both) to reduce the melt viscosity and thereby enhance processability. Upon thermal cure, the additives react and become chemically incorporated into the matrix and effect an increase in crosslink density relative to that of the host resin. This resultant increase in crosslink density has advantageous consequences on the cured resin properties such as higher glass transition temperature and higher modulus as compared to that of the host resin.

  17. Electrophilic addition of astatine

    SciTech Connect

    Norseev, Yu.V.; Vasaros, L.; Nhan, D.D.; Huan, N.K.

    1988-03-01

    It has been shown for the first time that astatine is capable of undergoing addition reactions to unsaturated hydrocarbons. A new compound of astatine, viz., ethylene astatohydrin, has been obtained, and its retention numbers of squalane, Apiezon, and tricresyl phosphate have been found. The influence of various factors on the formation of ethylene astatohydrin has been studied. It has been concluded on the basis of the results obtained that the univalent cations of astatine in an acidic medium is protonated hypoastatous acid.

  18. Addition of rapamycin and hydroxychloroquine to metronomic chemotherapy as a second line treatment results in high salvage rates for refractory metastatic solid tumors: a pilot safety and effectiveness analysis in a small patient cohort

    PubMed Central

    Chi, Kwan-Hwa; Ko, Hui-Ling; Yang, Kai-Lin; Lee, Cheng-Yen; Chi, Mau-Shin; Kao, Shang-Jyh

    2015-01-01

    BACKGROUND Autophagy is an important oncotarget that can be modulated during anti-cancer therapy. Enhancing autophagy using chemotherapy and rapamycin (Rapa) treatment and then inhibiting it using hydroxychloroquine (HCQ) could synergistically improve therapy outcome in cancer patients. It is still unclear whether addition of Rapa and HCQ to chemotherapy could be used for reversing drug resistance. PATIENTS AND METHODS Twenty-five stage IV cancer patients were identified. They had no clinical response to first-line metronomic chemotherapy; the patients were salvaged by adding an autophagy inducer (Rapa, 2 mg/day) and an autophagosome inhibitor (HCQ, 400 mg/day) to their current metronomic chemotherapy for at least 3 months. Patients included 4 prostate, 4 bladder, 4 lung, 4 breast, 2 colon, and 3 head and neck cancer patients as well as 4 sarcoma patients. RESULTS Chemotherapy was administered for a total of 137 months. The median duration of chemotherapy cycles per patient was 4 months (95% confidence interval, 3–7 months). The overall response rate to this treatment was of 40%, with an 84% disease control rate. The most frequent and clinically significant toxicities were myelotoxicities. Grade ≥3 leucopenia occurred in 6 patients (24%), grade ≥3 thrombocytopenia in 8 (32%), and anemia in 3 (12%). None of them developed febrile neutropenia. Non-hematologic toxicities were fatigue (total 32%, with 1 patient developing grade 3 fatigue), diarrhea (total 20%, 1 patient developed grade 3 fatigue), reversible grade 3 cardiotoxicity (1 patient), and grade V liver toxicity from hepatitis B reactivation (1 patient). CONCLUSION Our results of Rapa, HCQ and chemotherapy triplet combination suggest autophagy is a promising oncotarget and warrants further investigation in phase II studies. PMID:25944689

  19. Health effects models for nuclear power plant accident consequence analysis. Modification of models resulting from addition of effects of exposure to alpha-emitting radionuclides: Revision 1, Part 2, Scientific bases for health effects models, Addendum 2

    SciTech Connect

    Abrahamson, S.; Bender, M.A.; Boecker, B.B.; Scott, B.R.; Gilbert, E.S.

    1993-05-01

    The Nuclear Regulatory Commission (NRC) has sponsored several studies to identify and quantify, through the use of models, the potential health effects of accidental releases of radionuclides from nuclear power plants. The Reactor Safety Study provided the basis for most of the earlier estimates related to these health effects. Subsequent efforts by NRC-supported groups resulted in improved health effects models that were published in the report entitled {open_quotes}Health Effects Models for Nuclear Power Plant Consequence Analysis{close_quotes}, NUREG/CR-4214, 1985 and revised further in the 1989 report NUREG/CR-4214, Rev. 1, Part 2. The health effects models presented in the 1989 NUREG/CR-4214 report were developed for exposure to low-linear energy transfer (LET) (beta and gamma) radiation based on the best scientific information available at that time. Since the 1989 report was published, two addenda to that report have been prepared to (1) incorporate other scientific information related to low-LET health effects models and (2) extend the models to consider the possible health consequences of the addition of alpha-emitting radionuclides to the exposure source term. The first addendum report, entitled {open_quotes}Health Effects Models for Nuclear Power Plant Accident Consequence Analysis, Modifications of Models Resulting from Recent Reports on Health Effects of Ionizing Radiation, Low LET Radiation, Part 2: Scientific Bases for Health Effects Models,{close_quotes} was published in 1991 as NUREG/CR-4214, Rev. 1, Part 2, Addendum 1. This second addendum addresses the possibility that some fraction of the accident source term from an operating nuclear power plant comprises alpha-emitting radionuclides. Consideration of chronic high-LET exposure from alpha radiation as well as acute and chronic exposure to low-LET beta and gamma radiations is a reasonable extension of the health effects model.

  20. Quantitative Proteomic Analysis of the Human Nucleolus.

    PubMed

    Bensaddek, Dalila; Nicolas, Armel; Lamond, Angus I

    2016-01-01

    Recent years have witnessed spectacular progress in the field of mass spectrometry (MS)-based quantitative proteomics, including advances in instrumentation, chromatography, sample preparation methods, and experimental design for multidimensional analyses. It is now possible not only to identify most of the protein components of a cell proteome in a single experiment, but also to describe additional proteome dimensions, such as protein turnover rates, posttranslational modifications, and subcellular localization. Furthermore, by comparing the proteome at different time points, it is possible to create a "time-lapse" view of proteome dynamics. By combining high-throughput quantitative proteomics with detailed subcellular fractionation protocols and data analysis techniques it is also now possible to characterize in detail the proteomes of specific subcellular organelles, providing important insights into cell regulatory mechanisms and physiological responses. In this chapter we present a reliable workflow and protocol for MS-based analysis and quantitation of the proteome of nucleoli isolated from human cells. The protocol presented is based on a SILAC analysis of human MCF10A-Src-ER cells with analysis performed on a Q-Exactive Plus Orbitrap MS instrument (Thermo Fisher Scientific). The subsequent chapter describes how to process the resulting raw MS files from this experiment using MaxQuant software and data analysis procedures to evaluate the nucleolar proteome using customized R scripts. PMID:27576725

  1. Vinyl capped addition polyimides

    NASA Technical Reports Server (NTRS)

    Vannucci, Raymond D. (Inventor); Malarik, Diane C. (Inventor); Delvigs, Peter (Inventor)

    1991-01-01

    Polyimide resins (PMR) are generally useful where high strength and temperature capabilities are required (at temperatures up to about 700 F). Polyimide resins are particularly useful in applications such as jet engine compressor components, for example, blades, vanes, air seals, air splitters, and engine casing parts. Aromatic vinyl capped addition polyimides are obtained by reacting a diamine, an ester of tetracarboxylic acid, and an aromatic vinyl compound. Low void materials with improved oxidative stability when exposed to 700 F air may be fabricated as fiber reinforced high molecular weight capped polyimide composites. The aromatic vinyl capped polyimides are provided with a more aromatic nature and are more thermally stable than highly aliphatic, norbornenyl-type end-capped polyimides employed in PMR resins. The substitution of aromatic vinyl end-caps for norbornenyl end-caps in addition polyimides results in polymers with improved oxidative stability.

  2. Overview of differences between microbial feed additives and probiotics for food regarding regulation, growth promotion effects and health properties and consequences for extrapolation of farm animal results to humans.

    PubMed

    Bernardeau, M; Vernoux, J-P

    2013-04-01

    For many years, microbial adjuncts have been used to supplement the diets of farm animals and humans. They have evolved since the 1990s to become known as probiotics, i.e. functional food with health benefits. After the discovery of a possible link between manipulation of gut microflora in mice and obesity, a focus on the use of these beneficial microbes that act on gut microflora in animal farming was undertaken and compared with the use of probiotics for food. Beneficial microbes added to feed are classified at a regulatory level as zootechnical additives, in the category of gut flora stabilizers for healthy animals and are regulated up to strain level in Europe. Intended effects are improvement of performance characteristics, which are strain dependent and growth enhancement is not a prerequisite. In fact, increase of body weight is not commonly reported and its frequency is around 25% of the published data examined here. However, when a Body Weight Gain (BWG) was found in the literature, it was generally moderate (lower than or close to 10%) and this over a reduced period of their short industrial life. When it was higher than 10%, it could be explained as an indirect consequence of the alleviation of the weight losses linked to stressful intensive rearing conditions or health deficiency. However, regulations on feed do not consider the health effects because animals are supposed to be healthy, so there is no requirement for reporting healthy effects in the standard European dossier. The regulations governing the addition of beneficial microorganisms to food are less stringent than for feed and no dossier is required if a species has a Qualified Presumption of Safety status. The microbial strain marketed is not submitted to any regulation and its properties (including BWG) do not need to be studied. Only claims for functional or healthy properties are regulated and again growth effect is not included. However, recent studies on probiotic effects showed that BWG

  3. The impact on outcome of the addition of all-trans retinoic acid to intensive chemotherapy in younger patients with nonacute promyelocytic acute myeloid leukemia: overall results and results in genotypic subgroups defined by mutations in NPM1, FLT3, and CEBPA.

    PubMed

    Burnett, Alan K; Hills, Robert K; Green, Claire; Jenkinson, Sarah; Koo, Kenneth; Patel, Yashma; Guy, Carol; Gilkes, Amanda; Milligan, Donald W; Goldstone, Anthony H; Prentice, Archibald G; Wheatley, Keith; Linch, David C; Gale, Rosemary E

    2010-02-01

    We investigated the benefit of adding all-trans retinoic acid (ATRA) to chemotherapy for younger patients with nonacute promyelocytic acute myeloid leukemia and high-risk myelodysplastic syndrome, and considered interactions between treatment and molecular markers. Overall, 1075 patients less than 60 years of age were randomized to receive or not receive ATRA in addition to daunorubicin/Ara-C/thioguanine chemotherapy with Ara-C at standard or double standard dose. There were data on FLT3 internal tandem duplications and NPM1 mutations (n = 592), CEBPA mutations (n = 423), and MN1 expression (n = 195). The complete remission rate was 68% with complete remission with incomplete count recovery in an additional 16%; 8-year overall survival was 32%. There was no significant treatment effect for any outcome, with no significant interactions between treatment and demographics, or cytarabine randomization. Importantly, there were no interactions by FLT3/internal tandem duplications, NPM1, or CEBPA mutation. There was a suggestion that ATRA reduced relapse in patients with lower MN1 levels, but no significant effect on overall survival. Results were consistent when restricted to patients with normal karyotype. ATRA has no overall effect on treatment outcomes in this group of patients. The study did not identify any subgroup of patients likely to derive a significant survival benefit from the addition of ATRA to chemotherapy. PMID:19965647

  4. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  5. Towards quantitative assessment of calciphylaxis

    NASA Astrophysics Data System (ADS)

    Deserno, Thomas M.; Sárándi, István.; Jose, Abin; Haak, Daniel; Jonas, Stephan; Specht, Paula; Brandenburg, Vincent

    2014-03-01

    Calciphylaxis is a rare disease that has devastating conditions associated with high morbidity and mortality. Calciphylaxis is characterized by systemic medial calcification of the arteries yielding necrotic skin ulcerations. In this paper, we aim at supporting the installation of multi-center registries for calciphylaxis, which includes a photographic documentation of skin necrosis. However, photographs acquired in different centers under different conditions using different equipment and photographers cannot be compared quantitatively. For normalization, we use a simple color pad that is placed into the field of view, segmented from the image, and its color fields are analyzed. In total, 24 colors are printed on that scale. A least-squares approach is used to determine the affine color transform. Furthermore, the card allows scale normalization. We provide a case study for qualitative assessment. In addition, the method is evaluated quantitatively using 10 images of two sets of different captures of the same necrosis. The variability of quantitative measurements based on free hand photography is assessed regarding geometric and color distortions before and after our simple calibration procedure. Using automated image processing, the standard deviation of measurements is significantly reduced. The coefficients of variations yield 5-20% and 2-10% for geometry and color, respectively. Hence, quantitative assessment of calciphylaxis becomes practicable and will impact a better understanding of this rare but fatal disease.

  6. New addition curing polyimides

    NASA Technical Reports Server (NTRS)

    Frimer, Aryeh A.; Cavano, Paul

    1991-01-01

    In an attempt to improve the thermal-oxidative stability (TOS) of PMR-type polymers, the use of 1,4-phenylenebis (phenylmaleic anhydride) PPMA, was evaluated. Two series of nadic end-capped addition curing polyimides were prepared by imidizing PPMA with either 4,4'-methylene dianiline or p-phenylenediamine. The first resulted in improved solubility and increased resin flow while the latter yielded a compression molded neat resin sample with a T(sub g) of 408 C, close to 70 C higher than PME-15. The performance of these materials in long term weight loss studies was below that of PMR-15, independent of post-cure conditions. These results can be rationalized in terms of the thermal lability of the pendant phenyl groups and the incomplete imidization of the sterically congested PPMA. The preparation of model compounds as well as future research directions are discussed.

  7. Quantitative autoradiography of neurochemicals

    SciTech Connect

    Rainbow, T.C.; Biegon, A.; Bleisch, W.V.

    1982-05-24

    Several new methods have been developed that apply quantitative autoradiography to neurochemistry. These methods are derived from the 2-deoxyglucose (2DG) technique of Sokoloff (1), which uses quantitative autoradiography to measure the rate of glucose utilization in brain structures. The new methods allow the measurement of the rate of cerbral protein synthesis and the levels of particular neurotransmitter receptors by quantitative autoradiography. As with the 2DG method, the new techniques can measure molecular levels in micron-sized brain structures; and can be used in conjunction with computerized systems of image processing. It is possible that many neurochemical measurements could be made by computerized analysis of quantitative autoradiograms.

  8. Addition of docetaxel, zoledronic acid, or both to first-line long-term hormone therapy in prostate cancer (STAMPEDE): survival results from an adaptive, multiarm, multistage, platform randomised controlled trial

    PubMed Central

    James, Nicholas D; Sydes, Matthew R; Clarke, Noel W; Mason, Malcolm D; Dearnaley, David P; Spears, Melissa R; Ritchie, Alastair W S; Parker, Christopher C; Russell, J Martin; Attard, Gerhardt; de Bono, Johann; Cross, William; Jones, Rob J; Thalmann, George; Amos, Claire; Matheson, David; Millman, Robin; Alzouebi, Mymoona; Beesley, Sharon; Birtle, Alison J; Brock, Susannah; Cathomas, Richard; Chakraborti, Prabir; Chowdhury, Simon; Cook, Audrey; Elliott, Tony; Gale, Joanna; Gibbs, Stephanie; Graham, John D; Hetherington, John; Hughes, Robert; Laing, Robert; McKinna, Fiona; McLaren, Duncan B; O'Sullivan, Joe M; Parikh, Omi; Peedell, Clive; Protheroe, Andrew; Robinson, Angus J; Srihari, Narayanan; Srinivasan, Rajaguru; Staffurth, John; Sundar, Santhanam; Tolan, Shaun; Tsang, David; Wagstaff, John; Parmar, Mahesh K B

    2016-01-01

    Summary Background Long-term hormone therapy has been the standard of care for advanced prostate cancer since the 1940s. STAMPEDE is a randomised controlled trial using a multiarm, multistage platform design. It recruits men with high-risk, locally advanced, metastatic or recurrent prostate cancer who are starting first-line long-term hormone therapy. We report primary survival results for three research comparisons testing the addition of zoledronic acid, docetaxel, or their combination to standard of care versus standard of care alone. Methods Standard of care was hormone therapy for at least 2 years; radiotherapy was encouraged for men with N0M0 disease to November, 2011, then mandated; radiotherapy was optional for men with node-positive non-metastatic (N+M0) disease. Stratified randomisation (via minimisation) allocated men 2:1:1:1 to standard of care only (SOC-only; control), standard of care plus zoledronic acid (SOC + ZA), standard of care plus docetaxel (SOC + Doc), or standard of care with both zoledronic acid and docetaxel (SOC + ZA + Doc). Zoledronic acid (4 mg) was given for six 3-weekly cycles, then 4-weekly until 2 years, and docetaxel (75 mg/m2) for six 3-weekly cycles with prednisolone 10 mg daily. There was no blinding to treatment allocation. The primary outcome measure was overall survival. Pairwise comparisons of research versus control had 90% power at 2·5% one-sided α for hazard ratio (HR) 0·75, requiring roughly 400 control arm deaths. Statistical analyses were undertaken with standard log-rank-type methods for time-to-event data, with hazard ratios (HRs) and 95% CIs derived from adjusted Cox models. This trial is registered at ClinicalTrials.gov (NCT00268476) and ControlledTrials.com (ISRCTN78818544). Findings 2962 men were randomly assigned to four groups between Oct 5, 2005, and March 31, 2013. Median age was 65 years (IQR 60–71). 1817 (61%) men had M+ disease, 448 (15%) had N+/X M0, and 697 (24%) had N0M0. 165 (6

  9. Robust quantitative scratch assay

    PubMed Central

    Vargas, Andrea; Angeli, Marc; Pastrello, Chiara; McQuaid, Rosanne; Li, Han; Jurisicova, Andrea; Jurisica, Igor

    2016-01-01

    The wound healing assay (or scratch assay) is a technique frequently used to quantify the dependence of cell motility—a central process in tissue repair and evolution of disease—subject to various treatments conditions. However processing the resulting data is a laborious task due its high throughput and variability across images. This Robust Quantitative Scratch Assay algorithm introduced statistical outputs where migration rates are estimated, cellular behaviour is distinguished and outliers are identified among groups of unique experimental conditions. Furthermore, the RQSA decreased measurement errors and increased accuracy in the wound boundary at comparable processing times compared to previously developed method (TScratch). Availability and implementation: The RQSA is freely available at: http://ophid.utoronto.ca/RQSA/RQSA_Scripts.zip. The image sets used for training and validation and results are available at: (http://ophid.utoronto.ca/RQSA/trainingSet.zip, http://ophid.utoronto.ca/RQSA/validationSet.zip, http://ophid.utoronto.ca/RQSA/ValidationSetResults.zip, http://ophid.utoronto.ca/RQSA/ValidationSet_H1975.zip, http://ophid.utoronto.ca/RQSA/ValidationSet_H1975Results.zip, http://ophid.utoronto.ca/RQSA/RobustnessSet.zip, http://ophid.utoronto.ca/RQSA/RobustnessSet.zip). Supplementary Material is provided for detailed description of the development of the RQSA. Contact: juris@ai.utoronto.ca Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26722119

  10. A Quantitative Infrared Spectroscopy Experiment.

    ERIC Educational Resources Information Center

    Krahling, Mark D.; Eliason, Robert

    1985-01-01

    Although infrared spectroscopy is used primarily for qualitative identifications, it is possible to use it as a quantitative tool as well. The use of a standard curve to determine percent methanol in a 2,2,2-trifluoroethanol sample is described. Background information, experimental procedures, and results obtained are provided. (JN)

  11. Quantitative evolutionary design

    PubMed Central

    Diamond, Jared

    2002-01-01

    The field of quantitative evolutionary design uses evolutionary reasoning (in terms of natural selection and ultimate causation) to understand the magnitudes of biological reserve capacities, i.e. excesses of capacities over natural loads. Ratios of capacities to loads, defined as safety factors, fall in the range 1.2-10 for most engineered and biological components, even though engineered safety factors are specified intentionally by humans while biological safety factors arise through natural selection. Familiar examples of engineered safety factors include those of buildings, bridges and elevators (lifts), while biological examples include factors of bones and other structural elements, of enzymes and transporters, and of organ metabolic performances. Safety factors serve to minimize the overlap zone (resulting in performance failure) between the low tail of capacity distributions and the high tail of load distributions. Safety factors increase with coefficients of variation of load and capacity, with capacity deterioration with time, and with cost of failure, and decrease with costs of initial construction, maintenance, operation, and opportunity. Adaptive regulation of many biological systems involves capacity increases with increasing load; several quantitative examples suggest sublinear increases, such that safety factors decrease towards 1.0. Unsolved questions include safety factors of series systems, parallel or branched pathways, elements with multiple functions, enzyme reaction chains, and equilibrium enzymes. The modest sizes of safety factors imply the existence of costs that penalize excess capacities. Those costs are likely to involve wasted energy or space for large or expensive components, but opportunity costs of wasted space at the molecular level for minor components. PMID:12122135

  12. Quantitative Spectroscopy of Distant Type Ia Supernovae

    NASA Astrophysics Data System (ADS)

    Bronder, T. J.; Hook, I.; Howell, D. A.; Sullivan, M.; Perrett, K.; Conley, A.; Astier, P.; Basa, S.; Carlberg, R. G.; Guy, J.; Pain, R.; Pritchet, C. J.; Neill, James D.

    2007-08-01

    Quantitative analysis of 24 high-z (zmed = 0.81) Type Ia supernovae (SNe Ia) spectra observed at the Gemini Telescopes for the Supernova Legacy Survey (SNLS) is presented. This analysis includes equivalent width measurements of SNe Ia-specific absorption features with methods tailored to the reduced signal-to-noise and host galaxy contamination present in these distant spectra. The results from this analysis are compared to corresponding measurements of a large set of low-z SNe Ia from the literature. This comparison showed no significant difference (less than 2σ) between the spectroscopic features of the distant and nearby SNe; a result that supports the assumption that SNe Ia are not evolving with redshift. Additionally, a new correlation between SiII absorption (observed near peak luminosity) and SNe Ia peak magnitudes is presented.

  13. Quantitative phase imaging of arthropods

    NASA Astrophysics Data System (ADS)

    Sridharan, Shamira; Katz, Aron; Soto-Adames, Felipe; Popescu, Gabriel

    2015-11-01

    Classification of arthropods is performed by characterization of fine features such as setae and cuticles. An unstained whole arthropod specimen mounted on a slide can be preserved for many decades, but is difficult to study since current methods require sample manipulation or tedious image processing. Spatial light interference microscopy (SLIM) is a quantitative phase imaging (QPI) technique that is an add-on module to a commercial phase contrast microscope. We use SLIM to image a whole organism springtail Ceratophysella denticulata mounted on a slide. This is the first time, to our knowledge, that an entire organism has been imaged using QPI. We also demonstrate the ability of SLIM to image fine structures in addition to providing quantitative data that cannot be obtained by traditional bright field microscopy.

  14. Quantitative phase imaging of arthropods

    PubMed Central

    Sridharan, Shamira; Katz, Aron; Soto-Adames, Felipe; Popescu, Gabriel

    2015-01-01

    Abstract. Classification of arthropods is performed by characterization of fine features such as setae and cuticles. An unstained whole arthropod specimen mounted on a slide can be preserved for many decades, but is difficult to study since current methods require sample manipulation or tedious image processing. Spatial light interference microscopy (SLIM) is a quantitative phase imaging (QPI) technique that is an add-on module to a commercial phase contrast microscope. We use SLIM to image a whole organism springtail Ceratophysella denticulata mounted on a slide. This is the first time, to our knowledge, that an entire organism has been imaged using QPI. We also demonstrate the ability of SLIM to image fine structures in addition to providing quantitative data that cannot be obtained by traditional bright field microscopy. PMID:26334858

  15. A traceable quantitative infrared spectral database of chemical agents

    NASA Astrophysics Data System (ADS)

    Samuels, Alan C.; Williams, Barry R.; Ben-David, Avishai; Hulet, Melissa; Roelant, Geoffrey J.; Miles, Ronald W., Jr.; Green, Norman; Zhu, Changjiang

    2004-12-01

    Recent experimental field trials have demonstrated the ability of both Fourier transform infrared (FTIR) and active light detection and ranging (LIDAR) sensors to detect particulate matter, including simulants for biological materials. Both systems require a reliable, validated, quantitative database of the mid infrared spectra of the targeted threat agents. While several databases are available, none are validated and traceable to primary standards for reference quality reliability. Most of the existing chemical agent databases have been developed using a bubbler or syringe-fed vapor generator, and all are fraught with errors and uncertainties as a result. In addition, no quantitative condensed phase data on the low volatility chemicals and biological agents have been reported. We are filling this data gap through the systematic measurement of gas phase chemical agent materials generated using a unique vapor-liquid equilibrium approach that allows the quantitation of the cross-sections using a mass measurement calibrated to primary, National Institutes of Standards and Technology (NIST) standards. In addition, we have developed quantitative methods for the measurement of condensed phase materials in both transmission and diffuse reflectance modes. The latter data are valuable for the development of complex index of refraction data, which is required for both system modeling and algorithm development of both FTIR and LIDAR based sensor systems. We will describe our measurement approach and progress toward compiling the first known comprehensive and validated database of both vapor and condensed phase chemical warfare agents.

  16. [Food additives and healthiness].

    PubMed

    Heinonen, Marina

    2014-01-01

    Additives are used for improving food structure or preventing its spoilage, for example. Many substances used as additives are also naturally present in food. The safety of additives is evaluated according to commonly agreed principles. If high concentrations of an additive cause adverse health effects for humans, a limit of acceptable daily intake (ADI) is set for it. An additive is a risk only when ADI is exceeded. The healthiness of food is measured on the basis of nutrient density and scientifically proven effects. PMID:24772784

  17. Intensification of antiretroviral therapy through addition of enfuvirtide in naive HIV-1-infected patients with severe immunosuppression does not improve immunological response: results of a randomized multicenter trial (ANRS 130 Apollo).

    PubMed

    Joly, Véronique; Fagard, Catherine; Grondin, Carine; Descamps, Diane; Yazdanpanah, Yazdan; Charpentier, Charlotte; Colin de Verdiere, Nathalie; Tabuteau, Sophie; Raffi, François; Cabie, André; Chene, Geneviève; Yeni, Patrick

    2013-02-01

    We studied whether addition of enfuvirtide (ENF) to a background combination antiretroviral therapy (cART) would improve the CD4 cell count response at week 24 in naive patients with advanced HIV disease. ANRS 130 Apollo is a randomized study, conducted in naive HIV-1-infected patients, either asymptomatic with CD4 counts of <100/mm(3) or stage B/C disease with CD4 counts of <200/mm(3). Patients received tenofovir-emtricitabine with lopinavir-ritonavir (LPV/r) or efavirenz and were randomized to receive ENF for 24 weeks (ENF arm) or not (control arm). The primary endpoint was the proportion of patients with CD4 counts of ≥ 200/mm(3) at week 24. A total of 195 patients were randomized: 73% had stage C disease, 78% were male, the mean age was 44 years, the median CD4 count was 30/mm(3), and the median HIV-1 RNA load was 5.4 log(10) copies/ml. Eighty-one percent of patients received LPV/r. One patient was lost to follow-up, and eight discontinued the study (four in each arm). The proportions of patients with CD4 counts of ≥ 200/mm(3) at week 24 were 34% and 38% in the ENF and control arms, respectively (P = 0.53). The proportions of patients with HIV-1 RNA loads of <50 copies/ml were 74% and 58% at week 24 in the ENF and control arms, respectively (P < 0.02), and the proportion reached 79% in both arms at week 48. Twenty (20%) and 12 patients (13%) in the ENF and control arms, respectively, experienced at least one AIDS event during follow-up (P = 0.17). Although inducing a more rapid virological response, addition of ENF to a standard cART does not improve the immunological outcome in naive HIV-infected patients with severe immunosuppression. PMID:23165467

  18. Intensification of Antiretroviral Therapy through Addition of Enfuvirtide in Naive HIV-1-Infected Patients with Severe Immunosuppression Does Not Improve Immunological Response: Results of a Randomized Multicenter Trial (ANRS 130 Apollo)

    PubMed Central

    Fagard, Catherine; Grondin, Carine; Descamps, Diane; Yazdanpanah, Yazdan; Charpentier, Charlotte; Colin de Verdiere, Nathalie; Tabuteau, Sophie; Raffi, François; Cabie, André; Chene, Geneviève; Yeni, Patrick

    2013-01-01

    We studied whether addition of enfuvirtide (ENF) to a background combination antiretroviral therapy (cART) would improve the CD4 cell count response at week 24 in naive patients with advanced HIV disease. ANRS 130 Apollo is a randomized study, conducted in naive HIV-1-infected patients, either asymptomatic with CD4 counts of <100/mm3 or stage B/C disease with CD4 counts of <200/mm3. Patients received tenofovir-emtricitabine with lopinavir-ritonavir (LPV/r) or efavirenz and were randomized to receive ENF for 24 weeks (ENF arm) or not (control arm). The primary endpoint was the proportion of patients with CD4 counts of ≥200/mm3 at week 24. A total of 195 patients were randomized: 73% had stage C disease, 78% were male, the mean age was 44 years, the median CD4 count was 30/mm3, and the median HIV-1 RNA load was 5.4 log10 copies/ml. Eighty-one percent of patients received LPV/r. One patient was lost to follow-up, and eight discontinued the study (four in each arm). The proportions of patients with CD4 counts of ≥200/mm3 at week 24 were 34% and 38% in the ENF and control arms, respectively (P = 0.53). The proportions of patients with HIV-1 RNA loads of <50 copies/ml were 74% and 58% at week 24 in the ENF and control arms, respectively (P < 0.02), and the proportion reached 79% in both arms at week 48. Twenty (20%) and 12 patients (13%) in the ENF and control arms, respectively, experienced at least one AIDS event during follow-up (P = 0.17). Although inducing a more rapid virological response, addition of ENF to a standard cART does not improve the immunological outcome in naive HIV-infected patients with severe immunosuppression. PMID:23165467

  19. Electric utility use of fireside additives. Final report

    SciTech Connect

    Locklin, D.W.; Krause, H.H.; Anson, D.; Reid, W.

    1980-01-01

    Fireside additives have been used or proposed for use in fossil-fired utility boilers to combat a number of problems related to boiler performance and reliability. These problems include corrosion, fouling, superheat control, and acidic emissions. Fuel additives and other fireside additives have been used mainly with oil firing; however, there is growing experience with additives in coal-firing, especially for flyash conditioning to improve the performance of electrostatic precipitators. In decisions regarding the selection and use of additives, utilities have had to rely extensively on empiricism, due partly to an incomplete understanding of processes involved and partly to the limited amount of quantitative data. The study reported here was sponsored by the Electric Power Research Institute to assemble and analyze pertinent operating experience and to recommend guidelines for utility decisions on the use of additives. The combined results of the state-of-the-art review of technical literature and a special survey of utility experience are reported. A total of 38 utilities participated in the survey, providing information on trials conducted on 104 units in 93 different plants. Altogether, 445 separate trials were reported, each representing a unit/additive/fuel combination. Additives used in these trials included 90 different additive formulations, both pure compounds and proprietary products. These formulations were categorized into 37 generic classes according to their chemical constituents, and the results of the survey are presented by these generic classes. The findings are organized according to the operating problems for which fireside additives are used. Guidelines are presented for utility use in additive selection and in planning additive trials.

  20. A Quantitative Gas Chromatographic Ethanol Determination.

    ERIC Educational Resources Information Center

    Leary, James J.

    1983-01-01

    Describes a gas chromatographic experiment for the quantitative determination of volume percent ethanol in water ethanol solutions. Background information, procedures, and typical results are included. Accuracy and precision of results are both on the order of two percent. (JN)

  1. Visual Constraints for the Perception of Quantitative Depth from Temporal Interocular Unmatched Features

    PubMed Central

    Ni, Rui; Chen, Lin; Andersen, George J.

    2010-01-01

    Previous research (Brooks & Gillam, 2006) has found that temporal interocular unmatched (IOUM) features generate a perception of subjective contours and can result in a perception of quantitative depth. In the present study we examine in detail the factors important for quantitative depth perception from IOUM features. In Experiments 1 and 2 observers were shown temporal IOUM features based on three dots that disappeared behind an implicit surface. Subjects reported a perception of a subjective surface and were able to perceive qualitative depth. In Experiments 3 and 4 metrical depth was perceived when binocular disparity features were added to the display. These results suggest that quantitative depth from IOUM information is perceived when binocular matched information is present in regions adjacent to the surface. In addition, the perceived depth of the subjective surface decreased with an increase in the width of the subjective surface suggesting a limitation in the propagation of quantitative depth to surface regions where qualitative depth information is available. PMID:20493899

  2. Quantitative Analysis of Glaciated Landscapes

    NASA Astrophysics Data System (ADS)

    Huerta, A. D.

    2005-12-01

    The evolution of glaciated mountains is at the heart of the debate over Late Cenozoic linkages between climate and tectonics. Traditionally, the development of high summit elevations is attributed to tectonic processes. However, much of the high elevation of the Transantarctic Mountains can be attributed solely to uplift in response to glacial erosion (Stern et al., 2005). The Transantarctic Mountains (TAM) provide an unparalleled opportunity to study glacial erosion. The mountain range has experienced glacial conditions since Oligocene time. In the higher and dryer regions of the TAM there is only a thin veneer of ice and snow draping the topography. In these regions landforms that were shaped during earlier climatic conditions are preserved. In fact, both glacial and fluvial landforms dating as far back as 18 Ma are preserved locally. In addition, the TAM are ideal for studying glacial erosion since the range has experienced minimal tectonic uplift since late Oligocene time, thus isolating the erosion signal from any tectonic signal. With the advent of digital data sets and GIS methodologies, quantitative analysis can identify key aspects of glaciated landscape morphology, and thus develop powerful analytical techniques for objective study of glaciation. Inspection of USGS topographic maps of the TAM reveals that mountain tops display an extreme range of glacial modification. For example, in the Mt. Rabot region (83°-84° S), mountain peaks are strongly affected by glaciation; cirque development is advanced with cirque diameters on the range of several kilometers, and cirque confluence has resulted in the formation of ``knife-edge'' arêtes up to 10 km long. In contrast, in the Mt. Murchison area (73°-74° S) cirque development is youthful, and there is minimal development of arêtes. Preliminary work indicates that analysis of DEM's and contour lines can be used to distinguish degree of glaciation. In particular, slope, curvature, and power spectrum analysis

  3. Recapturing Quantitative Biology.

    ERIC Educational Resources Information Center

    Pernezny, Ken; And Others

    1996-01-01

    Presents a classroom activity on estimating animal populations. Uses shoe boxes and candies to emphasize the importance of mathematics in biology while introducing the methods of quantitative ecology. (JRH)

  4. On Quantitative Rorschach Scales.

    ERIC Educational Resources Information Center

    Haggard, Ernest A.

    1978-01-01

    Two types of quantitative Rorschach scales are discussed: first, those based on the response categories of content, location, and the determinants, and second, global scales based on the subject's responses to all ten stimulus cards. (Author/JKS)

  5. Quantitative film radiography

    SciTech Connect

    Devine, G.; Dobie, D.; Fugina, J.; Hernandez, J.; Logan, C.; Mohr, P.; Moss, R.; Schumacher, B.; Updike, E.; Weirup, D.

    1991-02-26

    We have developed a system of quantitative radiography in order to produce quantitative images displaying homogeneity of parts. The materials that we characterize are synthetic composites and may contain important subtle density variations not discernible by examining a raw film x-radiograph. In order to quantitatively interpret film radiographs, it is necessary to digitize, interpret, and display the images. Our integrated system of quantitative radiography displays accurate, high-resolution pseudo-color images in units of density. We characterize approximately 10,000 parts per year in hundreds of different configurations and compositions with this system. This report discusses: the method; film processor monitoring and control; verifying film and processor performance; and correction of scatter effects.

  6. Additive usage levels.

    PubMed

    Langlais, R

    1996-01-01

    With the adoption of the European Parliament and Council Directives on sweeteners, colours and miscellaneous additives the Commission is now embarking on the project of coordinating the activities of the European Union Member States in the collection of the data that are to make up the report on food additive intake requested by the European Parliament. This presentation looks at the inventory of available sources on additive use levels and concludes that for the time being national legislation is still the best source of information considering that the directives have yet to be transposed into national legislation. Furthermore, this presentation covers the correlation of the food categories as found in the additives directives with those used by national consumption surveys and finds that in a number of instances this correlation still leaves a lot to be desired. The intake of additives via food ingestion and the intake of substances which are chemically identical to additives but which occur naturally in fruits and vegetables is found in a number of cases to be higher than the intake of additives added during the manufacture of foodstuffs. While the difficulties are recognized in contributing to the compilation of food additive intake data, industry as a whole, i.e. the food manufacturing and food additive manufacturing industries, are confident that in a concerted effort, use data on food additives by industry can be made available. Lastly, the paper points out that with the transportation of the additives directives into national legislation and the time by which the food industry will be able to make use of the new food legislative environment several years will still go by; food additives use data by the food industry will thus have to be reviewed at the beginning of the next century. PMID:8792135

  7. Quantitative receptor autoradiography

    SciTech Connect

    Boast, C.A.; Snowhill, E.W.; Altar, C.A.

    1986-01-01

    Quantitative receptor autoradiography addresses the topic of technical and scientific advances in the sphere of quantitative autoradiography. The volume opens with a overview of the field from a historical and critical perspective. Following is a detailed discussion of in vitro data obtained from a variety of neurotransmitter systems. The next section explores applications of autoradiography, and the final two chapters consider experimental models. Methodological considerations are emphasized, including the use of computers for image analysis.

  8. An additional middle cuneiform?

    PubMed Central

    Brookes-Fazakerley, S.D.; Jackson, G.E.; Platt, S.R.

    2015-01-01

    Additional cuneiform bones of the foot have been described in reference to the medial bipartite cuneiform or as small accessory ossicles. An additional middle cuneiform has not been previously documented. We present the case of a patient with an additional ossicle that has the appearance and location of an additional middle cuneiform. Recognizing such an anatomical anomaly is essential for ruling out second metatarsal base or middle cuneiform fractures and for the preoperative planning of arthrodesis or open reduction and internal fixation procedures in this anatomical location. PMID:26224890

  9. Quantitive DNA Fiber Mapping

    SciTech Connect

    Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F.; Weier, Heinz-Ulli G.

    2008-01-28

    Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.

  10. Quantitative phase imaging with programmable illumination

    NASA Astrophysics Data System (ADS)

    Kim, Taewoo; Edwards, Chris; Goddard, Lynford L.; Popescu, Gabriel

    2015-03-01

    Even with the recent rapid advances in the field of microscopy, non-laser light sources used for light microscopy have not been developing significantly. Most current optical microscopy systems use halogen bulbs as their light sources to provide a white-light illumination. Due to the confined shapes and finite filament size of the bulbs, little room is available for modification in the light source, which prevents further advances in microscopy. By contrast, commercial projectors provide a high power output that is comparable to the halogen lamps while allowing for great flexibility in patterning the illumination. In addition to their high brightness, the illumination can be patterned to have arbitrary spatial and spectral distributions. Therefore, commercial projectors can be adopted as a flexible light source to an optical microscope by careful alignment to the existing optical path. In this study, we employed a commercial projector source to a quantitative phase imaging system called spatial light interference microscopy (SLIM), which is an outside module for an existing phase contrast (PC) microscope. By replacing the ring illumination of PC with a ring-shaped pattern projected onto the condenser plane, we were able to recover the same result as the original SLIM. Furthermore, the ring illumination is replaced with multiple dots aligned along the same ring to minimize the overlap between the scattered and unscattered fields. This new method minimizes the halo artifact of the imaging system, which allows for a halo-free high-resolution quantitative phase microscopy system.

  11. ImatraNMR: novel software for batch integration and analysis of quantitative NMR spectra.

    PubMed

    Mäkelä, A V; Heikkilä, O; Kilpeläinen, I; Heikkinen, S

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D (1)H and (13)C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request. PMID:21705250

  12. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    NASA Astrophysics Data System (ADS)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  13. Carbamate deposit control additives

    SciTech Connect

    Honnen, L.R.; Lewis, R.A.

    1980-11-25

    Deposit control additives for internal combustion engines are provided which maintain cleanliness of intake systems without contributing to combustion chamber deposits. The additives are poly(oxyalkylene) carbamates comprising a hydrocarbyloxyterminated poly(Oxyalkylene) chain of 2-5 carbon oxyalkylene units bonded through an oxycarbonyl group to a nitrogen atom of ethylenediamine.

  14. The late addition of core lipids to nascent apolipoprotein B100, resulting in the assembly and secretion of triglyceride-rich lipoproteins, is independent of both microsomal triglyceride transfer protein activity and new triglyceride synthesis.

    PubMed

    Pan, Meihui; Liang Js, Jun-shan; Fisher, Edward A; Ginsberg, Henry N

    2002-02-01

    Although microsomal triglyceride transfer protein (MTP) and newly synthesized triglyceride (TG) are critical for co-translational targeting of apolipoprotein B (apoB100) to lipoprotein assembly in hepatoma cell lines, their roles in the later stages of lipoprotein assembly remain unclear. Using N-acetyl-Leu-Leu-norleucinal to prevent proteasomal degradation, HepG2 cells were radiolabeled and chased for 0-90 min (chase I). The medium was changed and cells chased for another 150 min (chase II) in the absence (control) or presence of Pfizer MTP inhibitor CP-10447 (CP). As chase I was extended, inhibition of apoB100 secretion by CP during chase II decreased from 75.9% to only 15% of control (no CP during chase II). Additional studies were conducted in which chase I was either 0 or 90 min, and chase II was in the presence of [(3)H]glycerol and either BSA (control), CP (inhibits both MTP activity and TG synthesis),BMS-1976360-1) (BMS) (inhibits only MTP activity), or triacsin C (TC) (inhibits only TG synthesis). When chase I was 0 min, CP, BMS, and TC reduced apoB100 secretion during chase II by 75.3, 73.9, and 53.9%. However, when chase I was 90 min, those agents reduced apoB100 secretion during chase II by only 16.0, 19.2, and 13.9%. Of note, all three inhibited secretion of newly synthesized TG during chase II by 80, 80, and 40%, whether chase I was 0 or 90 min. In both HepG2 cells and McA-RH7777 cells, if chase I was at least 60 min, inhibition of TG synthesis and/or MTP activity did not affect the density of secreted apoB100-lipoproteins under basal conditions. Oleic acid increased secretion of TG-enriched apoB100-lipoproteins similarly in the absence or presence of either of CP, BMS, or TC. We conclude that neither MTP nor newly synthesized TG is necessary for the later stages of apoB100-lipoprotein assembly and secretion in either HepG2 or McA-RH7777 cells. PMID:11704664

  15. Fast quantitation of 5-hydroxymethylfurfural in honey using planar chromatography.

    PubMed

    Chernetsova, Elena S; Revelsky, Igor A; Morlock, Gertrud E

    2011-07-01

    An approach for rapid quantitation of 5-hydroxymethylfurfural (HMF) in honey using planar chromatography is suggested for the first time. In high-performance thin-layer chromatography (HPTLC) the migration time is approximately 5 min. Detection is performed by absorbance measurement at 290 nm. Polynomial calibration in the matrix over a range of 1:80 showed correlation coefficients, r, of  ≥  0.9997 for peak areas and  ≥  0.9996 for peak heights. Repeatability in the matrix confirmed the suitability of HPTLC-UV for quantitation of HMF in honey. The relative standard deviation (RSD, %, n = 6) of HMF at 10 ng/band was 2.9% (peak height) and 5.2% (peak area); it was 0.6% and 1.0%, respectively, at 100 ng/band. Other possible detection modes, for example fluorescence measurement after post-chromatographic derivatization and mass spectrometric detection, were also evaluated and can coupling can be used as an additional tool when it is necessary to confirm the results of prior quantitation by HPTLC-UV. The confirmation is provided by monitoring the HMF sodium adduct [M + Na](+) at m/z 149 followed by quantitation in TIC or SIM mode. Detection limits for HPTLC-UV, HPTLC-MS (TIC), and HPTLC-MS (SIM) were 0.8 ng/band, 4 ng/band, and 0.9 ng/band, respectively. If 12 μL honey solution was applied to an HPTLC plate, the respective detection limits for HMF in honey corresponded to 0.6 mg kg(-1). Thus, the developed method was highly suitable for quantitation of HMF in honey at the strictest regulated level of 15 mg kg(-1). Comparison of HPTLC-UV detection with HPTLC-MS showed findings were comparable, with a mean deviation of 5.1 mg kg(-1) for quantitation in SIM mode and 6.1 mg kg(-1) for quantitation in TIC mode. The mean deviation of the HPTLC method compared with the HPLC method was 0.9 mg kg(-1) HMF in honey. Re-evaluation of the same HPTLC plate after one month showed a deviation of 0.5 mg kg(-1) HMF in honey. It was demonstrated that the proposed

  16. Quantitative Luminescence Imaging System

    SciTech Connect

    Batishko, C.R.; Stahl, K.A.; Fecht, B.A.

    1992-12-31

    The goal of the MEASUREMENT OF CHEMILUMINESCENCE project is to develop and deliver a suite of imaging radiometric instruments for measuring spatial distributions of chemiluminescence. Envisioned deliverables include instruments working at the microscopic, macroscopic, and life-sized scales. Both laboratory and field portable instruments are envisioned. The project also includes development of phantoms as enclosures for the diazoluminomelanin (DALM) chemiluminescent chemistry. A suite of either phantoms in a variety of typical poses, or phantoms that could be adjusted to a variety of poses, is envisioned. These are to include small mammals (rats), mid-sized mammals (monkeys), and human body parts. A complete human phantom that can be posed is a long-term goal of the development. Taken together, the chemistry and instrumentation provide a means for imaging rf dosimetry based on chemiluminescence induced by the heat resulting from rf energy absorption. The first delivered instrument, the Quantitative Luminescence Imaging System (QLIS), resulted in a patent, and an R&D Magazine 1991 R&D 100 award, recognizing it as one of the 100 most significant technological developments of 1991. The current status of the project is that three systems have been delivered, several related studies have been conducted, two preliminary human hand phantoms have been delivered, system upgrades have been implemented, and calibrations have been maintained. Current development includes sensitivity improvements to the microscope-based system; extension of the large-scale (potentially life-sized targets) system to field portable applications; extension of the 2-D large-scale system to 3-D measurement; imminent delivery of a more refined human hand phantom and a rat phantom; rf, thermal and imaging subsystem integration; and continued calibration and upgrade support.

  17. Is herpes zoster an additional complication in old age alongside comorbidity and multiple medications? Results of the post hoc analysis of the 12-month longitudinal prospective observational ARIZONA cohort study

    PubMed Central

    Pickering, Gisèle; Gavazzi, Gaëtan; Gaillat, Jacques; Paccalin, Marc; Bloch, Karine; Bouhassira, Didier

    2016-01-01

    Objectives To examine the burden of comorbidity, polypharmacy and herpes zoster (HZ), an infectious disease, and its main complication post-herpetic neuralgia (PHN) in young (50–70 years of age: 70−) and old (≥70 years of age: 70+) patients. Design Post hoc analysis of the results of the 12-month longitudinal prospective multicentre observational ARIZONA cohort study. Settings and participants The study took place in primary care in France from 20 November 2006 to 12 September 2008. Overall, 644 general practitioners (GPs) collected data from 1358 patients aged 50 years or more with acute eruptive HZ. Outcome measures Presence of HZ-related pain or PHN (pain persisting >3 months) was documented at day 0 and at months 3, 6, and 12. To investigate HZ and PHN burden, pain, quality of life (QoL) and mood were self-assessed using validated questionnaires (Zoster Brief Pain Inventory, 12-item Short-Form health survey and Hospital Anxiety and Depression Scale, respectively). Results As compared with younger patients, older patients more frequently presented with comorbidities, more frequently took analgesics and had poorer response on all questionnaires, indicating greater burden, at inclusion. Analgesics were more frequently prescribed to relieve acute pain or PHN in 70+ than 70− patients. Despite higher levels of medication prescription, poorer pain relief and poorer response to all questionnaires were reported in 70+ than 70− patients. Conclusions Occurrence of HZ and progression to PHN adds extra burden on top of pharmacological treatment and impaired quality of life, especially in older patients who already have health problems to cope with in everyday life. PMID:26892790

  18. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Fletcher, James C. (Inventor); Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1992-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  19. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1993-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of the additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  20. Quantitative approaches to computational vaccinology.

    PubMed

    Doytchinova, Irini A; Flower, Darren R

    2002-06-01

    This article reviews the newly released JenPep database and two new powerful techniques for T-cell epitope prediction: (i) the additive method; and (ii) a 3D-Quantitative Structure Activity Relationships (3D-QSAR) method, based on Comparative Molecular Similarity Indices Analysis (CoMSIA). The JenPep database is a family of relational databases supporting the growing need of immunoinformaticians for quantitative data on peptide binding to major histocompatibility complexes and to the Transporters associated with Antigen Processing (TAP). It also contains an annotated list of T-cell epitopes. The database is available free via the Internet (http://www.jenner.ac.uk/JenPep). The additive prediction method is based on the assumption that the binding affinity of a peptide depends on the contributions from each amino acid as well as on the interactions between the adjacent and every second side-chain. In the 3D-QSAR approach, the influence of five physicochemical properties (steric bulk, electrostatic potential, local hydrophobicity, hydrogen-bond donor and hydrogen-bond acceptor abilities) on the affinity of peptides binding to MHC molecules were considered. Both methods were exemplified through their application to the well-studied problem of peptides binding to the human class I MHC molecule HLA-A*0201. PMID:12067414

  1. Microbial phytase addition resulted in a greater increase in phosphorus digestibility in dry-fed compared with liquid-fed non-heat-treated wheat-barley-maize diets for pigs.

    PubMed

    Blaabjerg, K; Thomassen, A-M; Poulsen, H D

    2015-02-01

    The objective was to evaluate the effect of microbial phytase (1250 FTU/kg diet with 88% dry matter (DM)) on apparent total tract digestibility (ATTD) of phosphorus (P) in pigs fed a dry or soaked diet. Twenty-four pigs (65±3 kg) from six litters were used. Pigs were housed in metabolism crates and fed one of four diets for 12 days; 5 days for adaptation and 7 days for total, but separate collection of feces and urine. The basal diet was composed of wheat, barley, maize, soybean meal and no mineral phosphate. Dietary treatments were: basal dry-fed diet (BDD), BDD with microbial phytase (BDD+phy), BDD soaked for 24 h at 20°C before feeding (BDS) and BDS with microbial phytase (BDS+phy). Supplementation of microbial phytase increased ATTD of DM and crude protein (N×6.25) by 2 and 3 percentage units (P<0.0001; P<0.001), respectively. The ATTD of P was affected by the interaction between microbial phytase and soaking (P=0.02). This was due to a greater increase in ATTD of P by soaking of the diet containing solely plant phytase compared with the diet supplemented with microbial phytase: 35%, 65%, 44% and 68% for BDD, BDD+phy, BSD and BSD+phy, respectively. As such, supplementation of microbial phytase increased ATTD of P in the dry-fed diet, but not in the soaked diet. The higher ATTD of P for BDS compared with BDD resulted from the degradation of 54% of the phytate in BDS by wheat and barley phytases during soaking. On the other hand, soaking of BDS+phy did not increase ATTD of P significantly compared with BDD+phy despite that 76% of the phytate in BDS+phy was degraded before feeding. In conclusion, soaking of BDS containing solely plant phytase provided a great potential for increasing ATTD of P. However, this potential was not present when microbial phytase (1250 FTU/kg diet) was supplemented, most likely because soaking of BDS+phy for 24 h at 20°C did not result in a complete degradation of phytate before feeding. PMID:25245085

  2. Food Additives and Hyperkinesis

    ERIC Educational Resources Information Center

    Wender, Ester H.

    1977-01-01

    The hypothesis that food additives are causally associated with hyperkinesis and learning disabilities in children is reviewed, and available data are summarized. Available from: American Medical Association 535 North Dearborn Street Chicago, Illinois 60610. (JG)

  3. Additional Types of Neuropathy

    MedlinePlus

    ... A A Listen En Español Additional Types of Neuropathy Charcot's Joint Charcot's Joint, also called neuropathic arthropathy, ... can stop bone destruction and aid healing. Cranial Neuropathy Cranial neuropathy affects the 12 pairs of nerves ...

  4. Electric utility use of fireside additives. Final report

    SciTech Connect

    Locklin, D.W.; Krause, H.H.; Anson, D.; Reid, W.

    1980-01-01

    Fireside additives have been used or proposed for use in fossil-fired utility boilers to combat a number of problems related to boiler performance and reliability. These problems include corrosion, fouling, superheat control, and acidic emissions. Fuel additivies and other fireside additives have been used mainly with oil firing; however, there is growing experience with additives in coal-firing, especially for flyash conditioning to improve the performance of electrostatic precipitators. In decisions regarding the selection and use of additives, utilities have had to rely extensively on empiricism, due partly to our incomplete understanding of processes involved and partly to the limited amount of quantitative data. The study reported here was sponsored by the Electric Power Research Institute to assemble and analyze pertinent operating experience and to recommend guidelines for utility decisions on the use of additives. This report describes the combined results of the state-of-the-art review of technical literature and a special survey of utility experience. A total of 38 utilities participated in the survey, providing information on trials conducted on 104 units in 93 different plants. Altogether, 445 separate trials were reported, each representing a unit/additive/fuel combination. 90 different additive formulations, both pure compounds and proprietary products, were categorized into 37 generic classes according to their chemical constituents, and the results of the survey are presented by these generic classes. This report is organized according to the operating problems for which fireside additives are used. Guidelines are presented for utility use in additive selection and in planning additive trials.

  5. Comparative study of different venous reflux duplex quantitation parameters.

    PubMed

    Valentín, L I; Valentín, W H

    1999-09-01

    The objective of this study was to compare different quantitation parameters of venous reflux by duplex scan in different venous disease manifestations. Duplex scan is a new modality to quantify venous reflux. Several studies propose different parameters. In addition, there is controversy about the importance of deep and superficial involvement in different disease manifestations. It is not clear whether there is an increased venous reflux associated with varied clinical stages. Venous conditions were classified in seven stages and their differences for several quantitation variables studied. Most quantitation variables, such as average and peak velocity, average and peak flow, and reflux volume disclosed significantly increased reflux from normal, pain only, and edema group to varicose vein, with or without edema, to lipodermatosclerosis and ulcer groups at every location in the lower extremity. Reflux time was not as consistent as other variables. Totalization of the results of every parameter for the whole extremity points to an increased reflux from pain only to edema and from lipodermatosclerosis to ulcer group. Chronic edema is not usually associated with increased venous reflux. The greater saphenous vein (superficial system) seems to be the main contributor to reflux in all stages of disease. Different quantitation methods of venous reflux are equivalent. Increased deep and superficial reflux and its totalization are associated with a more advanced disease stage. Reflux time may be the least useful variable. Chronic edema is frequently not associated with venous reflux. Greater saphenectomy may be the most useful intervention, even in the presence of deep vein reflux. PMID:10496498

  6. Quantitative statistical methods for image quality assessment.

    PubMed

    Dutta, Joyita; Ahn, Sangtae; Li, Quanzheng

    2013-01-01

    Quantitative measures of image quality and reliability are critical for both qualitative interpretation and quantitative analysis of medical images. While, in theory, it is possible to analyze reconstructed images by means of Monte Carlo simulations using a large number of noise realizations, the associated computational burden makes this approach impractical. Additionally, this approach is less meaningful in clinical scenarios, where multiple noise realizations are generally unavailable. The practical alternative is to compute closed-form analytical expressions for image quality measures. The objective of this paper is to review statistical analysis techniques that enable us to compute two key metrics: resolution (determined from the local impulse response) and covariance. The underlying methods include fixed-point approaches, which compute these metrics at a fixed point (the unique and stable solution) independent of the iterative algorithm employed, and iteration-based approaches, which yield results that are dependent on the algorithm, initialization, and number of iterations. We also explore extensions of some of these methods to a range of special contexts, including dynamic and motion-compensated image reconstruction. While most of the discussed techniques were developed for emission tomography, the general methods are extensible to other imaging modalities as well. In addition to enabling image characterization, these analysis techniques allow us to control and enhance imaging system performance. We review practical applications where performance improvement is achieved by applying these ideas to the contexts of both hardware (optimizing scanner design) and image reconstruction (designing regularization functions that produce uniform resolution or maximize task-specific figures of merit). PMID:24312148

  7. Quantitative Statistical Methods for Image Quality Assessment

    PubMed Central

    Dutta, Joyita; Ahn, Sangtae; Li, Quanzheng

    2013-01-01

    Quantitative measures of image quality and reliability are critical for both qualitative interpretation and quantitative analysis of medical images. While, in theory, it is possible to analyze reconstructed images by means of Monte Carlo simulations using a large number of noise realizations, the associated computational burden makes this approach impractical. Additionally, this approach is less meaningful in clinical scenarios, where multiple noise realizations are generally unavailable. The practical alternative is to compute closed-form analytical expressions for image quality measures. The objective of this paper is to review statistical analysis techniques that enable us to compute two key metrics: resolution (determined from the local impulse response) and covariance. The underlying methods include fixed-point approaches, which compute these metrics at a fixed point (the unique and stable solution) independent of the iterative algorithm employed, and iteration-based approaches, which yield results that are dependent on the algorithm, initialization, and number of iterations. We also explore extensions of some of these methods to a range of special contexts, including dynamic and motion-compensated image reconstruction. While most of the discussed techniques were developed for emission tomography, the general methods are extensible to other imaging modalities as well. In addition to enabling image characterization, these analysis techniques allow us to control and enhance imaging system performance. We review practical applications where performance improvement is achieved by applying these ideas to the contexts of both hardware (optimizing scanner design) and image reconstruction (designing regularization functions that produce uniform resolution or maximize task-specific figures of merit). PMID:24312148

  8. Report on Solar Water Heating Quantitative Survey

    SciTech Connect

    Focus Marketing Services

    1999-05-06

    This report details the results of a quantitative research study undertaken to better understand the marketplace for solar water-heating systems from the perspective of home builders, architects, and home buyers.

  9. Additive Manufacturing Infrared Inspection

    NASA Technical Reports Server (NTRS)

    Gaddy, Darrell

    2014-01-01

    Additive manufacturing is a rapid prototyping technology that allows parts to be built in a series of thin layers from plastic, ceramics, and metallics. Metallic additive manufacturing is an emerging form of rapid prototyping that allows complex structures to be built using various metallic powders. Significant time and cost savings have also been observed using the metallic additive manufacturing compared with traditional techniques. Development of the metallic additive manufacturing technology has advanced significantly over the last decade, although many of the techniques to inspect parts made from these processes have not advanced significantly or have limitations. Several external geometry inspection techniques exist such as Coordinate Measurement Machines (CMM), Laser Scanners, Structured Light Scanning Systems, or even traditional calipers and gages. All of the aforementioned techniques are limited to external geometry and contours or must use a contact probe to inspect limited internal dimensions. This presentation will document the development of a process for real-time dimensional inspection technique and digital quality record of the additive manufacturing process using Infrared camera imaging and processing techniques.

  10. Quantitative Decision Support Requires Quantitative User Guidance

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  11. Additives in plastics.

    PubMed Central

    Deanin, R D

    1975-01-01

    The polymers used in plastics are generally harmless. However, they are rarely used in pure form. In almost all commercial plastics, they are "compounded" with monomeric ingredients to improve their processing and end-use performance. In order of total volume used, these monomeric additives may be classified as follows: reinforcing fibers, fillers, and coupling agents; plasticizers; colorants; stabilizers (halogen stabilizers, antioxidants, ultraviolet absorbers, and biological preservatives); processing aids (lubricants, others, and flow controls); flame retardants, peroxides; and antistats. Some information is already available, and much more is needed, on potential toxicity and safe handling of these additives during processing and manufacture of plastics products. PMID:1175566

  12. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  13. Nuclear medicine and imaging research: Quantitative studies in radiopharmaceutical science

    SciTech Connect

    Copper, M.; Beck, R.N.

    1991-06-01

    During the past three years the program has undergone a substantial revitalization. There has been no significant change in the scientific direction of this grant, in which emphasis continues to be placed on developing new or improved methods of obtaining quantitative data from radiotracer imaging studies. However, considerable scientific progress has been made in the three areas of interest: Radiochemistry, Quantitative Methodologies, and Experimental Methods and Feasibility Studies, resulting in a sharper focus of perspective and improved integration of the overall scientific effort. Changes in Faculty and staff, including development of new collaborations, have contributed to this, as has acquisition of additional and new equipment and renovations and expansion of the core facilities. 121 refs., 30 figs., 2 tabs.

  14. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    PubMed

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells. PMID:26039484

  15. Non-interferometric quantitative phase imaging of yeast cells

    NASA Astrophysics Data System (ADS)

    Poola, Praveen K.; Pandiyan, Vimal Prabhu; John, Renu

    2015-12-01

    Real-time imaging of live cells is quite difficult without the addition of external contrast agents. Various methods for quantitative phase imaging of living cells have been proposed like digital holographic microscopy and diffraction phase microscopy. In this paper, we report theoretical and experimental results of quantitative phase imaging of live yeast cells with nanometric precision using transport of intensity equations (TIE). We demonstrate nanometric depth sensitivity in imaging live yeast cells using this technique. This technique being noninterferometric, does not need any coherent light sources and images can be captured through a regular bright-field microscope. This real-time imaging technique would deliver the depth or 3-D volume information of cells and is highly promising in real-time digital pathology applications, screening of pathogens and staging of diseases like malaria as it does not need any preprocessing of samples.

  16. Dual function microscope for quantitative DIC and birefringence imaging

    NASA Astrophysics Data System (ADS)

    Li, Chengshuai; Zhu, Yizheng

    2016-03-01

    A spectral multiplexing interferometry (SXI) method is presented for integrated birefringence and phase gradient measurement on label-free biological specimens. With SXI, the retardation and orientation of sample birefringence are simultaneously encoded onto two separate spectral carrier waves, generated by a crystal retarder oriented at a specific angle. Thus sufficient information for birefringence determination can be obtained from a single interference spectrum, eliminating the need for multiple acquisitions with mechanical rotation or electrical modulation. In addition, with the insertion of a Nomarski prism, the setup can then acquire quantitative differential interference contrast images. Red blood cells infected by malaria parasites are imaged for birefringence retardation as well as phase gradient. The results demonstrate that the SXI approach can achieve both quantitative phase imaging and birefringence imaging with a single, high-sensitivity system.

  17. The Changing Context of Critical Quantitative Inquiry

    ERIC Educational Resources Information Center

    Rios-Aguilar, Cecilia

    2014-01-01

    The author provides a framework to help scholars in the field of higher education to be critical. Additionally, the author reflects and comments on the chapters included in this special volume. Finally, this chapter ends with a discussion of the opportunities and challenges of critical quantitative inquiry.

  18. Quantitative Decision Making.

    ERIC Educational Resources Information Center

    Baldwin, Grover H.

    The use of quantitative decision making tools provides the decision maker with a range of alternatives among which to decide, permits acceptance and use of the optimal solution, and decreases risk. Training line administrators in the use of these tools can help school business officials obtain reliable information upon which to base district…

  19. Quantitative Simulation Games

    NASA Astrophysics Data System (ADS)

    Černý, Pavol; Henzinger, Thomas A.; Radhakrishna, Arjun

    While a boolean notion of correctness is given by a preorder on systems and properties, a quantitative notion of correctness is defined by a distance function on systems and properties, where the distance between a system and a property provides a measure of "fit" or "desirability." In this article, we explore several ways how the simulation preorder can be generalized to a distance function. This is done by equipping the classical simulation game between a system and a property with quantitative objectives. In particular, for systems that satisfy a property, a quantitative simulation game can measure the "robustness" of the satisfaction, that is, how much the system can deviate from its nominal behavior while still satisfying the property. For systems that violate a property, a quantitative simulation game can measure the "seriousness" of the violation, that is, how much the property has to be modified so that it is satisfied by the system. These distances can be computed in polynomial time, since the computation reduces to the value problem in limit average games with constant weights. Finally, we demonstrate how the robustness distance can be used to measure how many transmission errors are tolerated by error correcting codes.

  20. Additional results on orbits of Hilda-type asteroids

    NASA Astrophysics Data System (ADS)

    Schubart, J.

    1991-01-01

    The long period evolution of the Hilda-type orbits is studied by numerical integration. Three characteristic parameters are derived for Hildas numbered during the 1982-89 period. The distribution of orbits and subgroups of orbits is considered with respect to these parameters. Special attention is given to low-eccentricity orbits and to the observation conditions. The numerical integrations depend on a model of the forces due to Jupiter and Saturn.

  1. Quantitative analysis of endogenous compounds.

    PubMed

    Thakare, Rhishikesh; Chhonker, Yashpal S; Gautam, Nagsen; Alamoudi, Jawaher Abdullah; Alnouti, Yazen

    2016-09-01

    Accurate quantitative analysis of endogenous analytes is essential for several clinical and non-clinical applications. LC-MS/MS is the technique of choice for quantitative analyses. Absolute quantification by LC/MS requires preparing standard curves in the same matrix as the study samples so that the matrix effect and the extraction efficiency for analytes are the same in both the standard and study samples. However, by definition, analyte-free biological matrices do not exist for endogenous compounds. To address the lack of blank matrices for the quantification of endogenous compounds by LC-MS/MS, four approaches are used including the standard addition, the background subtraction, the surrogate matrix, and the surrogate analyte methods. This review article presents an overview these approaches, cite and summarize their applications, and compare their advantages and disadvantages. In addition, we discuss in details, validation requirements and compatibility with FDA guidelines to ensure method reliability in quantifying endogenous compounds. The standard addition, background subtraction, and the surrogate analyte approaches allow the use of the same matrix for the calibration curve as the one to be analyzed in the test samples. However, in the surrogate matrix approach, various matrices such as artificial, stripped, and neat matrices are used as surrogate matrices for the actual matrix of study samples. For the surrogate analyte approach, it is required to demonstrate similarity in matrix effect and recovery between surrogate and authentic endogenous analytes. Similarly, for the surrogate matrix approach, it is required to demonstrate similar matrix effect and extraction recovery in both the surrogate and original matrices. All these methods represent indirect approaches to quantify endogenous compounds and regardless of what approach is followed, it has to be shown that none of the validation criteria have been compromised due to the indirect analyses. PMID

  2. Critical Quantitative Inquiry in Context

    ERIC Educational Resources Information Center

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  3. Estimation of Variance Components of Quantitative Traits in Inbred Populations

    PubMed Central

    Abney, Mark; McPeek, Mary Sara; Ober, Carole

    2000-01-01

    Summary Use of variance-component estimation for mapping of quantitative-trait loci in humans is a subject of great current interest. When only trait values, not genotypic information, are considered, variance-component estimation can also be used to estimate heritability of a quantitative trait. Inbred pedigrees present special challenges for variance-component estimation. First, there are more variance components to be estimated in the inbred case, even for a relatively simple model including additive, dominance, and environmental effects. Second, more identity coefficients need to be calculated from an inbred pedigree in order to perform the estimation, and these are computationally more difficult to obtain in the inbred than in the outbred case. As a result, inbreeding effects have generally been ignored in practice. We describe here the calculation of identity coefficients and estimation of variance components of quantitative traits in large inbred pedigrees, using the example of HDL in the Hutterites. We use a multivariate normal model for the genetic effects, extending the central-limit theorem of Lange to allow for both inbreeding and dominance under the assumptions of our variance-component model. We use simulated examples to give an indication of under what conditions one has the power to detect the additional variance components and to examine their impact on variance-component estimation. We discuss the implications for mapping and heritability estimation by use of variance components in inbred populations. PMID:10677322

  4. Biobased lubricant additives

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Fully biobased lubricants are those formulated using all biobased ingredients, i.e. biobased base oils and biobased additives. Such formulations provide the maximum environmental, safety, and economic benefits expected from a biobased product. Currently, there are a number of biobased base oils that...

  5. Multifunctional fuel additives

    SciTech Connect

    Baillargeon, D.J.; Cardis, A.B.; Heck, D.B.

    1991-03-26

    This paper discusses a composition comprising a major amount of a liquid hydrocarbyl fuel and a minor low-temperature flow properties improving amount of an additive product of the reaction of a suitable diol and product of a benzophenone tetracarboxylic dianhydride and a long-chain hydrocarbyl aminoalcohol.

  6. Localization of load sensitivity of working memory storage: Quantitatively and qualitatively discrepant results yielded by single-subject and group-averaged approaches to fMRI group analysis

    PubMed Central

    Feredoes, Eva; Postle, Bradley R.

    2007-01-01

    The impetus for the present report is the evaluation of competing claims of two classes of working memory models: Memory systems models hold working memory to be supported by a network of prefrontal cortex (PFC)-based domain-specific buffers that act as workspaces for the storage and manipulation of information; emergent processes models, in contrast, hold that the contributions of PFC to working memory do not include the temporary storage of information. Empirically, each of these perspectives is supported by seemingly mutually incompatible results from functional magnetic resonance imaging (fMRI) studies that either do or do not find evidence for delay-period sensitivity to memory load, an index of storage, in PFC. We hypothesized that these empirical discrepancies may be due, at least in part, to methodological factors, because studies reporting delay-period load sensitivity in PFC typically employ spatially normalized group averaged analyses, whereas studies that don’t find PFC load sensitivity typically use a single-subject “case-study” approach. Experiment 1 performed these two approaches to analysis on the same data set, the results of which were consistent with this hypothesis. Experiment 2 evaluated one characteristic of the single-subject results from Experiment 1 – considerable topographical variability across subjects – by evaluating its test-retest reliability with a new group of subjects. Each subject was scanned twice, and the results indicated that, for each of several contrasts, test-retest reliability was significantly greater than chance. Together, these results raise the possibility that the brain bases of delay-period load sensitivity may be characterized by considerable intersubject topographical variability. Our results highlight how the selection of fMRI analysis methods can produce discrepant results, each of which is consistent with different, incompatible theoretical interpretations. PMID:17296315

  7. Quantitative characterisation of sedimentary grains

    NASA Astrophysics Data System (ADS)

    Tunwal, Mohit; Mulchrone, Kieran F.; Meere, Patrick A.

    2016-04-01

    Analysis of sedimentary texture helps in determining the formation, transportation and deposition processes of sedimentary rocks. Grain size analysis is traditionally quantitative, whereas grain shape analysis is largely qualitative. A semi-automated approach to quantitatively analyse shape and size of sand sized sedimentary grains is presented. Grain boundaries are manually traced from thin section microphotographs in the case of lithified samples and are automatically identified in the case of loose sediments. Shape and size paramters can then be estimated using a software package written on the Mathematica platform. While automated methodology already exists for loose sediment analysis, the available techniques for the case of lithified samples are limited to cases of high definition thin section microphotographs showing clear contrast between framework grains and matrix. Along with the size of grain, shape parameters such as roundness, angularity, circularity, irregularity and fractal dimension are measured. A new grain shape parameter developed using Fourier descriptors has also been developed. To test this new approach theoretical examples were analysed and produce high quality results supporting the accuracy of the algorithm. Furthermore sandstone samples from known aeolian and fluvial environments from the Dingle Basin, County Kerry, Ireland were collected and analysed. Modern loose sediments from glacial till from County Cork, Ireland and aeolian sediments from Rajasthan, India have also been collected and analysed. A graphical summary of the data is presented and allows for quantitative distinction between samples extracted from different sedimentary environments.

  8. Interpretation of Quantitative Shotgun Proteomic Data.

    PubMed

    Aasebø, Elise; Berven, Frode S; Selheim, Frode; Barsnes, Harald; Vaudel, Marc

    2016-01-01

    In quantitative proteomics, large lists of identified and quantified proteins are used to answer biological questions in a systemic approach. However, working with such extensive datasets can be challenging, especially when complex experimental designs are involved. Here, we demonstrate how to post-process large quantitative datasets, detect proteins of interest, and annotate the data with biological knowledge. The protocol presented can be achieved without advanced computational knowledge thanks to the user-friendly Perseus interface (available from the MaxQuant website, www.maxquant.org ). Various visualization techniques facilitating the interpretation of quantitative results in complex biological systems are also highlighted. PMID:26700055

  9. Towards a Quantitative OCT Image Analysis

    PubMed Central

    Garcia Garrido, Marina; Beck, Susanne C.; Mühlfriedel, Regine; Julien, Sylvie; Schraermeyer, Ulrich; Seeliger, Mathias W.

    2014-01-01

    Background Optical coherence tomography (OCT) is an invaluable diagnostic tool for the detection and follow-up of retinal pathology in patients and experimental disease models. However, as morphological structures and layering in health as well as their alterations in disease are complex, segmentation procedures have not yet reached a satisfactory level of performance. Therefore, raw images and qualitative data are commonly used in clinical and scientific reports. Here, we assess the value of OCT reflectivity profiles as a basis for a quantitative characterization of the retinal status in a cross-species comparative study. Methods Spectral-Domain Optical Coherence Tomography (OCT), confocal Scanning-La­ser Ophthalmoscopy (SLO), and Fluorescein Angiography (FA) were performed in mice (Mus musculus), gerbils (Gerbillus perpadillus), and cynomolgus monkeys (Macaca fascicularis) using the Heidelberg Engineering Spectralis system, and additional SLOs and FAs were obtained with the HRA I (same manufacturer). Reflectivity profiles were extracted from 8-bit greyscale OCT images using the ImageJ software package (http://rsb.info.nih.gov/ij/). Results Reflectivity profiles obtained from OCT scans of all three animal species correlated well with ex vivo histomorphometric data. Each of the retinal layers showed a typical pattern that varied in relative size and degree of reflectivity across species. In general, plexiform layers showed a higher level of reflectivity than nuclear layers. A comparison of reflectivity profiles from specialized retinal regions (e.g. visual streak in gerbils, fovea in non-human primates) with respective regions of human retina revealed multiple similarities. In a model of Retinitis Pigmentosa (RP), the value of reflectivity profiles for the follow-up of therapeutic interventions was demonstrated. Conclusions OCT reflectivity profiles provide a detailed, quantitative description of retinal layers and structures including specialized retinal regions

  10. Tackifier for addition polyimides

    NASA Technical Reports Server (NTRS)

    Butler, J. M.; St.clair, T. L.

    1980-01-01

    A modification to the addition polyimide, LaRC-160, was prepared to improve tack and drape and increase prepeg out-time. The essentially solventless, high viscosity laminating resin is synthesized from low cost liquid monomers. The modified version takes advantage of a reactive, liquid plasticizer which is used in place of solvent and helps solve a major problem of maintaining good prepeg tack and drape, or the ability of the prepeg to adhere to adjacent plies and conform to a desired shape during the lay up process. This alternate solventless approach allows both longer life of the polymer prepeg and the processing of low void laminates. This approach appears to be applicable to all addition polyimide systems.

  11. [Biologically active food additives].

    PubMed

    Velichko, M A; Shevchenko, V P

    1998-07-01

    More than half out of 40 projects for the medical science development by the year of 2000 have been connected with the bio-active edible additives that are called "the food of XXI century", non-pharmacological means for many diseases. Most of these additives--nutricevtics and parapharmacevtics--are intended for the enrichment of food rations for the sick or healthy people. The ecologicaly safest and most effective are combined domestic adaptogens with immuno-modulating and antioxidating action that give anabolic and stimulating effect,--"leveton", "phytoton" and "adapton". The MKTs-229 tablets are residue discharge means. For atherosclerosis and general adiposis they recommend "tsar tablets" and "aiconol (ikhtien)"--on the base of cod-liver oil or "splat" made out of seaweed (algae). All these preparations have been clinically tested and received hygiene certificates from the Institute of Dietology of the Russian Academy of Medical Science. PMID:9752776

  12. Hydrocarbon fuel additive

    SciTech Connect

    Ambrogio, S.

    1989-02-28

    This patent describes the method of fuel storage or combustion, wherein the fuel supply contains small amounts of water, the step of adding to the fuel supply an additive comprising a blend of a hydrophilic agent chosen from the group of ethylene glycol, n-butyl alcohol, and cellosolve in the range of 22-37% by weight; ethoxylated nonylphenol in the range of 26-35% by weight; nonylphenol polyethylene glycol ether in the range of 32-43% by weight.

  13. Quantitative measurements in capsule endoscopy.

    PubMed

    Keuchel, M; Kurniawan, N; Baltes, P; Bandorski, D; Koulaouzidis, A

    2015-10-01

    This review summarizes several approaches for quantitative measurement in capsule endoscopy. Video capsule endoscopy (VCE) typically provides wireless imaging of small bowel. Currently, a variety of quantitative measurements are implemented in commercially available hardware/software. The majority is proprietary and hence undisclosed algorithms. Measurement of amount of luminal contamination allows calculating scores from whole VCE studies. Other scores express the severity of small bowel lesions in Crohn׳s disease or the degree of villous atrophy in celiac disease. Image processing with numerous algorithms of textural and color feature extraction is further in the research focuses for automated image analysis. These tools aim to select single images with relevant lesions as blood, ulcers, polyps and tumors or to omit images showing only luminal contamination. Analysis of motility pattern, size measurement and determination of capsule localization are additional topics. Non-visual wireless capsules transmitting data acquired with specific sensors from the gastrointestinal (GI) tract are available for clinical routine. This includes pH measurement in the esophagus for the diagnosis of acid gastro-esophageal reflux. A wireless motility capsule provides GI motility analysis on the basis of pH, pressure, and temperature measurement. Electromagnetically tracking of another motility capsule allows visualization of motility. However, measurement of substances by GI capsules is of great interest but still at an early stage of development. PMID:26299419

  14. Functional Generalized Additive Models.

    PubMed

    McLean, Mathew W; Hooker, Giles; Staicu, Ana-Maria; Scheipl, Fabian; Ruppert, David

    2014-01-01

    We introduce the functional generalized additive model (FGAM), a novel regression model for association studies between a scalar response and a functional predictor. We model the link-transformed mean response as the integral with respect to t of F{X(t), t} where F(·,·) is an unknown regression function and X(t) is a functional covariate. Rather than having an additive model in a finite number of principal components as in Müller and Yao (2008), our model incorporates the functional predictor directly and thus our model can be viewed as the natural functional extension of generalized additive models. We estimate F(·,·) using tensor-product B-splines with roughness penalties. A pointwise quantile transformation of the functional predictor is also considered to ensure each tensor-product B-spline has observed data on its support. The methods are evaluated using simulated data and their predictive performance is compared with other competing scalar-on-function regression alternatives. We illustrate the usefulness of our approach through an application to brain tractography, where X(t) is a signal from diffusion tensor imaging at position, t, along a tract in the brain. In one example, the response is disease-status (case or control) and in a second example, it is the score on a cognitive test. R code for performing the simulations and fitting the FGAM can be found in supplemental materials available online. PMID:24729671

  15. Quantitative proteomics analysis of adsorbed plasma proteins classifies nanoparticles with different surface properties and size

    SciTech Connect

    Zhang, Haizhen; Burnum, Kristin E.; Luna, Maria L.; Petritis, Brianne O.; Kim, Jong Seo; Qian, Weijun; Moore, Ronald J.; Heredia-Langner, Alejandro; Webb-Robertson, Bobbie-Jo M.; Thrall, Brian D.; Camp, David G.; Smith, Richard D.; Pounds, Joel G.; Liu, Tao

    2011-12-01

    In biofluids (e.g., blood plasma) nanoparticles are readily embedded in layers of proteins that can affect their biological activity and biocompatibility. Herein, we report a study on the interactions between human plasma proteins and nanoparticles with a controlled systematic variation of properties using stable isotope labeling and liquid chromatography-mass spectrometry (LC-MS) based quantitative proteomics. Novel protocol has been developed to simplify the isolation of nanoparticle bound proteins and improve the reproducibility. Plasma proteins associated with polystyrene nanoparticles with three different surface chemistries and two sizes as well as for four different exposure times (for a total of 24 different samples) were identified and quantified by LC-MS analysis. Quantitative comparison of relative protein abundances were achieved by spiking an 18 O-labeled 'universal reference' into each individually processed unlabeled sample as an internal standard, enabling simultaneous application of both label-free and isotopic labeling quantitation across the sample set. Clustering analysis of the quantitative proteomics data resulted in distinctive pattern that classifies the nanoparticles based on their surface properties and size. In addition, data on the temporal study indicated that the stable protein 'corona' that was isolated for the quantitative analysis appeared to be formed in less than 5 minutes. The comprehensive results obtained herein using quantitative proteomics have potential implications towards predicting nanoparticle biocompatibility.

  16. Energy & Climate: Getting Quantitative

    NASA Astrophysics Data System (ADS)

    Wolfson, Richard

    2011-11-01

    A noted environmentalist claims that buying an SUV instead of a regular car is energetically equivalent to leaving your refrigerator door open for seven years. A fossil-fuel apologist argues that solar energy is a pie-in-the-sky dream promulgated by na"ive environmentalists, because there's nowhere near enough solar energy to meet humankind's energy demand. A group advocating shutdown of the Vermont Yankee nuclear plant claims that 70% of its electrical energy is lost in transmission lines. Around the world, thousands agitate for climate action, under the numerical banner ``350.'' Neither the environmentalist, the fossil-fuel apologist, the antinuclear activists, nor most of those marching under the ``350'' banner can back up their assertions with quantitative arguments. Yet questions about energy and its environmental impacts almost always require quantitative answers. Physics can help! This poster gives some cogent examples, based on the newly published 2^nd edition of the author's textbook Energy, Environment, and Climate.

  17. Study on the performance evaluation of quantitative precipitation estimation and quantitative precipitation forecast

    NASA Astrophysics Data System (ADS)

    Yang, H.; Chang, K.; Suk, M.; cha, J.; Choi, Y.

    2011-12-01

    Rainfall estimation and short-term (several hours) quantitative prediction of precipitation based on meteorological radar data is one of the intensely studied topics. The Korea Peninsula has the horizontally narrow land area and complex topography with many of mountains, and so it has the characteristics that the rainfall system changes in many cases. Quantitative precipitation estimation (QPE) and quantitative precipitation forecasts (QPF) are the crucial information for severe weather or water management. We have been conducted the performance evaluation of QPE/QPF of Korea Meteorological Administration (KMA), which is the first step for optimizing QPE/QPF system in South Korea. The real-time adjusted RAR (Radar-AWS-Rainrate) system gives better agreement with the observed rain-rate than that of the fixed Z-R relation, and the additional bias correction of RAR yields the slightly better results. A correlation coefficient of R2 = 0.84 is obtained between the daily accumulated observed and RAR estimated rainfall. The RAR will be available for the hydrological applications such as the water budget. The VSRF (Very Short Range Forecast) shows better performance than the MAPLE (McGill Algorithm for Precipitation Nowcasting by Lagrangian) within 40 minutes, but the MAPLE better than the VSRF after 40 minutes. In case of hourly forecast, MAPLE shows better performance than the VSRF. QPE and QPF are thought to be meaningful for the nowcasting (1~2 hours) except the model forecast. The long-term forecast longer than 3 hours by meteorological model is especially meaningful for such as water management.

  18. Computational vaccinology: quantitative approaches.

    PubMed

    Flower, Darren R; McSparron, Helen; Blythe, Martin J; Zygouri, Christianna; Taylor, Debra; Guan, Pingping; Wan, Shouzhan; Coveney, Peter V; Walshe, Valerie; Borrow, Persephone; Doytchinova, Irini A

    2003-01-01

    The immune system is hierarchical and has many levels, exhibiting much emergent behaviour. However, at its heart are molecular recognition events that are indistinguishable from other types of biomacromolecular interaction. These can be addressed well by quantitative experimental and theoretical biophysical techniques, and particularly by methods from drug design. We review here our approach to computational immunovaccinology. In particular, we describe the JenPep database and two new techniques for T cell epitope prediction. One is based on quantitative structure-activity relationships (a 3D-QSAR method based on CoMSIA and another 2D method based on the Free-Wilson approach) and the other on atomistic molecular dynamic simulations using high performance computing. JenPep (http://www.jenner.ar.uk/ JenPep) is a relational database system supporting quantitative data on peptide binding to major histocompatibility complexes, TAP transporters, TCR-pMHC complexes, and an annotated list of B cell and T cell epitopes. Our 2D-QSAR method factors the contribution to peptide binding from individual amino acids as well as 1-2 and 1-3 residue interactions. In the 3D-QSAR approach, the influence of five physicochemical properties (volume, electrostatic potential, hydrophobicity, hydrogen-bond donor and acceptor abilities) on peptide affinity were considered. Both methods are exemplified through their application to the well-studied problem of peptide binding to the human class I MHC molecule HLA-A*0201. PMID:14712934

  19. Evolutionary quantitative genetics of nonlinear developmental systems.

    PubMed

    Morrissey, Michael B

    2015-08-01

    In quantitative genetics, the effects of developmental relationships among traits on microevolution are generally represented by the contribution of pleiotropy to additive genetic covariances. Pleiotropic additive genetic covariances arise only from the average effects of alleles on multiple traits, and therefore the evolutionary importance of nonlinearities in development is generally neglected in quantitative genetic views on evolution. However, nonlinearities in relationships among traits at the level of whole organisms are undeniably important to biology in general, and therefore critical to understanding evolution. I outline a system for characterizing key quantitative parameters in nonlinear developmental systems, which yields expressions for quantities such as trait means and phenotypic and genetic covariance matrices. I then develop a system for quantitative prediction of evolution in nonlinear developmental systems. I apply the system to generating a new hypothesis for why direct stabilizing selection is rarely observed. Other uses will include separation of purely correlative from direct and indirect causal effects in studying mechanisms of selection, generation of predictions of medium-term evolutionary trajectories rather than immediate predictions of evolutionary change over single generation time-steps, and the development of efficient and biologically motivated models for separating additive from epistatic genetic variances and covariances. PMID:26174586

  20. The quantitative potential for breast tomosynthesis imaging

    SciTech Connect

    Shafer, Christina M.; Samei, Ehsan; Lo, Joseph Y.

    2010-03-15

    Purpose: Due to its limited angular scan range, breast tomosynthesis has lower resolution in the depth direction, which may limit its accuracy in quantifying tissue density. This study assesses the quantitative potential of breast tomosynthesis using relatively simple reconstruction and image processing algorithms. This quantitation could allow improved characterization of lesions as well as image processing to present tomosynthesis images with the familiar appearance of mammography by preserving more low-frequency information. Methods: All studies were based on a Siemens prototype MAMMOMAT Novation TOMO breast tomo system with a 45 deg. total angular span. This investigation was performed using both simulations and empirical measurements. Monte Carlo simulations were conducted using the breast tomosynthesis geometry and tissue-equivalent, uniform, voxelized phantoms with cuboid lesions of varying density embedded within. Empirical studies were then performed using tissue-equivalent plastic phantoms which were imaged on the actual prototype system. The material surrounding the lesions was set to either fat-equivalent or glandular-equivalent plastic. From the simulation experiments, the effects of scatter, lesion depth, and background material density were studied. The empirical experiments studied the effects of lesion depth, background material density, x-ray tube energy, and exposure level. Additionally, the proposed analysis methods were independently evaluated using a commercially available QA breast phantom (CIRS Model 11A). All image reconstruction was performed with a filtered backprojection algorithm. Reconstructed voxel values within each slice were corrected to reduce background nonuniformities. Results: The resulting lesion voxel values varied linearly with known glandular fraction (correlation coefficient R{sup 2}>0.90) under all simulated and empirical conditions, including for the independent tests with the QA phantom. Analysis of variance performed

  1. Theoretical study of the nuclear spin-molecular rotation coupling for relativistic electrons and non-relativistic nuclei. II. Quantitative results in HX (X=H,F,Cl,Br,I) compounds

    NASA Astrophysics Data System (ADS)

    Aucar, I. Agustín; Gómez, Sergio S.; Melo, Juan I.; Giribet, Claudia C.; Ruiz de Azúa, Martín C.

    2013-04-01

    In the present work, numerical results of the nuclear spin-rotation (SR) tensor in the series of compounds HX (X=H,F,Cl,Br,I) within relativistic 4-component expressions obtained by Aucar et al. [J. Chem. Phys. 136, 204119 (2012), 10.1063/1.4721627] are presented. The SR tensors of both the H and X nuclei are discussed. Calculations were carried out within the relativistic Linear Response formalism at the Random Phase Approximation with the DIRAC program. For the halogen nucleus X, correlation effects on the non-relativistic values are shown to be of similar magnitude and opposite sign to relativistic effects. For the light H nucleus, by means of the linear response within the elimination of the small component approach it is shown that the whole relativistic effect is given by the spin-orbit operator combined with the Fermi contact operator. Comparison of "best estimate" calculated values with experimental results yield differences smaller than 2%-3% in all cases. The validity of "Flygare's relation" linking the SR tensor and the NMR nuclear magnetic shielding tensor in the present series of compounds is analyzed.

  2. Theoretical study of the nuclear spin-molecular rotation coupling for relativistic electrons and non-relativistic nuclei. II. Quantitative results in HX (X = H,F,Cl,Br,I) compounds.

    PubMed

    Aucar, I Agustín; Gómez, Sergio S; Melo, Juan I; Giribet, Claudia C; Ruiz de Azúa, Martín C

    2013-04-01

    In the present work, numerical results of the nuclear spin-rotation (SR) tensor in the series of compounds HX (X = H,F,Cl,Br,I) within relativistic 4-component expressions obtained by Aucar et al. [J. Chem. Phys. 136, 204119 (2012)] are presented. The SR tensors of both the H and X nuclei are discussed. Calculations were carried out within the relativistic Linear Response formalism at the Random Phase Approximation with the DIRAC program. For the halogen nucleus X, correlation effects on the non-relativistic values are shown to be of similar magnitude and opposite sign to relativistic effects. For the light H nucleus, by means of the linear response within the elimination of the small component approach it is shown that the whole relativistic effect is given by the spin-orbit operator combined with the Fermi contact operator. Comparison of "best estimate" calculated values with experimental results yield differences smaller than 2%-3% in all cases. The validity of "Flygare's relation" linking the SR tensor and the NMR nuclear magnetic shielding tensor in the present series of compounds is analyzed. PMID:23574208

  3. Siloxane containing addition polyimides

    NASA Technical Reports Server (NTRS)

    Maudgal, S.; St. Clair, T. L.

    1984-01-01

    Addition polyimide oligomers have been synthesized from bis(gamma-aminopropyl) tetramethyldisiloxane and 3, 3', 4, 4'-benzophenonetetracarboxylic dianhydride using a variety of latent crosslinking groups as endcappers. The prepolymers were isolated and characterized for solubility (in amide, chlorinated and ether solvents), melt flow and cure properties. The most promising systems, maleimide and acetylene terminated prepolymers, were selected for detailed study. Graphite cloth reinforced composites were prepared and properties compared with those of graphite/Kerimid 601, a commercially available bismaleimide. Mixtures of the maleimide terminated system with Kerimid 601, in varying proportions, were also studied.

  4. Oil additive process

    SciTech Connect

    Bishop, H.

    1988-10-18

    This patent describes a method of making an additive comprising: (a) adding 2 parts by volume of 3% sodium hypochlorite to 45 parts by volume of diesel oil fuel to form a sulphur free fuel, (b) removing all water and foreign matter formed by the sodium hypochlorite, (c) blending 30 parts by volume of 24% lead naphthanate with 15 parts by volume of the sulphur free fuel, 15 parts by volume of light-weight material oil to form a blended mixture, and (d) heating the blended mixture slowly and uniformly to 152F.

  5. Quantitative SPECT brain imaging: Effects of attenuation and detector response

    SciTech Connect

    Gilland, D.R.; Jaszczak, R.J.; Bowsher, J.E.; Turkington, T.G.; Liang, Z.; Greer, K.L.; Coleman, R.E. . Dept. of Radiology)

    1993-06-01

    Two physical factors that substantially degrade quantitative accuracy in SPECT imaging of the brain are attenuation and detector response. In addition to the physical factors, random noise in the reconstructed image can greatly affect the quantitative measurement. The purpose of this work was to implement two reconstruction methods that compensate for attenuation and detector response, a 3D maximum likelihood-EM method (ML) and a filtered backprojection method (FB) with Metz filter and Chang attenuation compensation, and compare the methods in terms of quantitative accuracy and image noise. The methods were tested on simulated data of the 3D Hoffman brain phantom. The simulation incorporated attenuation and distance-dependent detector response. Bias and standard deviation of reconstructed voxel intensities were measured in the gray and white matter regions. The results with ML showed that in both the gray and white matter regions as the number of iterations increased, bias decreased and standard deviation increased. Similar results were observed with FB as the Metz filter power increased. In both regions, ML had smaller standard deviation than FB for a given bias. Reconstruction times for the ML method have been greatly reduced through efficient coding, limited source support, and by computing attenuation factors only along rays perpendicular to the detector.

  6. Evolutionary Quantitative Genomics of Populus trichocarpa

    PubMed Central

    McKown, Athena D.; La Mantia, Jonathan; Guy, Robert D.; Ingvarsson, Pär K.; Hamelin, Richard; Mansfield, Shawn D.; Ehlting, Jürgen; Douglas, Carl J.; El-Kassaby, Yousry A.

    2015-01-01

    Forest trees generally show high levels of local adaptation and efforts focusing on understanding adaptation to climate will be crucial for species survival and management. Here, we address fundamental questions regarding the molecular basis of adaptation in undomesticated forest tree populations to past climatic environments by employing an integrative quantitative genetics and landscape genomics approach. Using this comprehensive approach, we studied the molecular basis of climate adaptation in 433 Populus trichocarpa (black cottonwood) genotypes originating across western North America. Variation in 74 field-assessed traits (growth, ecophysiology, phenology, leaf stomata, wood, and disease resistance) was investigated for signatures of selection (comparing QST -FST) using clustering of individuals by climate of origin (temperature and precipitation). 29,354 SNPs were investigated employing three different outlier detection methods and marker-inferred relatedness was estimated to obtain the narrow-sense estimate of population differentiation in wild populations. In addition, we compared our results with previously assessed selection of candidate SNPs using the 25 topographical units (drainages) across the P. trichocarpa sampling range as population groupings. Narrow-sense QST for 53% of distinct field traits was significantly divergent from expectations of neutrality (indicating adaptive trait variation); 2,855 SNPs showed signals of diversifying selection and of these, 118 SNPs (within 81 genes) were associated with adaptive traits (based on significant QST). Many SNPs were putatively pleiotropic for functionally uncorrelated adaptive traits, such as autumn phenology, height, and disease resistance. Evolutionary quantitative genomics in P. trichocarpa provides an enhanced understanding regarding the molecular basis of climate-driven selection in forest trees and we highlight that important loci underlying adaptive trait variation also show relationship to

  7. Quantitative nondestructive characterization of visco-elastic materials at high pressure

    SciTech Connect

    Aizawa, Tatsuhiko; Kihara, Junji; Ohno, Jun

    1995-11-01

    New anvil apparatus was developed to realize high pressure atmosphere suitable to investigation of viscoelastic behaviors of such soft materials as polymers, lubricants, proteins and so forth. In addition, ultrasonic spectroscopy system was also newly constructed to make quantitative nondestructive evaluation of elasticity and viscosity of soft materials at high pressure. In order to demonstrate the validity and effectiveness of the developed system and methodology for quantitative nondestructive visco-elastic characterization, various silicone oils are employed, and measured spectra are compared to the theoretical results calculated by the three linear element model.

  8. Performance Boosting Additive

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Mainstream Engineering Corporation was awarded Phase I and Phase II contracts from Goddard Space Flight Center's Small Business Innovation Research (SBIR) program in early 1990. With support from the SBIR program, Mainstream Engineering Corporation has developed a unique low cost additive, QwikBoost (TM), that increases the performance of air conditioners, heat pumps, refrigerators, and freezers. Because of the energy and environmental benefits of QwikBoost, Mainstream received the Tibbetts Award at a White House Ceremony on October 16, 1997. QwikBoost was introduced at the 1998 International Air Conditioning, Heating, and Refrigeration Exposition. QwikBoost is packaged in a handy 3-ounce can (pressurized with R-134a) and will be available for automotive air conditioning systems in summer 1998.

  9. Sewage sludge additive

    NASA Technical Reports Server (NTRS)

    Kalvinskas, J. J.; Mueller, W. A.; Ingham, J. D. (Inventor)

    1980-01-01

    The additive is for a raw sewage treatment process of the type where settling tanks are used for the purpose of permitting the suspended matter in the raw sewage to be settled as well as to permit adsorption of the dissolved contaminants in the water of the sewage. The sludge, which settles down to the bottom of the settling tank is extracted, pyrolyzed and activated to form activated carbon and ash which is mixed with the sewage prior to its introduction into the settling tank. The sludge does not provide all of the activated carbon and ash required for adequate treatment of the raw sewage. It is necessary to add carbon to the process and instead of expensive commercial carbon, coal is used to provide the carbon supplement.

  10. Perspectives on Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Bourell, David L.

    2016-07-01

    Additive manufacturing (AM) has skyrocketed in visibility commercially and in the public sector. This article describes the development of this field from early layered manufacturing approaches of photosculpture, topography, and material deposition. Certain precursors to modern AM processes are also briefly described. The growth of the field over the last 30 years is presented. Included is the standard delineation of AM technologies into seven broad categories. The economics of AM part generation is considered, and the impacts of the economics on application sectors are described. On the basis of current trends, the future outlook will include a convergence of AM fabricators, mass-produced AM fabricators, enabling of topology optimization designs, and specialization in the AM legal arena. Long-term developments with huge impact are organ printing and volume-based printing.

  11. White-light quantitative phase imaging unit

    NASA Astrophysics Data System (ADS)

    Baek, YoonSeok; Lee, KyeoReh; Yoon, Jonghee; Kim, Kyoohyun; Park, YongKeun

    2016-05-01

    We introduce the white light quantitative phase imaging unit (WQPIU) as a practical realization of quantitative phase imaging (QPI) on standard microscope platforms. The WQPIU is a compact stand-alone unit which measures sample induced phase delay under white-light illumination. It does not require any modification of the microscope or additional accessories for its use. The principle of the WQPIU based on lateral shearing interferometry and phase shifting interferometry provides a cost-effective and user-friendly use of QPI. The validity and capacity of the presented method are demonstrated by measuring quantitative phase images of polystyrene beads, human red blood cells, HeLa cells and mouse white blood cells. With speckle-free imaging capability due to the use of white-light illumination, the WQPIU is expected to expand the scope of QPI in biological sciences as a powerful but simple imaging tool.

  12. White-light quantitative phase imaging unit.

    PubMed

    Baek, YoonSeok; Lee, KyeoReh; Yoon, Jonghee; Kim, Kyoohyun; Park, YongKeun

    2016-05-01

    We introduce the white-light quantitative phase imaging unit (WQPIU) as a practical realization of quantitative phase imaging (QPI) on standard microscope platforms. The WQPIU is a compact stand-alone unit which measures sample induced phase delay under white-light illumination. It does not require any modification of the microscope or additional accessories for its use. The principle of the WQPIU based on lateral shearing interferometry and phase shifting interferometry provides a cost-effective and user-friendly use of QPI. The validity and capacity of the presented method are demonstrated by measuring quantitative phase images of polystyrene beads, human red blood cells, HeLa cells and mouse white blood cells. With speckle-free imaging capability due to the use of white-light illumination, the WQPIU is expected to expand the scope of QPI in biological sciences as a powerful but simple imaging tool. PMID:27137546

  13. Quantitative biomedical mass spectrometry

    NASA Astrophysics Data System (ADS)

    de Leenheer, Andrép; Thienpont, Linda M.

    1992-09-01

    The scope of this contribution is an illustration of the capabilities of isotope dilution mass spectrometry (IDMS) for quantification of target substances in the biomedical field. After a brief discussion of the general principles of quantitative MS in biological samples, special attention will be paid to new technological developments or trends in IDMS from selected examples from the literature. The final section will deal with the use of IDMS for accuracy assessment in clinical chemistry. Methodological aspects considered crucial for avoiding sources of error will be discussed.

  14. Quantitative survey on the shape of the back of men's head as viewed from the side.

    PubMed

    Tamir, Abraham

    2013-05-01

    This article classifies quantitatively into 4 shapes men's back part of the head viewed from the side that are demonstrated in some of the figures in this article. Because of self-evident reasons, the shapes were blurred. The survey is based on the analysis of 2220 shapes obtained by photographing mainly bald men and by finding pictures in the Internet. To the best of the author's knowledge, this quantitative approach has never been implemented before. The results obtained are as follows: the percentage of 376 "flat heads" is 17%; the percentage of 755 "little round heads," 34%; the percentage of 1017 "round heads," 45.8%; and the percentage of 72 "very round heads," 3.2%. This quantitative survey is an additional step in analyzing quantitatively the shape of the parts of the face wherein, in articles that were previously published or that will be published in this magazine, shapes of the nose, ear conch, and human eye were analyzed quantitatively. In addition, the shapes of the leg toes were also analyzed. Finally, it should be noted that, because of obvious reasons, the survey is based on men's head, most of which are with baldness. PMID:23714907

  15. Enhancer additivity and non-additivity are determined by enhancer strength in the Drosophila embryo

    PubMed Central

    Bothma, Jacques P; Garcia, Hernan G; Ng, Samuel; Perry, Michael W; Gregor, Thomas; Levine, Michael

    2015-01-01

    Metazoan genes are embedded in a rich milieu of regulatory information that often includes multiple enhancers possessing overlapping activities. In this study, we employ quantitative live imaging methods to assess the function of pairs of primary and shadow enhancers in the regulation of key patterning genes-knirps, hunchback, and snail-in developing Drosophila embryos. The knirps enhancers exhibit additive, sometimes even super-additive activities, consistent with classical gene fusion studies. In contrast, the hunchback enhancers function sub-additively in anterior regions containing saturating levels of the Bicoid activator, but function additively in regions where there are diminishing levels of the Bicoid gradient. Strikingly sub-additive behavior is also observed for snail, whereby removal of the proximal enhancer causes a significant increase in gene expression. Quantitative modeling of enhancer–promoter interactions suggests that weakly active enhancers function additively while strong enhancers behave sub-additively due to competition with the target promoter. DOI: http://dx.doi.org/10.7554/eLife.07956.001 PMID:26267217

  16. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  17. Quantitative Hyperspectral Reflectance Imaging

    PubMed Central

    Klein, Marvin E.; Aalderink, Bernard J.; Padoan, Roberto; de Bruin, Gerrit; Steemers, Ted A.G.

    2008-01-01

    Hyperspectral imaging is a non-destructive optical analysis technique that can for instance be used to obtain information from cultural heritage objects unavailable with conventional colour or multi-spectral photography. This technique can be used to distinguish and recognize materials, to enhance the visibility of faint or obscured features, to detect signs of degradation and study the effect of environmental conditions on the object. We describe the basic concept, working principles, construction and performance of a laboratory instrument specifically developed for the analysis of historical documents. The instrument measures calibrated spectral reflectance images at 70 wavelengths ranging from 365 to 1100 nm (near-ultraviolet, visible and near-infrared). By using a wavelength tunable narrow-bandwidth light-source, the light energy used to illuminate the measured object is minimal, so that any light-induced degradation can be excluded. Basic analysis of the hyperspectral data includes a qualitative comparison of the spectral images and the extraction of quantitative data such as mean spectral reflectance curves and statistical information from user-defined regions-of-interest. More sophisticated mathematical feature extraction and classification techniques can be used to map areas on the document, where different types of ink had been applied or where one ink shows various degrees of degradation. The developed quantitative hyperspectral imager is currently in use by the Nationaal Archief (National Archives of The Netherlands) to study degradation effects of artificial samples and original documents, exposed in their permanent exhibition area or stored in their deposit rooms.

  18. Quantitative volumetric breast density estimation using phase contrast mammography

    NASA Astrophysics Data System (ADS)

    Wang, Zhentian; Hauser, Nik; Kubik-Huch, Rahel A.; D'Isidoro, Fabio; Stampanoni, Marco

    2015-05-01

    Phase contrast mammography using a grating interferometer is an emerging technology for breast imaging. It provides complementary information to the conventional absorption-based methods. Additional diagnostic values could be further obtained by retrieving quantitative information from the three physical signals (absorption, differential phase and small-angle scattering) yielded simultaneously. We report a non-parametric quantitative volumetric breast density estimation method by exploiting the ratio (dubbed the R value) of the absorption signal to the small-angle scattering signal. The R value is used to determine breast composition and the volumetric breast density (VBD) of the whole breast is obtained analytically by deducing the relationship between the R value and the pixel-wise breast density. The proposed method is tested by a phantom study and a group of 27 mastectomy samples. In the clinical evaluation, the estimated VBD values from both cranio-caudal (CC) and anterior-posterior (AP) views are compared with the ACR scores given by radiologists to the pre-surgical mammograms. The results show that the estimated VBD results using the proposed method are consistent with the pre-surgical ACR scores, indicating the effectiveness of this method in breast density estimation. A positive correlation is found between the estimated VBD and the diagnostic ACR score for both the CC view (p=0.033 ) and AP view (p=0.001 ). A linear regression between the results of the CC view and AP view showed a correlation coefficient γ = 0.77, which indicates the robustness of the proposed method and the quantitative character of the additional information obtained with our approach.

  19. A quantitative assessment of results with the Angelchik prosthesis.

    PubMed Central

    Wyllie, J. H.; Edwards, D. A.

    1985-01-01

    The Angelchik antireflux prosthesis was assessed in 15 unpromising patients, 12 of whom had peptic strictures of the oesophagus. Radiological techniques were used to show the effect of the device on gastro-oesophageal reflux, and on the bore and length of strictures. Twelve months later (range 6-24) most patients were well satisfied with the operation, and all considered it had been worthwhile; there was radiological evidence of reduction in reflux and remission of strictures. The device never surrounded the oesophageal sphincter; in all but 1 case it encircled a tube of stomach. Images Fig. 5 Fig. 6 PMID:4037629

  20. Quantitative imaging of volcanic plumes - Results, needs, and future trends

    NASA Astrophysics Data System (ADS)

    Platt, Ulrich; Lübcke, Peter; Kuhn, Jonas; Bobrowski, Nicole; Prata, Fred; Burton, Mike; Kern, Christoph

    2015-07-01

    Recent technology allows two-dimensional "imaging" of trace gas distributions in plumes. In contrast to older, one-dimensional remote sensing techniques, that are only capable of measuring total column densities, the new imaging methods give insight into details of transport and mixing processes as well as chemical transformation within plumes. We give an overview of gas imaging techniques already being applied at volcanoes (SO2 cameras, imaging DOAS, FT-IR imaging), present techniques where first field experiments were conducted (LED-LIDAR, tomographic mapping), and describe some techniques where only theoretical studies with application to volcanology exist (e.g. Fabry-Pérot Imaging, Gas Correlation Spectroscopy, bi-static LIDAR). Finally, we discuss current needs and future trends in imaging technology.

  1. FACE Experiments with Crops: A Quantitative Review of Results

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Two global changes that directly alter crop productivity are rising carbon dioxide concentration ([CO2]) and rising tropospheric ozone concentration ([O3]). While elevated [CO2] directly stimulates productivity in C3 crops, rising tropospheric [O3] negatively impacts photosynthesis and subsequent gr...

  2. Whole cell, label free protein quantitation with data independent acquisition: quantitation at the MS2 level.

    PubMed

    McQueen, Peter; Spicer, Vic; Schellenberg, John; Krokhin, Oleg; Sparling, Richard; Levin, David; Wilkins, John A

    2015-01-01

    Label free quantitation by measurement of peptide fragment signal intensity (MS2 quantitation) is a technique that has seen limited use due to the stochastic nature of data dependent acquisition (DDA). However, data independent acquisition has the potential to make large scale MS2 quantitation a more viable technique. In this study we used an implementation of data independent acquisition--SWATH--to perform label free protein quantitation in a model bacterium Clostridium stercorarium. Four tryptic digests analyzed by SWATH were probed by an ion library containing information on peptide mass and retention time obtained from DDA experiments. Application of this ion library to SWATH data quantified 1030 proteins with at least two peptides quantified (∼ 40% of predicted proteins in the C. stercorarium genome) in each replicate. Quantitative results obtained were very consistent between biological replicates (R(2) ∼ 0.960). Protein quantitation by summation of peptide fragment signal intensities was also highly consistent between biological replicates (R(2) ∼ 0.930), indicating that this approach may have increased viability compared to recent applications in label free protein quantitation. SWATH based quantitation was able to consistently detect differences in relative protein quantity and it provided coverage for a number of proteins that were missed in some samples by DDA analysis. PMID:25348682

  3. Does Preoperative Measurement of Cerebral Blood Flow with Acetazolamide Challenge in Addition to Preoperative Measurement of Cerebral Blood Flow at the Resting State Increase the Predictive Accuracy of Development of Cerebral Hyperperfusion after Carotid Endarterectomy? Results from 500 Cases with Brain Perfusion Single-photon Emission Computed Tomography Study

    PubMed Central

    OSHIDA, Sotaro; OGASAWARA, Kuniaki; SAURA, Hiroaki; YOSHIDA, Koji; FUJIWARA, Shunro; KOJIMA, Daigo; KOBAYASHI, Masakazu; YOSHIDA, Kenji; KUBO, Yoshitaka; OGAWA, Akira

    2015-01-01

    The purpose of the present study was to determine whether preoperative measurement of cerebral blood flow (CBF) with acetazolamide in addition to preoperative measurement of CBF at the resting state increases the predictive accuracy of development of cerebral hyperperfusion after carotid endarterectomy (CEA). CBF at the resting state and cerebrovascular reactivity (CVR) to acetazolamide were quantitatively assessed using N-isopropyl-p-[123I]-iodoamphetamine (IMP)-autoradiography method with single-photon emission computed tomography (SPECT) before CEA in 500 patients with ipsilateral internal carotid artery stenosis (≥ 70%). CBF measurement using 123I-IMP SPECT was also performed immediately and 3 days after CEA. A region of interest (ROI) was automatically placed in the middle cerebral artery territory in the affected cerebral hemisphere using a three-dimensional stereotactic ROI template. Preoperative decreases in CBF at the resting state [95% confidence intervals (CIs), 0.855 to 0.967; P = 0.0023] and preoperative decreases in CVR to acetazolamide (95% CIs, 0.844 to 0.912; P < 0.0001) were significant independent predictors of post-CEA hyperperfusion. The area under the receiver operating characteristic curve for prediction of the development of post-CEA hyperperfusion was significantly greater for CVR to acetazolamide than for CBF at the resting state (difference between areas, 0.173; P < 0.0001). Sensitivity, specificity, and positive- and negative-predictive values for the prediction of the development of post-CEA hyperperfusion were significantly greater for CVR to acetazolamide than for CBF at the resting state (P < 0.05, respectively). The present study demonstrated that preoperative measurement of CBF with acetazolamide in addition to preoperative measurement of CBF at the resting state increases the predictive accuracy of the development of post-CEA hyperperfusion. PMID:25746308

  4. Bayes` theorem and quantitative risk assessment

    SciTech Connect

    Kaplan, S.

    1994-12-31

    This paper argues that for a quantitative risk analysis (QRA) to be useful for public and private decision making, and for rallying the support necessary to implement those decisions, it is necessary that the QRA results be ``trustable.`` Trustable means that the results are based solidly and logically on all the relevant evidence available. This, in turn, means that the quantitative results must be derived from the evidence using Bayes` theorem. Thus, it argues that one should strive to make their QRAs more clearly and explicitly Bayesian, and in this way make them more ``evidence dependent`` than ``personality dependent.``

  5. Quantitative comparisons of in vitro assays for estrogenic activities.

    PubMed Central

    Fang, H; Tong, W; Perkins, R; Soto, A M; Prechtl, N V; Sheehan, D M

    2000-01-01

    Substances that may act as estrogens show a broad chemical structural diversity. To thoroughly address the question of possible adverse estrogenic effects, reliable methods are needed to detect and identify the chemicals of these diverse structural classes. We compared three assays--in vitro estrogen receptor competitive binding assays (ER binding assays), yeast-based reporter gene assays (yeast assays), and the MCF-7 cell proliferation assay (E-SCREEN assay)--to determine their quantitative agreement in identifying structurally diverse estrogens. We examined assay performance for relative sensitivity, detection of active/inactive chemicals, and estrogen/antiestrogen activities. In this examination, we combined individual data sets in a specific, quantitative data mining exercise. Data sets for at least 29 chemicals from five laboratories were analyzed pair-wise by X-Y plots. The ER binding assay was a good predictor for the other two assay results when the antiestrogens were excluded (r(2) is 0.78 for the yeast assays and 0.85 for the E-SCREEN assays). Additionally, the examination strongly suggests that biologic information that is not apparent from any of the individual assays can be discovered by quantitative pair-wise comparisons among assays. Antiestrogens are identified as outliers in the ER binding/yeast assay, while complete antagonists are identified in the ER binding and E-SCREEN assays. Furthermore, the presence of outliers may be explained by different mechanisms that induce an endocrine response, different impurities in different batches of chemicals, different species sensitivity, or limitations of the assay techniques. Although these assays involve different levels of biologic complexity, the major conclusion is that they generally provided consistent information in quantitatively determining estrogenic activity for the five data sets examined. The results should provide guidance for expanded data mining examinations and the selection of appropriate

  6. POEM: Identifying Joint Additive Effects on Regulatory Circuits

    PubMed Central

    Botzman, Maya; Nachshon, Aharon; Brodt, Avital; Gat-Viks, Irit

    2016-01-01

    Motivation: Expression Quantitative Trait Locus (eQTL) mapping tackles the problem of identifying variation in DNA sequence that have an effect on the transcriptional regulatory network. Major computational efforts are aimed at characterizing the joint effects of several eQTLs acting in concert to govern the expression of the same genes. Yet, progress toward a comprehensive prediction of such joint effects is limited. For example, existing eQTL methods commonly discover interacting loci affecting the expression levels of a module of co-regulated genes. Such “modularization” approaches, however, are focused on epistatic relations and thus have limited utility for the case of additive (non-epistatic) effects. Results: Here we present POEM (Pairwise effect On Expression Modules), a methodology for identifying pairwise eQTL effects on gene modules. POEM is specifically designed to achieve high performance in the case of additive joint effects. We applied POEM to transcription profiles measured in bone marrow-derived dendritic cells across a population of genotyped mice. Our study reveals widespread additive, trans-acting pairwise effects on gene modules, characterizes their organizational principles, and highlights high-order interconnections between modules within the immune signaling network. These analyses elucidate the central role of additive pairwise effect in regulatory circuits, and provide computational tools for future investigations into the interplay between eQTLs. Availability: The software described in this article is available at csgi.tau.ac.il/POEM/. PMID:27148351

  7. [Accounting for Expected Linkage in Biometric Analysis of Quantitative Traits].

    PubMed

    Mikhailov, M E

    2015-08-01

    The problem of accounting for a genetic estimation of expected linkage in the disposition of random loci was solved for the additive-dominant model. The Comstock-Robinson estimations for the sum of squares of dominant effects, the sum of squares of additive effects, and the average degree of dominance were modified. Also, the Wright's estimation for the number of loci controlling the variation of a quantitative trait was modified and its application sphere was extended. Formulas that should eliminate linkage, on average, were derived for these estimations. Nonbiased estimations were applied to the analysis of maize data. Our result showed that the most likely cause of heterosis is dominance rather than overdominance and that the main part of the heterotic effect is provided by dozens of genes. PMID:26601496

  8. Subsurface imaging and cell refractometry using quantitative phase/ shear-force feedback microscopy

    NASA Astrophysics Data System (ADS)

    Edward, Kert; Farahi, Faramarz

    2009-10-01

    Over the last few years, several novel quantitative phase imaging techniques have been developed for the study of biological cells. However, many of these techniques are encumbered by inherent limitations including 2π phase ambiguities and diffraction limited spatial resolution. In addition, subsurface information in the phase data is not exploited. We hereby present a novel quantitative phase imaging system without 2 π ambiguities, which also allows for subsurface imaging and cell refractometry studies. This is accomplished by utilizing simultaneously obtained shear-force topography information. We will demonstrate how the quantitative phase and topography data can be used for subsurface and cell refractometry analysis and will present results for a fabricated structure and a malaria infected red blood cell.

  9. Electric Field Quantitative Measurement System and Method

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  10. A quantitative assay for intercellular aggregation

    NASA Technical Reports Server (NTRS)

    Neelamegham, S.; Zygourakis, K.; McIntire, L. V. (Principal Investigator)

    1997-01-01

    In an earlier communication (Munn et al., J Immunol. Methods 166: 11-25, 1993), we presented the initial development of a quantitative assay for monitoring the rates of cellular aggregation based on digital image processing and video microscopy. This study describes some important enhancements and modifications to the procedure. A new index is introduced to characterize the three-dimensional morphology of the aggregates. This index is based on temporal changes in the projected area of the cells and cell aggregates during the course of the experiment. By drawing an analogy with the kinetic theory of gases, we have also introduced a procedure to normalize for variations in cell seeding density among different experiments. In addition, the image analysis technique has been improved by introducing a background subtraction algorithm to remove illumination defects and an adaptive segmentation procedure. These improvements allowed us to completely automate the image analysis procedure, thus minimizing user intervention and improving the reproducibility of the measurements. The enhanced visual assay is evaluated using some recent results from our studies on homotypic lymphocyte aggregation.

  11. Extracting Quantitative Data from Lunar Soil Spectra

    NASA Technical Reports Server (NTRS)

    Noble, S. K.; Pieters, C. M.; Hiroi, T.

    2005-01-01

    Using the modified Gaussian model (MGM) developed by Sunshine et al. [1] we compared the spectral properties of the Lunar Soil Characterization Consortium (LSCC) suite of lunar soils [2,3] with their petrologic and chemical compositions to obtain quantitative data. Our initial work on Apollo 17 soils [4] suggested that useful compositional data could be elicited from high quality soil spectra. We are now able to expand upon those results with the full suite of LSCC soils that allows us to explore a much wider range of compositions and maturity states. The model is shown to be sensitive to pyroxene abundance and can evaluate the relative portion of high-Ca and low-Ca pyroxenes in the soils. In addition, the dataset has provided unexpected insights into the nature and causes of absorption bands in lunar soils. For example, it was found that two distinct absorption bands are required in the 1.2 m region of the spectrum. Neither of these bands can be attributed to plagioclase or agglutinates, but both appear to be largely due to pyroxene.

  12. Multiple quantitative trait analysis using bayesian networks.

    PubMed

    Scutari, Marco; Howell, Phil; Balding, David J; Mackay, Ian

    2014-09-01

    Models for genome-wide prediction and association studies usually target a single phenotypic trait. However, in animal and plant genetics it is common to record information on multiple phenotypes for each individual that will be genotyped. Modeling traits individually disregards the fact that they are most likely associated due to pleiotropy and shared biological basis, thus providing only a partial, confounded view of genetic effects and phenotypic interactions. In this article we use data from a Multiparent Advanced Generation Inter-Cross (MAGIC) winter wheat population to explore Bayesian networks as a convenient and interpretable framework for the simultaneous modeling of multiple quantitative traits. We show that they are equivalent to multivariate genetic best linear unbiased prediction (GBLUP) and that they are competitive with single-trait elastic net and single-trait GBLUP in predictive performance. Finally, we discuss their relationship with other additive-effects models and their advantages in inference and interpretation. MAGIC populations provide an ideal setting for this kind of investigation because the very low population structure and large sample size result in predictive models with good power and limited confounding due to relatedness. PMID:25236454

  13. Geographical Variation in a Quantitative Character

    PubMed Central

    Nagylaki, T.

    1994-01-01

    A model for the evolution of the local averages of a quantitative character under migration, selection, and random genetic drift in a subdivided population is formulated and investigated. Generations are discrete and nonoverlapping; the monoecious, diploid population mates at random in each deme. All three evolutionary forces are weak, but the migration pattern and the local population numbers are otherwise arbitrary. The character is determined by purely additive gene action and a stochastically independent environment; its distribution is Gaussian with a constant variance; and it is under Gaussian stabilizing selection with the same parameters in every deme. Linkage disequilibrium is neglected. Most of the results concern the covariances of the local averages. For a finite number of demes, explicit formulas are derived for (i) the asymptotic rate and pattern of convergence to equilibrium, (ii) the variance of a suitably weighted average of the local averages, and (iii) the equilibrium covariances when selection and random drift are much weaker than migration. Essentially complete analyses of equilibrium and convergence are presented for random outbreeding and site homing, the Levene and island models, the circular habitat and the unbounded linear stepping-stone model in the diffusion approximation, and the exact unbounded stepping-stone model in one and two dimensions. PMID:8138171

  14. Quantitative assessment of visual behavior in disorders of consciousness.

    PubMed

    Trojano, L; Moretta, P; Loreto, V; Cozzolino, A; Santoro, L; Estraneo, A

    2012-09-01

    The study of eye behavior is of paramount importance in the differential diagnosis of disorders of consciousness (DoC). In spite of this, assessment of eye movement patterns in patients with vegetative state (VS) or minimally conscious state (MCS) only relies on clinical evaluation. In this study we aimed to provide a quantitative assessment of visual tracking behavior in response to moving stimuli in DoC patients. Nine VS patients and nine MCS patients were recruited in a Neurorehabilitation Unit for patients with chronic DoC; 11 matched healthy subjects were tested as the control group. All participants under went a quantitative evaluation of eye-tracking pattern by means of a computerized infrared eye-tracker system; stimuli were represented by a red circle or a small color picture slowly moving on a PC monitor. The proportion of on- or off-target fixations differed significantly between MCS and VS. Most importantly, the distribution of fixations on or off the target in all VS patients was at or below the chance level, whereas in the MCS group seven out of nine patients showed a proportion of on-target fixations significantly higher than the chance level. Fixation length did not differ among the three groups significantly. The present quantitative assessment of visual behaviour in a tracking task demonstrated that MCS and VS patients differ in the proportion of on-target fixations. These results could have important clinical implications since the quantitative analysis of visual behavior might provide additional elements in the differential diagnosis of DoC. PMID:22302277

  15. Cartilage Repair Surgery: Outcome Evaluation by Using Noninvasive Cartilage Biomarkers Based on Quantitative MRI Techniques?

    PubMed Central

    Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.

    2014-01-01

    Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139

  16. A MALDI-MS-based quantitative analytical method for endogenous estrone in human breast cancer cells.

    PubMed

    Kim, Kyoung-Jin; Kim, Hee-Jin; Park, Han-Gyu; Hwang, Cheol-Hwan; Sung, Changmin; Jang, Kyoung-Soon; Park, Sung-Hee; Kim, Byung-Gee; Lee, Yoo-Kyung; Yang, Yung-Hun; Jeong, Jae Hyun; Kim, Yun-Gon

    2016-01-01

    The level of endogenous estrone, one of the three major naturally occurring estrogens, has a significant correlation with the incidence of post-menopausal breast cancer. However, it is challenging to quantitatively monitor it owing to its low abundance. Here, we develop a robust and highly sensitive mass-assisted laser desorption/ionization mass spectrometry (MALDI-MS)-based quantitative platform to identify the absolute quantities of endogenous estrones in a variety of clinical specimens. The one-step modification of endogenous estrone provided good linearity (R(2) > 0.99) and significantly increased the sensitivity of the platform (limit of quantitation: 11 fmol). In addition, we could identify the absolute amount of endogenous estrones in cells of the breast cancer cell line MCF-7 (34 fmol/10(6) cells) by using a deuterated estrone as an internal standard. Finally, by applying the MALDI-MS-based quantitative method to endogenous estrones, we successfully monitored changes in the metabolic expression level of estrones (17.7 fmol/10(6) letrozole-treated cells) in MCF-7 cells resulting from treatment with an aromatase inhibitor. Taken together, these results suggest that this MALDI-MS-based quantitative approach may be a general method for the targeted metabolomics of ketone-containing metabolites, which can reflect clinical conditions and pathogenic mechanisms. PMID:27091422

  17. A MALDI-MS-based quantitative analytical method for endogenous estrone in human breast cancer cells

    PubMed Central

    Kim, Kyoung-Jin; Kim, Hee-Jin; Park, Han-Gyu; Hwang, Cheol-Hwan; Sung, Changmin; Jang, Kyoung-Soon; Park, Sung-Hee; Kim, Byung-Gee; Lee, Yoo-Kyung; Yang, Yung-Hun; Jeong, Jae Hyun; Kim, Yun-Gon

    2016-01-01

    The level of endogenous estrone, one of the three major naturally occurring estrogens, has a significant correlation with the incidence of post-menopausal breast cancer. However, it is challenging to quantitatively monitor it owing to its low abundance. Here, we develop a robust and highly sensitive mass-assisted laser desorption/ionization mass spectrometry (MALDI-MS)-based quantitative platform to identify the absolute quantities of endogenous estrones in a variety of clinical specimens. The one-step modification of endogenous estrone provided good linearity (R2 > 0.99) and significantly increased the sensitivity of the platform (limit of quantitation: 11 fmol). In addition, we could identify the absolute amount of endogenous estrones in cells of the breast cancer cell line MCF-7 (34 fmol/106 cells) by using a deuterated estrone as an internal standard. Finally, by applying the MALDI-MS-based quantitative method to endogenous estrones, we successfully monitored changes in the metabolic expression level of estrones (17.7 fmol/106 letrozole-treated cells) in MCF-7 cells resulting from treatment with an aromatase inhibitor. Taken together, these results suggest that this MALDI-MS-based quantitative approach may be a general method for the targeted metabolomics of ketone-containing metabolites, which can reflect clinical conditions and pathogenic mechanisms. PMID:27091422

  18. Quantitative thermal imaging of aircraft structures

    NASA Astrophysics Data System (ADS)

    Cramer, K. Elliott; Howell, Patricia A.; Syed, Hazari I.

    1995-03-01

    Aircraft structural integrity is a major concern for airlines and airframe manufacturers. To remain economically competitive, airlines are looking at ways to retire older aircraft, not when some fixed number of flight hours or cycles has been reached, but when true structural need dictates. This philosophy is known as `retirement for cause.' The need to extend the life of commercial aircraft has increased the desire to develop nondestructive evaluation (NDE) techniques capable of detecting critical flaws such as disbonding and corrosion. These subsurface flaws are of major concern in bonded lap joints. Disbonding in such a joint can provide an avenue for moisture to enter the structure leading to corrosion. Significant material loss due to corrosion can substantially reduce the structural strength, load bearing capacity and ultimately reduce the life of the structure. The National Aeronautics and Space Administration's Langley Research Center has developed a thermal NDE system designed for application to disbonding and corrosion detection in aircraft skins. By injecting a small amount of heat into the front surface of an aircraft skin, and recording the time history of the resulting surface temperature variations using an infrared camera, quantitative images of both bond integrity and material loss due to corrosion can be produced. This paper presents a discussion of the development of the thermal imaging system as well as the techniques used to analyze the resulting thermal images. The analysis techniques presented represent a significant improvement in the information available over conventional thermal imaging due to the inclusion of data from both the heating and cooling portion of the thermal cycle. Results of laboratory experiments on fabricated disbond and material loss samples are presented to determine the limitations of the system. Additionally, the results of actual aircraft inspections are shown, which help to establish the field applicability for this

  19. Quantitative photoacoustic elastography in humans

    NASA Astrophysics Data System (ADS)

    Hai, Pengfei; Zhou, Yong; Gong, Lei; Wang, Lihong V.

    2016-06-01

    We report quantitative photoacoustic elastography (QPAE) capable of measuring Young's modulus of biological tissue in vivo in humans. By combining conventional PAE with a stress sensor having known stress-strain behavior, QPAE can simultaneously measure strain and stress, from which Young's modulus is calculated. We first demonstrate the feasibility of QPAE in agar phantoms with different concentrations. The measured Young's modulus values fit well with both the empirical expectation based on the agar concentrations and those measured in an independent standard compression test. Next, QPAE was applied to quantify the Young's modulus of skeletal muscle in vivo in humans, showing a linear relationship between muscle stiffness and loading. The results demonstrated the capability of QPAE to assess the absolute elasticity of biological tissue noninvasively in vivo in humans, indicating its potential for tissue biomechanics studies and clinical applications.

  20. Quantitative patterns in drone wars

    NASA Astrophysics Data System (ADS)

    Garcia-Bernardo, Javier; Dodds, Peter Sheridan; Johnson, Neil F.

    2016-02-01

    Attacks by drones (i.e., unmanned combat air vehicles) continue to generate heated political and ethical debates. Here we examine the quantitative nature of drone attacks, focusing on how their intensity and frequency compare with that of other forms of human conflict. Instead of the power-law distribution found recently for insurgent and terrorist attacks, the severity of attacks is more akin to lognormal and exponential distributions, suggesting that the dynamics underlying drone attacks lie beyond these other forms of human conflict. We find that the pattern in the timing of attacks is consistent with one side having almost complete control, an important if expected result. We show that these novel features can be reproduced and understood using a generative mathematical model in which resource allocation to the dominant side is regulated through a feedback loop.

  1. Quantitative evaluation of dermatological antiseptics.

    PubMed

    Leitch, C S; Leitch, A E; Tidman, M J

    2015-12-01

    Topical antiseptics are frequently used in dermatological management, yet evidence for the efficacy of traditional generic formulations is often largely anecdotal. We tested the in vitro bactericidal activity of four commonly used topical antiseptics against Staphylococcus aureus, using a modified version of the European Standard EN 1276, a quantitative suspension test for evaluation of the bactericidal activity of chemical disinfectants and antiseptics. To meet the standard for antiseptic effectiveness of EN 1276, at least a 5 log10 reduction in bacterial count within 5 minutes of exposure is required. While 1% benzalkonium chloride and 6% hydrogen peroxide both achieved a 5 log10 reduction in S. aureus count, neither 2% aqueous eosin nor 1 : 10 000 potassium permanganate showed significant bactericidal activity compared with control at exposure periods of up to 1 h. Aqueous eosin and potassium permanganate may have desirable astringent properties, but these results suggest they lack effective antiseptic activity, at least against S. aureus. PMID:26456933

  2. Quantitative comparison of analysis methods for spectroscopic optical coherence tomography: reply to comment

    PubMed Central

    Bosschaart, Nienke; van Leeuwen, Ton G.; Aalders, Maurice C.G.; Faber, Dirk J.

    2014-01-01

    We reply to the comment by Kraszewski et al on “Quantitative comparison of analysis methods for spectroscopic optical coherence tomography.” We present additional simulations evaluating the proposed window function. We conclude that our simulations show good qualitative agreement with the results of Kraszewski, in support of their conclusion that SOCT optimization should include window shape, next to choice of window size and analysis algorithm. PMID:25401016

  3. Additive manufacturing of hybrid circuits

    DOE PAGESBeta

    Bell, Nelson S.; Sarobol, Pylin; Cook, Adam; Clem, Paul G.; Keicher, David M.; Hirschfeld, Deidre; Hall, Aaron Christopher

    2016-03-26

    There is a rising interest in developing functional electronics using additively manufactured components. Considerations in materials selection and pathways to forming hybrid circuits and devices must demonstrate useful electronic function; must enable integration; and must complement the complex shape, low cost, high volume, and high functionality of structural but generally electronically passive additively manufactured components. This article reviews several emerging technologies being used in industry and research/development to provide integration advantages of fabricating multilayer hybrid circuits or devices. First, we review a maskless, noncontact, direct write (DW) technology that excels in the deposition of metallic colloid inks for electrical interconnects.more » Second, we review a complementary technology, aerosol deposition (AD), which excels in the deposition of metallic and ceramic powder as consolidated, thick conformal coatings and is additionally patternable through masking. As a result, we show examples of hybrid circuits/devices integrated beyond 2-D planes, using combinations of DW or AD processes and conventional, established processes.« less

  4. Programmable Quantitative DNA Nanothermometers.

    PubMed

    Gareau, David; Desrosiers, Arnaud; Vallée-Bélisle, Alexis

    2016-07-13

    Developing molecules, switches, probes or nanomaterials that are able to respond to specific temperature changes should prove of utility for several applications in nanotechnology. Here, we describe bioinspired strategies to design DNA thermoswitches with programmable linear response ranges that can provide either a precise ultrasensitive response over a desired, small temperature interval (±0.05 °C) or an extended linear response over a wide temperature range (e.g., from 25 to 90 °C). Using structural modifications or inexpensive DNA stabilizers, we show that we can tune the transition midpoints of DNA thermometers from 30 to 85 °C. Using multimeric switch architectures, we are able to create ultrasensitive thermometers that display large quantitative fluorescence gains within small temperature variation (e.g., > 700% over 10 °C). Using a combination of thermoswitches of different stabilities or a mix of stabilizers of various strengths, we can create extended thermometers that respond linearly up to 50 °C in temperature range. Here, we demonstrate the reversibility, robustness, and efficiency of these programmable DNA thermometers by monitoring temperature change inside individual wells during polymerase chain reactions. We discuss the potential applications of these programmable DNA thermoswitches in various nanotechnology fields including cell imaging, nanofluidics, nanomedecine, nanoelectronics, nanomaterial, and synthetic biology. PMID:27058370

  5. Quantitative Electron Nanodiffraction.

    SciTech Connect

    Spence, John

    2015-01-30

    This Final report summarizes progress under this award for the final reporting period 2002 - 2013 in our development of quantitive electron nanodiffraction to materials problems, especially devoted to atomistic processes in semiconductors and electronic oxides such as the new artificial oxide multilayers, where our microdiffraction is complemented with energy-loss spectroscopy (ELNES) and aberration-corrected STEM imaging (9). The method has also been used to map out the chemical bonds in the important GaN semiconductor (1) used for solid state lighting, and to understand the effects of stacking sequence variations and interfaces in digital oxide superlattices (8). Other projects include the development of a laser-beam Zernike phase plate for cryo-electron microscopy (5) (based on the Kapitza-Dirac effect), work on reconstruction of molecular images using the scattering from many identical molecules lying in random orientations (4), a review article on space-group determination for the International Tables on Crystallography (10), the observation of energy-loss spectra with millivolt energy resolution and sub-nanometer spatial resolution from individual point defects in an alkali halide, a review article for the Centenary of X-ray Diffration (17) and the development of a new method of electron-beam lithography (12). We briefly summarize here the work on GaN, on oxide superlattice ELNES, and on lithography by STEM.

  6. Quantitative Phase Retrieval in Transmission Electron Microscopy

    NASA Astrophysics Data System (ADS)

    McLeod, Robert Alexander

    Phase retrieval in the transmission electron microscope offers the unique potential to collect quantitative data regarding the electric and magnetic properties of materials at the nanoscale. Substantial progress in the field of quantitative phase imaging was made by improvements to the technique of off-axis electron holography. In this thesis, several breakthroughs have been achieved that improve the quantitative analysis of phase retrieval. An accurate means of measuring the electron wavefront coherence in two-dimensions was developed and pratical applications demonstrated. The detector modulation-transfer function (MTF) was assessed by slanted-edge, noise, and the novel holographic techniques. It was shown the traditional slanted-edge technique underestimates the MTF. In addition, progress was made in dark and gain reference normalization of images, and it was shown that incomplete read-out is a concern for slow-scan CCD detectors. Last, the phase error due to electron shot noise was reduced by the technique of summation of hologram series. The phase error, which limits the finest electric and magnetic phenomena which can be investigated, was reduced by over 900 % with no loss of spatial resolution. Quantitative agreement between the experimental root-mean-square phase error and the analytical prediction of phase error was achieved.

  7. Development and application of absolute quantitative detection by duplex chamber-based digital PCR of genetically modified maize events without pretreatment steps.

    PubMed

    Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao

    2016-04-15

    The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events. PMID:27016439

  8. Nitrogen as a friendly addition to steel

    SciTech Connect

    Rawers, J.C.

    2006-01-01

    Interstitial alloying with nitrogen or carbon is a common means of enhancing properties of iron-based alloys. Interstitial nitrogen addition to fcc-phase Fe-Cr-Mn/Ni alloys results in improved mechanical properties, whereas addition of carbon can result in the formation of unwanted carbides. Carbon addition to low alloy, bcc-phase iron alloys significantly improves strength through the formation of carbides, whereas addition of nitrogen in bcc-phase iron alloys can result in porous casting and reduced mechanical properties. This study will show that alloying iron-based alloys with both nitrogen and carbon can produce positive results. Nitrogen addition to Fe-C and Fe-Cr-C alloys, and both nitrogen and nitrogen-carbon additions to Fe-Cr-Mn/Ni alloys altered the microstructure, improved mechanical properties, increased hardness, and reduced wear by stabilizing the fcc-phase and altering (possibly eliminating) precipitate formation.

  9. On the Additive and Dominant Variance and Covariance of Individuals Within the Genomic Selection Scope

    PubMed Central

    Vitezica, Zulma G.; Varona, Luis; Legarra, Andres

    2013-01-01

    Genomic evaluation models can fit additive and dominant SNP effects. Under quantitative genetics theory, additive or “breeding” values of individuals are generated by substitution effects, which involve both “biological” additive and dominant effects of the markers. Dominance deviations include only a portion of the biological dominant effects of the markers. Additive variance includes variation due to the additive and dominant effects of the markers. We describe a matrix of dominant genomic relationships across individuals, D, which is similar to the G matrix used in genomic best linear unbiased prediction. This matrix can be used in a mixed-model context for genomic evaluations or to estimate dominant and additive variances in the population. From the “genotypic” value of individuals, an alternative parameterization defines additive and dominance as the parts attributable to the additive and dominant effect of the markers. This approach underestimates the additive genetic variance and overestimates the dominance variance. Transforming the variances from one model into the other is trivial if the distribution of allelic frequencies is known. We illustrate these results with mouse data (four traits, 1884 mice, and 10,946 markers) and simulated data (2100 individuals and 10,000 markers). Variance components were estimated correctly in the model, considering breeding values and dominance deviations. For the model considering genotypic values, the inclusion of dominant effects biased the estimate of additive variance. Genomic models were more accurate for the estimation of variance components than their pedigree-based counterparts. PMID:24121775

  10. Quantitative Literacy: Geosciences and Beyond

    NASA Astrophysics Data System (ADS)

    Richardson, R. M.; McCallum, W. G.

    2002-12-01

    Quantitative literacy seems like such a natural for the geosciences, right? The field has gone from its origin as a largely descriptive discipline to one where it is hard to imagine failing to bring a full range of mathematical tools to the solution of geological problems. Although there are many definitions of quantitative literacy, we have proposed one that is analogous to the UNESCO definition of conventional literacy: "A quantitatively literate person is one who, with understanding, can both read and represent quantitative information arising in his or her everyday life." Central to this definition is the concept that a curriculum for quantitative literacy must go beyond the basic ability to "read and write" mathematics and develop conceptual understanding. It is also critical that a curriculum for quantitative literacy be engaged with a context, be it everyday life, humanities, geoscience or other sciences, business, engineering, or technology. Thus, our definition works both within and outside the sciences. What role do geoscience faculty have in helping students become quantitatively literate? Is it our role, or that of the mathematicians? How does quantitative literacy vary between different scientific and engineering fields? Or between science and nonscience fields? We will argue that successful quantitative literacy curricula must be an across-the-curriculum responsibility. We will share examples of how quantitative literacy can be developed within a geoscience curriculum, beginning with introductory classes for nonmajors (using the Mauna Loa CO2 data set) through graduate courses in inverse theory (using singular value decomposition). We will highlight six approaches to across-the curriculum efforts from national models: collaboration between mathematics and other faculty; gateway testing; intensive instructional support; workshops for nonmathematics faculty; quantitative reasoning requirement; and individual initiative by nonmathematics faculty.

  11. Microbiological Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dominguez, Silvia; Schaffner, Donald W.

    The meat and poultry industry faces ongoing challenges due to the natural association of pathogens of concern (e.g., Salmonella, Campylobacter jejuni, Escherichia coli O157:H7) with a variety of domesticated food animals. In addition, pathogens such as Listeria monocytogenes pose a significant cross-contamination risk during further meat and poultry processing, distribution, and storage. Furthermore, the meat and poultry industries are constantly changing with the addition of new products, use of new raw materials, and targeting of new consumer populations, each of which may give rise to potential new risks. National and international regulations are increasingly using a “risk-based” approach to food safety (where the regulatory focus is driven by the magnitude of the risk), so risk assessment is becoming a valuable tool to systematically organize and evaluate the potential public health risk posed by food processing operations.

  12. Quantitative Spectroscopy of Deneb

    NASA Astrophysics Data System (ADS)

    Schiller, Florian; Przybilla, N.

    We use the visually brightest A-type supergiant Deneb (A2 Ia) as benchmark for testing a spectro- scopic analysis technique developed for quantitative studies of BA-type supergiants. Our NLTE spectrum synthesis technique allows us to derive stellar parameters and elemental abundances with unprecedented accuracy. The study is based on a high-resolution and high-S/N spectrum obtained with the Echelle spectrograph FOCES on the Calar Alto 2.2 m telescope. Practically all inconsistencies reported in earlier studies are resolved. A self-consistent view of Deneb is thus obtained, allowing us to discuss its evolutionary state in detail by comparison with the most recent generation of evolution models for massive stars. The basic atmospheric parameters Teff = 8525 ± 75 K and log g = 1.10 ± 0.05 dex (cgs) and the distance imply the following fundamental parameters for Deneb: M spec = 17 ± 3 M⊙ , L = 1.77 ± 0.29 · 105 L⊙ and R = 192 ± 16 R⊙ . The derived He and CNO abundances indicate mixing with nuclear processed matter. The high N/C ratio of 4.64 ± 1.39 and a N/O ratio of 0.88 ± 0.07 (mass fractions) could in principle be explained by evolutionary models with initially very rapid rotation. A mass of ˜ 22 M⊙ is implied for the progenitor on the zero-age main se- quence, i.e. it was a late O-type star. Significant mass-loss has occurred, probably enhanced by pronounced centrifugal forces. The observational constraints favour a scenario for the evolu- tion of Deneb where the effects of rotational mixing may be amplified by an interaction with a magnetic field. Analogous analyses of such highly luminous BA-type supergiants will allow for precision studies of different galaxies in the Local Group and beyond.

  13. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  14. Getting Over the Quantitative-Qualitative Debate.

    ERIC Educational Resources Information Center

    Howe, Kenneth R.

    1992-01-01

    Describes the evolution of the qualitative-quantitative debate, and suggests that educational researchers learn to live with the necessary tensions resulting from accepting elements of each approach. The proposed critical educational research model is illustrated through examples that go beyond a positivist-interpretivist split. (SLD)

  15. An Additive Manufacturing Test Artifact

    PubMed Central

    Moylan, Shawn; Slotwinski, John; Cooke, April; Jurrens, Kevin; Donmez, M Alkan

    2014-01-01

    A test artifact, intended for standardization, is proposed for the purpose of evaluating the performance of additive manufacturing (AM) systems. A thorough analysis of previously proposed AM test artifacts as well as experience with machining test artifacts have inspired the design of the proposed test artifact. This new artifact is designed to provide a characterization of the capabilities and limitations of an AM system, as well as to allow system improvement by linking specific errors measured in the test artifact to specific sources in the AM system. The proposed test artifact has been built in multiple materials using multiple AM technologies. The results of several of the builds are discussed, demonstrating how the measurement results can be used to characterize and improve a specific AM system. PMID:26601039

  16. An Additive Manufacturing Test Artifact.

    PubMed

    Moylan, Shawn; Slotwinski, John; Cooke, April; Jurrens, Kevin; Donmez, M Alkan

    2014-01-01

    A test artifact, intended for standardization, is proposed for the purpose of evaluating the performance of additive manufacturing (AM) systems. A thorough analysis of previously proposed AM test artifacts as well as experience with machining test artifacts have inspired the design of the proposed test artifact. This new artifact is designed to provide a characterization of the capabilities and limitations of an AM system, as well as to allow system improvement by linking specific errors measured in the test artifact to specific sources in the AM system. The proposed test artifact has been built in multiple materials using multiple AM technologies. The results of several of the builds are discussed, demonstrating how the measurement results can be used to characterize and improve a specific AM system. PMID:26601039

  17. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  18. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  19. Optimization of quantitative infrared analysis

    NASA Astrophysics Data System (ADS)

    Duerst, Richard W.; Breneman, W. E.; Dittmar, Rebecca M.; Drugge, Richard E.; Gagnon, Jim E.; Pranis, Robert A.; Spicer, Colleen K.; Stebbings, William L.; Westberg, J. W.; Duerst, Marilyn D.

    1994-01-01

    A number of industrial processes, especially quality assurance procedures, accept information on relative quantities of components in mixtures, whenever absolute values for the quantitative analysis are unavailable. These relative quantities may be determined from infrared intensity ratios even though known standards are unavailable. Repeatability [vs precisionhl in quantitative analysis is a critical parameter for meaningful results. In any given analysis, multiple runs provide "answers" with a certain standard deviation. Obviously, the lower the standard deviation, the better the precision. In attempting to minimize the standard deviation and thus improve precision, we need to delineate which contributing factors we have control over (such as sample preparation techniques, data analysis methodology) and which factors we have little control over (environmental and instrument noise, for example). For a given set of conditions, the best instrumental precision achievable on an IR instrument should be determinable. Traditionally, the term "signal-to-noise" (S/N) has been used for a single spectrum, realizing that S/N improves with an increase in number of scans coadded for generation of that single spectrum. However, the S/N ratio does not directly reflect the precision achievable for an absorbing band. We prefer to use the phrase "maximum achievable instrument precision" (MAIP), which is equivalent to the minimum relative standard deviation for a given peak (either height or area) in spectra. For a specific analysis, the analyst should have in mind the desired precision. Only if the desired precision is less than the MA1P will the analysis be feasible. Once the MAIP is established, other experimental procedures may be modified to improve the analytical precision, if it is below that which is expected (the MAIP).

  20. Quantitation and detection of vanadium in biologic and pollution materials

    NASA Technical Reports Server (NTRS)

    Gordon, W. A.

    1974-01-01

    A review is presented of special considerations and methodology for determining vanadium in biological and air pollution materials. In addition to descriptions of specific analysis procedures, general sections are included on quantitation of analysis procedures, sample preparation, blanks, and methods of detection of vanadium. Most of the information presented is applicable to the determination of other trace elements in addition to vanadium.

  1. Workshop on quantitative dynamic stratigraphy

    SciTech Connect

    Cross, T.A.

    1988-04-01

    This document discusses the development of quantitative simulation models for the investigation of geologic systems. The selection of variables, model verification, evaluation, and future directions in quantitative dynamic stratigraphy (QDS) models are detailed. Interdisciplinary applications, integration, implementation, and transfer of QDS are also discussed. (FI)

  2. Helping Students Become Quantitatively Literate

    ERIC Educational Resources Information Center

    Piatek-Jimenez, Katrina; Marcinek, Tibor; Phelps, Christine M.; Dias, Ana

    2012-01-01

    In recent years, the term "quantitative literacy" has become a buzzword in the mathematics community. But what does it mean, and is it something that should be incorporated into the high school mathematics classroom? In this article, the authors will define quantitative literacy (QL), discuss how teaching for QL differs from teaching a traditional…

  3. QUANTITATIVE 15N NMR SPECTROSCOPY

    EPA Science Inventory

    Line intensities in 15N NMR spectra are strongly influenced by spin-lattice and spin-spin relaxation times, relaxation mechanisms and experimental conditions. Special care has to be taken in using 15N spectra for quantitative purposes. Quantitative aspects are discussed for the 1...

  4. Sensitivity, noise and quantitative model of Laser Speckle Contrast Imaging

    NASA Astrophysics Data System (ADS)

    Yuan, Shuai

    based on our model. In our experimental results, we saw significant improvements in data analyses using our model and calibration procedure, though they are still not as large as we had hoped. (2) The major noise source affecting the quantitative model is CCD systematic noise, which can add additional contrast in the image. We studied this separately to understand its nature. We proposed several methods to reduce CCD noise based on our noise model. Beyond those studies, we also did the following: (1) we performed several studies of statistical properties of laser speckle image. Our results show that intensities in static speckle images and dynamic speckle patterns follow gamma probability distributions. (2) For future implantation and instrumentation of LSCI, we studied different approximation algorithms to speed the SC processing in software and hardware as well as the requirements of the camera. (3) The study of polarization effect shows that the experimental result is consistent with theoretical analyses. (4) By comparing different models, we found that Brownian motion model can be used as a general model for most biomedical applications and Durian's new model is slightly better than Briers' original model, though the latter is still applicable to general theory analyses. (5) The new technique combining LSCI with phosphorescence lifetime imaging (PLI) can provide simultaneous 2D maps of partial pressure of oxygen (pO2) and cerebral blood flow (CBF). The capability of the system was demonstrated by monitoring the propagation of cortical spreading depression (CSD) waves through the sealed cranial window. This technique has the potential to be a novel tool for quantitative analysis of the dynamic delivery of oxygen and brain tissue metabolism.

  5. Comparison of official methods for 'readily oxidizable substances' in propionic acid as a food additive.

    PubMed

    Ishiwata, H; Takeda, Y; Kawasaki, Y; Kubota, H; Yamada, T

    1996-01-01

    The official methods for 'readily oxidizable substances (ROS)' in propionic acid as a food additive were compared. The methods examined were those adopted in the Compendium of Food Additive Specifications (CFAS) by the Joint FAO-WHO Expert Committee on Food Additives, FAO, The Japanese Standards for Food Additives (JSFA) by the Ministry of Health and Welfare, Japan, and the Food Chemicals Codex (FCC) by the National Research Council, USA. The methods given in CFAS and JSFA are the same (potassium permanganate consumption). However, by this method, manganese (VII) in potassium permanganate was readily reduced to colourless manganese(II) with some substances contained in the propionic acid before reacting with aldehydes, which are generally considered as 'readily oxidizable substances', to form brown manganese (IV) oxide. The FCC method (bromine consumption) for 'ROS' could be recommended because it was able to obtain quantitative results of 'ROS', including aldehydes. PMID:8647299

  6. Quantitative genetic analysis of injury liability in infants and toddlers

    SciTech Connect

    Phillips, K.; Matheny, A.P. Jr.

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  7. Next Generation Quantitative Genetics in Plants

    PubMed Central

    Jiménez-Gómez, José M.

    2011-01-01

    Most characteristics in living organisms show continuous variation, which suggests that they are controlled by multiple genes. Quantitative trait loci (QTL) analysis can identify the genes underlying continuous traits by establishing associations between genetic markers and observed phenotypic variation in a segregating population. The new high-throughput sequencing (HTS) technologies greatly facilitate QTL analysis by providing genetic markers at genome-wide resolution in any species without previous knowledge of its genome. In addition HTS serves to quantify molecular phenotypes, which aids to identify the loci responsible for QTLs and to understand the mechanisms underlying diversity. The constant improvements in price, experimental protocols, computational pipelines, and statistical frameworks are making feasible the use of HTS for any research group interested in quantitative genetics. In this review I discuss the application of HTS for molecular marker discovery, population genotyping, and expression profiling in QTL analysis. PMID:22645550

  8. Principles of Quantitative Research.

    ERIC Educational Resources Information Center

    Kitao, S. Kathleen

    Research results should not be taken at face value; some research is not well designed, and readers must be able to assess whether the research carried out actually supports the results or may be explained otherwise. Research reports are usually divided into introduction or literature review, methods, results, and discussion and conclusions. Basic…

  9. Quantitative microbial ecology through stable isotope probing.

    PubMed

    Hungate, Bruce A; Mau, Rebecca L; Schwartz, Egbert; Caporaso, J Gregory; Dijkstra, Paul; van Gestel, Natasja; Koch, Benjamin J; Liu, Cindy M; McHugh, Theresa A; Marks, Jane C; Morrissey, Ember M; Price, Lance B

    2015-11-01

    Bacteria grow and transform elements at different rates, and as yet, quantifying this variation in the environment is difficult. Determining isotope enrichment with fine taxonomic resolution after exposure to isotope tracers could help, but there are few suitable techniques. We propose a modification to stable isotope probing (SIP) that enables the isotopic composition of DNA from individual bacterial taxa after exposure to isotope tracers to be determined. In our modification, after isopycnic centrifugation, DNA is collected in multiple density fractions, and each fraction is sequenced separately. Taxon-specific density curves are produced for labeled and nonlabeled treatments, from which the shift in density for each individual taxon in response to isotope labeling is calculated. Expressing each taxon's density shift relative to that taxon's density measured without isotope enrichment accounts for the influence of nucleic acid composition on density and isolates the influence of isotope tracer assimilation. The shift in density translates quantitatively to isotopic enrichment. Because this revision to SIP allows quantitative measurements of isotope enrichment, we propose to call it quantitative stable isotope probing (qSIP). We demonstrated qSIP using soil incubations, in which soil bacteria exhibited strong taxonomic variations in (18)O and (13)C composition after exposure to [(18)O]water or [(13)C]glucose. The addition of glucose increased the assimilation of (18)O into DNA from [(18)O]water. However, the increase in (18)O assimilation was greater than expected based on utilization of glucose-derived carbon alone, because the addition of glucose indirectly stimulated bacteria to utilize other substrates for growth. This example illustrates the benefit of a quantitative approach to stable isotope probing. PMID:26296731

  10. NSCLC tumor shrinkage prediction using quantitative image features.

    PubMed

    Hunter, Luke A; Chen, Yi Pei; Zhang, Lifei; Matney, Jason E; Choi, Haesun; Kry, Stephen F; Martel, Mary K; Stingo, Francesco; Liao, Zhongxing; Gomez, Daniel; Yang, Jinzhong; Court, Laurence E

    2016-04-01

    The objective of this study was to develop a quantitative image feature model to predict non-small cell lung cancer (NSCLC) volume shrinkage from pre-treatment CT images. 64 stage II-IIIB NSCLC patients with similar treatments were all imaged using the same CT scanner and protocol. For each patient, the planning gross tumor volume (GTV) was deformed onto the week 6 treatment image, and tumor shrinkage was quantified as the deformed GTV volume divided by the planning GTV volume. Geometric, intensity histogram, absolute gradient image, co-occurrence matrix, and run-length matrix image features were extracted from each planning GTV. Prediction models were generated using principal component regression with simulated annealing subset selection. Performance was quantified using the mean squared error (MSE) between the predicted and observed tumor shrinkages. Permutation tests were used to validate the results. The optimal prediction model gave a strong correlation between the observed and predicted tumor shrinkages with r=0.81 and MSE=8.60×10(-3). Compared to predictions based on the mean population shrinkage this resulted in a 2.92 fold reduction in MSE. In conclusion, this study indicated that quantitative image features extracted from existing pre-treatment CT images can successfully predict tumor shrinkage and provide additional information for clinical decisions regarding patient risk stratification, treatment, and prognosis. PMID:26878137

  11. Performance Assessment in Fingerprinting and Multi Component Quantitative NMR Analyses.

    PubMed

    Gallo, Vito; Intini, Nicola; Mastrorilli, Piero; Latronico, Mario; Scapicchio, Pasquale; Triggiani, Maurizio; Bevilacqua, Vitoantonio; Fanizzi, Paolo; Acquotti, Domenico; Airoldi, Cristina; Arnesano, Fabio; Assfalg, Michael; Benevelli, Francesca; Bertelli, Davide; Cagliani, Laura R; Casadei, Luca; Cesare Marincola, Flaminia; Colafemmina, Giuseppe; Consonni, Roberto; Cosentino, Cesare; Davalli, Silvia; De Pascali, Sandra A; D'Aiuto, Virginia; Faccini, Andrea; Gobetto, Roberto; Lamanna, Raffaele; Liguori, Francesca; Longobardi, Francesco; Mallamace, Domenico; Mazzei, Pierluigi; Menegazzo, Ileana; Milone, Salvatore; Mucci, Adele; Napoli, Claudia; Pertinhez, Thelma; Rizzuti, Antonino; Rocchigiani, Luca; Schievano, Elisabetta; Sciubba, Fabio; Sobolev, Anatoly; Tenori, Leonardo; Valerio, Mariacristina

    2015-07-01

    An interlaboratory comparison (ILC) was organized with the aim to set up quality control indicators suitable for multicomponent quantitative analysis by nuclear magnetic resonance (NMR) spectroscopy. A total of 36 NMR data sets (corresponding to 1260 NMR spectra) were produced by 30 participants using 34 NMR spectrometers. The calibration line method was chosen for the quantification of a five-component model mixture. Results show that quantitative NMR is a robust quantification tool and that 26 out of 36 data sets resulted in statistically equivalent calibration lines for all considered NMR signals. The performance of each laboratory was assessed by means of a new performance index (named Qp-score) which is related to the difference between the experimental and the consensus values of the slope of the calibration lines. Laboratories endowed with a Qp-score falling within the suitable acceptability range are qualified to produce NMR spectra that can be considered statistically equivalent in terms of relative intensities of the signals. In addition, the specific response of nuclei to the experimental excitation/relaxation conditions was addressed by means of the parameter named NR. NR is related to the difference between the theoretical and the consensus slopes of the calibration lines and is specific for each signal produced by a well-defined set of acquisition parameters. PMID:26020452

  12. Quantitating Metabolites in Protein Precipitated Serum Using NMR Spectroscopy

    PubMed Central

    2015-01-01

    Quantitative NMR-based metabolite profiling is challenged by the deleterious effects of abundant proteins in the intact blood plasma/serum, which underscores the need for alternative approaches. Protein removal by ultrafiltration using low molecular weight cutoff filters thus represents an important step. However, protein precipitation, an alternative and simple approach for protein removal, lacks detailed quantitative assessment for use in NMR based metabolomics. In this study, we have comprehensively evaluated the performance of protein precipitation using methanol, acetonitrile, perchloric acid, and trichloroacetic acid and ultrafiltration approaches using 1D and 2D NMR, based on the identification and absolute quantitation of 44 human blood metabolites, including a few identified for the first time in the NMR spectra of human serum. We also investigated the use of a “smart isotope tag,” 15N-cholamine for further resolution enhancement, which resulted in the detection of a number of additional metabolites. 1H NMR of both protein precipitated and ultrafiltered serum detected all 44 metabolites with comparable reproducibility (average CV, 3.7% for precipitation; 3.6% for filtration). However, nearly half of the quantified metabolites in ultrafiltered serum exhibited 10–74% lower concentrations; specifically, tryptophan, benzoate, and 2-oxoisocaproate showed much lower concentrations compared to protein precipitated serum. These results indicate that protein precipitation using methanol offers a reliable approach for routine NMR-based metabolomics of human blood serum/plasma and should be considered as an alternative to ultrafiltration. Importantly, protein precipitation, which is commonly used by mass spectrometry (MS), promises avenues for direct comparison and correlation of metabolite data obtained from the two analytical platforms to exploit their combined strength in the metabolomics of blood. PMID:24796490

  13. Addition polyimide end cap study

    NASA Technical Reports Server (NTRS)

    St.clair, T. L.

    1980-01-01

    The characterization of addition polyimides with various end caps for adhesive applications at 120-250 C environments is discussed. Oligometric polyimides were prepared from 3,3',4,4'-benzophenone tetracarboxylic dianhydride and 3,3'-methylenedianiline which were end-capped with functionally reactive moities which cause crosslinking when the oligomers are heated to 200-400 C. The syntheses of the oligomers are outlined. The thermolysis of the oligomers was studied by differential scanning calorimetry and the resulting polymers were characterized by differential thermal analysis and adhesive performance. The adhesive data include lap shear strengths on titanium 6-4 adherends both before and after aging for 1000 hours at 121 C and/or 232 C.

  14. Understanding quantitative research: part 1.

    PubMed

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research. PMID:23346707

  15. Evaluation of quantitative accuracy in CZT-based pre-clinical SPECT for various isotopes

    NASA Astrophysics Data System (ADS)

    Park, S.-J.; Yu, A. R.; Kim, Y.-s.; Kang, W.-S.; Jin, S. S.; Kim, J.-S.; Son, T. J.; Kim, H.-J.

    2015-05-01

    In vivo pre-clinical single-photon emission computed tomography (SPECT) is a valuable tool for functional small animal imaging, but several physical factors, such as scatter radiation, limit the quantitative accuracy of conventional scintillation crystal-based SPECT. Semiconductor detectors such as CZT overcome these deficiencies through superior energy resolution. To our knowledge, little scientific information exists regarding the accuracy of quantitative analysis in CZT-based pre-clinical SPECT systems for different isotopes. The aim of this study was to assess the quantitative accuracy of CZT-based pre-clinical SPECT for four isotopes: 201Tl, 99mTc, 123I, and 111In. The quantitative accuracy of the CZT-based Triumph X-SPECT (Gamma-Medica Ideas, Northridge, CA, U.S.A.) was compared with that of a conventional SPECT using GATE simulation. Quantitative errors due to the attenuation and scatter effects were evaluated for all four isotopes with energy windows of 5%, 10%, and 20%. A spherical source containing the isotope was placed at the center of the air-or-water-filled mouse-sized cylinder phantom. The CZT-based pre-clinical SPECT was more accurate than the conventional SPECT. For example, in the conventional SPECT with an energy window of 10%, scatter effects degraded quantitative accuracy by up to 11.52%, 5.10%, 2.88%, and 1.84% for 201Tl, 99mTc, 123I, and 111In, respectively. However, with the CZT-based pre-clinical SPECT, the degradations were only 9.67%, 5.45%, 2.36%, and 1.24% for 201Tl, 99mTc, 123I, and 111In, respectively. As the energy window was increased, the quantitative errors increased in both SPECT systems. Additionally, the isotopes with lower energy of photon emissions had greater quantitative error. Our results demonstrated that the CZT-based pre-clinical SPECT had lower overall quantitative errors due to reduced scatter and high detection efficiency. Furthermore, the results of this systematic assessment quantifying the accuracy of these SPECT

  16. Incorporation of additives into polymers

    DOEpatents

    McCleskey, T. Mark; Yates, Matthew Z.

    2003-07-29

    There has been invented a method for incorporating additives into polymers comprising: (a) forming an aqueous or alcohol-based colloidal system of the polymer; (b) emulsifying the colloidal system with a compressed fluid; and (c) contacting the colloidal polymer with the additive in the presence of the compressed fluid. The colloidal polymer can be contacted with the additive by having the additive in the compressed fluid used for emulsification or by adding the additive to the colloidal system before or after emulsification with the compressed fluid. The invention process can be carried out either as a batch process or as a continuous on-line process.

  17. Deciphering the roles of multiple additives in organocatalyzed Michael additions.

    PubMed

    Günler, Z Inci; Companyó, Xavier; Alfonso, Ignacio; Burés, Jordi; Jimeno, Ciril; Pericàs, Miquel A

    2016-05-21

    The synergistic effects of multiple additives (water and acetic acid) on the asymmetric Michael addition of acetone to nitrostyrene catalyzed by primary amine-thioureas (PAT) were precisely determined. Acetic acid facilitates hydrolysis of the imine intermediates, thus leading to catalytic behavior, and minimizes the formation of the double addition side product. In contrast, water slows down the reaction but minimizes catalyst deactivation, eventually leading to higher final yields. PMID:27128165

  18. Use of quantitative approaches in plan development.

    PubMed

    Palmer, B Z

    1978-04-01

    Health Planning as mandated by P.L. 93-641 requires considerable emphasis on technical procedures, especially during the development of the 5 year Health Systems Plans (HSP) and the one year Annual Implementation Plans (AIP). In addition, the State Health Plans and the State Medical Facilities Plans, which are to be developed in part on the basis of HSPs and AIPs of the Health Systems Agencies (HSAs) in each state, are expected to have solid quantitative documentation. The gap between these expectations and the state of the art reality are reviewed in this article. PMID:10307191

  19. Quantitative Radiological Diagnosis Of The Temporomandibular Joint

    NASA Astrophysics Data System (ADS)

    Jordan, Steven L.; Heffez, Leslie B.

    1989-05-01

    Recent impressive technological advances in imaging techniques for the human temporomandibular (tm) joint, and in enabling geometric algorithms have outpaced diagnostic analyses. The authors present a basis for systematic quantitative diagnoses that exploit the imaging advancements. A reference line, coordinate system, and transformations are described that are appropriate for tomography of the tm joint. These yield radiographic measurements (disk displacement) and observations (beaking of radiopaque dye and disk shape) that refine diagnostic classifications of anterior displacement of the condylar disk. The relevance of these techniques has been clinically confirmed. Additional geometric invariants and procedures are proposed for future clinical verification.

  20. The Quantitative Preparation of Future Geoscience Graduate Students

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways

  1. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses.

    PubMed

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425

  2. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses

    PubMed Central

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425

  3. Enhancing the Quantitative Representation of Socioeconomic Conditions in the Shared Socio-economic Pathways (SSPs) using the International Futures Model

    NASA Astrophysics Data System (ADS)

    Rothman, D. S.; Siraj, A.; Hughes, B.

    2013-12-01

    The international research community is currently in the process of developing new scenarios for climate change research. One component of these scenarios are the Shared Socio-economic Pathways (SSPs), which describe a set of possible future socioeconomic conditions. These are presented in narrative storylines with associated quantitative drivers. The core quantitative drivers include total population, average GDP per capita, educational attainment, and urbanization at the global, regional, and national levels. At the same time there have been calls, particularly by the IAV community, for the SSPs to include additional quantitative information on other key social factors, such as income inequality, governance, health, and access to key infrastructures, which are discussed in the narratives. The International Futures system (IFs), based at the Pardee Center at the University of Denver, is able to provide forecasts of many of these indicators. IFs cannot use the SSP drivers as exogenous inputs, but we are able to create development pathways that closely reproduce the core quantitative drivers defined by the different SSPs, as well as incorporating assumptions on other key driving factors described in the qualitative narratives. In this paper, we present forecasts for additional quantitative indicators based upon the implementation of the SSP development pathways in IFs. These results will be of value to many researchers.

  4. 75 FR 4323 - Additional Quantitative Fit-testing Protocols for the Respiratory Protection Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-27

    ...) to Appendix A of ] its Respiratory Protection Standard (see 69 FR 46986). OSHA also published on... qualitative fit-testing protocol (see 72 FR 72971). Subsequently, OSHA withdrew, without prejudice, this fit... further research addressing issues described in the withdrawal notice (see 74 FR 30250). II. Summary...

  5. Quantitative blood flow velocity imaging using laser speckle flowmetry.

    PubMed

    Nadort, Annemarie; Kalkman, Koen; van Leeuwen, Ton G; Faber, Dirk J

    2016-01-01

    Laser speckle flowmetry suffers from a debated quantification of the inverse relation between decorrelation time (τc) and blood flow velocity (V), i.e. 1/τc = αV. Using a modified microcirculation imager (integrated sidestream dark field - laser speckle contrast imaging [SDF-LSCI]), we experimentally investigate on the influence of the optical properties of scatterers on α in vitro and in vivo. We found a good agreement to theoretical predictions within certain limits for scatterer size and multiple scattering. We present a practical model-based scaling factor to correct for multiple scattering in microcirculatory vessels. Our results show that SDF-LSCI offers a quantitative measure of flow velocity in addition to vessel morphology, enabling the quantification of the clinically relevant blood flow, velocity and tissue perfusion. PMID:27126250

  6. Quantitative blood flow velocity imaging using laser speckle flowmetry

    NASA Astrophysics Data System (ADS)

    Nadort, Annemarie; Kalkman, Koen; van Leeuwen, Ton G.; Faber, Dirk J.

    2016-04-01

    Laser speckle flowmetry suffers from a debated quantification of the inverse relation between decorrelation time (τc) and blood flow velocity (V), i.e. 1/τc = αV. Using a modified microcirculation imager (integrated sidestream dark field - laser speckle contrast imaging [SDF-LSCI]), we experimentally investigate on the influence of the optical properties of scatterers on α in vitro and in vivo. We found a good agreement to theoretical predictions within certain limits for scatterer size and multiple scattering. We present a practical model-based scaling factor to correct for multiple scattering in microcirculatory vessels. Our results show that SDF-LSCI offers a quantitative measure of flow velocity in addition to vessel morphology, enabling the quantification of the clinically relevant blood flow, velocity and tissue perfusion.

  7. Quantitative blood flow velocity imaging using laser speckle flowmetry

    PubMed Central

    Nadort, Annemarie; Kalkman, Koen; van Leeuwen, Ton G.; Faber, Dirk J.

    2016-01-01

    Laser speckle flowmetry suffers from a debated quantification of the inverse relation between decorrelation time (τc) and blood flow velocity (V), i.e. 1/τc = αV. Using a modified microcirculation imager (integrated sidestream dark field - laser speckle contrast imaging [SDF-LSCI]), we experimentally investigate on the influence of the optical properties of scatterers on α in vitro and in vivo. We found a good agreement to theoretical predictions within certain limits for scatterer size and multiple scattering. We present a practical model-based scaling factor to correct for multiple scattering in microcirculatory vessels. Our results show that SDF-LSCI offers a quantitative measure of flow velocity in addition to vessel morphology, enabling the quantification of the clinically relevant blood flow, velocity and tissue perfusion. PMID:27126250

  8. A New Simple Interferometer for Obtaining Quantitatively Evaluable Flow Patterns

    NASA Technical Reports Server (NTRS)

    Erdmann, S F

    1953-01-01

    The method described in the present report makes it possible to obtain interferometer records with the aid of any one of the available Schlieren optics by the addition of very simple expedients, which fundamentally need not to be inferior to those obtained by other methods, such as the Mach-Zehnder interferometer, for example. The method is based on the fundamental concept of the phase-contrast process developed by Zernike, but which in principle has been enlarged to such an extent that it practically represents an independent interference method for general applications. Moreover, the method offers the possibility, in case of necessity, of superposing any apparent wedge field on the density field to be gauged. The theory is explained on a purely physical basis and illustrated and proved by experimental data. A number of typical cases are cited and some quantitative results reported.

  9. The Additive Coloration of Alkali Halides

    ERIC Educational Resources Information Center

    Jirgal, G. H.; and others

    1969-01-01

    Describes the construction and use of an inexpensive, vacuum furnace designed to produce F-centers in alkali halide crystals by additive coloration. The method described avoids corrosion or contamination during the coloration process. Examination of the resultant crystals is discussed and several experiments using additively colored crystals are…

  10. Developing Multiplicative Thinking from Additive Reasoning

    ERIC Educational Resources Information Center

    Tobias, Jennifer M.; Andreasen, Janet B.

    2013-01-01

    As students progress through elementary school, they encounter mathematics concepts that shift from additive to multiplicative situations (NCTM 2000). When they encounter fraction problems that require multiplicative thinking, they tend to incorrectly extend additive properties from whole numbers (Post et al. 1985). As a result, topics such as …

  11. Quantitative micro-CT

    NASA Astrophysics Data System (ADS)

    Prevrhal, Sven

    2005-09-01

    Micro-CT for bone structural analysis has progressed from an in-vitro laboratory technique to devices for in-vivo assessment of small animals and the peripheral human skeleton. Currently, topological parameters of bone architecture are the primary goals of analysis. Additional measurement of the density or degree of mineralization (DMB) of trabecular and cortical bone at the microscopic level is desirable to study effects of disease and treatment progress. This information is not commonly extracted because of the challenges of accurate measurement and calibration at the tissue level. To assess the accuracy of micro-CT DMB measurements in a realistic but controlled situation, we prepared bone-mimicking watery solutions at concentrations of 100 to 600 mg/cm3 K2PO4H and scanned them with micro-CT, both in glass vials and microcapillary tubes with inner diameters of 50, 100 and 150 μm to simulate trabecular thickness. Values of the linear attenuation coefficients μ in the reconstructed image are commonly affected by beam hardening effects for larger samples and by partial volume effects for small volumes. We implemented an iterative reconstruction technique to reduce beam hardening. Partial voluming was sought to be reduced by excluding voxels near the tube wall. With these two measures, improvement on the constancy of the reconstructed voxel values and linearity with solution concentration could be observed to over 90% accuracy. However, since the expected change in real bone is small more measurements are needed to confirm that micro-CT can indeed be adapted to assess bone mineralization at the tissue level.

  12. A quantitative philology of introspection

    PubMed Central

    Diuk, Carlos G.; Slezak, D. Fernandez; Raskovsky, I.; Sigman, M.; Cecchi, G. A.

    2012-01-01

    The cultural evolution of introspective thought has been recognized to undergo a drastic change during the middle of the first millennium BC. This period, known as the “Axial Age,” saw the birth of religions and philosophies still alive in modern culture, as well as the transition from orality to literacy—which led to the hypothesis of a link between introspection and literacy. Here we set out to examine the evolution of introspection in the Axial Age, studying the cultural record of the Greco-Roman and Judeo-Christian literary traditions. Using a statistical measure of semantic similarity, we identify a single “arrow of time” in the Old and New Testaments of the Bible, and a more complex non-monotonic dynamics in the Greco-Roman tradition reflecting the rise and fall of the respective societies. A comparable analysis of the twentieth century cultural record shows a steady increase in the incidence of introspective topics, punctuated by abrupt declines during and preceding the First and Second World Wars. Our results show that (a) it is possible to devise a consistent metric to quantify the history of a high-level concept such as introspection, cementing the path for a new quantitative philology and (b) to the extent that it is captured in the cultural record, the increased ability of human thought for self-reflection that the Axial Age brought about is still heavily determined by societal contingencies beyond the orality-literacy nexus. PMID:23015783

  13. Performance of calibration standards for antigen quantitation with flow cytometry.

    PubMed

    Lenkei, R; Gratama, J W; Rothe, G; Schmitz, G; D'hautcourt, J L; Arekrans, A; Mandy, F; Marti, G

    1998-10-01

    In the frame of the activities initiated by the Task Force for Antigen Quantitation of the European Working Group on Clinical Cell Analysis (EWGCCA), an experiment was conducted to evaluate microbead standards used for quantitative flow cytometry (QFCM). An unified window of analysis (UWA) was established on three different instruments (EPICS XL [Coulter Corporation, Miami, FL], FACScan and FACS Calibur [Becton Dickinson, San Jose, CA]) with QC3 microbeads (FCSC, PR). By using this defined fluorescence intensity scale, the performance of several monoclonal antibodies directed to CD3, CD4, and CD8 (conjugated and unconjugated), from three manufacturers (BDIS, Coulter [Immunotech], and DAKO) was tested. In addition, the QIFI system (DAKO) and QuantiBRITE (BDIS), and a method of relative fluorescence intensity (RFI, method of Giorgi), were compared. mAbs reacting with three more antigens, CD16, CD19, and CD38 were tested on the FACScan instrument. Quantitation was carried out using a single batch of cryopreserved peripheral blood leukocytes, and all tests were performed as single color analyses. Significant correlations were observed between the antibody-binding capacity (ABC) values of the same CD antigen measured with various calibrators and with antibodies differing in respect to vendor, labeling and possible epitope recognition. Despite the significant correlations, the ABC values of most monoclonal antibodies differed by 20-40% when determined by the different fluorochrome conjugates and different calibrators. The results of this study indicate that, at the present stage of QFCM consistent ABC values may be attained between laboratories provided that a specific calibration system is used including specific calibrators, reagents, and protocols. PMID:9773879

  14. Quantitative Mineralogical Characterization of Oregon Erionite

    NASA Astrophysics Data System (ADS)

    Dogan, A.; Dogan, M.; Ballirano, P.

    2006-12-01

    Erionite has been classified as Group-I Human Carcinogen by the IARC Working Group. Fibrogenetic potential of erionite varies from low to high yield of mesothelioma. This may require quantitative characterization of physicochemical properties of erionite before any experimental design. The toxicity of the mineral is such that quantitative characterization of erionite is extremely important. Yet, often the erionite specimens were incompletely or incorrectly characterized throwing doubt on the results of the work. For example, none of the Turkish erionite published until recently had balance error (E%) less than 10%, and Mg cation of the type specimen of erionite-Ca from Maze, Niigita Prefecture, Japan is more than 0.8. In the present study, erionite sample near Rome, Oregon have been quantitatively characterized using powder x-ray diffraction, Reitveld refinement, scanning electron microscopy, energy dispersive spectroscopy, inductively coupled plasma - mass spectroscopy, and Massbauer spectroscopy. The cell parameters of the erionite-K from Oregon is computed as a=13.2217(2) Å and c=15.0671 Å; chemical composition of the erionite as major oxides, rare earth elements and other trace elements, are characterized quantitatively. Crystal chemistries of the erionite are computed based upon the quidelines of the IMAA zeolite report of 1997.

  15. The Effect of an Experimental Bottleneck upon Quantitative Genetic Variation in the Housefly

    PubMed Central

    Bryant, Edwin H.; McCommas, Steven A.; Combs, Lisa M.

    1986-01-01

    Effects of a population bottleneck (founder-flush cycle) upon quantitative genetic variation of morphometric traits were examined in replicated experimental lines of the housefly founded with one, four or 16 pairs of flies. Heritability and additive genetic variances for eight morphometric traits generally increased as a result of the bottleneck, but the pattern of increase among bottleneck sizes differed among traits. Principal axes of the additive genetic correlation matrix for the control line yielded two suites of traits, one associated with general body size and another set largely independent of body size. In the former set containing five of the traits, additive genetic variance was greatest in the bottleneck size of four pairs, whereas in the latter set of two traits the largest additive genetic variance occurred in the smallest bottleneck size of one pair. One trait exhibited changes in additive genetic variance intermediate between these two major responses. These results were inconsistent with models of additive effects of alleles within loci or of additive effects among loci. An observed decline in viability measures and body size in the bottleneck lines also indicated that there was nonadditivity of allelic effects for these traits. Several possible nonadditive models were explored that increased additive genetic variance as a result of a bottleneck. These included a model with complete dominance, a model with overdominance and a model incorporating multiplicative epistasis. PMID:17246359

  16. Composition of fingermark residue: a qualitative and quantitative review.

    PubMed

    Girod, Aline; Ramotowski, Robert; Weyermann, Céline

    2012-11-30

    This article describes the composition of fingermark residue as being a complex system with numerous compounds coming from different sources and evolving over time from the initial composition (corresponding to the composition right after deposition) to the aged composition (corresponding to the evolution of the initial composition over time). This complex system will additionally vary due to effects of numerous influence factors grouped in five different classes: the donor characteristics, the deposition conditions, the substrate nature, the environmental conditions and the applied enhancement techniques. The initial and aged compositions as well as the influence factors are thus considered in this article to provide a qualitative and quantitative review of all compounds identified in fingermark residue up to now. The analytical techniques used to obtain these data are also enumerated. This review highlights the fact that despite the numerous analytical processes that have already been proposed and tested to elucidate fingermark composition, advanced knowledge is still missing. Thus, there is a real need to conduct future research on the composition of fingermark residue, focusing particularly on quantitative measurements, aging kinetics and effects of influence factors. The results of future research are particularly important for advances in fingermark enhancement and dating technique developments. PMID:22727572

  17. Quantitative proteomics for identifying biomarkers for tuberculous meningitis

    PubMed Central

    2012-01-01

    Introduction Tuberculous meningitis is a frequent extrapulmonary disease caused by Mycobacterium tuberculosis and is associated with high mortality rates and severe neurological sequelae. In an earlier study employing DNA microarrays, we had identified genes that were differentially expressed at the transcript level in human brain tissue from cases of tuberculous meningitis. In the current study, we used a quantitative proteomics approach to discover protein biomarkers for tuberculous meningitis. Methods To compare brain tissues from confirmed cased of tuberculous meningitis with uninfected brain tissue, we carried out quantitative protein expression profiling using iTRAQ labeling and LC-MS/MS analysis of SCX fractionated peptides on Agilent’s accurate mass QTOF mass spectrometer. Results and conclusions Through this approach, we identified both known and novel differentially regulated molecules. Those described previously included signal-regulatory protein alpha (SIRPA) and protein disulfide isomerase family A, member 6 (PDIA6), which have been shown to be overexpressed at the mRNA level in tuberculous meningitis. The novel overexpressed proteins identified in our study included amphiphysin (AMPH) and neurofascin (NFASC) while ferritin light chain (FTL) was found to be downregulated in TBM. We validated amphiphysin, neurofascin and ferritin light chain using immunohistochemistry which confirmed their differential expression in tuberculous meningitis. Overall, our data provides insights into the host response in tuberculous meningitis at the molecular level in addition to providing candidate diagnostic biomarkers for tuberculous meningitis. PMID:23198679

  18. Quantitative reconstructions in palaeolimnology: new paradigm or sick science?

    NASA Astrophysics Data System (ADS)

    Juggins, Steve

    2013-03-01

    Quantitative reconstructions from biological proxies have revolutionised palaeolimnology but the methodology is not without problems. The most important of these result from attempts to reconstruct non-causal environmental variables and from the effects of secondary variables. Non-causal variables act as surrogates for often unknown or unquantified ecological factors and the method assumes that these relationships are invariant in space and time. This assumption is almost never met and examples of diatom models for water depth and summer temperature demonstrate how violation leads to spurious and misleading reconstructions. In addition, comparison of published species optima indicate that a number of models have little or no predictive power outside their current spatial setting. Finally, experiments using simulated training sets of known properties demonstrate how changes in secondary "nuisance" variables can lead to large, consistent, and interpretable trends in a reconstruction that are completely spurious and independent of any real change in the reconstructed variable. These problems pervade many quantitative reconstructions in palaeolimnology and other disciplines. Palaeoecologists must give greater attention to what can and cannot be reconstructed and explicitly address the dangers of reconstructing surrogate and confounded variables if our reconstructions are to remain credible.

  19. Quantitative Proteomics Using Ultralow Flow Capillary Electrophoresis–Mass Spectrometry

    PubMed Central

    2015-01-01

    In this work, we evaluate the incorporation of an ultralow flow interface for coupling capillary electrophoresis (CE) and mass spectrometry (MS), in combination with reversed-phase high-pressure liquid chromatography (HPLC) fractionation as an alternate workflow for quantitative proteomics. Proteins, extracted from a SILAC (stable isotope labeling by amino acids in cell culture) labeled and an unlabeled yeast strain were mixed and digested enzymatically in solution. The resulting peptides were fractionated using RP-HPLC and analyzed by CE–MS yielding a total of 28 538 quantified peptides that correspond to 3 272 quantified proteins. CE–MS analysis was performed using a neutral capillary coating, providing the highest separation efficiency at ultralow flow conditions (<10 nL/min). Moreover, we were able to demonstrate that CE–MS is a powerful method for the identification of low-abundance modified peptides within the same sample. Without any further enrichment strategies, we succeeded in quantifying 1 371 phosphopeptides present in the CE–MS data set and found 49 phosphopeptides to be differentially regulated in the two yeast strains. Including acetylation, phosphorylation, deamidation, and oxidized forms, a total of 8 106 modified peptides could be identified in addition to 33 854 unique peptide sequences found. The work presented here shows the first quantitative proteomics approach that combines SILAC labeling with CE–MS analysis. PMID:25839223

  20. Quantitatively Probing the Means of Controlling Nanoparticle Assembly on Surfaces

    SciTech Connect

    Patete, J.m.; Wong, S.; Peng, X.; Serafin, J.M.

    2011-05-17

    As a means of developing a simple, cost-effective, and reliable method for probing nanoparticle behavior, we have used atomic force microscopy to gain a quantitative 3D visual representation of the deposition patterns of citrate-capped Au nanoparticles on a substrate as a function of (a) sample preparation, (b) the choice of substrate, (c) the dispersion solvent, and (d) the number of loading steps. Specifically, we have found that all four parameters can be independently controlled and manipulated in order to alter the resulting pattern and quantity of as-deposited nanoparticles. From these data, the sample preparation technique appears to influence deposition patterns most broadly, and the dispersion solvent is the most convenient parameter to use in tuning the quantity of nanoparticles deposited onto the surface under spin-coating conditions. Indeed, we have quantitatively measured the effect of surface coverage for both mica and silicon substrates under preparation techniques associated with (i) evaporation under ambient air, (ii) heat treatment, and (iii) spin-coating preparation conditions. In addition, we have observed a decrease in nanoparticle adhesion to a substrate when the ethylene glycol content of the colloidal dispersion solvent is increased, which had the effect of decreasing interparticle-substrate interactions. Finally, we have shown that substrates prepared by these diverse techniques have potential applicability in surface-enhanced Raman spectroscopy.

  1. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular. PMID:23650936

  2. DREAM: a method for semi-quantitative dermal exposure assessment.

    PubMed

    Van-Wendel-de-Joode, Berna; Brouwer, Derk H; Vermeulen, Roel; Van Hemmen, Joop J; Heederik, Dick; Kromhout, Hans

    2003-01-01

    This paper describes a new method (DREAM) for structured, semi-quantitative dermal exposure assessment for chemical or biological agents that can be used in occupational hygiene or epidemiology. It is anticipated that DREAM could serve as an initial assessment of dermal exposure, amongst others, resulting in a ranking of tasks and subsequently jobs. DREAM consists of an inventory and evaluation part. Two examples of dermal exposure of workers of a car-construction company show that DREAM characterizes tasks and gives insight into exposure mechanisms, forming a basis for systematic exposure reduction. DREAM supplies estimates for exposure levels on the outside clothing layer as well as on skin, and provides insight into the distribution of dermal exposure over the body. Together with the ranking of tasks and people, this provides information for measurement strategies and helps to determine who, where and what to measure. In addition to dermal exposure assessment, the systematic description of dermal exposure pathways helps to prioritize and determine most adequate measurement strategies and methods. DREAM could be a promising approach for structured, semi-quantitative, dermal exposure assessment. PMID:12505908

  3. Enantioselective Michael Addition of Water

    PubMed Central

    Chen, Bi-Shuang; Resch, Verena; Otten, Linda G; Hanefeld, Ulf

    2015-01-01

    The enantioselective Michael addition using water as both nucleophile and solvent has to date proved beyond the ability of synthetic chemists. Herein, the direct, enantioselective Michael addition of water in water to prepare important β-hydroxy carbonyl compounds using whole cells of Rhodococcus strains is described. Good yields and excellent enantioselectivities were achieved with this method. Deuterium labeling studies demonstrate that a Michael hydratase catalyzes the water addition exclusively with anti-stereochemistry. PMID:25529526

  4. Gasoline additives, emissions, and performance

    SciTech Connect

    1995-12-31

    The papers included in this publication deal with the influence of fuel, additive, and hardware changes on a variety of vehicle performance characteristics. Advanced techniques for measuring these performance parameters are also described. Contents include: Fleet test evaluation of gasoline additives for intake valve and combustion chamber deposit clean up; A technique for evaluating octane requirement additives in modern engines on dynamometer test stands; A fleet test of two additive technologies comparing their effects on tailpipe emissions; Investigation into the vehicle exhaust emissions of high percentage ethanol blends; Variability in hydrocarbon speciation measurements at low emission (ULEV) levels; and more.

  5. Quantitative rainbow schlieren deflectometry.

    PubMed

    Greenberg, P S; Klimek, R B; Buchele, D R

    1995-07-01

    In the rainbow schlieren apparatus, a continuously graded rainbow filter is placed in the back focal plane of the decollimating lens. Refractive-index gradients in the test section thus appear as gradations in huerather than irradiance. Asimple system is described wherein a conventional color CCD array and video digitizer are used to quantify accurately the color attributes of the resulting image, and hence the associated ray deflections. The present system provides a sensitivity comparable with that of conventional interferometry, while being simpler to implement and less sensitive to mechanical misalignment. PMID:21052205

  6. Quantitative rainbow schlieren deflectometry

    NASA Technical Reports Server (NTRS)

    Greenberg, Paul S.; Klimek, Robert B.; Buchele, Donald R.

    1995-01-01

    In the rainbow schlieren apparatus, a continuously graded rainbow filter is placed in the back focal plane of the decollimating lens. Refractive-index gradients in the test section thus appear as gradations in hue rather than irradiance. A simple system is described wherein a conventional color CCD array and video digitizer are used to quantify accurately the color attributes of the resulting image, and hence the associated ray deflections. The present system provides a sensitivity comparable with that of conventional interferometry, while being simpler to implement and less sensitive to mechanical misalignment.

  7. Defining Breast Cancer Intrinsic Subtypes by Quantitative Receptor Expression

    PubMed Central

    Cheang, Maggie C.U.; Martin, Miguel; Nielsen, Torsten O.; Prat, Aleix; Voduc, David; Rodriguez-Lescure, Alvaro; Ruiz, Amparo; Chia, Stephen; Shepherd, Lois; Ruiz-Borrego, Manuel; Calvo, Lourdes; Alba, Emilio; Carrasco, Eva; Caballero, Rosalia; Tu, Dongsheng; Pritchard, Kathleen I.; Levine, Mark N.; Bramwell, Vivien H.; Parker, Joel; Bernard, Philip S.; Ellis, Matthew J.; Perou, Charles M.; Di Leo, Angelo

    2015-01-01

    Purpose. To determine intrinsic breast cancer subtypes represented within categories defined by quantitative hormone receptor (HR) and HER2 expression. Methods. We merged 1,557 cases from three randomized phase III trials into a single data set. These breast tumors were centrally reviewed in each trial for quantitative ER, PR, and HER2 expression by immunohistochemistry (IHC) stain and by reverse transcription-quantitative polymerase chain reaction (RT-qPCR), with intrinsic subtyping by research-based PAM50 RT-qPCR assay. Results. Among 283 HER2-negative tumors with <1% HR expression by IHC, 207 (73%) were basal-like; other subtypes, particularly HER2-enriched (48, 17%), were present. Among the 1,298 HER2-negative tumors, borderline HR (1%–9% staining) was uncommon (n = 39), and these tumors were heterogeneous: 17 (44%) luminal A/B, 12 (31%) HER2-enriched, and only 7 (18%) basal-like. Including them in the definition of triple-negative breast cancer significantly diminished enrichment for basal-like cancer (p < .05). Among 106 HER2-positive tumors with <1% HR expression by IHC, the HER2-enriched subtype was the most frequent (87, 82%), whereas among 127 HER2-positive tumors with strong HR (>10%) expression, only 69 (54%) were HER2-enriched and 55 (43%) were luminal (39 luminal B, 16 luminal A). Quantitative HR expression by RT-qPCR gave similar results. Regardless of methodology, basal-like cases seldom expressed ER/ESR1 or PR/PGR and were associated with the lowest expression level of HER2/ERBB2 relative to other subtypes. Conclusion. Significant discordance remains between clinical assay-defined subsets and intrinsic subtype. For identifying basal-like breast cancer, the optimal HR IHC cut point was <1%, matching the American Society of Clinical Oncology and College of American Pathologists guidelines. Tumors with borderline HR staining are molecularly diverse and may require additional assays to clarify underlying biology. PMID:25908555

  8. Collection of quantitative chemical release field data.

    SciTech Connect

    Demirgian, J.; Macha, S.; Loyola Univ.

    1999-01-01

    Detection and quantitation of chemicals in the environment requires Fourier-transform infrared (FTIR) instruments that are properly calibrated and tested. This calibration and testing requires field testing using matrices that are representative of actual instrument use conditions. Three methods commonly used for developing calibration files and training sets in the field are a closed optical cell or chamber, a large-scale chemical release, and a small-scale chemical release. There is no best method. The advantages and limitations of each method should be considered in evaluating field results. Proper calibration characterizes the sensitivity of an instrument, its ability to detect a component in different matrices, and the quantitative accuracy and precision of the results.

  9. Additively manufactured porous tantalum implants.

    PubMed

    Wauthle, Ruben; van der Stok, Johan; Amin Yavari, Saber; Van Humbeeck, Jan; Kruth, Jean-Pierre; Zadpoor, Amir Abbas; Weinans, Harrie; Mulier, Michiel; Schrooten, Jan

    2015-03-01

    The medical device industry's interest in open porous, metallic biomaterials has increased in response to additive manufacturing techniques enabling the production of complex shapes that cannot be produced with conventional techniques. Tantalum is an important metal for medical devices because of its good biocompatibility. In this study selective laser melting technology was used for the first time to manufacture highly porous pure tantalum implants with fully interconnected open pores. The architecture of the porous structure in combination with the material properties of tantalum result in mechanical properties close to those of human bone and allow for bone ingrowth. The bone regeneration performance of the porous tantalum was evaluated in vivo using an orthotopic load-bearing bone defect model in the rat femur. After 12 weeks, substantial bone ingrowth, good quality of the regenerated bone and a strong, functional implant-bone interface connection were observed. Compared to identical porous Ti-6Al-4V structures, laser-melted tantalum shows excellent osteoconductive properties, has a higher normalized fatigue strength and allows for more plastic deformation due to its high ductility. It is therefore concluded that this is a first step towards a new generation of open porous tantalum implants manufactured using selective laser melting. PMID:25500631

  10. Color Addition and Subtraction Apps

    ERIC Educational Resources Information Center

    Ruiz, Frances; Ruiz, Michael J.

    2015-01-01

    Color addition and subtraction apps in HTML5 have been developed for students as an online hands-on experience so that they can more easily master principles introduced through traditional classroom demonstrations. The evolution of the additive RGB color model is traced through the early IBM color adapters so that students can proceed step by step…

  11. 75 FR 27313 - Proposed Additions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-14

    ... FROM PEOPLE WHO ARE BLIND OR SEVERELY DISABLED PROCUREMENT LIST Proposed Additions AGENCY: Committee for Purchase From People Who Are Blind or Severely Disabled. ACTION: Proposed additions to the... or Severely Disabled, Jefferson Plaza 2, Suite 10800, 1421 Jefferson Davis Highway,...

  12. Quantitative interferometric microscopy cytometer based on regularized optical flow algorithm

    NASA Astrophysics Data System (ADS)

    Xue, Liang; Vargas, Javier; Wang, Shouyu; Li, Zhenhua; Liu, Fei

    2015-09-01

    Cell detections and analysis are important in various fields, such as medical observations and disease diagnoses. In order to analyze the cell parameters as well as observe the samples directly, in this paper, we present an improved quantitative interferometric microscopy cytometer, which can monitor the quantitative phase distributions of bio-samples and realize cellular parameter statistics. The proposed system is able to recover the phase imaging of biological samples in the expanded field of view via a regularized optical flow demodulation algorithm. This algorithm reconstructs the phase distribution with high accuracy with only two interferograms acquired at different time points simplifying the scanning system. Additionally, the method is totally automatic, and therefore it is convenient for establishing a quantitative phase cytometer. Moreover, the phase retrieval approach is robust against noise and background. Excitingly, red blood cells are readily investigated with the quantitative interferometric microscopy cytometer system.

  13. Display considerations for quantitative radiology.

    PubMed

    Badano, Aldo

    2007-01-01

    The early prediction of the response to treatment using quantitative imaging holds great promise for streamlining the development, assessment, approval and personalization of new therapies. However, to realize this potential, quantitative radiology needs to develop an understanding of several limitations that might hinder the application of quantitation tools and techniques. Among these limitations, the fidelity of the display device used to interpret the image data is a significant factor that affects the accuracy and precision of quantitative visual tasks, particularly those involving large, volumetric, multi-dimensional and multi-modality image sets. This paper reviews several aspects of display performance and display image quality that are likely to contribute negatively to the robustness of quantitative imaging methods. Display characteristics that will be addressed include the grayscale and color performance of different classes of display devices, the angular distribution of the emissions of liquid crystal technologies, and the temporal response for stack mode viewing. The paper will also summarize current efforts for the metrology, standardization and image quality assessment methods for display devices.: PMID:24980719

  14. Evaluation of additive element to improve PZT piezoelectricity by using first-principles calculation

    NASA Astrophysics Data System (ADS)

    Yasoda, Yutaka; Uetsuji, Yasutomo; Tsuchiya, Kazuyoshi

    2015-12-01

    Recently, piezoelectric material has a very important potential for functional material which configure Bio-MEMS (Biological Micro Electro Mechanical Systems) actuator and sensor. Specifically, in implementation of piezoelectric material for Bio-MEMS, thin film fabrication by sputtering method is made from the viewpoint of miniaturization. Furthermore, in piezoelectric material, perovskite type material composed of ABO3 has a high piezoelectricity. Then, PZT (Lead Zirconate Titanate) as the perovskite type piezoelectric material is widely used since it is easy to produce and has high piezoelectricity. PZT has zirconium or titanium in the B site of ABO3 structure. PZT has the features such as physical properties to greatly change by change in the B site composition ratio of zirconium and titanium. Thus, the B site greatly influences physical properties and therefore function improvement by additive element is tried widely. However, experimental method to lack in economy and quantitativeness is mainstream. Therefore, application of the result is difficult and new evaluation method of B site additive element for sputtering fabrication is necessary. Accordingly, in this research, search of an additive element at low cost and quantitative from the viewpoint of energy by first-principles calculation. First of all, the additive elements which capable of substituting for a B site of PZT were searched. Next, change of piezoelectricity was evaluated by change of crystal structure in a PZT system was introduced an additive element that substitution of the B site was possible. As a result, additive elements for the PZT B site capable of improving piezoelectricity were determined.

  15. Quantitative imaging of the optical near field.

    PubMed

    Kühler, Paul; García de Abajo, F Javier; Leiprecht, Philipp; Kolloch, Andreas; Solis, Javier; Leiderer, Paul; Siegel, Jan

    2012-09-24

    When exposing small particles on a substrate to a light plane wave, the scattered optical near field is spatially modulated and highly complex. We show, for the particular case of dielectric microspheres, that it is possible to image these optical near-field distributions in a quantitative way. By placing a single microsphere on a thin film of the photosensitive phase change material Ge(2)Sb(5)Te(5) and exposing it to a single short laser pulse, the spatial intensity modulation of the near field is imprinted into the film as a pattern of different material phases. The resulting patterns are investigated by using optical as well as high-resolution scanning electron microscopy. Quantitative information on the local optical near field at each location is obtained by calibrating the material response to pulsed laser irradiation. We discuss the influence of polarization and angle of incidence of the laser beam as well as particle size on the field distribution. The experimental results are in good quantitative agreement with a model based on a rigorous solution of Maxwell's equations. Our results have potential application to near-field optical lithography and experimental determination of near fields in complex nanostructures. PMID:23037356

  16. Quantitative analysis of blood vessel geometry

    NASA Astrophysics Data System (ADS)

    Fuhrman, Michael G.; Abdul-Karim, Othman; Shah, Sujal; Gilbert, Steven G.; Van Bibber, Richard

    2001-07-01

    Re-narrowing or restenosis of a human coronary artery occurs within six months in one third of balloon angioplasty procedures. Accurate and repeatable quantitative analysis of vessel shape is important to characterize the progression and type of restenosis, and to evaluate effects new therapies might have. A combination of complicated geometry and image variability, and the need for high resolution and large image size makes visual/manual analysis slow, difficult, and prone to error. The image processing and analysis described here was developed to automate feature extraction of the lumen, internal elastic lamina, neointima, external elastic lamina, and tunica adventitia and to enable an objective, quantitative definition of blood vessel geometry. The quantitative geometrical analysis enables the measurement of several features including perimeter, area, and other metrics of vessel damage. Automation of feature extraction creates a high throughput capability that enables analysis of serial sections for more accurate measurement of restenosis dimensions. Measurement results are input into a relational database where they can be statistically analyzed compared across studies. As part of the integrated process, results are also imprinted on the images themselves to facilitate auditing of the results. The analysis is fast, repeatable and accurate while allowing the pathologist to control the measurement process.

  17. The Use of a Quantitative Cysteinyl-peptide Enrichment Technology for High-Throughput Quantitative Proteomics

    SciTech Connect

    Liu, Tao; Qian, Weijun; Camp, David G.; Smith, Richard D.

    2007-01-02

    Quantitative proteomic measurements are of significant interest in studies aimed at discovering disease biomarkers and providing new insights into biological pathways. A quantitative cysteinyl-peptide enrichment technology (QCET) can be employed to achieve higher efficiency, greater dynamic range, and higher throughput in quantitative proteomic studies that utilize stable-isotope labeling techniques combined with high-resolution liquid chromatography (LC)-mass spectrometry (MS) measurements. The QCET approach involves specific 16O/18O labeling of tryptic peptides, high-efficiency enrichment of cysteinyl-peptides, and confident protein identification and quantification from high resolution LC-Fourier transform ion cyclotron resonance mass spectrometry (FTICR) measurements and a previously established database of accurate mass and elution time information. This methodology is demonstrated by using proteome profiling of naïve and in vitro-differentiated human mammary epithelial cells (HMEC) as an example, which initially resulted in the identification and quantification of 603 proteins in a single LC-FTICR analysis. QCET provides not only highly efficient enrichment of cysteinyl-peptides for more extensive proteome coverage and improved labeling efficiency for better quantitative measurements, but more importantly, a high-throughput strategy suitable for quantitative proteome analysis where extensive or parallel proteomic measurements are required, such as in time course studies of specific pathways and clinical sample analyses for biomarker discovery.

  18. Simulation of Laser Additive Manufacturing and its Applications

    NASA Astrophysics Data System (ADS)

    Lee, Yousub

    Laser and metal powder based additive manufacturing (AM), a key category of advanced Direct Digital Manufacturing (DDM), produces metallic components directly from a digital representation of the part such as a CAD file. It is well suited for the production of high-value, customizable components with complex geometry and the repair of damaged components. Currently, the main challenges for laser and metal powder based AM include the formation of defects (e.g., porosity), low surface finish quality, and spatially non-uniform properties of material. Such challenges stem largely from the limited knowledge of complex physical processes in AM especially the molten pool physics such as melting, molten metal flow, heat conduction, vaporization of alloying elements, and solidification. Direct experimental measurement of melt pool phenomena is highly difficult since the process is localized (on the order of 0.1 mm to 1 mm melt pool size) and transient (on the order of 1 m/s scanning speed). Furthermore, current optical and infrared cameras are limited to observe the melt pool surface. As a result, fluid flows in the melt pool, melt pool shape and formation of sub-surface defects are difficult to be visualized by experiment. On the other hand, numerical simulation, based on rigorous solution of mass, momentum and energy transport equations, can provide important quantitative knowledge of complex transport phenomena taking place in AM. The overarching goal of this dissertation research is to develop an analytical foundation for fundamental understanding of heat transfer, molten metal flow and free surface evolution. Two key types of laser AM processes are studied: a) powder injection, commonly used for repairing of turbine blades, and b) powder bed, commonly used for manufacturing of new parts with complex geometry. In the powder injection simulation, fluid convection, temperature gradient (G), solidification rate (R) and melt pool shape are calculated using a heat transfer

  19. Qualitative and quantitative analysis of steroidal saponins in crude extract and bark powder of Yucca schidigera Roezl.

    PubMed

    Kowalczyk, Mariusz; Pecio, Łukasz; Stochmal, Anna; Oleszek, Wiesław

    2011-08-10

    Steroidal saponins in commercial stem syrup and in extract of a bark of Yucca schidigera were identified with high-performance liquid chromatography ion trap mass spectrometry and quantitated using ultraperformance liquid chromatography with quadrupole mass spectrometric detection. Fragmentation patterns of yucca saponins were generated using collision-induced dissociation and compared with fragmentation of authentic standards as well as with published spectrometric information. In addition to detection of twelve saponins known to occur in Y. schidigera, collected fragmentation data led to tentative identifications of seven new saponins. A quantitation method for all 19 detected compounds was developed and validated. Samples derived from the syrup and the bark of yucca were quantitatively measured and compared. Obtained results indicate that yucca bark accumulates polar, bidesmosidic saponins, while in the stem steroidal glycosides with middle- and short-length saccharide chains are predominant. The newly developed method provides an opportunity to evaluate the composition of yucca products available on the market. PMID:21721553

  20. A symmetrical fluorous dendron-cyanine dye-conjugated bimodal nanoprobe for quantitative 19F MRI and NIR fluorescence bioimaging.

    PubMed

    Wang, Zhe; Yue, Xuyi; Wang, Yu; Qian, Chunqi; Huang, Peng; Lizak, Marty; Niu, Gang; Wang, Fu; Rong, Pengfei; Kiesewetter, Dale O; Ma, Ying; Chen, Xiaoyuan

    2014-08-01

    (19)F MRI and optical imaging are two powerful noninvasive molecular imaging modalities in biomedical applications. (19)F MRI has great potential for high resolution in vivo imaging, while fluorescent probes enable ultracontrast cellular/tissue imaging with high accuracy and sensitivity. A bimodal nanoprobe is developed, integrating the merits of (19)F MRI and fluorescence imaging into a single synthetic molecule, which is further engineered into nanoprobe, by addressing shortcomings of conventional contrast agents to explore the quantitative (19)F MRI and fluorescence imaging and cell tracking. Results show that this bimodal imaging nanoprobe presents high correlation of (19)F MR signal and NIR fluorescence intensity in vitro and in vivo. Additionally, this nanoprobe enables quantitative (19)F MR analysis, confirmed by a complementary fluorescence analysis. This unique feature can hardly be obtained by traditional (19)F MRI contrast agents. It is envisioned that this nanoprobe can hold great potential for quantitative and sensitive multi-modal molecular imaging. PMID:24789108

  1. Quantitative Proteome Mapping of Nitrotyrosines

    SciTech Connect

    Bigelow, Diana J.; Qian, Weijun

    2008-02-10

    An essential first step in the understanding disease and environmental perturbations is the early and quantitative detection of the increased levels of the inflammatory marker nitrotyrosine, as compared with its endogenous levels within the tissue or cellular proteome. Thus, methods that successfully address a proteome-wide quantitation of nitrotyrosine and related oxidative modifications can provide early biomarkers of risk and progression of disease as well as effective strategies for therapy. Multidimensional separations LC coupled with tandem mass spectrometry (LC-MS/MS) has, in recent years, significantly expanded our knowledge of human (and mammalian model system) proteomes including some nascent work in identification of post-translational modifications. In the following review, we discuss the application of LC-MS/MS for quantitation and identification of nitrotyrosine-modified proteins within the context of complex protein mixtures presented in mammalian proteomes.

  2. Energy Education: The Quantitative Voice

    NASA Astrophysics Data System (ADS)

    Wolfson, Richard

    2010-02-01

    A serious study of energy use and its consequences has to be quantitative. It makes little sense to push your favorite renewable energy source if it can't provide enough energy to make a dent in humankind's prodigious energy consumption. Conversely, it makes no sense to dismiss alternatives---solar in particular---that supply Earth with energy at some 10,000 times our human energy consumption rate. But being quantitative---especially with nonscience students or the general public---is a delicate business. This talk draws on the speaker's experience presenting energy issues to diverse audiences through single lectures, entire courses, and a textbook. The emphasis is on developing a quick, ``back-of-the-envelope'' approach to quantitative understanding of energy issues. )

  3. Quantitative genetic models for describing simultaneous and recursive relationships between phenotypes.

    PubMed Central

    Gianola, Daniel; Sorensen, Daniel

    2004-01-01

    Multivariate models are of great importance in theoretical and applied quantitative genetics. We extend quantitative genetic theory to accommodate situations in which there is linear feedback or recursiveness between the phenotypes involved in a multivariate system, assuming an infinitesimal, additive, model of inheritance. It is shown that structural parameters defining a simultaneous or recursive system have a bearing on the interpretation of quantitative genetic parameter estimates (e.g., heritability, offspring-parent regression, genetic correlation) when such features are ignored. Matrix representations are given for treating a plethora of feedback-recursive situations. The likelihood function is derived, assuming multivariate normality, and results from econometric theory for parameter identification are adapted to a quantitative genetic setting. A Bayesian treatment with a Markov chain Monte Carlo implementation is suggested for inference and developed. When the system is fully recursive, all conditional posterior distributions are in closed form, so Gibbs sampling is straightforward. If there is feedback, a Metropolis step may be embedded for sampling the structural parameters, since their conditional distributions are unknown. Extensions of the model to discrete random variables and to nonlinear relationships between phenotypes are discussed. PMID:15280252

  4. Quantitative modeling of transcription factor binding specificities using DNA shape.

    PubMed

    Zhou, Tianyin; Shen, Ning; Yang, Lin; Abe, Namiko; Horton, John; Mann, Richard S; Bussemaker, Harmen J; Gordân, Raluca; Rohs, Remo

    2015-04-14

    DNA binding specificities of transcription factors (TFs) are a key component of gene regulatory processes. Underlying mechanisms that explain the highly specific binding of TFs to their genomic target sites are poorly understood. A better understanding of TF-DNA binding requires the ability to quantitatively model TF binding to accessible DNA as its basic step, before additional in vivo components can be considered. Traditionally, these models were built based on nucleotide sequence. Here, we integrated 3D DNA shape information derived with a high-throughput approach into the modeling of TF binding specificities. Using support vector regression, we trained quantitative models of TF binding specificity based on protein binding microarray (PBM) data for 68 mammalian TFs. The evaluation of our models included cross-validation on specific PBM array designs, testing across different PBM array designs, and using PBM-trained models to predict relative binding affinities derived from in vitro selection combined with deep sequencing (SELEX-seq). Our results showed that shape-augmented models compared favorably to sequence-based models. Although both k-mer and DNA shape features can encode interdependencies between nucleotide positions of the binding site, using DNA shape features reduced the dimensionality of the feature space. In addition, analyzing the feature weights of DNA shape-augmented models uncovered TF family-specific structural readout mechanisms that were not revealed by the DNA sequence. As such, this work combines knowledge from structural biology and genomics, and suggests a new path toward understanding TF binding and genome function. PMID:25775564

  5. Color Addition and Subtraction Apps

    NASA Astrophysics Data System (ADS)

    Ruiz, Frances; Ruiz, Michael J.

    2015-10-01

    Color addition and subtraction apps in HTML5 have been developed for students as an online hands-on experience so that they can more easily master principles introduced through traditional classroom demonstrations. The evolution of the additive RGB color model is traced through the early IBM color adapters so that students can proceed step by step in understanding mathematical representations of RGB color. Finally, color addition and subtraction are presented for the X11 colors from web design to illustrate yet another real-life application of color mixing.

  6. Quantitative analysis of sandstone porosity

    SciTech Connect

    Ferrell, R.E. Jr.; Carpenter, P.K.

    1988-01-01

    A quantitative analysis of changes in porosity associated with sandstone diagenesis was accomplished with digital back-scattered electron image analysis techniques. The volume percent (vol. %) of macroporosity, quartz, clay minerals, feldspar, and other constituents combined with stereological parameters, such as the size and shape of the analyzed features, permitted the determination of cement volumes, the ratio of primary to secondary porosity, and the relative abundance of detrital and authigenic clay minerals. The analyses were produced with a JEOL 733 Superprobe and a TRACOR/NORTHERN 5700 Image Analyzer System. The results provided a numerical evaluation of sedimentological facies controls and diagenetic effects on the permeabilities of potential reservoirs. In a typical application, subtle differences in the diagnetic development of porosity were detected in Wilcox sandstones from central Louisiana. Mechanical compaction of these shoreface sandstones has reduced the porosity to approximately 20%. In most samples with permeabilities greater than 10 md, the measured ratio of macroporosity to microporosity associated with pore-filling kaolinite was 3:1. In other sandstones with lower permeabilities, the measured ratio was higher, but the volume of pore-filling clay was essentially the same. An analysis of the frequency distribution of pore diameters and shapes revealed that the latter samples contained 2-3 vol% of grain-dissolution or moldic porosity. Fluid entry to these large pores was restricted and the clays produced from the grain dissolution products reduced the observed permeability. The image analysis technique provided valuable data for the distinction of productive and nonproductive intervals in this reservoir.

  7. Quantitative nature of overexpression experiments

    PubMed Central

    Moriya, Hisao

    2015-01-01

    Overexpression experiments are sometimes considered as qualitative experiments designed to identify novel proteins and study their function. However, in order to draw conclusions regarding protein overexpression through association analyses using large-scale biological data sets, we need to recognize the quantitative nature of overexpression experiments. Here I discuss the quantitative features of two different types of overexpression experiment: absolute and relative. I also introduce the four primary mechanisms involved in growth defects caused by protein overexpression: resource overload, stoichiometric imbalance, promiscuous interactions, and pathway modulation associated with the degree of overexpression. PMID:26543202

  8. Quantitative intracerebral brain hemorrhage analysis

    NASA Astrophysics Data System (ADS)

    Loncaric, Sven; Dhawan, Atam P.; Cosic, Dubravko; Kovacevic, Domagoj; Broderick, Joseph; Brott, Thomas

    1999-05-01

    In this paper a system for 3-D quantitative analysis of human spontaneous intracerebral brain hemorrhage (ICH) is described. The purpose of the developed system is to perform quantitative 3-D measurements of the parameters of ICH region and from computed tomography (CT) images. The measured parameter in this phase of the system development is volume of the hemorrhage region. The goal of the project is to measure parameters for a large number of patients having ICH and to correlate measured parameters to patient morbidity and mortality.

  9. Software for quantitative trait analysis

    PubMed Central

    2005-01-01

    This paper provides a brief overview of software currently available for the genetic analysis of quantitative traits in humans. Programs that implement variance components, Markov Chain Monte Carlo (MCMC), Haseman-Elston (H-E) and penetrance model-based linkage analyses are discussed, as are programs for measured genotype association analyses and quantitative trait transmission disequilibrium tests. The software compared includes LINKAGE, FASTLINK, PAP, SOLAR, SEGPATH, ACT, Mx, MERLIN, GENEHUNTER, Loki, Mendel, SAGE, QTDT and FBAT. Where possible, the paper provides URLs for acquiring these programs through the internet, details of the platforms for which the software is available and the types of analyses performed. PMID:16197737

  10. Quantitative measurement of nanomechanical properties in composite materials

    NASA Astrophysics Data System (ADS)

    Zhao, Wei

    results significantly, and new, power-law body of revolution models of the probe tip geometry have been applied. Due to the low yield strength of polymers compared with other engineering materials, elastic-plastic contact is considered to better represent the epoxy surface response and was used to acquire more accurate quantitative measurements. Visco-elastic contact response was introduced in the boundary condition of the AFAM cantilever vibration model, due to the creep nature of epoxy, to determine time-dependent effects. These methods have direct impact on the quantitative measurement capabilities of near-filler interphase regions in polymers and composites and the long-term influence of environmental conditions on composites. In addition, quantitative AFAM scans were made on distal surfaces of human bicuspids and molars, to determine the microstructural and spatial variation in nanomechanical properties of the enamel biocomposite. Single point AFAM measurements were performed on individual enamel prism and sheath locations to determine spatial elastic modulus. Mechanical property variation of enamel is associated to the differences in the mineral to organic content and the apatite crystal orientations within the enamel microstructure. Also, variation in the elastic modulus of the enamel ultrastructure was observed in measurements at the outer enamel versus near the dentine enamel junction (DEJ).

  11. Calculators and Computers: Graphical Addition.

    ERIC Educational Resources Information Center

    Spero, Samuel W.

    1978-01-01

    A computer program is presented that generates problem sets involving sketching graphs of trigonometric functions using graphical addition. The students use calculators to sketch the graphs and a computer solution is used to check it. (MP)

  12. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  13. Quantitative matrix assisted plasma desorption mass spectrometry

    NASA Astrophysics Data System (ADS)

    Jungclas, Hartmut; Schmidt, Lothar; Köhl, Peter; Fritsch, Hans-Walter

    1993-07-01

    The development of optimized sample preparation methods accompanied the history of successful applications of 252Cf-PDMS. Studying the pharmacokinetics of the antineoplastic agent etoposide serum samples from cancer patients were labelled with the homologeous compounds teniposide as internal standard for the quantitative PDMS analysis. Sample purification by chloroform extraction and by thin layer chromatography turned out to be insufficient to guarantee a satisfying final PDMS result. Embedding the purified sample into a matrix of suitable substances on the target reduced the negative influence of impurities, raised the signal-to-noise ratio of molecular ions and improved the reproducibility of calibration. This preparation method was again successfully employed for the quantitative analysis of the cytostatic drug doxorubicin. The application of a different matrix optimized for the preparation of this anthracycline and its homologeous compound daunorubicin, improved the sensitivity, linearity and detection limit.

  14. Germ cell quantitation in human testicular biopsy.

    PubMed

    Sinha Hikim, A P; Chakraborty, J; Jhunjhunwala, J S

    1985-01-01

    Quantitative analysis of human seminiferous epithelium was carried out using an improved method of glutaraldehyde and osmium fixation with plastic embedding. Part of each biopsy specimen was fixed in Bouin's fixative and embedded in paraffin for comparison. Epon embedded tissue had very little artifactual damage compared with paraffin embedded tissue sections. The germ cell to Sertoli cell ratios were determined by counting the various germ cells per "unit" tubular area. Data obtained by this method reflect a remarkable stability of Sertoli cell number and germ cell-Sertoli cell ratios both between biopsies from different individuals and between biopsies from right and left testes from the same individual. Agreement between the present results and those of earlier studies based on paraffin embedded testicular specimens supports the validity of this method of germ cell quantitation of human testicular biopsy samples. PMID:3927550

  15. Nanostructured surfaces investigated by quantitative morphological studies.

    PubMed

    Perani, Martina; Carapezzi, Stefania; Mutta, Geeta Rani; Cavalcoli, Daniela

    2016-05-01

    The morphology of different surfaces has been investigated by atomic force microscopy and quantitatively analyzed in this paper. Two different tools have been employed to this scope: the analysis of the height-height correlation function and the determination of the mean grain size, which have been combined to obtain a complete characterization of the surfaces. Different materials have been analyzed: SiO x N y , InGaN/GaN quantum wells and Si nanowires, grown with different techniques. Notwithstanding the presence of grain-like structures on all the samples analyzed, they present very diverse surface design, underlying that this procedure can be of general use. Our results show that the quantitative analysis of nanostructured surfaces allows us to obtain interesting information, such as grain clustering, from the comparison of the lateral correlation length and the grain size. PMID:27004458

  16. A quantitative description for efficient financial markets

    NASA Astrophysics Data System (ADS)

    Immonen, Eero

    2015-09-01

    In this article we develop a control system model for describing efficient financial markets. We define the efficiency of a financial market in quantitative terms by robust asymptotic price-value equality in this model. By invoking the Internal Model Principle of robust output regulation theory we then show that under No Bubble Conditions, in the proposed model, the market is efficient if and only if the following conditions hold true: (1) the traders, as a group, can identify any mispricing in asset value (even if no one single trader can do it accurately), and (2) the traders, as a group, incorporate an internal model of the value process (again, even if no one single trader knows it). This main result of the article, which deliberately avoids the requirement for investor rationality, demonstrates, in quantitative terms, that the more transparent the markets are, the more efficient they are. An extensive example is provided to illustrate the theoretical development.

  17. Nanostructured surfaces investigated by quantitative morphological studies

    NASA Astrophysics Data System (ADS)

    Perani, Martina; Carapezzi, Stefania; Rani Mutta, Geeta; Cavalcoli, Daniela

    2016-05-01

    The morphology of different surfaces has been investigated by atomic force microscopy and quantitatively analyzed in this paper. Two different tools have been employed to this scope: the analysis of the height-height correlation function and the determination of the mean grain size, which have been combined to obtain a complete characterization of the surfaces. Different materials have been analyzed: SiO x N y , InGaN/GaN quantum wells and Si nanowires, grown with different techniques. Notwithstanding the presence of grain-like structures on all the samples analyzed, they present very diverse surface design, underlying that this procedure can be of general use. Our results show that the quantitative analysis of nanostructured surfaces allows us to obtain interesting information, such as grain clustering, from the comparison of the lateral correlation length and the grain size.

  18. Polyolefins as additives in plastics

    SciTech Connect

    Deanin, R.D.

    1993-12-31

    Polyolefins are not only major commodity plastics - they are also very useful as additives, both in other polyolefins and also in other types of plastics. This review covers ethylene, propylene, butylene and isobutylene polymers, in blends with each other, and as additives to natural rubber, styrene/butadiene rubber, polystyrene, polyvinyl chloride, polymethyl methacrylate, polyphenylene oxide, polycarbonate, thermoplastic polyesters, polyurethanes, polyamides, and mixed automotive plastics recycling.

  19. Multigrid-based reconstruction algorithm for quantitative photoacoustic tomography

    PubMed Central

    Li, Shengfu; Montcel, Bruno; Yuan, Zhen; Liu, Wanyu; Vray, Didier

    2015-01-01

    This paper proposes a multigrid inversion framework for quantitative photoacoustic tomography reconstruction. The forward model of optical fluence distribution and the inverse problem are solved at multiple resolutions. A fixed-point iteration scheme is formulated for each resolution and used as a cost function. The simulated and experimental results for quantitative photoacoustic tomography reconstruction show that the proposed multigrid inversion can dramatically reduce the required number of iterations for the optimization process without loss of reliability in the results. PMID:26203371

  20. Multigrid-based reconstruction algorithm for quantitative photoacoustic tomography.

    PubMed

    Li, Shengfu; Montcel, Bruno; Yuan, Zhen; Liu, Wanyu; Vray, Didier

    2015-07-01

    This paper proposes a multigrid inversion framework for quantitative photoacoustic tomography reconstruction. The forward model of optical fluence distribution and the inverse problem are solved at multiple resolutions. A fixed-point iteration scheme is formulated for each resolution and used as a cost function. The simulated and experimental results for quantitative photoacoustic tomography reconstruction show that the proposed multigrid inversion can dramatically reduce the required number of iterations for the optimization process without loss of reliability in the results. PMID:26203371

  1. Simulating heat addition via mass addition in constant area compressible flows

    NASA Astrophysics Data System (ADS)

    Heiser, W. H.; McClure, W. B.; Wood, C. W.

    1995-01-01

    A study conducted demonstrated the striking similarity between the influence of heat addition and mass addition on compressible flows. These results encourage the belief that relatively modest laboratory experiments employing mass addition can be devised that will reproduce the leading phenomena of heat addition, such as the axial variation of properties, choking, and wall-boundary-layer separation. These suggest that some aspects of the complex behavior of dual-mode ramjet/scramjet combustors could be experimentally evaluated or demonstrated by replacing combustion with less expensive, more easily controlled, and safer mass addition.

  2. Four-Point Bending as a Method for Quantitatively Evaluating Spinal Arthrodesis in a Rat Model

    PubMed Central

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-01-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague–Dawley rat spines after single-level posterolateral fusion procedures at L4–L5. Segments were classified as ‘not fused,’ ‘restricted motion,’ or ‘fused’ by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4–L5 motion segment, and stiffness was measured as the slope of the moment–displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery. PMID:25730756

  3. Evaluation of Quantitative Performance of Sequential Immobilized Metal Affinity Chromatographic Enrichment for Phosphopeptides

    PubMed Central

    Sun, Zeyu; Hamilton, Karyn L.; Reardon, Kenneth F.

    2014-01-01

    We evaluated a sequential elution protocol from immobilized metal affinity chromatography (SIMAC) employing gallium-based immobilized metal affinity chromatography (IMAC) in conjunction with titanium-dioxide-based metal oxide affinity chromatography (MOAC). The quantitative performance of this SIMAC enrichment approach, assessed in terms of repeatability, dynamic range, and linearity, was evaluated using a mixture composed of tryptic peptides from caseins, bovine serum albumin, and phosphopeptide standards. While our data demonstrate the overall consistent performance of the SIMAC approach under various loading conditions, the results also revealed that the method had limited repeatability and linearity for most phosphopeptides tested, and different phosphopeptides were found to have different linear ranges. These data suggest that, unless additional strategies are used, SIMAC should be regarded as a semi-quantitative method when used in large-scale phosphoproteomics studies in complex backgrounds. PMID:24096195

  4. Quantitative three-dimensional photoacoustic tomography of the finger joints: an in vivo study

    NASA Astrophysics Data System (ADS)

    Sun, Yao; Sobel, Eric; Jiang, Huabei

    2009-11-01

    We present for the first time in vivo full three-dimensional (3-D) photoacoustic tomography (PAT) of the distal interphalangeal joint in a human subject. Both absorbed energy density and absorption coefficient images of the joint are quantitatively obtained using our finite-element-based photoacoustic image reconstruction algorithm coupled with the photon diffusion equation. The results show that major anatomical features in the joint along with the side arteries can be imaged with a 1-MHz transducer in a spherical scanning geometry. In addition, the cartilages associated with the joint can be quantitatively differentiated from the phalanx. This in vivo study suggests that the 3-D PAT method described has the potential to be used for early diagnosis of joint diseases such as osteoarthritis and rheumatoid arthritis.

  5. Key Parameters Affecting Quantitative Analysis of STEM-EDS Spectrum Images

    SciTech Connect

    Brewer, Luke; Parish, Chad M

    2010-06-01

    In this article, we use simulated and experimental data to explore how three operator-controllable parameters - (1) signal level, (2) detector resolution, and (3) number of factors chosen for analysis - affect quantitative analyses of scanning transmission electron microscopy-energy dispersive X-ray spectroscopy spectrum images processed by principal component analysis (PCA). We find that improvements in both signal level and detector resolution improve the precision of quantitative analyses, but that signal level is the most important. We also find that if the rank of the PCA solution is not chosen properly, it may be possible to improperly fit the underlying data and degrade the accuracy of results. Additionally, precision is degraded in the case when too many factors are included in the model.

  6. The evolution and extinction of the ichthyosaurs from the perspective of quantitative ecospace modelling.

    PubMed

    Dick, Daniel G; Maxwell, Erin E

    2015-07-01

    The role of niche specialization and narrowing in the evolution and extinction of the ichthyosaurs has been widely discussed in the literature. However, previous studies have concentrated on a qualitative discussion of these variables only. Here, we use the recently developed approach of quantitative ecospace modelling to provide a high-resolution quantitative examination of the changes in dietary and ecological niche experienced by the ichthyosaurs throughout their evolution in the Mesozoic. In particular, we demonstrate that despite recent discoveries increasing our understanding of taxonomic diversity among the ichthyosaurs in the Cretaceous, when viewed from the perspective of ecospace modelling, a clear trend of ecological contraction is visible as early as the Middle Jurassic. We suggest that this ecospace redundancy, if carried through to the Late Cretaceous, could have contributed to the extinction of the ichthyosaurs. Additionally, our results suggest a novel model to explain ecospace change, termed the 'migration model'. PMID:26156130

  7. The evolution and extinction of the ichthyosaurs from the perspective of quantitative ecospace modelling

    PubMed Central

    Dick, Daniel G.; Maxwell, Erin E.

    2015-01-01

    The role of niche specialization and narrowing in the evolution and extinction of the ichthyosaurs has been widely discussed in the literature. However, previous studies have concentrated on a qualitative discussion of these variables only. Here, we use the recently developed approach of quantitative ecospace modelling to provide a high-resolution quantitative examination of the changes in dietary and ecological niche experienced by the ichthyosaurs throughout their evolution in the Mesozoic. In particular, we demonstrate that despite recent discoveries increasing our understanding of taxonomic diversity among the ichthyosaurs in the Cretaceous, when viewed from the perspective of ecospace modelling, a clear trend of ecological contraction is visible as early as the Middle Jurassic. We suggest that this ecospace redundancy, if carried through to the Late Cretaceous, could have contributed to the extinction of the ichthyosaurs. Additionally, our results suggest a novel model to explain ecospace change, termed the ‘migration model’. PMID:26156130

  8. NASA Intellectual Property Negotiation Practices and their Relationship to Quantitative Measures of Technology Transfer

    NASA Technical Reports Server (NTRS)

    Bush, Lance B.

    1997-01-01

    In the current political climate NASA must be able to show reliable measures demonstrating successful technology transfer. The currently available quantitative data of intellectual property technology transfer efforts portray a less than successful performance. In this paper, the use of only quantitative values for measurement of technology transfer is shown to undervalue the effort. In addition, NASA's current policy in negotiating intellectual property rights results in undervalued royalty rates. NASA has maintained that it's position of providing public good precludes it from negotiating fair market value for its technology and instead has negotiated for reasonable cost in order to recover processing fees. This measurement issue is examined and recommendations made which include a new policy regarding the intellectual property rights negotiation, and two measures to supplement the intellectual property measures.

  9. Quantitative analysis of the relationship between nucleotide sequence and functional activity.

    PubMed Central

    Stormo, G D; Schneider, T D; Gold, L

    1986-01-01

    Matrices can be used to evaluate sequences for functional activity. Multiple regression can solve for the matrix that gives the best fit between sequence evaluations and quantitative activities. This analysis shows that the best model for context effects on suppression by su2 involves primarily the two nucleotides 3' to the amber codon, and that their contributions are independent and additive. Context effects on 2AP mutagenesis also involve the two nucleotides 3' to the 2AP insertion, but their effects are not independent. In a construct for producing beta-galactosidase, the effects on translational yields of the tri-nucleotide 5' to the initiation codon are dependent on the entire triplet. Models based on these quantitative results are presented for each of the examples. PMID:3092188

  10. Profiling and Quantitation of Bacterial Carotenoids by Liquid Chromatography and Photodiode Array Detection

    PubMed Central

    Nelis, H. J.; De Leenheer, A. P.

    1989-01-01

    An analytical method for the profiling and quantitative determination of carotenoids in bacteria is described. Exhaustive extraction of the pigments from four selected bacterial strains required treatment of the cells with potassium hydroxide or liquefied phenol or both before the addition of the extracting solvent (methanol or diethyl ether). The carotenoids in the extracts were separated by nonaqueous reversed-phase liquid chromatography in conjunction with photodiode array absorption detection. The identity of a peak was considered definitive only when both its retention time and absorption spectrum, before and after chemical reactions, matched those of a reference component. In the absence of the latter, most peaks could be tentatively identified. Two examples illustrate how in the analysis of pigmented bacteria errors may result from using nonchromatographic procedures or liquid chromatographic methods lacking sufficient criteria for peak identification. Carotenoids of interest were determined quantitatively when the authentic reference substance was available or, alternatively, were determined semiquantitatively. PMID:16348068

  11. Quantitative wake analysis of a freely swimming fish using 3D synthetic aperture PIV

    NASA Astrophysics Data System (ADS)

    Mendelson, Leah; Techet, Alexandra H.

    2015-07-01

    Synthetic aperture PIV (SAPIV) is used to quantitatively analyze the wake behind a giant danio ( Danio aequipinnatus) swimming freely in a seeded quiescent tank. The experiment is designed with minimal constraints on animal behavior to ensure that natural swimming occurs. The fish exhibits forward swimming and turning behaviors at speeds between 0.9 and 1.5 body lengths/second. Results show clearly isolated and linked vortex rings in the wake structure, as well as the thrust jet coming off of a visual hull reconstruction of the fish body. As a benchmark for quantitative analysis of volumetric PIV data, the vortex circulation and impulse are computed using methods consistent with those applied to planar PIV data. Volumetric momentum analysis frameworks are discussed for linked and asymmetric vortex structures, laying a foundation for further volumetric studies of swimming hydrodynamics with SAPIV. Additionally, a novel weighted refocusing method is presented as an improvement to SAPIV reconstruction.

  12. Resolving the Quantitative-Qualitative Dilemma: A Critical Realist Approach

    ERIC Educational Resources Information Center

    Scott, David

    2007-01-01

    The philosophical issues underpinning the quantitative-qualitative divide in educational research are examined. Three types of argument which support a resolution are considered: pragmatism, false duality and warranty through triangulation. In addition a number of proposed strategies--alignment, sequencing, translation and triangulation--are…

  13. Quantitative genomics of female reproduction

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Numerous quantitative trait loci (QTL) for reproductive traits in domestic livestock have been described in the literature. In this chapter, the components needed for detection of reproductive trait QTL are described, including collection of phenotypes, genotypes, and the appropriate statistical ana...

  14. Quantitative Research in Written Composition.

    ERIC Educational Resources Information Center

    Gebhard, Ann O.

    Offered as an introductory guide to teachers interested in approaching written English as a "second dialect" that students must master, this review covers quantitative investigations of written language. The first section deals with developmental studies, describing how a variety of researchers have related written structure to writer maturity.…

  15. Equilibria in Quantitative Reachability Games

    NASA Astrophysics Data System (ADS)

    Brihaye, Thomas; Bruyère, Véronique; de Pril, Julie

    In this paper, we study turn-based quantitative multiplayer non zero-sum games played on finite graphs with reachability objectives. In this framework each player aims at reaching his own goal as soon as possible. We prove existence of finite-memory Nash (resp. secure) equilibria in multiplayer (resp. two-player) games.

  16. Quantitative Literacy for Social Justice

    ERIC Educational Resources Information Center

    Wiest, Lynda R.; Higgins, Heidi J.; Frost, Janet Hart

    2007-01-01

    In this article, we argue that many adults lack the "numeracy" needed to function in a maximally effective manner in their vocational, civic, and personal lives. We believe schools need to foster skills in quantitative literacy (QL), an inclination and ability to make reasoned decisions using general world knowledge and fundamental mathematics in…

  17. Quantitative Genomics of Male Reproduction

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objective of the review was to establish the current status of quantitative genomics for male reproduction. Genetic variation exists for male reproduction traits. These traits are expensive and time consuming traits to evaluate through conventional breeding schemes. Genomics is an alternative to...

  18. Quantitative Reasoning in Problem Solving

    ERIC Educational Resources Information Center

    Ramful, Ajay; Ho, Siew Yin

    2015-01-01

    In this article, Ajay Ramful and Siew Yin Ho explain the meaning of quantitative reasoning, describing how it is used in the to solve mathematical problems. They also describe a diagrammatic approach to represent relationships among quantities and provide examples of problems and their solutions.

  19. Quantitative assessment of scientific quality

    NASA Astrophysics Data System (ADS)

    Heinzl, Harald; Bloching, Philipp

    2012-09-01

    Scientific publications, authors, and journals are commonly evaluated with quantitative bibliometric measures. Frequently-used measures will be reviewed and their strengths and weaknesses will be highlighted. Reflections about conditions for a new, research paper-specific measure will be presented.

  20. Pharmacological and Chemical Effects of Cigarette Additives

    PubMed Central

    Rabinoff, Michael; Caskey, Nicholas; Rissling, Anthony; Park, Candice

    2007-01-01

    We investigated tobacco industry documents and other sources for evidence of possible pharmacological and chemical effects of tobacco additives. Our findings indicated that more than 100 of 599 documented cigarette additives have pharmacological actions that camouflage the odor of environmental tobacco smoke emitted from cigarettes, enhance or maintain nicotine delivery, could increase the addictiveness of cigarettes, and mask symptoms and illnesses associated with smoking behaviors. Whether such uses were specifically intended for these agents is unknown. Our results provide a clear rationale for regulatory control of tobacco additives. PMID:17666709

  1. ADDITIVITY ASSESSMENT OF TRIHALOMETHANE MIXTURES BY PROPORTIONAL RESPONSE ADDITION

    EPA Science Inventory

    If additivity is known or assumed, the toxicity of a chemical mixture may be predicted from the dose response curves of the individual chemicals comprising the mixture. As single chemical data are abundant and mixture data sparse, mixture risk methods that utilize single chemical...

  2. Formation Of Cometary Hydrocarbons By Hydrogen Addition Reactions On Cold Grains

    NASA Astrophysics Data System (ADS)

    Kobayashi, Hitomi; Watanabe, N.; Kawakita, H.; Fukushima, T.

    2012-10-01

    Hydrogen addition reactions on cold grains are considered to play an important role to form many kinds of volatiles in low temperature conditions like molecular clouds or early solar nebula. We can investigate the physical conditions (e.g., temperature, gas density, and etc.) of the early solar nebula via chemical properties of the pristine bodies like comets. The hydrocarbons like C2H2 and C2H6 have been studied so far and C2H6 might be a product of successive hydrogen addition of C2H2 on the cold grain. To evaluate the efficiency of hydrogen addition reactions from C2H2 to C2H6 quantitatively, we conducted laboratory measurements of those reactions under multiple conditions of the samples (on H2O ice) at different temperatures (10, 20, 30 K) with the LASSIE apparatus at Hokkaido University. Our results provide more detailed information about those reactions than previous quantitative studies. We discuss about the reaction rates with different samples and conditions.

  3. Dynamic quantitative photothermal monitoring of cell death of individual human red blood cells upon glucose depletion

    NASA Astrophysics Data System (ADS)

    Vasudevan, Srivathsan; Chen, George Chung Kit; Andika, Marta; Agarwal, Shuchi; Chen, Peng; Olivo, Malini

    2010-09-01

    Red blood cells (RBCs) have been found to undergo ``programmed cell death,'' or eryptosis, and understanding this process can provide more information about apoptosis of nucleated cells. Photothermal (PT) response, a label-free photothermal noninvasive technique, is proposed as a tool to monitor the cell death process of living human RBCs upon glucose depletion. Since the physiological status of the dying cells is highly sensitive to photothermal parameters (e.g., thermal diffusivity, absorption, etc.), we applied linear PT response to continuously monitor the death mechanism of RBC when depleted of glucose. The kinetics of the assay where the cell's PT response transforms from linear to nonlinear regime is reported. In addition, quantitative monitoring was performed by extracting the relevant photothermal parameters from the PT response. Twofold increases in thermal diffusivity and size reduction were found in the linear PT response during cell death. Our results reveal that photothermal parameters change earlier than phosphatidylserine externalization (used for fluorescent studies), allowing us to detect the initial stage of eryptosis in a quantitative manner. Hence, the proposed tool, in addition to detection of eryptosis earlier than fluorescence, could also reveal physiological status of the cells through quantitative photothermal parameter extraction.

  4. Gas Chromatographic Determination of Methyl Salicylate in Rubbing Alcohol: An Experiment Employing Standard Addition.

    ERIC Educational Resources Information Center

    Van Atta, Robert E.; Van Atta, R. Lewis

    1980-01-01

    Provides a gas chromatography experiment that exercises the quantitative technique of standard addition to the analysis for a minor component, methyl salicylate, in a commercial product, "wintergreen rubbing alcohol." (CS)

  5. [INVITED] Lasers in additive manufacturing

    NASA Astrophysics Data System (ADS)

    Pinkerton, Andrew J.

    2016-04-01

    Additive manufacturing is a topic of considerable ongoing interest, with forecasts predicting it to have major impact on industry in the future. This paper focusses on the current status and potential future development of the technology, with particular reference to the role of lasers within it. It begins by making clear the types and roles of lasers in the different categories of additive manufacturing. This is followed by concise reviews of the economic benefits and disadvantages of the technology, current state of the market and use of additive manufacturing in different industries. Details of these fields are referenced rather than expanded in detail. The paper continues, focusing on current indicators to the future of additive manufacturing. Barriers to its development, trends and opportunities in major industrial sectors, and wider opportunities for its development are covered. Evidence indicates that additive manufacturing may not become the dominant manufacturing technology in all industries, but represents an excellent opportunity for lasers to increase their influence in manufacturing as a whole.

  6. Evaluation of certain food additives.

    PubMed

    2015-01-01

    This report represents the conclusions of a Joint FAO/WHO Expert Committee convened to evaluate the safety of various food additives, including flavouring agents, and to prepare specifications for identity and purity. The first part of the report contains a general discussion of the principles governing the toxicological evaluation of and assessment of dietary exposure to food additives, including flavouring agents. A summary follows of the Committee's evaluations of technical, toxicological and dietary exposure data for eight food additives (Benzoe tonkinensis; carrageenan; citric and fatty acid esters of glycerol; gardenia yellow; lutein esters from Tagetes erecta; octenyl succinic acid-modified gum arabic; octenyl succinic acid-modified starch; paprika extract; and pectin) and eight groups of flavouring agents (aliphatic and alicyclic hydrocarbons; aliphatic and aromatic ethers; ionones and structurally related substances; miscellaneous nitrogen-containing substances; monocyclic and bicyclic secondary alcohols, ketones and related esters; phenol and phenol derivatives; phenyl-substituted aliphatic alcohols and related aldehydes and esters; and sulfur-containing heterocyclic compounds). Specifications for the following food additives were revised: citric acid; gellan gum; polyoxyethylene (20) sorbitan monostearate; potassium aluminium silicate; and Quillaia extract (Type 2). Annexed to the report are tables summarizing the Committee's recommendations for dietary exposures to and toxicological evaluations of all of the food additives and flavouring agents considered at this meeting. PMID:26118220

  7. Limits of quantitation - Yet another suggestion

    NASA Astrophysics Data System (ADS)

    Carlson, Jill; Wysoczanski, Artur; Voigtman, Edward

    2014-06-01

    The work presented herein suggests that the limit of quantitation concept may be rendered substantially less ambiguous and ultimately more useful as a figure of merit by basing it upon the significant figure and relative measurement error ideas due to Coleman, Auses and Gram, coupled with the correct instantiation of Currie's detection limit methodology. Simple theoretical results are presented for a linear, univariate chemical measurement system with homoscedastic Gaussian noise, and these are tested against both Monte Carlo computer simulations and laser-excited molecular fluorescence experimental results. Good agreement among experiment, theory and simulation is obtained and an easy extension to linearly heteroscedastic Gaussian noise is also outlined.

  8. Manipulating crystallization with molecular additives.

    PubMed

    Shtukenberg, Alexander G; Lee, Stephanie S; Kahr, Bart; Ward, Michael D

    2014-01-01

    Given the importance of organic crystals in a wide range of industrial applications, the chemistry, biology, materials science, and chemical engineering communities have focused considerable attention on developing methods to control crystal structure, size, shape, and orientation. Tailored additives have been used to control crystallization to great effect, presumably by selectively binding to particular crystallographic surfaces and sites. However, substantial knowledge gaps still exist in the fundamental mechanisms that govern the formation and growth of organic crystals in both the absence and presence of additives. In this review, we highlight research discoveries that reveal the role of additives, either introduced by design or present adventitiously, on various stages of formation and growth of organic crystals, including nucleation, dislocation spiral growth mechanisms, growth inhibition, and nonclassical crystal morphologies. The insights from these investigations and others of their kind are likely to guide the development of innovative methods to manipulate crystallization for a wide range of materials and applications. PMID:24579880

  9. Additive Manufacturing of Hybrid Circuits

    NASA Astrophysics Data System (ADS)

    Sarobol, Pylin; Cook, Adam; Clem, Paul G.; Keicher, David; Hirschfeld, Deidre; Hall, Aaron C.; Bell, Nelson S.

    2016-07-01

    There is a rising interest in developing functional electronics using additively manufactured components. Considerations in materials selection and pathways to forming hybrid circuits and devices must demonstrate useful electronic function; must enable integration; and must complement the complex shape, low cost, high volume, and high functionality of structural but generally electronically passive additively manufactured components. This article reviews several emerging technologies being used in industry and research/development to provide integration advantages of fabricating multilayer hybrid circuits or devices. First, we review a maskless, noncontact, direct write (DW) technology that excels in the deposition of metallic colloid inks for electrical interconnects. Second, we review a complementary technology, aerosol deposition (AD), which excels in the deposition of metallic and ceramic powder as consolidated, thick conformal coatings and is additionally patternable through masking. Finally, we show examples of hybrid circuits/devices integrated beyond 2-D planes, using combinations of DW or AD processes and conventional, established processes.

  10. Tougher Addition Polyimides Containing Siloxane

    NASA Technical Reports Server (NTRS)

    St. Clair, T. L.; Maudgal, S.

    1986-01-01

    Laminates show increased impact resistances and other desirable mechanical properties. Bismaleamic acid extended by reaction of diaminosiloxane with maleic anhydride in 1:1 molar ratio, followed by reaction with half this molar ratio of aromatic dianhydride. Bismaleamic acid also extended by reaction of diaminosiloxane with maleic anhydride in 1:2 molar ratio, followed by reaction with half this molar ratio of aromatic diamine (Michael-addition reaction). Impact resistances improved over those of unmodified bismaleimide, showing significant increase in toughness. Aromatic addition polyimides developed as both matrix and adhesive resins for applications on future aircraft and spacecraft.

  11. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. PMID:26763302

  12. NMR quantitation: influence of RF inhomogeneity

    PubMed Central

    Mo, Huaping; Harwood, John; Raftery, Daniel

    2016-01-01

    The NMR peak integral is ideally linearly dependent on the sine of excitation angle (θ), which has provided unsurpassed flexibility in quantitative NMR by allowing the use of a signal of any concentration as the internal concentration reference. Controlling the excitation angle is particularly critical for solvent proton concentration referencing to minimize the negative impact of radiation damping, and to reduce the risk of receiver gain compression. In practice, due to the influence of RF inhomogeneity for any given probe, the observed peak integral is not exactly proportional to sin θ. To evaluate the impact quantitatively, we introduce a RF inhomogeneity factor I(θ) as a function of the nominal pulse excitation angle and propose a simple calibration procedure. Alternatively, I(θ) can be calculated from the probe’s RF profile, which can be readily obtained as a gradient image of an aqueous sample. Our results show that without consideration of I(θ), even for a probe with good RF homogeneity, up to 5% error can be introduced due to different excitation pulse angles used for the analyte and the reference. Hence, a simple calibration of I(θ) can eliminate such errors and allow an accurate description of the observed NMR signal’s dependence on the excitation angle in quantitative analysis. PMID:21919056

  13. Quantitative Ultrasound Assessment of the Rat Cervix

    PubMed Central

    McFarlin, Barbara L.; O’Brien, William D.; Oelze, Michael L.; Zachary, James F.; White-Traut, Rosemary C.

    2009-01-01

    Objective The purpose of this research was to detect cervical ripening with a new quantitative ultrasound technique. Methods Cervices of 13 nonpregnant and 65 timed pregnant (days 15, 17, 19, 20, and 21 of pregnancy) Sprague Dawley rats were scanned ex vivo with a 70-MHz ultrasound transducer. Ultrasound scatterer property estimates (scatterer diameter [SD], acoustic concentration [AC], and scatterer strength factor [SSF]) from the cervices were quantified and then compared to hydroxyproline and water content. Insertion loss (attenuation) was measured in 3 rats in each of the 6 groups. Discriminant analysis was used to predict gestational age group (cervical ripening) from the ultrasound variables SD, SSF, and AC. Results Differences were observed between the groups (SD, AC, and SSF; P < .0001). Quantitative ultrasound measures changed as the cervix ripened: (1) SD increased from days 15 to 21; (2) AC decreased from days 15 to 21; and (3) SSF was the greatest in the nonpregnant group and the least in the day 21 group. Cervix hydroxyproline content increased as pregnancy progressed (P < .003) and correlated with group, SD, AC, and SSF (P < .001). Discriminant analysis of ultrasound variables predicted 56.4% of gestational group assignment (P < .001) and increased to 77% within 2 days of the predicted analysis. Cervix insertion loss was greatest for the nonpregnant group and least for the day 21 group. Conclusions Quantitative ultrasound predicted cervical ripening in the rat cervix, but before use in humans, quantitative ultrasound will need to predict gestational age in the later days of gestation with more precision. PMID:16870896

  14. Quantitative risk assessment of durable glass fibers.

    PubMed

    Fayerweather, William E; Eastes, Walter; Cereghini, Francesco; Hadley, John G

    2002-06-01

    This article presents a quantitative risk assessment for the theoretical lifetime cancer risk from the manufacture and use of relatively durable synthetic glass fibers. More specifically, we estimate levels of exposure to respirable fibers or fiberlike structures of E-glass and C-glass that, assuming a working lifetime exposure, pose a theoretical lifetime cancer risk of not more than 1 per 100,000. For comparability with other risk assessments we define these levels as nonsignificant exposures. Nonsignificant exposure levels are estimated from (a) the Institute of Occupational Medicine (IOM) chronic rat inhalation bioassay of durable E-glass microfibers, and (b) the Research Consulting Company (RCC) chronic inhalation bioassay of durable refractory ceramic fibers (RCF). Best estimates of nonsignificant E-glass exposure exceed 0.05-0.13 fibers (or shards) per cubic centimeter (cm3) when calculated from the multistage nonthreshold model. Best estimates of nonsignificant C-glass exposure exceed 0.27-0.6 fibers/cm3. Estimates of nonsignificant exposure increase markedly for E- and C-glass when non-linear models are applied and rapidly exceed 1 fiber/cm3. Controlling durable fiber exposures to an 8-h time-weighted average of 0.05 fibers/cm3 will assure that the additional theoretical lifetime risk from working lifetime exposures to these durable fibers or shards is kept below the 1 per 100,000 level. Measured airborne exposures to respirable, durable glass fibers (or shards) in glass fiber manufacturing and fabrication operations were compared with the nonsignificant exposure estimates described. Sampling results for B-sized respirable E-glass fibers at facilities that manufacture or fabricate small-diameter continuous-filament products, from those that manufacture respirable E-glass shards from PERG (process to efficiently recycle glass), from milled fiber operations, and from respirable C-glass shards from Flakeglass operations indicate very low median exposures of 0

  15. Does finger sense predict addition performance?

    PubMed

    Newman, Sharlene D

    2016-05-01

    The impact of fingers on numerical and mathematical cognition has received a great deal of attention recently. However, the precise role that fingers play in numerical cognition is unknown. The current study explores the relationship between finger sense, arithmetic and general cognitive ability. Seventy-six children between the ages of 5 and 12 participated in the study. The results of stepwise multiple regression analyses demonstrated that while general cognitive ability including language processing was a predictor of addition performance, finger sense was not. The impact of age on the relationship between finger sense, and addition was further examined. The participants were separated into two groups based on age. The results showed that finger gnosia score impacted addition performance in the older group but not the younger group. These results appear to support the hypothesis that fingers provide a scaffold for calculation and that if that scaffold is not properly built, it has continued differential consequences to mathematical cognition. PMID:26993292

  16. Thermal diffusivity estimation with quantitative pulsed phase thermography

    NASA Astrophysics Data System (ADS)

    Ospina-Borras, J. E.; Florez-Ospina, Juan F.; Benitez-Restrepo, H. D.; Maldague, X.

    2015-05-01

    Quantitative Pulsed Phase Thermography (PPT) has been only used to estimate defect parameters such as depth and thermal resistance. Here, we propose a thermal quadrupole based method that extends quantitative pulsed phase thermography. This approach estimates thermal diffusivity by solving a inversion problem based on non-linear squares estimation. This approach is tested with pulsed thermography data acquired from a composite sample. We compare our results with another technique established in time domain. The proposed quantitative analysis with PPT provides estimates of thermal diffusivity close to those obtained with the time domain approach. This estimation requires only the a priori knowledge of sample thickness.

  17. Natural bacterial communities serve as quantitative geochemical biosensors

    DOE PAGESBeta

    Smith, Mark B.; Rocha, Andrea M.; Smillie, Chris S.; Olesen, Scott W.; Paradis, Charles; Wu, Liyou; Campbell, James H.; Fortney, Julian L.; Mehlhorn, Tonia L.; Lowe, Kenneth A.; et al

    2015-05-12

    Biological sensors can be engineered to measure a wide range of environmental conditions. Here we show that statistical analysis of DNA from natural microbial communities can be used to accurately identify environmental contaminants, including uranium and nitrate at a nuclear waste site. In addition to contamination, sequence data from the 16S rRNA gene alone can quantitatively predict a rich catalogue of 26 geochemical features collected from 93 wells with highly differing geochemistry characteristics. We extend this approach to identify sites contaminated with hydrocarbons from the Deepwater Horizon oil spill, finding that altered bacterial communities encode a memory of prior contamination,more » even after the contaminants themselves have been fully degraded. We show that the bacterial strains that are most useful for detecting oil and uranium are known to interact with these substrates, indicating that this statistical approach uncovers ecologically meaningful interactions consistent with previous experimental observations. Future efforts should focus on evaluating the geographical generalizability of these associations. Taken as a whole, these results indicate that ubiquitous, natural bacterial communities can be used as in situ environmental sensors that respond to and capture perturbations caused by human impacts. These in situ biosensors rely on environmental selection rather than directed engineering, and so this approach could be rapidly deployed and scaled as sequencing technology continues to become faster, simpler, and less expensive. Here we show that DNA from natural bacterial communities can be used as a quantitative biosensor to accurately distinguish unpolluted sites from those contaminated with uranium, nitrate, or oil. These results indicate that bacterial communities can be used as environmental sensors that respond to and capture perturbations caused by human impacts.« less

  18. Natural bacterial communities serve as quantitative geochemical biosensors

    SciTech Connect

    Smith, Mark B.; Rocha, Andrea M.; Smillie, Chris S.; Olesen, Scott W.; Paradis, Charles; Wu, Liyou; Campbell, James H.; Fortney, Julian L.; Mehlhorn, Tonia L.; Lowe, Kenneth A.; Earles, Jennifer E.; Phillips, Jana; Techtmann, Steve M.; Joyner, Dominique C.; Elias, Dwayne A.; Bailey, Kathryn L.; Hurt, Richard A.; Preheim, Sarah P.; Sanders, Matthew C.; Yang, Joy; Mueller, Marcella A.; Brooks, Scott; Watson, David B.; Zhang, Ping; He, Zhili; Dubinsky, Eric A.; Adams, Paul D.; Arkin, Adam P.; Fields, Matthew W.; Zhou, Jizhong; Alm, Eric J.; Hazen, Terry C.

    2015-05-12

    Biological sensors can be engineered to measure a wide range of environmental conditions. Here we show that statistical analysis of DNA from natural microbial communities can be used to accurately identify environmental contaminants, including uranium and nitrate at a nuclear waste site. In addition to contamination, sequence data from the 16S rRNA gene alone can quantitatively predict a rich catalogue of 26 geochemical features collected from 93 wells with highly differing geochemistry characteristics. We extend this approach to identify sites contaminated with hydrocarbons from the Deepwater Horizon oil spill, finding that altered bacterial communities encode a memory of prior contamination, even after the contaminants themselves have been fully degraded. We show that the bacterial strains that are most useful for detecting oil and uranium are known to interact with these substrates, indicating that this statistical approach uncovers ecologically meaningful interactions consistent with previous experimental observations. Future efforts should focus on evaluating the geographical generalizability of these associations. Taken as a whole, these results indicate that ubiquitous, natural bacterial communities can be used as in situ environmental sensors that respond to and capture perturbations caused by human impacts. These in situ biosensors rely on environmental selection rather than directed engineering, and so this approach could be rapidly deployed and scaled as sequencing technology continues to become faster, simpler, and less expensive. Here we show that DNA from natural bacterial communities can be used as a quantitative biosensor to accurately distinguish unpolluted sites from those contaminated with uranium, nitrate, or oil. These results indicate that bacterial communities can be used as environmental sensors that respond to and capture perturbations caused by human impacts.

  19. Promoting Additive Acculturation in Schools.

    ERIC Educational Resources Information Center

    Gibson, Margaret A.

    1995-01-01

    A study focusing on 113 ninth graders of Mexican descent indicates that most students and their parents adhere to a strategy of additive acculturation (incorporating skills of the new culture and language), but that the school curriculum and general school climate devalue Mexican culture. (SLD)

  20. Individualized Additional Instruction for Calculus

    ERIC Educational Resources Information Center

    Takata, Ken

    2010-01-01

    College students enrolling in the calculus sequence have a wide variance in their preparation and abilities, yet they are usually taught from the same lecture. We describe another pedagogical model of Individualized Additional Instruction (IAI) that assesses each student frequently and prescribes further instruction and homework based on the…

  1. Out of bounds additive manufacturing

    SciTech Connect

    Holshouser, Chris; Newell, Clint; Palas, Sid; Love, Lonnie J.; Kunc, Vlastimil; Lind, Randall F.; Lloyd, Peter D.; Rowe, John C.; Blue, Craig A.; Duty, Chad E.; Peter, William H.; Dehoff, Ryan R.

    2013-03-01

    Lockheed Martin and Oak Ridge National Laboratory are working on an additive manufacturing system capable of manufacturing components measured not in terms of inches or feet, but multiple yards in all dimensions with the potential to manufacture parts that are completely unbounded in size.

  2. The Additive Property of Energy.

    ERIC Educational Resources Information Center

    Tsaoussis, Dimitris S.

    1995-01-01

    Presents exercises that analyze the additive property of energy. Concludes that if a body has more than one component of energy depending on the same physical quantity, the body's total energy will be the algebraic sum of the components if a linear relationship exists between the energy components and that physical quantity. (JRH)

  3. Tinkertoy Color-Addition Device.

    ERIC Educational Resources Information Center

    Ferguson, Joe L.

    1995-01-01

    Describes construction and use of a simple home-built device, using an overhead projector, for use in demonstrations of the addition of various combinations of red, green, and blue light. Useful in connection with discussions of color, color vision, or color television. (JRH)

  4. Silage Additives and Management Issues

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Inoculants are the most common silage additives in the United States. These products contain lactic acid bacteria to supplement the lactic acid bacteria naturally on the crop and help insure a consistent fermentation in the silo. There are three types of inoculants: homofermentative lactic acid bact...

  5. Tetrasulfide extreme pressure lubricant additives

    SciTech Connect

    Gast, L.E.; Kenney, H.E.; Schwab, A.W.

    1980-08-19

    A novel class of compounds has been prepared comprising the tetrasulfides of /sup 18/C hydrocarbons, /sup 18/C fatty acids, and /sup 18/C fatty and alkyl and triglyceride esters. These tetrasulfides are useful as extreme pressure lubricant additives and show potential as replacements for sulfurized sperm whale oil.

  6. Thermographic detection and quantitative characterization of corrosion by application of thermal line source

    NASA Astrophysics Data System (ADS)

    Cramer, K. Elliott; Winfree, William P.; Reid, Dan; Johnson, Jane

    1999-02-01

    Wall thinning in utility boiler waterwall tubing is a significant inspection concern for boiler operators. Historically, conventional ultrasonics has been used for inspection of these tubes. This technique has proved to be very labor intensive and slow. This has resulted in a `spot check' approach to inspections, making thickness measurements over a relatively small percentage of the total boiler wall area. NASA Langley Research Center has developed a thermal NDE technique designed to image and quantitatively characterize the amount of material thinning present in steel tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source, coupled with this analysis technique, represents a significant improvement in the inspection speed for large structures such as boiler waterwalls while still providing high-resolution thickness measurements. A theoretical basis for the technique will be presented thus demonstrating the quantitative nature of the technique. Further, results of laboratory experiments on flat panel specimens with fabricated material loss regions will be presented to demonstrate the capabilities of the technique. Additionally, the results of applying this technology to actual waterwall tubing samples will be presented.

  7. Quantitative Species Measurements In Microgravity Combustion Flames

    NASA Technical Reports Server (NTRS)

    Chen, Shin-Juh; Pilgrim, Jeffrey S.; Silver, Joel A.; Piltch, Nancy D.

    2003-01-01

    The capability of models and theories to accurately predict and describe the behavior of low gravity flames can only be verified by quantitative measurements. Although video imaging, simple temperature measurements, and velocimetry methods have provided useful information in many cases, there is still a need for quantitative species measurements. Over the past decade, we have been developing high sensitivity optical absorption techniques to permit in situ, non-intrusive, absolute concentration measurements for both major and minor flames species using diode lasers. This work has helped to establish wavelength modulation spectroscopy (WMS) as an important method for species detection within the restrictions of microgravity-based measurements. More recently, in collaboration with Prof. Dahm at the University of Michigan, a new methodology combining computed flame libraries with a single experimental measurement has allowed us to determine the concentration profiles for all species in a flame. This method, termed ITAC (Iterative Temperature with Assumed Chemistry) was demonstrated for a simple laminar nonpremixed methane-air flame at both 1-g and at 0-g in a vortex ring flame. In this paper, we report additional normal and microgravity experiments which further confirm the usefulness of this approach. We also present the development of a new type of laser. This is an external cavity diode laser (ECDL) which has the unique capability of high frequency modulation as well as a very wide tuning range. This will permit the detection of multiple species with one laser while using WMS detection.

  8. Evidence for dose-additive effects of a type II pyrethroid mixture. In vitro assessment.

    PubMed

    Romero, A; Ares, I; Ramos, E; Castellano, V; Martínez, M; Martínez-Larrañaga, M R; Anadón, A; Martínez, M A

    2015-04-01

    Despite the widespread use of pyrethroid insecticides that led to common exposure in the population, few studies have been conducted to quantitatively assess dose-additive effects of pyrethroids using a funcional measure involved in the common toxic mode of action. The aim of this study was to evaluate the potency and efficacy of 6 Type II pyretroids (α-cypermethrin, cyfluthrin, λ-cyhalothrin, deltamethrin, cyphenothrin and esfenvalerate) to evoke induction of both nitric oxide and lipid peroxides levels measured as malondialdehyde in three in vitro models (SH-SY5Y, HepG2 and Caco-2 human cells) as well as to test the hypothesis of dose additivity for mixtures of these same 6 pyrethroids. Concentration-responses for 6 pyrethroids were determined as well as the response to mixtures of all 6 pyrethroids. Additivity was tested assuming a dose-additive model. The human neuroblastoma SH-SY5Y cell line was the most sensitive in vitro model. The rank order of potency for cell SH-SY5Y viability MTT assay was deltamethrin>cyphenothrin>λ-cyhalothrin>cyfluthrin>esfenvalerate>α-cypermethrin. When 6 pyrethroids were present in the mixture at an equitoxic mixing ratio, the action on nitric oxide (NO) and lipid peroxides measured as malondialdehyde (MDA) production was consistent with a dose-additive model. The results of the present study are consistent with previous reports of additivity of pyrethroids in vivo e in vitro. PMID:25688004

  9. Evaluation of certain food additives.

    PubMed

    2012-01-01

    This report represents the conclusions of a Joint FAO/WHO Expert Committee convened to evaluate the safety of various food additives, including flavouring agents, with a view to concluding as to safety concerns and to preparing specifications for identity and purity. The first part of the report contains a general discussion of the principles governing the toxicological evaluation of and assessment of dietary exposure to food additives, including flavouring agents. A summary follows of the Committee's evaluations of technical, toxicological and dietary exposure data for five food additives (magnesium dihydrogen diphosphate; mineral oil (medium and low viscosity) classes II and III; 3-phytase from Aspergillus niger expressed in Aspergillus niger; serine protease (chymotrypsin) from Nocardiopsis prasina expressed in Bacillus licheniformis; and serine protease (trypsin) from Fusarium oxysporum expressed in Fusarium venenatum) and 16 groups of flavouring agents (aliphatic and aromatic amines and amides; aliphatic and aromatic ethers; aliphatic hydrocarbons, alcohols, aldehydes, ketones, carboxylic acids and related esters, sulfides, disulfides and ethers containing furan substitution; aliphatic linear alpha,beta-unsaturated aldehydes, acids and related alcohols, acetals and esters; amino acids and related substances; epoxides; furfuryl alcohol and related substances; linear and branched-chain aliphatic, unsaturated, unconjugated alcohols, aldehydes, acids and related esters; miscellaneous nitrogen-containing substances; phenol and phenol derivatives; pyrazine derivatives; pyridine, pyrrole and quinoline derivatives; saturated aliphatic acyclic branched-chain primary alcohols, aldehydes and acids; simple aliphatic and aromatic sulfides and thiols; sulfur-containing heterocyclic compounds; and sulfur-substituted furan derivatives). Specifications for the following food additives were revised: ethyl cellulose, mineral oil (medium viscosity), modified starches and titanium

  10. Active mineral additives of sapropel ashes

    NASA Astrophysics Data System (ADS)

    Khomich, V. A.; Danilina, E. V.; Krivonos, O. I.; Plaksin, G. V.

    2015-01-01

    The goal of the presented research is to establish a scientific rational for the possibility of sapropel ashes usage as an active mineral additive. The research included the study of producing active mineral additives from sapropels by their thermal treatment at 850900 °C and afterpowdering, the investigation of the properties of paste matrix with an ash additive, and the study of the ash influence on the cement bonding agent. Thermogravimetric analysis and X-ray investigations allowed us to establish that while burning, organic substances are removed, clay minerals are dehydrated and their structure is broken. Sapropel ashes chemical composition was determined. An amorphous ash constituent is mainly formed from silica of the mineral sapropel part and alumosilicagels resulted from clay minerals decomposition. Properties of PC 400 and PC 500A0 sparopel ash additives were studied. Adding ashes containing Glenium plasticizer to the cement increases paste matrix strength and considerably reduces its water absorption. X-ray phase analysis data shows changes in the phase composition of the paste matrix with an ash additive. Ash additives produce a pozzolanic effect on the cement bonding agent. Besides, an ash additive due to the alumosilicagels content causes transformation from unstable calcium aluminate forms to the stable ones.

  11. Measuring additive interaction using odds ratios

    PubMed Central

    Kalilani, Linda; Atashili, Julius

    2006-01-01

    Interaction measured on the additive scale has been argued to be better correlated with biologic interaction than when measured on the multiplicative scale. Measures of interaction on the additive scale have been developed using risk ratios. However, in studies that use odds ratios as the sole measure of effect, the calculation of these measures of additive interaction is usually performed by directly substituting odds ratios for risk ratios. Yet assessing additive interaction based on replacing risk ratios by odds ratios in formulas that were derived using the former may be erroneous. In this paper, we evaluate the extent to which three measures of additive interaction – the interaction contrast ratio (ICR), the attributable proportion due to interaction (AP), and the synergy index (S), estimated using odds ratios versus using risk ratios differ as the incidence of the outcome of interest increases in the source population and/or as the magnitude of interaction increases. Our analysis shows that the difference between the two depends on the measure of interaction used, the type of interaction present, and the baseline incidence of the outcome. Substituting odds ratios for risk ratios, when calculating measures of additive interaction, may result in misleading conclusions. Of the three measures, AP appears to be the most robust to this direct substitution. Formulas that use stratum specific odds and odds ratios to accurately calculate measures of additive interaction are presented. PMID:16620385

  12. Bioimaging for quantitative phenotype analysis.

    PubMed

    Chen, Weiyang; Xia, Xian; Huang, Yi; Chen, Xingwei; Han, Jing-Dong J

    2016-06-01

    With the development of bio-imaging techniques, an increasing number of studies apply these techniques to generate a myriad of image data. Its applications range from quantification of cellular, tissue, organismal and behavioral phenotypes of model organisms, to human facial phenotypes. The bio-imaging approaches to automatically detect, quantify, and profile phenotypic changes related to specific biological questions open new doors to studying phenotype-genotype associations and to precisely evaluating molecular changes associated with quantitative phenotypes. Here, we review major applications of bioimage-based quantitative phenotype analysis. Specifically, we describe the biological questions and experimental needs addressable by these analyses, computational techniques and tools that are available in these contexts, and the new perspectives on phenotype-genotype association uncovered by such analyses. PMID:26850283

  13. Quantitative Imaging Biomarkers of NAFLD

    PubMed Central

    Kinner, Sonja; Reeder, Scott B.

    2016-01-01

    Conventional imaging modalities, including ultrasonography (US), computed tomography (CT), and magnetic resonance (MR), play an important role in the diagnosis and management of patients with nonalcoholic fatty liver disease (NAFLD) by allowing noninvasive diagnosis of hepatic steatosis. However, conventional imaging modalities are limited as biomarkers of NAFLD for various reasons. Multi-parametric quantitative MRI techniques overcome many of the shortcomings of conventional imaging and allow comprehensive and objective evaluation of NAFLD. MRI can provide unconfounded biomarkers of hepatic fat, iron, and fibrosis in a single examination—a virtual biopsy has become a clinical reality. In this article, we will review the utility and limitation of conventional US, CT, and MR imaging for the diagnosis NAFLD. Recent advances in imaging biomarkers of NAFLD are also discussed with an emphasis in multi-parametric quantitative MRI. PMID:26848588

  14. Quantitative Imaging Biomarkers of NAFLD.

    PubMed

    Kinner, Sonja; Reeder, Scott B; Yokoo, Takeshi

    2016-05-01

    Conventional imaging modalities, including ultrasonography (US), computed tomography (CT), and magnetic resonance (MR), play an important role in the diagnosis and management of patients with nonalcoholic fatty liver disease (NAFLD) by allowing noninvasive diagnosis of hepatic steatosis. However, conventional imaging modalities are limited as biomarkers of NAFLD for various reasons. Multi-parametric quantitative MRI techniques overcome many of the shortcomings of conventional imaging and allow comprehensive and objective evaluation of NAFLD. MRI can provide unconfounded biomarkers of hepatic fat, iron, and fibrosis in a single examination-a virtual biopsy has become a clinical reality. In this article, we will review the utility and limitation of conventional US, CT, and MR imaging for the diagnosis NAFLD. Recent advances in imaging biomarkers of NAFLD are also discussed with an emphasis in multi-parametric quantitative MRI. PMID:26848588

  15. Image analysis and quantitative morphology.

    PubMed

    Mandarim-de-Lacerda, Carlos Alberto; Fernandes-Santos, Caroline; Aguila, Marcia Barbosa

    2010-01-01

    Quantitative studies are increasingly found in the literature, particularly in the fields of development/evolution, pathology, and neurosciences. Image digitalization converts tissue images into a numeric form by dividing them into very small regions termed picture elements or pixels. Image analysis allows automatic morphometry of digitalized images, and stereology aims to understand the structural inner three-dimensional arrangement based on the analysis of slices showing two-dimensional information. To quantify morphological structures in an unbiased and reproducible manner, appropriate isotropic and uniform random sampling of sections, and updated stereological tools are needed. Through the correct use of stereology, a quantitative study can be performed with little effort; efficiency in stereology means as little counting as possible (little work), low cost (section preparation), but still good accuracy. This short text provides a background guide for non-expert morphologists. PMID:19960334

  16. Two additional principles for determining which species to monitor.

    PubMed

    Wilson, Howard B; Rhodes, Jonathan R; Possingham, Hugh P

    2015-11-01

    Monitoring to detect population declines is widespread, but also costly. There is, consequently, a need to optimize monitoring to maximize cost-effectiveness. Here we develop a quantitative decision analysis framework for how to optimally allocate resources for monitoring among species. By keeping the framework simple, we analytically establish two new principles about which species are optimal to monitor for detecting declines: (1) those that lie on the boundary between species being allocated resources for conservation action and species that are not and (2) those with the greatest uncertainty in whether they are declining. These two principles are in addition to other factors that are also important in monitoring decisions, such as complementarity. We demonstrate the efficacy of these principles when other factors are not present, and show how the two principles can be combined. This analysis demonstrates that the most cost-effective species to monitor are ones where the information gained from monitoring is most likely to change the allocation of funds for action, not necessarily the most vulnerable or endangered. We suggest these results are general and apply to all ecological monitoring, not just of biological species: monitoring and information are only valuable when they are likely to change how people act. PMID:27070020

  17. Quantitative computed tomography–derived clusters: Redefining airway remodeling in asthmatic patients☆

    PubMed Central

    Gupta, Sumit; Hartley, Ruth; Khan, Umair T.; Singapuri, Amisha; Hargadon, Beverly; Monteiro, William; Pavord, Ian D.; Sousa, Ana R.; Marshall, Richard P.; Subramanian, Deepak; Parr, David; Entwisle, James J.; Siddiqui, Salman; Raj, Vimal; Brightling, Christopher E.

    2014-01-01

    Background Asthma heterogeneity is multidimensional and requires additional tools to unravel its complexity. Computed tomography (CT)–assessed proximal airway remodeling and air trapping in asthmatic patients might provide new insights into underlying disease mechanisms. Objectives The aim of this study was to explore novel, quantitative, CT-determined asthma phenotypes. Methods Sixty-five asthmatic patients and 30 healthy subjects underwent detailed clinical, physiologic characterization and quantitative CT analysis. Factor and cluster analysis techniques were used to determine 3 novel, quantitative, CT-based asthma phenotypes. Results Patients with severe and mild-to-moderate asthma demonstrated smaller mean right upper lobe apical segmental bronchus (RB1) lumen volume (LV) in comparison with healthy control subjects (272.3 mm3 [SD, 112.6 mm3], 259.0 mm3 [SD, 53.3 mm3], 366.4 mm3 [SD, 195.3 mm3], respectively; P = .007) but no difference in RB1 wall volume (WV). Air trapping measured based on mean lung density expiratory/inspiratory ratio was greater in patients with severe and mild-to-moderate asthma compared with that seen in healthy control subjects (0.861 [SD, 0.05)], 0.866 [SD, 0.07], and 0.830 [SD, 0.06], respectively; P = .04). The fractal dimension of the segmented airway tree was less in asthmatic patients compared with that seen in control subjects (P = .007). Three novel, quantitative, CT-based asthma clusters were identified, all of which demonstrated air trapping. Cluster 1 demonstrates increased RB1 WV and RB1 LV but decreased RB1 percentage WV. On the contrary, cluster 3 subjects have the smallest RB1 WV and LV values but the highest RB1 percentage WV values. There is a lack of proximal airway remodeling in cluster 2 subjects. Conclusions Quantitative CT analysis provides a new perspective in asthma phenotyping, which might prove useful in patient selection for novel therapies. PMID:24238646

  18. Imaging Performance of Quantitative Transmission Ultrasound

    PubMed Central

    Lenox, Mark W.; Wiskin, James; Lewis, Matthew A.; Darrouzet, Stephen; Borup, David; Hsieh, Scott

    2015-01-01

    Quantitative Transmission Ultrasound (QTUS) is a tomographic transmission ultrasound modality that is capable of generating 3D speed-of-sound maps of objects in the field of view. It performs this measurement by propagating a plane wave through the medium from a transmitter on one side of a water tank to a high resolution receiver on the opposite side. This information is then used via inverse scattering to compute a speed map. In addition, the presence of reflection transducers allows the creation of a high resolution, spatially compounded reflection map that is natively coregistered to the speed map. A prototype QTUS system was evaluated for measurement and geometric accuracy as well as for the ability to correctly determine speed of sound. PMID:26604918

  19. A quantitative ELISA for dystrophin.

    PubMed

    Morris, G E; Ellis, J M; Nguyen, T M

    1993-05-01

    A novel approach to the quantitation of the muscular dystrophy protein, dystrophin, in muscle extracts is described. The two-site ELISA uses two monoclonal antibodies against dystrophin epitopes which lie close together in the rod domain of the dystrophin molecule in order to minimize the effects of dystrophin degradation. Dystrophin is assayed in its native form by extracting with non-ionic detergents and avoiding the use of SDS. PMID:8486926

  20. Quantitative wave-particle duality

    NASA Astrophysics Data System (ADS)

    Qureshi, Tabish

    2016-07-01

    The complementary wave and particle character of quantum objects (or quantons) was pointed out by Niels Bohr. This wave-particle duality, in the context of the two-slit experiment, is here described not just as two extreme cases of wave and particle characteristics, but in terms of quantitative measures of these characteristics, known to follow a duality relation. A very simple and intuitive derivation of a closely related duality relation is presented, which should be understandable to the introductory student.